Forward voltage at rated current is 450 volts. Even at 30 KV that's some serious loss. The specified risetime of 10 ns into a resistive load isn't bad, but the falltime isn't specified and the interesting loads are all inductive -- falltime into those is tricky because of snubbing losses and Miller capacitance.
Others rather less precisely specified but generally similar.
Rather more to the point, though, is that they don't get you usable voltage conversion. You still need a transformer, so the semiconductor losses are in addition to the transformer losses. And all of that lovely high-frequency switching causes problems when you're dealing with transformer cores weighing tons. Which you need to keep the Q of the transformer up (inductive loss is pretty much a pure function of how much copper you're willing to pay for.)
The loss of efficiency is acceptable for applications like PC power supplies or lighting ballasts because the added functionality such as flexible regulation makes up for it. But when you're looking to handle the output of gigawatt power plants, you really don't want to be dissipating several percent of your output (pure loss) into a solid-state system that has to be kept below 70 degrees under peak load, which around here means an ambient temperature of close to 50 degrees. That is, for one, a big direct cost for the inefficiency. Also a honking enormous cooling system prone to catastrophic failure due to thermal runaway. And, finally, a maintenance nightmare. What is the service MTBF of one of those switches? Now, figure it for an array capable of handling a gigawatt. Don't forget that you can't just take the system down for safe maintenance.
Much as I love transistors, this isn't happening in my lifetime.