0.9 is the historical capacity factor for nuclear in the U.S. Yeah, nuclear is really best for base load. It's slow to ramp up or down, so is not very good for peaking power load (the hourly and instantaneous spikes and dips in power consumption for the grid overall). Peaking load is usually handled by hydro and gas plants, sometimes coal.
The difference is that nuclear power proponents do not advocate making 100% of power generation nuclear. They are ok with using hydro, gas, wind, solar to handle peaking load. Renewable power proponents OTOH advocate 100% of our power come from hydro, wind, and solar, even though none of them are suitable for base load. Geothermal was really the only viable renewable for base load, but it has become collateral damage in environmentalists' war against fracking. (The energy of earthquakes triggered by fracking was already in the earth. If that energy hadn't been released by the fracking, it would've been released in a natural earthquake some time in the future. But in their zeal to shut down fracking for oil by incorrectly blaming fracking for all the energy released in an earthquake, they've poisoned public perception so that geothermal would also be blamed for earthquakes.)
As for rates for different power sources
, wind is getting close to nuclear, but solar is still nowhere near. And as mentioned above, neither are suitable for base load. Most of the articles I've read proclaiming renewables will overtake nuclear and fossil fuels in cost mistakenly omit capacity factor in their comparison. They wind up comparing peak generating capacity, which has very little to do with rates. Theoretically you could use renewables for base load if you had sufficient storage capacity. But the most efficient storage system (pumped storage) only has about 75% efficiency, so that automatically makes it at least 1.33x more expensive than its source.