I should not have used "real power" earlier for VA, I did screw that up. Watts is real power, VA is apparent power.
Quoting the wiki article:
"For example, to get 1 kW of real power, if the power factor is unity, 1 kVA of apparent power needs to be transferred (1 kW ÷ 1 = 1 kVA). At low values of power factor, more apparent power needs to be transferred to get the same real power. To get 1 kW of real power at 0.2 power factor, 5 kVA of apparent power needs to be transferred (1 kW ÷ 0.2 = 5 kVA). This apparent power must be produced and transmitted to the load in the conventional fashion, and is subject to the usual distributed losses in the production and transmission processes."
A straight reading of that would seem to say that the generator is doing 5 times the work to deliver the same real power. Is this an incorrect interpretation?
From further down in the same article:
"The significance of power factor lies in the fact that utility companies supply customers with volt-amperes, but bill them for watts. Power factors below 1.0 require a utility to generate more than the minimum volt-amperes necessary to supply the real power (watts). This increases generation and transmission costs. For example, if the load power factor were as low as 0.7, the apparent power would be 1.4 times the real power used by the load. Line current in the circuit would also be 1.4 times the current required at 1.0 power factor, so the losses in the circuit would be doubled (since they are proportional to the square of the current). Alternatively all components of the system such as generators, conductors, transformers, and switchgear would be increased in size (and cost) to carry the extra current."
My experience with electrical systems is in terms of building design, transformers, and standby generator sizing, which is all about designing around the largest load.
While doing additional searches, I came across this post
which does a nice job of explaining it as a beer metaphor.