Really, why are people surprised? Just because it's Apple doesn't mean first release is going to be flawless. Shiny maybe, but certainly not flawless.
Quoting the wiki article:
"For example, to get 1 kW of real power, if the power factor is unity, 1 kVA of apparent power needs to be transferred (1 kW ÷ 1 = 1 kVA). At low values of power factor, more apparent power needs to be transferred to get the same real power. To get 1 kW of real power at 0.2 power factor, 5 kVA of apparent power needs to be transferred (1 kW ÷ 0.2 = 5 kVA). This apparent power must be produced and transmitted to the load in the conventional fashion, and is subject to the usual distributed losses in the production and transmission processes."
A straight reading of that would seem to say that the generator is doing 5 times the work to deliver the same real power. Is this an incorrect interpretation?
From further down in the same article:
"The significance of power factor lies in the fact that utility companies supply customers with volt-amperes, but bill them for watts. Power factors below 1.0 require a utility to generate more than the minimum volt-amperes necessary to supply the real power (watts). This increases generation and transmission costs. For example, if the load power factor were as low as 0.7, the apparent power would be 1.4 times the real power used by the load. Line current in the circuit would also be 1.4 times the current required at 1.0 power factor, so the losses in the circuit would be doubled (since they are proportional to the square of the current). Alternatively all components of the system such as generators, conductors, transformers, and switchgear would be increased in size (and cost) to carry the extra current."
My experience with electrical systems is in terms of building design, transformers, and standby generator sizing, which is all about designing around the largest load.
While doing additional searches, I came across this post which does a nice job of explaining it as a beer metaphor.
If two devices place the same VA loading on a generator, the generator must produce that amount of power, regardless of the wattage rating of the device.
Please explain how that is erroneous?
Now, on the other hand, I have a CFL rated to draw 50 watts. But it has a power factor of
In short, they require the exact same fuel supply from the generator and require the exact same fuel consumption.
In any case, you can't run a generator unloaded or underloaded, so the utility has line reactors and capacitor buffers to balance the difference.
If you can sit there and try to argue that the CFL is still somehow more efficient in terms of electrical requirements (not light output) then you obviously have no idea what you're talking about when it comes to actual electrical systems.
So, yes, VA matters very much because your 20W CFL is drawing 40 VA, where as a 20W incandescent is drawing 20 VA (a pure resistive load has a PF of 1).
Please, if you know an EE, ask the difference between watts and VA and why power factor is a big deal.
The math: VA = W / PF W = VA * PF PF = W / VA
For example, a theoretical 1 Megawatt alternator provides 1000A at 1000V. So, you get 1,000,000 VA, or at a PF of 1 you get 1,000,000 watts. If the entire grid connected load for this alternator had a power factor of 0.5, then the delivered, usable power would only be 500kW, and that is the amount the meters would read for billing. The alternator still had to provide 1MVA. This is why large commercial users have surcharges for low power factor and discounts if they have a high power factor. Hence the move to variable frequency drives in large commercial gear. VFD systems allow them to bring the power factor of the motors much closer to unity.
So, right now a CFL may have an internal efficiency of 50+%, but the power factor is so bad that they are not much better than incandescent when you look at the actual grid load.
Of the links provided by the GP, this one is probably the best, if long.
Similarly fully spec'd machines from Apple and Dell are around $1,000 apart:
Dell Precision T7400 vs. Apple Mac Pro:
2x Quad Core Xeon 3.2 Ghz
32 GB of RAM
4x 1 TB HDD
nVidia 1.5 GB Quadro 5600
No other addons
3 year AppleCare or Dell 3 year 4 HR on-site 24x7
Yes, there is an Apple premium. Always has been, and always will be.
But honestly, only someone who has no concerns about final build cost buys extra memory and hard drives from Apple (or even Dell to be honest) since companies tend to charge some fairly heavy markups on those upgrades. As a note, both charged the same price for the Quadro, $2,850