I managed to grill one of them on that. The part about "by law" is the bullshit. It's a store policy after they allegedly burned out a customer's ECU doing a code read. So, higher up made it policy to not perform that service anymore.

News at 11.

Really, why are people surprised? Just because it's Apple doesn't mean first release is going to be flawless. Shiny maybe, but certainly not flawless.

Really, why are people surprised? Just because it's Apple doesn't mean first release is going to be flawless. Shiny maybe, but certainly not flawless.

That's a very large unit if the running load is 10 kW. Just grabbing submittal data for an old Carrier heat pump (Model 50JS, 10 SEER; end of life) a 5 ton unit with no backup electric heat requires a 20 amp max breaker at 460 volts (commercial 3 phase power). So that puts a cap on it at 9.2 kW; and that's an overload condition. Its normal run load amps rating is 12 amps, so a normal operating power of 5.5 kW. If we use a unit designed for 230 volt (3 phase commercial) the run load amps rating is 27.1, so the normal operating power works out to be just over 6.2 kW. At 10 kW you're talking about a heat pump between 7.5 and 9 tons sensible capacity (depending on efficiency rating and operating voltage).
In residential applications a 5 ton unit is sufficient to cool a 2500 to 3000 square foot house as a single zone (this does depend on climate zone and location, but is a good generalization); often a 3 ton unit will be installed and the house split into two 1500 square foot zones. A residential 5 ton split system (Carrier 24ABA060 and Carrier FC4CNC060), operating at 230 volts single phase has a run load amps of 33 (compressor and indoor unit combined) giving us a normal operating power just under 7.6 kW.

A modern 2.5" drive @ 7200 RPM has an idle power usage of 1-2 watts and a seek (not peak) power usage of approximately 2-3 watts. Read/write power usage is also approximately 2-3 watts.
Most optical drives are rated at a minimum of 1 amp @ 5 volts, so that's 5 watts. Nearly twice as much as the high end figure for a hard disk.

Since it's orbiting I expect that it has a blackout period similar to that encountered by the Apollo spacecraft. Makes sense that it would have as fast a link as possible to offload data before the next blackout period.

Local Safety Nazis must be involved, I bet someone found the MSDS for vacuum.

I should not have used "real power" earlier for VA, I did screw that up. Watts is real power, VA is apparent power.

Quoting the wiki article:

"For example, to get 1 kW of real power, if the power factor is unity, 1 kVA of apparent power needs to be transferred (1 kW ÷ 1 = 1 kVA). At low values of power factor, more apparent power needs to be transferred to get the same real power. To get 1 kW of real power at 0.2 power factor, 5 kVA of apparent power needs to be transferred (1 kW ÷ 0.2 = 5 kVA). This apparent power must be produced and transmitted to the load in the conventional fashion, and is subject to the usual distributed losses in the production and transmission processes."

A straight reading of that would seem to say that the generator is doing 5 times the work to deliver the same real power. Is this an incorrect interpretation?

From further down in the same article:

"The significance of power factor lies in the fact that utility companies supply customers with volt-amperes, but bill them for watts. Power factors below 1.0 require a utility to generate more than the minimum volt-amperes necessary to supply the real power (watts). This increases generation and transmission costs. For example, if the load power factor were as low as 0.7, the apparent power would be 1.4 times the real power used by the load. Line current in the circuit would also be 1.4 times the current required at 1.0 power factor, so the losses in the circuit would be doubled (since they are proportional to the square of the current). Alternatively all components of the system such as generators, conductors, transformers, and switchgear would be increased in size (and cost) to carry the extra current."

My experience with electrical systems is in terms of building design, transformers, and standby generator sizing, which is all about designing around the largest load.

While doing additional searches, I came across this post which does a nice job of explaining it as a beer metaphor.

Quoting the wiki article:

"For example, to get 1 kW of real power, if the power factor is unity, 1 kVA of apparent power needs to be transferred (1 kW ÷ 1 = 1 kVA). At low values of power factor, more apparent power needs to be transferred to get the same real power. To get 1 kW of real power at 0.2 power factor, 5 kVA of apparent power needs to be transferred (1 kW ÷ 0.2 = 5 kVA). This apparent power must be produced and transmitted to the load in the conventional fashion, and is subject to the usual distributed losses in the production and transmission processes."

A straight reading of that would seem to say that the generator is doing 5 times the work to deliver the same real power. Is this an incorrect interpretation?

From further down in the same article:

"The significance of power factor lies in the fact that utility companies supply customers with volt-amperes, but bill them for watts. Power factors below 1.0 require a utility to generate more than the minimum volt-amperes necessary to supply the real power (watts). This increases generation and transmission costs. For example, if the load power factor were as low as 0.7, the apparent power would be 1.4 times the real power used by the load. Line current in the circuit would also be 1.4 times the current required at 1.0 power factor, so the losses in the circuit would be doubled (since they are proportional to the square of the current). Alternatively all components of the system such as generators, conductors, transformers, and switchgear would be increased in size (and cost) to carry the extra current."

My experience with electrical systems is in terms of building design, transformers, and standby generator sizing, which is all about designing around the largest load.

While doing additional searches, I came across this post which does a nice job of explaining it as a beer metaphor.

You must have missed in my first post where I said watts are not VA? Perhaps you missed where I posted the formula where watts = VA * Power Factor?

If two devices place the same VA loading on a generator, the generator must produce that amount of power, regardless of the wattage rating of the device.

Please explain how that is erroneous?

If two devices place the same VA loading on a generator, the generator must produce that amount of power, regardless of the wattage rating of the device.

Please explain how that is erroneous?

Here is (again) a real example: I have a 100W incandescent bulb. Because it is a pure resistive load, it has a power factor of 1. This means that the 100W = 100VA.

Now, on the other hand, I have a CFL rated to draw 50 watts. But it has a power factor of .5, which means it still requires 100VA supply (50 / .5 = 100).

In short, they require the exact same fuel supply from the generator and require the exact same fuel consumption.

In any case, you can't run a generator unloaded or underloaded, so the utility has line reactors and capacitor buffers to balance the difference.

If you can sit there and try to argue that the CFL is still somehow more efficient in terms of electrical requirements (not light output) then you obviously have no idea what you're talking about when it comes to actual electrical systems.

Now, on the other hand, I have a CFL rated to draw 50 watts. But it has a power factor of

In short, they require the exact same fuel supply from the generator and require the exact same fuel consumption.

In any case, you can't run a generator unloaded or underloaded, so the utility has line reactors and capacitor buffers to balance the difference.

If you can sit there and try to argue that the CFL is still somehow more efficient in terms of electrical requirements (not light output) then you obviously have no idea what you're talking about when it comes to actual electrical systems.

VA is not "peak current." VA is the REAL power. Watts is the apparent _usable_ power. With a PF of .5 you burn the same amount of fuel to deliver half the usable power. How is that NOT an efficiency issue?

So, yes, VA matters very much because your 20W CFL is drawing 40 VA, where as a 20W incandescent is drawing 20 VA (a pure resistive load has a PF of 1).

Please, if you know an EE, ask the difference between watts and VA and why power factor is a big deal.

So, yes, VA matters very much because your 20W CFL is drawing 40 VA, where as a 20W incandescent is drawing 20 VA (a pure resistive load has a PF of 1).

Please, if you know an EE, ask the difference between watts and VA and why power factor is a big deal.

A 20w device with a PF of .5 will draw 40 VA. Volt-Amps is the rating used for generating and distribution equipment (transformers, line reactors, switching stations, etc). Volt-amps is the raw produced current, whereas watts is the delivered, usable power. This seems to be a common misunderstanding when people start throwing around watts and volt-amps and power factor.

The math: VA = W / PF W = VA * PF PF = W / VA

For example, a theoretical 1 Megawatt alternator provides 1000A at 1000V. So, you get 1,000,000 VA, or at a PF of 1 you get 1,000,000 watts. If the entire grid connected load for this alternator had a power factor of 0.5, then the delivered, usable power would only be 500kW, and that is the amount the meters would read for billing. The alternator still had to provide 1MVA. This is why large commercial users have surcharges for low power factor and discounts if they have a high power factor. Hence the move to variable frequency drives in large commercial gear. VFD systems allow them to bring the power factor of the motors much closer to unity.

So, right now a CFL may have an internal efficiency of 50+%, but the power factor is so bad that they are not much better than incandescent when you look at the actual grid load.

Power Factor

Of the links provided by the GP, this one is probably the best, if long.

The math: VA = W / PF W = VA * PF PF = W / VA

For example, a theoretical 1 Megawatt alternator provides 1000A at 1000V. So, you get 1,000,000 VA, or at a PF of 1 you get 1,000,000 watts. If the entire grid connected load for this alternator had a power factor of 0.5, then the delivered, usable power would only be 500kW, and that is the amount the meters would read for billing. The alternator still had to provide 1MVA. This is why large commercial users have surcharges for low power factor and discounts if they have a high power factor. Hence the move to variable frequency drives in large commercial gear. VFD systems allow them to bring the power factor of the motors much closer to unity.

So, right now a CFL may have an internal efficiency of 50+%, but the power factor is so bad that they are not much better than incandescent when you look at the actual grid load.

Power Factor

Of the links provided by the GP, this one is probably the best, if long.

Similarly fully spec'd machines from Apple and Dell are around $1,000 apart:

Dell Precision T7400 vs. Apple Mac Pro:

2x Quad Core Xeon 3.2 Ghz

32 GB of RAM

4x 1 TB HDD

DVDRW drive

nVidia 1.5 GB Quadro 5600

No monitors

No other addons

3 year AppleCare or Dell 3 year 4 HR on-site 24x7

Dell: $17,231

Apple: $18,248

Yes, there is an Apple premium. Always has been, and always will be.

But honestly, only someone who has no concerns about final build cost buys extra memory and hard drives from Apple (or even Dell to be honest) since companies tend to charge some fairly heavy markups on those upgrades. As a note, both charged the same price for the Quadro, $2,850

/usr/news/gotcha