>> Voltage mutiplied by current in Amps equals Watts.
NO. For God's sake will people stop making this mistake.
Voltage multiplied by current in Amps equals VA, not Watts. If you want watts, you have to multiply Voltage in Volts, Current in Amps, and the cosine of the angle between them (which is more commonly known as the power factor.
VA = V*A
Watts = V*A*PF
No, Watts is really Voltage times Current. But when referring to AC systems, definitions get all screwed up. Just look at "kWh" - what a mess. It's like electricians have their own definitions for these units. I suppose it is understandable - using a single number to approximate a waveform and then performing calculations using Ohms Law makes most tasks much easier.
So pointing out the difference between Watts and VA is good - thanks for that. But don't be calling the real definition for Watts wrong. Also, your definition for power factor is not correct - or at least it is dated. It only applies to AC systems where the waveform is shifted. Power factor also applies to waveforms that are modified in other ways. For example, a computer power supply without power factor correction consumes pulses of power during the peak points of the sine wave. This changes the shape of the wave without resulting in a phase shift. With power factor correction, a control circuit draws power throughout the entire waveform so that the sine wave is not distorted.
I wonder what they used to measure power usage for this test. Did the instrument record true RMS power? Those instruments are much more expensive but required for accurate results. Guess I should rtfa.