Volt-Amps vs Watts?

RossGr

Diamond Member
Jan 11, 2000
3,383
1
0
What is the significane of the Volt Amp ratings? How does it differ from Watts?
 

Fandu

Golden Member
Oct 9, 1999
1,341
0
0
VA ratings take into account the power factor of a load. Power factor is the difference in phase between the maximum amplitude of the voltage, and the max amplitude of the current draw. In a purely resistive load, the current draw is exactly in phase with the voltage, and the power factor is 1. In this case, VA is equivilant to Watts. However if your dealing with a capacitive or inductive load, the current draw is not in phase with the voltage waveform. The difference in phases is called the power factor. If you divide the VA of a device by it's power factor, you'll get Watts.

VA is used because with large loads that are not completely resistive because rating a device in watts is quite deceiving of it's actual power usage. VA is the 'true' amount of power that a device uses. If you purchased a large motor for example, say it's rated at 1kVA and 0.9 PF. If I were to tell you that it only uses 1000 watts of power, I would be fudging the truth, because in actuality it's drawing 1111 watts when you account for the current lag (or voltage lag, depending if capacitive or inductive load).

Does that make any sense?
 

arcas

Platinum Member
Apr 10, 2001
2,155
2
0
Depending on the load, they're either the same or they're different. :p

If you have a pure resistive load...say a regular light bulb or heater coil...then Watts = Volt Amps. If you have a load that consists of magnetic coils...motors, transformers...then Watts != Volt Amps.

When you're comparing V-A and Watts, you have to consider what's called "Power Factor" which measures the ratio of actual work performed versus the amount of power used. That is, it kind of measures the amount of input power that's wasted. So in the case of a heater coil, the "Power Factor" or ratio between real and apparent power is 1.0. There's no waste: all the input power is converted to heat. The PF for inductive loads like motors and transformers is always (someone correct me here) less than 1.0. In the case of a motor, the PF will vary depending on the motor's load. An unloaded motor might have a PF < 0.5 meaning that although you might be pulling 100VA from the wall, you're only getting 50W worth of work...the wasted power goes towards maintaining the magnetic fields and such. A fully-loaded motor will have a PF close to 1.0 (but still < 1).

Personal computer power supplies generally have lousy PFs. According to apc.com, they come in at around 0.65. Which means that a 1400VA UPS can only handle around 900W worth of computers with these low-PF power supplies.


 

PowerEngineer

Diamond Member
Oct 22, 2001
3,598
774
136
The explanations given are pretty much on target, but let me clarify a couple of points.

When we talk about voltages and currents in an AC system, we use their RMS (root-mean-sqaure) values which are actually the peak value of the sine wave divided by the square root of two. The RMS values are used because multiplying the RMS voltage times the RMS current will give you the average watts. By average here, I mean the average power delivered over a complete cycle. At each instant, the actual power is equal to the actual voltage times the actual current. Assuming the voltage and current are in phase, the resulting plot of instantaneous power looks like a rectified sine wave. Again, the average value of that rectified sine wave is equal to the RMS voltage times the RMS current. Pretty similar to simpe DC circuits so far...

Now if you introduce a phase shift of 90 degress between the current and voltage waves (either way; it doens't matter), then you'll see something interesting. The instantaneous power values will be positive for one half of the cycle and negative for the other half of the cycle, making the average power delivery zero! Phase shifts between 0 and 90 degrees will show more positive power and less negative power, so that the average power delivery is positive but less than what it would be at 0 degrees. So we need to take the phase angle between the current and voltage waves into account when calculating the actual power delivered. The equation becomes RMS voltage times RMS current times the cosine of the angle between them. And the value of the cosine is also called the "power factor".

Of course, this is important in determining how much (RMS) current will actually be needed to deliver a desired amount of power to a load that has significant inductive characteristics (such as any electrical motors), and therefore in anticipating volatge drops and in the selection of transformer and wire sizes. Conductors and transformers are essentially capable of carrying a certain amount of current regardless of the power that current ends up delivering to a load. Transformers usually have so-called KVA (kilo-volt-amp) rating which are just their rated RMS voltage times their rated RMS current.

So, using Fandu's numbers, a 1 KVA load with a 0.9 PF would actually draw just 900 watts of power (on average over each cycle). The difference between the 1000 volt-amps and the 900 watts does NOT mean that 100 watts is "lost" somewhere. It does mean that the motor is pulling more current than the minimum that would be required to deliver that same 900 watts to a purely resistive load (where the PF = 1), and that means some loss in efficiency due to greater resistive losses in the conductors and higher voltage drops. As Arcas says, this difference between volt-amps and watts is being used to build up and drain off the magnetic fields from inductances (and voltage fields from capacitances) every cycle; real power is being taken and stored during one half of the cycle but then delivered back during the next half cycle -- making the average power delivered by this "extra" current equal to zero.

This may be more than anyone wants to know, but another way of talking about this difference between volt-amps and watts is to introduce the concept of "VARs" (Volt-Amps-Reactive). The calculation for VARs is RMS voltage times RMS current times the sine of the phase angle between them. Flows on electrical power lines are most often measured in megawatts and megavars.

One more aside: customer meters only measure the average watts delivered; typical residential customers are not charged more because of the larger current needed to deliver power to reactive loads (i.e. not charged for VARs).

I hope this answers the questions even more completely. :)
 

RossGr

Diamond Member
Jan 11, 2000
3,383
1
0
An addendum to this question. Apparently from the apc link mentioned above the PC Power supply is capacitive, assuming that a standard moniter is inductive (fly back transformer) is there some natural load balancing with the pair?
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
PC power supplies are inherently a capacitative load, but every PSU I'd seen has had power factor correction (usually a big inductor in series with the PSU), although some use a sophisticated active system. Where I've had a chance to measure the PF, it's been well above 0.95.

Modern CRTs are also capacitative - they are powered from a SMPS (just like the one in your PC) - sometimes these are power factor corrected. Older ones did indeed present a transformer to the mains often the flyback transformer - but SMPS and a subsequently smaller flyback transformer are now the preferred method.
 

TurboMan

Member
Feb 17, 2002
31
0
0
The instantaneous power values will be positive for one half of the cycle and negative for the other half of the cycle, making the average power delivery zero! Phase shifts between 0 and 90 degrees will show more positive power and less negative power, so that the average power delivery is positive but less than what it would be at 0 degrees.
I think this is what they (hackers) use to slow their power meters down. I bought a book with schematics for doing this but haven't had time to really study it yet.