• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Power Factor - what happens to the other percentage?

Maverick2002

Diamond Member
After reading through this: http://en.wikipedia.org/wiki/Power_factor

I still don't understand what happens to "rest" of the power. That is, the difference between the actual power and the apparent power ... where does it "go"? Does it go back through the grid? Is it given off as heat? Or am I not understanding this correctly?
 
Back to the grid, I think.

Due to energy stored in the load and returned to the source, or due to a non-linear load that distorts the wave shape of the current drawn from the source, the apparent power can be greater than the real power.
 
Even as an EE in school they never really explained this to me. Basically, you want your PF to be as close to one as possible. Most of the time in school, we did this b adding in capcitor banks, which will change the angle of the load (lower) which makes the Pf closer to 1. The way I magine the "imaginary" power ( called Q) is almost like a backwards flowing current in the system. It just causes losses in the power that actually shows up. Its not really given off as heat per say, its not really a physical object. Somebody can correct me if I am wrong though, because I would like to know myself. Hope I helped a little.
 
You pay in kWh(Kilo watt hours) so you are paying for the real power you get. Apparent power ("S") is measured in kVA's (How transformers are rated). If you are curious to know, the imaginary ("Q") is measured in kVAR's (kiloVoltAmps Reactive).
 
Originally posted by: Maverick2002
Hmm.. maybe I'm not understanding this correctly and "apparent power" is really just some theoretical number that "doesn't exist", whereas the actual power is what's used and what you pay for. I have no clue though really.

The way to think about it is as follows:

An ideal device takes energy from the grid as it uses it - this means that the current is exactly proportional to the voltage.

Some devices, however, store energy within themselves - e.g. electric motors, capacitors, electronics. In effect, they draw too much power from the grid at one point, they get full and then they stop drawing power for a bit. The problem is that grid stress is proportional to I^2, so taking 2x the current for half the time, is more stressful than taking 1x the current all the time. In this case, taking 2x the current for half the time causes as much stress as taking 1.4x the current all the time - so the power factor would be 0.7.

In extreme cases, like an idling electric motor, or a big capacitor just connected direct to AC - so much energy gets stored, in the device, and so little gets used, that the virtually all energy gets given back to the grid in the next half-cycle of AC current. As a result, these devices take a lot of current, but use no energy - the 'borrow' a lot of energy, but repay it 1/120 second later. The grid has to work hard to deliver the power, then haul it away again - but no energy is actually consumed.
 
Thanks for the explanation. How does it make the grid "work harder" though? Does more energy get used somewhere? I guess I'm not really seeing the benefit of a high PFC quite yet.
 
i think what he's saying is that you're not using any real power, but the power grid still has to work hard to provide this "reactive power," due to your device with poor power factor.
 
Maverick2002:

Did you consider moving this thread to the "Highly Technical" Forum? While the inputs here are certainly good, there are some basic-science-type folks who hang out in Highly Technical that might offer some additional thoughts.

I can move the whole Topic for you if you'd like.
 
Originally posted by: Maverick2002
Thanks for the explanation. How does it make the grid "work harder" though? Does more energy get used somewhere? I guess I'm not really seeing the benefit of a high PFC quite yet.


- What is power factor?
Instantaneous power is the product of voltage and current. Think of current and voltage supplied as sine waves. If they are in phase the instantaneous power is always positive.

As soon as one is distorted (well phaseshifted) by a capacitive or inducive elements you will get instances where the instantaneous power will be negative, or will be returning to the source. This phaseshift defines power factor. A phase shift of 90 degrees (power factor of 0, or cosine of 90) will have all the power returned to the source every second half cycle.


- Why does this happen?
In resistors the voltage-current relationship is linear. That is the higher voltage the higher the current. So the current and voltage waves will be in phase.

Capacitors on the other hand store charge. A capacitor's charge to voltage relationship is linear. However, its current is proportional to the rate of change of the voltage. Think of DC, if the voltage is constant the capacitor will chage and there will eventially be no current. In a capacitor the current leads the voltage, hence why some call it a leading power factor etc.

Inductors are a bit more tricky. (And for some reason I have a brain fart everytime it comes to magnetics 😛) In inductors the Flux Linkage (Total Flux * No of terns) is proportional to the current and the voltage is proportinal to the rate of change of the current. In inductors the current lags the voltage.


- What is the net effect?
The grid has to provide the current. For example if you have 11kW or 11kvar (reactive component) on an 11kV system. The grid still has to provide 1A of curent. However in the kW case you actually have real power to do work, while in the reactive case its just returned to the source every second half cycle. In either case the losses are the same over the power line since the same current flows through and the lines have resistance (I^2 * R losses).

So the lower the powerfactor of the device the higher the current (or Volt Amps) you have to provide to get the same ammount of real power. This will translate to power losses in the grid, need for larger transformers, higher voltage drops (depends as capacitive systems or very long lines can have voltage rise) etc.
 
RebateMonger - sure, move it if you think it's necessary.

Dark Cupcake - thanks for the explanation. Still, regarding this "power loss in the grid", where does it go exactly? That's what I'm trying to figure out - does a lower PFC actually reduce the efficiency of the grid in the sense that X amount of energy is "lost" via heat (?) or something else?
 
As I understand it, a lower Power Factor does reduce grid efficiency, but in a somewhat indirect manner. With a Power Factor less than one, the actual work you get out of an electrically-powered device (like a motor moving product down a conveyor belt) is less than the simple product of supply voltage times current. Now, the supply voltage is fixed by the system, and the work being done by the motor is set by the application (e.g., how many pounds per minute of products are being moved by the conveyor), so the only variable that can change is the current drawn by the motor. So it consumes more current that a simple I=P/V calculation predicts. The actual excess energy apparently is really returned to the electrical supply system. But that has two costs associated. One is that the actual currents required are a bit higher, so the wire size for the supply cable, plus the transformer size feeding this area, all need to be bigger and thus cost more. Moreover, since the current is higher, the heat losses in all the wiring (with real non-zero resistance) are higher, so there is a small excess real loss of electrical energy (as heat) to the surroundings. Some of that loss will happen in the user's facility (that is, after the electrical consumption meter), and some will happen in the utility company's equipment (before the meter). The combination of larger equipment required and real excess energy loss create increased costs to the utility company, so they charge more for a Power Factor outside their tolerance range. The customer still has their own increased wiring system losses to deal with, but they usually don't bother calculating those and figuring the dollar impact on their own side of the meter.
 
Originally posted by: Paperdoc
As I understand it, a lower Power Factor does reduce grid efficiency, but in a somewhat indirect manner. With a Power Factor less than one, the actual work you get out of an electrically-powered device (like a motor moving product down a conveyor belt) is less than the simple product .....


QFT

Simple example would be to compare say a 80kW motor with a power factor of 1 and 0.8.
With a power factor of one you only have to supply 80kVA, with 0.8 you need to supply 100kVA to achive the same real power to the motor.

Lets say for example its a 240 volt system.

For the 1pf motor the current supplied will be ~333 amps (80000/240)
For the 0.8pf motor the current supplied will be 416amps even though the same real power is actually produced. Lets for example assume the power line resistance is 15 ohms.

416-333 = 83A difference. 83A * 15ohms = 1245 watts. So this means just by having a motor with a power factor of 0.8 instead of 1 there are 1.245kW extra losses in the power system in this example. These losses will just be seen as heat on the conductors, transformers etc.
 
Thanks, this is what I was looking for. So in summary there are 2 costs to the utility company associated with this, but no change in cost to you (unless you have a devices with a lower PF than they specify). Those being:

1) higher cost due to "bigger" equipment needed to deliver more power.
2) some energy loss (heat) through the wires on the way back to the grid.
 
I am not quite sure if there is heat loss through the wires themselves..maybe at the insulators though. (they are the "bell shaped" gray objects you see on the poles where the wires meet.) I just say this because even over like a 2 mile span, the reactance of an ACSR line isnt even 1, so its barely anything..aka prob. not much heat being produced. Please somebody correct me if I am wrong, I am just using an educated guess here.
 
Originally posted by: z1ggy
I am not quite sure if there is heat loss through the wires themselves..maybe at the insulators though. (they are the "bell shaped" gray objects you see on the poles where the wires meet.) I just say this because even over like a 2 mile span, the reactance of an ACSR line isnt even 1, so its barely anything..aka prob. not much heat being produced. Please somebody correct me if I am wrong, I am just using an educated guess here.

Reactance doesn't produce heat, although it does affect current flow.

It is resistance that produces heat as well as affecting current flow. Heat losses are signficiant in high power lines. You've probably noticed big lines with wires in bundles of 2 or 4. It's done that way because 4 small wires have bigger surface area to volume ratio, so they maintain a lower temperature than a single big wire.

Ever noticed that birds don't sit on the primary conductors on big power lines? It's not because they'll get shocked (they won't - they're quite happy to sit on 33 or 12 kV regional power lines) - but it's because on big interstate power lines the currents are so high, that the lines operate at a temperature of about 200 F, hotter in Summer. Birds don't sit on them because they'd burn their feet.
 
I remember my EE professor saying something about using an inductor to "cheat" the power company over this apparent/actual power thing, but didn't really understand what he was going on about.
 
Originally posted by: z1ggy
I am not quite sure if there is heat loss through the wires themselves..maybe at the insulators though. (they are the "bell shaped" gray objects you see on the poles where the wires meet.) I just say this because even over like a 2 mile span, the reactance of an ACSR line isnt even 1, so its barely anything..aka prob. not much heat being produced. Please somebody correct me if I am wrong, I am just using an educated guess here.

All conductors have heat loss through them and also all of them have a resistive, inductive and capacitive component. Its just that capacitance only plays a big role on long lines. All conductors also have a heat rating, but the primary use for that is so they don?t sag below clearance limits

On the other hand the resistance of conductors is fairly small, but it does add up. Some common conductors have the following DC resistances (AC will be slightly higher):
Moon (7/4.75 AAC) 0.232 ohms/km @20C - common conductor for 11kV here
Oxygen (19/4.75 AAAC) 0.0884 ohms/km @20C
7/2.00 CU HDBC 0.815 ohms/km - crappy old conductors which need to be replaced

When you have a few hundred thousand km of the stuff strung up, the losses do add up, even though percentage wise most of the time its less than 5%.

The other thing to consider is power loss through transformers. I've seen a 25MVA unit which has been uprated to 34MVA. If you want 34MVA out of it you have to input 40MVA. Thats a heck of a lot of power loss.
 
The rest of the power eventually return back to the source although some will dissipate in some form of energy referred to as losses. However, the rest of it will return back to the source referred to as reactive current. This reactive current is undesirable in the sense that it tends to counter the voltage output of the generator. To compensate, the generator is equipped with a PF compensator which consist mostly of large capacitors. Except for capacitors, most electrical loads are inductive which have "lagging" reactive loads. Capacitive loads on the otherhand have capacitive or "leading" reactive loads. This is the reason large capacitors are used as PF compensators.
 
Originally posted by: Dark Cupcake
The other thing to consider is power loss through transformers. I've seen a 25MVA unit which has been uprated to 34MVA. If you want 34MVA out of it you have to input 40MVA. Thats a heck of a lot of power loss.
LOL. That's crazy!

Utiltiy transformers are some of the most efficient industrial machines ever built. A typical 30 MVA utility transformer has an efficiency of around 99.5%. I.e. to get 30 MW out, you need an input of 30.2 MW in.

Modern transformers are even more efficient due to better materials and manufacturing.

A transformer with 20% losses? Who in their right mind would ever use something like that. That transformer would be burning though about $250/hr in wasted energy. Spending the $2 million to replace it with a proper transformer would pay for itself in a few months.
 
lol, we would probably never have gone with AC if we had such crazy power loss with transformers... Just think of how many transformers sit between us and the power plants...
 
Originally posted by: Mark R
LOL. That's crazy!

Utiltiy transformers are some of the most efficient industrial machines ever built. A typical 30 MVA utility transformer has an efficiency of around 99.5%. I.e. to get 30 MW out, you need an input of 30.2 MW in.

Modern transformers are even more efficient due to better materials and manufacturing.

A transformer with 20% losses? Who in their right mind would ever use something like that. That transformer would be burning though about $250/hr in wasted energy. Spending the $2 million to replace it with a proper transformer would pay for itself in a few months.

Yes larger transformer can reach ~99% efficiency, however, you also don't typically overload them by 45% and call it its nominal rating.

Truth be told it was an extreme example, but things like that do happen. This transformer I mentioned is in fact a 15/18/25 MVA unit (depending on configuration), but due to the fact that the bushings, CTs and cooling is sufficient it can be used up to 34 MVA.

The other thing to understand is load is cyclic. That is even if the 34MVA figure was the peak load (is currently only peaks at 23MVA) that would only occur around 30-60 min of the day, most of the time it'll spend in the low to high teens where the transformer wouldn't have such losses. So unless I went up to my boss and gave him 2 mil bucks to replace it, I think its here to stay for a while. Plus if I did have 2 mil dollars I would not be using it to buy a new transformer anyways 😛

I have a ton more stories about some weird/crazy things out there, obviously can only tell the generic ones.

Originally posted by: fffblackmage
lol, we would probably never have gone with AC if we had such crazy power loss with transformers... Just think of how many transformers sit between us and the power plants...

As opposed to using what? 95 - 98% efficiency of transformer is sure as hell better than what was achievable otherwise at the time.
 
Originally posted by: Dark Cupcake
Originally posted by: fffblackmage
lol, we would probably never have gone with AC if we had such crazy power loss with transformers... Just think of how many transformers sit between us and the power plants...

As opposed to using what? 95 - 98% efficiency of transformer is sure as hell better than what was achievable otherwise at the time.

You could use DC for delivering power...? That's what I meant.
 
Originally posted by: fffblackmage
Originally posted by: Dark Cupcake
Originally posted by: fffblackmage
lol, we would probably never have gone with AC if we had such crazy power loss with transformers... Just think of how many transformers sit between us and the power plants...

As opposed to using what? 95 - 98% efficiency of transformer is sure as hell better than what was achievable otherwise at the time.

You could use DC for delivering power...? That's what I meant.

What I meant was that there was no way of efficiently stepping up or down dc voltages, even now the costs are very high. Without being able to use high voltage for transmission you don't have much load reach.

 
Back
Top