AC v.s DC power transmission

onix

Member
Nov 20, 2004
66
0
0
It has been well established by Telsa, that AC for power lines is both more convenient and more efficient than DC. While it is easy to understand the former, I have never fully understood why AC should be more efficient than DC.

This following excerpt (from http://en.wikipedia.org/wiki/Alternating_current) also confuses me:

"Use of a higher voltage leads to more efficient transmission of power. The power losses in the conductor are due to the current and are described by the formula P = I2 * R, implying that if the current is doubled, the power loss will be four times greater. Therefore it is advantageous when transmitting large amounts of power to convert the power to extremely high voltages (sometimes as high as hundreds of kilovolts)."

Power equals I*I*R, but it also equals V*V/R. It seems to me that R (the resistance in power lines) will be fixed and should be kept as low as possible, so that increasing V will also increase power consumption according to the second formula.

Thoughts?

UPDATE:

The only details from the comments below which makes sense is the non-linearity of the resistance with current.

I understand the transformer utility for AC, but I think Telsa also argued that AC was more efficient (i.e. used less energy) than DC. Maybe it had nothing to do with the mode of transmission but the "inefficiency" of DC-to-DC conversion.
 

blahblah99

Platinum Member
Oct 10, 2000
2,689
0
0
Originally posted by: onix

It has been well established by Telsa, that AC for power lines is both more convenient and more efficient than DC. While it is easy to understand the former, I have never fully understood why AC should be more efficient than DC.

This following excert (from http://en.wikipedia.org/wiki/Alternating_current) also confuses me:

"Use of a higher voltage leads to more efficient transmission of power. The power losses in the conductor are due to the current and are described by the formula P = I2 * R, implying that if the current is doubled, the power loss will be four times greater. Therefore it is advantageous when transmitting large amounts of power to convert the power to extremely high voltages (sometimes as high as hundreds of kilovolts)."

Power equals I*I*R, but it also equals V*V/R. It seems to me that R (the resistance in power lines) will be fixed and should be kept as low as possible, so that increasing V will also increase power consumption according to the second formula.

Thoughts?

For one thing, converting high voltage AC down to low voltage AC and vice versa is easily done with a transformer. You can't do that with DC.

Also, as far as efficiency is concerned, if R is fixed like you said, the only way to reduce power loss through the R is to increase V, which will decrease the current through the resistor, and ultimately decrease power loss.

 

fishmonger12

Senior member
Sep 14, 2004
759
0
0
P = IV.

they try to keep current down. the more current there is, the more resistance there will be, because the wire doesn't obey ohm's law. the more resistance there is, the more the electrical energy is changed to thermal energy.

so you increase voltage, and decrease current, and end up with the same amount of power :)
 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
It would probably be just as efficient to transmit power in DC as AC. The efficiency comes from the high voltage. However, it's hard to convert DC to high voltage and back.

Doh; they already said all that... :(

Ok, fishmonger was kind of onto it but it wasn't really in English ;) so here's the translation:

When you transmit high amounts of current through a wire, it heats up and you need huge cables to handle it. Therefore, lower current is preferable, which is where higher voltage comes into play. From P=IV, the same amount of power is being transmitted.

Also, think of it this way: let's say you are sending 500 watts (25 volts @ 20 amps) across a long line that has a resistance of 10 ohms. V=IR; the drop across the wire will be 20 * 10, or 200 volts! That is, you'd need to put 225 volts in to get 25 volts out at 20 amps. I think this works out; it's all in my head, someone correct me otherwise.

Now consider the voltage is boosted to 500 V and current is dropped to 1 amp. This is the same power, 500 watts. The voltage drop will be 10 volts, so only 510 volts input would be required. That's why higher voltages are used.

Look at the efficiency: 1st case, power in = 225*20 = 4500 watts, / power out (500) = 0.11111 efficiency... (BAD)
2nd case, power in = 510, / power out (500) = 0.98

98% instead of 11% - that's why high voltages are used.

Ok, I realize that's not EXACTLY how to calculate all that, but it should be close...... :eek:
 

Vee

Senior member
Jun 18, 2004
689
0
0
Originally posted by: onix
Power equals I*I*R, but it also equals V*V/R. It seems to me that R (the resistance in power lines) will be fixed and should be kept as low as possible, so that increasing V will also increase power consumption according to the second formula.

Thoughts?

You're falling into one of them mind traps here.
When calculating power loss by V*V/R, V is not the voltage relative to ground. V is the voltage DRIVING the current through the line. And that voltage is the DROP in voltage from one end of the line to the other.
Check out preceding bobsmith1492 post for more details.
 

onix

Member
Nov 20, 2004
66
0
0
The only thing of the above which makes sense is the non-linearity of the resistance with current.

I understand the transformer utility for AC, but I think Telsa argued that AC was more efficient (i.e. using less energy) than DC. Maybe it had nothing to do with the mode of transmission but the "inefficiency" of DC-to-DC conversion.
 

damonpip

Senior member
Mar 11, 2003
635
0
0
Basically, power transmission is more efficient at high voltage for a number of different reasons. High tension power lines carry a voltage of more than 200k volts, and even neighborhood lines carry something like 7000. Now, 7000 volts is obviously too high to be used by the consumer, and the power plant does not generate 200k volts. The only simple, efficient method of increasing and decreasing voltage is with transformers. Transformers only work with AC electricity, therefore AC is much better for power transmission. Don't quote me on this, but I don't believe that AC of a certain voltage is more efficient than DC at that same voltage.
 

JSSheridan

Golden Member
Sep 20, 2002
1,382
0
0
bobsmith1492 is correct, essentially. Another reason that voltages are boosted so high is because a higher voltage on the line will allow you to send more power down a line than will a lower voltage. In AC transmission, inductance is the largest component of the line's impedance instead of resistance or capacitance. L>>R>>C generally. The inductance is all you need consider in shorter lines. The amount of power you can send down a line is found by taking the product of the voltage (scalar) at both ends of the line, dividing by the impedance (scalar), and multipling by the sin of the difference between the phase angles of the 2 voltages. ((|V1| * |V2|) / |Z|) * sin( Theta1 - Theta2 )

Don't forget about 3 phase power. It's really useful for industrial purposes. I remember reading somewhere that DC transmission was more efficient than single phase AC. There are solid-state DC to DC converters now, but they didn't exist back then.
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
There are plenty of HVDC lines in existence and they are very efficient for long distance transmission of power because they do not have changing electric fields (there is no inductive or capacitive component). Some of the larger hydroelectric power plants use HVDC. Canada also has several large scale HVDC links. Due to the conversion efficiency issue that others mentioned, they are generally only used on really long distance transmission - where the losses from an AC system would outweigh the losses from AC->DC->AC conversion.

The best online article on power delivery that I have seen is:
http://www.americanscientist.org/template/AssetDetail/assetid/14726/page/1

There's some interesting history on page 2. There are also plenty of really good books on the topic out there.
 

Calin

Diamond Member
Apr 9, 2001
3,112
0
0
On really long distances (like the wavelength of the electric current) appears some very interesting phenomenon. There is a wave on the line that is having voltages min and maxes along the line (due to interference/...). For Europe (50Hz), the wave length is something like 6 000km (or maybe less), for USA I think is 3000 miles due to higher frequency.
 

AluminumStudios

Senior member
Sep 7, 2001
628
0
0
I either read in a book or saw on a special somewhere that one major drawback to DC was the fact that a + and - pair of lines had to make a complete run to/from the power plant - that distribution stations were more difficult to impliment, you couldn't actually use the ground as a "ground" and major short cuircults could cause long runs of wire to burn to be damaged. DC in general just wasn't as flexible and friendly to impliment on a large scale as AC was.
 

Calin

Diamond Member
Apr 9, 2001
3,112
0
0
In alternative current, a pair of lines must go from power plant to the distribution stations (except is a pair of 4 lines, one "ground" and three phases). But only if you want the voltage to be certain between the lines (at the user), to not affect the wild life creating large paths of current thru water and so on.
The line that is called "ground" in an electric circuit is seldom at the same voltage as the earth outside.
 

genghislegacy

Member
Jan 21, 2005
100
0
0
Originally posted by: blahblah99
Originally posted by: onix

It has been well established by Telsa, that AC for power lines is both more convenient and more efficient than DC. While it is easy to understand the former, I have never fully understood why AC should be more efficient than DC.

This following excert (from http://en.wikipedia.org/wiki/Alternating_current) also confuses me:

"Use of a higher voltage leads to more efficient transmission of power. The power losses in the conductor are due to the current and are described by the formula P = I2 * R, implying that if the current is doubled, the power loss will be four times greater. Therefore it is advantageous when transmitting large amounts of power to convert the power to extremely high voltages (sometimes as high as hundreds of kilovolts)."

Power equals I*I*R, but it also equals V*V/R. It seems to me that R (the resistance in power lines) will be fixed and should be kept as low as possible, so that increasing V will also increase power consumption according to the second formula.

Thoughts?

For one thing, converting high voltage AC down to low voltage AC and vice versa is easily done with a transformer. You can't do that with DC.

Also, as far as efficiency is concerned, if R is fixed like you said, the only way to reduce power loss through the R is to increase V, which will decrease the current through the resistor, and ultimately decrease power loss.


Yeah that's what they taught in university
 

JTWill

Senior member
Feb 2, 2005
327
0
0
DC voltages require larger lines to go the same distance as AC, the main reason very high voltages are use in line transmission is wire size, a 20 amp line for 480 volts is the same size for 120 volts, if it was not done this way the power lines for a city of 1 million would add 250 million in costs. Not everything runs at the same level as your residence, industry has greater demands. With a little loss everytime you drop the voltage in half by transformer you double the current capability.
http://www.rfcafe.com/references/electrical.htm
RF Cafe - Electrical Conversions Formulas & References
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
AC is used for mains power primarily becuase it is very easy to increase the voltage for transmission and then reduce it for distribution and further use. As has previously been mentioned high voltages are more efficient for transmissio than low voltages.

However, it can actually be more efficient to transmit large volumes of power by DC than by AC, provided that the quantity of power is high enough and the distance long enough. With AC the voltage constantly fluctuates between maximum and 0, but the transmission lines have to be rated for the maximum voltage. With DC its possible to maintain the maximum voltage continuously allowing greater power transfer. There are several other advantages as well related to changing electrical fields and parasitic capacitance and inductance.

The disadvantage is that you need rectifiers +/- inverters at each end of the link, and as you might guess these are very expensive, and they do waste power. Nevertheless there are a number of very high voltage DC power lines because of the overall improvements in cost and efficiency.

The other issue with DC is that you can transfer power from one grid to another. The problem with AC is that the rotation of every generator on a grid has to be synchronized with every other generator. Different grids don't synchronize to each other (In Japan, their 2 grids don't even have the same frequency - 50 Hz in the East, 60 Hz in the West). In this case the only way to transmit power between them is to convert to DC and convert back to AC with a grid synchronized inverter.
 

PowerEngineer

Diamond Member
Oct 22, 2001
3,606
786
136
Mark R is right.

AC power systems are more practical because transformers make voltage changes doable and because AC motors are also much simpler and more reliable than thier DC counterparts.

Conductors are generally limited by temperature, which is a function of their inherent resistance (and environmental factors) and the current flow. Therefore conductors can essentially carry the same current at any voltage level.

As a general rule, the amount of power that can be carried down a conductor increases as the square of the (line-to-neutral) voltage (AC or DC).

DC makes better use of a conductor because its voltage is constantly at its peak value, whereas the AC voltage reaches its peak only very briefly. AC power calculations are based on RMS (root-mean-square) values that are the peak AC voltage divided by the square root of 2; same with the current waveform. So for a fixed maximum voltage and a fixed maximum current magnitude, the DC line will deliver twice the power than the best that AC can do. And AC (real) power delivery drops when the phase angle between the voltage and current waveforms is anything other than zero (P=V*I*cos[angle]).

Of course, losses due to a steady current constantly at the maximum value (DC) will also be higher than the losses for the AC current, but you still end up with lower losses for every megwatt delivered on DC.

As Mark said, there are big disadvantages to DC transmisison -- particularly the cost of the converter stations. That's why they are only applied where power needs to be transmitted over long distances (where the reduced transmisison line costs due to better utilization of the conductor and lower per-Mwhr losses really pay off) or when power needs to be exchanged between two AC systems that are not synchronized (as between the Eastern and Western Interconnections in North America -- the split runs down the west edge of the Great Plains states).

 

Calin

Diamond Member
Apr 9, 2001
3,112
0
0
The limit in current capabilities of a conductor is established by its capability to loose the heat from the transmission. As the effective value of an AC current is the value of the DC current that will create the same thermal effect (heating), an conductor can carry the same numbers of "measured" amperes whatever the current (DC, AC at different frequencies). The AC line with the current and voltage in phase will deliver the same power. You could try to measure the peak voltage on a AC line, and you will find it is 1.4142... greater than the declared voltage (that is the effective voltage)
The DC is more efficient just because of the capacitive and inductive properties of the power lines in AC, that consumes some of the transmitted power. For a DC line, the inductive and capacitive properties only affect the transitional times (when the current and/or voltage is changing), unlike the AC lines that are affected all the time
 

JonB

Platinum Member
Oct 10, 1999
2,126
13
81
www.granburychristmaslights.com
Unrelated really to the efficiency question, there is a good reason for DC transmission in long distance and some short distance applications.

In an AC system, the entire "Grid" must be synchronized to not only the same frequency, but must be in phase. All generators are synchronized and stay locked to that frequency. This isn't a big deal locally, but when one Grid borders another, they can't share the power load unless both Grids are synchronized. If you connect two Grids that aren't exactly in sync (voltage, frequency and phase), the connecting device will literally destroy itself from the excessive current and heat.

So, to connect two Grids, the easiest way to share power is - Convert the AC to high voltage DC, send it down the line (not always long, sometimes just across a state border), and then DC back to AC that is synchronized to the receiving Grid.