Why are modern videocards so power hungry?

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
It's been a couple years since I've bought a videocard so I've been researching for the past week. I'm shocked and apalled at how power hungry they are now.

Remember when video cards were powered by the AGP bus? Then at some point, a few cards started sprouting Molex connectors. Then came the 6 pin connectors on the PCIE cards. Fine, I can see the need to pump more power than the motherboard can supply.

But now high end videocards have two power connectors and draw 250W+, and it's been getting worse with each of the recent generations, and apparently that will continue.

Meanwhile, CPUs have become more efficient, hard drives are gradually transitioning to solid state, cars are more efficient even while putting out 2x the horsepower than a decade ago, diesel trucks are cleaner than ever. So why are video cards the exception? Why is an 850W powersupply now the norm for running a single videocard? Will it not end until PSUs aren't capable of keeping up?

Is there some physical reason that videocards can't become more powerful without requiring more and more power, or is it just a cost tradeoff?
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
That's because both ATI and Nvidia went with claims of "Moore's Law cubed" saying they double performance every 6 months while in reality they were making ever bigger and power hogging chips to achieve that. While the performance gains were impressive, power costs came with it.

Plus, Moore's Law is about doubling transistors and the GPU performance doubling went down to 2x/year, obviously limited by power.
 

fffblackmage

Platinum Member
Dec 28, 2007
2,548
0
76
I would say the new 68xx are less power hungry than the 58xx, but the 69xx are probably going to end up using more power anyways.

A 850W PSU seems more normal for an SLI setup than anything else. A 500-650W PSU is sufficient for most single GPU setups.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,682
329
126
Videocards performance have increased much more than CPU performance in the last few years.

Still look at 5870 performance/watt compared to 4870x2 and GTX295, or 5850 performance/watt vs 4870/GTX285, etc.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Nvidia 480 has a TPD of 250 or something, but thats with avg. load.... if you stress it out completly it ll go up to about ~320 watts. AMD cards use abit less... the TPD they show is from the max they can get the card to use (measured at the wall), and they usually have lower rateings than the nvidia counter parts.

So when you look at TPD be aware that nvidia arnt measureing most possible draw, but avg load(s), while amd show the most possible they can get out of a card. Why do nvidia do it differntly? because currently theyre cards use more power/performance, and dont want to look bad.

There are youtube videos of a guy with 4x SLI 480s, and a watt o metter, where he has a PSU that only gives power to the grafics cards. He gets over 1600 watts used. 1600watts / 4 cards = 400watts pr card.

That means a current 480, if overvolted to overclock can reach over 400watts used.


Why does newer gfx cards draw more power? because apperntly they cant figour out a way to get big improvements without more energy used. Neither nvidia OR amd can make huge improvements in performance without useing more.

Anyways if power is expensive where you are, and want a card that doesnt use as much, your better off with a amd product.
 
Last edited:

tyl998

Senior member
Aug 30, 2010
236
0
0
I have a 750 antec power supply. Supplies my 460 SLI rig just fine. With one card even 550 will be fine.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Is there some physical reason that videocards can't become more powerful without requiring more and more power, or is it just a cost tradeoff?

I suspect it may have already been addressed, but I need to pad my post-count so I'll chime in with the following:

They are, and you can.

You could dial the clockspeed down on any GPU to the point that its power-consumption was low enough that it could be fed solely by the mobo, and even at those paltry clocks the card would still outperform any prior generation video card when benched with its clocks dialed down to fit inside the same power-footprint.

What changed was the fact that people were willing to buy >400W PSU's and deal with >75W dissipation from their video cards. Where there is a market their will be capitalism.

So we've had performance in video cards grow at a rate faster than what might have been had the video cards been TDP limited like CPU's. (where the max power is on the order of 140W for retail cooling solutions)

But even CPU's can reach >300W power dissipation if you are willing to spend $ on the 3rd party cooling solution.
 

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
Good points about videocards becoming more powerful much faster than CPUs.

How big is the power usage difference between AMD and nVidia? How do midrange cards like the 6850 and GT460 compare? (those are equivalent right?)
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
ram eats energy too, and gddr5 at high volts/clocks even more so. ram size/speed has shot up over the years
 

KingstonU

Golden Member
Dec 26, 2006
1,405
16
81
That is a good point, how does GDDR5 compare to DDR3 in terms of power usage and performance? Are they even comparable?
 

nenforcer

Golden Member
Aug 26, 2008
1,767
1
76
Its called 3 billion transistors needed to accelerate my virtual girlfriend

adrianne.jpg
 

Vdubchaos

Lifer
Nov 11, 2009
10,408
10
0
it's also worth noting that even though Video cards use A LOT more power......video games graphics quality in general hasn't really took a BIG leap AT ALL.
 

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
it's also worth noting that even though Video cards use A LOT more power......video games graphics quality in general hasn't really took a BIG leap AT ALL.

That's one thing I've noticed for the past 2-5 years. I go back and play old games like Far Cry and they look amazing because of the artistic quality. Newer games have more effects, but they don't really LOOK better. Even technically, the effects these new cards are driving don't seem that impressive.

Another thing is when you turn the options down so new games can run on older hardware, they look much worse than the old games did.
 

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
I remember when the Luclin expansion came out for Everquest in 2001. It was a huge leap forward in graphics quality. I think I upgraded my video card to a TNT 2 (or maybe GF2?) so I could run it.

Well early this year I fired up Everquest for old time's sake. It still looks great. The graphics quality and the artistic quality are BETTER than World of Warcraft.
 

Zap

Elite Member
Oct 13, 1999
22,377
2
81
Is there some physical reason that videocards can't become more powerful without requiring more and more power, or is it just a cost tradeoff?

You can ask the same thing about CPUs. My first system didn't even need a heatsink on the CPU. Then, we went with tiny heatsinks/fans. Now even Intel has a tower heatpipe monstrosity in the lineup.

That's one thing I've noticed for the past 2-5 years. I go back and play old games like Far Cry and they look amazing because of the artistic quality. Newer games have more effects, but they don't really LOOK better.

That's because companies are throwing tech at it instead of throwing better artists at it. Also, some games (BFBC2?) just don't lend themselves to looking good. After all, how good can dirt and camouflage look?
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
That's because companies are throwing tech at it instead of throwing better artists at it. Also, some games (BFBC2?) just don't lend themselves to looking good. After all, how good can dirt and camouflage look?

I thought the greenery and water in the game looked pretty great. That being said, Battlefield imo has always had better sound than graphics.
 

dualsmp

Golden Member
Aug 16, 2003
1,627
45
91
The 6850 is the most powerful card using a single six pin PCI-E connector.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
That's because companies are throwing tech at it instead of throwing better artists at it. Also, some games (BFBC2?) just don't lend themselves to looking good. After all, how good can dirt and camouflage look?

I've often wondered by companies don't simply outsource texture-making or something. Why do X companies need to make X number of dirt textures? Why hasn't a middleware company emerged that specializes in making scalable textures (from crap to photorealistic)? Companies could further mod textures too, like how Valve modded Havok physics, so that we don't see literally the same textures in each game. Maybe if companies didn't have to spend a fortune each on graphics and licensed it from the middleware company, they would have more money left over to spend on trivial things like, I dunno, making the game fun?
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
GPU architectures went parallel earlier and to a much greater degree than CPU architectures did, and are hitting the "multi-core scaling wall" earlier as a result. It's not like were going to get 16-core desktop CPUs anytime soon, either.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Good points about videocards becoming more powerful much faster than CPUs.

How big is the power usage difference between AMD and nVidia? How do midrange cards like the 6850 and GT460 compare? (those are equivalent right?)


I have a picture with a table showing the watts, measured off the wall, where they calculated out the system use (they ran furmark and measured at the wall). I cant for the life of me remember where the hell I got it.



**Edit: (goes off to look for more charts for the rest)

power_maximum.gif




Source: http://www.techpowerup.com/ (just search for a card, go to review under "power comsumption")

max load: (measured at the wall, all cards at stock gpu core/mem/shaders ect)

480 = 320w
470 = 232w
460 = 155w

5870 = 212w
6870 = 163w
6850 = 125w (from same site but differnt table)


These are all from 1 site, instead of what I had written down myself from various places so probably give a better representation than those I posted at first. For some reason they can vary abit from site to site.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I have a picture with a table showing the watts, measured off the wall, where they calculated out the system use. I cant for the life of me remember where the hell I got it.


5850 TDP 151w/27w
6850 TDP 127w/19w
5870 TDP 188w/27w
6870 TDP 151w/19w

460 TDP 155w/15w____wiki/nvidia say: 160
470 TDP 232w/29w____wiki/nvidia say: 215
480 TDP 320w/54w____wiki/nvidia say: 250

Those are from TechPowerUp and seem to list TDPs not actual power draws. TDP doesn't necessarily equal power draw.
 

tvdang7

Platinum Member
Jun 4, 2005
2,242
5
81
only the super high end like 480gtx and 5970. The high end tends to stay about the same as the generation before it.