Why are modern videocards so power hungry?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
http://www.xbitlabs.com/articles/video/display/radeon-hd6870-hd6850.html

hd6800_power.png

IIRC xbit said they got a defective 6850 or something hence the idle watts being high on their sample
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
That's one thing I've noticed for the past 2-5 years. I go back and play old games like Far Cry and they look amazing because of the artistic quality. Newer games have more effects, but they don't really LOOK better. Even technically, the effects these new cards are driving don't seem that impressive.

Power consumption has been close to the same in the last 2-5 years as well. HD2900XT and 8800GTX use about the same as an HD 5870 and GTX 470. Actually I can't recall the exact usage of the 2900XT; it might be closer to the GTX 480.

Midrange cards usually operate in the 90-130W range, and low-end and low-midrange are below that.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I like your chart Lonyo where theyre side by side, that makes it alot easier to compair, also that it shows in 3 stats of differnt amounts of GPU useage is pretty nice so you can see how they scale.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Shot in the dark... about 105watts or something...
Based on techpowerup "idle system with 8800gt" - "max system with 8800gt", and factoring in about 20watts for idle or so.

Checked wiki for the 8800gt and that was 105watts :)
Lmao educated guesstimate landed right on the watt :p (if you believe wiki)

It might be abit more.. youd need to find a watt o meter measured at the wall to be sure. Those reviews are kinda hard to find for a 8800GT, people wherent as concerned about power draw back then.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
*** nvm saw that its the blu-ray playback. yeah nvidia is dominateing in that area.
Thought it was the idle chart that was posted <.<.
 
Last edited:

KingstonU

Golden Member
Dec 26, 2006
1,405
16
81
Another thing is when you turn the options down so new games can run on older hardware, they look much worse than the old games did.

I have actually been finding the opposite. I find that a game like Gears of War, to turn down the graphics all the way down I was still massively impressed by the graphics, and when I ran it on better hardware at a friend's place with all the eye-candy turned up, it did not look much better yet it required a massive increase in performance to be able and make just that small increase in eye candy.

Graphics are going to face a decreasing rate of return on image quality for the performance requirements as we get more and more realistic characters and environments. As achieving that more realistic image is going to become more and more complex.
 

W2zzard

Junior Member
Nov 5, 2010
1
0
0
Source: http://www.techpowerup.com/ (just search for a card, go to review under "power comsumption")

max load: (measured at the wall, all cards at stock gpu core/mem/shaders ect)

all my recent reviews are card only power measurements (not at the wall, graphics card only)

Those are from TechPowerUp and seem to list TDPs not actual power draws. TDP doesn't necessarily equal power draw.

my measurements are always measurements not quoted TDPs

Shot in the dark... about 105watts or something...
Based on techpowerup "idle system with 8800gt" - "max system with 8800gt", and factoring in about 20watts for idle or so.

check my 9800 gt power numbers from a recent review. 9800 gt == 8800 gt
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
my measurements are always measurements not quoted TDPs

Hi W1zz. :)

Although that may be the case, the poster I responded to used a mixture of official TDPs and your measured power draws. E.g., 151 and 188w are the TDPs for 5850 and 5870, respectively, but 320w was what you measured as peak load for GTX480. Very inconsistent.

The poster apparently thinks maximum power draw means something in real life usage. Imo, sustained power draw is what matters more (assuming the PSU can handle the peak load of the system including peak graphics card load). Intel knows this which is why in their newest designs they allow temporary spikes in wattage that would go over TDP if sustained. See: http://www.anandtech.com/show/3922/intels-sandy-bridge-architecture-exposed/7 So personally I care more about the 3DMark sustained average load. My case has powerful cooling that can smooth out any peaks in power usage and the corresponding heat output.

(The previous paragraph was to the poster, not to you W1zzard. By the way did you lose your account or something or why are you using W2zzard? And is beta TriXX basically done or is it a real beta with bugs and crashes? My Sapphire 6850 is scheduled to be delivered later today and I am excited about the card but less excited about having to use a beta version of TriXX.)
 
Last edited:

JimmiG

Platinum Member
Feb 24, 2005
2,024
112
106
Intel and AMD (CPU division) discovered that 125-140W was as high as anyone would accept for a high-end CPU. That has been the upper limit ever since the Pentium 4 days.

I think Nvidia and AMD's GPU division are coming to the same realization - there's a limit to what people will accept in terms of heat and power consumption. AMD of course already started touting efficiency with the 3 and 4k series of Radeon cards. Nvidia is now finally starting to talk about performance per watt and promising some improvement already with the 5xx-series. I think we've reached the limit of what people will accept, with 250W for a single-GPU card.
 

biostud

Lifer
Feb 27, 2003
19,925
7,036
136
Just don't buy a high end card. Same with CPU. CPU/GPU's with low power draw are more often used in laptops. There's a limit of performance/watt at each fabrication node, which means the more performance you want the power you're going to use.

If the high end cards didn't exist, then the power usage wouldn't be very high, and we would all be able to game fine on 5770, but since there's a demand for more performance then it would be stupid not to make products that meets those demands.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Intel and AMD (CPU division) discovered that 125-140W was as high as anyone would accept for a high-end CPU. That has been the upper limit ever since the Pentium 4 days.

That was actually an economical upper-limit at the time from the standpoint of stock coolers (no heatpipes yet, remember) and the cost of binning+validating silicon at the requisite Tjunction max temps.

140W is not an issue with today's heatpipe HSF, but the damage was done in terms of creating a mindset in the consumer that more power was bad (as was done with GHz) so they've shied away from having SKU's approach those TDP's anymore.

2S or 4S systems plow through 300-600W anyways, an overclocked rig can easily touch 250W for a single CPU.

But CPU power-consumption has been so badly vilified by both companies that neither is about to make it seem like they are eating crow and contradicting their prior marketing mantra by going to >140W SKUs.

Its called painting yourself into a corner, in your career you are guided to never put into email anything that is definite when eliminating choices in decisions because you never know when you are going to find yourself wanting to pursue that option tomorrow which you demonized today.

The GPU guys did a better job of not demonizing power-consumption, so they still have that engineering parameter at their disposal when it comes to maximizing the performance vector of their platforms. CPU guys effed themselves there.

So even though they could release 200W CPU's at 4GHz that would be perfectly stable and generate higher revenue, they can't because of the perception issue that still persists from Prescott days.
 

Vdubchaos

Lifer
Nov 11, 2009
10,408
10
0
That's one thing I've noticed for the past 2-5 years. I go back and play old games like Far Cry and they look amazing because of the artistic quality. Newer games have more effects, but they don't really LOOK better. Even technically, the effects these new cards are driving don't seem that impressive.

Another thing is when you turn the options down so new games can run on older hardware, they look much worse than the old games did.

To an extent yes.

IMO newer games look better, but difference is MINIMAL at best.

Like I said, the advancement of graphics has not been very great AT ALL.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Compare GTX460 to GTX285 and you will see that 460 performs as good as 285 but consume less electricity. Compare Radeon 4870 to Radeon 5850 and you will see 5850 performs better but it doesn't use more electricity. That means, newer GPUs do have better performance per watt ratio.

Now compare Q6600 (@2.4Ghz) to I7-920(2.66Ghz), at load it is 160watt vs 226watt.
Data from bit-tech

What is wrong again?
 

JimmiG

Platinum Member
Feb 24, 2005
2,024
112
106
So even though they could release 200W CPU's at 4GHz that would be perfectly stable and generate higher revenue, they can't because of the perception issue that still persists from Prescott days.

I guess there might be some truth to that, seeing as how especially Intel CPUs are ridiculously overclockable. There's clearly a lot of headroom for those willing to put up with the increased power consumption and heat.

I don't think anything over 160-180W is practical for a CPU. Clearly, validating them for higher temperatures, as you say, would increase cost. Phenom II is only good for up to 62-64C while, GPUs can go as high as 90C or more. Even with heatpipe coolers, you can quite easily hit temperatures in the mid 50's in a mid-sized case without too many case fans. My Ph-II is hovering around the 55C mark running Prime95, with a ridiculously huge Zalman heatpipe cooler strapped on.
 

bamacre

Lifer
Jul 1, 2004
21,029
2
81
I've been gaming for the past 3 years on a Dell with a 375W PSU (with one 6pin). Hasn't been a problem for me. If you're gaming on multiple monitors, and/or at a greater resolution than 1920x1080, I could see where you'd need more. Currently my Q6600 (stock) and "green" GTX 260 are just fine for 1080p gaming. If you think power requirements are too high, reconsider your needs.