How come Graphics cards are getting more powerful, but require more power than ever? Where's the efficiency?

Hajpoj

Senior member
Dec 9, 2006
288
0
0
Are really advancing in technology here? CPUs are getting smaller, cheaper, more powerful and consuming less energy.

It seems graphics cards are getting more powerful, but also larger, hotter, costlier and using more energy than ever. From a technological standpoint this doesn't make sense.

 

Gomce

Senior member
Dec 4, 2000
812
0
76
Graphic cards (and we're talking advanced 3d graphic cards, not embedded 2d) are machines on their own. CPU by itself can't do much without RAM, northbridge etc. A GPU has a dedicated processor, dedicated ram, controller, everything, hell, if Nvidia/ATI wanted they could attach a hard drive and call it a PC :). We're talking massive parallel execution of data here. Ram speeds approaching 2Ghz with low latencies, 128 pipelines, 80GB+ bandwidth etc. In 10 years we will have graphic cards that would be able to render Toy Story of Final Fantasy (advent children) real time, for which nowadays is required server farms!

The new G80 and R600 will be able to run applications like Folding at home.

Point is, it's a totally different ball game. But someone with actual knowledge could explain this far better than I can.
 

The Sauce

Diamond Member
Oct 31, 1999
4,741
34
91
I have wondered this myself. I think my new upgrade will be limited by the timing of efficiency improvements in the video card area. I mean, hell...I spent $120 for a great 500W power supply about 2 years ago which may, just cut it with current video cards. Hopefully a transistor process shrink will net some savings. Arent the vid card mfgr's all still running 90 - 120nm processes?
 

Aries64

Golden Member
Jul 30, 2004
1,030
0
0
While technological afvances have allowed graphics cards to become much more powerful than ever before, their power has largely come from the use of an ever increasing number of transistors. Die shrinks notwithstanding, more transistors mean greater power requirement, greater heat output, and larger dies. And don't forget that that large (512MB+) on-board memory requires not only more power but a larger PCB size, which also adds to the cost of the videocard beyond the GPU and other components.

Also, contrary to what many out there believe, high-end gaming videocards DO NOT cost more than ever before. There were $600.00 videocards 10 years ago. Those of you who aren't still wet-behind-the-ears and have been into high-end gaming cards for a while might remember the $600.00 Quantum Obsidian3D X-24 SLI card ADD-IN 3D ONLY card. $600.00 and you still needed another card for 2D!

It was a marvel of technology back then - plyaing Quake at 1,024x768 (which was/is the highest resolution supported by that game) at 75+ fps was the ultimate back then. I still own one of those cards for nostalgia, along with the Merlin DP server board and the dual Pentium Pro 200Mhz/512Ks' that powered that system. It still runs too!
 

Hajpoj

Senior member
Dec 9, 2006
288
0
0
I guess we're still working backwards from a technological standpoint. Gaming rigs just keep consuming more and more power as time goes on, I think there's something wrong here as efficiency has been thrown out the door.

These high end cards are designed for people that have tons of money to throw at them. And people with tons of money don't give a squat about efficiency, they only care about MORE RAW POWER, the gfx companies know this and feed the beasts.
 

Roguestar

Diamond Member
Aug 29, 2006
6,045
0
0
We need more power in to get more power out even before we start refining efficiency. Look at today's cars; they're many times more efficient than the first chugging, spluttering beasts but they're also many times more powerful as well. We can't just expect that technology will become perfected as soon as it's developed. The graphics card industry is so fast moving that it's often not worth refining an old design because a new one has emerged that while less efficient than the improved old one would have been, has more in terms of raw peformance. There'd be very little point in ATI bringing out a range of new, super low power and noise radeon 8500s, would there?
 

StopSign

Senior member
Dec 15, 2006
986
0
0
Originally posted by: Roguestar
First we get more powerful, then we get more efficient. Look at the Pentium 4.
No, it goes something like...

First you get the heat. Then you get the power. Then you get the FPS.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Originally posted by: Hajpoj
I guess we're still working backwards from a technological standpoint. Gaming rigs just keep consuming more and more power as time goes on, I think there's something wrong here as efficiency has been thrown out the door.

These high end cards are designed for people that have tons of money to throw at them. And people with tons of money don't give a squat about efficiency, they only care about MORE RAW POWER, the gfx companies know this and feed the beasts.

Who's to say that these cards aren't more efficient? Just because it uses more power, that automatically makes it less efficient? Even though the newer DX10 cards (bar the R600, since we technically don't know the specs, although it should be at least on par with the G80) are faster than the predecessors. When they compare processors on Anandtech, they look at performance / consumption (FPS / W for example). If you look at the same comparison for say... the 8800GTX vs 1950XTX, you'll see which one is more power efficient :p.
 

43st

Diamond Member
Nov 7, 2001
3,197
0
0
Efficiency is a big deal and hardware manufactures are doing pretty well in my opinion. The Intel Core 2 Duo blew the AMD X2 processor out of the water with regard to power consumption. AMD is striking back with 65 watt rated 5000+ X2 processor. Which boggles the mind considering what we were using 20+ years ago.

As for graphics processors they're a step behind but still doing pretty well. The Nvidia 7900GS holds the crown right now for maximum performance per watt consumed. As the GPU gets closer the the CPU in architecture, with Dx10, it can also start sharing low energy consumption designs.

Also keep an eye on PC power supplies. They're coming very close to the 90% efficiency mark. Especially Sea Sonic and now the new Corsair PSU line.

But you're right... graphics cards are a step behind. They do seem to be in line with the energy consumption trends though, and that's a good thing.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Things are getting less power efficient. It's a fundamental physical problem more than an architectural problem. The recent drop in processor power use is only temporary. It will go up again in the not distant future.

For a long time, little regard was given for integrated circuit power consumption. The only reason cpus companies started caring about power was because they were forced to because the huge heat given off by processors was impeding clock scaling. CPU power consumption was able to be moderated because companies actually started designing parts to be lower power consumption. However, design improvements can only get you so far. Fundamentally, the cause of the problem lies with the laws of physics and not designs. As feature sizes shrink, the surfaceArea/volume ratio gets worse leading to more leakage and the gate insulators get thinner leading to more leakage. Also, energy consumption increases with frequency.

GPU manufacturers have done little to address power consumption so far. They use block design rather than custom circuit design so they cannot tune non-critical path transistors for lower power consumption. They also have less advanced fabs in many repsects. Eventually, gpu manufacturers will have to start making full custom designs because they will otherwise be stopped from attaining higher performance by high heat output. I think that nvidia may have already taken a step in that direction given the high clock speed of shaders in the g80. When full custom designs arrive, expect a big initial leap in performance coupled with longer design cycles. Clock speeds of gpus will also start to approach the clock speed of processors.

Are really advancing in technology here?
Historically, advancing technology has come with greater usage of energy in all fields of technology.
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
CPU power consumption keeps climbing as well, just look at it over a longer time span.
In the 486 and earlier era, CPU fans were rare, not needed, no idea what wattage these consumed.

With the Pentium(P54, not the original which was phased out fairly quickly), you could still get by with a large heat sink, or a smaller one with a low speed 40mm fan, HS only became increasingly rare as speeds went up, around 10-15W here.

P2's could be passively cooled due to the slot design providing space for enormous heat sinks, the ones with smaller sinks all had fans.
Coppermines came exclusively with fans aside from some custom OEM jobs, consuming upwards of 30+ watts here.

Athlons, well, forget running these passively without some serious custom cooling setups, getting above the 50W line here.

And of course, Prescott, comment not needed.

Steady rise, now and then it goes down a little, but over the course of 5 years or so, it's been a steady upward trend.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
Because we (consumers) want an increase in the 75-100% range in performance from video cards every generation other wise we wont buy it. If AMD or Intel get 15-25% increase in performance we're lucky.