Go Back   AnandTech Forums > Hardware and Technology > Video Cards and Graphics

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals
· Free Stuff
· Contests and Sweepstakes
· Black Friday 2013
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 11-04-2010, 03:50 PM   #1
Throckmorton
Lifer
 
Throckmorton's Avatar
 
Join Date: Aug 2007
Location: Houston, TX
Posts: 16,599
Default Why are modern videocards so power hungry?

It's been a couple years since I've bought a videocard so I've been researching for the past week. I'm shocked and apalled at how power hungry they are now.

Remember when video cards were powered by the AGP bus? Then at some point, a few cards started sprouting Molex connectors. Then came the 6 pin connectors on the PCIE cards. Fine, I can see the need to pump more power than the motherboard can supply.

But now high end videocards have two power connectors and draw 250W+, and it's been getting worse with each of the recent generations, and apparently that will continue.

Meanwhile, CPUs have become more efficient, hard drives are gradually transitioning to solid state, cars are more efficient even while putting out 2x the horsepower than a decade ago, diesel trucks are cleaner than ever. So why are video cards the exception? Why is an 850W powersupply now the norm for running a single videocard? Will it not end until PSUs aren't capable of keeping up?

Is there some physical reason that videocards can't become more powerful without requiring more and more power, or is it just a cost tradeoff?
__________________
K&N air filter = 45x as much dirt in your engine (working link)
http://i52.tinypic.com/50lf0y.gif
"It has already been proven that the concept of a "living wage" is a liberal fantasy." - Patranus

Last edited by Throckmorton; 11-04-2010 at 03:56 PM.
Throckmorton is offline   Reply With Quote
Old 11-04-2010, 03:53 PM   #2
TemjinGold
Platinum Member
 
Join Date: Dec 2006
Posts: 2,406
Default

Quote:
Originally Posted by Throckmorton View Post
Is there some physical reason that videocards can't become more powerful without requiring taking more and more power, or is it just a cost tradeoff?
Nope. Just look at AMD's current cards.
__________________
Antec P182 | Seasonic X-660 | Cyberpower 1000PFCLCD |
Intel i5 2500k @4.2 w/ Corsair H55 | ASUS Maximus IV Gene-Z |
16gb G.SKILL DDR3 1600 | ASUS GTX580 |
Sharp AQUOS LC-32LE700UN 32" LED HDTV | 1x 960 gb Crucial M500 |
1x 1TB Samsung F3 | Samsung SATA DVD Burner | Logitech Z-5500
TemjinGold is offline   Reply With Quote
Old 11-04-2010, 03:56 PM   #3
IntelUser2000
Elite Member
 
IntelUser2000's Avatar
 
Join Date: Oct 2003
Posts: 3,395
Default

That's because both ATI and Nvidia went with claims of "Moore's Law cubed" saying they double performance every 6 months while in reality they were making ever bigger and power hogging chips to achieve that. While the performance gains were impressive, power costs came with it.

Plus, Moore's Law is about doubling transistors and the GPU performance doubling went down to 2x/year, obviously limited by power.
__________________
Core i7 2600K + Turbo Boost | Intel DH67BL/GMA HD 3000 IGP | Corsair XMS3 2x2GB DDR3-1600 @ 1333 9-9-9-24 |
Intel X25-M G1 80GB + Seagate 160GB 7200RPM | OCZ Modstream 450W | Samsung Syncmaster 931c | Windows 7 Home Premium 64-bit | Microsoft Sidewinder Mouse | Viliv S5-Atom Z520 WinXP UMPC
IntelUser2000 is offline   Reply With Quote
Old 11-04-2010, 03:57 PM   #4
fffblackmage
Platinum Member
 
fffblackmage's Avatar
 
Join Date: Dec 2007
Posts: 2,535
Default

I would say the new 68xx are less power hungry than the 58xx, but the 69xx are probably going to end up using more power anyways.

A 850W PSU seems more normal for an SLI setup than anything else. A 500-650W PSU is sufficient for most single GPU setups.
fffblackmage is offline   Reply With Quote
Old 11-04-2010, 04:01 PM   #5
GaiaHunter
Diamond Member
 
GaiaHunter's Avatar
 
Join Date: Jul 2008
Posts: 3,369
Default

Videocards performance have increased much more than CPU performance in the last few years.

Still look at 5870 performance/watt compared to 4870x2 and GTX295, or 5850 performance/watt vs 4870/GTX285, etc.
GaiaHunter is online now   Reply With Quote
Old 11-04-2010, 04:04 PM   #6
Arkadrel
Diamond Member
 
Join Date: Oct 2010
Posts: 3,632
Default

Nvidia 480 has a TPD of 250 or something, but thats with avg. load.... if you stress it out completly it ll go up to about ~320 watts. AMD cards use abit less... the TPD they show is from the max they can get the card to use (measured at the wall), and they usually have lower rateings than the nvidia counter parts.

So when you look at TPD be aware that nvidia arnt measureing most possible draw, but avg load(s), while amd show the most possible they can get out of a card. Why do nvidia do it differntly? because currently theyre cards use more power/performance, and dont want to look bad.

There are youtube videos of a guy with 4x SLI 480s, and a watt o metter, where he has a PSU that only gives power to the grafics cards. He gets over 1600 watts used. 1600watts / 4 cards = 400watts pr card.

That means a current 480, if overvolted to overclock can reach over 400watts used.


Why does newer gfx cards draw more power? because apperntly they cant figour out a way to get big improvements without more energy used. Neither nvidia OR amd can make huge improvements in performance without useing more.

Anyways if power is expensive where you are, and want a card that doesnt use as much, your better off with a amd product.

Last edited by Arkadrel; 11-04-2010 at 04:12 PM.
Arkadrel is offline   Reply With Quote
Old 11-04-2010, 04:16 PM   #7
tyl998
Senior Member
 
tyl998's Avatar
 
Join Date: Aug 2010
Posts: 236
Default

I have a 750 antec power supply. Supplies my 460 SLI rig just fine. With one card even 550 will be fine.
__________________
Current rig:
CPU - Intel i7-930 @ 4.0 ghz
MOBO - Gigabyte GA-X58A-UD3R rev2.0
GPU - GIGABYTE GV-N460OC-1GI GeForce GTX 460 1GB x2 SLI
RAM - CORSAIR XMS3 6GB (3 x 2GB) DDR3 1333 (PC3 10666) Triple Channel Kit Model TR3X6G1333C9
HDD - Samsung Spinpoint F3 1TB 7200 rpm
PSU - Antec TruePower New TP-750 Blue
Case - Antec 300
CPU Cooler - Antec Kuhler Box
tyl998 is offline   Reply With Quote
Old 11-04-2010, 04:21 PM   #8
Idontcare
Administrator
Elite Member
 
Idontcare's Avatar
 
Join Date: Oct 1999
Location: 台北市
Posts: 20,120
Default

Quote:
Originally Posted by Throckmorton View Post
Is there some physical reason that videocards can't become more powerful without requiring more and more power, or is it just a cost tradeoff?
I suspect it may have already been addressed, but I need to pad my post-count so I'll chime in with the following:

They are, and you can.

You could dial the clockspeed down on any GPU to the point that its power-consumption was low enough that it could be fed solely by the mobo, and even at those paltry clocks the card would still outperform any prior generation video card when benched with its clocks dialed down to fit inside the same power-footprint.

What changed was the fact that people were willing to buy >400W PSU's and deal with >75W dissipation from their video cards. Where there is a market their will be capitalism.

So we've had performance in video cards grow at a rate faster than what might have been had the video cards been TDP limited like CPU's. (where the max power is on the order of 140W for retail cooling solutions)

But even CPU's can reach >300W power dissipation if you are willing to spend $ on the 3rd party cooling solution.
Idontcare is online now   Reply With Quote
Old 11-04-2010, 06:00 PM   #9
Throckmorton
Lifer
 
Throckmorton's Avatar
 
Join Date: Aug 2007
Location: Houston, TX
Posts: 16,599
Default

Good points about videocards becoming more powerful much faster than CPUs.

How big is the power usage difference between AMD and nVidia? How do midrange cards like the 6850 and GT460 compare? (those are equivalent right?)
__________________
K&N air filter = 45x as much dirt in your engine (working link)
http://i52.tinypic.com/50lf0y.gif
"It has already been proven that the concept of a "living wage" is a liberal fantasy." - Patranus
Throckmorton is offline   Reply With Quote
Old 11-04-2010, 06:23 PM   #10
blastingcap
Diamond Member
 
blastingcap's Avatar
 
Join Date: Sep 2010
Posts: 5,570
Default

ram eats energy too, and gddr5 at high volts/clocks even more so. ram size/speed has shot up over the years
__________________
Quote:
Originally Posted by BoFox View Post
We had to suffer polygonal boobs for a decade because of selfish corporate reasons.
Main: 3570K + R9 290 CF + Crucial 16GB 1866 + AsRock Extreme4 Z77 + Eyefinity 5760x1080 eIPS

NAS and HTPC/workstation: Supermicro MBD-X9SCM + G530 + 16GB ECC; ASUS P8B WS + i3-3220; 1.1TB of Intel/Crucial/Samsung SSDs + 26TB of WD/Hitachi HDDs
blastingcap is offline   Reply With Quote
Old 11-04-2010, 06:30 PM   #11
KingstonU
Golden Member
 
KingstonU's Avatar
 
Join Date: Dec 2006
Posts: 1,224
Default

That is a good point, how does GDDR5 compare to DDR3 in terms of power usage and performance? Are they even comparable?
__________________
Quote:
Originally Posted by flvinny521 View Post
Has anyone really been far even as decided to use even go want to do look more like?
Quote:
Originally Posted by DixyCrat View Post
Youíve got to be kidding me. Iíve been further even more decided to use even go need to do look more as anyone can. Can you really be far even as decided half as much to use go wish for that? My guess is that when one really been far even as decided once to use even go want, it is then that he has really been far even as decided to use even go want to do look more like. Itís just common sense.
KingstonU is offline   Reply With Quote
Old 11-04-2010, 06:37 PM   #12
nenforcer
Golden Member
 
nenforcer's Avatar
 
Join Date: Aug 2008
Posts: 1,454
Default

Its called 3 billion transistors needed to accelerate my virtual girlfriend

__________________
nForcer 2
======
AMD Sempron 3300+ @ 2.2GHz Barton Sock A 512Kb L2
ASUS A7N8X-E Deluxe nForce 2 MB
Seagate 7200.10 7200 RPM 250GB IDE w/ 8MB Cache Perpendicular
EVGA Geforce 7800GS 256MB AGP 8X
BFGTech Ageia Physx 128MB PPU PCI
1GB (512MBx2) Crucial Ballistix DDR400 4-4-4 Dual Channel
nVidia Soundstorm Dolby Digital Coaxial
Sony CPD-E540 21" CRT Monitor 1600x1200 85Hz VSYNC Off
Windows XP SP3
nenforcer is offline   Reply With Quote
Old 11-04-2010, 06:40 PM   #13
Vdubchaos
Diamond Member
 
Join Date: Nov 2009
Posts: 8,259
Default

it's also worth noting that even though Video cards use A LOT more power......video games graphics quality in general hasn't really took a BIG leap AT ALL.
Vdubchaos is offline   Reply With Quote
Old 11-04-2010, 06:44 PM   #14
Throckmorton
Lifer
 
Throckmorton's Avatar
 
Join Date: Aug 2007
Location: Houston, TX
Posts: 16,599
Default

Quote:
Originally Posted by Vdubchaos View Post
it's also worth noting that even though Video cards use A LOT more power......video games graphics quality in general hasn't really took a BIG leap AT ALL.
That's one thing I've noticed for the past 2-5 years. I go back and play old games like Far Cry and they look amazing because of the artistic quality. Newer games have more effects, but they don't really LOOK better. Even technically, the effects these new cards are driving don't seem that impressive.

Another thing is when you turn the options down so new games can run on older hardware, they look much worse than the old games did.
__________________
K&N air filter = 45x as much dirt in your engine (working link)
http://i52.tinypic.com/50lf0y.gif
"It has already been proven that the concept of a "living wage" is a liberal fantasy." - Patranus
Throckmorton is offline   Reply With Quote
Old 11-04-2010, 06:48 PM   #15
Throckmorton
Lifer
 
Throckmorton's Avatar
 
Join Date: Aug 2007
Location: Houston, TX
Posts: 16,599
Default

I remember when the Luclin expansion came out for Everquest in 2001. It was a huge leap forward in graphics quality. I think I upgraded my video card to a TNT 2 (or maybe GF2?) so I could run it.

Well early this year I fired up Everquest for old time's sake. It still looks great. The graphics quality and the artistic quality are BETTER than World of Warcraft.
__________________
K&N air filter = 45x as much dirt in your engine (working link)
http://i52.tinypic.com/50lf0y.gif
"It has already been proven that the concept of a "living wage" is a liberal fantasy." - Patranus
Throckmorton is offline   Reply With Quote
Old 11-04-2010, 06:51 PM   #16
nenforcer
Golden Member
 
nenforcer's Avatar
 
Join Date: Aug 2008
Posts: 1,454
Default

and this bitch too

__________________
nForcer 2
======
AMD Sempron 3300+ @ 2.2GHz Barton Sock A 512Kb L2
ASUS A7N8X-E Deluxe nForce 2 MB
Seagate 7200.10 7200 RPM 250GB IDE w/ 8MB Cache Perpendicular
EVGA Geforce 7800GS 256MB AGP 8X
BFGTech Ageia Physx 128MB PPU PCI
1GB (512MBx2) Crucial Ballistix DDR400 4-4-4 Dual Channel
nVidia Soundstorm Dolby Digital Coaxial
Sony CPD-E540 21" CRT Monitor 1600x1200 85Hz VSYNC Off
Windows XP SP3
nenforcer is offline   Reply With Quote
Old 11-04-2010, 06:52 PM   #17
Zap
Super Moderator
Off Topic
Elite Member
 
Zap's Avatar
 
Join Date: Oct 1999
Location: Somewhere Gillbot can't find me
Posts: 22,378
Default

Quote:
Originally Posted by Throckmorton View Post
Is there some physical reason that videocards can't become more powerful without requiring more and more power, or is it just a cost tradeoff?
You can ask the same thing about CPUs. My first system didn't even need a heatsink on the CPU. Then, we went with tiny heatsinks/fans. Now even Intel has a tower heatpipe monstrosity in the lineup.

Quote:
Originally Posted by Throckmorton View Post
That's one thing I've noticed for the past 2-5 years. I go back and play old games like Far Cry and they look amazing because of the artistic quality. Newer games have more effects, but they don't really LOOK better.
That's because companies are throwing tech at it instead of throwing better artists at it. Also, some games (BFBC2?) just don't lend themselves to looking good. After all, how good can dirt and camouflage look?
__________________
The best way to future-proof is to save money and spend it on future products. (Ken g6)

SSD turns duds into studs. (JBT)
Zap is offline   Reply With Quote
Old 11-04-2010, 06:58 PM   #18
Gloomy
Golden Member
 
Gloomy's Avatar
 
Join Date: Oct 2010
Posts: 1,127
Default

Quote:
Originally Posted by Zap View Post
That's because companies are throwing tech at it instead of throwing better artists at it. Also, some games (BFBC2?) just don't lend themselves to looking good. After all, how good can dirt and camouflage look?
I thought the greenery and water in the game looked pretty great. That being said, Battlefield imo has always had better sound than graphics.
Gloomy is online now   Reply With Quote
Old 11-04-2010, 07:01 PM   #19
dualsmp
Golden Member
 
dualsmp's Avatar
 
Join Date: Aug 2003
Location: SC
Posts: 1,514
Default

The 6850 is the most powerful card using a single six pin PCI-E connector.
dualsmp is offline   Reply With Quote
Old 11-04-2010, 07:02 PM   #20
blastingcap
Diamond Member
 
blastingcap's Avatar
 
Join Date: Sep 2010
Posts: 5,570
Default

Quote:
Originally Posted by Zap View Post
That's because companies are throwing tech at it instead of throwing better artists at it. Also, some games (BFBC2?) just don't lend themselves to looking good. After all, how good can dirt and camouflage look?
I've often wondered by companies don't simply outsource texture-making or something. Why do X companies need to make X number of dirt textures? Why hasn't a middleware company emerged that specializes in making scalable textures (from crap to photorealistic)? Companies could further mod textures too, like how Valve modded Havok physics, so that we don't see literally the same textures in each game. Maybe if companies didn't have to spend a fortune each on graphics and licensed it from the middleware company, they would have more money left over to spend on trivial things like, I dunno, making the game fun?
__________________
Quote:
Originally Posted by BoFox View Post
We had to suffer polygonal boobs for a decade because of selfish corporate reasons.
Main: 3570K + R9 290 CF + Crucial 16GB 1866 + AsRock Extreme4 Z77 + Eyefinity 5760x1080 eIPS

NAS and HTPC/workstation: Supermicro MBD-X9SCM + G530 + 16GB ECC; ASUS P8B WS + i3-3220; 1.1TB of Intel/Crucial/Samsung SSDs + 26TB of WD/Hitachi HDDs
blastingcap is offline   Reply With Quote
Old 11-04-2010, 07:15 PM   #21
aka1nas
Diamond Member
 
Join Date: Aug 2001
Posts: 4,335
Default

GPU architectures went parallel earlier and to a much greater degree than CPU architectures did, and are hitting the "multi-core scaling wall" earlier as a result. It's not like were going to get 16-core desktop CPUs anytime soon, either.
__________________
Main Rig:
I7 920 D0 @ 4.22Ghz
Asus P6T6 Revolution X58
24GB GSkill DDR3-1333
Radeon 6870
Enermax Galaxy Evo 1250w
GSkill Falcon 128GB SSD x2 (RAID 0)
Intel X25-M 160GB SSD
Seagate 7200.11 1.5TB
Windows 7 Pro 64-bit
aka1nas is offline   Reply With Quote
Old 11-04-2010, 07:23 PM   #22
Arkadrel
Diamond Member
 
Join Date: Oct 2010
Posts: 3,632
Default

Quote:
Originally Posted by Throckmorton View Post
Good points about videocards becoming more powerful much faster than CPUs.

How big is the power usage difference between AMD and nVidia? How do midrange cards like the 6850 and GT460 compare? (those are equivalent right?)

I have a picture with a table showing the watts, measured off the wall, where they calculated out the system use (they ran furmark and measured at the wall). I cant for the life of me remember where the hell I got it.



**Edit: (goes off to look for more charts for the rest)





Source: http://www.techpowerup.com/ (just search for a card, go to review under "power comsumption")

max load: (measured at the wall, all cards at stock gpu core/mem/shaders ect)

480 = 320w
470 = 232w
460 = 155w

5870 = 212w
6870 = 163w
6850 = 125w (from same site but differnt table)


These are all from 1 site, instead of what I had written down myself from various places so probably give a better representation than those I posted at first. For some reason they can vary abit from site to site.

Last edited by Arkadrel; 11-04-2010 at 08:01 PM.
Arkadrel is offline   Reply With Quote
Old 11-04-2010, 07:25 PM   #23
blastingcap
Diamond Member
 
blastingcap's Avatar
 
Join Date: Sep 2010
Posts: 5,570
Default

Quote:
Originally Posted by Arkadrel View Post
I have a picture with a table showing the watts, measured off the wall, where they calculated out the system use. I cant for the life of me remember where the hell I got it.


5850 TDP 151w/27w
6850 TDP 127w/19w
5870 TDP 188w/27w
6870 TDP 151w/19w

460 TDP 155w/15w____wiki/nvidia say: 160
470 TDP 232w/29w____wiki/nvidia say: 215
480 TDP 320w/54w____wiki/nvidia say: 250
Those are from TechPowerUp and seem to list TDPs not actual power draws. TDP doesn't necessarily equal power draw.
__________________
Quote:
Originally Posted by BoFox View Post
We had to suffer polygonal boobs for a decade because of selfish corporate reasons.
Main: 3570K + R9 290 CF + Crucial 16GB 1866 + AsRock Extreme4 Z77 + Eyefinity 5760x1080 eIPS

NAS and HTPC/workstation: Supermicro MBD-X9SCM + G530 + 16GB ECC; ASUS P8B WS + i3-3220; 1.1TB of Intel/Crucial/Samsung SSDs + 26TB of WD/Hitachi HDDs
blastingcap is offline   Reply With Quote
Old 11-04-2010, 07:33 PM   #24
Lonyo
Lifer
 
Lonyo's Avatar
 
Join Date: Aug 2002
Posts: 21,462
Default

http://www.xbitlabs.com/articles/vid...70-hd6850.html

__________________
CPU: Q3570K @ 4.1GHz 1.23v // Mobo: Asus P8Z77-V // GFX: Sapphire Tri-X 290 @ 1000/5200 // RAM: Corsair DDR3 @ 1600MHz 9-9-9-24 // SSD: Samsung 830 128GB
Video cards: TNT2, Ti4400, 9800, 7800GT(+7200GS), HD4850(+HD2400), HD6850, HD7950 (Laptops: GF6150, HD3200, GMA500)
Lonyo is offline   Reply With Quote
Old 11-04-2010, 07:38 PM   #25
tvdang7
Platinum Member
 
Join Date: Jun 2005
Posts: 2,179
Default

only the super high end like 480gtx and 5970. The high end tends to stay about the same as the generation before it.
__________________
Lenobo y510p
Intel Haswell 4700qm 2.4ghz
16gb ram
1tb hardrive + 24 gb ssd
nvidia 750m SLI
1080p
tvdang7 is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 06:08 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.