• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

New GPUs have stupid power requirements

Stiganator

Platinum Member
I think it is retarded for nvidia and ati to release a product that uses 270W. Seriously, my Quadro FX 560 uses only 35W and it can power two 1920x1200 displays. Why are they making such inefficient parts?
 
I think it is retarded for nvidia and ati to release a product that uses 270W. Seriously, my Quadro FX 560 uses only 35W and it can power two 1920x1200 displays. Why are they making such inefficient parts?

...because your Quadro FX560 has probably 1/10 the 3D performance of an 8800GTX, if even that?

First off, no video card uses 270W. HardOCP measured a whole system with an 8800GTX, and at the wall (that is, not taking the PSU's efficiency into account) the whole system used 345W under load. NVIDIA gives a maximum 'thermal load' of 185W for the 8800GTX, and HardOCP estimated the actual load (taking the PSU's efficiency into account) around 150W in their system.

http://enthusiast.hardocp.com/article.html?art=MTIxOCwxNiwsaGVudGh1c2lhc3Q=

An 8800GTX SLI setup might be pulling close to 300W from your PSU. But no single card is close to that.

If you look at, say, a GF1, it was probably drawing something like 10-20W. The 8800GTX is drawing 10-20x as much power, but is at least 100x more powerful if you look at the actual amount of work it can do (high AA/AF, pixel shaders, etc.) Graphics cards have been getting (for the most part) more efficient, not less. It's just that the overall computational power has gone up more than efficiency has compensated for.

If you just want to run dual monitors for 2D work, you do not need a high-end gaming card. Modern integrated graphics can handle that at a fraction of the power usage.
 
First off, no video card uses 270W. HardOCP measured a whole system with an 8800GTX, and at the wall (that is, not taking the PSU's efficiency into account) the whole system used 345W under load. NVIDIA gives a maximum 'thermal load' of 185W for the 8800GTX, and HardOCP estimated the actual load (taking the PSU's efficiency into account) around 150W in their system.




Yup, this whole "i gotta have a huge 700+W PSU" trend of late is way over-rated. We're running our main system (FX55, X1950XT etc.) w/ an FSP 450W and our backup (Oc'ed 3000+, X850XT) etc.) w/ an Antec 350W and both have plenty of power to spare. As far as PSUs go, quality is far better than quantity IMHO.

 
It won't be long and you'll have to upgrade the electrical wiring in your house in order to run your gaming box. Most house circuits are 15 amp circuits. At 120V, that yields 1800W max or probably 1600W max w/ safety margin. With gamer power supplies starting to reach the 1000+W range, this is going to start causing problems (remember, you'll most likely have other things also plugged into that circuit...your monitor, a lamp, a stereo...).

Who's going to want to install a dedicated 20A circuit to their bedroom or home office in order to run the latest games? Better yet, I can envision some boutique power supply companies switching to 220V so that the line currents will be halved (a 15A@220V circuit can deliver around 2900W w/ safety margin). Have we reached the point where having 220V outlets in the home office is a selling point of the house?

 
Since when do either of the 8800 series cards use 270W of power?
Anyways...
arcas, you make some good points. Both ATi and nVidia have image quality and performance worked out at this point in time. What discrete graphics solution developers need to start doing is working on a 65nm process solution for their up and coming cards in order to lower heat and power requirements. Also, lessening size will reduce the need for giganto power-eater dual slot cooling solutions.
 
Originally posted by: conlan
First off, no video card uses 270W. HardOCP measured a whole system with an 8800GTX, and at the wall (that is, not taking the PSU's efficiency into account) the whole system used 345W under load. NVIDIA gives a maximum 'thermal load' of 185W for the 8800GTX, and HardOCP estimated the actual load (taking the PSU's efficiency into account) around 150W in their system.




Yup, this whole "i gotta have a huge 700+W PSU" trend of late is way over-rated. We're running our main system (FX55, X1950XT etc.) w/ an FSP 450W and our backup (Oc'ed 3000+, X850XT) etc.) w/ an Antec 350W and both have plenty of power to spare. As far as PSUs go, quality is far better than quantity IMHO.

Very true...lots of people buy into it and buy some 600w no-name psu and wonder why their system blows up. They'd have been much better off getting a 450w from a well-regarded and proven company.
 
Originally posted by: arcas
It won't be long and you'll have to upgrade the electrical wiring in your house in order to run your gaming box. Most house circuits are 15 amp circuits. At 120V, that yields 1800W max or probably 1600W max w/ safety margin. With gamer power supplies starting to reach the 1000+W range, this is going to start causing problems (remember, you'll most likely have other things also plugged into that circuit...your monitor, a lamp, a stereo...).

Who's going to want to install a dedicated 20A circuit to their bedroom or home office in order to run the latest games? Better yet, I can envision some boutique power supply companies switching to 220V so that the line currents will be halved (a 15A@220V circuit can deliver around 2900W w/ safety margin). Have we reached the point where having 220V outlets in the home office is a selling point of the house?

Forget that, by pass your power company and get a personal naquadah reactor. you'll need one to power the x4500xt and a athlon fx 85. 😛

 
Originally posted by: AgentJean
Originally posted by: arcas
It won't be long and you'll have to upgrade the electrical wiring in your house in order to run your gaming box. Most house circuits are 15 amp circuits. At 120V, that yields 1800W max or probably 1600W max w/ safety margin. With gamer power supplies starting to reach the 1000+W range, this is going to start causing problems (remember, you'll most likely have other things also plugged into that circuit...your monitor, a lamp, a stereo...).

Who's going to want to install a dedicated 20A circuit to their bedroom or home office in order to run the latest games? Better yet, I can envision some boutique power supply companies switching to 220V so that the line currents will be halved (a 15A@220V circuit can deliver around 2900W w/ safety margin). Have we reached the point where having 220V outlets in the home office is a selling point of the house?

Forget that, by pass your power company and get a personal naquadah reactor. you'll need one to power the x4500xt and a athlon fx 85. 😛

And what if the last naquadah reactor you can find is in the hands of the Ori ? No X4500XT for you.
 
Originally posted by: Zenoth
Originally posted by: AgentJean
Originally posted by: arcas
It won't be long and you'll have to upgrade the electrical wiring in your house in order to run your gaming box. Most house circuits are 15 amp circuits. At 120V, that yields 1800W max or probably 1600W max w/ safety margin. With gamer power supplies starting to reach the 1000+W range, this is going to start causing problems (remember, you'll most likely have other things also plugged into that circuit...your monitor, a lamp, a stereo...).

Who's going to want to install a dedicated 20A circuit to their bedroom or home office in order to run the latest games? Better yet, I can envision some boutique power supply companies switching to 220V so that the line currents will be halved (a 15A@220V circuit can deliver around 2900W w/ safety margin). Have we reached the point where having 220V outlets in the home office is a selling point of the house?

Forget that, by pass your power company and get a personal naquadah reactor. you'll need one to power the x4500xt and a athlon fx 85. 😛

And what if the last naquadah reactor you can find is in the hands of the Ori ? No X4500XT for you.

ZPM FTW?
 
Originally posted by: arcas
Have we reached the point where having 220V outlets in the home office is a selling point of the house?

I should be glad that here in the UK, I have 230-240volts by default anyway! 😀
 
Yeah, its pretty lame. Thats why I'm skipping out on Nvidia 8800. I'm going to wait until 8600 Ultra comes out. I'm gonna "vote" with my money, and only start purchasing energy efficient and cool components from now on.

I currently have a 7800 GTX OC + AMD 4800+ and it heats the room like a space heater.
 
Originally posted by: TrandM
Originally posted by: Zenoth
Originally posted by: AgentJean
Originally posted by: arcas
It won't be long and you'll have to upgrade the electrical wiring in your house in order to run your gaming box. Most house circuits are 15 amp circuits. At 120V, that yields 1800W max or probably 1600W max w/ safety margin. With gamer power supplies starting to reach the 1000+W range, this is going to start causing problems (remember, you'll most likely have other things also plugged into that circuit...your monitor, a lamp, a stereo...).

Who's going to want to install a dedicated 20A circuit to their bedroom or home office in order to run the latest games? Better yet, I can envision some boutique power supply companies switching to 220V so that the line currents will be halved (a 15A@220V circuit can deliver around 2900W w/ safety margin). Have we reached the point where having 220V outlets in the home office is a selling point of the house?

Forget that, by pass your power company and get a personal naquadah reactor. you'll need one to power the x4500xt and a athlon fx 85. 😛

And what if the last naquadah reactor you can find is in the hands of the Ori ? No X4500XT for you.

ZPM FTW?

Wasn't the last ZPM depleted though? 🙁

 
Back
Top