Is the power usage of a GPU a major factor in your purchase decision?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Is the power usage of a GPU a major factor in your purchase decision?

  • Yes

  • No


Results are only viewable after voting.
Feb 19, 2009
10,457
10
76
Broad terms like this is pointless, power use is always a factor, people who say it isn't are silly. What if a GPU used 300W and perform like another that uses 150W? It certainly matters.

Power by itself is meaningless, it's PERF/W that matters.

R290 uses 15W extra power compared to the 780 in gaming loads, that difference is ~7%, not even 10% extra.

http://www.techpowerup.com/reviews/AMD/R9_290/24.html
power_average.gif


For reference sakes (for those with short term memory), the 480 used ~90% more power than its competitor the 5870.

The R290X uses 37W extra by itself, generating a big performance advantage (which causes overall system consumption to be higher; pushing the CPU and entire subsystems harder).

From [H] results at 1600p, Eye-infinity and 4K gaming, the R290X utter destroys the 780 and Titan. Is that extra power consumption justified? Hell yes, unless you're some kind of green freak.

From the 780Ti leaks, it has its power use on-par with the R290X for ~8% above Titan performance. That should put it roughly similar on the perf/w scale, versus a CRAP and HOT running AMD reference R290X...

So, where's the beef?
 

spat55

Senior member
Jul 2, 2013
539
5
76
I don't mind the power draw, it is how the cooler dissipates it. In the winter though it makes a nice heater, as we hardly ever use central heating. Blame the fire downstairs and free wood :(
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Broad terms like this is pointless, power use is always a factor, people who say it isn't are silly. What if a GPU used 300W and perform like another that uses 150W? It certainly matters.

Power by itself is meaningless, it's PERF/W that matters.

R290 uses 15W extra power compared to the 780 in gaming loads, that difference is ~7%, not even 10% extra.

http://www.techpowerup.com/reviews/AMD/R9_290/24.html
power_average.gif


For reference sakes (for those with short term memory), the 480 used ~90% more power than its competitor the 5870.

The R290X uses 37W extra by itself, generating a big performance advantage (which causes overall system consumption to be higher; pushing the CPU and entire subsystems harder).

From [H] results at 1600p, Eye-infinity and 4K gaming, the R290X utter destroys the 780 and Titan. Is that extra power consumption justified? Hell yes, unless you're some kind of green freak.

From the 780Ti leaks, it has its power use on-par with the R290X for ~8% above Titan performance. That should put it roughly similar on the perf/w scale, versus a CRAP and HOT running AMD reference R290X...

So, where's the beef?
Great summary, really shows how some people are trying to rewrite history. The GTX 480 was an inefficient pig compared to the rest of the market. The 290X uses a bit more power because it's faster, nothing else to say about it.

Personally, unless it's something outrageous like the GTX 480, power consumption really doesn't matter much to me.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Broad terms like this is pointless, power use is always a factor, people who say it isn't are silly. What if a GPU used 300W and perform like another that uses 150W? It certainly matters.

Power by itself is meaningless, it's PERF/W that matters.

R290 uses 15W extra power compared to the 780 in gaming loads, that difference is ~7%, not even 10% extra.

Correct. The power usage of the 290 and 290X cards is not astoundingly more than the Titan in gaming loads, which means that AMD could have created an acoustic solution not necessarily as GOOD as the Titan shroud, but close. I know i'm beating a dead horse, but I seriously don't get it the design decisions at AMD. Titan acoustics while Titan uses 10-15W less could easily have been done by AMD with similar acoustics - again, the power difference is not huge unless the 290 cards are overclocked/overvolted.

I still like AMD GPUs and the 290 is still amazing price performance with noise compromises - it seems to me that Nvidia focuses a little bit more on the user experience in terms of software and acoustics and I really wish AMD would step up here. IMHO, nobody would blink an eye at a 290/290x costing 20-30$ more but with a much better acoustical solution - this would also help offset the quiet mode performance issues (as compared to uber) as well. Again, I don't get the design decisions. Dead horse. I know I know.

In my mind, maybe a few folks care about power at the high end, but once you get into the realm of SLI and super high end halo stuff I think a lot of people care less but they *DO* care about indirect effects such as heat or noise. That's the category I fall under.
 
Last edited:

Pandora's Box

Senior member
Apr 26, 2011
428
151
116
It's not a major purchase decision but it is a consideration. If it's reasonable I wont think a second thought about it. Reasonable being a slight increase from the last generation. However, if its a large jump I will question it before buying. Usually I base it on whether my power supply can handle 2 of the cards or not.
 
Feb 19, 2009
10,457
10
76
AMD doesn't need to focus on the "user experience" in terms of acoustics, because thats what their AIBs do with a plethora of models coming soon. Its not news, look at the 7970 Ghz ed and 7950 Boost. Reference models were an utter joke. They used anywhere from 20 to 50W EXTRA compared to AIB models and were loud.

R290/X on a good cooler will end up being cooler, quieter and using less power due to less leakage operating at 95C.

I do agree AMD needs to put an extra effort with their reference design. They are clearly not even trying.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
As I get older I really care more about idle power consumption as that's where my PC spends most of its time these days. Load power consumption doesn't really bother me as I only play 5-7 hours per week which won't add much to my power bill. Noise factor is huge for my HTPC but not somuch my gaming box as I wear headphones while gaming.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
Power draw isnt a major issue, but obviously if 2 cards perform similar, ill go for the one with less power draw as that equals less heat (and subsequently less noise).:)
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
It does because it translate to heat expelled into the room I keep the PC in. My previous GPU was a GTX 460 1GB OC'd to 875 Mhz. The room was pretty darn warm during gaming sessions. Replaced it with a GTX 670 that I keep at stock speeds and the room is cooler during hot months.

If it is Nvidia then it bothers me, if it is AMD then I'm ok with it:colbert:

Know its meant as a joke, but it does seem to be the prevailing attitude among some. The current gen from both is a little bit disappointing in that regard. The GTX 600 series and HD 7000 series both had a very nice drop in power consumption vs. performance increase over the GTX 500 and HD 6000 series. While the performance is up a fair amount with the GTX 700 and R9 series its unfortunate the power consumption has to start creeping back up as well. I kept hoping each release would be more efficient and faster.
 
Last edited:

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
Yes.
The CRT goes away and the CPU gets undervolted in the summer due to the heat they dump in the room. 300W GPU? No thanks.
 

tolis626

Senior member
Aug 25, 2013
399
0
76
I don't mind the power draw, it is how the cooler dissipates it. In the winter though it makes a nice heater, as we hardly ever use central heating. Blame the fire downstairs and free wood :(

It does because it translate to heat expelled into the room I keep the PC in. My previous GPU was a GTX 460 1GB OC'd to 875 Mhz. The room was pretty darn warm during gaming sessions. Replaced it with a GTX 670 that I keep at stock speeds and the room is cooler during hot months.



Know its meant as a joke, but it does seem to be the prevailing attitude among some. The current gen from both is a little bit disappointing in that regard. The GTX 600 series and HD 7000 series both had a very nice drop in power consumption vs. performance increase over the GTX 500 and HD 6000 series. While the performance is up a fair amount with the GTX 700 and R9 series its unfortunate the power consumption has to start creeping back up as well. I kept hoping each release would be more efficient and faster.

Honestly guys,is the whole "room getting warm" thing a real issue?I always thought it was a joke.The only high power system I've ever built was in a large (about 50 sq. m) room in my parents' house,so no extra heat must have mattered.Would a rig with a slightly overclocked 4770k and 290x really heat the ~15 sq.m room I'll use it in?If so,I'm all for it.It gets quite cold in the winter here,and I spend 95% of the time I spend home in my office room (Apart from sleeping,that is).If so,I really could cut down on heating the house itself.Cold doesn't bother me much anyway.

Yeah,I sound cheap,but my funds are limited and I'd prefer to spend all I got on stuff I care about,not warming the house. :p
 

smackababy

Lifer
Oct 30, 2008
27,024
79
86
I am a bit late, but I agree with BrightCandle for the heat part. I watercool everything, mostly because I enjoy building the set ups and the quiet, cool systems I get. I consider TDP (and power usage, by proxy), but only because I have to account for the heat that will be produced. I would be thrilled if I could get 120w and get 290x level of performance, however, that is a tad unrealistic.

With that said, I am still buying at least one 290. I have plenty of rad space available. And if I can ever find another PA120.3 for sale, I will have 3 of those in my rig. I should be prepared for anything with 5C deltas to boot!
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Honestly guys,is the whole "room getting warm" thing a real issue?I always thought it was a joke.The only high power system I've ever built was in a large (about 50 sq. m) room in my parents' house,so no extra heat must have mattered.Would a rig with a slightly overclocked 4770k and 290x really heat the ~15 sq.m room I'll use it in?If so,I'm all for it.It gets quite cold in the winter here,and I spend 95% of the time I spend home in my office room (Apart from sleeping,that is).If so,I really could cut down on heating the house itself.Cold doesn't bother me much anyway.

Yeah,I sound cheap,but my funds are limited and I'd prefer to spend all I got on stuff I care about,not warming the house. :p

Sometimes I put furmark and prime64 on purely to get the room a little warmer if I'm the only one awake in the house. Thankfully my air cooling is efficient enough to make sure neither my CPU or GPU goes above 66c.
 

PliotronX

Diamond Member
Oct 17, 1999
8,883
107
106
Typically I only pay attention to power consumption when it's relating to the TDP and overclockability of certain chips. My only concern relating to power consumption is heat output cuz it warms up a bedroom fast but then again even a laptop at full tilt can do it so it's really not a consideration for me. Electric-bill wise it probably ends up a few pennies worth a month. I try to overclock to the hairy edge every time anyway :D
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
For my gaming rig, not really. It somewhat used to be when custom coolers weren't very good (5-10 years ago). I put everything under water now, so I don't care about heat/power as much.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
R290 uses 15W extra power compared to the 780 in gaming loads, that difference is ~7%, not even 10% extra.

I figured something was fishy about that TPU chart. Turns out, they were using Crysis 2 to test power load, which is pathetic, because that game is far too old and doesn't stress these modern cards.

Crysis 3 is much better and that's what Anandtech used, and of course, they got a much more significant difference between the 290 and the 780..

59709.png


That's a 16.5% difference.
 
Feb 19, 2009
10,457
10
76
I figured something was fishy about that TPU chart. Turns out, they were using Crysis 2 to test power load, which is pathetic, because that game is far too old and doesn't stress these modern cards.

Crysis 3 is much better and that's what Anandtech used, and of course, they got a much more significant difference between the 290 and the 780..

59709.png


That's a 16.5% difference.

16.5% total system power difference.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
Absolutely. To me it shows the absolute value of the card in terms of money spent.

If a card performs amazingly well for results, and has ridiculous power consumption, it better plot itself in line with every other card out there, otherwise I'll reconsider.

See sig. I beat this topic to death already. In short, I argued: why even care about the initial cost of a video card if you don't care about the money spent on the power that drives it? They should go hand-in-hand. :)
 

beginner99

Diamond Member
Jun 2, 2009
5,315
1,760
136
This obviously is not a yes/no answer.

Yes power consumption has a certain influence. If there are 2 cards with same price and performance I will obviously choose the one that uses less power.

However if a card costs a bit less and offers about the same performance I would still buy it given power consumption isn't through the roof. Yes, Radeons use more power but not that much more.

I'm not buying a R9 290 currently rather due to heat and noise issues and because they are also not available here at all. I will wait for AIB cards and the release of Mantle. GTX 780 also looks nice after price cuts + game bundle.