Is the power usage of a GPU a major factor in your purchase decision?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Is the power usage of a GPU a major factor in your purchase decision?

  • Yes

  • No


Results are only viewable after voting.

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
If I was air-cooling I would care because if it was bad enough the noise would be annoying. I don't care about the actual consumption or use of the power, only if it causes the cooling system to be noisy because it can't manage to keep it cool at a reasonable sound level.

Once I started water cooling I stopped caring. I run Scythe 1850RPM GTs on my radiators and never have them going full speed. The only noise my computer makes now is a beep when it posts. :cool:
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
If you do a forum search for GTX480 launch I think that would be a great indicator of who stands where on this subject.


Then simply switch your focus over to a GTX470 then. No need for the wall.
Regardless of what you just said, it still would be a good indicator who cares about power consumption.


Save the forum time on this pointless task and do it yourself (since it matters to you, for some reason), then post the equally pointless results. Looking forward to it.

Why not add to the discussion like everyone else in this thread? What are you trying to start?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Power isn't directly a concern of mine, but high power cards tend to run hotter and louder. Noise certainly is a big factor for me.
 

tolis626

Senior member
Aug 25, 2013
399
0
76
Take this scenario:

In a room, there are two monitors. On the monitors the same game is playing. No fraps, or other framerate monitoring software is running. A person is asked to tell if he/she can tell a difference. Both GPU's are above 60fps with the game on max settings. One is at 70fps, the other is at 75fps. This, however, is kept from the viewer and they are asked on which monitor do they think the game is running better.. The viewer can't tell a difference between the two.

Then the viewer is carried into a room where both desktops driving the monitors are. The one on the left averaging 75fps is very noisy due to high heat and power demands, the one on the right averaging 70fps is very quiet due to running cooler and having a lower power draw. Again, the viewer does not know which one is driving which monitor. Only asked which do you prefer given price between the two is no more than $50 either way.

Which one do you think he/she chose? Which one would you choose?

I'd choose the noisy one.Being able to hear it work makes me feel sure that it's indeed working,whereas if it was silent,I wouldn't know when it would stop working (E.g. after a blackout).Noise is a feature!

Joking aside,I don't really care about power consumption per se,unless in extreme cases.Same goes for noise.As long as it's withing reasonable levels,I can handle it.If power consumption and noise get out of this world on my PC,then the GPU gets out of the PC,simple as that.It's not like I get mad with the slightest noise,even though my hearing is pretty good.Perhaps I've spent too little time next to really noisy,high-powered systems...
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Do I care about power consumption? Hell yes I do..

Power consumption, heat and noise are all intertwined.. Years ago when NVidia first came out with Fermi, I bought two GTX 470s and overclocked them as high as they would go. Max temps when overclocked was a blistering 102c.

This prompted me to use the EVGA step up program to swap them for a pair of GTX 480s which ran cooler due to superior heatsink design. Overclocked them as high as they could go, and the result was that they (a long with the other components) overloaded my UPS backup, which is good for up to 900 watts.

Then I sold the 480s and bought a pair of 580s, which were much better in every respect. Now I have a pair of GTX 770 4GB that are not only much faster than my previous 580s, but much cooler and quieter as well.

Power, noise and heat reduction are all worthy engineering goals......which is why AMD screwed up with the 290 series, as there is regression on all fronts except performance and features.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Do I care about power consumption? Hell yes I do..

Power consumption, heat and noise are all intertwined.. Years ago when NVidia first came out with Fermi, I bought two GTX 470s and overclocked them as high as they would go. Max temps when overclocked was a blistering 102c.

This prompted me to use the EVGA step up program to swap them for a pair of GTX 480s which ran cooler due to superior heatsink design. Overclocked them as high as they could go, and the result was that they (a long with the other components) overloaded my UPS backup, which is good for up to 900 watts.

Then I sold the 480s and bought a pair of 580s, which were much better in every respect. Now I have a pair of GTX 770 4GB that are not only much faster than my previous 580s, but much cooler and quieter as well.

Power, noise and heat reduction are all worthy engineering goals......which is why AMD screwed up with the 290 series, as there is regression on all fronts except performance and features.

I also had 470's, and hated them. I was so relieved when I got 680's last year.

I do not really believe AMD really screwed up. They just didn't have the capability to achieve the performance they needed and have great power, heat and noise levels. They had to get the performance, so that is what they did. Just like Nvidia did with fermi.
 

mindbomb

Senior member
May 30, 2013
363
0
0
to me, yes.
it is principally why i never recommend crossfire/sli for old graphics cards over a new graphics card, even though the former solution may be much cheaper.
for example, gtx 560 sli is still pretty decent in terms of pure performance, but that's 300 watts right there.
you can easily get better performance with a gtx 770 with less power usage.

and as for this generation, it is why I'm not really a big fan of the large die gk110 and hawaii based gpu's. sure they perform better than the gk104 and tahiti, but i still feel you don't get enough to offset the increased power usage.

and it is certainly more important than the related measurements of noise and gpu temperature. Noise can be directly controlled with the driver or an application like afterburner or a bios editor, or can be fixed with a larger heatsink. Temperature also can be fixed with a bigger heatsink, but I think this measurement is almost completely meaningless. Obviously they aren't going to sell a gpu that can't sustain it's own temperatures, so what relevance does this really have to your average consumer?
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Do I care about power consumption? Hell yes I do..

Power consumption, heat and noise are all intertwined.. Years ago when NVidia first came out with Fermi, I bought two GTX 470s and overclocked them as high as they would go. Max temps when overclocked was a blistering 102c.

This prompted me to use the EVGA step up program to swap them for a pair of GTX 480s which ran cooler due to superior heatsink design. Overclocked them as high as they could go, and the result was that they (a long with the other components) overloaded my UPS backup, which is good for up to 900 watts.

Then I sold the 480s and bought a pair of 580s, which were much better in every respect. Now I have a pair of GTX 770 4GB that are not only much faster than my previous 580s, but much cooler and quieter as well.

Power, noise and heat reduction are all worthy engineering goals......which is why AMD screwed up with the 290 series, as there is regression on all fronts except performance and features.

How is that screwing? Hugely inefficient (not even close compared to 290/290X) Fermi was a double sale on you. One would expect the same now? Something changed?
 

nightspydk

Senior member
Sep 7, 2012
339
19
81
Power is heat so it matters not only with $ but component resilience. It is indeed very important. Maybe that is not what you intended, but no matter how you twist it the less power the better overall system stability.

Think on that. :)

To give you an excample of what is going on, I remember the first generation 8800 nvidia ran extremely hot.

As a second thought, the electricity price is highly variant from country to country. Here it's pricy and no more cheap el from germany.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
And just like the first generation Fermi cards were considered a disaster, so too should the 290 series.

Nobody cares. They'll still be purchased by the truckload. Maybe even sell out too, like the 290X.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Its important but price, performance and noise are more important. As others said, it impacts noise. Since most GPUs have pretty smart throttle down now to limit draw at idle its actually not a big concern as far as cost of electricity goes to me anymore, that feature takes care of it IMO. The other concerns are needing a PSU to drive it (cost) and I like to have a battery backup to protect against brownouts which needs to be up to snuff (cost again).

^Pretty much this^

Back in the day when I had a 650W Antec TP New I could easily run 2 170W cards, but since it blew (well before it's time) I had to invest in a 500W BeQuiet (which cost me more then the Antec) that just about handles my overvolted GPU and overclocked CPU in intensive games like Crysis 3 leaving 100W headroom so I don't overstress it.

It would be nice to have the option for SLI in case I needed but unless my Antec suddenly comes back to life I don't think this will be possible.

Plus British electric prices are currently a hot debate topic over here and living sharing a house with 2 other people makes me really appreciate a card's adaptability when it comes to power draw for various tasks.

I actually plug my PC into a power monitor so I can see exactly what wattage it is pulling at any given time.
 
Last edited:

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
I'd choose the noisy one.Being able to hear it work makes me feel sure that it's indeed working,whereas if it was silent,I wouldn't know when it would stop working (E.g. after a blackout).Noise is a feature

Funny you say that because when I was younger I had become accustomed to noisy computers. Back in the day when we used to walk to school 3 miles in 3 foot deep snow and use floppy drives....fans on performance gpus (and psu's) were ridiculous. But, to us...it was just normal. Didn't know any better.

I remember the first time I built a silent, high performance, computer...when I turned the power on I thought I had done something wrong and my heart skipped a beat :p

Now that I know you don't have to compromise in order to have high performance I will never put another loud component in my computer again.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Power use? No I couldn't care less? Noise, yes, Noise is important.
 
Aug 11, 2008
10,451
642
126
I understand the point of those who say power consumption is not a major factor. But I dont agree that it should be ignored, even if you consider simply an environmental impact. If 2 cards give the same performance at the same price, I would consider power usage a factor. If one wants to use more power to get better performance, then fine, I understand that. Personally, I am running a relative efficient system with a HD7770 and a SB i5. It plays the games I want at the image quality I want. It runs easily on a 460w power supply, and is relatively quiet. If I decided this was insufficient though, I would go for a card that uses more power. When I bought the card, the considerations were: 1. can it give the performance I need, 2. what is the cost, and 3. how much power does it use so I would not have to upgrade other components.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
didn't vote
imo both nv and amd on 28nm so they are at the limit of 28nm ,so this is about noise of amd cards right?
power draw is what you need PER YOUR GAME SETTINGS.
play at 1050 on low where is the power draw?
-play at 1600 on ultra up to you imo
 

sniffin

Member
Jun 29, 2013
141
22
81
Depends what the purpose of the system is. If the goal is performance then power consumption isn't something I want limiting that. For mITX builds I would choose parts that can do more with less.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I don't think it's so much about not caring as looking at the price difference and seeing what you can do with the money you save. A basic 290 will beat or equal Titan, just slap on a better $100 buck cooler and you're good to go.

Wait what? 290 equal Titan? At what 100% fan speed? Here we go again with users defending AMD's design decision to not improve reference acoustics. I'm sorry, where were we again? Ah yes, the thread which is 81 pages long with 81 pages of complaints of NOISE. I wonder how this could have been prevented? It doesn't take a genius engineer to figure out that there's a wide range of cooler designs between "cheap and garbage" and "excellent and more costly" and AMD went with the extreme cheap and garbage cooler. The reference blower that AMD uses was acceptable in 2010 - it's not acceptable by today's standards. You'd think that after reading 500 web reviews and non stop threads on EVERY forum about NOISE that this design decision would not be defended. People will pay slightly more for a better cooler, as it improves user experience. This is why people buy nvidia products - excellent user experience.

Don't get me wrong, the 290 performs outstanding but maximizing performance requires 47% fan which is not quiet. Therefore the user experience is not optimal for a great number of users, which web reviews and forum posts clearly indicate.

It's just...I dunno. The performance is outstanding, but AMD could have and should have made a better cooler. I was REALLY excited about the 290 series prior to launch and I wanted AMD to hit a home run - performance wise, AMD did a good job but the overall package falls well short of a home run. I keep saying this repeatedly because i'm truly baffled as to why they didn't - a 420-430$ 290 non X would sell EXCEPTIONALLY well with a Titan-esque cooler. But here we are. 500 web reviews complaining about NOISE and for some reason, some folks defend this design decision. I Don't get it. I'm sure the card will sell well based on price/performance as it is simply excellent in that respect, but it still compromises user experience. It should not do this, even if the cost is slightly higher, period.

As far as the context between this and the topic, I will point out that the 290 and 290X cards use ABOUT the same power as the Titan - the 290/X uses about 10 more watts on average during games from what I've seen. Yes, the Titan is better during blu ray playback and furmark but the main metric i'm looking at is games. With the power consumption between these two cards being roughly equal (290 and Titan), AMD easily COULD have made a cooler that didn't necessarily match the Titan shroud, but could have come close to it in terms of acoustics. Instead, the 290 shroud is basically 100% identical under the hood to the 5870, 6970, and 7970 shrouds. That's kinda stupid, really, using a nearly 4 year old shroud on a halo graphics card? Really?
 
Last edited:

sniffin

Member
Jun 29, 2013
141
22
81
The problem goes away completely when custom designs are launched though. At that point the 290/290X as a complete package are just fine.

What they should have done is allow custom designs at launch. Noise issues wouldn't exist and everybody would be focused on performance. Ideally though they'd just fix up the cooler. Reference designs have their place and I actually prefer them.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The problem goes away completely when custom designs are launched though. At that point the 290/290X as a complete package are just fine.

What they should have done is allow custom designs at launch. Noise issues wouldn't exist and everybody would be focused on performance.

I agree, aftermarket availability on day one would have prevented this somewhat, although a lot of users do prefer blower style coolers for small form factors. So I think they should have done both, although day 1 availability of aftermarket shrouds definitely would have helped.

Even if they had done that, it wouldn't excuse them from using a 4 year old shroud design on their 2013 halo graphics card. But, it would definitely have lessened the impact of noise complaints by a country mile.
 

nightspydk

Senior member
Sep 7, 2012
339
19
81
So you push the hardware and cool it respectively, but why don't you look at the heat dissipation since that is in some respect lost performance. That's why a cpu /gpu performs best in extreme cold conditions. Don't understand it should really matter to those who push this all the way. That's exatly why some use extreme measures to cool the system. Noise how about water cooling?
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
I agree, aftermarket availability on day one would have prevented this somewhat, although a lot of users do prefer blower style coolers for small form factors. So I think they should have done both, although day 1 availability of aftermarket shrouds definitely would have helped.

Even if they had done that, it wouldn't excuse them from using a 4 year old shroud design on their 2013 halo graphics card. But, it would definitely have lessened the impact of noise complaints by a country mile.

Nothing you can say will change the fact that what's done is done. You're just beating a dead horse. The only thing to do now is wait for better cooling solutions, or just buy what you want to buy. What I've learned, however, is people will find another thing to complain about. You can take that one to the bank.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Save the forum time on this pointless task and do it yourself (since it matters to you, for some reason), then post the equally pointless results. Looking forward to it.

Why not add to the discussion like everyone else in this thread? What are you trying to start?

I don't understand? Can you elaborate?
 

dust

Golden Member
Oct 13, 2008
1,328
2
71
If it is Nvidia then it bothers me, if it is AMD then I'm ok with it:colbert:

Jokes aside, Nvidia did this with early Fermi and asked for a very cheeky price(highest at the time). It sort of translated like "Yeah! yeah! We know we're late, power hungry and hot! Shut up and give us your money"

AMD at least had the decency to lower the rate vs the competition.Whether they saw the power draw/cooling/heat as a failure or simply wanted to undercut the competition by lowering prices, I don't care.

The result is that we get to buy cheaper/faster AMD cards and the whiners get to buy cheaper-than-before Nvidia cards. Win-win! Wtf is wrong with you guys?:confused: