Gtx 480 nearly as power efficient as 6970

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dust

Golden Member
Oct 13, 2008
1,339
2
71
That's quite a difference compared to the Anand bench. They do say there might be something wrong with the sample card they got, although I don't really know how that might affect the power draw.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
This review found the same results :
http://www.techspot.com/review/348-amd-radeon-6970/page12.html

I noted they used the performance comparison that used to be argued against the gtx 480 all the time :
Against the Radeon HD 5970, the HD 6970 consumed 3% less power yet we found that on average it was 15% slower, 26% slower in Crysis Warhead, which is what we used for stress testing. The Radeon HD 6970 used roughly the same amount of power as the GeForce GTX 570 consuming 1% more when stressed and 4% less at idle. Given that both graphics cards provide similar performance, this time around it seems they are equally matched.
Power.png
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Both of these cards have similar performance hence why they have similar power consumption. What does a die size have to do with this? It's none of our concern unless you are implying that the smaller die size somehow should affect how much power a GPU consumes? I don't think that's how it works but enlighten me.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
:thumbsup: Ya, seriously. People who drop $300 on CPUs and overclock them like Core i7 920 @ 4.0ghz-4.2ghz or even $200 Phenom II systems that suck power when overclocked and then use the "power savings" argument for GPUs make me laugh!! I expect every single one of them to convert to SB immediately. :colbert:

It's funny how none of the same people advocate moving towards the Core i5 661 for gaming...

ThePig-Corei7Overclocked.jpg

Do you recommend a dual-core over a quad core for gaming?

I don't think you do.

Why did you get a GTX470 over a 5870?

Because it was cheaper, wasn't it?

What if the 5870 was only slightly more expensive, would your option be the same?

Why do some people buy AMD CPUs over Intel offerings?

Because they are considerably cheaper, not only the individual CPU but the motherboard as well. When they are similarly priced (and perform similarly/have same core count), you should go Intel.


I've seen this post from you countless times now.

At similar price points and performance, makes no sense buying the product that consumes more power.
 

WorldExclusive

Senior member
Nov 19, 2009
449
0
71
At stock clocks the difference isn't that much between the 6970 and 480.
But, when the 480 is overclocked the power used increases a great deal.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
Really no card was competing with the gtx480 until the 6970. 8 months? The 5870 was some 18% slower then the 5870 going by your math in another thread.
Unless you count the 5970? If you do, then I guess the gtx295 was in competition with the 5870 in those 6 months.
LOL, the point went entirely above your head.

But I'll humor you: The GTX 295 was even worse in terms of performance per watt when compared to the 5870 than the 480 was. It was also a dual card with less actual memory, and it was more expensive for hardly any extra performance. So it has worse trade-offs than if you were to compare the 5970 to the GTX 480 - where the 5970 was typically faster by a good margin and also used less power.

HD3870 --> HD4870 was accomplished on the same 55nm. The performance jump was enormous.
ROFL I knew you would say something bogus like this. The 3800 was a die shrink from the 2900, so please try again. Going by your standards the 3000 series wasn't worthy of a new generation either, and there were only like 4 cards released for the entire series. So it's quite funny you would use this for your argument, because it really doesn't support it at all.

I'm actually appalled you would use that as an example to back up your argument. You really love to restrict perspective to fit whatever you're trying to say, even if by doing so you contradict yourself.

Not necessarily. I can tell you with certainty, when you bought a mid-high end or a high-end card, the performance difference was substantial.

- $500 9700Pro was 50-70% faster than a $200 9500Pro.
- $425 6800 Ultra was 70-100% faster than a $200 6600GT. $350 6800GT was an even better value.
- $500 X850XT was a good 30% faster than a $350 X850Pro
- $500 5950 Ultra was 70-100% faster than the $200 5700 Ultra.
- $300 HD4870 was 30-40% faster than the $200 HD4850
- $400 7800GTX was 70%+ faster than a $200 7600GT.
^^^There are far too many examples to list.

Today, the increase in price is not commensurate with increased performance of these $350-500 cards over the $200 offerings. This generation especially, the price/performance for high end cards is one of the worst in the past 10 years. Just think about it, HD5870 (high-end) was at least 70% faster on average than an HD5770 (mid-range). Is HD6970 70% faster than an HD6850? Not even close.

It was almost unthinkable in the past to take a mid-range card and overclock it to high-end's performance. The only time this happened was GeForce 4200 if I remember correctly (unless you consider 9500Pro unlocking into 9700 series). It wasn't until 8800GT that mid-range cards started to be so powerful. Before that, when you paid a premium for a high-end card, you got a massive performance increase!! It was really the 8800GT that opened our eyes into the world of mid-range $200 goodness.

Wow, you really exaggerrated on the 4850 to 4870 jump. It's more like 25%, not 30-40%.

http://techpowerup.com/reviews/HIS/Radeon_HD_6970/29.html

Let's compare the GTX 580 ($500) to the GTX 460 ($200). The 580 is 70% faster! The 6970 ($370) is 40% (a lot more in some games and situations) faster than the 6850 ($180), which doesn't seem too far off from your shining example of "$500 9700Pro was 50-70% faster than a $200 9500Pro." In fact that seems like a better price to performance ratio gap if you ask me. Things seem to line up with history. And history has fluctuations, as you well know.

"Just think about it, HD5870 (high-end) was at least 70% faster on average than an HD5770 (mid-range). Is HD6970 70% faster than an HD6850? Not even close."

This is a worthless question. The question means nothing as does the answer within the confines you have created. The price gap between the 6800 and 6900 cards is smaller than the one we had between the 5700 and 5800 cards. The performance is also smaller. See that correlation? The price gap between the 5700 and 5800 cards was bigger. The performance difference was bigger.

I have been building computers for 10 years. I am of the view that in the past a high-end card really did justify the $400-500 price. Today, a $350 card is barely 20% faster than a $200 card. $500 cards are barely 40-50% faster than $200 cards. It's a free market, so people are free to buy those $500 cards. That doesn't change the fact that they are poor value. You may think my expectations are too high, but I think the market has changed like you said. Nowdays, consumers just expect way less. Not sure why that is.

I think pricing should be a tad more aggressive, but currently it is not as out of line with history as you are making it out to be. The $150-$250 market is crowded, and this is what is driving the prices of these midrange cards down.

It really isn't an issue for anyone, unless you live in boonies, eat cold food, hand wash and air dry your clothes, and don't use lights because you go to bed when the sun sets. Until then, you're not allowed to speak of power savings when it comes to hardwired (non battery powered) electronic devices.

Your video card uses 1/100th of the power of all the other appliances and commodities you use on a daily basis, that you could easily survive without. If you are concerned with saving money on your power bill, shut off your AC, shut off your electric water heater, don't use the oven, don't use the electric cook top, don't use the dryer, and replace all your mechanical switches with electronic dimmers or replace all your light bulbs with half of the wattage.

That's how you save money, not by bragging that ATI uses 50W less than nvidia. 50W is the wattage of one halogen light bulb in your kitchen for **** sakes.

And people are beginning to shut off those other appliances, and are definitely replacing those 50W bulb. So yeah... bunk'd.

Ya, seriously. People who drop $300 on CPUs and overclock them like Core i7 920 @ 4.0ghz-4.2ghz or even $200 Phenom II systems that suck power when overclocked and then use the "power savings" argument for GPUs make me laugh!! I expect every single one of them to convert to SB immediately.
1. Go ahead and show me these people who overclock like mad and use the power argument. I doubt there aren't too many of them because obviously they prioritize performance.

2. We're concerned with energy efficiency. If one chip is giving similar or better performance than other chip while needing less power, why wouldn't it be valid (even for those who overclock) to take that into consideration as a pro (or con)? The 5870 <-> 470 comes to mind here.

GaiHunter said it best:
"At similar price points and performance, makes no sense buying the product that consumes more power."

It's funny how none of the same people advocate moving towards the Core i5 661 for gaming...
What's funny is that I haven't come across a proper review for processor power consumption during gaming. You simply can't use the typical power consumption that reviewers use because they test processors under full 100% load, and that is definitely not the same situation a processor undergoes when gaming. So again, efficiency comes into play. It's more like what power does a processor need when it's 10% loaded? 20% loaded? 30%? 40%? 50%? 60%? 70%? 80%?

For example, under a full load obviously a Core i7 870 is going to use more power than a Core i5 661. However, when gaming, it's quite possible for games to only use 40% of the i7 870's performance (and thus power) while the i5 661 will be loaded to 80-100%. Let's compare power figures now, as well as the performance of each.

It's the same kind of situation some reviewers have touched on with video encoding. A chip may use more power than another, but it goes much quicker, so the end result is that it actually uses less energy to complete the task. Gaming is different as in it's not time-limited. However, another metric produces a similar result to getting something done quicker. That metric is a base framerate level that a user expects. If the i5 661 isn't going to give a user 60 fps in a game and an i7 870 will, then obviously that person needs to weigh the cost of power versus the desired performance. It's no different than weighing the initial cost of purchase versus performance.

So this statement is not as clear cut as you make it. It's a balancing effect and there's a lot of consideration. There's performance. There's power consumption of the chip. There's platform (motherboard) power consumption differences. There's the type of load a game requires. There's the performance you get out of a game from a certain setup.

http://www.xbitlabs.com/articles/vid..._13.html#sect0

The 6970 performs similar to a 570 yet uses more power. So their smaller more efficient chip strategy seems to have gone out the window.

http://techpowerup.com/reviews/HIS/Radeon_HD_6970/27.html

6970 > 570 in performance
6970 < 570 in power

Say what?

And look at the efficiency of the 6950.
It's faster than the 5870
http://techpowerup.com/reviews/HIS/Radeon_HD_6950/29.html
and uses less power:
http://techpowerup.com/reviews/HIS/Radeon_HD_6950/27.html

So I LOL @ you thinking they took a step back.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
http://www.xbitlabs.com/articles/video/display/radeon-hd6970-hd6950_13.html#sect0

The 6970 performs similar to a 570 yet uses more power. So their smaller more efficient chip strategy seems to have gone out the window.

Quoting HappyMedium "No real reviewer uses Furmark any more".

Considering that both AMD and NVIDIA artificially reduces power consumption on Furmark for GTX580/570 and 6970/6950 it is just a question of who reduces power consumption more.

As one of our crazy members say, although in a much more unintelligible way, where are the performance figures for Furmark?

That is why we should look at other ways of measurement.

From http://www.hardocp.com/article/2010/12/14/amd_radeon_hd_6970_6950_video_card_review/8 using BFBC2.

1292337625zR9jST7GBp_8_1.gif


34663.png


So now we see that those that tested power consumption with Furmark have reached the conclusion that the 6970 consumes more power than the GTX570. Those that tested without it reached a different conclusion. (Although curiously AT furmark results are quite different. I wonder if that is related with people messing with powertune settings)

Still NVIDIA improved quite a bit on this department over GTX480/470.

Of course that at 2560x1600 the GTX580 is only 5&#37; faster than the 6970 that is 9% faster than the GTX570. With 4xAA NVIDIA cards fare slightly better but at 8xAA AMD cards strike back.

http://www.computerbase.de/artikel/...950/26/#abschnitt_performancerating_ohne_aaaf
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
According to that chart above from Hocp, the 6970 uses 50 more watts than the 6950, I've read the unlocked 50's to 70's can use an additional 25 or so. That 50-75 watts more, or a amount thats being argued about by the 'green power savers'.
So do owners unlock their 6950 and be wasteful, does that argument about needing a better psu come in to play ?
I'm being playfully sarcastic with some of these rhetorical questions. :)
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
According to that chart above from Hocp, the 6970 uses 50 more watts than the 6950, I've read the unlocked 50's to 70's can use an additional 25 or so. That 50-75 watts more, or a amount thats being argued about by the 'green power savers'.

Where did you read this nonsense? Thats impossible, its the exact same chip at the same clocks, at the exact same voltage. The only difference is the memory chips used are rated for a slower speed and that has nothing to do with the power consumption.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
According to that chart above from Hocp, the 6970 uses 50 more watts than the 6950, I've read the unlocked 50's to 70's can use an additional 25 or so. That 50-75 watts more, or a amount thats being argued about by the 'green power savers'.
So do owners unlock their 6950 and be wasteful, does that argument about needing a better psu come in to play ?
I'm being playfully sarcastic with some of these rhetorical questions. :)

But they are saving $175 to have similar performance and that is something you cant ignore.

You need to consider initial price, performance, power consumption, among other things.

Why did the GTX470 dropped in price so much in the first place?

Because people had the option of buying a cheaper 5850 that was only slightly slower and consumed much less power or buying a more expensive but faster 5870 that consumed much less power.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Where did you read this nonsense? Thats impossible, its the exact same chip at the same clocks, at the exact same voltage. The only difference is the memory chips used are rated for a slower speed and that has nothing to do with the power consumption.
Its called INEFFICIENCY. You see it in heat and added power use from silicon that less perfect. Its similar to any electrical situation where added resistance is introduced.

http://www.semiaccurate.com/2010/12/27/radeon-hd-6950-flashable-6970/
Also make sure you have a decent power supply, as the card will go from using about 169W at default settings to 202W with the Shaders unlocked all the way to 252W with the card overclocked to the same speeds as a 6970. That means you're actually using 24W more than what a real 6970 would do, again, not likely to be a huge concern for anyone that just wants a bit of extra performance out of their card.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
According to that chart above from Hocp, the 6970 uses 50 more watts than the 6950, I've read the unlocked 50's to 70's can use an additional 25 or so. That 50-75 watts more, or a amount thats being argued about by the 'green power savers'.
So do owners unlock their 6950 and be wasteful, does that argument about needing a better psu come in to play ?
I'm being playfully sarcastic with some of these rhetorical questions.

You better be joking entirely. You don't provide any conclusive links to data, and the argument isn't the same one a sane person would be making.

You want more performance you use more power. That's why a 5870 > 5450. The real argument is when one product uses more power but doesn't outperform another product, or performs worse while using more power. Or if there's some significant (non-linear) relationship going on between power use and performance between stock cards.


ts called INEFFICIENCY. You see it in heat and added power use from silicon that less perfect. Its similar to any electrical situation where added resistance is introduced.

Quote: http://www.semiaccurate.com/2010/12/...lashable-6970/
Also make sure you have a decent power supply, as the card will go from using about 169W at default settings to 202W with the Shaders unlocked all the way to 252W with the card overclocked to the same speeds as a 6970. That means you're actually using 24W more than what a real 6970 would do, again, not likely to be a huge concern for anyone that just wants a bit of extra performance out of their card.

Alright, so you provided some data but that data isn't very conclusive. This is the real source:
power.jpg


Notice he used FURMARK

Also notice the 6950, modded, +20&#37; is getting higher framerates than the 6970 at 0% so the extra power seems justified, but compared to the 6970 at +20% it's using less power and delivering the same frames. So yeah, what you just said has been debunked even from these bunked FURMARK results.

It would be nice for people to actually do a proper analysis of the data. We are discussing performance per watt here.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,229
9,990
126
I was one of the ones that actually fell for the power issue being a huge concern. I figured I would be spending an extra $20 a month in electricity. Man what a fool I was. That marketing is some clever stuff. Makes me wonder what else I've foolishly believed over the years. :confused:

I dunno. I was running a Q6600 @ 3.6Ghz, and two GTX460 1GB cards @ 820Mhz, for the entire month of December.

My electric bill was $120, for 810KW/h. My normal usage, with just my dual-core E2140 @ 2.8 running 24/7 doing DC was around $60. In Dec, I had stopped doing DC with the E2140, so it was running idle nearly all of the time (MagicJack computer). I didn't re-enable the power-saving features, since it was OCed.

So running a single power-hungry computer rig, DID significantly change my power bill.
(Also heated up my apt something fierce.)
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
We see this all the time, better binned cpu's / gpu's o/c further with less voltage, ergo they use less power.
You take a gpu that was binned a 6950 probably because its a inferior piece and clock it to non standard clocks and it uses more power.
Are we in denial over that ?
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
We see this all the time, better binned cpu's / gpu's o/c further with less voltage, ergo they use less power.
You take a gpu that was binned a 6950 probably because its a inferior piece and clock it to non standard clocks and it uses more power.
Are we in denial over that ?

No, but why don't you actually analyze the data for yourself?

power.jpg


Give it a try; you'll see interesting results if you look at performance per watt, or you can read my "stealth" edit above. Although I'm not sure if it's even worth it as there are probably driver limitations being activated since we're comparing almost useless FURMARK numbers.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
In the above scenario's , I would unlock my 6950, just like I o/c my gtx 460's and accept the power penalties.
I do use the power saving features built in to my Intel cpu, because I recognize that is where I would be most wasteful.
Thats my 2 cents on power. I think under load gpu power usage, basically when your gaming is more trivial than most power concerns.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
In the above scenario's , I would unlock my 6950, just like I o/c my gtx 460's and accept the power penalties.
You're also accepting more performance, which has to be taken into consideration along with the increase in power. I've already explained this...

However with your semiaccurate link you were trying to make a claim as certainty when in fact the data doesn't even point to the conclusion you used: "That means you're actually using 24W more than what a real 6970 would do, again"

Just look at the data...

6950 modded: 202W - 88 fps
6950 mod+20: 252W - 115 fps
6970 stock+0: 228W - 92 fps

I wonder what would happen if you set the 6950 modded to +10&#37;? If we split the difference betwen mod+0% and mod+20%, we get 227W at 101 fps.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
At similar price points and performance, makes no sense buying the product that consumes more power.

That's true. I was just saying sometimes the argument over power consumption is so heated, yet from those LegionHardware graphs, the power consumption differences between GTX470/570/480/6970/5970/580 amounted to 36 watts at load (from least to most). More or less, power consumption differences so minute shouldn't even be discussed over card warranties, game bundles, card features, etc.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Wow, you really exaggerrated on the 4850 to 4870 jump. It's more like 25&#37;, not 30-40%.

4870 1GB is 33% faster on average than a 4850. This is what I said 30-40%.

Let's compare the GTX 580 ($500) to the GTX 460 ($200). The 580 is 70% faster!

A $200 GTX460 has 850mhz clock speeds .

It is faster than a GTX470 at those speeds. You think GTX580 is 70% faster on average than a GTX470? http://www.anandtech.com/show/3987/...renewing-competition-in-the-midrange-market/8

The 6970 ($370) is 40% (a lot more in some games and situations) faster than the 6850 ($180), which doesn't seem too far off

Except that a bunch of 6850s on newegg are $160. You can't just conveniently ignore comparing the best bang for the buck mid-high end cards like 850mhz GTX460, GTX470 or even the HD5870 which can be had for $225. HD6850 is just 1 card that I used as an example.

How does a $199 850mhz GTX460 or a $225 HD5870 compare to the $360 HD6970? :p I mean seriously you can have 2x HD6850s or 2x 850mhz GTX460 for almost the same price as a single HD6970/GTX570....Similarly you can buy 2xHD6950s for nearly the price of a single GTX580. In each of those cases, the single high-end card gets a beating of a lifetime.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
Sound is big for me so 80W power will make a huge difference if you ever think about running a card passive. My 5770 is quiet so I no longer consider it, but if I ever got a noisy card I'd think about putting on a huge heatsink and letting case cooling take care of it. I'm sure that eliminates most high end cards though since 6970 is probably too hot for passive cooling.

As an aside, I'm surprised how well CPUs can run passive lately. I can OC my Q9300 from 2.5 to 3.5 and run it passive with a Xigmatek 1283 and still keep safe load temps.