First AMD 7970 review up: Now with AT official review

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
the potential in this card lies in its OC ability. Honestly, people buying a card on this level are enthusiasts so I think most will try to push the limits.

that is true! This card overclocks easily in the gtx590 territory...that cost alot more
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
the potential in this card lies in its OC ability. Honestly, people buying a card on this level are enthusiasts so I think most will try to push the limits.

The problem IMO is the reference cooler, its already sacrificing quite a bit of a noise to keep temps around the 6950 range, which isn't exactly anything commendably low. So either we're going to get even louder and/or temps are going to start to really climb.

Honestly I wouldn't be happy pushing a 7970 to any sort of extreme without water or at least a triple slot air cooler to keep temps and noise in check.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The problem IMO is the reference cooler, its already sacrificing quite a bit of a noise to keep temps around the 6950 range, which isn't exactly anything commendably low. So either we're going to get even louder and/or temps are going to start to really climb.

Honestly I wouldn't be happy pushing a 7970 to any sort of extreme without water or at least a triple slot air cooler to keep temps and noise in check.

They're getting +1100MHz @ stock volts though. Whatever model someone goes with there won;t really be much additional heat/noise if they don't increase the voltage.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
They're getting +1100MHz @ stock volts though. Whatever model someone goes with there won;t really be much additional heat/noise if they don't increase the voltage.

Was there any review with power consumption figures after ocing? I wonder why they didn't use up some of that headroom unless they wanted to fit in under the HD6970 power specs (and potentially saving up for a faster part for Kepler).
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Was there any review with power consumption figures after ocing? I wonder why they didn't use up some of that headroom unless they wanted to fit in under the HD6970 power specs (and potentially saving up for a faster part for Kepler).

I think legitreviews, that did the overclock thingy, mentioned that it looked good when overclocked too.

So they dont seem to explode in poweruse from stock voltage, overclocking.


also there was some kinde mixup with the thermal paste on some of the review cards (really right?) but I swear I read that somewhere. Apperntly these people are goofballs to make mistakes like this.


Here:

Computerbase.de

http://translate.google.es/translat...abschnitt_anisotrope_filterung_in_der_theorie

AMD has told us a few hours before the end of the NDAs that our (and others) to test cards may have provided an error when applying the thermal compound, so the cooler does not lie perfectly on the GPU. Accordingly, we can not guarantee that the measured values ​​correspond regarding the volume and temperature of the facts. Our special test with the replaced thermal compound shows the best case for the Radeon HD 7970, which the graphics card would work much cooler and a little quieter. To our statement that the 3D accelerator is very loud, but this is likely to change anything. We remain in contact with AMD, and try to correct the problem as quickly as possible in order to perform new measurements.

They applied new thermal paste for the cards and:
7°C lower and 3dB lower noise after adding new cooling paste.
 
Last edited:

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
I think legitreviews, that did the overclock thingy, mentioned that it looked good when overclocked too.

So they dont seem to explode in poweruse from stock voltage, overclocking.


also there was some kinde mixup with the thermal paste on some of the review cards (really right?) but I swear I read that somewhere. Apperntly these people are goofballs to make mistakes like this.


Here:

Computerbase.de

http://translate.google.es/translat...abschnitt_anisotrope_filterung_in_der_theorie



They applied new thermal paste for the cards and:
7°C lower and 3dB lower noise after adding new cooling paste.

Thats huge and worth of the effort, thanks for sharing.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Was there any review with power consumption figures after ocing? I wonder why they didn't use up some of that headroom unless they wanted to fit in under the HD6970 power specs (and potentially saving up for a faster part for Kepler).

I didn't see any. Many of these reviews looked rushed and didn't give typical full info.

I too believe that clocks are artificially low. The process could well be a bit leaky at this point and this will improve as it matures, would be my guess. Because of this they set the clocks only as fast as they needed to to beat the 580 by a comfortable enough margin that reviewers wouldn't be able to find a way to make the 580 look faster. When the process and yields get better clocks will be able to be raised. Maybe there even are some inactive sections of the chip, as apoppin suggested, that can later be activated. I doubt it though. I think he mistook crippled clocks for physically crippling the chips.
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
finally amd is charging some good money for their new card and getting a good margin off it. god knows they need it considering bd is such disaster. now ATI acquisition is looking better by the day.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Was there any review with power consumption figures after ocing?
Computerbase.de mentions them clocking it to the limits (stock voltage), and it useing about 35watts more
(which is still under the 580 in their tests).

Just go type in Computerbase.de and use googletranslate (or go with link in post above).
Its under their "power" tab.


Legitreviews, shows the 7970 OCed beating a stock 580 by around 40% with those settings.
 
Last edited:

WMD

Senior member
Apr 13, 2011
476
0
0
Yes, it's a bit frustrating to see people pull one review (sometimes one bench from one review) and declare victory or defeat from it. Also the rhetoric and double standards we see being posted. We all have biases. It's only human, and difficult to over come. It takes effort though to be a BS artist. Some folks need to quit trying so hard. :D

I am well aware there are other benchmarks where it fares better but you know I couldn't care less if it beats the gtx580 by a significant margin in witcher2 and skyrim. Those are Dx9 games with horrible graphics. BF3, metro2033 and other DX11 titles are the ones that matter more to me. They are hardest to run and have the best leading edge graphics IMO.

Whichever way I look at it, beating a card that launched over a year ago by 10%-20% isn't impressive at all especially considering its on a much smaller 28nm process.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Bottom line is if you were in the market for a card right now in the ~$500 price range, would you buy a 580 or 7970?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Folks, this is your "fair warning" heads-up message from your local moderator.

As we have historically done with all pre-release threads and speculation threads that remain active past the review release time, we intend to lock this thread soon.

Please migrate your discussions, if they need continue, over to the stickied "official 7970 review thread".

I'll be back to lock this one down tonight. Get your final arguments in like Flynn. ;)

Administrator Idontcare
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
I am well aware there are other benchmarks where it fares better but you know I couldn't care less if it beats the gtx580 by a significant margin in witcher2 and skyrim. Those are Dx9 games with horrible graphics.

witcher 2 has horrible graphics? Have u played it?

Witcher 2 looks really good and thankfully its not a console port.
 
Oct 4, 2004
10,515
6
81
How is this a failure? This will cost 10% more than a GTX 580 and delivers 25% more performance? How often does that happen with ANY computer upgrade (CPU/GPU/Mobo/RAM/SSD/HDD)?

I can see the line of thinking that goes, "Nvidia will soon (?) launch something (?) that will offer ?% more performance for $?" and sure, that's inevitable. For a card announced today, available in January, this is a solid launch. AMD would be stupid to sell it for less money while they still have the fastest single-card GPU. Now will they be able to cut price should/when Nvidia launches their counterattack, that remains to be seen.

As for me, I bought a Radeon 5850 two years ago while they were selling for $339 and am perfectly fine playing BF3 on the High preset. I have tried the Ultra preset (works pretty fine in single-player and on an empty MP server:p) - sure it looks better but not worth the cost/hassle of selling my current GPU and paying the difference. Need to wait for the next big leap in graphics that renders GTX 4xx/Radeon 5xxx class cards unusable.

Will probably end up waiting until the inevitable 8870/8850.
 

Capt Caveman

Lifer
Jan 30, 2005
34,543
651
126
How is this a failure? This will cost 10% more than a GTX 580 and delivers 25% more performance? How often does that happen with ANY computer upgrade (CPU/GPU/Mobo/RAM/SSD/HDD)?

I can see the line of thinking that goes, "Nvidia will soon (?) launch something (?) that will offer ?% more performance for $?" and sure, that's inevitable. For a card announced today, available in January, this is a solid launch. AMD would be stupid to sell it for less money while they still have the fastest single-card GPU. Now will they be able to cut price should/when Nvidia launches their counterattack, that remains to be seen.

As for me, I bought a Radeon 5850 two years ago while they were selling for $339 and am perfectly fine playing BF3 on the High preset. I have tried the Ultra preset (works pretty fine in single-player and on an empty MP server:p) - sure it looks better but not worth the cost/hassle of selling my current GPU and paying the difference. Need to wait for the next big leap in graphics that renders GTX 4xx/Radeon 5xxx class cards unusable.

Will probably end up waiting until the inevitable 8870/8850.

I got my 580 for $340.
 
Status
Not open for further replies.