***AMD Radeon HD7970 GHZ Edition - Official Reviews Thread***

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
The big question then is how much does this have to do with GCN's compute chops, and how much does this have to do with Codemasters targeting GCN? If it's more of the former it could be an indication of the way future games might go, if it's more of the later it could just be an extreme example of one company gaining an advantage due to heavy optimization.

It has nothing to do with GCN. It's AMD way to cheat their cards over the competition. They did it with Dragon Age 2 and recently with Shogun 2.

I find it funny because the same people who always complain about nVidia have no problem with this strategy. :whiste:
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
According to computerbase, this card has a 20% advantage over GTX 680 at eyefinity res + high AA.
.

Then if you run Eyefinity and can get this card to ran quiet enough, it would indeed be the card to get, but at 25x16 or 19x10, I would say no!
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
As a surround user myself it's nice for a couple titles, garbage in many, but it's great in the titles it's good for like FPS and Racing Sims - not really a major selling point since our numbers are few and far between even on enthusiast forums.


Computerbase isn't listing actual FPS, my guess is at 5760x1080 8xAA/16xAF 99.9% of games worth playing on a triple screen would equate into garbage unplayable frame rates for either card.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
It has nothing to do with GCN. It's AMD way to cheat their cards over the competition. They did it with Dragon Age 2 and recently with Shogun 2.

I find it funny because the same people who always complain about nVidia have no problem with this strategy. :whiste:

lol [citation required]

I wouldn't touch the ghz7970 but apparently AMD is cheating in DA2 and shogun 2? If you say so ;) If you bother with a reply you should probably include some type of proof,
of which i'm sure there is either none (likely) or highly circumstantial BS
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Would like a response on this, http://techreport.com/articles.x/23150/11

Did they edit the article, last paragraph for me is:

"For now, we at least have a fresh reminder of how close the battle for GPU supremacy is in this generation of chips. You can't go wrong with either team this time around, although the mojo is liable to change hands again at any moment. "

Might want to reconsider throwing out the troll moniker. Making up article quotes, if this is indeed what has happened, whiffs of Wreckage worthy material to me.

Edit: After some interneting it seems you are quoting Techpowerup while Will Robinson is quoting TechReport, now I am sure that the troll slam was uncalled for.

Quote the last paragraph of that same article, same page troll.

EDIT: I'll do it for you, since I know you won't.
AMD's HD 7970 GHz edition is priced at $499, the same as NVIDIA's GTX 680. While this might seem enticing at first glance, since the GHz Edition is faster, NVIDIA's card wins at power-draw, noise, and manual overclocking, with the better card overall. I find $499 is just too high to really draw away much attention from the GTX 680, if the HD 7970 GHz Edition was $450 I'd definitely consider it, until that happens I'll happily take a GTX 680, or even GTX 670, which offers better price/performance at not much lower performance.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
lol [citation required]

I wouldn't touch the ghz7970 but apparently AMD is cheating in DA2 and shogun 2? If you say so ;) If you bother with a reply you should probably include some type of proof,
of which i'm sure there is either none (likely) or highly circumstantial BS

Let me guess you forgot the famous Dragon Age 2 performance of the Fermi cards at the start: http://www.pcgameshardware.de/aid,8...n-Age-2-um-knapp-60-Prozent/Grafikkarte/News/

Look, 60% more performance.

And Shogun 2:
Kepler performance before the latest patch: http://www.hardware.fr/articles/857-21/benchmark-total-war-shogun-2.html
Kepler performance after the latest patch: http://www.hardware.fr/articles/866-19/benchmark-total-war-shogun-2.html
Kepler performance with the latest beta driver: http://www.hardware.fr/articles/869-17/benchmark-total-war-shogun-2.html

BTW it's a 60% performance increasement, too. :biggrin:

/edit: Statement from Ryan Shrout about the Kepler performance in Dirt:Showdown:
People inside NVIDIA are telling me that there is an unpatched bug in the game that is causing these massive performance hits on NVIDIA's Kepler architecture
http://pcper.com/reviews/Graphics-C...on-Review-Taking-GTX-680/New-Games-and-Comput
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I suspected you wouldn't post proof of AMD cheating, and you didn't. Again, if you're going to bother replying, post proof. Such bold claims usually
warrant explicit proof that AMD is cheating, not that there are performance differences between drivers.

I understand you hate AMD, which is cool, they deserve flack but posting such a bold claim without good proof to back it up doesn't make sense. I asked for proof that AMD is cheating. Not circumstantial differences between drivers revisions, stuff like that happens all the time. But you bust out with AMD is cheating. Whatever :rolleyes:
 
Last edited by a moderator:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Maybe you should read my posting again.

LOL..i'll get right on reading your posting again. What you posted isn't proof that AMD is cheating. Do any of your links pinpoint a specific cheat on AMDs part that intentionally cripples the game on nvidia hardware? No? Highly circumstantial BS? Yes.

edit: i'm also curious as to how a game that uses PhysX (dragon age 2) employs an AMD cheat

I love it when you guys make good technical points, but its all for naught if you're going to mix it with inflammatory rhetoric. Please stick to the former and avoid the latter.

And for the record, the idea that a patch in a year-old game that reduces Kepler performance is an AMD conspiracy is really stretching it. Critical thinking, guys!:p

-ViRGE
 
Last edited by a moderator:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Nvidia should start looking for an underground ocean in that Dirt game, imo.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
No it's a reading comprehension issue on yours.

The stock Ghz Edition is at 429w. When they were testing max overclocks they got 485w. :rolleyes:

Yeah. An overclocked Ghz edition draws more power than a dual GPU card with roughly 2x the speed. That seem normal to you? You crazy?
 

KompuKare

Golden Member
Jul 28, 2009
1,235
1,610
136
Computerbase isn't listing actual FPS, my guess is at 5760x1080 8xAA/16xAF 99.9% of games worth playing on a triple screen would equate into garbage unplayable frame rates for either card.

They only list the actual FPS in the 'Anhang' (addendum), but there its per game (which is logical since average FPS over multiple games would be strange).

But its '+ Anhang' with a little arrow because I suspect their reviews would end up just too long otherwise. If you click on 'Drucken' (Print) you get the whole review on one page - it would print out 72 pages which is almost a small novel.
 

Abwx

Lifer
Apr 2, 2011
12,000
4,954
136
Yeah. An overclocked Ghz edition draws more power than a dual GPU card with roughly 2x the speed. That seem normal to you? You crazy?

The power draw graph you quoted use metro2033 as benchmark ,
now show us a GTX690 that does double the perfs (roughly!!)
of the overclocked 7970 in this game....:biggrin:

Without overclocking :

IMG0037270.gif

http://www.hardware.fr/articles/869-15/benchmark-metro-2033.html
 
Last edited:

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Oh I don't know. Because 10 million plus are probably playing that game at this moment.

Also to the poster above. Warhead? Way to link a 4 year old game. No one cares about warhead

Here's a little more recent game for you. It uses direct compute for forward+ rendering.

 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Would like a response on this, http://techreport.com/articles.x/23150/11

Did they edit the article, last paragraph for me is:

"For now, we at least have a fresh reminder of how close the battle for GPU supremacy is in this generation of chips. You can't go wrong with either team this time around, although the mojo is liable to change hands again at any moment. "

Might want to reconsider throwing out the troll moniker. Making up article quotes, if this is indeed what has happened, whiffs of Wreckage worthy material to me.

Edit: After some interneting it seems you are quoting Techpowerup while Will Robinson is quoting TechReport, now I am sure that the troll slam was uncalled for.
Thanks Ves,I figured that was the case but I knew tviceman was just blindly lashing out at a perceived negative comment about NVDA.
Not much chance/likelihood of an apology there tho....:|
 

HurleyBird

Platinum Member
Apr 22, 2003
2,818
1,553
136
Here's a little more recent game for you. It uses direct compute for forward+ rendering.



I can see it now. Overuse of tessellation was used as cudgel when Fermi came out. Only makes sense that AMD would start using their overarching compute advantage in the same way.

If your architecture isn't balanced as well for games, then change the way games are balanced!

Eventually I can see compute being used as a cudgel by both sides, working with devs to exploit the specfic idiosyncrasies of their architectures. Unfortunately for the green team, the best case scenarios for compute for them only seem to give them parity with GCN. They could just use CUDA, but that is a tough sell for a game dev to exclude over half of the market, and exclusive features get turned off in reviews to ensure equal settings across cards anyway.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Oh this thread, lots of stock holders here at Anand.

dirtfps.gif


A+ Title, wound recommend to everyone.

I've enjoyed seeing Metro crop up again as anything other than a broken, horribly coded game. Back in my day we used to mock how 470s got 88 fps and 580 TRI got 100.
 
Last edited:

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
I can see it now. Overuse of tessellation was used as cudgel when Fermi came out. Only makes sense that AMD would start using their overarching compute advantage in the same way.

If your architecture isn't balanced as well for games, then change the way games are balanced!

Eventually I can see compute being used as a cudgel by both sides, working with devs to exploit the specfic idiosyncrasies of their architectures. Unfortunately for the green team, the best case scenarios for compute for them only seem to give them parity with GCN. They could just use CUDA, but that is a tough sell for a game dev to exclude over half of the market, and exclusive features get turned off in reviews to ensure equal settings across cards anyway.

Yeah, there's no way developers are going to use a closed API when direct compute is available to all hardware. What's so great about direct compute and the GCN architecture is that it can really enhance both performance and quality of the game. Dirt Showdown is a great example of what direction the industry is going.