• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

The rumors about the product schedule are false -- AMD

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VulgarDisplay

Diamond Member
Apr 3, 2009
6,194
2
76
I predict the re-write driver will push the 7970GE to an average of .5% faster overall than an overclocked (975-1000) Titan. That's how incredible it is going to be because they've (AMD) been mismanaging the memory subsystem on Tahiti since launch day and nobody realized it. Now that AMD has fired a large percentage of their staff, the programmers that are left had no choice but to eventually pick up on the huge oversight.
Yes, you get exactly what I was trying to say.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,194
2
76

RussianSensation

Elite Member
Sep 5, 2003
19,460
744
126
I guess Crysis 3 is not using DirectCompute and is not a good benchmark to compare graphics like for example Sniper Elite 2.
Huh? There are games that have a lot of tessellation (Batman AC, Lost Planet 2, Civ5, Crysis 3) and games that don't. Similarly there are games that are very DirectCompute heavy, and some that are not. If you want to know which game has a lot of Compute shader effects, you have to read AMD's blog. Crysis 3 was only recently added as a GE title for marketing reasons. It's obviously nothing like Sniper Elite V2 or Dirt Showdown that used DirectCompute for many graphical enhancement.

No one says games that are not Compute heavy are not good benchmarks. Stop trolling.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,597
1
81
I doubt AMDs blog is a good source of information for this. It is far too superficial for that and lacks detailed and in-depth information. I don't have the proper background or information, but I would assume Crysis 3 uses DirectCompute as well as tessellation. To what extent I cannot say, and neither can anyone else.

Recent AMD promoted titles use OGSSAA which is the most inefficient supersampling method I know of. Probably because only at higher resolutions can AMD bring their advantage in raw shading power to bear. I prefer SGSSAA which provides a far better image quality/performance ratio than OGSSAA (also used in downsampling). The latter should be a last resort only. I own Sleeping Dogs and with SSAA on high, I still see pixel creep, loose two thirds of performance vs no AA. SGSSAA usually gets rid of all jaggies and shimmering at a performance ost of about 45%.
Interestingly, AMDs GCN is unusually weak with SGSSAA as it makes a difference if you render smaller pixels once (high res, OGSSAA, good for GCN) or larger pixels repeatedly (lower res, SGSSAA, bad for GCN). Almost every benchmark with SGSSAA that I know of shifts the picture in favor of Nvidia. Recently, AMD has promoted SGSSAA in their SDK, starting with slide 51:
http://www.pcgameshardware.de/AMD-Radeon-Hardware-255597/Downloads/AMD-Demo-DX11-Download-1045674/
Hopefully this replaces the current SSAA implementations soon. That would be a really nice development.
 

ASK THE COMMUNITY