AMD's Gaming Evolved snags FarCry3

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Benchmarks or it didn't happen. CURRENT BENCHMARKS TOO PLEASE.

He derived that from gtx680 vs standard 7970 and on older benches. This is the same nonsense he spouted from another thread, which disproved but he argued 7970 Ghz isnt a competitor to the 680...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That's not how software engineering works at all.To gain better performance from your hardware you must show the developers how to extract that performance in the first place,throwing money won't help that.If someone has used Intel/NV/Amd compilers they will understand that.

It's not physically "throwing $ at developers" in the way you think. It's an exchange of a good for a service. For example, AMD may pay for the AMD Gaming Evolved game title that comes bundled with the GPUs later (See Dirt Showdown, Nexuiz, Sleeping Dogs). NV does the same with Batman bundle, Borderlands 2, Just Cause 2, Mafia II gaming bundles.

Notice how both companies tend to bundle these "free" games with their GPUs? Those games aren't free. Someone has to pay for them and that someone isn't the developer or publisher of that game. In return, AMD/NV get to dictate what features they want added in the game and/or alternatively get access to early development code and their programmers work closely with the developer to alter that code for their specific architecture.

These types of developer relations aren't free. It costs AMD and NV $ to bundle these games (goes to the firm which made the game), then pay the programmers (hire additional programmers) that can go out into the field so to speak and work with these developers. It's basically a game of who throws more $ at their developer relationship program, and some of that $ goes to the developers indirectly via these NV/AMD sponsored game bundles.

Also, when JHH presented the GTX690, Cevat Yerli (CEO of Crytek) shook his hand and after his speech, before he left the stage he said: "JHH, I look forward to working with you in the future to help sell more videocards" or something along those lines. You think Yerli's team just out of the blue put in insane tessellation after the game already shipped via a DX11 patch because they have all this time on their hands and they are bored?

Now a days when PC game's sales are dwindling I don't see how throwing money is going to help either AMD/NV.I think they do a excellent job of marketing the game though.AMD/NV are well known than most of the game devs, so if they advertise your game its great for them.I mean AMD advertises SD and NV advertises BL2 ,what's more the devs can hope for?

What does that mean NV advertises BL2 and AMD advertises SD? Those 2 games come bundled for "free" with NV and AMD GPU purchases. Who pays for those games? It sure isn't the developer. The $ from those Steam Coupon code sales goes to the developer/publisher from which company? You think the develpment houses are just giving these games away for free to AMD and NV, just out of the blue? When you say AMD and NV advertise these games what it means is AMD and NV pay for these game bundles and later get to call the shots on what features are added, get the peak of early access to game code, etc. This type of access isn't free!

You are saying that's not indirectly influencing the developer by throwing $ at their game?

Please keep in mind AMD is rumored to have a clean sweep of the Wii/Sony/Xbox next-gen consoles. In fact we already know Wii U uses AMD 7xxx-series technology. This obviously does not help hardware PhysX market penetration. However, NV's architecture has historically been more flexible at more varied computational tasks so don't count them out... I doubt they will fall behind AMD by much if at all, even if games get coded in such a way as to favor GCN.

Is there a source which confirms that Wii U has an HD7000 GPU? I thought it was some custom made Radeon HD card but with DX10.1 support only? The codename for the GPU is "GPU7", which is probably why people confused it with HD7000 series.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Standard 7970 and 12.3 drivers? Link please!

He can't spin harder even if he tried. Every reputable review on the planet shows that HD7970 GE > GTX680. The more games are included, the more this lead is cemented. Of course if you mostly play the games in which NV is fast, it's a great card. Interestingly enough, HD7970GE is faster in the 4 out of 5 best looking games on the PC (Metro 2033, Crysis 1/Warhead, Trine 2, Witcher 2 Enhanced Edition vs. BF3 for NV). Then you have Skyrim with Mods and Arma II - very popular games in the gaming community and in these 2 games, HD7970GE leads by 20-30%.

Actually it is AMD who is falling behing nVidia. Losing in 16 of 23 DX11 games is not really a sign of a great Architecture for DX11.

Your list is 100% wrong. Everyone knows right now AMD has more titles in which it's faster, not less. You included Dirt 3, Deus Ex:HR and Sleeping Dogs when HD7970GE is faster in those games without question. Shows how out of touch with recent reviews you are.

Here is the real list - AMD HD7970 GE wins in all of these, and in some of these DX9-DX11 games by 20-30%:

- Aliens vs. Predator
- Crysis 1 / Warhead
- Anno 2070
- Bulletstorm
- Serious Sam 3
- Alan Wake series
- Dirt 3
- Skyrim + mods + MSAA
- Batman AC + MSAA
- Arma II - Day Z, Reinforcements and Operations
- Risen 2 Dark Waters
- Darksiders II
- Fireball
- Deus Ex: Human Revolution
- Metro 2033
- STALKER COP
- Battleforge
- Trine 2
- Dirt Showdown
- Sniper Elite V2
- The Witcher 2 Enhanced Edition
- Civilization V
- Brink
- Mafia II
- Nexuiz

NV wins in these:
- WOW
- Hard Reset
- Project Cars
- Starcraft II
- Battlefield 3 (it's debatable who wins this at 1600P though)
- Lost Planet 2
- HAWX 2
- Dragon Age II
- Max Payne 3
- TrackMania 2
- Total War Shogun 2
- Portal 2
- World of Planes
- Prototype 2
- Wargame: European Escalation

* Crysis 2 and Diablo 3 I call a tie since a ton of reviews show HD7970 GE winning and a ton show GTX680 winning and some show a tie basically.

Oh and now HD7970GE beats GTX680 into the ground in Half Life 1: Black Mesa mod = one of the best FPS games ever made. 29% lead for the HD7970 GE at 1600P. Half Life 1 is 100x better game than the trio of LP2, HAWX 2 and Dragon Age II. Even an 860mhz HD7950 FLEX is as fast as GTX680 in this game.

bm%202560.png


The more games you include in a review, the more HD7970 GE assures its performance lead over the GTX680. In the games where 680 trails, the delta is often very large, 15-25%, sometimes 30%. When 680 leads, it's usually 10%, maybe 15%.

Actually at TPU, in their GTX660 testing, HD7970 GE wins 16 out of 18 gaming tests against the 680. The more games you play outside of the BF3, Crysis 2, WOW camp, the better 7970 GE looks. The main card that puts GTX680 into the running for the fastest single-GPU is the MSI Lightning 680 @ 1360-1390mhz overclock.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
@RS
If Intel chooses to manipulate one developer they can do so as Amd's market share in cpu is negligible compared to them.But neither NV nor Amd can do this as the other one has quite good market share as well.If they purposefully hurt the performance of their competitor the overall sales will be going to be quite low indeed.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
That can't be right. I sure as hell don't get 27FPS in black mesa source and I run at 6048x1080.

Just thought I'd chip in there.


Dual GPU cards can have odd results. The 6990 is slower than the 6970 there, my guess is the CF profile wasn't working for that bench, or something along those lines.
 
Feb 19, 2009
10,457
10
76
@RS: Nice list, but Max Payne 3 ive seen recent reviews where the 7970 Ghz wins or ties. BF3 same, at 1080p. At 1600p its a win.

Another recent title: I am Alive, huge win for AMD.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
In some of the titles where nvidia wins both brands are going to get more than 100fps anyways. I could personally care less that your gtx680 gets 130fps in portal 2 when my 7970 gets 110fps.

Made up numbers, but you get the point.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
You gotta love it, RS methodically dismantling all the specious marketing speak and fanboi bs from the NV camp with facts. All those inconvenient and annoying truths that the NV marketing dollar normally removes from view. GJ!
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
In return, AMD/NV get to dictate what features they want added in the game and/or alternatively get access to early development code and their programmers work closely with the developer to alter that code for their specific architecture.

Imho,

And if the IHV's dictated to the developers, surly there would be tons of blogs from the developers discussing this.

It's probably like developers may desire assistance or help from the IHV's and the IHV offering ways to improve the titles, create awareness for the titles, that makes sense for developer and IHV in cooperation. In most cases, it is fidelity settings that doesn't change the fundamental game-play components. Where is the harm? AMD customers can enjoy some improved fidelity that may take advantage of the GCN architecture or AMD's software vision.

The key is not to do harm to the original code but one may optimize for a specific architecture at launch for a more out-of-the-box advantage for a specific company. Where the other may need some time after release to optimize or try to work with the developer after release --see Dragon Age 2.

Where is the harm? Amd customers enjoyed more out-of-the box performance when Dragon Age 2 was released.

Idealism 101: No, customers should all be equal -- everything should be standardized -- it's only fair when all customers should have the same features -- all customers should have similar efficiency! If not, features shouldn't be offered because it may divide the PC market. Cobblestones should look real!:) Ignore the good and some consumers enjoying the hard work the IHV's and developers do. Allow idealism to be the enemy of good.

The good is some PC gamers may enjoy the hard work and it pushes innovation instead of waiting for everyone to be on the same page, where everyone has to wait. Innovation doesn't wait for anyone -- AMD or nVidia. With both pushing hard creates more innovation and improved gaming experiences for the PC gamer over-all.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
He can't spin harder even if he tried. Every reputable review on the planet shows that HD7970 GE > GTX680. The more games are included, the more this lead is cemented. Of course if you mostly play the games in which NV is fast, it's a great card. Interestingly enough, HD7970GE is faster in the 4 out of 5 best looking games on the PC (Metro 2033, Crysis 1/Warhead, Trine 2, Witcher 2 Enhanced Edition vs. BF3 for NV). Then you have Skyrim with Mods and Arma II - very popular games in the gaming community and in these 2 games, HD7970GE leads by 20-30%.



Your list is 100% wrong. Everyone knows right now AMD has more titles in which it's faster, not less. You included Dirt 3, Deus Ex:HR and Sleeping Dogs when HD7970GE is faster in those games without question. Shows how out of touch with recent reviews you are.

Here is the real list - AMD HD7970 GE wins in all of these, and in some of these DX9-DX11 games by 20-30%:

- Aliens vs. Predator
- Crysis 1 / Warhead
- Anno 2070
- Bulletstorm
- Serious Sam 3
- Alan Wake series
- Dirt 3
- Skyrim + mods + MSAA
- Batman AC + MSAA
- Arma II - Day Z, Reinforcements and Operations
- Risen 2 Dark Waters
- Darksiders II
- Fireball
- Deus Ex: Human Revolution
- Metro 2033
- STALKER COP
- Battleforge
- Trine 2
- Dirt Showdown
- Sniper Elite V2
- The Witcher 2 Enhanced Edition
- Civilization V
- Brink
- Mafia II
- Nexuiz

NV wins in these:
- WOW
- Hard Reset
- Project Cars
- Starcraft II
- Battlefield 3 (it's debatable who wins this at 1600P though)
- Lost Planet 2
- HAWX 2
- Dragon Age II
- Max Payne 3
- TrackMania 2
- Total War Shogun 2
- Portal 2
- World of Planes
- Prototype 2
- Wargame: European Escalation

Pretty sure Deus Ex runs better on nvidia hardware now - at least the more demanding DLC for it does. And just as many websites show Batman AC perform as well or better on Kepler cards, so from a performance perspective I call that a tie.

Metro, Mafia, and Batman are all physx. You can play Batman and Mafia with high amounts of AA, or you can enable physx effects and add in substantially more eye candy and still get >60 fps. I call that a win for Nvidia. Metro is punishing enough on both cards at 2560, that I don't think either is particularly playable at the highest settings. Dropping it down to 1920 doesn't change the performance disparity that much, but again Nvdia has physx, which does not extract much of any performance hit in this game. So I think that is more of a draw insofar as who "wins" in that benchmark. Borderlands 2? Another "automatic" win in my book: http://www.youtube.com/watch?v=EWFkDrKvBRU

As others have pointed out, Nvidia is offering so many more things to enhance image quality or playability experience via their drivers (ambient occlusion, FXAA, adaptive vsync), that if they lose 90fps to 105fps, it doesn't really matter, and there is still plenty of headroom for Nvidia's image quality enhancing drivers to kick in w/o sacrificing fluid gameplay.

AMD has the overall fastest GPU, that I agree with. But Nvidia's ecosystem and PC-centric focus has value to it that apparently is worth to other people than to you.

EDIT: Half those games you listed literally do not matter at all because the frame rates are so ridiculously high in the first place. And when you have to start saying GAME + MODS + 16X AA + OVERCLOCKING + SURROUND + WATER COOLING it's reeeeeeeeeaally starting to stretch things just a little bit.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
@RS: Nice list, but Max Payne 3 ive seen recent reviews where the 7970 Ghz wins or ties. BF3 same, at 1080p. At 1600p its a win.

Another recent title: I am Alive, huge win for AMD.

Ya, I am not going to split hairs over MP3 and BF3. I feel on the whole in most reviews GTX680 is faster at 1080P. Sontin's list was extremely biased and missed a ton of games out of the blue and even added games where 680 is slower.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
AMD has the overall fastest GPU, that I agree with.

imho,

Indeed! AMD/ATI have world class engineers and immense talents and has happened many times in the past and probably in the future. Congratulations -- nVidia and AMD are both working hard trying to create improved gaming experiences for their customers and for the PC platform and this is a good thing.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Pretty sure Deus Ex runs better on nvidia hardware now - at least the more demanding DLC for it does. And just as many websites show Batman AC perform as well or better on Kepler cards, so from a performance perspective I call that a tie.

The 7970 GHz edition edges out the 680 at 1200p and both the standard 7970 and GHz edition beat the 680 at 1600p.

DeusEx_01.png

DeusEx_02.png

Metro, Mafia, and Batman are all physx. You can play Batman and Mafia with high amounts of AA, or you can enable physx effects and add in substantially more eye candy and still get >60 fps. I call that a win for Nvidia. Metro is punishing enough on both cards at 2560, that I don't think either is particularly playable at the highest settings. Dropping it down to 1920 doesn't change the performance disparity that much, but again Nvdia has physx, which does not extract much of any performance hit in this game. So I think that is more of a draw insofar as who "wins" in that benchmark. Borderlands 2? Another "automatic" win in my book: http://www.youtube.com/watch?v=EWFkDrKvBRU

So...are Eyefinity games that Nvidia wins at general performance, like Dragon Age II, "automatic wins" for AMD as well?

As others have pointed out, Nvidia is offering so many more things to enhance image quality or playability experience via their drivers (ambient occlusion, FXAA, adaptive vsync), that if they lose 90fps to 105fps, it doesn't really matter, and there is still plenty of headroom for Nvidia's image quality enhancing drivers to kick in w/o sacrificing fluid gameplay.

So does AMD. Driver tessellation control and MLAA.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Pretty sure Deus Ex runs better on nvidia hardware now - at least the more demanding DLC for it does. And just as many websites show Batman AC perform as well or better on Kepler cards, so from a performance perspective I call that a tie.

I haven't seen many people recently test Deus Ex but I see that at Computebase, an HD7970 gets > 74 fps at 2560x1600 and 680 is slower than that. At 1080P they are a hair of each other which leads me to believe 7970 1050mhz would take it easily. Not that it matters since both cards are crushing this game (so really it doesn't matter like you say).
http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-670/26/

Batman AC is much faster on AMD cards after Cats 12.6 with MSAA, especially once you go to 1440/1600P. I'll give you that NV has PhysX, but with MSAA, 680 does not beat a 7970 GE. Again, both of these cards crush Batman AC anyway so I'll give the edge to NV for having PhysX, but that's IQ not performance which was the whole point.

Metro, Mafia, and Batman are all physx. You can play Batman and Mafia with high amounts of AA, or you can enable physx effects and add in substantially more eye candy and still get >60 fps.

OK, great if you care about PhysX, by all means. I think PhysX in Batman AC is decent but it is terrible in Mafia II. Regardless, don't want to debate that but AMD gives you faster performance in all 3 and if you want PhysX in them, get an NV card. GTX680 gets slapped around in Metro 2033 pretty bad. It'll be interesting to see how this plays out in Metro Last Light. Again, if you care about PhysX in Metro 2033, by all means get the NV card, but 7970 GE is faster than 680 in this game performance wise.

Dropping it down to 1920 doesn't change the performance disparity that much, but again Nvdia has physx, which does not extract much of any performance hit in this game. So I think that is more of a draw insofar as who "wins" in that benchmark. Borderlands 2? Another "automatic" win in my book: http://www.youtube.com/watch?v=EWFkDrKvBRU

Ya, for sure I think BL2 will offer more IQ on the NV side. Still, to me the only 'automatic' win is Bitcoin mining. It means 'free' GPUs on the AMD side my friend. :p If PhysX is worth $400-500 to you, by all means....To me it wasn't even when I have the 470s.

What you are missing here is as long as bitcoin mining works, it's free AMD GPUs, HD8000 series too. You know when you decide to sell your 670 and upgrade to a 770/780, it's going to be free for us to get an 8970. As long as this perk exists, I see no reason to actually spend $ I'd rather spend on vacations, going out, booze, restaurants, Haswell CPU platform upgrade, larger SSD, bigger PSU, videogames, etc. Again, if you don't care for this perk and are willing to pay $400-500 to experience PhysX, AO, FXAA, adaptive Vsync, you have that option :). The point Sontin made was that AMD is losing in performance in 16 of 23 DX11 games. This is not true. If he had stated that in his eyes the extra performance of HD7970GE is not enough to compensate him for losing the IQ perks that the NV eco-system offers, that's a more objective statement. Instead, he made up facts related to performance.

Again, strictly from a performance point of view, 7970 GE is faster. Now if you want to argue that GTX670/680 is a better buy for your personal needs since you like PhysX, that's a fair point but he specifically ignored the performance difference in all of these games I listed without PhysX.

As others have pointed out, Nvidia is offering so many more things to enhance image quality or playability experience via their drivers (ambient occlusion, FXAA, adaptive vsync), that if they lose 90fps to 105fps, it doesn't really matter, and there is still plenty of headroom for Nvidia's image quality enhancing drivers to kick in w/o sacrificing fluid gameplay.

Ya, these things do add value to some gamers and other gamers don't think they are worth spending extra $ for. I mean to me the advantage you listed mean little this generation since using bitcoin mining you can get a $500 GPU for free if I you go with the AMD route. Why would I spend $500 for a slower card just for adaptive sync, PhysX in 1-2 games a year, TXAA/FXAA that blurs textures? FXAA works on AMD cards in many games anyway. This is how I look at it: I gotta spend $500 of my money for a GTX680, or $0 for a 7970. Hmm....tough, tough choice.

AMD has the overall fastest GPU, that I agree with. But Nvidia's ecosystem and PC-centric focus has value to it that apparently is worth to other people than to you.

Ya, that's a fair point. If some people want those specific NV features and don't care for any specific AMD features, by all means. However, this thread not once said anything negative or disregarded NV's specific features. We were talking from a performance point of view and price/performance. So if you are going to say NV eco-system, I am going to say free or substantially discounted AMD GPUs. It's only fair. To some people paying $0 for a 7970 is better than $400 for a GTX670 to get PhysX. Now if $400 is the same as $0 to you because you are a high net worth individual, then again you don't care about price/performance or value anyway and I presume you'd be gaming on GTX690 SLI or Quad-Fire 680s.

EDIT: Half those games you listed literally do not matter at all because the frame rates are so ridiculously high in the first place. And when you have to start saying GAME + MODS + 16X AA + OVERCLOCKING + SURROUND + WATER COOLING it's reeeeeeeeeaally starting to stretch things just a little bit.

In that case GTX570/HD6950 unlocked is fast enough for most of those games anyway. If you don't crank settings, game at 1440/1600P or use mods, you don't need a $500 videocard. You are buying it either because you want the added image quality, 60 fps+ in games, or because you want to run the most demanding games faster (Metro 2033, Witcher 2 Enhanced, BF3, Crysis games), or maybe because you run some GPU specific applications that use CUDA, compute, double precision, etc.

Even if we use your argument that older games don't matter and focus on modern games, $380 for a 1Ghz HD7970 makes $480+ GTX680 overpriced. It would be far smarter to save $100 and get a 7970 OC (or if you must have NV get a GTX670 OC) and use that $100 for a next GPU upgrade. If you place additional value on the NV eco-system how much is it?

- How much slower would you accept an NV card to still remain in their eco-system, 15, 20, 35%?
- How much extra are you willing to pay for the NV eco-system for similar performance? $50, 100, 150, 200?

What you are saying is NV feels that that 680 can command a $100 price premium over the 1Ghz HD7970 card for the extra NV features?

Not saying you have to use 8xMSAA in Batman AC but even for people with a single 1440/1600P monitor, HD7950 vs. 670 and 7970/7970GE vs. 680 clearly provide superior price/performance and in the case of 7970 GE simply unbeatable performance at 1600P:

perfrel_2560.gif


Sapphire Vapor-X 7970 costs $450. That just made the entire GTX680 line-up irrelevant for 1440/1600P users (unless they get the $580 GTX680 Lightning, or plan on going SLI). I somehow doubt you can even use PhysX in modern games at 1440/1600P and maintain good framerates. Looking forward to BL2 benchmarks to be proven wrong on this :)
 
Last edited:

NIGELG

Senior member
Nov 4, 2009
852
31
91
Imho,

And if the IHV's dictated to the developers, surly there would be tons of blogs from the developers discussing this.

It's probably like developers may desire assistance or help from the IHV's and the IHV offering ways to improve the titles, create awareness for the titles, that makes sense for developer and IHV in cooperation. In most cases, it is fidelity settings that doesn't change the fundamental game-play components. Where is the harm? AMD customers can enjoy some improved fidelity that may take advantage of the GCN architecture or AMD's software vision.

The key is not to do harm to the original code but one may optimize for a specific architecture at launch for a more out-of-the-box advantage for a specific company. Where the other may need some time after release to optimize or try to work with the developer after release --see Dragon Age 2.

Where is the harm? Amd customers enjoyed more out-of-the box performance when Dragon Age 2 was released.

Idealism 101: No, customers should all be equal -- everything should be standardized -- it's only fair when all customers should have the same features -- all customers should have similar efficiency! If not, features shouldn't be offered because it may divide the PC market. Cobblestones should look real!:) Ignore the good and some consumers enjoying the hard work the IHV's and developers do. Allow idealism to be the enemy of good.

The good is some PC gamers may enjoy the hard work and it pushes innovation instead of waiting for everyone to be on the same page, where everyone has to wait. Innovation doesn't wait for anyone -- AMD or nVidia. With both pushing hard creates more innovation and improved gaming experiences for the PC gamer over-all.
''Allow idealism to be the enemy of good.''

You keep repeating this over and over....

Idealism.... standards in the computer hardware industry.Otherwise chaos results.Idealism is not the enemy of good.Idealism IS good,it is what we should all strive for.Evil,Chaos,fragmentation,unethical practices are the enemy of good....not idealism.

Amd and nVIDIA are going down a sickening road with proprietary crap...

I don't want to buy two gaming rigs to enjoy my games....do you?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Amd and nVIDIA are going down a sickening road with proprietary crap...
I don't want to buy two gaming rigs to enjoy my games....do you?

Big difference: DirectCompute is not proprietary. It's an open standard.

Now you have a situation where AMD is throwing $ at AMD Gaming Evolved titles to bring out specific architectural advantages of its SKUs: the superior dynamic scheduler + 32 Compute Units in Tahiti XT to perform HDAO, contact hardening shadows, dynamic lighting calculations.

By using this approach they mitigate the performance hit that would normally be incurred using traditional methods. Actually, GCN can run a separable bilateral filter, which further improves the performance of the lighting simulation. That's just simply because GCN was designed as a general purpose compute GPU from the ground-up making it better at multi-tasking. The architecture has massive flexibility while Kepler GK104 is stuck using a static scheduler to order the SIMDs. You can also use the compute units to accelerate post-processed AA on GCN.

Here are 2 biggest differences between GK104 and Tahiti XT:

1) Dynamic vs. Static scheduler for compute tasks:

Tahiti XT has 32 Compute Units and there is a dynamic scheduler that can cover up dependencies and other types of stalls. VLIW's weakness -- it was statically scheduled ahead of time by the compiler. Kepler GK104 has Scalar SIMDs but still has a static scheduler. So it's just half way there. In the GCN, it is the CU that is now scheduling execution within its domain.

GCNCU_575px.png


2) Since GCN is a brand new architecture built for compute from the ground-up and Kepler GK104 is not, that gives AMD a huge leg up in compute performance since it's just a much more modern architecture:

"From a high level overview Kepler is identical to Fermi: it’s still organized into CUDA cores, SMs, and GPCs, and how warps are executed has not significantly changed. Nor for that matter has the rendering side significantly changed" ~ Anandtech

All Kepler GK104 is a fantastically rebalanced Fermi on a 28nm node without shader clocks, some improvements such as NVIDIA doubling some execution resources, with 2 warp schedulers becoming 4 warp schedulers, and the register file being doubled from 32K entries to 64K entries. However, fundamentally it's an architecture that goes back to 2010, at least.

For compute, GCN Tahiti XT is 1 full generation ahead:

AMD-GCN-3Th.png


"On the compute side, AMD’s new Asynchronous Compute Engines serve as the command processors for compute operations on GCN. The principal purpose of ACEs will be to accept work and to dispatch it off to the CUs for processing. As GCN is designed to concurrently work on several tasks, there can be multiple ACEs on a GPU, with the ACEs deciding on resource allocation, context switching, and task priority." ~ AnandTech

Now most of us thought this is all marketing fluff from AMD, but now that they are throwing $ at developers, they are accelerating graphical workloads via DirectCompute a lot quicker than anticipated. Even 4-5 months ago I thought NV will continue piling on extreme tessellation in games and I didn't even imagine that AMD would improve drivers so much and at the same time start working with developers to extract faster performance via direct compute capabilities of GCN.

There is nothing proprietary about this -- Kepler GK104 was just never designed for flexible computational work like this. It's good for single-precision mathematical calculations but is not this sophisticated. GK110 is much better since it has a dynamic scheduler.

Worst case scenario for GK104 is if some aspects of game code are specifically coded to take full advantage of the 32 Compute Units, the ACEs and its dynamic scheduler of Tahiti XT to accelerate graphical calculations (or mitigate their corresponding performance hit). The end result is this:

07_dirt.png


No one is stopping NV from making a better compute architecture, reintroducing back a dynamic scheduler, or compensating by simply adding more functional units. AMD cannot their fix PhysX performance, however.

Either way, we saw how NV took full advantage of its superior tessellation performance with Fermi and now it appears AMD is trying to do the same for compute. Of course, NV is not stupid and if they feel compute is the future of GPU evolution, you can bet the marbles here Maxwell may very well end up more advanced than GCN (and should be since it would be even more modern) and AMD will be trying to catch up with incremental/evolutionary upgrades to GCN. From what I read, Maxwell should be a brand new architecture from NV and beastly indeed. NV still has those 8 Polymorph 2.0 engines in GK104 (i.e., 8 tessellator engines).

The biggest marketing bluff NV pulled this round is convinced the world Kepler is a brand new architecture. No sir, it's just Fermi enhanced. This goes to show just how excellent the Fermi was that it still can hang with a more modern GCN. I am still waiting for NV to pull its trump card with an insanely tessellated game. If they throw enough $ at a developer we could see tessellated frogs that bring out the best features of Kepler (wink).
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
As long as both sides can run these extra features without massive performance hits, I'm all for it. I don't play DirT but hopefully some other game I will play uses this kind of stuff.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
And of course there is the possibility for Nvidia to find some sort of driver workaround to even out performance or even exceed AMD's. They did that quite ably with Dragon Age II.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
''Allow idealism to be the enemy of good.''

You keep repeating this over and over....

Idealism.... standards in the computer hardware industry.Otherwise chaos results.Idealism is not the enemy of good.Idealism IS good,it is what we should all strive for.Evil,Chaos,fragmentation,unethical practices are the enemy of good....not idealism.

Amd and nVIDIA are going down a sickening road with proprietary crap...

I don't want to buy two gaming rigs to enjoy my games....do you?

Direct Compute and OpenCL are available to both companies. Just as tessellation is. Because nVidia leveraged their advantage with tessellation AMD was forced to correct that weakness in their designs. Now, it's up to nVidia to do the same with Direct Compute and OpenCL. This is actually a good thing. Neither company can afford to put out a product that isn't well balanced. If they do the other will nail them to the wall. We are going to get better GPU's because of this competition. I agree that proprietary features and vendor locks are bad for us consumers. The latter (vendor lockouts) should be illegal and fall under anti competition laws, IMO.

I don't think this "compute advantage" that AMD is exploiting right now will be a long lasting advantage. nVidia has done real well competing with GK104, but it's just not as powerful a chip all around as Tahiti is. AMD took a bit of a lackadaisical approach at first with Tahiti and nVidia was able to beat them with a smaller less balanced chip by exploiting GK104's strengths, but I think they (Rory the Rookie ;)) learned that nVidia is a bit more resourceful than they (he) anticipated.

What will come out of this for nVidia customers is nvidia will have to extend the compute capabilities further down their range. Pitcairn beats GK104 in most OpenCL benches (Cape Verdi can trade blows with GK104.). If AMD's 2nd and 3rd tier cards are capable performers then GK114 and GK116 are going to have to be as well or they'll just get clobbered by AMD's midrange and lower cards.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Now a days when PC game's sales are dwindling I don't see how throwing money is going to help either AMD/NV.I think they do a excellent job of marketing the game though.AMD/NV are well known than most of the game devs, so if they advertise your game its great for them.I mean AMD advertises SD and NV advertises BL2 ,what's more the devs can hope for?


Bollocks, PC gaming is doing better than ever...stop posting uninformred opinions as being "facts"...
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
"PC gaming is doing better than ever" indeed but compared to consoles not so much.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
That can't be right. I sure as hell don't get 27FPS in black mesa source and I run at 6048x1080.

Just thought I'd chip in there.

you played the entire game?

overall the game runs over 100fps in many places, but there are some spots with a few fill rate and dynamic light issues that have huge fps drops for me


BUT, this "Game GPU" website, I'm not sure, but from what I see their testing methodology is far from ideal, they have different people with different combination of CPUs and VGAs making the tests? (anyone who understands Russian can clarify that?)



anyway, about FC3, I'm not excited at all, FC2 was so boring...
FC1 was so much better, but it was made by Crytek, FC2 and FC3 are made by Ubisoft