[CB] Evolve Co-op Performance Review Findings - Radeon 290s Ace High-Res Gaming

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Throwing GE/GW features after the game launched sounds like:

"You know what, we have these 10 programmers that are underutilized and all this marketing $ budget left over from our record gross margins in Q4 2015. Can we spend that $ on some new AAA game that might be popular to help us increase GPU sales/brand image? Oh there is this game Evolve coming out - seems to have won a lot of awards at E3. Let's call the developer....."

With NV posting record net income and having ample cash flow, what I feared is slowly becoming a reality - the GPU maker that has more $ to throw at developers will eventually win. I never views that as fair competition because what if one GPU firm has 5-10X the financial resources of its competitor - well it can simply bribe developers with marketing dollars and send dozen of programmers to make sure all the popular AAA games run faster on its product. Unfortunately, this is exactly what's happening to PC gaming today.

This is much worse than Sony/MS paying developers to have exclusive titles on their consoles. This is basically going in and changing the natural flow of game development with $$$ and software engineers to alter the normal PC gaming coding process. I truly wish for AMD GE and NV GW to be banned from PC gaming like the good old days. This is about as bad as Intel paying companies to optimize software for its compilers, but now AMD and NV are doing it in the open, and everyone can see it. The ironic part is AMD GE titles run well on both brands' products for the most part.

DA2, TR, Dirt Showdown comes to mind.
 

Unoid

Senior member
Dec 20, 2012
461
0
76
It's good to see AMD kicking butt.

BUT

CrossFire failed and made evolve a no buy for me due to bad performance. I require crossfire as i play at 1440p 90hz.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
DA2, TR, Dirt Showdown comes to mind.

All 3 of those games were fixed by NV's driver team once they had access to open-source code for the game. NV's performance in all 3 of those games is excellent today. AMD cannot fix unoptimized black box SDK of GW's in any game. Notice the difference in the approaches?

Also, you didn't address the key part of my point which is the firm that has the most money can afford to work most closely with developers to optimize games for its products. How is that fair competition?
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
All 3 of those games were fixed by NV's driver team once they had access to open-source code for the game. NV's performance in all 3 of those games is excellent today. AMD cannot fix unoptimized black box SDK of GW's in any game. Notice the difference in the approaches?

Also, you didn't address the key part of my point which is the firm that has the most money can afford to work most closely with developers to optimize games for its products. How is that fair competition?

Now a days I am glad that games are coming on PC tbh and I think if NV/AMD didn't pour many they would be in a even [worse] state.

Profanity isn't allowed in the technical forums.
-- stahlhart
 
Last edited by a moderator:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Games are obviously bandwidth limited at high resolutions. I won't be paying a cent for this game until it's on the $4.99 heap due to the disgusting launch day DLC model.

Oh just realized the full Evolve gaming experience costs $215. Ha! this game is an epic fail. I guess we have NV premiums, game DLC premiums, premiums all around! Yup, another game for the $5 bargain bin. Between this and GW's titles, PC gaming has never been "cheaper" - $5 Steam bargain bin sales here I come. :thumbsup:

Now a days I am glad that games are coming on PC tbh and I think if NV/AMD didn't pour many they would be in a even shittier state.

I don't think NV/AMD pouring $ into PC games does much to change the PC industry for the better. Developers cater 95% to consoles anyway. Honestly things like $130 of DLC is enough to turn me off to skip the game entirely regardless if it is a GE/GW titles. Name 1 GE/GW title since Crysis 3 with true next gen graphics, AI, physics? Even FC4 has worse foliage physics effects than the original 2007 Crysis.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Oh just realized the full Evolve gaming experience costs $215. Ha! this game is an epic fail. I guess we have NV premiums, game DLC premiums, premiums all around! Yup, another game for the $5 bargain bin. Between this and GW's titles, PC gaming has never been "cheaper" - $5 Steam bargain bin sales here I come. :thumbsup:



I don't think NV/AMD pouring $ into PC games does much to change the PC industry for the better. Developers cater 95% to consoles anyway. Honestly things like $130 of DLC is enough to turn me off to skip the game entirely regardless if it is a GE/GW titles. Name 1 GE/GW title since Crysis 3 with true next gen graphics, AI, physics? Even FC4 has worse foliage physics effects than the original 2007 Crysis.

Very honest with you I always prefer game play over graphics, I didn't enjoy Crysis 3 at all. DAI has good graphics but terrible pc controls, the only game I am interested this year is W3 and nothing else.
 
Feb 19, 2009
10,457
10
76
DA2, TR, Dirt Showdown comes to mind.

NV was pretty quick on those, because TressFX & Global Illumination are open sourced. AMD even have conferences explaining how those features work. About a month after release, those games ran as well or better on NV hardware.

There's ONE AMD GE game that still run like crap on NV after all this time: Company of Heroes 2.

Why? AMD played NV's dirty game and used DirectCompute for snow physics & rendering, optimized for GCN. It shows they can play dirty, but they haven't, CoH2 was a once occurence, after that AMD's optimizations in GE have all been open source.
 
Last edited:

Pneumothorax

Golden Member
Nov 4, 2002
1,181
23
81
Very honest with you I always prefer game play over graphics, I didn't enjoy Crysis 3 at all. DAI has good graphics but terrible pc controls, the only game I am interested this year is W3 and nothing else.

Try DAI with a xbox 360 controller... It's like playing it on the xbone, except with better graphics... :D
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
NV was pretty quick on those, because TressFX & Global Illumination are open sourced. AMD even have conferences explaining how those features work. About a month after release, those games ran as well or better on NV hardware.

There's ONE AMD GE game that still run like crap on NV after all this time: Company of Heroes 2.

Why? AMD played NV's dirty game and used DirectCompute for snow physics & rendering, optimized for GCN. It shows they can play dirty, but they haven't, CoH2 was a once occurence, after that AMD's optimizations in GE have all been open source.

Or maybe they just can't afford to? Who knows.
 

Makaveli

Diamond Member
Feb 8, 2002
4,798
1,263
136
Try DAI with a xbox 360 controller... It's like playing it on the xbone, except with better graphics... :D

Agreed

A proper PC gaming setup should also have a xbox controller as a default item.

I also have a street fighter fight pad for the fighting games.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Ofc I don't know why they choose to go open source on GE, but I'm glad they did. GE games tend to run very well on all hardware.

sorry but the linux geek in me forces me to interject and state that you are using the term "open source" incorrectly. Then again the pragmatist in me fully understands what you mean.

maybe it should be called "source available," like this:
"Ofc I don't know why they choose make the source available on GE, but I'm glad they did. GE games tend to run very well on all hardware."
 

garagisti

Senior member
Aug 7, 2007
592
7
81
NV was pretty quick on those, because TressFX & Global Illumination are open sourced. AMD even have conferences explaining how those features work. About a month after release, those games ran as well or better on NV hardware.

There's ONE AMD GE game that still run like crap on NV after all this time: Company of Heroes 2.

Why? AMD played NV's dirty game and used DirectCompute for snow physics & rendering, optimized for GCN. It shows they can play dirty, but they haven't, CoH2 was a once occurence, after that AMD's optimizations in GE have all been open source.
AMD at the time was pushing physics based on compute, but later on moved to physics which relied on CPU, which is more efficient as CPU's have a lot more cores now and so not many games are CPU limited. That is if the CPU is being used properly, and it is not just some poorly optimised piece of garbage. I don't think i read anything to the effect that AMD did that deliberately, and infact this is the second time i have heard it from you. Would you like to suggest a source where i could read more about the same? Mind, most of the Nvidia gaming cards have lower compute capability. If a game is capable and is using compute, then how is it deliberate? It is not the same as overtesselated concrete blocks/ cape of batman and so on is it? Surely those should work fine on the full fat cores like the 780ti, Titan and ones before it. A 290x has more than 5 teraflops in single precision iirc. 7970 was not bad as well, iirc 7970 had 1/4 DP and comparatively 290x has 1/8th.

There was a reason why AMD was going VLIW4, VLIW5, and skip-hopping. They were simply trying to find more efficient cores for gaming.
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76


From page 4

Increasing the resolution to 2560x1600 dragged every single-GPU solution below 60fps, with the R9 290X falling just short with 57fps while the GTX 980 was slightly slower at 55fps. The R9 290 or GTX 780 Ti are about as low as you'll want to go at this resolution.

Disappointingly, the GTX 970 averaged just 44fps, being only marginally faster than the old HD 7970 GHz at 41fps. We believe this massive reduction in performance is due to the GTX 970's partitioned memory configuration.


That's the ugly truth on the 970. Gotta worry how often this is going to play out on the 970. I doubt nVidia is going to provide support down the road here to cover up shortcomings of the 970 in any kind of timely manner. While the 970 is new i'm sure nVidia will put resources behind fixing Evolve, there's no way this will happen once their next cards come out.


Read reviews of this game at metacritic, lot of folks not liking the DLC scheme attached to this game. Agreed.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101

Thanks.

Someone, some time ago said that hawaii will ridicule Titan. Well, that happened:
2560.png


And that has me thinking... AMD didin't even release a driver for this game.
 

SuiGeneris83

Junior Member
Feb 12, 2015
15
0
0
As usual, judging game performance this early isn't exactly wise. It takes months for a game's final performance to be determined, after patches and driver revisions have reached their peak optimization.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
There is nothing in that graph that suggests the 970's insane memory setup is the reason behind its poor performance.

It's more likely that Cryengine's new PBR renderer just favors GCN.

Which isn't really weird as it's an engine that has to run well on consoles.

Even that might be a stretch, considering we've only had two games so far using Cryengine 3.6+, Ryse and this. But at least it's a better conclusion than jumping on the 970 hatewagon with no evidence... :thumbsdown:
 

SuiGeneris83

Junior Member
Feb 12, 2015
15
0
0
290X is kicking ass lately, with the current abysmal state of nvidia's driver support for Kepler cards, I wish I'd gone with dual 290X for the better performance and the superior frame pacing of AMD's XDMA multi-gpu setups.

I don't think drivers are responsible for Kepler's recent performance. I think it's because more and more games are using compute shaders. This explains why the GTX 780 Ti's performance hasn't diminished as much as the lower end Kepler models, because the GTX 780 Ti had strong compute performance, though not as strong as Hawaii or Maxwell.
 

SuiGeneris83

Junior Member
Feb 12, 2015
15
0
0
There is nothing in that graph that suggests the 970's insane memory setup is the reason behind its poor performance.

It's more likely that Cryengine's new PBR renderer just favors GCN.

Which isn't really weird as it's an engine that has to run well on consoles.

Even that might be a stretch, considering we've only had two games so far using Cryengine 3.6+, Ryse and this. But at least it's a better conclusion than jumping on the 970 hatewagon with no evidence... :thumbsdown:

I think it's probably an issue of bandwidth. Both Maxwell cards have very strong performance at 1080p, but as you ratchet up to higher resolutions, the performance drops significantly. Hawaii has both a 512 bit bus and strong compute performance, so this game favors that combination.

Anyway, I'm sure NVidia will continue to optimize for the title, especially for the L2 cache which is supposed to help mitigate bandwidth issues.

1920.png
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
As usual, judging game performance this early isn't exactly wise. It takes months for a game's final performance to be determined, after patches and driver revisions have reached their peak optimization.

Well, there are some who think we can judge tech demos from games that are still at least a year away from being released and condemn an entire company's driver program (Starswarm). Why can't we judge anything from a game that's released? People want to play the game now, not months from now, and today's performance is what matters.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
There is nothing in that graph that suggests the 970's insane memory setup is the reason behind its poor performance.

It's more likely that Cryengine's new PBR renderer just favors GCN.

Which isn't really weird as it's an engine that has to run well on consoles.

Even that might be a stretch, considering we've only had two games so far using Cryengine 3.6+, Ryse and this. But at least it's a better conclusion than jumping on the 970 hatewagon with no evidence... :thumbsdown:

This is Techspot's conclusion. This isn't some nVidia haters on a forum.
Increasing the resolution to 2560x1600 dragged every single-GPU solution below 60fps, with the R9 290X falling just short with 57fps while the GTX 980 was slightly slower at 55fps. The R9 290 or GTX 780 Ti are about as low as you'll want to go at this resolution.

Disappointingly, the GTX 970 averaged just 44fps, being only marginally faster than the old HD 7970 GHz at 41fps. We believe this massive reduction in performance is due to the GTX 970's partitioned memory configuration.

As much as people want to push that the memory on the 970 deosn't matter we are seeing that it indeed does.
 

SuiGeneris83

Junior Member
Feb 12, 2015
15
0
0
This is Techspot's conclusion. This isn't some nVidia haters on a forum.


As much as people want to push that the memory on the 970 deosn't matter we are seeing that it indeed does.

If VRAM was indeed the cause, then the GTX 780 Ti with even less VRAM than the GTX 970 would also be affected.

I have no doubt that the GTX 970's peculiar memory setup is a problem here, but I don't think it's because of the VRAM. The GTX 970's memory bus is 224 bit, compared to the full 256 bit bus of the GTX 980.

GTX 780 Ti has a 384 bit bus and more raw bandwidth than the Maxwell cards. Between all three of them, the GTX 780 Ti has the least amount of performance decline going from 1080p to 1600p. Then comes the GTX 980, and finally the GTX 970 in the rear with a massive performance decrease.

So I'm convinced it's more a bandwidth issue than a VRAM issue.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
If VRAM was indeed the cause, then the GTX 780 Ti with even less VRAM than the GTX 970 would also be affected.

I have no doubt that the GTX 970's peculiar memory setup is a problem here, but I don't think it's because of the VRAM. The GTX 970's memory bus is 224 bit, compared to the full 256 bit bus of the GTX 980.

GTX 780 Ti has a 384 bit bus and more raw bandwidth than the Maxwell cards. Between all three of them, the GTX 780 Ti has the least amount of performance decline going from 1080p to 1600p. Then comes the GTX 980, and finally the GTX 970 in the rear with a massive performance decrease.

So I'm convinced it's more a bandwidth issue than a VRAM issue.

They said it was because of the configuration, which is what you are saying. I can't see what you are disagreeing with.