[CB] Evolve Co-op Performance Review Findings - Radeon 290s Ace High-Res Gaming

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
According to CB.de, NV is going to add some more GameWorks features to it soon in the next updates.

While CB says they won't know how it would change the line up... personally, going on the history of GameWorks, I can safely say: expect Kepler to be fully neutered when that happens and GCN parts fall in line behind Maxwell.

Also, its not cool to ship games unfinished and have to release patches to add rendering features. :/

Fingers crossed the GW update doesn't break anything for AMD cards. I don't mind if it will improve nVidia, that's fine and expected. I'll be curious to see how this goes.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
If they add gameworks later, this may be our first time to see if it has a negative impact on AMD cards. Since we will have a before and after.
 

DiogoDX

Senior member
Oct 11, 2012
747
279
136
qpC9web.jpg

http://www.pcgameshardware.de/Evolve-PC-258203/Specials/Test-Benchmarks-1150387/
 

S.H.O.D.A.N.

Senior member
Mar 22, 2014
205
0
41
If they add gameworks later, this may be our first time to see if it has a negative impact on AMD cards. Since we will have a before and after.

I'd think there would be some difference between libraries that are embedded deep in the code (like with AC or Dying Light) and something you can just tack on to a finished product.

Then again, maybe the code is already there and they just locked it out for whatever reason.
 
Last edited:

Makaveli

Diamond Member
Feb 8, 2002
4,798
1,263
136
The 780 is time and again being embarassed by R9 280X.

I'm starting to notice this trend a lot now. It seems as time goes on Tahiti seems to be erasing that lead the 780 held when it was launched.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
I'm starting to notice this trend a lot now. It seems as time goes on Tahiti seems to be erasing that lead the 780 held when it was launched.

It's rather interesting considering at launch the 7970 was only slightly faster than the 580. I was disappointed in the 7970/7950 launch benchmarks, now they are so much faster than what we saw at launch.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126

The day 285 came out, many warned that 2GB card is DOA for 2015 gaming. Then when 960 2GB came out, once again gamers pointed out that 2GB is dead end and soon 2015-2016 games will make Swiss cheese out of 2GB cards. And now the proof just keeps on coming. 2015 is officially 'game over year' for 2GB cards. The list of games that drop 2GB cards is growing and growing. Real shame so many professional reviews of 285/960 completely dropped the ball, despite history showing us just what happened to X1900XT 256MB, 8800GT 256MB, 8800GTS 320MB, GTX470/480/570/580 1.28-1.5GB. History has a tendency to repeat itself.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Note: triple monitor is borked on Evolve. They don't properly scale the 2D UI so it assumes your monitor is as tall as it is wide (with 3x monitors...)
 

zlatan

Senior member
Mar 15, 2011
580
291
136
CryEngine has no issues with CF/SLI so that isn't even a valid excuse. It's almost as if they CBF.

Depends on CryEngine version. The newer codes don't really like forced AFR. That's the primary reason why they implement Mantle.
Same as UE4, there are a number of effects that are incompatible with forced AFR techniques. With an API like Mantle it is possible to copy all necessary data directly from GPU1 mem to GPU2 memory. With DX11 the data have to be copied to the main RAM, and then the second GPU can access it by copying it to it's own VRAM. While this option is possible for the new engines, the scaling will be very bad in some situation.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
290X is kicking ass lately, with the current abysmal state of nvidia's driver support for Kepler cards, I wish I'd gone with dual 290X for the better performance and the superior frame pacing of AMD's XDMA multi-gpu setups.

Wow

In this game, the 780ti falls exactly where it was when the gm204 launched:
Faster than the 970

I mean, people are gonna say whatever no matter what the results are. Just keep the misinformation coming. But here is a clear cut case against the "nvidia abandoned Kepler" claims yet your still trying to force it in here somehow?

There is no denying that there has been a wave of games that have nvidia cards all over the place. But there is also the fact that AMD has promised hefty performance in a major driver update, could we just once give credit where its do?

AMD is doing better across the board. Not just against Kepler cards. But instead of them getting credit, it must be nvidia turning their back on Kepler.

Hmmmm, but AMD is not only catching up with Kepler cards. They are also gaining on maxwell too. This is a trend across many games, which one has to look at when talking about performance of a graphics card. See, taking a single game is completely cherry picking. It's really easy to make some random claim and use one game, chart or graph to proclaim its proven. But, it always helps if your one game actually supports your claims. In this case, its a total failure. The Kepler flagship falls exactly where it did when maxwell launched.

The truth,
We see nvidia cards struggling on several new console ports. AMD has made real progress with a major driver update. There are games maxwell seems to handle better than Kepler. But then there are these cases, a brand new game that has Kepler right where it was at launch. It's always a game by game basis.

We can look at all of those factors and many more
Or
We can ignore everything and just keep repeating things until they become popular.

This game doesn't even support your theory but I don't guess that even matters.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
If they add gameworks later, this may be our first time to see if it has a negative impact on AMD cards. Since we will have a before and after.


This game is centered in my hit box, but the price is a bit too high, particularly if it will be a nVidia Gameworks game. I'd expect nVidia gets stronger performance in followup driver update, I hope this does not involve nVidia's shenanigans within it's Gameworks blackbox. This utilization of gameworks in current games is just silly and it is no surprise we see anomalies in performance for AMD cards in gameworks vs non gameworks games. We'd all expect nVidia to exploit when allowed to like this, it's just standard.

Currently Performance is very nice on AMD cards and it looks like a lot of fun. In a way the gameplay and environment, it reminds me of Giants: Citizen Kabuto which is a standout classic.

But bottomlined for this gamer is that Gameworks is a deal breaker for me, was bummed to hear it will be put into Evolve. I'll hold off until this gets cleared out and the dust settles surrounding Gameworks implementation in this one. Disappointing to find out this will be put in after release, especially given the strong correlation between Gameworks games and lousy overall performance and particularly poor performance of Gameworks on AMD cards. Grrrrrrr, but it will save me $60 today.
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,080
1,232
136
I am not getting max gpu usage on either my GTX 970 or my 7950.

Both lack about 10% to reach max usage.

Am I looking at something localized here, or has it been observed by others as well?

It does not seem to be a cpu limit, since the cpu has a lot more cycles to spare.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Either flawed testing methodology or 970/780Ti SLI setups are running into a pixel fill-rate bottleneck compared to 980 SLI at 4K:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Evolve-test-Evolve_3840.jpg


Kepler's performance at high resolutions is abysmal with 280X > 780, 780Ti just manages 5 fps faster than a 280X (wow!), and a 290 > 780Ti, 7870 with higher minimums than a 680, 7950 > 770.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Evolve-test-Evolve_2560.jpg


At 1080P, 280X is 29% faster than a 680, 285 > 680,

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Evolve-test-Evolve_1920.jpg


Almost everything besides the 780Ti from Kepler is seriously starting to show very poor performance in modern games when taking into account the premium pricing of those Kepler chips during HD7000 vs. 600/700 generation. I am honestly very surprised at how well CryEngine runs on GCN cards considering Crysis 2 made mince meat of HD5870/6970 compared to NV's competing cards.
 
Last edited:
Feb 19, 2009
10,457
10
76
AMD needs to stay on the ball, I'm not going to accept "GameWorks" excuses anymore for lack of CF support on release.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
If they add gameworks later, this may be our first time to see if it has a negative impact on AMD cards. Since we will have a before and after.

Not only that, but also what its affect may be on nvidia cards as well and how their performance shakes out. Gameworks by and large seems to cause large performance issues for both vendors, as well as in-game bugs. AFAIK they still have not further updated the gameworks implementation in AC Unity to include the promised tessellation effects. Probably because the game is still broken and who knows how badly adding it would cripple that game further.

I'm sure gamepu.ru will have benches for Evolve including it pre/post gameworks being added.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
What's shocking is games like Evolve take years to develop but NV couldn't work with the developer to add GW features in the last 12 months?
 

darkfalz

Member
Jul 29, 2007
181
0
76
Games are obviously bandwidth limited at high resolutions. I won't be paying a cent for this game until it's on the $4.99 heap due to the disgusting launch day DLC model.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That's cause nvidia has a bunch of Knotheads working for them such as ... Tom Petersen.

Throwing GE/GW features after the game launched sounds like:

"You know what, we have these 10 programmers that are underutilized and all this marketing $ budget left over from our record gross margins in Q4 2015. Can we spend that $ on some new AAA game that might be popular to help us increase GPU sales/brand image? Oh there is this game Evolve coming out - seems to have won a lot of awards at E3. Let's call the developer....."

With NV posting record net income and having ample cash flow, what I feared is slowly becoming a reality - the GPU maker that has more $ to throw at developers will eventually win. I never views that as fair competition because what if one GPU firm has 5-10X the financial resources of its competitor - well it can simply bribe developers with marketing dollars and send dozen of programmers to make sure all the popular AAA games run faster on its product. Unfortunately, this is exactly what's happening to PC gaming today.

This is much worse than Sony/MS paying developers to have exclusive titles on their consoles. This is basically going in and changing the natural flow of game development with $$$ and software engineers to alter the normal PC gaming coding process. I truly wish for AMD GE and NV GW to be banned from PC gaming like the good old days. This is about as bad as Intel paying companies to optimize software for its compilers, but now AMD and NV are doing it in the open, and everyone can see it. The ironic part is AMD GE titles run well on both brands' products for the most part.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I won't be paying a cent for this game until it's on the $4.99 heap due to the disgusting launch day DLC model.

Halo 2 on Xbox is the game that ruined it all for me. The only new game for consoles or PC that I have bought since that cold November morning is Borderlands 2 on Steam... but technically I bought a four-pack with friends to get a discount, so I guess I didn't pay full price after all...
 

stuff_me_good

Senior member
Nov 2, 2013
206
35
91
Cool!

Seems like GCN on all consoles is finally starting to pay off with interests. There is still so much value on those 3 years old 7970 chips and nvidia's "superior" drivers are nowhere to be seen. :) To bad amd's marketing suck so badly and nvidia's is in whole new level. Despite mining bloom and everything, nv cards are selling like hotcakes but amd is struggling to move inventory out without selling on abysmal profit.