computerbaseDirt Rally Benchmark

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Better question: Why is the 290X below the 970 in that graph when the 290X is faster?

That's something users brought up on that site. Their answer was they sort they graphs by minimum fps. The numbers are rounded for the purposes of the graph. It could mean 290X got 54.5-55.0 fps min and 970 got 55.1-55.4 fps min as an example.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
GameWorks effects are like Baskin-Robbins or Dove. It's better than your average ice cream/chocolate but it's not Gelato or Godiva.
They offer sweet features in a unideal multi-platform world.They may not be the sweetest or ideal, but some don't allow idealism to be the enemy of good.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,541
10,167
126
Open-source development means you are required to spend more $, resources and put in-house talent/brains to use that's required to make true next gen games from the ground-up.

considering no GWs game of 2015 (or ever) looks as good as SW:BF - an open source game.

Methinks that you are using the term "open-source" incorrectly. I think that you should be using the term "in-house", as opposed to "GPU vendor proprietary".

Unless, of course, I can get the full source code to SW:BF. Got links?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Is there special features you can enable on AMD cards like hairworks?


From the AMD TressFX 2.0 release.

https://community.amd.com/community/gaming/blog/2015/05/12/tressfx-hair-cross-platform-and-v20

TressFX Hair was certainly impressive from a visual perspective, but less discussed is the operational efficiency that compels appreciation on both technical and philosophical grounds. To put a fine point on that, we wanted to illustrate the actual performance impact of AMD’s TressFX Hair contrasted against NVIDIA’s Hairworks.

In the below diagram, we isolated the specific routine that renders these competing hair technologies and plotted the results. The bars indicate the portion of time required, in milliseconds, to render the hair from start to finish within one frame of a user’s total framerate. In this scenario, a lower bar is better as that demonstrates quicker time to completion through more efficient code.


tfx_tr_perf.png


In the diagram, you can see that TressFX Hair exhibits an identically low performance impact on both AMD and NVIDIA hardware at just five milliseconds. Our belief in “doing the work for everyone” with open and modifiable code allowed Tomb Raider’s developer to achieve an efficient implementation regardless of the gamer’s hardware.

In contrast, NVIDIA’s Hairworks technology is seven times slower on AMD hardware with no obvious route to achieve cross-vendor optimizations as enabled by open access to TressFX source. As the code for Hairworks cannot be downloaded, analyzed or modified, developers and enthusiasts alike must suffer through unacceptably poor performance on a significant chunk of the industry’s graphics hardware.

With TressFX Hair, the value of openly-shared game code is clear.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Also, I don't think AMD is big on going out there, helping with a game, then branding features into their game and calling it "AMD Hairworks!!!!!"

I think AMD just usually works with devs to make the game better. Not brand a bunch of setting so forum warriors can say "Nvidia let me enable Ultra Godrays!!! You know, the setting I can only use at 1080p with a GTX 980Ti!"

While on an AMD game, the GodRays would simply just be there, and work well for everyone, and not be some insane performance impacting feature for one small effect.

But I'm sure people have more evidence to back this up.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
There's a big difference, actually: Project Cars is unplayable on most AMD cards, while this is at least playable on the cards it should be playable on. This game is clearly extremely unbalanced, but not in the same way as GameWorks titles. You guys are just ignoring the actual problem with GameWorks titles.

So, AMD cards are not as good as nVidia's and you blame the game? This doesnt make sense.

A GTX970 with OC (around 1250MHz) gets 35FPS in 4K with 20 CPU components and dry conditions in Project Cars: http://www.computerbase.de/2015-05/...diagramm-grafikkarten-benchmarks-in-3840-2160

The same card gets the same frames in Dirt:Rally with only ony car on the track (GTX970 is not overclocked here, so you can add ~15%): http://www.computerbase.de/2015-12/dirt-rally-benchmarks/2/#diagramm-dirt-rally-3840-2160

With 20 other cars on the track Dirt:Rally wouldnt even be close to be playable on a nVidia card.
 

PhonakV30

Senior member
Oct 26, 2009
987
378
136
sontin , here

Project card , 1920x1080 :
R9 290x = 42 fps
Geforce GTX 970 = 74.2 fps
74.2/42.2 = 76% ( GTX 970 is 76% faster than R9 290x)

Dirt rally , 1920x1080 :
AMD R9 290x = 92.6 fps
Geforce GTX 970 = 80.4 fps
92.6/80.4 = 15% ( R9 290x is 15% faster than GTX 970)

This is huge different.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
That's the reason why i used the 4K numbers.
More cars on the tracks mean more draw calls. nVidia's DX11 driver can push nearly two times the number of draw calls than AMD. AMD cards are limited by AMD's bad DX11 driver.
Reducing the quality of the game which will reduce the number of draw calls will help AMD cards: http://www.hardware.fr/articles/937-20/benchmark-project-cars.html

Project Cars is well optimized. It is not the fault of the developer that AMD's driver are worse than nVidia'S and that a nVidia optimized game runs better on nVidia hardware.

BTW: The difference between the GTX970 and 290X in 4K is exact the same in both games:
Project Cars: GTX970 33% faster
Dirt:Rally: 290X 33% faster
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
So, AMD cards are not as good as nVidia's and you blame the game? This doesnt make sense.

A GTX970 with OC (around 1250MHz) gets 35FPS in 4K with 20 CPU components and dry conditions in Project Cars: http://www.computerbase.de/2015-05/...diagramm-grafikkarten-benchmarks-in-3840-2160

The same card gets the same frames in Dirt:Rally with only ony car on the track (GTX970 is not overclocked here, so you can add ~15%): http://www.computerbase.de/2015-12/dirt-rally-benchmarks/2/#diagramm-dirt-rally-3840-2160

With 20 other cars on the track Dirt:Rally wouldnt even be close to be playable on a nVidia card.

Yes, because people are going to play a racing game at 4k 30FPS (or less) with a single 970 or 290X. Really realistic and relevant comparison there!

BTW: The difference between the GTX970 and 290X in 4K is exact the same in both games:
Project Cars: GTX970 33% faster
Dirt:Rally: 290X 33% faster

See, this would be a good point if the 970 and 290X performed exactly the same at 4k (which, again, is irrelevant for these cards anyway) in a typical scenario. However, that's not the case; the 290X is normally 13% faster than the 970 at 4k:

perfrel_3840_2160.png


So, more accurately:

Dirt: Rally - 290X gains 20% over norm
Project Cars - 970 gains 46% over norm

Your point only really makes sense if Project Cars is the most CPU-intensive game ever released for reason not related to the forced heavy CPU PhysX, really.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
mabe I'm missing something....

Is the argument that AMD cards can't use the Nvidia features or that current card are not fast enough to use the extra gameworks features or the gameworks features make the AMD cards perform slow ?????

or all the above?

fill me in.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
So the answer to my question is "current cards are not fast enough to handle gameworks"?

That's what you guys are pissed about?

You're trying to play with words. The issue is that the features are purposefully poorly optimized (evidenced be the fact that they often completely destroy frame times and minimum frame rates) and make the main game more difficult to optimize in many cases, and the only party that really benefits in the end is Nvidia because people are forced to buy their flagship every year to game at 1080p with better than console graphics. If they wanted to, they could make the features run better across more hardware; however, they don't. GameWorks exists for no other reason than for Nvidia's planned obsolescence, and it takes advantage of the fact that some devs are lazy and can use those features as a selling point. These are facts. I'm not going to explain it again, so if you're going to continue with the willful ignorance just don't reply.
 
Last edited:

tg2708

Senior member
May 23, 2013
687
20
81
Ill have to side with techhog since his post are insightful and just makes sense. But correct me if I'm wrong, we all here to be informed right? But to my understanding game-works is a crap shoot for the major reason of it degrading performance with minimal increase in graphics fidelity. Gamework it seems was developed mainly to help lazy developers (rather work along side said people to make sure its properly optimized they just injected in the games code with no care) and to hurt the competition. But my question is why add such features when only the two top cards of this architecture (maxwell 2) can run them while giving a somewhat decent frame rate and even then performance tanks? To me if a feature that is marketed as next gen takes me from 60 to 45 fps with no discernible difference in IQ while also tanking performance of card that was also marketed as supporting said features it needs to be dropped plain and simple.

I tried dying light demo for the first time last night and my conclusion is game works sucks the two features that tanked performance of the game was the ones that had nvidia's name beside it. With those features turned off I went from 35-ish minimums to 60 fps at minimum. Granted their will be areas that will cause a drop in fps lower than 60 the performance gain was quite high, and the game did not look much different or I'm just blind. This was at 1440p, stock 290, map size I think was a notch down all other settings were maxed for the demo.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
See, this would be a good point if the 970 and 290X performed exactly the same at 4k (which, again, is irrelevant for these cards anyway) in a typical scenario. However, that's not the case; the 290X is normally 13% faster than the 970 at 4k:

So, more accurately:

Dirt: Rally - 290X gains 20% over norm
Project Cars - 970 gains 46% over norm

This benchmark suite doesnt include any racing game. Using it to make a statement about a racing game is misleading.
If we use just Dirt:Rally and Project Cars then the summary would be: There is no difference between nVidia and AMD in the performance. If you include other games like Formula 1 2015 and Assetto Corsa the average performance would be the same but Dirt:Rally and Project Cars are showing more and more a bias towards one company.

In the end there is no difference between Dirt:Rally and Project Cars.

Your point only really makes sense if Project Cars is the most CPU-intensive game ever released for reason not related to the forced heavy CPU PhysX, really.

Project Cars eats CPUs. With ultra settings it generates a huge amount of draw calls when you have other cars on the track.
 

dark zero

Platinum Member
Jun 2, 2015
2,655
140
106
sontin , here

Project card , 1920x1080 :
R9 290x = 42 fps
Geforce GTX 970 = 74.2 fps
74.2/42.2 = 76% ( GTX 970 is 76% faster than R9 290x)

Dirt rally , 1920x1080 :
AMD R9 290x = 92.6 fps
Geforce GTX 970 = 80.4 fps
92.6/80.4 = 15% ( R9 290x is 15% faster than GTX 970)

This is huge different.
And better not to start to talk about consoles who ended to be a dissaster and literally destroyed the game..
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
This benchmark suite doesnt include any racing game. Using it to make a statement about a racing game is misleading.
If we use just Dirt:Rally and Project Cars then the summary would be: There is no difference between nVidia and AMD in the performance. If you include other games like Formula 1 2015 and Assetto Corsa the average performance would be the same but Dirt:Rally and Project Cars are showing more and more a bias towards one company.

In the end there is no difference between Dirt:Rally and Project Cars.



Project Cars eats CPUs. With ultra settings it generates a huge amount of draw calls when you have other cars on the track.

*waits patiently for you to back up these statements*

EDIT: Nevermind, fond some.

F1-2014-4K-3840x2160-Ultra-Benchmark.jpg


7diwc29.png


And my point is proven. Do you have anything else to add? :)
 
Last edited:
Aug 11, 2008
10,451
642
126
You're trying to play with words. The issue is that the features are purposefully poorly optimized (evidenced be the fact that they often completely destroy frame times and minimum frame rates) and make the main game more difficult to optimize in many cases, and the only party that really benefits in the end is Nvidia because people are forced to buy their flagship every year to game at 1080p with better than console graphics. If they wanted to, they could make the features run better across more hardware; however, they don't. GameWorks exists for no other reason than for Nvidia's planned obsolescence, and it takes advantage of the fact that some devs are lazy and can use those features as a selling point. These are facts. I'm not going to explain it again, so if you're going to continue with the willful ignorance just don't reply.

Are these "facts" or your inferences and opinion? Do you have documented proof of these accusations? You are certainly entitled to your opinions, but I dont see any proof that they are "facts". You can certainly give framerates on various games, but personally, I dont have an inside contact in nVidia to tell me their motivations. Do you?
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Are these "facts" or your inferences and opinion? Do you have documented proof of these accusations? You are certainly entitled to your opinions, but I dont see any proof that they are "facts". You can certainly give framerates on various games, but personally, I dont have an inside contact in nVidia to tell me their motivations. Do you?

There's literally no other explanation beyond Nvidia's programmers being completely incompetent.
 

finbarqs

Diamond Member
Feb 16, 2005
3,617
2
81
gtx 780 w/ i7-4770k hits about 70/45 on the low during the benchmark... I was going to upgrade to the GTX 980ti, but waiting for the next round of video cards.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Ill have to side with techhog since his post are insightful and just makes sense. But correct me if I'm wrong, we all here to be informed right? But to my understanding game-works is a crap shoot for the major reason of it degrading performance with minimal increase in graphics fidelity. Gamework it seems was developed mainly to help lazy developers (rather work along side said people to make sure its properly optimized they just injected in the games code with no care) and to hurt the competition. But my question is why add such features when only the two top cards of this architecture (maxwell 2) can run them while giving a somewhat decent frame rate and even then performance tanks? To me if a feature that is marketed as next gen takes me from 60 to 45 fps with no discernible difference in IQ while also tanking performance of card that was also marketed as supporting said features it needs to be dropped plain and simple.

I tried dying light demo for the first time last night and my conclusion is game works sucks the two features that tanked performance of the game was the ones that had nvidia's name beside it. With those features turned off I went from 35-ish minimums to 60 fps at minimum. Granted their will be areas that will cause a drop in fps lower than 60 the performance gain was quite high, and the game did not look much different or I'm just blind. This was at 1440p, stock 290, map size I think was a notch down all other settings were maxed for the demo.


Just read the recent posts. It's only going to get worse. People are perfectly fine with a GTX 980Ti doing 60fps/1080p. As long as Nvidia>AMD in a game. Because at least they chose the right GPU vendor!

The argument is completely lost on people on here because they're too focused on Nvidia vs AMD to realize that they're purchasing a FASTER Nvidia GPU than ever before, to play games that are graphically inferior to what they were playing before. And they're happy to do it.

Gamers are such natural fanboys, that we defend AC Unity, Fallout 4 (even AFTER they remove a lot of shadows from the game), Batman Arkham Knight, etc. and even run out to purchase the sequels no matter what, that do you really expect game quality to improve? The rates of preordering have only increased, despite people being burned more often than ever before.

Now, we're even paying to play games during their development?

We're just going to keep getting lower and lower quality products, and really the gaming community as a whole only has itself to blame for being so team centric that they can't even demand better quality from everyone.

It's the same parallel with politics. People are so caught up in democrat vs republican, that they don't realize both sides have massive issues (although with Trump on one side.... just lol).

So don't expect it to get any better.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
So what's your plan? What's your constructive criticism?

Constructive criticism Nvidia: I would like to see the game work features more open, like they did with CPU physx, to improve adoption.

Constructive criticism Amd: I would like to see them innovate more and improve the experience more with features like Tressfx.

Developers major focus is on consoles and is paramount for the independent hardware vendors to try to differentiate and innovate to improve the PC experience.


Constructive criticism developers: I would like them to invest more into the PC platform, have more of a major focus.

If PC gaming continues to grow, developers may shift more focus on the PC platform, looking forward to this day.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
So what's your plan?

Spend your money wisely, it is your only vote. I will probably buy this game pretty soon myself but I was always going to get it. I might also get the Star Wars game and that had zero appeal to me before launch.