PCPERDX12 GPU and CPU Performance Tested: Ashes of the Singularity Benchmark

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Do you think Project CARS is a GW game? It is not part of the GW suite, but many here claim it is a GW game. In any case, the developers say both Nvidia and AMD had access during development:

I have asked to give me a GW title that AMD was able to have access before BETA and allowed to give their optimized code. If Project Cars is not a GW game why did you mentioned it ??
 

Mercennarius

Senior member
Oct 28, 2015
466
84
91
we are at the mercy of review sites on figuring out the current state of ashes I guess. And apparently none of them are going to bother checking. All we'll get are some russian and german sites. The first big news was one chart from a german site much later after release, the current is an article that just pops up this month from a russian site.

I am not even certain about those dx11 gains from AMD. There is no way for that to be so close to dx12 for them imo unless something changed in the game

You can just purchase the game on steam right now if you want to see for your self. I have it and get about 29FPS on my 280X with DX12 and about 22FPS with DX11 with the lastest update (.63).
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
I have asked to give me a GW title that AMD was able to have access before BETA and allowed to give their optimized code. If Project Cars is not a GW game why did you mentioned it ??

You claimed that Project CARS was a GW game in this post, now you are claiming it is not? Why the flippy floppy, AtenRa? :'(
 

PhonakV30

Senior member
Oct 26, 2009
987
378
136
AMD is not allowed to optimize game engine for their Card if Game is a GW game? Rofl! Only GW not the whole Game engine.GW is a tiny part of game engine.So AMD should access source code.
 

mysticjbyrd

Golden Member
Oct 6, 2015
1,363
3
0
AMD is not allowed to optimize game engine for their Card if Game is a GW game? Rofl! Only GW not the whole Game engine.GW is a tiny part of game engine.So AMD should access source code.
A small part can have an enormous impact. God Rays in Fallout4 can easily impact performance by over 30%!

AMD isn't allowed to see the source code of GW. The developers aren't even allowed to optimize for AMD, if it hurts the performance of Nvidia, either.

It's purely a monopolistic tool to sabotage the competition, and further control the market. I really can't how an intelligent person could defend this....
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
You claimed that Project CARS was a GW game in this post, now you are claiming it is not? Why the flippy floppy, AtenRa? :'(

Because nvidia is flippy flopping, and people are getting confused.
169s18x.jpg

source: anandtech

Nice callout btw. :sneaky:
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
First time I have seen someone suggest this on here.

Bottom line is certain Kepler cards are SLOWER under Directx12. How much slower doesn't matter because slower=bad.

That is notable, and there is no spinning or sugar coating that will fix it. Only Nvidia drivers will fix it I assume.

Oxide is responsible for the performance under DX12. And they dont care about nVidia user.

nVidia cant fix anything. Low Level API at best.
 

werepossum

Elite Member
Jul 10, 2006
29,873
463
126
I hope nobody is surprised at this..
Well - I was. Supposedly AMD had designed for DX12, yet it appears that the relative balance of power is largely unchanged. (Which is a good thing considering that NVidia has vastly more market share; we don't need a new technology that hurts the majority of gamers.)

Well shows me DX12 isn't going to change the world. Here we have the DX12 poster child game designed to be beyond what DX11 can render, it's had lots of effort put into it by both gpu manufacturers. What has all that hassle and low level coding bought? - 10% faster on either manufacturers gpu's at best?

Sure DX12 is a minor improvement but so was DX10 over DX9 and DX11 over DX10. It's not looking like it's going to do the magical things many have claimed.
That's a good point - unfortunately. Nonetheless I'm still hoping that as developers gain more experience with DX12 and the new API, we see the promised huge improvements. Especially since it's apparent that NVidia isn't going to be harmed by it.
 

Spjut

Senior member
Apr 9, 2011
931
160
106
Is Aots DX12 feature level 11_0 only or does it support higher feature levels?
 

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
Kepler below the 780/Titan gets crushed in Directx12 mode:

http--www.gamegpu.ru-images-stories-Test_GPU-strategy-Ashes_of_the_Singularity-test-Ashes_1920_extr.jpg


Tahiti gets a nice boost.

I'm not so sure about those numbers. Look here for example:

1080p.png


All cards get higher FPS, the Fury X gets much higher scores.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Games that are mixed DX11 + DX12 engines wont see the full benefit of DX12, much like how games that are PC + Xone + PS4 + 360 + PS3 don't see the full benefit of the newer consoles and PC hardware.

DX11 just becomes the new lowest common denominator and it will stay that way for a while.

Mixed DX11 / DX12 engines will see the most benefit in CPU bound situations on slower CPUs as the press has been reporting for a while now. The really cool stuff like games that natively have 5x-10x higher draw call counts will require DX12 only and that wont happen for a while yet. GPU bound tests like the above are going to come out a lot like Mantle tests did, a small benefit when GPU bound and a larger one when CPU bound.

It's actually quite atypical that we can see benefits from the new DX API this early into its lifecycle. Let's not get greedy because this time we get some of the benefits up front with more to come, as opposed to the traditional very little up front all on the back (DX9, DX10, DX11)
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Well shows me DX12 isn't going to change the world. Here we have the DX12 poster child game designed to be beyond what DX11 can render, it's had lots of effort put into it by both gpu manufacturers. What has all that hassle and low level coding bought? - 10% faster on either manufacturers gpu's at best?

Sure DX12 is a minor improvement but so was DX10 over DX9 and DX11 over DX10. It's not looking like it's going to do the magical things many have claimed.

DX12 does offer a few things not likely shown in the benchmarks given. Frame times under DX12 are likely more consistent, and likely doesn't have as low of minimums. And as usual, benchmarks are always on high end systems, those with aging CPUs may find a bigger improvement, particularly with the minimums.
 

jpiniero

Lifer
Oct 1, 2010
16,493
6,987
136
Well shows me DX12 isn't going to change the world. Here we have the DX12 poster child game designed to be beyond what DX11 can render, it's had lots of effort put into it by both gpu manufacturers. What has all that hassle and low level coding bought? - 10% faster on either manufacturers gpu's at best?

Sure DX12 is a minor improvement but so was DX10 over DX9 and DX11 over DX10. It's not looking like it's going to do the magical things many have claimed.

It's never really been a benefit for extreme high end processors like the 5960X... slower stuff is what benefits.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
It's never really been a benefit for extreme high end processors like the 5960X... slower stuff is what benefits.

You noticed how every new console port game as of late has massive frame drops even on very high end systems?
It's because consoles can stream textures to the GPU without the GPU having to stop what it's doing,so yes we won't get hughe improvements in FPS but if the drops stop (and I get ~10% on top) that will be more than enough for me.

The video explains how the console API differs from dx11 and why we need Dx12.So watching it from the beginning is worth it.
https://www.youtube.com/watch?feature=player_detailpage&v=H1L4iLIU9xU#t=941
 

Goatsecks

Senior member
May 7, 2012
210
7
76
These results validate that the efforts AMD put into Mantle/Vulkan are beginning to pay off. Without Mantle/Vulkan/DX12 AMD could never have caught up to Nvidia given the fact that Nvidia have a higher R&D budget and focus primarily on GPUs while AMD with a smaller R&D budget focus on high performance x86 CPU cores (Zen/ Zen+) and GPUs (GCN). DX12 levels the playing field for AMD against Nvidia.

I agree @raghu78!

AMD was criticized for designing GCN as being so future focused with poor efficiency for the current, but they really had no choice. They don't have the budget to constantly come up with new uarch. GCN was made to last and its made to excel with a new API beyond DX11.

They gambled on GCN & Mantle's success and later pawning for free to various players (Apple, Kronos), which would have to force MS to also adopt it or fall behind. A good move given their financial situation.

Looks like AMD's DX12 strategy is really starting to pay for itself. :awe:
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
All cards get higher FPS, the Fury X gets much higher scores.

But why would Tonga lose frames? It's GCN1.2, yet the 390 gains frames.

It's too bad they used 2gb cards for the 380 and 960.

I suspect that the 380 would get a boost as well, if it had 4gb vram.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
TechSpot results are using older game build and older drivers compared to GameGPU...

Edit - Not sure about TechSpot's game version, as they recklessly leave it absent.
 
Last edited:

Azix

Golden Member
Apr 18, 2014
1,438
67
91
TechSpot results are using older game build and older drivers compared to GameGPU...

Edit - Not sure about TechSpot's game version, as they recklessly leave it absent.

wouldnt this suggest their scores should be lower?

one issue with gameGPU might be their mention of fraps and afterburner. Neither of these are needed running to benchmark this game with its built in tool. but I dont know what the text says.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
But why would Tonga lose frames? It's GCN1.2, yet the 390 gains frames.

It's too bad they used 2gb cards for the 380 and 960.

I suspect that the 380 would get a boost as well, if it had 4gb vram.

With reduced settings, the ones that those cards should be using, it could show more improvements under DX12.