PCPERDX12 GPU and CPU Performance Tested: Ashes of the Singularity Benchmark

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tential

Diamond Member
May 13, 2008
7,348
642
121
Yea, that's my recollection as well. I thought it was interesting to share here doue to the enormous amount of hoo-har regarding maxwel being hamstrung on the dx12/async compute front.

There was A LOT of crazy speculation.
I feel bad for 980ti owners who threw it out of the window for fury x. Looks like they're throwing fury x out the window and hoping their 980ti is still there
 
Feb 19, 2009
10,457
10
76
The problem with NV's cards was it had worse performance in DX12 than DX11.

Now looking at this new benchmark,

Comparing *only the older drivers* results, this seems to be no longer the case. DX12 is still faster than DX11 on the OLD the drivers. So, what's happening? Is this a new benchmark version?

Reading that computerbase bench, it seems they are using a new version of the benchmark with performance improvements without the new drivers, as well as bug fixes which they mentioned.

Here you can compare to their old benchmark: http://www.computerbase.de/2015-08/...diagramm-ashes-of-the-singularity-1920-x-1080

Nothing much has changed except every GPU is faster across the board.

Note that Fury X is normally 10-20% behind the 980Ti at 1080p in current games (well known that it sucks at low resolutions). Here its matching it under DX12. Oxide has said they only use a small amount of AC compared to some console devs they know which uses up to 30% of it. I'm going guess its similar to Fable, ~10-15%. That alone would allow the Fury X to catch up at 1080p.

This is the new 1080p data:
5yPzhuB.jpg


Compare to the old 1080p data:
O3WCXNK.jpg


The order of the GPUs haven't changed. NV GPU perform better at 1080p and the Fury X matches the 980Ti.

Old 1440p data:
Y2EKAMP.jpg


Old 4K data:
ya199Fj.jpg


No new 1440 or 4K data to compare?
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I think what we can take away from all this is we need to just sit back and wait until these games are released before making judgements. A lot is going to change with game code and drivers.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
You have the 1080p results flipped but those are pretty huge gains compared to the old bench.

Hilarious that some of our forum members think the game is basically in final stage when there is still a lot of performance to be gained from updates to the game.

Dunno why people believe beta = final product.
 
Feb 19, 2009
10,457
10
76
I think what we can take away from all this is we need to just sit back and wait until these games are released before making judgements. A lot is going to change with game code and drivers.

Now that we know Maxwell doesn't do parallel graphics & compute in AC mode thanks to GPUviewer investigations, the DX12 situation could be described as similar to the DX11 era with Tessellation where AMD GPU were worse at it than NV GPU.

Recall if games use a lot of tessellation, AMD GPUs run it worse.

The question isn't judgement or anything like that, it's all about how frequent DX12 games use AC and how much of it they use. That will determine how the benchmark stacks up.

Example, if its an UE4 game, I still expect NV to be faster and no AC is used (except for Fable due to heavily customized engine). Imagine ARK in DX12, NV should still have a massive lead thanks to GameWorks.
 
Feb 19, 2009
10,457
10
76
You have the 1080p results flipped but those are pretty huge gains compared to the old bench.

Hilarious that some of our forum members think the game is basically in final stage when there is still a lot of performance to be gained from updates to the game.

Dunno why people believe beta = final product.

The only interesting thing out of these benches (including Fable) is the side-investigations done by users regarding Maxwell's lack of AC on hardware.

Whether it has an impact, depends on whether games use it and how much of it they use.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Now that we know Maxwell doesn't do parallel graphics & compute in AC mode thanks to GPUviewer investigations, the DX12 situation could be described as similar to the DX11 era with Tessellation where AMD GPU were worse at it than NV GPU.

Recall if games use a lot of tessellation, AMD GPUs run it worse.

The question isn't judgement or anything like that, it's all about how frequent DX12 games use AC and how much of it they use. That will determine how the benchmark stacks up.

Example, if its an UE4 game, I still expect NV to be faster and no AC is used (except for Fable due to heavily customized engine). Imagine ARK in DX12, NV should still have a massive lead thanks to GameWorks.

One aspect of a GPU does not determine the end result. We'll have to wait and see just how much AC is used, and how much that weakness is. It's not like Nvidia GPU's can't run AC, it just isn't an advantage on their GPU's.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yea, that's my recollection as well. I thought it was interesting to share here doue to the enormous amount of hoo-har regarding maxwel being hamstrung on the dx12/async compute front.

There was A LOT of crazy speculation.

One aspect of a GPU does not determine the end result. We'll have to wait and see just how much AC is used, and how much that weakness is. It's not like Nvidia GPU's can't run AC, it just isn't an advantage on their GPU's.

Do nVidia GPU's now run async compute? I haven't seen this. AFAIK they aren't capable. I keep asking because I might have missed it, but nobody has shown me where they got it working. Please, I'd really like to know if someone can show it.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Do nVidia GPU's now run async compute? I haven't seen this. AFAIK they aren't capable. I keep asking because I might have missed it, but nobody has shown me where they got it working. Please, I'd really like to know if someone can show it.

They can run the commands. Whether it'll allow for them to gain any performance is another issue, but the commands are there in one form or another.
 

Goatsecks

Senior member
May 7, 2012
210
7
76
Many people seem desperate to claim that nvidia can or can't implement async compute or if it emulates such a process. For now, I don't know and do not care.

I've said it before and I'll say it again: conclusions based on data generated from a pre-beta benchmark are likely to be erroneous or otherwise contain flaws.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Async does not need to work,on the vga,as long as you have a better cpu than the one in the consoles.
Sure it will make a difference in benchmarks because there the cpu can run at 100% and async running on the gpu could make a difference,but in real games cpu usage is much lower so the async will not interfere with the execution of the game code.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Many people seem desperate to claim that nvidia can or can't implement async compute or if it emulates such a process. For now, I don't know and do not care.

I've said it before and I'll say it again: conclusions based on data generated from a pre-beta benchmark are likely to be erroneous or otherwise contain flaws.

I'm just wondering if they have done it at all. Through drivers, hardware, any way at all. I know they claim they can make it work, but have they?
 

littleg

Senior member
Jul 9, 2015
355
38
91
Async does not need to work,on the vga,as long as you have a better cpu than the one in the consoles.
Sure it will make a difference in benchmarks because there the cpu can run at 100% and async running on the gpu could make a difference,but in real games cpu usage is much lower so the async will not interfere with the execution of the game code.

There will be a latency penalty for going over to the CPU. What you've said is basically a load of rubbish dressed up in a mildly technical way and presented as fact.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
I'm just wondering if they have done it at all. Through drivers, hardware, any way at all. I know they claim they can make it work, but have they?
it does not matter they have or not.The main thing is that without AC Nvidia is still beating AMD so that is the point that even AC cannot help AMD that much.
 

littleg

Senior member
Jul 9, 2015
355
38
91
Well, no. Take a look at the results over the whole range of cards and compare with the DX11 benches.
 

TXBPNT

Junior Member
Sep 19, 2015
19
0
36
it does not matter they have or not.The main thing is that without AC Nvidia is still beating AMD so that is the point that even AC cannot help AMD that much.
Beating ? In the latest 1080p result from computerbase,980ti ties with fury x (72.5 vs 72.1).But in the sub 300$ price point,where the real battle is,the 390 is still 10% faster than the 970 (54.6 vs 50.6).
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
So, async compute is now working? Where are the benches?

The benchmarks that show AC on are ones that show it works. It may not run in parallel, or gain any advantage of not using it, but the commands will work, so the games will run.

It is one aspect of a GPU, ones that Nvidia clearly is behind on, but that is not the only aspect of the GPU. What performance on games will be like is not something we can known until AFTER the games are released, and drivers are improved.

BETA is not a time to make too many claims.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
You have the 1080p results flipped but those are pretty huge gains compared to the old bench.

Hilarious that some of our forum members think the game is basically in final stage when there is still a lot of performance to be gained from updates to the game.

Dunno why people believe beta = final product.

I think they know very well that beta doesnt equal final product.

This has always been an effort to prop up and beat down. A cheap shot. Take a look at it. There was a whole lot of energy spent across the internet, forum after forum. Of course it is a beta product.....but that was completely buried.

But lets really lay it out, cause more importantly-
This is not only a campaign spent on a "beta" program, this wasnt even the game. It was actually an alpha benchmark of a game that was nowhere near launching. We are talking year(+) away. Yet this alpha benchmark was sent to several media outlets, resulting in a push across nearly every PC tech forum on the internet.

It may just be a coincidence that the creator of this benchmark is involved with AMD and have openly admitted to a marketing agreement. Or perhaps the AMD exec claiming that this plan they have had in place is finally reveling itself. I have been pretty convinced that this isnt all some coincidence- but that is just my opinion, which i hope I am allowed to state.

Perhaps this was all just convenient but i do believe that there was a real strategy behind the push.

For many, it is always about propping up one team and trying anything to push down another. And this has been done for years, a lot of it is a result of the passion one has for the technology and their favorite team. But my opinion is there were seeds planted and i think there is enough that leads back for any open minded person to at least wonder.

It isnt strange to me that once nvidia improves their performance, there is not much being said about it. Most of the energy spent had an opposing purpose and nvidia gaining performance is totally at odds with the big fuss that was being made. I fully expect that there will be other cases very similar to this, we already see some trying to pop up.

The real take away here-----
-After all the effort to "prove" that nvidia was just screwed and not having this all important feature made them helpless, it turns out not to be the case. Nvidia did and can gain performance. Perhaps there is validity in the nvidia response, "wait until the actual DX12 games come out".

-That there is a real effort to put down nvidia and their products, there are fans on both sides that do this endlessly.

-But whether it DX12 drivers, console based game engines around GCN, or Async..............
Nvidia is capable.

They have some of the best software engineers in the world and they are very competent.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
This is why I've said, buy whichever brand gpu you prefer. R9 390 vs gtx 970 buy whichever you prefer. I like the R9 390 better personally, but for my bro, I'd get him a gtx 970 for lightboost gaming.

For high end, I like the Fury X(maybe x2!?), because I like Freesync+4k+big monitors. But the GTX 980Ti is clearly the fastest card. The real ovevrclockers dream didn't have to let anyone know it. It just let the numbers do the talking. Not only that, DSR is actually flexible. No VSR above 4K? Is that serious? I been trying to find info on the Fury X2 card, and then I realized that all that GPU horsepower is stuck at 4K for older games, while the DSR can go as high as you want.

And woah, there are pros and cons to each vendor.... So just get what works for you.
Why people are hoping that 980Ti owners will regret their purchase because the Fury X ends up being "superior", or people trying to make it seem like Kepler owners now own dirt GPUs. It's just irritating. A gpu has a long lifecycle of use, and a couple of forum posts trying to make it seem like a "bad purchase" can't possibly take in the multitude of factors that decide whether a person enjoyed a product and got good value out of it.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The benchmarks that show AC on are ones that show it works. It may not run in parallel, or gain any advantage of not using it, but the commands will work, so the games will run.

It is one aspect of a GPU, ones that Nvidia clearly is behind on, but that is not the only aspect of the GPU. What performance on games will be like is not something we can known until AFTER the games are released, and drivers are improved.

BETA is not a time to make too many claims.

Why does this seem so difficult? Async compute was disabled for nVidia. It still is, isn't it? There are no benchmarks, are there? It's simply not capable on nVidia hardware, is it?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I think they know very well that beta doesnt equal final product.

This has always been an effort to prop up and beat down. A cheap shot. Take a look at it. There was a whole lot of energy spent across the internet, forum after forum. Of course it is a beta product.....but that was completely buried.

But lets really lay it out, cause more importantly-
This is not only a campaign spent on a "beta" program, this wasnt even the game. It was actually an alpha benchmark of a game that was nowhere near launching. We are talking year(+) away. Yet this alpha benchmark was sent to several media outlets, resulting in a push across nearly every PC tech forum on the internet.

It may just be a coincidence that the creator of this benchmark is involved with AMD and have openly admitted to a marketing agreement. Or perhaps the AMD exec claiming that this plan they have had in place is finally reveling itself. I have been pretty convinced that this isnt all some coincidence- but that is just my opinion, which i hope I am allowed to state.

Perhaps this was all just convenient but i do believe that there was a real strategy behind the push.

For many, it is always about propping up one team and trying anything to push down another. And this has been done for years, a lot of it is a result of the passion one has for the technology and their favorite team. But my opinion is there were seeds planted and i think there is enough that leads back for any open minded person to at least wonder.

It isnt strange to me that once nvidia improves their performance, there is not much being said about it. Most of the energy spent had an opposing purpose and nvidia gaining performance is totally at odds with the big fuss that was being made. I fully expect that there will be other cases very similar to this, we already see some trying to pop up.

The real take away here-----
-After all the effort to "prove" that nvidia was just screwed and not having this all important feature made them helpless, it turns out not to be the case. Nvidia did and can gain performance. Perhaps there is validity in the nvidia response, "wait until the actual DX12 games come out".

-That there is a real effort to put down nvidia and their products, there are fans on both sides that do this endlessly.

-But whether it DX12 drivers, console based game engines around GCN, or Async..............
Nvidia is capable.

They have some of the best software engineers in the world and they are very competent.

So was it better to have FCAT be used and pushed all over the internet and hide it was actually nVidia themselves who created the software and supplied the equipment? At least the level of AMD's involvement is known. Also, there is zero prior evidence that AMD hinders developers with nVidia. They allow full cooperation, sharing of source code, and changing of source code. As long as it doesn't actually hinder their own performance.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
So, can I assume NV money hatted Oxide? They saw 11% gain in 1080p medium settings with new build+driver update to AMD's 6%.

And AMD saw -3% in DX11 1080p medium.

Woof. AMD must not be happy that their competitor is seeing bigger gains in the game they are probably going broke in funding.
 

Goatsecks

Senior member
May 7, 2012
210
7
76
This forum bristles at any opportunity to lecture any would-be reader about Nvidia's inferior and poorly maintained products (I would give examples but suspect I will be warned). One of the tools they use to do this is the new DX12 benchmarks starting with AoTS.

AoTS has not been released, it has not even been beta tested and there is not a single DX12 title on the market (fully released with DX12 implemented). In spite of this the forum strains to put emphasis on these immature DX12 benchmarks. I have even seen people giving buying advice to others factoring in Nvidia's purportedly poor Dx12 performance, with no mention of DX11 in spite of it being the current standard.

On topic: This second set of benchmark data indicates, unsurprisingly, that first set of data was not accurate. Since we are still dealing with an immature API and a pre-beta benchmark it is reasonable to conjecture that this second set of data is similarly not going to be accurate. This is why the new data is interesting. Yet the forum is busy setting up a strawman along the lines of "marginal differences" and "AMD are still winning". I was being naive to think that this new benchmark info would make people stop and think.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
There is a lot of negative writing on the wall for AMD. Long time fans are probably grasping at every possible straw they can to spin the negative news.

They're looking for a win. And if the first DX12 titles hit the scene and NV wins the benchmarks, I think they'll make their threads true and go console gaming. Haha.