ComputerBase & GameGPURise of the Tomb Raider: DX11 vs DX12 + VXAO Tested

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Oxide added the option to turn on/off Async Compute. That's a clear sign they are not anti-NVIDIA.

If they were paid to be anti-NV, they would not have such options. Think about it.

@TheELF

How did you equate this:

A DirectX 12-capable system will allow players to turn on the high-fidelity features.

To a slider to adjust unit quantity? o_O Do you understand what FIDELITY means?
 
Last edited:

PPB

Golden Member
Jul 5, 2013
1,118
168
106
And how exactly the material quoted means they will apply an slider to decrease unit number?

Comparing unit quantity/draw call amount in a RTS to tesselation factor is just BS, tess is just eye candy which can be easily controlled with an slider, unit quantity is a core element in making a RTS and having less draw calls without compromising unit quantity just so Maxwell doesnt look so bad is just asinine.

Do people really think for a second before smashing the keyboard like this?
 

casiofx

Senior member
Mar 24, 2015
369
36
61
Oxide doesn't give a crap about anything, they intentionally cripple performance on Nvidia hardware.

IO-Interactive doesn't give a crap about GCN3, it's not even much faster than a cheaper GCN2 card, GCN3 users getting the short end of the stick.

These companies are crap and pretty much write bad code to cripple performance on Nvidia.
Then the first one to complain would be Nvidia, but they are silent. That tells a lot.

Secondly, that bolded part is absolutely hypocrite's statement. Why? Because the one doing that for the past few years is Nvidia with their Gameworks.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
Oxide doesn't give a crap about anything, they intentionally cripple performance on Nvidia hardware.
Where do you come up with this? In Ashes, NV crushes AMD in DX11 and loses performance in DX12. Is the latter the "sabotage" you speak of? Which other game shows NV losing performance in DX12? :sneaky:

(hint: it's a Gameworks title that's the subject of this thread.)
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
And how exactly the material quoted means they will apply an slider to decrease unit number?

Comparing unit quantity/draw call amount in a RTS to tesselation factor is just BS, tess is just eye candy which can be easily controlled with an slider, unit quantity is a core element in making a RTS and having less draw calls without compromising unit quantity just so Maxwell doesnt look so bad is just asinine.

Do people really think for a second before smashing the keyboard like this?
Units are made of triangles. Triangles form polygons. NVIDIA GPUs have a superior GTris/s rate. Many units favors NVIDIA.

Each units has its own light source, light sources require shaders, shaders are compute intensive. Compute intensive favors AMD.

The ground is made up of tessellated, geometry bound, triangle meshes. NVIDIA have superior tessellation.

Smoke effects are compute bound. They require a shader.

AotS uses NVIDIA and AMD optimized shaders.

The advantage AMD have is not due to any scrupulous bias on the part of Oxide. AMD focused their architecture on compute parallelism. When many compute effects permeate a scene, this results in a compute bound scenario but this in no way indicates a bias towards AMD. Look at the DX11 performance for AotS, NVIDIA dominate, because DX11 can't feed AMD GCNs CUs.

Move over to DX12 and we've got a different situation. AMD GCN can be properly fed. NVIDIA are still executing the same shaders optimized for their architecture.

Add Async Compute + Graphics and AMD GCN can execute compute and graphics work in parallel. This isn't a bias, it's simply part of the DX12 multi-engine specifications.

NVIDIA lacking this capability is sad but that's NVIDIA lacking a truly beneficial aspect of DX12.
 

casiofx

Senior member
Mar 24, 2015
369
36
61
Nvidia's market share has been around 80% for awhile now. Looking at these benchmarks I can see why.

Sorry but it's more like 50% or less

Consider that only Maxwell can run Gameworks game good enough. Keplers and below are grouped with AMD where their performance are "magically horrible".

Please explain that to kepler owners.
 

TheELF

Diamond Member
Dec 22, 2012
3,973
731
126
What makes Ashes of the Singularity different from other RTS games?
Until now, terrestrial strategy games have had to substantially limit the number of units on screen. As a result, these RTS's could be described as battles.

Thanks to recent technological improvements such as multi-core processors and 64-bit computing combined with the invention of a new type of 3D engine called Nitrous, Ashes of the Singularity games can be described as a war across an entire world without abstraction. Thousands or even tens of thousands of individual actors can engage in dozens of battles simultaneously.
Until now low units,now with new tech lots of units....
Most gamers have the necessary hardware to run Ashes. A DirectX 11, 64-bit system should do it. A DirectX 12-capable system will allow players to turn on the high-fidelity features.
Most gamers=low units=dx11,DirectX 12-capable system=lots of units.
How was this going to work if not with a slider?
Only other possibility would be the Dx11.exe having less units without the option to have more.



P.s. yes mahigan the units in civ5 are just as hard to display as the 3d units with lights and effects and live independent movement and so on...
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
The underlying problem is probably the experience. AMD has a lot, because they supported 9 game releases with one or more explicit API. NV don't really have experience on this. Rise of the Tomb Raider is their first supported game with an explicit API. NV is still trying to understand what implementation could help the performance. Most gamers are right about that this patch is not good, and their DX12 implementation mostly rubbish, but this is a start for them. NV needs a lot of time, and a lot of test before they can provide the same quality of knowledge, that AMD provides now to their partners.

Sure. Or AMD's DX12 driver is just a mess and needs more optimized code to not left behind.

Gears of War and Tomb Raider are nearly unplayable on AMD hardware.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Tomb Raider are nearly unplayable on AMD hardware.

Stop lying. There are people on OcUK forums who have benched the game with AMD hardware fine and saying it runs fine even with the latest DX12 addition. You don't believe me?? Go on there.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Feb 19, 2009
10,457
10
76
@sontin

Worse performance in most of the game, same performance in one area...

If you think this is a good implementation of DX12, you are welcome to champion it. If NV sponsorship brings DX12 like that, you are all welcome to enjoy it. That's pushing the gaming industry forward for gamers to enjoy right there!! lol
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
http://www.computerbase.de/2016-03/rise-of-the-tomb-raider-directx-12-benchmark/2/

45FPS with a Fury X in the second test. This is a CPU hearvy scene. The GTX980TI is 63% faster and loses no performance with DX12.

But i guess this is AMD's experience with a low level API. :\

Yes,like all the people who actually then started playing the game and found it was actually smoother on both AMD and Nvidia cards with older CPUs.

You might want to fire up Tombraider yourself and give it a try!! I assume you have it yourself?? I already have 20+ hours in it myself.

Interesting how we have ZERO frametime testing with this game. Hmm.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I have Tomb Raider. Soviet in 1080p is the best case scenario for DX12. Look at the CPU benchmarks and how good the AMD FX processor performs on the GTX980TI.

Maybe AMD's DX12 driver has problem with Multi threaded rendering and the synchronisation process. Explains the performance in Gears of War before they released the 60% driver...
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
I have Tomb Raider. Soviet in 1080p is the best case scenario for DX12. Look at the CPU benchmarks and how good the AMD FX processor performs on the GTX980TI.

Maybe AMD's DX12 driver has problem with Multi threaded rendering and the synchronisation process. Explains the performance in Gears of War before they released the 60% driver...

Yeah,but the problem is that even on OcUK people are saying the actual benchmark is not very reliable. The performance seems to be better in game,so this is why we need some frametime graphs.

Edit to post.

The only problem is now that I finished the game,apart from a quick jaunt to see how DX12 fared,I won't be playing this now.

I gave up on The Division since the GTX960 sucks in it so won't be buying it. I have no choice with Ashes,since it will be a LAN game,and its out in a few weeks.

Hoping,Nvidia releases a driver for it soon.
 
Last edited:

Game_dev

Member
Mar 2, 2016
133
0
0
Sorry but it's more like 50% or less

Consider that only Maxwell can run Gameworks game good enough. Keplers and below are grouped with AMD where their performance are "magically horrible".

Please explain that to kepler owners.

It hasn't been close to 50% in years. Recently AMD was down to 18%.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
Until now low units,now with new tech lots of units....

Most gamers=low units=dx11,DirectX 12-capable system=lots of units.
How was this going to work if not with a slider?
Only other possibility would be the Dx11.exe having less units without the option to have more.



P.s. yes mahigan the units in civ5 are just as hard to display as the 3d units with lights and effects and live independent movement and so on...
Yes, because alleviated CPU bottle necks allow for more draw calls by the CPU to the GPU in order to draw more triangles, texture, color them and run their individual AI algorithms.

NVIDIAs DX11 driver included some hidden driver threads which were largely sufficient for feeding draw calls to their GPU by taking advantage of Kepler and Maxwell's static scheduler (software driven scheduling) which was multi-threaded.

AMD, on the other hand, couldn't implement the same driver feature and instead had to optimize on a per-game basis under DX11.

DX12, provided that a title takes advantage of multi-threaded rendering, alleviates this bottle neck. DX12 benefits AMD GPUs moreso than NVIDIA right now but as GPUs grow (more ROps, TMUs, Rasterizers, Compute units etc), both AMD and NVIDIA will benefit strongly from DX12.
 
Last edited:

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
It hasn't been close to 50% in years. Recently AMD was down to 18%.

Units shipped,not actual marketshare of installed units. When will people stop not read these reports correctly??
 
Last edited:

linkgoron

Platinum Member
Mar 9, 2005
2,305
822
136
As stated, some users are actually saying that in-game the game feels better even though the benchmarks say that the FPS is lower.

for example this post: https://forums.overclockers.co.uk/showpost.php?p=29276293&postcount=108

So maybe there is actually some flaw in how the benchmark works, and the work load is not representative of the actual game.
Either way, things can only improve from here. I assume that dx12 code will be improved for both nvidia and AMD in future patches, and AMD performance will also get better if/when they add AC.
 

dark zero

Platinum Member
Jun 2, 2015
2,655
138
106
So does that mean DX12 is slower if you have a fast CPU. :D If the DX12 version of a game is not performing better than the DX11 version then its not the API at fault but the developer. The key here is developers ought to put in the effort to write a robust DX12 codepath which exploits the low CPU overhead and multithreaded design of the DX12 API and provides better performance than DX11. We saw that with AOTS and Hitman. Rise of the Tomb Raider does not do that.
This, definately they need to improve a LOT in their performance since near all cards shows regression in some degree.
Also AMD needs to improve thir drivers.
 

nvgpu

Senior member
Sep 12, 2014
629
202
81
Gameworks pushes PC gaming forward, it's not Nvidia's fault that AMD can't do tessellation well at all, despite claiming multiple generations of tessellators since DX10 GPU era, Xbox 360's tessellator, DX8's TruForm abomination etc.

http://i.imgur.com/ltuij49.jpg
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
Gameworks pushes PC gaming forward, it's not Nvidia's fault that AMD can't do tessellation well at all, despite claiming multiple generations of tessellators since DX10 GPU era, Xbox 360's tessellator, DX8's TruForm abomination etc.

http://i.imgur.com/ltuij49.jpg
No, gameworks black boxes effects and hinders open market competition. There is no reason that any GPU on the market today can't optimize and take advantage of those effects. Gameworks uses currently available technology in order to incorporate effects. That's not pushing the industry forward, that's remaining in the present (a.k.a stagnation).

AMD CAN perform tessellation well. There is basically no visual discernable reason for utilizing tessellation beyond a x16 factor. There is no discernable image quality improvement going from x16 to x64.

Another thing worth mentioning is that NVIDIA aren't even doing x64 tessellation. Their geometry processors perform primitive discard operations and culling of small triangles that go beyond x16. So basically setting a tessellation factor of x64 has NVIDIA performing x16 + culling (removing/not processing) unseen triangle meshes.

Oh and Open Standards move the industry forward.
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Gameworks pushes PC gaming forward, it's not Nvidia's fault that AMD can't do tessellation well at all, despite claiming multiple generations of tessellators since DX10 GPU era, Xbox 360's tessellator, DX8's TruForm abomination etc.

http://i.imgur.com/ltuij49.jpg

Your sense of direction is really screwed up in that case. There is no doubt that Gameworks is pure crap. Not a single Gameworks title has been universally accepted by even the reviewers as moving PC gaming forward. Even People with Kepler and Maxwell cards agree that. I dare you to start a poll on this forum or ocn or any other popular site on whether Gameworks is pushing PC gaming forward or backward.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Gameworks pushes PC gaming forward, it's not Nvidia's fault that AMD can't do tessellation well at all, despite claiming multiple generations of tessellators since DX10 GPU era, Xbox 360's tessellator, DX8's TruForm abomination etc.

I personally like extra features that differentiate the PC version from a mere console port running at higher-res with better textures. Having only used Radeon graphics cards over the years I missed some of that. Bring on GameWorks. Fanboys should be asking AMD to come up with a similar solution (open or not) instead of bashing it.
icon10.gif
 

nvgpu

Senior member
Sep 12, 2014
629
202
81
That's just your uneducated and irrelevant opinion.

Gameworks pushes PC gaming forward, GCN game console makes game devs extremely lazy and stuck in GCN2(1.1) era with no Conservative Rasterization, no Rasterizer Ordered Views, no Tiled Resources Tier 3.

Nvidia & Intel both support FL12_1 fully.