• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

ComputerBase & GameGPURise of the Tomb Raider: DX11 vs DX12 + VXAO Tested

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
ranking best dx12 implementations so far

ashes
hitman

tomb raider





gears of war

People won't accept the tomb raider results because only nvidia has shown loss in performance from dx12 so far. If we are talking tin foil hats this was another one that got pushed out before it was ready (like gears of war). I doubt we will hear more from them on this. You would think with their blog post it would be performance increase all around. Nothing fancy, just better/smoother gameplay. nope. it barely works.
Yeah, but I would knock down Tomb raider 2 spots, we've gone backwards here on all hardware.D:
 
Sweepr, doesn't count, obviously. Anything that shows NVIDIA possibly being good for future DX12 titles is clearly not a "true" DX12 title, don't you know? 😉

(obviously this is playful sarcasm, thanks for sharing Sweepr!)
True Nvidia dx12 titles feature performance regression for their gpus. Yay!
 
This is a troll thread, why isn't a 390X beating a 980 Ti? stupid game! guys just remember if AMD isn't winning in a DX 12 title it is inherently flawed and it can't exist.

Trolling is not allowed here, not to mention that this is flamebait.
Markfw900
 
Last edited by a moderator:
The performance loss does not seem worth ambient occlusion. Most people wont tell the difference in motion to be going this hard on it. When it works quickly, great. When its killing performance, no thanks. Its not something that makes that big a difference unless its off or set to real low.

Yeah, but I would knock down Tomb raider 2 spots, we've gone backwards here on all hardware.D:

I failed to clarify that was what I meant. Nvidia losing performance in dx12 isn't unexpected, but AMD losing it, when the game is supposed to be taking advantage of dx12 basic api overhead reduction, doesn't make sense. dx12 isn't adding any new graphics afaik.
 
Remmember ComputerBase tested all graphics cards with the best gaming CPU out here, Core i7-6700K. Reading their blog post about the patch, slower CPUs should see improved performance under DX12 mode, so it would be interesting to see all dGPUs tested again with something more mundane like a Core i3-4330 or Core i7-2600K.

As an example to illustrate the point, below is a screenshot of a scene in the game running on an Intel i7-2600 processor with 1333Mhz memory, paired with a GTX 970. Using DirectX 11 at High Settings we would only get 46 fps.

DX11
tumblr_inline_o3vsjnzHO71qij4lt_1280.jpg


DX12
tumblr_inline_o3vsk0mwJl1qij4lt_1280.jpg


http://tombraider.tumblr.com/post/140859222830/dev-blog-bringing-directx-12-to-rise-of-the-tomb


This is a troll thread, why isn't a 390X beating a 980 Ti? stupid game! guys just remember if AMD isn't winning in a DX 12 title it is inherently flawed and it can't exist.

Sums up the discussion perfectly.
icon14.gif
 
This is a troll thread, why isn't a 390X beating a 980 Ti? stupid game! guys just remember if AMD isn't winning in a DX 12 title it is inherently flawed and it can't exist.

It's not normal that GTX 970 is close to fury X.if DX12 removes CPU overhead , you should get more FPS but instead we see less fps than DX11.
 
It's pretty simple really. Nobody wants crappy unoptimized junk. That's the reason I don't like GameWorks. If DX12 offers no performance over DX11, don't do. Don't release a patch that reduces performance across the board for no freaking reason. It makes no sense. Don't be like GameWorks.
 
DX12 improves performance by 17% @ Soviet Base using a dual-core Core i3:

cpu%20scaling_zpsdubeg6ov.jpg

Ahh. I see. I guess the performance gain really is targeted at the consoles. Those weak cores need all the help they can get. But it comes at the cost of absolute performance on the higher end PCs.
 
DX12 is faster if you got a weak CPU.

That doesn't mean if you have a stronger CPU you're supposed to lose 20% of your frame rate compared to DX11. Something is wrong here.



Like this one ?? FX 8370 30% faster vs DX-11 and equal to Core i7 6700K in Soviet System.


All those cores with 'enough' IPC, and AMD's pincer movement getting Mantle in as the base for DX12 and getting their hardware in the consoles is going to pay dividends for them in benchmarks. The FX have aged very well despite their short comings.
 
This is not exclusive to AMD, as demonstrated by the developer above: >30% performance bump for stock Core i7-2600K under DX12 (above) with a Geforce GTX970. Probably even more had they used a Geforce GTX980 Ti like CB did. 😉
 
Then don't you find it odd that 390 is so close to 980 Ti in Hitman?

If you mean computerbase's bench , they're are reference , both running around 1K ~ 1050Mhz while pcgameshardware's bench shows boost for Asus GTX 980 Ti.all of GTX 980Ti cards are overclocking version and they're still faster than R9 390 by around 15 ~ 30 depend on card frequency , but it's AMD's Title and you don't see that they lose to DX11.All cards(except some) in Hitman get more fps than DX11.
 
Last edited:
That doesn't mean if you have a stronger CPU you're supposed to lose 20% of your frame rate compared to DX11. Something is wrong here.






All those cores with 'enough' IPC, and AMD's pincer movement getting Mantle in as the base for DX12 and getting their hardware in the consoles is going to pay dividends for them in benchmarks. The FX have aged very well despite their short comings.

Given the current trend, with Nvidia you CPU will be equal to the 6700k. The only way to benefit in DX12 with a fast GPU will be to go AMD for GPU.
 
Why is this DX12 patch getting praise? Lower-end CPUs benefit. Strange that Mantle, which did the same thing, wasn't so positively received, not to mention other DX12 implementations (Ashes, Hitman) affected the same change, but there seems to be plenty of backlash over those two. VXAO? Looks decent, but the IQ gain is not proportional to the massive performance hit. Apart from that, nearly everyone on this forum is going to be seeing lower performance using the DX12 path. VXAO is the only unique DX12 feature and I don't see anyone too excited about it.
 
Why is this DX12 patch getting praise? Lower-end CPUs benefit. Strange that Mantle, which did the same thing, wasn't so positively received, not to mention other DX12 implementations (Ashes, Hitman) affected the same change, but there seems to be plenty of backlash over those two. VXAO? Looks decent, but the IQ gain is not proportional to the massive performance hit. Apart from that, nearly everyone on this forum is going to be seeing lower performance using the DX12 path. VXAO is the only unique DX12 feature and I don't see anyone too excited about it.
It's an Nvidia sponsored game in which Nvidia gpus are doing better than amd. So praise!
Whether it improves the game is irrelevant as long as it has a game works label, praise it!
 
DX12 improves performance by 17% @ Soviet Base using a dual-core Core i3:

cpu%20scaling_zpsdubeg6ov.jpg
this what I dislike, why pick and choose? why point out the soviet base improvement and forget about the top half of the graph you put up? where it lost 20% fps in base camp?

that doesn't explain anything. that is essentially the same as quoting someone out of context.

back to my original question. has anyone who read the german article and does it explain why the performances as they are?
 
damn the spin is really hilarious 😀

nv and amd cards getting worse perf. but just because nv cards are "better" its a win. this is so funny 😀

again a game that shows not benefit that is associated with nvidia and its gameworks programm.

dx12 and nvidia/ms in the making since 1995. well done guys. this is nvidias way its ment to dx12ed? wow i actually dont want to have this kind of dx12 future 😀
 
Back
Top