For some reason the CPU usage goes to 100% in DX12 vs half that in DX11 while also being slower... Seems there is something causing the CPU usage to go nuts?
Somebody forgot to tell those guys that DX12 is supposed to reduce CPU overhead, so you know, it achieves higher performance while using less CPU resources...
Why would they release DX12 patch in such a broken state?!
At least those Ark guys had the right idea to delay it.
Weird, I had a 14fps drop switching from DX11 to DX12.
So the 970 is on par with a 390X in directx 12.
You mean when DX12 is broken and all cards lose FPS instead of gaining.. sure. Glad you like being king of broken games![]()
So the 970 is on par with a 390X in directx 12.
Ahh, I see it, The CPU spikes just as the RAM usage rises.
It looks like the CPU spikes due to a lack of framebuffer (spilling over the 4GB frame buffer and into Dynamic Memory).
That's just like what the review showed. The DX12 patch is broken thus far. Needs better memory management.
Not weird,Even stranger - http://i.imgur.com/bc5vHti.png
Keeping the game open between runs removed the huge CPU spike during start of each area in the benchmark. CPU usage got less severe each time it ran.
Well your image shows that there is no Multi-threaded rendering going on as well.Whats odd, is there was no difference in the FPS when keeping the game open, but that huge CPU usage went away between runs - http://i.imgur.com/bc5vHti.png
So it must be loading up every texture it possible can, even though they aren't used yet.
Well your image shows that there is no Multi-threaded rendering going on as well.
13, 0, 3, 2, 0, 2, 2, 28. In other words, Rise of the Tomb Raider DX12 only scales across two cores.
I find it odd that two NVIDIA sponsored games now show no implementation of multi-threaded rendering.
1. I don't want to put a conspiracy hat on, but so far it looks like two NVIDIA sponsored titles are maintaining an API over head on AMD GPUs.
2. Also two titles now, sponsored by NV, with horrible memory management.
Maintaining an API overhead hurts all GCN GPUs.
Bad memory management hurts Fiji (but also older GCN GPUs with less than 8GB of framebuffer).
That's quite interesting... If this is maintained then "Gimpworks" could be truer than ever.
NVIDIA looks to be ahead in directx 12. After seeing this and Gears of War. Their full support of directx 12.1 is paying off. That's why I only look at benchmarks from released games.
Ahh gotcha,Oh don't use the numbers at the end, thats the current usage (after testing), just use the graphs. So it does spread usage well, but well... so does DX11 version.
There is definitely some ugliness going on though, no reason all this hardware should be losing framerate in an API designed to improve it
NVIDIA looks to be ahead in directx 12. After seeing this and Gears of War. Their full support of directx 12.1 is paying off. That's why I only look at benchmarks from released games.
How did you come to the conclusion that this has anything to do with feature levels? Do Tomb Raider or Gears of War use 12_1 features?NVIDIA looks to be ahead in directx 12. After seeing this and Gears of War. Their full support of directx 12.1 is paying off. That's why I only look at benchmarks from released games.
He's trolling. Don't give him the attention and replies he's seeking.How did you come to the conclusion that this has anything to do with feature levels? Do Tomb Raider or Gears of War use 12_1 features?
I guess by this logic, AMD is ahead in DX11 because they support 11_3 and Kepler doesn't. Feature level support is just some magical number that automatically means your cards run better in the API now.
By my reckoning, Hitman is a released game that NV sucks dirt in. Oh that's not valid because it's Gaming Evolved? Ok, Tomb Raider isn't valid because it's Gameworks.
I don't see NV getting any better in Ashes between now and its March 31st release date either.
I tested it with amd cpus and it increased the performance on both. I tried the fx 8120 oc to 4.77ghz and phenom 960t with all 6 cores unlocked oc to 4.22ghz. I saw about a 10 fps increase on both. They are running with a gtx 980 btw.
http://m.imgur.com/a/Zv2L6