I'd like to see how a Fury X changes in performance in this game going from DX11 to DX12.
So the 980ti overclock capability will keep it well in front then?
I'd like to see how a Fury X changes in performance in this game going from DX11 to DX12.
I'd like to see how a Fury X changes in performance in this game going from DX11 to DX12.
So the 980ti overclock capability will keep it well in front then?
Well in DX 11 a GTX 960 beats fury X, while in DX 12 Fury X beats the 980 Ti. And of course, Oxide said all of their code was reviewed etc etc.
Also there is a huge inconsistency in how the 980 Ti gained performance, and all other Nvidia cards lost performance going from DX 11->DX 12.
Unless we're ready to accept that as a new reality, I think this game has serious issues with optimization on multiple levels. I have never seen a shipping DX11 game where a Fury or 390X got clobbered by a 960.
![]()
I have never seen a shipping DX11 game where a Fury or 390X got clobbered by a 960.
It is unusual but I think right now we are seeing raw GPU performance with little to no optimizations in the driver or maybe AMD has the optimizations and Nvidia doesn't. Nvidia is clearly behind in at this point when it comes to this particular game.
For example, when Nvidia noticed that a specific shader was taking a particularly long time on their hardware, they offered an optimized shader that made things faster which we integrated into our code.
Crosspost, lets keep DX12 talk in this thread:
Oxide clearly said NV did optimized the game. They even listed an example.
When they had source code for over a year, lack of optimization is no excuse.
When there is no game yet maybe Nvidia didn't put much work into it? It is pre-alpha right? They might have just been lazy.
Oxide doesn't know what Nvidia did with their driver either, they offered code for the game but that doesn't mean they did any work in their drivers for it yet. Nvidia usually releases a game ready driver for new games right at release. I don't know how long they hold those optimizations out.
Game Ready
Best gaming experience for Ashes of the Singularity preview
From what I read and I cannot verify any of this, is that Nvidia has some driver voodoo going on for DX11 in general to help with CPU bottlenecks and relieve some of that overhead. AMD doesn't have the same thing going on in their drivers which is why DX11 is such a wide margin in favor of Nvidia. When we go to DX12 there's very little driver optimization right now, if any at all. So it relies on the raw performance of the GPU at the moment. We are simply seeing that AMD's GPUs are hindered by a certain amount of driver overhead in DX11 without game specific optimizations to help out any.
http://www.nvidia.com/download/driverResults.aspx/89124/en-us
It is pretty strange that performance drops. Nvidia must know why since they have the code and tools. They just aren't being straight
They did have time to release an official reviewer's guide to the media.. they should have had ample opportunity to optimize it. Instead of attacking the game for not being representative of DX12 gaming... which prompted a correction from Oxide.
This reminds me of Tessellation. Recall how NV pretty much said Tessellation wasn't important. Until Fermi. Then it's all about Tessellation.
I'm expecting the same with Pascal. It's all about DX12 & VR post-Pascal.![]()
From what I read and I cannot verify any of this, is that Nvidia has some driver voodoo going on for DX11 in general to help with CPU bottlenecks and relieve some of that overhead. AMD doesn't have the same thing going on in their drivers which is why DX11 is such a wide margin in favor of Nvidia. When we go to DX12 there's very little driver optimization right now, if any at all. So it relies on the raw performance of the GPU at the moment. We are simply seeing that AMD's GPUs are hindered by a certain amount of driver overhead in DX11 without game specific optimizations to help out any.
While I believe that may be true to some extent, but this is much more massive than you'd see in other DX11 games which doesn't support Mantle. I'd have to believe AMD just decided not to try and do anything to help them develop for DX11 on this game.
Well here's a GTX 770 with an i5-3570K eating a Titan X / i7-5960X's for lunch.
I find it hard to believe any of these benchmarks are meaningful.
Well here's a GTX 770 with an i5-3570K eating a Titan X / i7-5960X's for lunch.
I find it hard to believe any of these benchmarks are meaningful.
After looking closely did you notice the Titan X wasn't running fullscreen? I wonder if that has an effect on the performance results. Also the Titan X was a full system test and the 770 a CPU test. Settings are not identical. Dunno what to say
Should be fixed now.
Source is here : http://wccftech.com/ashes-singularity-alpha-dx12-benchmark/
After looking closely did you notice the Titan X wasn't running fullscreen? I wonder if that has an effect on the performance results. Also the Titan X was a full system test and the 770 a CPU test. Settings are not identical. Dunno what to say