positivedoppler
Golden Member
- Apr 30, 2012
- 1,148
- 256
- 136
Oh wow, Nvidia threw a complete gutter ball in DirectX 12, I wonder what went wrong? This won't be pretty if they don't fix it, be a lot of pissed off Maxwell owners.
Actually, if you read Dan Baker's latest blog he says that Nvidia, AMD, Microsoft, and Intel have had access to all AoTS source code for a year.
ExtremeTech has Fury X vs 980Ti AoTS benches.
http://www.extremetech.com/gaming/2...he-singularity-amd-and-nvidia-go-head-to-head
Oh wow, Nvidia threw a complete gutter ball in DirectX 12, I wonder what went wrong? This won't be pretty if they don't fix it, be a lot of pissed off Maxwell owners.
Oh wow, Nvidia threw a complete gutter ball in DirectX 12, I wonder what went wrong? This won't be pretty if they don't fix it, be a lot of pissed off Maxwell owners.
So after the AOTS benchmarks, want to change your hopes? :awe:
A single, Alpha benchmark and we already made a conclusion ???
How useful is the benchmark?
It should not be considered that because the game is not yet publically out, it’s not a legitimate test. While there are still optimizations to be had, Ashes of the Singularity in its pre-beta stage is as – or more – optimized as most released games. What’s the point of optimizing code 6 months after a title is released, after all? Certainly, things will change a bit until release. But PC games with digital updates are always changing, we certainly won’t hold back from making big changes post launch if we feel it makes the game better!
DirectX 11 vs. DirectX 12 performance
There may also be some cases where D3D11 is faster than D3D12 (it should be a relatively small amount). This may happen under lower CPU load conditions and does not surprise us. First, D3D11 has 5 years of optimizations where D3D12 is brand new. Second, D3D11 has more opportunities for driver intervention. The problem with this driver intervention is that it comes at the cost of extra CPU overhead, and can only be done by the hardware vendor’s driver teams. On a closed system, this may not be the best choice if you’re burning more power on the CPU to make the GPU faster. It can also lead to instability or visual corruption if the hardware vendor does not keep their optimizations in sync with a game’s updates.
While Oxide is showing off D3D12 support, Oxide also is very proud of its DX11 engine. As a team, we were one of the first groups to use DX11 during Sid Meier’s Civilization V, so we’ve been using it longer than almost anyone and know exactly how to get the get the most performance out of it. However, it took 3 engines and 6 years to get to this point . We believe that Nitrous is one of the fastest, if not the fastest, DX11 engines ever made.
It would have been easy to engineer a game or benchmark that showed D3D12 simply destroying D3D11 in terms of performance, but the truth is that not all players will have access to D3D12, and this benchmark is about yielding real data so that the industry as a whole can learn. We’ve worked tirelessly over the last years with the IHVs and quite literally seen D3D11 performance more than double in just a few years time. If you happen to have an older driver laying around, you’ll see just that. Still, despite these huge gains in recent years, we’re just about out of runway.
Unfortunately, our data is telling us that we are near the absolute limit of what it can do. What we are finding is that if the total dispatch overhead can fit within a single thread, D3D11 performance is solid. But eventually, one core is not enough to handle the rendering. Once that core is saturated, we get no more performance. Unfortunately, the constructs for threading in D3D11 turned out to be not viable. Thus, if we want to get beyond 4 core utilization, D3D12 is critical.
http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/
A *single benchmark. We do need other games
Gives some idea of why dx11 sometimes does better than 12.
Ever tried turning shadows all the way up in Rome 2/Attila Total War? Doesn't matter what graphics card you have or how many CPU cores you have, it will cause PCs with mid-level clock speeds to chug.
Lol at the consistent nvidia hate.
If you're drawing any conclusions from this I feel bad for you.
Sour sour grapes from Nvidia, they sure don't like it when they can't control the source code and prop up their hardware.
![]()
So much for DX12 being the panacea to AMD's low IPC CPUs.
It's alright. The fx will be the best choice once direct x 13 comes out (/sarcasm from the cpu forums where constantly we're told once better threaded support is out amd processors will crush Intel ones. We obviously know better...
I think the DX11 results are pretty depressing for AMD, it shows how bad the disadvantage to NV can actually be in heavy scenes, now, while it's nice that they are doing so well with DX12, it's good to remember that most games until late next year will still be using DX11
but if we have better than expected DX12 adoption it's great help for AMD GPUs, the CPUs didn't like it all the much in this game, still the 8 core FX behind the i3, worse than some DX11 games.
Lol at the consistent nvidia hate.
If you're drawing any conclusions from this I feel bad for you.
Lol at the consistent nvidia hate.
If you're drawing any conclusions from this I feel bad for you.
I don't understand GPU manufacturer bias, how does it benefit the consumer?![]()
All the IHV had source code for over a year. That's more than what most devs would do, so definitely any poor performance is entirely the fault of IHVs.