• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Hitman DirectX-12 BenchmarksupdateComputerbase

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Another DX12 game not showing the AOTS effect, how surprising.
Are you sure you're looking at the same thing I am? That's a GTX 980 Ti running 1,380MHz barely besting an R9 390x running 1,070MHz.

That is the AotS effect. There's no Fury or Fury X here.
 
Are you sure you're looking at the same thing I am? That's a GTX 980 Ti running 1,380MHz barely besting an R9 390x running 1,070MHz.

That is the AotS effect. There's no Fury or Fury X here.

I assume you didn't follow the thread. They discarded the original benches. And now runs them with Framelock(AMD) and Render Target Reuse(NVidia) disabled.

And it seems they got trouble getting DX12 to work on Fury.
 
I assume you didn't follow the thread. They discarded the original benches. And now runs them with Framelock(AMD) and Render Target Reuse(NVidia) disabled.

And it seems they got trouble getting DX12 to work on Fury.
And what do you think of their results now?
 
And what do you think of their results now?

Considering GCN 1.2 doesn't seem to work in DX12, not much so far.

GCN 1.0 performance doesn't seem special either. All the fun is in GCN 1.1, but that shouldn't be a surprise since its a console port.

So if you want to play console ports, use 390/390X and forget everything else? 🙂
 
So maybe no DX12 GCN 1.2 results. :/

Maybe not, I don't think it's a big deal until tomorrow, when the game launches.

Then, it's the beginning of a big deal. Because if GCN 1.2 didn't get in here, then what else will not get in here?
Will Kepler get attention when Pascal releases?
Will Tonga get attention? ?
 
Maybe not, I don't think it's a big deal until tomorrow, when the game launches.

Then, it's the beginning of a big deal. Because if GCN 1.2 didn't get in here, then what else will not get in here?
Will Kepler get attention when Pascal releases?
Will Tonga get attention? ?

Can Intel IGP run it?

Another interesting part is to see DX12 with Polaris/Pascal. If it can even run it. They may suffer the same.

Its not really that surprising with a low level API. Just look at all the Tonga/Fury issues with Mantle and BF4/Thief. And I think its just a DX12 side effect we have to deal with. It needs optimization from the devs for every single uarch.

DX12 console ports, get a GCN 1.1 card. I said this before, because devs will be lazy or money/time limited. And MS and Sony wants everyone on the consoles anyway.
 
Last edited:
Another interesting part is to see DX12 with Polaris/Pascal. If it can even run it. They may suffer the same.

Its not really that surprising with a low level API. Just look at all the Tonga/Fury issues with Mantle and BF4/Thief.

DX12 console ports, get a GCN 1.1 card. I said this before, because devs will be lazy. And MS and Sony wants everyone on the consoles anyway.

Imagine getting Pascal/Polaris, and you can't play older DX12 games without updates to support them?

I sometimes wait for a new GPU release to play older game. In fact, I'll be primarily playing DX11 games with Polaris the first year I own it.
 
Imagine getting Pascal/Polaris, and you can't play older DX12 games without updates to support them?

I sometimes wait for a new GPU release to play older game. In fact, I'll be primarily playing DX11 games with Polaris the first year I own it.

I wouldn't be surprised. This is also why a DX11 fallback path is so important. But some games will be DX12 only. And that will bring problems.
 
Considering GCN 1.2 doesn't seem to work in DX12, not much so far.

GCN 1.0 performance doesn't seem special either. All the fun is in GCN 1.1, but that shouldn't be a surprise since its a console port.

So if you want to play console ports, use 390/390X and forget everything else? 🙂

Rofl... GCN 1.2 doesn't work? Because its capping @ 60 fps constantly? Wow I wish all games didn't work like that.

Seriously you can't make up comedy this good.

The game works great with it, just its capping @ 60fps so benchmarks aren't valid because its not able to go above that. For anyone actually playing the game having 60 fps is fine for now because its not action heavy like a FPS, not to mention most people don't have > 60 hz monitors anyway.
 
Considering GCN 1.2 doesn't seem to work in DX12, not much so far.

GCN 1.0 performance doesn't seem special either. All the fun is in GCN 1.1, but that shouldn't be a surprise since its a console port.

So if you want to play console ports, use 390/390X and forget everything else? 🙂

GCN 1.0 cards like HD 7970/ R9 280X have 2 ACE while GCN 1.1 cards like R9 290X/ R9 390X have 8 ACE (just as the PS4). If you look at the perf improvement for R9 280X from DX11 to DX12 its negligible. The R9 390 gains a solid 10% when moving to DX12 from DX11. Those 8 ACE are improving resource utilization and perf. GCN 1.2 cards should show similar benefits as they have the 8 ACEs like R9 390x. Given that its a Gaming Evolved title the framelock issue should be resolved soon by AMD . I am sure the GCN 1.2 cards will also have excellent performance.
 
Boy I sure hope pascal doesn't suck. I do not want to have to sell my gsync monitor for a huge loss in order to switch to AMD.

What do you mean? 980 is getting pounded by 390 and the only way to get team green on top is to use heavily OC 980Ti.

1380mhz is not a heavy OC. That is most likely the in game boost clocks of that card out of the box. You can likely get another 10-15% out of it without any trouble.
 
GCN 1.0 cards like HD 7970/ R9 280X have 2 ACE while GCN 1.1 cards like R9 290X/ R9 390X have 8 ACE (just as the PS4). If you look at the perf improvement for R9 280X from DX11 to DX12 its negligible. The R9 390 gains a solid 10% when moving to DX12 from DX11. Those 8 ACE are improving resource utilization and perf. GCN 1.2 cards should show similar benefits as they have the 8 ACEs like R9 390x. Given that its a Gaming Evolved title the framelock issue should be resolved soon by AMD . I am sure the GCN 1.2 cards will also have excellent performance.

That's an interesting view on it.

980TI(10%)/390(10%)/390X(5%) gains.
280X/980 is same.
960 loses a tad(5%).
 
Last edited:
Boy I sure hope pascal doesn't suck. I do not want to have to sell my gsync monitor for a huge loss in order to switch to AMD.



1380mhz is not a heavy OC. That is most likely the in game boost clocks of that card out of the box. You can likely get another 10-15% out of it without any trouble.

I don't recommend ANYONE sell their panel to swap vendors if you invest in Gsync/Freesync. It's a wash then. Does it matter if you're getting 42 FPS and an R9 390 user gets 10 more FPS than you at 1440p? You're still in Gsync range, so it's all gravy baby.
 
That's an interesting view on it.

980TI gains.
390/390X gains.
280X/980 is same.
960 loses a tad.

Yeah that part surprised me. Considering the constant "Maxwell can't do DX12!!!1!!!" posts around here. It gained almost 10%. Not too bad for a card that supposedly can't do DX12.
 
What do you mean? 980 is getting pounded by 390 and the only way to get team green on top is to use heavily OC 980Ti.

this a 390 and a 980ti are not even in the same category of gpus.

and yet it looks like a 390x might surpass a 980ti.

then there is the halo fury/furyx...
 
Yeah that part surprised me. Considering the constant "Maxwell can't do DX12!!!1!!!" posts around here. It gained almost 10%. Not too bad for a card that supposedly can't do DX12.

Its never been that it can't do DX12, the issue is it can't do async compute even though NVidia is selling it as a card that can.

You can see the DX12 gains without async compute in Ashes.
 
1380mhz is not a heavy OC. That is most likely the in game boost clocks of that card out of the box. You can likely get another 10-15% out of it without any trouble.

I think 15% might be pushing it, but the important thing is that the 980 Ti doesn't have much more headroom than the 390 so we can probably expect the 390 not to lose ground with OC.

I don't recommend ANYONE sell their panel to swap vendors if you invest in Gsync/Freesync. It's a wash then. Does it matter if you're getting 42 FPS and an R9 390 user gets 10 more FPS than you at 1440p? You're still in Gsync range, so it's all gravy baby.

I'd consider it going from Gsync to Freesync because the Freesync screens tend to be cheaper. I wouldn't consider it going the other way. That's still a heavy decision to have to make.

Its never been that it can't do DX12, the issue is it can't do async compute even though NVidia is selling it as a card that can.

You can see the DX12 gains without async compute in Ashes.

Yeah. It's very very important to remember that. NV is already gaining. They'll gain more when they make a contemporary architecture ready to take full advantage, but even now, DX12 is a gain for all gamers. AMD users gaining more does not change that NV users are gaining, and that's fantastic news. Performance isn't a zero sum game, and there's room for both sides to win.
 
Last edited:
Its never been that it can't do DX12, the issue is it can't do async compute even though NVidia is selling it as a card that can.

You can see the DX12 gains without async compute in Ashes.

You should revisit the older locked Ashes thread..
 
I don't recommend ANYONE sell their panel to swap vendors if you invest in Gsync/Freesync. It's a wash then. Does it matter if you're getting 42 FPS and an R9 390 user gets 10 more FPS than you at 1440p? You're still in Gsync range, so it's all gravy baby.

fortunately nvidia tech typically sells for more than its worth. A g-sync monitor can become a freesync monitor for a profit.
 
fortunately nvidia tech typically sells for more than its worth. A g-sync monitor can become a freesync monitor for a profit.

Are you going to keep switching monitor tech to the faster GPU? When it has little to no effect on actual user experience?

Meh, I can't recommend anyone to do that. Gsync/Freesync is the ultimate equalizer in my eyes.
 
Yeah that part surprised me. Considering the constant "Maxwell can't do DX12!!!1!!!" posts around here. It gained almost 10%. Not too bad for a card that supposedly can't do DX12.
You can see those gains degrade to negatives at 1440p ultrawide and above. My guess is that there's a bit of CPU bottlenecking at those low resolutions, and NV can take advantage of the lower overhead.
 
You can see those gains degrade to negatives at 1440p ultrawide and above. My guess is that there's a bit of CPU bottlenecking at those low resolutions, and NV can take advantage of the lower overhead.

Woof, didn't notice that. Guess the Meme will be ride again!
 
Back
Top