• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

FX CPUs and Mantle/DirectX 12...?

jamesgalb

Member
Are there any/many test result on the CPU gains of AMD FX vs Intel when using Mantle (or DirectX 12)?

I know many current games only utilize 4 cores/modules on an AMD FX CPU (like Assassins Creed), and I am wondering if these next wave of APIs have the ability to thread across AMD CPU cores any better.
 
Are there any/many test result on the CPU gains of AMD FX vs Intel when using Mantle (or DirectX 12)?

I know many current games only utilize 4 cores/modules on an AMD FX CPU (like Assassins Creed), and I am wondering if these next wave of APIs have the ability to thread across AMD CPU cores any better.
There's a couple of main things to keep in mind:

1) AMD finally gets multithreaded rendering with Mantle (their DX11 drivers do not support this).

2) It mitigates their performance deficit vs. Intel.
 
I was surprised to see this in anandtech's newest DX12 preview. Note those are Intel Cores in the chart, not AMD.

71448.png


71449.png


71454.png



http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/4
 
Yeah, that's because they're finally getting widespread support for multithreaded rendering.
 
Sort of bittersweet to see this finally coming about. Wonder if this had happened two or three years ago how it'd have affected the FX's success (or lack of). I used mantle briefly on BF4 with my 9590 and xfire 280x's, while it was unstable and the game played pretty well without there was a noticeable improvement when it was working.

Also wonder how much a life extension this might be to the higher clocked FX chips.
 
Sort of bittersweet to see this finally coming about. Wonder if this had happened two or three years ago how it'd have affected the FX's success (or lack of).

You'll notice in the charts that more cores actually matter LESS in DX12 than in DX11; possibly because of reduced CPU overhead.
(But those are powerful Intel cores, an AMD chart might look different)
 
Last edited:
Sort of bittersweet to see this finally coming about. Wonder if this had happened two or three years ago how it'd have affected the FX's success (or lack of). I used mantle briefly on BF4 with my 9590 and xfire 280x's, while it was unstable and the game played pretty well without there was a noticeable improvement when it was working.

Also wonder how much a life extension this might be to the higher clocked FX chips.

Running prescripted benchmarks is one thing. Running actual gameplay another. The Tomb Raider demonstration between benchmark and actual gameplay with 800 and 4700Mhz Haswell was pretty extreme.

Also any bets on how fast the recovered overhead is being used for something else?
 
And that's a good thing. I want my hardware to be used on interesting and entertaining things to the most of its capability. Or in mobile, same performance, less power.
 
Running prescripted benchmarks is one thing. Running actual gameplay another. The Tomb Raider demonstration between benchmark and actual gameplay with 800 and 4700Mhz Haswell was pretty extreme.

Also any bets on how fast the recovered overhead is being used for something else?

No idea, between what data is shown so far and my brief experience with BF4 mantle it's promising though. No doubt game developers will use up the "extra", as is the goal, but it's gotta be some help.
 
Soo.... no test results/research anywhere yet? 🙁

On a counter-note... do we know of any games that use all 8 cores, vs something like Assassins Creed that only uses 4...?
 
Back
Top