More cores = worse DX12 performance?

ioni

Senior member
Aug 3, 2009
619
11
81
Unless I'm completely misreading the graphs from anandtech's Fable DX12 analysis, more cores seem to equate to worse DX12 performance. Anyone have any thoughts on this? Do you think drivers will be updated to fix this? Does it not really matter because the performance differences are so close?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Hmm?

980TiScaling.png
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
163
106
Unless I'm completely misreading the graphs from anandtech's Fable DX12 analysis, more cores seem to equate to worse DX12 performance. Anyone have any thoughts on this? Do you think drivers will be updated to fix this? Does it not really matter because the performance differences are so close?
Nope, what's certain though is that Fable Legends beta isn't multi core friendly, looking at the results above!
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
163
106
Most likely talking about this one.

Not scaling in FPS at higher resolutions doesn't necessarily mean it's not multi-core friendly. It might just mean it's not CPU bottlenecked.
Yes except that scaling at lower resolutions (720p) isn't much better.
 

ALIVE

Golden Member
May 21, 2012
1,960
0
0
Most likely talking about this one.
FuryScaling.png





Not scaling in FPS at higher resolutions doesn't necessarily mean it's not multi-core friendly. It might just mean it's not CPU bottlenecked.

except it seems to favor the i3 with a tiny more fps
if it was a bottleneck somewhere else would not this numbers be closer??
or even better the tiny tiny difference will be in favor of the i7???

in the past when we had a graphic bottleneck from the gpu
the chart showing that the cpus start gathering all around a number
but the strongest cpu was on top with a tiny tiny amount

most probably the beta drivers are not that optimized
or there is a serious problem with scaling
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Doesn't seem like cpu-hungry game at all

It's not a game at all (yet) ,it's a gpu benchmark to highlight the graphics engine,have you ever run a in-game benchmark and looked at CPU/GPU usage with riva tuner or something?
Not all that much going on on the CPU,only just enough to get the GPU to 100%
And with dx12 it's even easier to get a card to 100% so less CPU needed,remember that it (mantle/dx12) was made for the consoles with their 1,5Ghz,almost an arm,CPUs.


(Are all cpus in this test at the same Ghz? )
 

Seba

Golden Member
Sep 17, 2000
1,596
258
126
Also take note that those are not real i7, i5 and i3. They used an i7 Extreme Edition (with 15MB L3 Cache) and then disabled cores and/or HT from it.