Hi All!
XBitLabs have recently published an article on PileDriver CPUs:
http://www.xbitlabs.com/articles/cpu/display/fx-8350-8320-6300-4300_6.html#sect0
But I found it interesting form a different point of view. their test bench uses a single GTX680 as a GPU, a powerful card to say the least, but then again all their gaming benchmark graphs show a massive GPU bottleneck! how can this be??
(i refer to the UHQ parts of each graph - not the 1280X800 benches)
examin graph 1: Batman: AC - a TWIMTBP game - the flagship single GPU can't handle 8AA @ 1080P? CPU power keeps rising, but performance stays roughly the same.
moar coars? Batman don't care. but that's the game's fault, not the GPU.
moving on to graph 2: Borderlands 2 - Pentium G2120 scores the same as an FX-4300 PD? and within 10% of the top PD FX-8350?? really?? and that's even without AA but rather with FXAA and probably Physx on maximum.
graph number 3: Crysis 2 - another TWIMTBP game - seriously??? 1920X1080 and 1280X800 score the same over all CPUs?? is the game artificialy capped to 60FPS?
graph 4: Dirt Showdown - 8AA again. CPUs keep getting bigger, FPS stay the same.
graph 5: Far Cry 2 - same deal as Crysis 2, but at least it's showing logical results in terms of processing power.
graph 6: Metro 2033 - i honestly don't know what to say at this point...
i'm completely stumped. can anyone provide an explanation to these results?
i generally hold XBitLabs as a reputable site. could it be that they completely messed up?
XBitLabs have recently published an article on PileDriver CPUs:
http://www.xbitlabs.com/articles/cpu/display/fx-8350-8320-6300-4300_6.html#sect0
But I found it interesting form a different point of view. their test bench uses a single GTX680 as a GPU, a powerful card to say the least, but then again all their gaming benchmark graphs show a massive GPU bottleneck! how can this be??
(i refer to the UHQ parts of each graph - not the 1280X800 benches)
examin graph 1: Batman: AC - a TWIMTBP game - the flagship single GPU can't handle 8AA @ 1080P? CPU power keeps rising, but performance stays roughly the same.
moar coars? Batman don't care. but that's the game's fault, not the GPU.
moving on to graph 2: Borderlands 2 - Pentium G2120 scores the same as an FX-4300 PD? and within 10% of the top PD FX-8350?? really?? and that's even without AA but rather with FXAA and probably Physx on maximum.
graph number 3: Crysis 2 - another TWIMTBP game - seriously??? 1920X1080 and 1280X800 score the same over all CPUs?? is the game artificialy capped to 60FPS?
graph 4: Dirt Showdown - 8AA again. CPUs keep getting bigger, FPS stay the same.
graph 5: Far Cry 2 - same deal as Crysis 2, but at least it's showing logical results in terms of processing power.
graph 6: Metro 2033 - i honestly don't know what to say at this point...
i'm completely stumped. can anyone provide an explanation to these results?
i generally hold XBitLabs as a reputable site. could it be that they completely messed up?
