• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Thief 4 CPU benchmarks

escrow4

Diamond Member
which is why going from 4 threads to 8 threads gains you nothing . . . wait, what?

I meant it more that quad's are now well utilised, 4 cores are 4 cores, and (so far) are more than enough to push past 60FPS, i.e. 4 cores are not a bottleneck (so far). As for 8 cores, well, we shall see with Watch Dogs and the Witcher 3.
 
surprise surprise FX is actually excellent for once

Not surprising for me. I've been telling the Intel supporters for quite some time now that FX is good for games and is going to get better with the improved MT coding game devs are implementing.
 
Not surprising for me. I've been telling the Intel supporters for quite some time now that FX is good for games and is going to get better with the improved MT coding game devs are implementing.

This is just begging to start a flame war. Lets not go there.


On a different note, I'm not entirely sure what is going on due to the german language there as I don't speak that. I can't find any english reviews yet. Anyone find some?
 
the consoles have the game locked at 30fps max (900p for xbox one, 1080p for ps4), and I've seen people complaining of poor performance while playing on both!

anyway, I guess Mantle is needed to save the Core 2 Quad!?

pcgameshardawre seem to have picked a testing scenario with lower cpu dependency (70 min for all!?) than these guys

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Thief_-test-proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Thief_-test-amd.jpg


http://gamegpu.ru/action-/-fps-/-tps/thief-test-gpu.html
 
Hyper-threading is worse than useless yet again...

Although im always skeptical of that Russian site and how fast they bench all those different parts.
 
Last edited:
I have a hard time seeing Mantle helping much on this kind of game .
I can see it helping on games where object count can be very high in FOV .

Haven't played this version ,but can't see it doing much here .

Interesting results on CPU % load graphs . on Intel it seems it can handle up to 8 cores and then falls off .
AMD side have issues with there modules with 8350 even though it has 8 too.
 
Last edited:
the consoles have the game locked at 30fps max (900p for xbox one, 1080p for ps4), and I've seen people complaining of poor performance while playing on both!

anyway, I guess Mantle is needed to save the Core 2 Quad!?

pcgameshardawre seem to have picked a testing scenario with lower cpu dependency (70 min for all!?) than these guys

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Thief_-test-proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Thief_-test-amd.jpg


http://gamegpu.ru/action-/-fps-/-tps/thief-test-gpu.html

Strange, very, very different results from the other test. These results actually make more sense to me, because it is an unreal based engine, and intel has historically done better in those games. I wish Anand would do a test.

Also the system requirements are very weird, based on these results. Seems way less cpu dependent than the system requirements would suggest. And it also seems a strange game for mantle support, since it doesnt seem to require much cpu power, especially for intel.
 
Even an i3 2100 is enough and perform better than any of the consoles. Console ports are pointless in benchmarks.

And its a terrible game. I really reject buying this as a preorder. Its simply another flop.

And again we see the PC easily outperform consoles with the same GPU.
 
Last edited:
Not much credibility with the benches if 2500k outperforms 2600k. And the other one (pcgameshardware) seem to think CPU performance @ 1280x720 is relevant to its readers in 2014.
 
This is just begging to start a flame war. Lets not go there.


On a different note, I'm not entirely sure what is going on due to the german language there as I don't speak that. I can't find any english reviews yet. Anyone find some?

You could google then link then use translate?

http://translate.google.co.uk/trans...ler-auf-Diebestour-1110793/&biw=2048&bih=1090

Not surprising for me. I've been telling the Intel supporters for quite some time now that FX is good for games and is going to get better with the improved MT coding game devs are implementing.

By the time this happens the i5 3570k I have had for almost 2 years will need replacing, then I will get whatever is best then 🙂
 
Not much credibility with the benches if 2500k outperforms 2600k. And the other one (pcgameshardware) seem to think CPU performance @ 1280x720 is relevant to its readers in 2014.

it's actually to close, it could be called the same performance, within a margin of error, or maybe HT is having a small negative impact for both quad cores with HT tested, anyway, it looks like the game loves 4 cores, and don't need more than 4c/t, same for AMD, look a the 4300, even with lower clock and half the l3 is not far slower than the 8350.

I don't have a problem with 720, my problem with the pcgameshardware test is that it looks like they've made their tests with "built in benchmark"!?! or a poorly selected location, when all (apart from the C2Q) the CPUs are running with over 70fps min, which makes the gamegpu test look a little more relevant, but who knows, I would love to see the big and more credible sites like Anandtech testing games, I mean really testing, putting the effort to play the entire game and find the good spots for testing, and than going across multiple CPUs, but unfortunately most websites with the hardware in hands are not interested in this kind of stuff,

I'm also looking forward for the console version analysis from Digital Foundry as always,
 
Not much credibility with the benches if 2500k outperforms 2600k. And the other one (pcgameshardware) seem to think CPU performance @ 1280x720 is relevant to its readers in 2014.

The game seems to be terrible at handling HT. Thats why you see 2500K and 4670K outperform the i7. But its a tiny variance as such.
 
In game benchmark appears to be nothing like the actual gameplay according to the Hardforum review. Fairly useless.
 
One of the major reasons why C2Q are so far behind is due to their extremely low stock clock speed. They usually get anywhere from 20-50% fps boost when OCed. But I agree, they are not really considered gaming CPUs anymore since any i3 is very likely to outperform them in most gaming benchmarks.
 
And once again, gamegpu.ru tested cpu scaling without AA filters. That is nice but it doesnt represent how anyone with high-end GPUs like the GTX-780ti would play the game.

FXAA and SSAA are disable.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Thief_-test-1920_nastr_proz.jpg
 
You would need a comparison with a dual core chip to make that statement

Perhaps a C2D is just as fast as a C2Q in Thief 4

I still have a couple C2D processors and a C2Q processor and there are nearly no newer and even not so new games that the C2D's run just as well.


Pretty weak showing by AMD's top end processor being so far behind a 2 generation old i5...
 
Last edited:
One of the major reasons why C2Q are so far behind is due to their extremely low stock clock speed. They usually get anywhere from 20-50% fps boost when OCed. But I agree, they are not really considered gaming CPUs anymore since any i3 is very likely to outperform them in most gaming benchmarks.

the QX9650 have high enough clock speed, 3GHz I wouldn't expect it to be so slow when all the others have 70 minimum,

tomshardware compared a 2.83 and 3.4GHz version of the same thing against ivy bridge

http://media.bestofmicro.com/I/G/395944/original/Combined-Average-Gaming-Performance.png

even on the worst CPU part (with more than 4c usage) of Crysis 3
http://media.bestofmicro.com/E/5/381533/original/Crysis-3-Very-High-FPS.png

the difference looks abnormally higher to me on Thief test from the pcgameshardware,
or maybe I'm not looking at it correctly,


And once again, gamegpu.ru tested cpu scaling without AA filters. That is nice but it doesnt represent how anyone with high-end GPUs like the GTX-780ti would play the game.

FXAA and SSAA are disable.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Thief_-test-1920_nastr_proz.jpg

FXAA as far as I know is not going to decrease the performance significantly, it wouldn't change much, the test was made in 1080p very high, the pcgameshardware test at 720p, and with strange results, indicating a lower CPU load test,

anyway, the performance difference is quite clear and actually somewhat expected considering other games, it's a useful test, it shows the Intel CPUs (Sandy Bridge i5 and higher) are much more likely to achieve a full 60FPS experience (given enough GPU power or well balanced settings) than any AMD CPU.
 
And once again, gamegpu.ru tested cpu scaling without AA filters. That is nice but it doesnt represent how anyone with high-end GPUs like the GTX-780ti would play the game.

FXAA and SSAA are disable.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Thief_-test-1920_nastr_proz.jpg


Worthless canned benchmark.

Hardforum gives an actual performance preview in-game, all settings turned up. The game plays on practically anything.
 
I have always wondered why would a PC gamer buy a high(er) end CPU with high(er) end GPU and play games like Thief/BF4/etc. without AA and turned up settings? What is the point of buying such GFX card anyway if that was the case?
 
Back
Top