[Tom's Hardware] CPU bottlenecking/frame latency benchmarks

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Link: http://www.tomshardware.com/reviews/gaming-processor-frame-rate-performance,3427.html

Interesting article over at Tom's on CPU bottlenecking, testing Intel dual and quad cores, and AMD 3, 4, 6, and 8 cores.

Tom's has adopted a frame latency benchmarking technique similar to one pioneered by TechReport (Edit: but not the same - it's testing consistency in frame time for consecutive frames - thanks to ThePeasant for noting that). And the big surprise - AMD CPUs are beating Intel pretty badly in regard to frame consistency, even while losing in frames per second, as shown below:

average.png


The conclusion is that the i3-2120 is faster than AMD's best chip, the FX8350, but the 8350 beats it in all the frame latency testing. So which is the real winner? Note that the frame latency testing can't be summarized in one nice graph, so you'll have to look at the article to see it game by game, but here's an example:

farcry3.png


farcry3cfl.png



Looks like this could lead to some discussion...proceed, ladies and gentlemen.
 
Last edited:

inf64

Diamond Member
Mar 11, 2011
3,884
4,691
136
Oops :D. Suddenly somewhat lower fps but smoother gameplay argument many AMD users had in the past doesn't look so ridiculous any more ? ;)
 

Ed1

Senior member
Jan 8, 2001
453
18
81
Link: http://www.tomshardware.com/reviews/gaming-processor-frame-rate-performance,3427.html

Interesting article over at Tom's on CPU bottlenecking, testing Intel dual and quad cores, and AMD 3, 4, 6, and 8 cores.

Tom's has adopted the frame latency benchmarking technique pioneered by TechReport - looks like this may be a real sea change in the way game benchmarking will be done in the future, even when testing CPUs. And the big surprise - AMD CPUs are beating Intel pretty badly in regard to frame latency, even while losing in frames per second, as shown below:

average.png


The conclusion is that the i3-2120 is faster than AMD's best chip, the FX8350, but the 8350 beats in in all the frame latency testing. So which is the real winner?

Looks like this could lead to some discussion...proceed, ladies and gentlemen.

well your going to get slanted results as it is 200$ max , so no 3570k or 3770's . most of AMD are faster clocked with 200$ ceiling .
I would of liked to see other Intel, just to see scaling results .

Also from Metro to Far cry3 the results change drastic but overall low latency with fast chips overall .
 

Ed1

Senior member
Jan 8, 2001
453
18
81
Oops :D. Suddenly somewhat lower fps but smoother gameplay argument many AMD users had in the past doesn't look so ridiculous any more ? ;)

But both/all systems ran with 680 GTX .

I know I have seen reports of smoother play with Nvidia vers AMD vid cards .
forget site that ran slow motion playback showing slight stuttering with AMD compared to Nvidia .
 

inf64

Diamond Member
Mar 11, 2011
3,884
4,691
136
It was on TR and it was purely AMD (GPU)driver related. AMD fixed their Catalysts now with the latest beta and frame latency is comparable to NV cards.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Don't get all giddy here. The difference is in the low single digits in milliseconds. It would be quite a feat if someone could actually make out the difference between results that are so close.

Also with these small differences I would have liked if they had run the tests at least twice to confirm the values. Fluctuations are normal to a certain degree.
 
Last edited:

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Hmm in that chart graphic posted here, how could it be possible that some results show that the "minimum" FPS for a CPU is *greater than* the average?

If something is called "minimum", shouldn't it be *less than or equal to* the average, not greater as shown?
 

sefsefsefsef

Senior member
Jun 21, 2007
218
1
71
Hmm in that chart graphic posted here, how could it be possible that some results show that the "minimum" FPS for a CPU is *greater than* the average?

If something is called "minimum", shouldn't it be *less than or equal to* the average, not greater as shown?

That's the % difference of that CPU compared to the Pentium. Since the processors all have different ratios of min:avg framerates, it's not surprising that some CPUs are "more better" in the min framerate department than the average framerate department, as compared to the Pentium.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
The conclusion is that the i3-2120 is faster than AMD's best chip, the FX8350, but the 8350 beats in in all the frame latency testing. So which is the real winner?

It depends on the actual data, which isn't in your graph, but it's possible that the 8350 is preferable to the i3-2120. Raw FPS isn't everything when gaming. If it's delivered sporatically, it looks like far fewer fps than it is.

edit: I looked at the article. Tom's has a ways to go on providing meaningful data in that realm. You can't separate the differences without providing the base amounts and get meaningful data. We'd have to go back in and add the total frame time (not just the variation that they see) to get useful data out of that. I'm not going to try to say one way or the other based on the way they present the data.
 
Last edited:

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
I don't have any benchmarks to back it up, but my Phenom II 960T @ 4.0Ghz was noticeably faster than my i3 2100 in games like GTA 4, Skyrim and FC3. Not that much though - 5-10 FPS at the most. I do realize that all of the games I play are take advantage of 4 cores.
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
Agreed, I would like to see more CPUs tested and hopefully more are coming. Also would be interesting in seeing what the results look like with a Radeon...

As someone with a PhII-X6, it doesn't look like I'll be upgrading for a while (well, for gaming at least). Though to be fair, I only have a 5670 so...
 

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
I'm trying, but having a hard time seeing the great win/vindication for AMD here. The differences don't seem that significant except for the very worst cases.

Also, wouldn't you expect more consistent performance from a slower chip than a faster one anyway? It would seem to be easier to have lower spreads at a lower overall average than a higher one.
 

ThePeasant

Member
May 20, 2011
36
0
0
The latency graphs depict the average, 75th and 95th percentile differences in consecutive frame times. This is more a measure of consistency than a measure of the absolute values of the frame times which is what techreport does. I think it is a poorer metric of experience.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
In Far Cry 3 the A4 looks like one of the best. In Skyrim the worst. Other way around for the Pentiums.

Skyrim-CFL.png


It also puts the testing method into questioning.
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I remember a while ago, maybe a few months ago, reading on these forums that someone mentioned that multi-card setups alwasy felt smoother on AMD platforms (besides the GPU's). Could certainly have been a placebo effect, maybe there is a bit of truth to it, who knows.

But what I can tell you is that while I understand the point of low res benchmarks when testing out a CPU, I do wish there were more whole platform tests along with the low res tests when it comes to gaming (testing the games as I would play them may not isolate the CPU, but it would provide numbers to compare based on how I would use two platforms... afterall, none of us game on just a GPU and CPU). Maybe having the PCIE lanes on the northbridge has some effect that helps here?
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
correct me if I'm wrong but,
the 3550 and 2500 both run at 3.3GHz (+turbo at as high as 3.7),
as do the 3220 and 2120.
it's interesting that for the i5 the difference from ivy to sandy bridge is much higher, perhaps ivy bridge runs at a higher turbo clock most of the time?
 

Puppies04

Diamond Member
Apr 25, 2011
5,909
17
76
The conclusion is that the i3-2120 is faster than AMD's best chip, the FX8350, but the 8350 beats in in all the frame latency testing. So which is the real winner?

Sorry but I am a bit confused, why are you comparing the 2120 to the 8350? Are they the same price? Have intel released a successor to that chip? Can you not pick up an intel quad for the same/less than an 8350?

Looks like this could lead to some discussion, proceed ladies and gentlemen (or we could just read the conclusion from the article).

"Intel still holds the aces. For your dollar, the Core i5 has no competition above $160. At $130, the Core i3-3220 is tough to beat. It no longer humiliates the FX line-up in games thanks to AMD's most recent architectural update, but it's still cheaper, faster, and more power-friendly than most of the Vishera-based models."
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
There is a problem with this statistical approach that means its flawed, it doesn't show what Toms want it to show. For example given CPU1 and CPU2 where CPU1 is twice as fast as CPU2 when the game is CPU limited. When the game becomes GPU limited the frame rate of both systems will drop to the same point, CPU1 will jump much further than CPU2. Thus it shows more inconsistency. That however does not mean its not smooth, infact most of the time it was better as the thing being measured is not the interframe stuttering.

I tried a lot of approaches to find a way to average or signal process the data but there isn't one, they all loose critical information. The frame time graph and its difference (the interframe difference graph) remain the best way to look at this to determine stutter and microstutter events. If you want to process it a bit more for comparison you can sort the interframe graph and absolute the values, that produces a very good idea of how much microstutter and stutter there is.

Toms hasn't produce a single graph that provides the resolution necessary to determine stutter and microstutter, I think from this review its clear they don't know what they are doing. They are still doing FPS and combining it with another chart that is fatally flawed and doesn't summarize what they think it does. I'll be the first one to jump on perceivable stutter as an issue, but this approach isn't good enough for determining it. (Not that Anandtech is any better, they use average FPS across an entire run, both these historically good review sites are really poor places to go for graphics reviews today)
 
Last edited:

sefsefsefsef

Senior member
Jun 21, 2007
218
1
71
I actually think this methodology is superior to TR's. If you want the average frame latency, then that's just the inverse of the average frame rate, so nothing lost there. Showing the stats for the deltas between successive frames does more to communicate information about "jerkiness" than TR's methodology, which may still end up giving a bad score to a very smooth experience.

For example, many low-latency frames followed by a steady increase to higher latency frames, and then many high latency frames will get a bad score from TR, even though the experience will be smooth. TH would give this scenario a good score, because the frame latency would change gradually, which is the very definition of smoothness.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Oops :D. Suddenly somewhat lower fps but smoother gameplay argument many AMD users had in the past doesn't look so ridiculous any more ? ;)

Is there a way to limit frame rate of a game to see if you get a smoother experience? I think Borderlands 2 had something like that.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Is there a way to limit frame rate of a game to see if you get a smoother experience? I think Borderlands 2 had something like that.

msi afterburner, dxtory (it's the one I'm always using) and some other softwares can do that,
and it definitely can make the game a lot smoother and more enjoyable...
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
how can a dual core beat a 6-8 core chip in games that use 4 threads+
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
But what I can tell you is that while I understand the point of low res benchmarks when testing out a CPU, I do wish there were more whole platform tests along with the low res tests when it comes to gaming (testing the games as I would play them may not isolate the CPU, but it would provide numbers to compare based on how I would use two platforms... afterall, none of us game on just a GPU and CPU). Maybe having the PCIE lanes on the northbridge has some effect that helps here?

Just run FC3 with FX8350 vs Core i7 3770K with HD7950 @ 1GHz (Catalyst 13.1) on Win 8 64bit

Game settings
fc3settings.jpg


fc3settings2.jpg



fc3optimal1080phd79501g.jpg


FX3850
Min : 31fps
Avg : 44,17fps

Core i7 3770K
Min : 36fps
Avg : 44,91fps
 
Aug 11, 2008
10,451
642
126
In Far Cry 3 the A4 looks like one of the best. In Skyrim the worst. Other way around for the Pentiums.

Skyrim-CFL.png


It also puts the testing method into questioning.

The whole thing just seems "off" somehow. According to this, an A4 is better than either i5 2500k or FX 8350. Even the old phenoms are better than either of the newer processors. An interesting metric, but just doesnt add up. I also cant remember where, but I saw similar data previously, and intel was ahead, especially 3570k, quite a bit better than 2500k.