E5200 or E8400

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: TekDemon
I wish people ran benchmarks like that, but the way cache affects performance is exactly like that-when the data it needs isn't in cache it gets a HUGE performance hit, and when it is there isn't a hit.

The only article I've seen showing this was an article by X-Bit where they looked at the difference between overclocking a dual core celeron and buying top of the line. The overclocked celeron has more raw computing power, but the frame rate in games is consistently lower than the top end processor.

http://xbitlabs.com/articles/c...y/celeron-e1200_8.html

Like you stated a few posts ago, the lack of cache cripples the hell out of the CPU when it comes to running the most popular game engines. For raw processing power, the E6750 is about 18% slower than an overclocked celeron, but it still manages to bench faster in every game. 26% faster in Quake 4, 43% faster in Half-Life 2, 31% faster in Crysis, 40% faster in Unreal Tournament 3, 29% faster in World of Conflict.

HL2 and UT3 are the most telling. HL2 covers all of Valve's games, including TF2, L4D, and any other games they will make on that engine. UT3 is even bigger than that; here is a list of games that use the UT3 engine. All of those games will run like garbage if you don't have enough cache memory.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: TekDemon
Originally posted by: cusideabelincoln
The E7200, on the average, performs about 8% faster than the E5200. In the game tests, it is on average about 15% faster.

Now let's take a look at the E7500 vs. the E8400, which have fairly close clock speeds:
http://www.anandtech.com/bench...3.44.45.46.47.48.49.50

The E8400 has 100% more L2 cache than the E7500 (plus a small clockspeed advantage), and yet on average it is only about 7% faster in all of AT's benchmarks. In the game benchmarks, it is about 12% faster.

So, the move from 2MB to 3MB of L2 cache yields greater improvements than the move from 3MB to 6MB for these stock processors.

Seriously why does everyone care about averages?!?
The worst case scenarios determine when your system is outdated, not the averages.
And the problem is that in VERY POPULAR game engines like UT3 and HL2 the penalty is huge.

Here's Anandtech's Left 4 Dead chart:
http://www.anandtech.com/bench/default.aspx?b=48
The E8200 gets 108FPS...the E5200 gets 77FPS.
And Left 4 Dead is THE MOST POPULAR GAME OUT RIGHT NOW.

They didn't benchmark a TON of other Unreal and HL2 based games either. It really gives the wrong impression about how much of a performance hit you'll suffer, because you're MUCH more likely to play good hit games based on popular rendering engines than just any benchmark game. So you need to weight those benchmarks much more heavily.

Go look at the xbit and polish benches I linked the post above yours...it's really ugly in a ton of engines where the difference is 30-40% average. And again, that's just AVERAGE, except I guaran-fricking-tee you that there'll be parts of the game where you'll be running 50%+ slower just like there's parts where you'll be running only 15% slower. Jeez.

I think you are over-rating the importance just as much as the other guy is under-rating it. I looked at the PCGH benchmarks, and even if you look at minimum framerates only there is not a 30-40% advantage in favor of the processors with 6MB of L2 cache over the 2, 3, or 4MB versions.

I also don't see any minimum framerate numbers in those polish benchmarks. Since you are claiming the difference is most noticeable at the worst case scenario, you better be finding some benchmarks which do have minimum framerate numbers to back up your claim. So far the only set I see is the PCGH one, and again the difference is not what you are claiming.
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: ShawnD1
Originally posted by: TekDemon
I wish people ran benchmarks like that, but the way cache affects performance is exactly like that-when the data it needs isn't in cache it gets a HUGE performance hit, and when it is there isn't a hit.

The only article I've seen showing this was an article by X-Bit where they looked at the difference between overclocking a dual core celeron and buying top of the line. The overclocked celeron has more raw computing power, but the frame rate in games is consistently lower than the top end processor.

http://xbitlabs.com/articles/c...y/celeron-e1200_8.html


Yeah I linked to this article in my long-ass post above:
http://www.xbitlabs.com/articl...pdc-e5200_6.html#sect0

It's seriously driving me crazy that people keep trying to convince other people that the E8400 doesn't offer a significant performance boost in THE PROGRAMS AND GAMES THAT PEOPLE RUN MOST OFTEN. Why would anybody care about the average performance hit over a buncha apps and games they don't run when it's taking a 30% hit in Adobe apps (because nobody ever uses PDFs or Photoshop?!?), iTunes, and a lot of other popular apps not benchmarked. They just point to random synthetic benchmarks that only show a 10% hit, except I don't run mystery synthetic tests, I run iTunes and Acrobat and Photoshop.

And I play Left 4 Dead and other HL2 based games. Not "average game".
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: cusideabelincoln
Originally posted by: TekDemon
Originally posted by: cusideabelincoln
The E7200, on the average, performs about 8% faster than the E5200. In the game tests, it is on average about 15% faster.

Now let's take a look at the E7500 vs. the E8400, which have fairly close clock speeds:
http://www.anandtech.com/bench...3.44.45.46.47.48.49.50

The E8400 has 100% more L2 cache than the E7500 (plus a small clockspeed advantage), and yet on average it is only about 7% faster in all of AT's benchmarks. In the game benchmarks, it is about 12% faster.

So, the move from 2MB to 3MB of L2 cache yields greater improvements than the move from 3MB to 6MB for these stock processors.

Seriously why does everyone care about averages?!?
The worst case scenarios determine when your system is outdated, not the averages.
And the problem is that in VERY POPULAR game engines like UT3 and HL2 the penalty is huge.

Here's Anandtech's Left 4 Dead chart:
http://www.anandtech.com/bench/default.aspx?b=48
The E8200 gets 108FPS...the E5200 gets 77FPS.
And Left 4 Dead is THE MOST POPULAR GAME OUT RIGHT NOW.

They didn't benchmark a TON of other Unreal and HL2 based games either. It really gives the wrong impression about how much of a performance hit you'll suffer, because you're MUCH more likely to play good hit games based on popular rendering engines than just any benchmark game. So you need to weight those benchmarks much more heavily.

Go look at the xbit and polish benches I linked the post above yours...it's really ugly in a ton of engines where the difference is 30-40% average. And again, that's just AVERAGE, except I guaran-fricking-tee you that there'll be parts of the game where you'll be running 50%+ slower just like there's parts where you'll be running only 15% slower. Jeez.

I think you are over-rating the importance just as much as the other guy is under-rating it. I looked at the PCGH benchmarks, and even if you look at minimum framerates only there is not a 30-40% advantage in favor of the processors with 6MB of L2 cache over the 2, 3, or 4MB versions.

I also don't see any minimum framerate numbers in those polish benchmarks. Since you are claiming the difference is most noticeable at the worst case scenario, you better be finding some benchmarks which do have minimum framerate numbers to back up your claim. So far the only set I see is the PCGH one, and again the difference is not what you are claiming.

Dude, forget about the minimum FPS. I already said that's not the point since the minimum likely has nothing to do with the cache, or overflows both caches, or is GPU limited, or a billion other possibilities since it's just ONE second in the game. What's important is all the other slowdowns in the game.

That benchmark happened to have numbers all in the playable zone. But imagine a few years go by, and the minimum is 15FPS, and the average is 35FPS. Now do you see why it would matter that there are more low-FPS scenes on the slower CPU?

And I did prove that there are more low FPS scenes on the slower CPU. Go read the post again if you don't get it, because I'm seriously tired of people wanting to believe nonsense. You can't have a 1FPS difference in some frames (proven by the minimums) and have a 4FPS average difference without there being frames with much higher FPS differences. That's the point of having the minimum FPS there as a reference, just to show that the drop in FPS isn't a constant set percentage, it varies depending on the demand.

Plus the average fps is 30-40% slower anyway in a ton of very popular games so what's your point exactly? That saving $80 is worth having 30% worse FPS in Left 4 Dead, Unreal Tournament 3, STALKER, and every game based on the HL2 and Unreal engines? If that's seriously your argument then go buy all the 1MB cache CPUs to your heart's desire.

Telling someone building a gaming rig that it doesn't matter that a whole buncha popular games run 30% slower is patently ridiculous. It makes no sense to save 10% on your system's price to have many popular apps and games run 30% slower. It's not like it's just ONE popular game, it's a LOT of popular games. And a LOT of popular applications. If you just wanna cling to an average performance drop that's your choice, but you're giving lousy advice for someone building a gaming rig.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: TekDemon
And I play Left 4 Dead and other HL2 based games. Not "average game".

...

Source games are great, but there are more people playing other games than you care to acknowledge. "Average game" comparisons are completely valid for any gamer. Just because you don't play them doesn't mean others don't, or others will. You are purely recommending a product based off your own needs and not the needs of others.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: TekDemon
Originally posted by: cusideabelincoln
Originally posted by: TekDemon
Originally posted by: cusideabelincoln
The E7200, on the average, performs about 8% faster than the E5200. In the game tests, it is on average about 15% faster.

Now let's take a look at the E7500 vs. the E8400, which have fairly close clock speeds:
http://www.anandtech.com/bench...3.44.45.46.47.48.49.50

The E8400 has 100% more L2 cache than the E7500 (plus a small clockspeed advantage), and yet on average it is only about 7% faster in all of AT's benchmarks. In the game benchmarks, it is about 12% faster.

So, the move from 2MB to 3MB of L2 cache yields greater improvements than the move from 3MB to 6MB for these stock processors.

Seriously why does everyone care about averages?!?
The worst case scenarios determine when your system is outdated, not the averages.
And the problem is that in VERY POPULAR game engines like UT3 and HL2 the penalty is huge.

Here's Anandtech's Left 4 Dead chart:
http://www.anandtech.com/bench/default.aspx?b=48
The E8200 gets 108FPS...the E5200 gets 77FPS.
And Left 4 Dead is THE MOST POPULAR GAME OUT RIGHT NOW.

They didn't benchmark a TON of other Unreal and HL2 based games either. It really gives the wrong impression about how much of a performance hit you'll suffer, because you're MUCH more likely to play good hit games based on popular rendering engines than just any benchmark game. So you need to weight those benchmarks much more heavily.

Go look at the xbit and polish benches I linked the post above yours...it's really ugly in a ton of engines where the difference is 30-40% average. And again, that's just AVERAGE, except I guaran-fricking-tee you that there'll be parts of the game where you'll be running 50%+ slower just like there's parts where you'll be running only 15% slower. Jeez.

I think you are over-rating the importance just as much as the other guy is under-rating it. I looked at the PCGH benchmarks, and even if you look at minimum framerates only there is not a 30-40% advantage in favor of the processors with 6MB of L2 cache over the 2, 3, or 4MB versions.

I also don't see any minimum framerate numbers in those polish benchmarks. Since you are claiming the difference is most noticeable at the worst case scenario, you better be finding some benchmarks which do have minimum framerate numbers to back up your claim. So far the only set I see is the PCGH one, and again the difference is not what you are claiming.

Dude, forget about the minimum FPS. I already said that's not the point since the minimum likely has nothing to do with the cache, or overflows both caches, or is GPU limited, or a billion other possibilities since it's just ONE second in the game. What's important is all the other slowdowns in the game.

That benchmark happened to have numbers all in the playable zone. But imagine a few years go by, and the minimum is 15FPS, and the average is 35FPS. Now do you see why it would matter that there are more low-FPS scenes on the slower CPU?

And I did prove that there are more low FPS scenes on the slower CPU. Go read the post again if you don't get it, because I'm seriously tired of people wanting to believe nonsense. You can't have a 1FPS difference in some frames (proven by the minimums) and have a 4FPS average difference without there being frames with much higher FPS differences. That's the point of having the minimum FPS there as a reference, just to show that the drop in FPS isn't a constant set percentage, it varies depending on the demand.

Plus the average fps is 30-40% slower anyway in a ton of very popular games so what's your point exactly? That saving $80 is worth having 30% worse FPS in Left 4 Dead, Unreal Tournament 3, STALKER, and every game based on the HL2 and Unreal engines? If that's seriously your argument then go buy all the 1MB cache CPUs to your heart's desire.

Telling someone building a gaming rig that it doesn't matter that a whole buncha popular games run 30% slower is patently ridiculous. It makes no sense to save 10% on your system's price to have many popular apps and games run 30% slower. It's not like it's just ONE popular game, it's a LOT of popular games. And a LOT of popular applications. If you just wanna cling to an average performance drop that's your choice, but you're giving lousy advice for someone building a gaming rig.

I don't even think you are trying to understand what I'm saying.
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: cusideabelincoln
Originally posted by: TekDemon
And I play Left 4 Dead and other HL2 based games. Not "average game".

...

Source games are great, but there are more people playing other games than you care to acknowledge. "Average game" comparisons are completely valid for any gamer. Just because you don't play them doesn't mean others don't, or others will. You are purely recommending a product based off your own needs and not the needs of others.

No I'm recommending a product that'll play all games well, not just some.
You're the one recommending a lower-cache CPU that'll only fit your needs.
Most shooters run off a licensed engine, so your point is invalid when the HL2 and Unreal engines are used in a ton of FPS games, everything from Bioshock to Left4Dead.

If he buys an E8400 he'll be able to run every game well, so my suggestion does not actually prevent him from satisfying another need.

Here's the list of Unreal Engine games:
http://en.wikipedia.org/wiki/L...of_Unreal_Engine_games
Here's the list of Source engine games:
http://en.wikipedia.org/wiki/C...ry:Source_engine_games

Plus it's slower in other engines too. How exactly does buying an E8400 hurt his needs? He said he wanted to build a gaming rig...to play games.

So because I'm trying to recommend a CPU that'll play all those games anywhere from 10-40% faster, I'm apparently wrong. Wow. Don't become a teacher.
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: cusideabelincoln

I don't even think you are trying to understand what I'm saying.
No I do, you're saying the minimum FPS is the worst case scenario and the difference wasn't that great in those benchmarks.
My point is that one singular dip in FPS isn't the issue. All the other unacceptable low frame rates are, and there's more of them on the E5200. And I proved it mathematically twice that there are more near-minimum frame rates on the E5200. If you don't understand how averages work I can't help you.

If there were two CPUs and one had an 8FPS minimum and one had a 9FPS minimum, but one had a 31FPS average and one had a 26FPS average, the game would be rather unplayable on the 26FPS average one.
A scenario much like the one the benchmarks showed-both had similar minimum FPSes, but the average had a larger gap, showing that more frames were being drawn closer to minimum on the E5200.

Neither one would be playable in the particular part where it hit 9FPS, but for most of the game one system would be dropping under playable rates while the other one likely isn't.

If you really still don't get why I addressed your point then just forget it and look at the fact that it's 30-40% slower in a lot of popular benchmarks and leave it at that. Gotta go back to Endocrinology so I can dispense advice on medicine instead of CPUs. Hahaha.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Learn to read? I never recommended anything. Do you even understand the point of my posts? If not... then let me break it down for you: I was making the argument that an increase in L2 cache size does provide noticeable improvements in performance at the same clock speed. Hence, if you would have actually read my replies, why I compared the E5200 to the E7200 and the E7500 to the E8400. Since he is overclocking, performance per clock speed is important.
 

heyheybooboo

Diamond Member
Jun 29, 2007
6,278
0
0
Phenom II 720BE / Gigabyte 790GX combo deal: $239

Kingston HyperX 4GB (2 x 2GB) DDR2 1066: $57

Antec Case w/ 380w Power Supply: $60 - free shipping
With Promo Code EMCLNNT37

Pioneer 2MB Cache SATA 20X DVD±R DVD Burner: $23

Western Digital 640GB Cache SATA Hard Drive: $65
with promo code EMCLNNT38

Acer 20" LCD Monitor: $150

- - - - - - >>> Around $600 so far

Throw in an HD4830 for $95 and the OS of your choice and off yah go ...

Overclocking a 'BE' microprocessor is as simple as increasing the cpu multiplier. And if you get lucky you may get a magic Phenom II 720 with an operable 4th core
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: cusideabelincoln
Learn to read? I never recommended anything. Do you even understand the point of my posts? If not... then let me break it down for you: I was making the argument that an increase in L2 cache size does provide noticeable improvements in performance at the same clock speed. Hence, if you would have actually read my replies, why I compared the E5200 to the E7200 and the E7500 to the E8400. Since he is overclocking, performance per clock speed is important.

That's not what you said.
You said "I looked at the PCGH benchmarks, and even if you look at minimum framerates only there is not a 30-40% advantage in favor of the processors with 6MB of L2 cache over the 2, 3, or 4MB versions."

And I said that it's not just minimum frame rates, which I explained in detail in several posts by now.

Then you said:
"You are purely recommending a product based off your own needs and not the needs of others."

Which...again, I addressed in detail.

You're not even arguing the same point in each post. You might have it all in your head, but I can only see what you're posting, and what you're posting doesn't invalidate my advice at all.

Let me make it simple: What is your recommendation and why?
And explain why your recommendation addresses his needs better than mine, because you criticized me for supposedly only addressing my needs.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: TekDemon
Originally posted by: cusideabelincoln

I don't even think you are trying to understand what I'm saying.
No I do, you're saying the minimum FPS is the worst case scenario and the difference wasn't that great in those benchmarks.
My point is that one singular dip in FPS isn't the issue. All the other unacceptable low frame rates are, and there's more of them on the E5200. And I proved it mathematically twice that there are more near-minimum frame rates on the E5200. If you don't understand how averages work I can't help you.

If there were two CPUs and one had an 8FPS minimum and one had a 9FPS minimum, but one had a 31FPS average and one had a 26FPS average, the game would be rather unplayable on the 26FPS average one.
A scenario much like the one the benchmarks showed-both had similar minimum FPSes, but the average had a larger gap, showing that more frames were being drawn closer to minimum on the E5200.

Neither one would be playable in the particular part where it hit 9FPS, but for most of the game one system would be dropping under playable rates while the other one likely isn't.

If you really still don't get why I addressed your point then just forget it and look at the fact that it's 30-40% slower in a lot of popular benchmarks and leave it at that. Gotta go back to Endocrinology so I can dispense advice on medicine instead of CPUs. Hahaha.

Uh, that proves NOTHING! I think you need to understand how averages work as well. In the "proof" you provided, the low average could easily be accounted for a lower maximum value in framerate. Your hypothetical proofs mean nothing. To determine what you are saying, we would need to see the graph of the framerate over time, which is something you have not provided. Get some real world data and then we can start talking, because you are making assumptions.
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: cusideabelincoln
Uh, that proves NOTHING! I think you need to understand how averages work as well. In the "proof" you provided, the low average could easily be accounted for a lower maximum value in framerate. Your hypothetical proofs mean nothing. To determine what you are saying, we would need to see the graph of the framerate over time, which is something you have not provided. Get some real world data and then we can start talking, because you are making assumptions.
I'm talking about FPS deltas being applied in game situations where the average is closer to the minimum acceptable rate.
Again, address my above post or stop spewing BS criticisms. You're completely ignoring the fact that it's 30-40% slower on average. Period. And pointing out stuff I don't have data for would be great if you had any data for the opposite instead of pointing out an extremely unlikely possibility to explain the FPS difference. Wow. And maybe all those FPS differences aren't because of cache but because of invisible leprechauns. You can't disprove that either, but I think we can both agree that's rather unlikely

Let me make it simple: What is your recommendation and why? And explain why your recommendation addresses his needs better than mine, because you criticized me for supposedly only addressing my needs.

You still have posted ZERO recommendations with an explanation of why it's better than the E8400. None. When you do that you can continue pointing out the flaws in my posts about why it's important that a gaming rig not run popular games 30-40% slower.

Quit trolling, seriously.

Edit: Apparently he's an AMD fanboy of some sort who doesn't think CPU performance matters to gamers. Wow.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: TekDemon
Originally posted by: cusideabelincoln
Uh, that proves NOTHING! I think you need to understand how averages work as well. In the "proof" you provided, the low average could easily be accounted for a lower maximum value in framerate. Your hypothetical proofs mean nothing. To determine what you are saying, we would need to see the graph of the framerate over time, which is something you have not provided. Get some real world data and then we can start talking, because you are making assumptions.
I'm talking about FPS deltas being applied in game situations where the average is closer to the minimum acceptable rate.
Again, address my above post or stop spewing BS criticisms. You're completely ignoring the fact that it's 30-40% slower on average. Period. And pointing out stuff I don't have data for would be great if you had any data for the opposite instead of pointing out an extremely unlikely possibility to explain the FPS difference. Wow. And maybe all those FPS differences aren't because of cache but because of invisible leprechauns. You can't disprove that either, but I think we can both agree that's rather unlikely

Let me make it simple: What is your recommendation and why? And explain why your recommendation addresses his needs better than mine, because you criticized me for supposedly only addressing my needs.

You still have posted ZERO recommendations with an explanation of why it's better than the E8400. None. When you do that you can continue pointing out the flaws in my posts about why it's important that a gaming rig not run popular games 30-40% slower.

Quit trolling, seriously.

Your logic is, truthfully, fucked up.

And your point about whatever being 30-40% slower is... moot. Why? BECAUSE HE WILL BE OVERCLOCKING. To compare the stock E5200 to the stock E8400 is... retarded. We have to look at performance per clock speed as well as the performance per price.

My advice? I didn't come into this thread to give my advice, but I'll give it now that you're badgering me. It's simple: He should get the fastest processor he can afford. For the Intel route, if he can afford the E7400 then he should get it over the E5200. If he can afford the E8400 then he should get that. If he's stretching his dollar already, the E5200 will suffice.

Originally posted by: TekDemon
Oh I see now, you're an AMD fanboy, so in your world apparently slower CPUs don't matter to gamers. Well that explains everything.

LOL you know nothing. You make way too many assumptions and then try to pass them off as fact.
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: cusideabelincoln
Your logic is, truthfully, fucked up.

And your point about whatever being 30-40% slower is... moot. Why? BECAUSE HE WILL BE OVERCLOCKING. To compare the stock E5200 to the stock E8400 is... retarded. We have to look at performance per clock speed as well as the performance per price.

My advice? I didn't come into this thread to give my advice, but I'll give it now that you're badgering me. It's simple: He should get the fastest processor he can afford. For the Intel route, if he can afford the E7400 then he should get it over the E52
00. If he can afford the E8400 then he should get that. If he's stretching his dollar already, the E5200 will suffice.


Are you retarded? It's 30-40% SLOWER AT THE SAME CLOCK. Those xbit labs benchmarks compare the E8200 to the E7300...they're CLOCKED AT THE SAME SPEED. And the E5200 is only 166Mhz slower in that chart.

Those polish benchmarks COMPARE ALL THREE CPUs AT THE SAME CLOCK SPEED.

Wow...your stupidity is actually greater than I thought. Wow.

Are you seriously that stupid? Seriously? If I was comparing stock speeds you'd see a much larger difference...I was comparing similar speeds to show clock for clock differences you'd see if overclocking.

Go learn to read before you post again because your inability to read benchmarks is unbelievable.

Of all the retarded advice I've seen given on any tech forum ever, this is the greatest failure. No wonder I couldn't understand your point, your point is based on being an illiterate idiot who can't read benchmarks. If my logic is fucked up then by comparison your logic is that of a mentally retarded toothpick. Not just the logic of something retarded mind you, the logic of a retarded inanimate object. It's that bad.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: TekDemon
Are you retarded? It's 30-40% SLOWER AT THE SAME CLOCK. Those xbit labs benchmarks compare the E8200 to the E7300...they're CLOCKED AT THE SAME SPEED. And the E5200 is only 166Mhz slower in that chart.

Those polish benchmarks COMPARE ALL THREE CPUs AT THE SAME CLOCK SPEED.

Wow...your stupidity is actually greater than I thought. Wow.

Are you seriously that stupid? Seriously? If I was comparing stock speeds you'd see a much larger difference...I was comparing similar speeds to show clock for clock differences you'd see if overclocking.

Go learn to read before you post again because your inability to read benchmarks is unbelievable.

Of all the retarded advice I've seen given on any tech forum ever, this is the greatest failure. No wonder I couldn't understand your point, your point is based on being an illiterate idiot who can't read benchmarks. If my logic is fucked up then by comparison your logic is that of a mentally retarded toothpick. Not just the logic of something retarded mind you, the logic of a retarded inanimate object. It's that bad.

[face_laugh_my_fucking_ass_off]

Are you talking about the following Polish benchmarks?
http://translate.google.com/tr...&tl=en&history_state0=

They are not run at the same clock speed.

And these Xbit benchmarks?
http://www.xbitlabs.com/articl...pdc-e5200_6.html#sect0

There is not a 30-40% performance advantage for the E8200 over the E7300. Where the fuck did you learn to do math? And did you even read my other posts? Here are the results:

Originally posted by: cusideabelincoln
I disagree. The extra 1MB is a 50% increase in L2 cache size, and the performance difference is pretty big and makes it a very, very good middle option.

Just look at the E5200 vs. the E7200, which run at as close of a clock speed as you'll probably find:
http://www.anandtech.com/bench...3.44.45.46.47.48.49.50

The E7200, on the average, performs about 8% faster than the E5200. In the game tests, it is on average about 15% faster.

Now let's take a look at the E7500 vs. the E8400, which have fairly close clock speeds:
http://www.anandtech.com/bench...3.44.45.46.47.48.49.50

The E8400 has 100% more L2 cache than the E7500 (plus a small clockspeed advantage), and yet on average it is only about 7% faster in all of AT's benchmarks. In the game benchmarks, it is about 12% faster.

So, the move from 2MB to 3MB of L2 cache yields greater improvements than the move from 3MB to 6MB for these stock processors.

Where is this magical 30% advantage you are claiming, due solely to L2 cache size?

You even posted these PCGH benchmarks, which are all run at the same clock speed:
http://www.pcgameshardware.com...viewed/Reviews/?page=4

And there is no 30% performance advantage for the 6MB chip.
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: cusideabelincoln
Originally posted by: TekDemon
Are you retarded? It's 30-40% SLOWER AT THE SAME CLOCK. Those xbit labs benchmarks compare the E8200 to the E7300...they're CLOCKED AT THE SAME SPEED. And the E5200 is only 166Mhz slower in that chart.

Those polish benchmarks COMPARE ALL THREE CPUs AT THE SAME CLOCK SPEED.

Wow...your stupidity is actually greater than I thought. Wow.

Are you seriously that stupid? Seriously? If I was comparing stock speeds you'd see a much larger difference...I was comparing similar speeds to show clock for clock differences you'd see if overclocking.

Go learn to read before you post again because your inability to read benchmarks is unbelievable.

Of all the retarded advice I've seen given on any tech forum ever, this is the greatest failure. No wonder I couldn't understand your point, your point is based on being an illiterate idiot who can't read benchmarks. If my logic is fucked up then by comparison your logic is that of a mentally retarded toothpick. Not just the logic of something retarded mind you, the logic of a retarded inanimate object. It's that bad.

[face_laugh_my_fucking_ass_off]

Are you talking about the following Polish benchmarks?
http://translate.google.com/tr...&tl=en&history_state0=

They are not run at the same clock speed.

And these Xbit benchmarks?
http://www.xbitlabs.com/articl...pdc-e5200_6.html#sect0

There is not a 30-40% performance advantage for the E8200 over the E7300. Where the fuck did you learn to do math? And did you even read my other posts? Here are the results:

Originally posted by: cusideabelincoln
I disagree. The extra 1MB is a 50% increase in L2 cache size, and the performance difference is pretty big and makes it a very, very good middle option.

Just look at the E5200 vs. the E7200, which run at as close of a clock speed as you'll probably find:
http://www.anandtech.com/bench...3.44.45.46.47.48.49.50

The E7200, on the average, performs about 8% faster than the E5200. In the game tests, it is on average about 15% faster.

Now let's take a look at the E7500 vs. the E8400, which have fairly close clock speeds:
http://www.anandtech.com/bench...3.44.45.46.47.48.49.50

The E8400 has 100% more L2 cache than the E7500 (plus a small clockspeed advantage), and yet on average it is only about 7% faster in all of AT's benchmarks. In the game benchmarks, it is about 12% faster.

So, the move from 2MB to 3MB of L2 cache yields greater improvements than the move from 3MB to 6MB for these stock processors.

Where is this magical 30% advantage you are claiming, due solely to L2 cache size?

You even posted these PCGH benchmarks, which are all run at the same clock speed:
http://www.pcgameshardware.com...viewed/Reviews/?page=4

And there is no 30% performance advantage for the 6MB chip.

You're right that the polish benchmarks aren't all the same clock speed, only 2 of the CPUs are. I confused them with the PCGH set from too many hours of arguing with your idiocy.

On the xbit benchmarks the 8200 gets 76FPS while the 5200 gets 56.
Now, the 8200 is clocked 6.4% faster, so I'll give you the full benefit of the doubt and correct the 56fps, giving us 59.5FPS (which is generous because in real life it doesn't move linearly since the CPU is bottlenecking).

Which still makes the 8200 28% faster. What's hard to understand?

Or the UT3 benchmark where the 8200 gets 91FPS vs 68FPS.

Those aren't even totally CPU limited benches

This graph where it's actually CPU limited since it's run at 1024x768 shows the real ugliness:
http://www.egielda.com.pl/imag...f18bea785b5fb22ff2.png

Or the anandtech benchmark for Left 4 Dead comparing the 8200 vs the 5200:
http://www.anandtech.com/bench/default.aspx?p=58&p2=66
What's 108 divided by 77? 108 divided by 81 (that's 77 * 1.064)?

How is that not 30%+? Are you also mathematically illiterate?

Do you fail to comprehend that when a CPU that's 166Mhz faster is getting 30 more FPS that it's far faster at the same clock speed? Or does the E5200 magically gain 25FPS when it's overclocked 6%?

Seriously stop smoking crack. I never said every benchmark, I said some of the most popular games. Which are indeed 30+% faster clock for clock. Correcting the FPS by multiplying by the clockspeed difference already gives a best case scenario correction since in reality the non-CPU performance factors don't change. In order for the average FPS to actually go up 6.4% every component has to go 6.4% faster. Even being generous you're looking at 28-35% gains...so how again is that not 30%?

Some of those benchmarks aren't even run at really CPU dependent resolutions, it'd be much worse if they were all 1024x768 benchmarks.

How hard is it to understand that it's 30% faster in a lot of gaming benchmarks clock for clock?
 

lady513

Junior Member
Feb 25, 2009
6
0
0
WOW checking back in on the comments. That is some argument there!
However, I learned a few things and Thanks!
Have not had the time to play any games since DAOC a couple of years ago. The reason for building this new system is to play Warhammer with a friend(least the trial period to see if its any good).
BUT since I am splurging on a new rig I wanna play Half Life, Left 4 Dead, and any other games that catch my eye.
Now what I can not stand is lag. That one instance when your screen freezes can be the difference in life and death.
Obviously my budget is low and I understand that I have to put up with some.
I hadn't considered the difference in cache before when putting together my build. And was almost convinced that an OCed E5200 was just about the same as an E8400.
Now that I know, a E8400 is definitely looking more appealing. Heading back to newegg to put something else together.




 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: TekDemon
You're right that the polish benchmarks aren't all the same clock speed, only 2 of the CPUs are. I confused them with the PCGH set from too many hours of arguing with your idiocy.

On the xbit benchmarks the 8200 gets 76FPS while the 5200 gets 56.
Now, the 8200 is clocked 6.4% faster, so I'll give you the full benefit of the doubt and correct the 56fps, giving us 59.5FPS (which is generous because in real life it doesn't move linearly since the CPU is bottlenecking).

Which still makes the 8200 28% faster. What's hard to understand?

Or the UT3 benchmark where the 8200 gets 91FPS vs 68FPS.

Those aren't even totally CPU limited benches

This graph where it's actually CPU limited since it's run at 1024x768 shows the real ugliness:
http://www.egielda.com.pl/imag...f18bea785b5fb22ff2.png

Or the anandtech benchmark for Left 4 Dead comparing the 8200 vs the 5200:
http://www.anandtech.com/bench/default.aspx?p=58&p2=66
What's 108 divided by 77? 108 divided by 81 (that's 77 * 1.064)?

How is that not 30%+? Are you also mathematically illiterate?

Do you fail to comprehend that when a CPU that's 166Mhz faster is getting 30 more FPS that it's far faster at the same clock speed? Or does the E5200 magically gain 25FPS when it's overclocked 6%?

Seriously stop smoking crack. I never said every benchmark, I said some of the most popular games. Which are indeed 30+% faster clock for clock. Correcting the FPS by multiplying by the clockspeed difference already gives a best case scenario correction since in reality the non-CPU performance factors don't change. In order for the average FPS to actually go up 6.4% every component has to go 6.4% faster. Even being generous you're looking at 28-35% gains...so how again is that not 30%?

Some of those benchmarks aren't even run at really CPU dependent resolutions, it'd be much worse if they were all 1024x768 benchmarks.

How hard is it to understand that it's 30% faster in a lot of gaming benchmarks clock for clock?

Get your facts straight before you start calling people out. My argument was for processors with the same (or very, very, very, very similar) clock speeds. YOU DO NOT SEEM TO UNDERSTAND THAT, and you keep trying to compare chips at different clock speeds, cache sizes, and FSB speeds. You even said, and I fucking quote, "It's 30-40% SLOWER AT THE SAME CLOCK." And that is completely FALSE. Moron.

And what you are presenting is not real-world. I feel sorry for people who actually game at 1024x768. At least PCGH's and Anandtech's benchmarks are run at settings people would be using. I know, and most people do, the Source engine is very CPU-dependant. But guess what? It's also very scalable. 77 fps in L4D is very, very playable, and for a budget build this framerate is acceptable.

Now I am done with this thread. You must like arguing and being over-zealous for no reason, because my initial reply was a debate for the usefulness of more L2 cache.
 

xCxStylex

Senior member
Apr 6, 2003
710
0
0
This is the best thorough and detailed opinion/advice I've read and agreed with in a long time. IMHO this opinion should be added to the "dual core vs quadcore" sticky :D

props to you!


Originally posted by: TekDemon
I strongly disagree with the stuff about quads, because in most daily use a higher-clocked dual core with more cache per core is going to feel snappier than a quad-core with cut-down cache that can't clock as high.

And I have both an E4300 and E8400 (well the xeon version), and even at the SAME clocks the E8400 feels much snappier in real-world usage. Yes, I know there were tweaks to wolfdale, but they're less significant than the cache bump, and I believe that even if I clocked the E4300 slightly higher it would still feel less snappy in daily use.

The benchmarks might show 20% or 15% or 10% or whatever difference, but that's averaging all the frames together. The truth is that some calls to the CPU will be MUCH faster, while some will run at the same speed (since it fits in the cache) and it AVERAGES out to 20%.

You have to realize that on the slow parts you'll be screaming murder because your computer will feel much slower than one with more cache. Looking at averages is silly because as long as it's "fast enough" it'll feel fine, it's the parts where it's not fast enough that you'll notice. The average hit in speed in an app might be 15%, but that means a lot of parts run at the same speed while some parts run like 40% slower while it has to hunt through RAM for the data because it's not in cache.

Judging how fast a CPU is based on the average FPS or average performance of an app on benchmarks is just not reflective of REAL WORLD performance, where the larger cache will FEEL much faster in use because it's not hitting slow-points. If Unreal Tournament 3 runs 20% slower on average that means there's parts where it's going to be 40% slower because the CPU is trying to do a whole buncha stuff it can't fit in cache, whereas there'll also be parts where it's running 95% or 100%. You might average it out to look at benchmarks but when you're playing the game and it tanks to 5FPS because a buncha people just fancy guns at you it's going to feel a hell of a lot more than 20% slower than the E8400.

BTW OP, if your budget is $800 you can EASILY afford the E8400 if you shop around. Video cards are very cheap now, RAM is very cheap now, go look at the hot deals forum, so as long as you don't want a particularly fancy monitor you should be able to build a good system.

And while the E7x00 series is cheaper the jump from 2MB to 3MB still isn't as awesome as getting the full 6MB intel figured was optimal. The whole reason they increased from 4MB to 6MB was because 4MB still hit slowdowns.

You don't have to believe me, if you have a friend with a 6MB cache system go use their system, and then go use a system with 2MB of cache. Run the apps you actually run, etc.

The way I look at the value here is that the 6MB cache CPU will make your system run about 15-20% faster. 20% of $800 is $160 man, and you're not saving $160 by buying the cheaper CPU, so it's really not worth it. This isn't even considering the fact that the slowest parts are going to be more than 20% slower and those are the parts that count (since the stuff that runs fast on both CPUs probably runs fast on almost any CPU anyway and wouldn't feel slow on any system).

I dunno what video card you were planning on using, but the 4850 or even 4870 can be had very cheaply nowadays-$125 after a rebate for the 4870 and cheaper for the 4850. Or you can go 4830 and overclock it, you can get one for like $80 at newegg ($90 if you're too lazy to do a rebate).

Unlike the GPU the CPU is used 100% of the time (ok well, the GPU is used 100% of the time but the 2D really doesn't matter), so you'll feel it being faster or slower 100% of the time. I'm not saying you should go buy the most ripped off CPU in the world, but as long as the performance change still makes sense in relationship to the cost of your system I think it's worth it.

The same logic is why quad-cores aren't always worth it, since most apps simply can't use all 4 cores and most quad-cores cannot overclock as high. And if you DO want to overclock as high you have to invest extra $$$ in better cooling because trying to run a quad-core at 4Ghz takes a lot more money in cooling than running a dual-core at 4Ghz (also, you need a lot of luck if you wanna hit 4Ghz on a cheap quad).

In the real world an E8400 clocked at 3.8Ghz will feel faster about 98% of the time versus something like a Q9300, because it'll be clocked faster AND have more cache per core. Sure the Q9300 will feel faster if you're encoding a video, or if you're lucky enough to play a game that happens to use all 4 cores correctly. But the other 98% of the time your system will feel slower, so it makes no sense to me to go quad unless you know you'll be using programs that actually use 4 cores. And even in those programs the E8400 will make up some of the performance gap with it's higher clocks.

Some real world benches with apps people often run:
http://www.tomshardware.com/ch...-Professional,825.html
http://www.tomshardware.com/ch...3-2008/iTunes,827.html
http://www.tomshardware.com/ch...hotoshop-CS-3,826.html
As you can tell in the real-world quad cores don't mean much...only the extremes can beat the E8400 here (and overclocking the E8400 easily gives you the best performance)

Look at where the E7200 is on those charts and realize that the E5200 is even more cache-starved-you're looking at a HUGE performance hit in very commonly used programs. And that's just the average hit, imagine the parts where the hit is even worse.

Synthetic benchmarks where they actually use all 4 cores just don't reflect real-world usage. The only apps where you'll notice real-world benefits are encoding/rendering type apps.

I know the way I use my computer my system is probably running all those programs way faster than anybody with a budget quad-core, and it's probably faster than most expensive quad cores as well since it overclocks better. The E8400 is just a really sweet processor for people who want TOP NOTCH performance in programs for the lowest price. The E8600 sitting on the very top of the Photoshop chart is just a higher clocked E8400, and it's 2nd on the iTunes chart, and 3rd on the Acrobat chart. All the other CPUs in that territory are $1000+!!!

Obviously if you're seriously limited by budget the E5200 is a good choice, but if you use your computer for the apps that most people run an overclocked E8400 hangs with $1500 CPUs. Which is why I think it's a killer value even though the CPU costs 2x as much as the E5200.

 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81

Originally posted by: TekDemon
...an overclocked E8400 hangs with $1500 CPUs. Which is why I think it's a killer value even though the CPU costs 2x as much as the E5200.


:thumbsup:
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Originally posted by: jaredpace

Originally posted by: TekDemon
...an overclocked E8400 hangs with $1500 CPUs. Which is why I think it's a killer value even though the CPU costs 2x as much as the E5200.


:thumbsup:

That's true, but for a budget PC build, the extra ~$75 on a video card would make far more sense than going 8400 vs. 5200. At ~3.5Ghz, the E5200 is going to be video card limited for gaming outside of somewhat rare situations, unless he's going to pop SLI'd 280GTX out of his butt.