E5200 or E8400

lady513

Junior Member
Feb 25, 2009
6
0
0
AS of today the E5200 is 72.99 adn the E8400 is 164.99.
Is it worth the extra $90 for the E8400 on a low budget build
Do I plan on overclocking.... not sure, if it is easy enough to figure out as I have never done it b4.
Will be using cpu in a low budget gaming pc
 
Feb 24, 2009
36
0
0
I would not get the E8400 becuase as of right now it is overpriced compared to the AMD Phenom II X3 720 Triple-Core CPU. You can get the P2 720 for $144.99 at Newegg.com $20 cheaper than the E8400 which leves you with the E5200.
 

13Gigatons

Diamond Member
Apr 19, 2005
7,461
500
126
If you are going to spend $164.99 you might as well get the Q9400 and overclock it. Microcenter had them for $175.
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
Originally posted by: lady513
Will be using cpu in a low budget gaming pc

E5200 should be fine for this. Plenty of room for OCing, and quite forgiving if you're a newb to the OCing art!

If this is a new build, and you think your usage requires more than two cores, then AMD's X3 and X4 chips + mobos are nicely priced. But for a basic low budget gaming rig: E5200 + a modern graphics card (best within your budget).
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Welcome to the forums!

The reason everyone is asking if this is a new build is because if it is, you have to buy a new motherboard & everything so you might as well get an AMD X3 720 and save some money.

If you're doing a upgrade in processor only (already have a motherboard that supports e5200 & e8400) then you would want to buy one of those chips. Of the two, I would suggest the e5200 and learn how to overclock. (It's ridiculously easy & gives you much higher performance than you paid for.)
 

nerp

Diamond Member
Dec 31, 2005
9,865
105
106
I have an E8400 and don't regret it. I could have saved money, but at the time, I was upgrading from an E2180. The cache difference is quite noticable.

Now, the E5200 is a better performer than the E2180, so if I was at an E5200, I'd have a hard time justifying the jump to an E8400.

So you should probably get the E5200.
 

nOOky

Diamond Member
Aug 17, 2004
3,231
2,287
136
I'd say we need a little more info. If you have an existing mobo you plan to use as well as memory etc. or if you plan to replace those components. Even hard drive type is important, if you have an older drive then a new faster hard drive can give a system a boost as well in addition to a new cpu etc.
 

Kraeoss

Senior member
Jul 31, 2008
450
0
76
Originally posted by: EarthBoy
e8400 is not worth 2x the price of e5200

well it does have 3x cache and stock is 3.0 but indeed it's really not worth it due to the fact that the e5200 can reach the same stable clock spd as the e8400 when oc's
 

djnsmith7

Platinum Member
Apr 13, 2004
2,612
1
0
E8400 is worth every cent, but it's not the only option in its price range...Consider all options before pulling the trigger...

E8400's OC very well with proper cooling...
 
Feb 24, 2009
36
0
0
Originally posted by: djnsmith7
E8400 is worth every cent, but it's not the only option in its price range...Consider all options before pulling the trigger...

E8400's OC very well with proper cooling...

But the E8400 is overpriced right now due to the Phenom II X3 chips
 

lady513

Junior Member
Feb 25, 2009
6
0
0
Thanks for the replies. posted this otw to work and didn't have time to check it.
Yes I am building a whole new system. Trying not to spend more than $800, but that is for EVERY thing.
Dont think I am in the spot just yet for a Quad core, and everyone I know says it is best to go intell atm.
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
Originally posted by: 13Gigatons
If you are going to spend $164.99 you might as well get the Q9400 and overclock it. Microcenter had them for $175.

i agree here. if you spending that much like $170, save up and get a quad. I'd just get e5200 but OC it to 3.5ghz. will be plenty for games. save your cash to get a strong gcard.
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
I strongly disagree with the stuff about quads, because in most daily use a higher-clocked dual core with more cache per core is going to feel snappier than a quad-core with cut-down cache that can't clock as high.

And I have both an E4300 and E8400 (well the xeon version), and even at the SAME clocks the E8400 feels much snappier in real-world usage. Yes, I know there were tweaks to wolfdale, but they're less significant than the cache bump, and I believe that even if I clocked the E4300 slightly higher it would still feel less snappy in daily use.

The benchmarks might show 20% or 15% or 10% or whatever difference, but that's averaging all the frames together. The truth is that some calls to the CPU will be MUCH faster, while some will run at the same speed (since it fits in the cache) and it AVERAGES out to 20%.

You have to realize that on the slow parts you'll be screaming murder because your computer will feel much slower than one with more cache. Looking at averages is silly because as long as it's "fast enough" it'll feel fine, it's the parts where it's not fast enough that you'll notice. The average hit in speed in an app might be 15%, but that means a lot of parts run at the same speed while some parts run like 40% slower while it has to hunt through RAM for the data because it's not in cache.

Judging how fast a CPU is based on the average FPS or average performance of an app on benchmarks is just not reflective of REAL WORLD performance, where the larger cache will FEEL much faster in use because it's not hitting slow-points. If Unreal Tournament 3 runs 20% slower on average that means there's parts where it's going to be 40% slower because the CPU is trying to do a whole buncha stuff it can't fit in cache, whereas there'll also be parts where it's running 95% or 100%. You might average it out to look at benchmarks but when you're playing the game and it tanks to 5FPS because a buncha people just fancy guns at you it's going to feel a hell of a lot more than 20% slower than the E8400.

BTW OP, if your budget is $800 you can EASILY afford the E8400 if you shop around. Video cards are very cheap now, RAM is very cheap now, go look at the hot deals forum, so as long as you don't want a particularly fancy monitor you should be able to build a good system.

And while the E7x00 series is cheaper the jump from 2MB to 3MB still isn't as awesome as getting the full 6MB intel figured was optimal. The whole reason they increased from 4MB to 6MB was because 4MB still hit slowdowns.

You don't have to believe me, if you have a friend with a 6MB cache system go use their system, and then go use a system with 2MB of cache. Run the apps you actually run, etc.

The way I look at the value here is that the 6MB cache CPU will make your system run about 15-20% faster. 20% of $800 is $160 man, and you're not saving $160 by buying the cheaper CPU, so it's really not worth it. This isn't even considering the fact that the slowest parts are going to be more than 20% slower and those are the parts that count (since the stuff that runs fast on both CPUs probably runs fast on almost any CPU anyway and wouldn't feel slow on any system).

I dunno what video card you were planning on using, but the 4850 or even 4870 can be had very cheaply nowadays-$125 after a rebate for the 4870 and cheaper for the 4850. Or you can go 4830 and overclock it, you can get one for like $80 at newegg ($90 if you're too lazy to do a rebate).

Unlike the GPU the CPU is used 100% of the time (ok well, the GPU is used 100% of the time but the 2D really doesn't matter), so you'll feel it being faster or slower 100% of the time. I'm not saying you should go buy the most ripped off CPU in the world, but as long as the performance change still makes sense in relationship to the cost of your system I think it's worth it.

The same logic is why quad-cores aren't always worth it, since most apps simply can't use all 4 cores and most quad-cores cannot overclock as high. And if you DO want to overclock as high you have to invest extra $$$ in better cooling because trying to run a quad-core at 4Ghz takes a lot more money in cooling than running a dual-core at 4Ghz (also, you need a lot of luck if you wanna hit 4Ghz on a cheap quad).

In the real world an E8400 clocked at 3.8Ghz will feel faster about 98% of the time versus something like a Q9300, because it'll be clocked faster AND have more cache per core. Sure the Q9300 will feel faster if you're encoding a video, or if you're lucky enough to play a game that happens to use all 4 cores correctly. But the other 98% of the time your system will feel slower, so it makes no sense to me to go quad unless you know you'll be using programs that actually use 4 cores. And even in those programs the E8400 will make up some of the performance gap with it's higher clocks.

Some real world benches with apps people often run:
http://www.tomshardware.com/ch...-Professional,825.html
http://www.tomshardware.com/ch...3-2008/iTunes,827.html
http://www.tomshardware.com/ch...hotoshop-CS-3,826.html
As you can tell in the real-world quad cores don't mean much...only the extremes can beat the E8400 here (and overclocking the E8400 easily gives you the best performance)

Look at where the E7200 is on those charts and realize that the E5200 is even more cache-starved-you're looking at a HUGE performance hit in very commonly used programs. And that's just the average hit, imagine the parts where the hit is even worse.

Synthetic benchmarks where they actually use all 4 cores just don't reflect real-world usage. The only apps where you'll notice real-world benefits are encoding/rendering type apps.

I know the way I use my computer my system is probably running all those programs way faster than anybody with a budget quad-core, and it's probably faster than most expensive quad cores as well since it overclocks better. The E8400 is just a really sweet processor for people who want TOP NOTCH performance in programs for the lowest price. The E8600 sitting on the very top of the Photoshop chart is just a higher clocked E8400, and it's 2nd on the iTunes chart, and 3rd on the Acrobat chart. All the other CPUs in that territory are $1000+!!!

Obviously if you're seriously limited by budget the E5200 is a good choice, but if you use your computer for the apps that most people run an overclocked E8400 hangs with $1500 CPUs. Which is why I think it's a killer value even though the CPU costs 2x as much as the E5200.
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Here's some help to keep your system under budget:
4830 for $79.99/89.99 (with/without rebate) http://www.newegg.com/Product/...x?Item=N82E16814102803
Caviar Black 32MB Cache $109.99 http://accessories.us.dell.com...925t1215623f0fp0c0s558 You can get this for $99 or so if you wait...ZipZoomFly had it but ran outta stock.
Heck if you don't want 1TB you could just buy this at Newegg since it's already one of the fastest drives out there (single platter):
640GB AAKS for $65http://www.newegg.com/Product/...x?Item=N82E16822136218
Use coupon code EMCLNNT38 to knock $5 off
http://www.newegg.com/Product/...x?Item=N82E16820220293 that's CL4 timings for $20 with coupon code EMCLNNP26


Honestly if you want to cut $$$ from stuff I'd cut money from the RAM since the $$$/performance is terrible for fancy DDR3, etc. You shouldn't be paying more than $40 or so for GOOD DDR2. I'd probably only spend $20 and just buy the DDR2 that's on sale since you can relax the timings and hit whatever you need. Some people don't even have to relax timings to get 1000+.

The other thing I'd cut on would be the motherboard, since those high end crazy 50 heatpipe 900 stage power motherboards that cost like $350 are just terrible return on investment. Get one with the features you actually need and good overclocking and you shouldn't be spending more than like $150 for a quality board. And that's a fairly high estimate.

This is a pretty good board btw for $115:
http://www.newegg.com/Product/...=Gigabyte+GA-EP45-UD3P

There's no way that board doesn't have enough features for a budget gaming rig...it has like multi-phase power for everything (gets very, very good overclocks) and has 8 SATA ports. The only thing it's missing is x16/x16 crossfire (it has x8/x8) but no budget system is going to feel that.

I actually think you could probably spring for two 4830's in crossfire if you control your case budget well. Probably not if you included the monitor in the $800 though.

For monitor's there's everything from that Dell eIPS 22" to random cheap 24" ones. It's really a great time to be building a rig since the economy is sucking so badly.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
Originally posted by: nerp
Definatley a good middle option. The extra cache is great for gaming and the price is a nice middle spot between both.

I disagree

One MB of extra L2 cache won't make it great for gaming. You would get a 5% increase at the most with 2~3% increase on average. You also get SSE 4.1 but unless you use an application that can really take advantage of that, you will only get a tiny boost that would likely be noticeable only on benchmarks (running at the same MHz and FSB)


If Fry's had a sale for a E7400 and motherboard for $100 for the combo, that would be a good deal.


I think the E5200 and the Q9450 are the best Intel deals currently

 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
Originally posted by: TekDemon
The benchmarks might show 20% or 15% or 10% or whatever difference, but that's averaging all the frames together. The truth is that some calls to the CPU will be MUCH faster, while some will run at the same speed (since it fits in the cache) and it AVERAGES out to 20%.

do you have any links to cpu benchmarks that show minimum fps to back that up?


 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: edplayer
Originally posted by: TekDemon
The benchmarks might show 20% or 15% or 10% or whatever difference, but that's averaging all the frames together. The truth is that some calls to the CPU will be MUCH faster, while some will run at the same speed (since it fits in the cache) and it AVERAGES out to 20%.

do you have any links to cpu benchmarks that show minimum fps to back that up?

I wish people ran benchmarks like that, but the way cache affects performance is exactly like that-when the data it needs isn't in cache it gets a HUGE performance hit, and when it is there isn't a hit.

But seriously it's just a logic exercise with averages. With an average of 20% it's clear that there are parts much greater than 20% slower. You know that the lower cached CPU isn't going to be running stuff faster than 100%, so any numbers being averaged in are capped at 100%.

And you also know that it doesn't consistently run 20% slower, because the way cache affects performance is all based on whether or not it manages to grab data from the cache instead of system RAM. So whenever there are computations that constantly need to hit the cache, you'll see performance tank below 20%.

Unless you believe that a game NEVER has a computation that's in cache (in which case it would seriously run like 1FPS) it's obvious that there'll be parts significantly more than 20% slower, because cache misses are a huge penalty.

The cache size doesn't modify performance like clock speed does-with clock speed if a CPU is 10% slower it's always 10% slower (assuming identical everything else). But with cache sizes it's a hit or miss thing, which is why some apps have such large performance penalties and some apps run about the same, since some trigger tons of cache misses and some don't.

But that's just an average performance penalty. In actual use when you need the program to do stuff that causes a ton of cache misses you'll have a huge penalty, and when you don't it'll run the same speed.

Basically with more cache misses you increase the volatility of your performance. It'll swing up and down much more often than a CPU with a cache large enough to constantly keep it going at maximum performance.

Forget minimum FPS actually, since sometimes the parts that cause minimum FPS are often too big for even the larger cache, or have nothing to do with cache performance (which would give the same or similar minimum FPS). But above the minimum there'll be a lot more LOW FPS situations on the CPU with less cache.

The best example I can find is here:
http://www.pcgameshardware.de/..._08_CoD4_1920x1200.PNG
You see that the E8x00 (clocked to the same speed) and the E7x00 have similar minimum FPSes (only 1FPS), but their average is 4 times as different. Because not every part of the game only runs 1FPS slower. Some parts run a lot slower, some parts run like the minimum runs-almost the same speed.

Because if the average difference is 4FPS and you know there's parts like the minimum where it's only 1FPS, there has to be parts where it's like 7FPS+ slower so it'll average out to 4FPS different across the board. Get it? Because if it was always 1FPS slower like the minimum it wouldn't average to 4FPS slower.

I really wish someone would show a trace graph of the FPS variance, because trying to show this concept with crappy graphs not meant to show this is really tough.

It's really more problematic with games where most of the time you're running close to the minimum desired FPS, so you don't want dips into unplayable.

HardOCP shows those graphs on video card reviews but they unfortunately didn't bother with CPUs =(
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
Originally posted by: TekDemon

But seriously it's just a logic exercise with averages. With an average of 20% it's clear that there are parts much greater than 20% slower. You know that the lower cached CPU isn't going to be running stuff faster than 100%, so any numbers being averaged in are capped at 100%.


but averages are closer to the 5% area (using equal MHz)
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: edplayer
Originally posted by: TekDemon

But seriously it's just a logic exercise with averages. With an average of 20% it's clear that there are parts much greater than 20% slower. You know that the lower cached CPU isn't going to be running stuff faster than 100%, so any numbers being averaged in are capped at 100%.


but averages are closer to the 5% area (using equal MHz)

No they're not.
http://www.pcgameshardware.de/..._08_CoD4_1920x1200.PNG 12%
http://www.pcgameshardware.com...viewed/Reviews/?page=5 12% again

And if I wanted to be a jerk:
http://www.egielda.com.pl/imag...e972b8123d31f58bd8.png
http://www.egielda.com.pl/imag...f18bea785b5fb22ff2.png
http://www.egielda.com.pl/imag...92bfa544c3d50f9a37.png
Notice how the 3.75Ghz E5200 is barely faster than the 2.67Ghz E8200...and how the E8200 beats the E5200 into the ground stock in those games. And that's just the average FPS, imagine playing those games in the parts where the E5200 is 50% slower.

The thing is that most people don't care or maybe don't even notice whether one system is 15-20% faster. But you'll definitely notice it when you try to run the game that's 50% faster on the E8x00.
Or you get to the part of some game where the E8x00 has a 30% advantage. When you get there it won't matter that you only average a 15% deficit, because your game is now unplayable.

The extra cache minimizes the amount of "worst case scenarios", which is a lot worse than average performance differences.

Anyways I'm not going to post any more, gotta get back to work, but if you still don't believe me then there's no point in trying to convince you. Averages don't matter.
Kind of like how having an average A- GPA isn't helpful when you flunk that last required class senior year. That'll bother you a lot more than even if your average GPA was a little bit slower, because the huge drop is what'll hurt.
And that's the same idea with computer apps and games, as long as it runs decently you probably won't notice the difference. But when it runs like total butt because it's missing cache requests you'll notice it greatly.

If you wanna look at all the benches:
http://translate.google.com/tr...&tl=en&history_state0=
http://www.egielda.com.pl/?str=art&id=4678-20
It's not entirely fair since it's 2.67Ghz vs 2.5Ghz, but even correcting for that the average penalty is nowhere near 5%. So unless you don't wanna play new Valve games or Unreal Engine games I think the E8400 is easily worth it.

And in case you don't believe that website since it's in polish:
http://www.xbitlabs.com/articl...pdc-e5200_6.html#sect0
Again, the E8200 (2.67Ghz) annihilates the E7300 and the E5200.

And look at the app charts I linked above to see the app performance differences in popular apps.

I don't care if some crazy synthetic benchmarks shows a 5% difference, because in the real world you'll be dropping up and down on the E5200 when you run programs. I don't want one program to run well then the next one runs 30% slower. If you're cool with that that's great, but trying to argue that the E8400 isn't worth the extra money is silly since the extra money clearly buys real performance boosts in a ton of apps and games. Intel engineered it with 6MB because that was the optimal amount. It's seriously the best gamer value if you don't just wanna play the games where cache doesn't matter.

If you already bought an E5200 or E7300 or something because someone on these forums or elsewhere claimed that cache was just a small performance hit, well I'm sorry but that doesn't mean that you should keep insisting that it's a small 5% hit or that it doesn't matter.

It matters a lot. It matters enough that I kept holding out for a price cut on the E8400 series before I bought because I just didn't wanna settle for reduced cache again after experiencing what it was like on the E4300. It matter so, so much more than any stupid benchmark average graph will show you.

For one thing, even the best benchmark graphs don't show you subsecond performance. Meaning if the game slows to 10FPS for half a second before bouncing back to 50FPS it'll just show you that it averaged 30FPS. Which might not sound much worse than a 40FPS average. Except when you're playing and it slows to 10FPS for that half second you'll notice.

I don't mean to cause people any sadness about having bought an E5x00 or E7x00, but enough with the cognitive dissonance. I'm just trying to tell someone who's deciding what CPU to buy that the jump to 6MB in cache is a big performance boost. He's building an $800 system, how does it at all make sense to save $80 (10% of the system price) on a CPU if it drops his FPS 30-40% in pretty popular games like Half Life 2, or STALKER, or UT3? It's not like these are some crazy games nobody plays, they're huge games and a ton of other games are based on the Unreal or Half Life 2 engine.

And if you're still feeling annoyed at the difference, then just look at some AMD benchmarks and I guarantee you'll feel better =P Cuz those crappy E5x00's are still faster than most of the AMD lineup.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: edplayer
Originally posted by: nerp
Definatley a good middle option. The extra cache is great for gaming and the price is a nice middle spot between both.

I disagree

One MB of extra L2 cache won't make it great for gaming. You would get a 5% increase at the most with 2~3% increase on average. You also get SSE 4.1 but unless you use an application that can really take advantage of that, you will only get a tiny boost that would likely be noticeable only on benchmarks (running at the same MHz and FSB)


If Fry's had a sale for a E7400 and motherboard for $100 for the combo, that would be a good deal.


I think the E5200 and the Q9450 are the best Intel deals currently

I disagree. The extra 1MB is a 50% increase in L2 cache size, and the performance difference is pretty big and makes it a very, very good middle option.

Just look at the E5200 vs. the E7200, which run at as close of a clock speed as you'll probably find:
http://www.anandtech.com/bench...3.44.45.46.47.48.49.50

The E7200, on the average, performs about 8% faster than the E5200. In the game tests, it is on average about 15% faster.

Now let's take a look at the E7500 vs. the E8400, which have fairly close clock speeds:
http://www.anandtech.com/bench...3.44.45.46.47.48.49.50

The E8400 has 100% more L2 cache than the E7500 (plus a small clockspeed advantage), and yet on average it is only about 7% faster in all of AT's benchmarks. In the game benchmarks, it is about 12% faster.

So, the move from 2MB to 3MB of L2 cache yields greater improvements than the move from 3MB to 6MB for these stock processors.
 

TekDemon

Platinum Member
Mar 12, 2001
2,296
1
81
Originally posted by: cusideabelincoln
The E7200, on the average, performs about 8% faster than the E5200. In the game tests, it is on average about 15% faster.

Now let's take a look at the E7500 vs. the E8400, which have fairly close clock speeds:
http://www.anandtech.com/bench...3.44.45.46.47.48.49.50

The E8400 has 100% more L2 cache than the E7500 (plus a small clockspeed advantage), and yet on average it is only about 7% faster in all of AT's benchmarks. In the game benchmarks, it is about 12% faster.

So, the move from 2MB to 3MB of L2 cache yields greater improvements than the move from 3MB to 6MB for these stock processors.

Seriously why does everyone care about averages?!?
The worst case scenarios determine when your system is outdated, not the averages.
And the problem is that in VERY POPULAR game engines like UT3 and HL2 the penalty is huge.

Here's Anandtech's Left 4 Dead chart:
http://www.anandtech.com/bench/default.aspx?b=48
The E8200 gets 108FPS...the E5200 gets 77FPS.
And Left 4 Dead is THE MOST POPULAR GAME OUT RIGHT NOW.

They didn't benchmark a TON of other Unreal and HL2 based games either. It really gives the wrong impression about how much of a performance hit you'll suffer, because you're MUCH more likely to play good hit games based on popular rendering engines than just any benchmark game. So you need to weight those benchmarks much more heavily. Unless you don't think you'll ever play a Valve game for some crazy reason, or any Unreal engine games...in which case you really have no business building a gaming rig at all.

Go look at the xbit and polish benches I linked the post above yours...it's really ugly in a ton of engines where the difference is 30-40% average. And again, that's just AVERAGE, except I guaran-fricking-tee you that there'll be parts of the game where you'll be running 50%+ slower just like there's parts where you'll be running only 15% slower.

Cache matters in the most popular 3D engines out there. It matters less in engines like CryTek because Crysis is spending so much time murdering your GPU the CPU handicapping isn't as obvious.

I feel like I'm taking crazy pills with people ignoring benchmarks for the most popular engines on the planet and constantly telling other people that the average penalty is only xx%. You don't play an average game, you play specific games, and the most popular games on the planet happen to take massive, huge, hits to FPS.