E8400 vs. X2 4200+

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
For a lot of users sitting on the fence with aging X2 systems, this shows that even without overclocking, you'd get close to double the performance increase (Note: in games 8800GTX was used; so gaming advantages will be lower with a slower graphics card)

***See gaming benchmarks with higher resolutions below this post***

---------------------------------------------------------

E8400's Advantage over 4200+ is shown below:

SYSMark 2007 - 155 vs. 88 (76%)
3DMark06 - 11601 vs. 8195 (42%)
3DMarkCPU - 2813 vs. 1635 (72%)

Quake 4 1024x768 HQ - 155 vs. 83 (87%)
HL 2: Episode 2 1024x768 - 181 vs. 99 (83%)
Crysis, 1024x768 MQ - 69 vs. 35 (98%)
Unreal Tournament 3, 1024x768 - 101 vs. 52 (94%)
World in Conflict, 1024x768 MQ - 89 vs. 36 (147%)

MP3 Encoding, ITunes 7.4 - 108s vs. 203s (88%)
DivX 6.8, fps - 67 vs. 34 (97%)
Xvid 1.2, fps - 46 vs. 24 (92%)
Mainconcept H.264 Encoder - 35 vs. 19 (84%)

3ds Max 9, Rendering - 5.21 vs. 2.73 (91%)
CINEBENCH R10, Rendering - 6318 vs. 3536 (79%)
Photoshop CS3 - 67s vs. 149 s (122%)
After Effects CS3 - 383s vs. 840 (119%)

Mathematica 6 - 3.40 vs. 1.63 (109%)
WinRAR 3.7 - 287s vs. 479s (67%)

System Power Consumption (Idle)
8400 = 171W
4200+ = 172W

System Power consumption (load)
8400 = 214W
4200+ = 218W

Average Advantage: 91.5% (~92%)

Source: Xbitlabs
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
holy cow... the 8xxx rapes all the dual cores bigtime in crysis... imagine with a 50%oc at 4.5ghz... sheesh.

edit: also, great find russian!

The e8400 is furthest off the median too towards the performance end... It is the best price/performance ratio with 2nd place being the e2200.

And it overclocks the highest as well. (not the highest margin, but the highest overall speed)
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
I'd have to say the gaming results are slightly skewed as they were tested at low resolutions on the 8800GTX to show more CPU scaling.

In real world gaming, where users tend to play at the highest resolution and details their GPU would allow, the difference won't be as great. I've compared my E4400 @ 2GHz vs 3.33GHz at 1680x1050 on my 8800GTS, and whilst certain games (mainly online shooters and RTS games) definitely showed greater performance, others (like racing and single player FPS) were still GPU bottlenecked once I cranked the details up.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I'd like to see gaming benchmarks at a higher resolution. I understand that lower resolutions highlight the CPU performance by removing the video card bottleneck, but I'd like to see higher resolution benchmarks to see what a CPU upgrade does in more 'real life' gaming situations. My guess is that the upgrade in performance will be much less.

Also, the photoshop and encoding things are obviously much faster, but with either it'd still take long enough that I'd start it then go do something else for a few minutes and come back when it's done. If I'm reading that right, the benchmarks are done in fps?

*edit - Not to mention, based on the examples above, you're paying about 400% more for less then 100% more performance.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: jaredpace

edit: also, great find russian!

Thanks Jared. E8400 is a sweet beast! If I was still on an X2 system, I'd be all over it right now!

Oh, check out Crysis C2D scaling:
1440x900 High Quality DX10 8800GTX
C2D 2.66ghz = 45.4
C2D 1.60ghz = 41.7

1920x1200 High Quality DX10 8800GTX
C2D 2.66ghz = 29.6
C2D 1.60ghz = 27.6

GF9/10 is necessary for this game to show any advantage for a C2D processor ;)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: harpoon84
I'd have to say the gaming results are slightly skewed as they were tested at low resolutions on the 8800GTX to show more CPU scaling.

In real world gaming, where users tend to play at the highest resolution and details their GPU would allow, the difference won't be as great. I've compared my E4400 @ 2GHz vs 3.33GHz at 1680x1050 on my 8800GTS, and whilst certain games (mainly online shooters and RTS games) definitely showed greater performance, others (like racing and single player FPS) were still GPU bottlenecked once I cranked the details up.

I agree on this point that GPU is still the most important component. However, in the 3 of 5 games tested (Crysis, UT3 and WIC) the X2 delivered lower than 60fps (esp too slow in Crysis and WIC). So although when comparing C2D families the performance increase in gaming will not be as drastic, when comparing C2D to X2, the X2's minimum frames will likely be tons lower if the average is in the 35s! But let's see ....

-------------------------------------------------------

EDIT: Here are the benchmarks with higher resolutions for anyone interested:
(highlighted where the performance difference is significant)

NEWER GAMES

World in Conflict - 1600x1200 Very High Quality
C2D 2.4ghz = 36fps
X2 2.6ghz = 23fps
X2 2.0ghz = 18fps
A64 single 2.4ghz = 10fps

Crysis - 1600x1200 High Quality
C2D 2.93ghz = 26fps
C2D 1.86ghz = 26fps
X2 2.6ghz = 25fps
X2 2.0ghz = 24fps
A64 single 2.4ghz = 18fps
*Crysis just kills all the videocards today!
TIE

COD 4 - 1280x1024 Maximum Quality
C2D 2.4ghz = 77fps
C2D 1.8ghz = 54fps
X2 2.6ghz = 60fps
X2 2.0ghz = 48fps
A64 single 2.4ghz = 26fps

Unreal Tournament 3 2048x1536 Maximum Quality
C2D 2.93ghz = 85fps
C2D 2.4ghz = 85
C2D 1.86ghz = 84
X2 2.6ghz = 83
X2 2.0ghz = 79
A64 single 2.4ghz = 76
*Not aligned with Xbitlabs benchmarks - but that's why it's good to use more than 1 website to see what makes sense and where.
TIE

Bioshock 1600x1200 High Quality
C2D 2.93ghz = 65
C2D 2.4ghz = 63
C2D 1.86ghz = 63
X2 2.6ghz = 61
X2 2.0ghz = 61
A64 single 2.4ghz = 44
TIE

Supreme Commander 1600x1200 High Quality
C2Quad 2.66ghz = 43fps
C2D 2.93ghz = 38
C2D 1.86 = 27
X2 2.6ghz = 20
A64 single 2.8ghz = 17

Command and Conquer 3 2048x1536 Very High Quality, 4AA
C2D 2.93ghz = 30
C2D 1.86ghz = 30
X2 2.6ghz = 30
A64 single 2.8ghz = 27
TIE

OLDER GAMES

Prey - 1280x1024 High Quality
C2D 2.9ghz = 97
C2D 2.1ghz = 94
X2 2.8ghz = 90
X2 2.4ghz = 85
X2 2.0ghz = 80
TIE

BF 2142 - 1600x1200 4AA/16AF
C2D 2.93ghz = 63
C2D 1.9ghz = 63
X2 2.8ghz = 63
X2 2.0ghz = 63
A64 single 2.4ghz = 63
TIE

Neverwinter Nights 2 1600x1200, Medium Settings
C2D 2.93ghz = 44
C2D 2.4ghz = 37
C2D 1.87ghz = 29
X2 2.8ghz = 30
X2 2.4ghz = 27
X2 2.0ghz = 23

Company of Heroes - 1600x1200 Ultra Settings
C2D 2.93ghz = 47
C2D 2.4ghz = 47
C2D 1.9ghz = 45
X2 2.8ghz = 45
X2 2.4ghz = 44
TIE

As has been suggested above the performance difference varies by the game, but certainly C2D processors do not show as great of an advantage in games due to videocard limitations. We need GF9 to see if C2D offers significant performance differences at higher resolutions then!
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
yes thanks for the high-res benches. It would be nice to see a gpu offering equivalent to the cpu offerings these days. Then high res games like crysis *wouldn't* be so gpu limited and cpus would scale better.

 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: jaredpace
yes thanks for the high-res benches. It would be nice to see a gpu offering equivalent to the cpu offerings these days. Then high res games like crysis would be so gpu limited and cpus would scale better.

This certainly tells us that C2D architecture will be more than sufficient for GF9 and possibly even 10 (esp the overclocked versions).
 

tallman45

Golden Member
May 27, 2003
1,463
0
0
Those findings are not a surprise at all.

Consider some basic tuning tricks

Any data that does not have to be retrieved from the HDD greatly speeds a systems performance

1mb L2 per core vs 6mb L2 per core caches

5 additional mb of data per core is readily available to the processor and memory and does not need to be fetched from your HDD subsystem

Thats the biggest reason why the E8400 is a much better performer at stock speeds
 

magreen

Golden Member
Dec 27, 2006
1,309
1
81
Umm... 5 additional MB doesn't need to be fetched from the HDD? You're talking about systems with 2000 MB of memory to keep things from being fetch from the hdd. That's not what cache is useful for.
 

zsdersw

Lifer
Oct 29, 2003
10,505
2
0
One thing I noticed about the 8400 when I put mine in is the IHS has changed a little. It seems like it's flatter, which should be good news for you overclockers.

Edit: .. compared to the E6300, first batch of its kind, that it replaced.
 
Dec 30, 2004
12,553
2
76
Originally posted by: RussianSensation
snip

Now what I'd REALLY like to see is which platform has the lowest FPS dips in those games where it's a tie. Wonder if the memory controller still charms...
 
Dec 30, 2004
12,553
2
76
Originally posted by: magreen
Umm... 5 additional MB doesn't need to be fetched from the HDD? You're talking about systems with 2000 MB of memory to keep things from being fetch from the hdd. That's not what cache is useful for.

lol!
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Goes to show my x2 3800+ @ 2.6ghz needs to be replaced for games like SupCom and WiC :(
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: harpoon84
I'd have to say the gaming results are slightly skewed as they were tested at low resolutions on the 8800GTX to show more CPU scaling.

In real world gaming, where users tend to play at the highest resolution and details their GPU would allow, the difference won't be as great. I've compared my E4400 @ 2GHz vs 3.33GHz at 1680x1050 on my 8800GTS, and whilst certain games (mainly online shooters and RTS games) definitely showed greater performance, others (like racing and single player FPS) were still GPU bottlenecked once I cranked the details up.


Even tho I don't game alot anymore because of health. I don't know what your going on about. One gamer I know who is among the best of the best games at 800x600 only.
He doesn't play with Eyecandy. He says its to big of a performance hit. This is in online gaming not single player.

I have read that the worlds top players use low res. in tournies. But don't know for fact.

But if its true. Than the best of the best use REAL WORLD low res. gaming.

Ya sound a little bit like me when AMD was owning P4 at the review sites 800x600 was used in cpu benchies. I argued that In high res that that advantage would go away. At the time I had a 19" sony monitor that would do high res. So the monitors were available for higher Res testing. Funny how the fanbois have picked up on the real world high res. testing.

Tell me something. If I am playing online . I am running say 200fps. Another player is running 100 fps. I understand the human eye can't see the differances. BUT! What does the server see? The player running 100 fps will see me ghosting . Turning a corner way befor I am actually there. To hit me he has to shot behind me. Now thats real world.Many get this confussed with lagg. But its not the kind of lagg they understand.

I game @ 1280x1024 most the time. But if its a big match I go low res. It makes a big differance. But the server if the operator is good can throttle anyones bandwidth. So is what I do is . If a big match is coming up . I game hours their daily at high res . Constantly complaining about bandwidth. Once the server gives me the required bandwidth to compete. I day of big matches. I lower the res. But server hasn't readjusted to my res. So I become super fast. But it only last about a day before they throttle me again.


For you guys to talk real world . The fact is its a false statement. Its not real world its your world. Look at < and his so called real world test. The guy thinks he is an Engineer. But alast he is nothing but a self rightious hardware idiot. Note I mentioned no name . I dont know whats going on with this server. But that darkened area has happened 2 times now both trying to Put H with its [] but it doesn't show up in the post. So I guess so no misunderstanding I was referring to Kyle.




 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Nemesis, just because the 'best of the best' game at low res doens't mean the majority of people do. I don't care what John Wendel games at, most of us game with atleast some eye candy on, I game with as much eye candy on that I can get away and still have good frame rates.

Also, with the P4, the big problem with them wasn't that the A64's had better average frame rates at low res, but I remember the P4's had terrible minimum frame rates compared to the A64's. If a P4 had 50fps average and the A64 had a 60fps average, that doesn't sound like that much of a difference, but if those same systems minimum frame rates were 25 on the A64 and 10 on the P4, that's a performance issue. I'm pulling those numbers out of my ass, but I do remember seeing reviews where the minimum frame rate differences between the two architectures was very dramatic.

I'd like to see minimum frame rate benchmarks, I'd be curious to see the differnce between an A64 and a C2D.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
1280 x 1024 is as high as my crt goes, i would like to game higher, but cant 1366 x768 is highest my tv goes, so I can't go higher than that if i'd like too. I only found it a drawback in cod4, because i knew it could look better. I would average over 60fps at 1280 with max quality4xAA and 16AF, I went back and played HL2: ep2 at max qual 16xQAA and 16xQAF w/ vsync on. hahahh it looked sweet. FPS stayed high too.
 

drebo

Diamond Member
Feb 24, 2006
7,034
1
81
My problem with this is that you're benchmarking a chip that's more than two years old against one that's not even available in bulk quantities yet.

Yes, I realize people want to see how their particular processor stacks up, but benchmarking a 3.0ghz brand new CPU against a 2.2ghz low end processor from last generation isn't exactly fair.

I mean, seriously...what, exactly, were you expecting?
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: drebo
My problem with this is that you're benchmarking a chip that's more than two years old against one that's not even available in bulk quantities yet.

Yes, I realize people want to see how their particular processor stacks up, but benchmarking a 3.0ghz brand new CPU against a 2.2ghz low end processor from last generation isn't exactly fair.

I mean, seriously...what, exactly, were you expecting?

Well, the point of it was to show that it's worth upgrading from your 2 year old processor to the new stuff, depending on what you do. But, to tell you the truth I still think the X2's perform well enough to hold off on an upgrade... the benchmarks are there, anyone with an X2 can make their own decision.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Good point. You know what I would like to see. A review of P4 going against AMD 64.

Using game benchies that take advantage of more than one thread. I will pick the P4 as the winner at low res. I don't know whats going on in the hardware world. But when a company has 75% market share . Game developers should have gone multi. thread . When Intel P4 ht was introduced. I don't much like programmers. Were do virus come from . Users like us. Or would it be more likely that people who right software for virus protection . Actually create the virus. $$$$ is the route of all evil.

To many higher education people graduating who aren't quilified. Hence the problems were suffering from.

Look at analyst. They come out and downgrade intel . What is going on with that. I know.

Its about $$$ and friendships built in school. Its crooked and can't be stopped.

2 highly respected analyst both good friends . But both taking opposite positions on a stock. They work together to raise and lower stock prices. As long as they keep their mouths shout. They will never get caught. This is the normal situation not the unnormal. But proving it is impossiable. Unless mouth has foot in it.

Look at whats going on now with AMD and MS. They have cuddled up pretty tight. They dam well better also.

Now Fututemark is talking about getting into gaming using a new IP. What you want to bet that Apple and Intel will use this IP. If I were futuremark and I was getting into gaming . I would jump inbed with Intel. I lot of people are saying some strange things about RTRT. That it has limited uses. They say this and that. But not one of these will mention collision detection which raytracing does very well. Than there is Havik. Intel bought which will require great collision detection for game physics to work properly.

Now MS is buying Yahoo. So to make it perfect we need google and its billions to create an OS. I understand digital and I understand MS hasn't opened up its source code. But knowing Bill and the the gang way back when. The source code for MS is probably something a kin to morse code dot dash vs 1/0. When whom every it was that invented the mouse that Bill bought the rights to for $5000. Because the big company couldn't figure out what to do with it. Now I don't know if it was Jobs or bates who thought of creating a WINDOW inside Dos . That didn't require dos in the window. I suspect it was Jobs who thought of it. Because we all know Bill stoled ideas from apples Lisa. But I guess Bill gets the credit.



Its a fact guys. The bed partners have been picked. All are cosy . With 1 exception. That being NV.

So we have Intel /Apple / and Sun all getting into bed together. Look at Suns improvement since going to intel . Look at what Intel did for Dell. Look at the staggering market share Apple is gaining. Looks to me like being in bed with intel is good .


Point is we all view the world from our own little private world. If you really understood real world you probably turn down the res a little. Because viewing real world can be ugly also.

I am not arguing against high res. What I am arguing is the use of the term real world.

Cold hard fact . Its off topic but a fact. Americans look at what Hitler did to Jews and non- christians durring WW2 in horror. Were are the MOHICAN INDIANS THAT ONCE LIVED IN THE HUDSON VALLEY? OR ANY OF THE OTHER iNDIAN NATIONS THAT WERE WIPED OFF THE FACE OF THE EARTH? Now thats real world. Its ugly isn't it .
 

heyheybooboo

Diamond Member
Jun 29, 2007
6,278
0
0
I didn't want to say anything and start a flame war but I, too, was a little disappointed at the "" E8400 vs. X2 4200+"" comparison as far as clock speed, cache, etc

Thanks for the benchies at higher resolutions :thumbsup: