Phenom II X4 3.8ghz vs. Core i7 920 3.4ghz - Gaming Benchmarks by PureOC

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
With Core i5 just around the corner, I thought this Pure OC article was pretty good in highlighting that for quad core users (i.e., Phenom II, Core i7, 45/65nm Overclocked C2Qs), the videocard is still the most limiting component.

***All benchmarks are at 1920x1200*** with 4890 videocard

ArmA 2
Phenom 955 3.2ghz = 21.4
Phenom 955 3.8ghz = 23.8
Core i7 2.66ghz = 23.2
Core i7 3.4ghz = 26.1 (+22% fastest vs. slowest)

Left 4 Dead - 8AA/16AF
Phenom 955 3.2ghz = 84.3
Phenom 955 3.8ghz = 85.1 (+1% fastest vs. slowest)
Core i7 2.66ghz = 84.9
Core i7 3.4ghz = 84.4

Call of Duty: World at War - 4AA/16AF
Phenom 955 3.2ghz = 62.8
Phenom 955 3.8ghz = 63.1 (+0.5% fastest vs. slowest)
Core i7 2.66ghz = 62.9
Core i7 3.4ghz = 63.0

Tom Clancy's H.A.W.X - 8AA/16AF
Phenom 955 3.2ghz = 42
Phenom 955 3.8ghz = 42 (+2.4% fastest vs. slowest)
Core i7 2.66ghz = 42
Core i7 3.4ghz = 41

Crysis: Warhead - 0AA
Phenom 955 3.2ghz = 23.4
Phenom 955 3.8ghz = 23.7 (+1.3% fastest vs. slowest)
Core i7 2.66ghz = 23.6
Core i7 3.4ghz = 23.7

X3: Terran Conflict - highest AA/AF

Phenom 955 3.2ghz = 74.4
Phenom 955 3.8ghz = 82.1 (+15% fastest vs. slowest)
Core i7 2.66ghz = 71.3
Core i7 3.4ghz = 78.6

Average difference between the fastest system vs. slowest:
7.03%

Say Hello GPU bottleneck. Looks like a decent quad-core system still has a lot of legs left in it ;) Too bad they didn't throw in E8400 or something similar in there.

EDIT (Updated September 8, 2009):
Anandtech tests GTX 275 on Core i7 975 vs. Core i7 870 vs. Phenom II X4 965

2560x1600 maximum settings

Crysis Warhead (ambush)
i7 975 = 20.8
i7 870 = 20.8
X4 965 = 20.9

Crysis Warhead (avalanche)
i7 975 = 23.0
i7 870 = 22.9
X4 965 = 23.0

Crysis Warhead (frost)
i7 975 = 21.4
i7 870 = 21.5
X4 965 = 21.5


 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Need to throw 5890 or GT300 at the same rigs and see if it seperates any men from the boys.

Interesting if you arbitrarily draw a line at 60fps and say "below this, unacceptable gameplay, above this, impossible to differentiate by gameplay" all these rigs are either above or below that line for any given game.

Doesn't matter what you have, ArmA 2 @ 26.1fps has got to suck. Likewise L4D @85fps is not giving you much versus L4D @75fps.
 

konakona

Diamond Member
May 6, 2004
6,285
1
0
not only PII is on equal footing with i7, OCing it has minimal impact according to this review. I shouldn't feel so bad my 710 only managed to eek out measly 3.12 :)
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: Idontcare
Need to throw 5890 or GT300 at the same rigs and see if it seperates any men from the boys.

Interesting if you arbitrarily draw a line at 60fps and say "below this, unacceptable gameplay, above this, impossible to differentiate by gameplay" all these rigs are either above or below that line for any given game.

Doesn't matter what you have, ArmA 2 @ 26.1fps has got to suck. Likewise L4D @85fps is not giving you much versus L4D @75fps.

Yep, which is why for those games with greater than 60 FPS they should really throw in the minimum framerates (well minimum should always be included)
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Drop down to 4X AA and you'll probably see some separation, particularly in L4D. Of course everything is going to be close when you're spending vast amount of GPU resources in increasingly irrelevant levels of AA.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: ViRGE
Drop down to 4X AA and you'll probably see some separation, particularly in L4D. Of course everything is going to be close when you're spending vast amount of GPU resources in increasingly irrelevant levels of AA.

You can see that 4890 is getting over 60 frames in Left4Dead, Call of Duty5 and X3. So there hardly will be any benefit by lowering to 4AA. These games are already playable with 8AA. In much the way you stated that 8AA is 'irrelevant', 100 frames over 80 or 80 over 60 is also irrelevant unless it impacts minimum framerates significantly.

In Crysis and ArmA2, the performance is so low, there was no AA already. Future games will likely be as intensive if not more.

Also remember, when more intensive games come out, 4890/GTX275/285 will only become more stressed out shifting the bottleneck even more to the GPU. In other words sooner or later even without AA, any modern quad core CPU will be more than sufficient, because the videocard will be overstressed first.

I realize that tons of people want to get a Core i7 and therefore they need to justify that purchase. This doesn't mean in any way that a Phenom II or say 45nm 3.4ghz C2D will provide any less performance with today's videocards. Of course, with 4890s in CF or GTX 275 in SLI, a faster cpu will yield better results. But these benchmarks are strictly for reference for single videocard users.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Those are the minimums, mind taking the time to show averages too :D
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: yh125d
Those are the minimums, mind taking the time to show averages too :D

yh125d the lower numbers you are seeing in these graphs just corresponds to the average framerate at the highest resolution, higher numbers in the graph are for lower resolution.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
chart reading fail, assumed the differens resolutions would be diff charts
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
Originally posted by: yh125d
chart reading fail, assumed the differens resolutions would be diff charts

I like it. Don't have to jump around between charts to compare (and its really pretty easy to figure out).

I liked the statement at the end. Just a reminder that in spite of the minor differences between the two performance wise, with the difference in cost you can get either another GPU (as they stated), or a much better single. Or just keep the difference in your checking account (which is what I did).

 

DrMrLordX

Lifer
Apr 27, 2000
22,772
12,781
136
I'm a little surprised that they didn't do any benchmarks on CPU-bound games like World of Warcraft or what have you. Wasn't MS Flight Simulator X heavily CPU-bound as well?

edit: and yeah, I would have liked to have seen a heavily-overclocked E8500 in there or something (4-4.5 ghz+).
 

Bill Brasky

Diamond Member
May 18, 2006
4,324
1
0
Excellent article and thanks for posting that OP. Reality checks like that are very necessary for me because I constantly to justify an i7 upgrade for my next rig. I hope we can get similar numbers when the i5 is released.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
I wonder how my PhII will hold up when ATI 5xxx series hits. I'd imagine, looking from synthetic benches, that the i5s/i7s will start to run away from the PhII.
 

geokilla

Platinum Member
Oct 14, 2006
2,012
3
81
Should've made comparison on CPU-intensive applications instead and stuff and not gaming. Or better yet, some distributed computing goodness.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: geokilla
Should've made comparison on CPU-intensive applications instead and stuff and not gaming. Or better yet, some distributed computing goodness.

Why? It shows that the majority of users are still capped.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Zstream
Originally posted by: geokilla
Should've made comparison on CPU-intensive applications instead and stuff and not gaming. Or better yet, some distributed computing goodness.

Why? It shows that the majority of users are still capped.
It shows the majority of users in GPU-bound situations are still capped, which is a "well duh" kind of thing.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: geokilla
Should've made comparison on CPU-intensive applications instead and stuff and not gaming. Or better yet, some distributed computing goodness.

Without a doubt the Core i7 with HT platform will be significantly faster than the Phenom system with any type of audio/video work and distributed computing. But the intention of this was to see what is the best value if you are on a budget. In other words, get the 2nd fastest processor, but put the rest into the video. Of course, with Core i5 750 for $200 and some 1156 mobos for $80-100, Phenom II is now more or less irrelevant.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Scoop
What's the point of OCing the 920 only to 3.4Ghz? These results are meaningless.

:confused: The performance at 3.4ghz was barely improved over 2.66ghz clock speeds on a 4890, which was also aligned with the Phenom II. So running it at 4.0ghz would have produced little to no difference in these games.

It's very clear that in the games tested, any modern quad core processor will be bottlenecked by the videocard. That was the whole point of the benchmarks.

For example, performance in your rig would improve by 3-4x if you had a 4890 or GTX 275, while getting a Core i7 4.0ghz will give a fractional boost.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Atechie
Nice...a GPU test masked as CPU test...

I realize your point. However, I think it's fair to say a lot of people are going to be gaming using 1680x1050 4AA at least with a decent graphics card. Perhaps as future games become more multi-threaded, the Core i7 architecture will pull away.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
I should test some of these so you can get a comparison with my 4850 X2 2GB...