[Techspot] Then and Now: A decade of Intel CPUs compared, from Conroe to Haswell

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
In MultiThreading and Multitasking, in Server and Workstation environments. The FX8350 with low IPC and slow caches was not created to compete in the desktop and especially in Games.
The Bulldozer architecture was created for throughput, not single thread performance. It is just simple, AMD couldn't create another dedicated Desktop SKU and they just used the Server design for the desktop.
If GloFo would continue with the 22nm SOI, the Steamroller 4-5 Module SKU in 2013-2014 would be very nice. Unfortunately for us, GloFo and AMD changed their plans and they took the FinFet and Mobile road.

Agreed -- had Vishera simply moved to 20nm as a stopgap product until Zen was ready.... There would probably be a lot of people happy with it. Vishera was quite solid -- it's biggest handicap IMO was the amount of heat generated, which a smaller fab process would have greatly improved.
 
Aug 11, 2008
10,451
642
126
Did you miss all the other games? Let's go over it again:

i3 4330 = 40 fps average / 33 minimum
FX9370 (speeds which are achievable by 8320/8350) = 59 fps average (nearly 50% faster) / 40 minimum
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Ryse_Son_of_Rome-test-Ryse_proz.jpg


FX8350 at stock beating 2600K in Crysis 3.
proz.jpg


Project CARS FX series core is 30% faster than an i3 on a 970.

pcars_win_nv_t2.png


Let's expand the testing:

i5-4690K@4.8 vs FX-8350@4.7 in 9 games (GTX 970)
https://www.youtube.com/watch?v=20QPpIqo-YY

Face the truth - an i3 is garbage compared to the FX8320/FX8350 @ 4.4-4.7Ghz for budget gaming with a modern graphics card but they cost very similarly.

Let's also ignore how FX8300 series wipes the floor with an i3 in productivity.

7zip-comp-oc.gif

7zip-decomp-oc.gif

x264-1-oc.gif

x264-2-oc.gif


This is just a stock 8350 trading blows with a 2600K in rendering.
cinebench.gif


PovRay
pov-chess.gif

pov-bench.gif


But ya, let's ignore FX8000 series overclocking, let's ignore that people do other things with their PCs besides gaming, let's ignore all well-threaded modern AAA games so that the i3 looks good again. No need to roll your eyes and ignore the key trends in gaming and PC usage. This forum is amazing and it will forever defend the garbage i3 just like it will forever defend how the i5 is good enough for games and the i7/X99 platform is a waste of $. I've been there before where so many people defended dual cores and they were all wrong. It's only going to get worse for the i3 when games move on to DX12. If a gamer can step up to the i5/i7, that's a good foundation for a solid gaming rig but an i3 isn't unless one is specifically knows he/she will ONLY play dual threaded games for the next 5 years.

No wonder so many tech enthusiasts have left AT to other forums. When people twist reality to their biased Intel/NV distortion field, who the heck wants to post here?

And BTW, I have just sold 2x 4790K and 2x 5820K systems to friends in the last 45 days.

OK, if the FX is so great, how come you are trying to justify it by comparison to a mid range, well lower mid range actually, Intel i3? Which by the way uses about half the power of an 8350 and probably 1/3 or 1/4 that of the 9590, or an overclocked lower tier FX.

And I dont really care what budget AMD has, or if the FX is an old design, or if they are stuck on a node disadvantage. Not my problem. I make my comparison product to product, what is currently available in the marketplace. Now there are certain scenarios that FX makes sense, but they are *very* limited, which is reflected in their marketshare. Even for gaming, you have to cherry pick highly multithreaded games to make FX look better or at least competitive, against even an i3. And a *stock* 4690k, which you could overclock, pretty much destroys any FX in gaming.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
you really cant look at a CPU purely based on power draw....

You need to know the performance on the said cpu, to get a Performance : power ratio.

I agree it makes the AMD cpu look absolutely terrible without doing a P : P ratio, which is completely UNFAIR.


Yea, that's why I posted the benches for those power draw numbers. AMD obviously doesn't match Intel in performance per watt. But when you consider those power draw numbers are for CPU's that finished 2nd and 4th in those benches, it's not as bad as the OP makes it look to be based on the power graphs alone. It also placed respectably in a lot of the other benches, too.
 

BigDaveX

Senior member
Jun 12, 2014
440
216
116
The Bulldozer architecture was created for throughput, not single thread performance. It is just simple, AMD couldn't create another dedicated Desktop SKU and they just used the Server design for the desktop.

Um, Intel also use the same architecture for their desktop CPUs are they do their server ones. The basic cores in a desktop i7 and a Xeon are one and the same, the latter just have vastly beefed-up uncores.

Besides, designing for servers first wasn't the issue. That strategy worked incredibly well for K8, and would likely have worked for K10 if not for the first revision being too slow and too late, and the second coming up against such a beast as Nehalem. Bulldozer was just badly designed, and the succeeding revisions couldn't close the gap well enough.

Oh yeah, you can't compare CPU's from different time periods even though that was the basis of this article from the OP.

What a joke -- fanboys contort themselves into such absurd positions.

So, a heavily overclocked CPU has worse power consumption than a stock CPU released four years later. And that proves... what, exactly?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Um, Intel also use the same architecture for their desktop CPUs are they do their server ones. The basic cores in a desktop i7 and a Xeon are one and the same, the latter just have vastly beefed-up uncores.

Besides, designing for servers first wasn't the issue. That strategy worked incredibly well for K8, and would likely have worked for K10 if not for the first revision being too slow and too late, and the second coming up against such a beast as Nehalem. Bulldozer was just badly designed, and the succeeding revisions couldn't close the gap well enough.

The architecture is the same but the SKUs are not. Much like AMDs Server Vishera FX8350 and Trinity APUs.

Intel SandyBridge Core i7 2600K and Core i7 3820 are using the same architecture but the SKUs are way different.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
OK, if the FX is so great, how come you are trying to justify it by comparison to a mid range, well lower mid range actually, Intel i3? Which by the way uses about half the power of an 8350 and probably 1/3 or 1/4 that of the 9590, or an overclocked lower tier FX.

Because generally speaking when you put together an FX8000 series platform with mobo, it costs much closer to the i3 not the i5 K series. Did you not read my post that if someone can stretch their budget to the K series i5 or i7, I would recommend them? Did I at all recommend the 9590? I am only using it as a basis for overclocking for FX8300 series since I can't readily find FX8350 @ 4.7Ghz in 5 minutes but it doesn't change my point.

And I dont really care what budget AMD has, or if the FX is an old design, or if they are stuck on a node disadvantage. Not my problem.

Ok so you operate in a vacuum then. It shouldn't be surprising that FX8000/9000 series uses way more power than modern i5/i7 CPUs since AMD has an R&D and node disadvantage.

I make my comparison product to product, what is currently available in the marketplace. Now there are certain scenarios that FX makes sense, but they are *very* limited, which is reflected in their marketshare.

Marketshare doesn't always reflect that one product is better than the other. Oftentimes it's the perception that a product is better. For example, millions of people buy 2015 Toyota Corolla with drum breaks, Bose or Beats products. But anyone who knows anything about cars, speakers or headphones will straight up tell you those products are all trash. It doesn't stop Beats from having close to 65% of all profits/markets hare in the headphone market.

Even for gaming, you have to cherry pick highly multithreaded games to make FX look better or at least competitive, against even an i3. .

No, you don't need to cherry-pick. In almost all games the FX8320/8350 @ 4.4-4.5Ghz will beat any i3 system for gaming and easily trade blows with a stock i5 2500K. Is it a great outcome, no obviously not when i5 4690K @ 4.5Ghz is a better CPU but the amount of hate FX8000 series products get is not commensurate with their respective price/performance and actual performance.

And a *stock* 4690k, which you could overclock, pretty much destroys any FX in gaming

No wonder and it should. Intel has 100X the market cap, tens of thousands more engineers and billions of dollars more money. You can't be serious if you expect a firm with 1.5B market cap to have a CPU in 2015 that can beat Intel's best products? :rolleyes: You keep saying how it doesn't matter but in this industry performance, performance/watt and overclocking often go hand-in-hand with manufacturing node. Intel specifically lists its manufacturing node as a key competitive advantage.

If AMD had the resources of capability to move FX8320 to 20-22nm node in 2015, then at least the comparison to the i5 4690K would be a lot more reasonable.

Bulldozer architecture came out October 12, 2011. Haswell came out June 1, 2013. Haswell had a huge node advantage, new architecture and it was 1.5 years newer. Now you want to compare i5 4690K vs. FX8300 series and conclude that AMD is not competitive? Duh! Thanks for stating the obvious Mr. obvious.

That's why some posters in this thread responded to the crazy claims of Vishera's power usage. In the era when it was created, it was never meant to compete with 22nm Haswell. The reason it continues to sell today isn't because AMD intended for Bulldozer/Vishera, etc. to compete with 14nm-22nm Haswell/Skylake processors but because AMD ran out of $ to refresh their CPU architectures.

We might as well start comparing Pentium 3/4 to Athlon 64/X2 and conclude that Intel is done for but Intel continued to pour money into new architectural designs. The difference is AMD doesn't have that $ which is why a largely unchanged CPU architecture from Q4 2011 is what AMD is selling today.

Still though, regardless of 2015, you seem to have missed how so many people bashed FX8000 series not in 2015, but in 2011, 2012 and 2013 when the gap in performance wasn't nearly that large as it is today. I mean one could overclock FX8150 to 4.4Ghz all the way in 2011. I am in no way suggesting anyone build an FX8000/9000 series over i5 K series today just simply reflecting on the overall context.
 
Last edited:

BigDaveX

Senior member
Jun 12, 2014
440
216
116
The architecture is the same but the SKUs are not. Much like AMDs Server Vishera FX8350 and Trinity APUs.

Intel SandyBridge Core i7 2600K and Core i7 3820 are using the same architecture but the SKUs are way different.

"Way different?" The latter had two extra memory channels, another 20 PCIe lanes, and 2MB extra L3 cache. They were nice things to have, but they didn't make a huge difference in performance terms.

Granted, the 3820 also lacked an iGPU, but considering most enthusiasts probably used the 2600K with a dGPU, that would have made even less difference.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
In MultiThreading and Multitasking, in Server and Workstation environments. The FX8350 with low IPC and slow caches was not created to compete in the desktop and especially in Games.
The Bulldozer architecture was created for throughput, not single thread performance. It is just simple, AMD couldn't create another dedicated Desktop SKU and they just used the Server design for the desktop.
If GloFo would continue with the 22nm SOI, the Steamroller 4-5 Module SKU in 2013-2014 would be very nice. Unfortunately for us, GloFo and AMD changed their plans and they took the FinFet and Mobile road.

All the above is revisionist history. Bulldozer and its derivatives killed AMD.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
"Way different?" The latter had two extra memory channels, another 20 PCIe lanes, and 2MB extra L3 cache. They were nice things to have, but they didn't make a huge difference in performance terms.

Granted, the 3820 also lacked an iGPU, but considering most enthusiasts probably used the 2600K with a dGPU, that would have made even less difference.

Im sorry but all those differences you just mentioned made one die at 240mm2 and the other at 300mm2. Those two SKUs are way different and made for different segments. In server workloads where cache and memory bandwidth are need it, the Core i7 3820 would be a lot faster than Core i7 2600K. Not to mention you can have dual or quad socket Core i7 3820 and double the memory capacity per socket.

A Quad Module 8 Threads Trinity would sell more than those FX8350 back in 2012-2013.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Still though, regardless of 2015, you seem to have missed how so many people bashed FX8000 series not in 2015, but in 2011, 2012 and 2013 when the gap in performance wasn't nearly that large as it is today. I mean one could overclock FX8150 to 4.4Ghz all the way in 2011. I am in no way suggesting anyone build an FX8000/9000 series over i5 K series today just simply reflecting on the overall context.

My early 2012 FX8150 Gaming review, in a time when people saying FX cannot game. Simple because reviews at the time were benchmarking at sub 720p at low settings :rolleyes:

zn0vhi.jpg


330y1ch.jpg


dxflzl.jpg


1zgsqbs.jpg


149s9kh.jpg


29x985k.jpg


2mrwbw4.jpg


2aam2bk.jpg
 

cytg111

Lifer
Mar 17, 2008
26,163
15,587
136
Gaming_02.png


Core 2 Duo/Quad is just not enough ;)

something is fishy Q9650 at 22 -> i5-760 at 60, thats almost a three fold increase, from core2 to nehalem.. its not avx ..
edit : lots of other inconsistencies, i cant put my finger on the common denominator just yet
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Besides, designing for servers first wasn't the issue. That strategy worked incredibly well for K8, and would likely have worked for K10 if not for the first revision being too slow and too late, and the second coming up against such a beast as Nehalem. Bulldozer was just badly designed, and the succeeding revisions couldn't close the gap well enough.

Anyone saying that Bulldozer is suitable for servers is either clueless about the server market or have commercial interests in selling the chip. Once you put the right pieces together it's not hard to understand why AMD server market collapsed with Sandy Bridge on 32nm in 2012. Bulldozer and its descendants are everything a server processor should *not* be.
 
Last edited:

Makaveli

Diamond Member
Feb 8, 2002
4,975
1,571
136
The article's intention was to include Intel CPUs over the last decade. That should include Pentium Ds (some models are from 2006).

And saying it's hard to get hold of a Pentium D is a really poor excuse. They are quite easy to come by, e.g. from eBay or whatever.


So you think your personal experiences should determine what CPUs they should include in the test? I know persons using both Pentium 4 and old Conroe models, but not many of either. Still it doesn't matter, since the article's intention was to compare "Intel CPUs over the last decade", not "Intel CPUs over the last decade, that still are being used commonly by the social network of some selected AnandTech forum members". ;)

Finally, if this was a test to compare Intel CPUs, why did they cherry-pick a single AMD CPU that has high power consumption for the only purpose of "winning" the highest power consumption award? Either they should keep it strictly Intel CPUs, or include several other CPUs from AMDs lineup over the years too. Them failing to do so is obviously biased, whether intentional or not.

PS. I noticed that they added an "alibi" 7870K too, but that does not really change much. Not sure what it's doing in their "Intel test" to be begin with, and if it's "Intel and AMD test", then there should be plenty more AMD CPUs included.

Then I recommend you write them an email about said bias and send them a p4 to update article...
 

Azuma Hazuki

Golden Member
Jun 18, 2012
1,532
866
131
Use the right tool for the job. This requires knowing yourself, your workload, your OS, and your budget. There is (almost) no such thing as a bad CPU in the modern era, rather, bad matches between hardware and workload.

I'm a heavy Linux user, Gentoo specifically, and would be able to take advantage of properly-optimized code on Vishera. Most people are not and would not, and for performance would be forced to spend more on an i5 or i7.

For average everyday users the A8 is a much better value than any i3, and the A10 is better in most regards for the same money.

Intel wins the performance crown, and definitely performance per watt, but not always performance per currency-unit.

For the best experience all aspects of your system must be in harmony to eliminate bottlenecks. I'd rather have an A8-7600 with an SSD than an i5-4460 with a standard HDD for example.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
No, you don't need to cherry-pick. In almost all games the FX8320/8350 @ 4.4-4.5Ghz will beat any i3 system for gaming and easily trade blows with a stock i5 2500K. Is it a great outcome, no obviously not when i5 4690K @ 4.5Ghz is a better CPU but the amount of hate FX8000 series products get is not commensurate with their respective price/performance and actual performance.

There's a significant amount of games where the FX either doesn't beat the i3 or doesn't offer significantly more performance, especially when ST performance is necessary, you can use smaller form factors and cheaper PSU with Intel chips, and then don't forget the ancient AM3+ platform, not everyone wants to stick with it.

The FX was a decisively inferior product by AMD, not everyone wants to deal with the ancient platform, the atrocious power consumption and the subpar ST performance that affects a lot of popular games and applications out there, and this is talking about the enthusiast market, which is a quite small niche to start with. The average joe shouldn't touch the FX series even with a barge pole. All that to save, what, 30-40 bucks on a FX processor? That's enough to buy a pizza in most places.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
All the above is revisionist history. Bulldozer and its derivatives killed AMD.

Even that is revisionist history. The decline of the PC is what is killing AMD. Stars core could have continued (and ended up being moved to 32nm for Trinity -- so a lot of the engineering work was already performed for the die shrink) if Bulldozer was truly that bad. The bottom line is -- it isn't.

The reality is FX ended up being an average product, not the game changer that it was (over)hyped to be.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Data taken from the review in the op,

Performance per Watt (higher is better)


54buk5.jpg


But the best way to measure perf/watt is at the same performance or at the same power consumption.

Another interesting point to note is that Core i5 2500K has higher perf/watt than Core i7 2700K. That is most probable because SandyBridge HT doesnt have high enough performance scaling (like Haswell) to overcome the Quad Core i5 2500K.

Also to note that for a 125W TDP the FX 8350 in this application has a nice perf/watt being the second fastest CPU in the review.

7-zip is a terrible measure of general performance as it is so dependent on the caching and memory subsystem of the CPU.

7-zip performance on intel CPUs has also not increased appreciably since SB.

The last thing to remember is that that above test includes platform power so lower power CPUs are more unjustly penalized as the active CPU power is a smaller fraction of system power compared to that of a higher power CPU.

This is why the i3 Haswell is so far behind the i7 when to all intents and purposes they should be roughly equal and the haswell pentium is half the efficiency of the much higher clocked i7.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Even that is revisionist history. The decline of the PC is what is killing AMD. Stars core could have continued (and ended up being moved to 32nm for Trinity -- so a lot of the engineering work was already performed for the die shrink) if Bulldozer was truly that bad. The bottom line is -- it isn't.

The reality is FX ended up being an average product, not the game changer that it was (over)hyped to be.

FX was noticeably worse than PII x6 at the same clocks (legacy software) and would have been significantly worse than a PII x8 with tweaks (32nm, AVX, power saving features).

BD really was that bad. But AMD had/has nothing in the pipeline until Zen. That's precisely why Zen is going to be much more similar to PII than BD.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
My early 2012 FX8150 Gaming review, in a time when people saying FX cannot game. Simple because reviews at the time were benchmarking at sub 720p at low settings :rolleyes:

http://i58.tinypic.com/zn0vhi.jpg

http://i61.tinypic.com/330y1ch.jpg

http://i62.tinypic.com/dxflzl.jpg

http://i60.tinypic.com/1zgsqbs.jpg

http://i60.tinypic.com/149s9kh.jpg

http://i60.tinypic.com/29x985k.jpg

http://i60.tinypic.com/2mrwbw4.jpg

http://i60.tinypic.com/2aam2bk.jpg[/IMG]
nonsense because on low settings the 8150 would look better as many settings impact the cpu in modern games. and there are plenty of games where an 8150 cant even average anywhere near 60 fps nevermind stay above it while any i5 can.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
7-zip is a terrible measure of general performance as it is so dependent on the caching and memory subsystem of the CPU.

It is so bad to measure perfs that Anandtech is using it to estimate servers performances in ST and MT.

So it is totaly relevant, it s just that it doesnt suit your preferences, last time it was Fritzbench that you deemed as ancient and irrelevant, so what s next, something like this perhaps :

I'm suprised we made it to 18 posts without Hardware.fr graphs.

http://forums.anandtech.com/showpost.php?p=37550856&postcount=2782

LoL...
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
It is so bad to measure perfs that Anandtech is using it to estimate servers performances in ST and MT.

So it is totaly relevant, it s just that it doesnt suit your preferences, last time it was Fritzbench that you deemed as ancient and irrelevant, so what s next, something like this perhaps :

http://forums.anandtech.com/showpost.php?p=37550856&postcount=2782

LoL...

Do I really have to post for something like the 3rd of 4th time what AT says.

7-zip is very low IPC and very bandwidth intensive. It it good for showing some aspects of IPC but it not a good general purpose indicator.

74921.png


Winrar (very similar to 7-zip) shows similar gains. Unless broadwell really is capable of >35% IPC gains over Haswell. AT's benchmark uses a lot of small files which the cache excels immensely at helping.

7-zip benchmark operates on a 32 MB file, meaning that broadwell C can fit the entire dataset in cache and run without accessing main memory at all.

You can easily see why compression benchmarks suck at measuring pure IPC.

4%20WinRAR%204.2.png


The problem with Fritzbench was that it showed almost no IPC gain from Conroe to Haswell. Which is fine as a benchmark goes. But to use Fritz and solely Fritz for performance determination is wrong. Most (nearly all) real world software will show gains going from conroe to Haswell as much greater.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Do I really have to post for something like the 3rd of 4th time what AT says.

7-zip is very low IPC and very bandwidth intensive. It it good for showing some aspects of IPC but it not a good general purpose indicator.



Winrar (very similar to 7-zip) shows similar gains. Unless broadwell really is capable of >35% IPC gains over Haswell. AT's benchmark uses a lot of small files which the cache excels immensely at helping.

7-zip benchmark operates on a 32 MB file, meaning that broadwell C can fit the entire dataset in cache and run without accessing main memory at all.

You can easily see why compression benchmarks suck at measuring pure IPC.


The problem with Fritzbench was that it showed almost no IPC gain from Conroe to Haswell. Which is fine as a benchmark goes. But to use Fritz and solely Fritz for performance determination is wrong. Most (nearly all) real world software will show gains going from conroe to Haswell as much greater.


Hardware.fr or AT DT reviewer do not use 7Zip integrated bench, they compress an actual folder and measure the time, this indeed bias the bench since memory and aventual caches can have an influence.

As for Fritzbench it is variable, it did bring an healthy 8% for the i5 HW, that s not negligible for an Integer improvement as that s above average, most improvement are in FP and that s why such benches are redundant as they help to cancel the much lower improvement in anything Integer, that is, for 99% of the softwares used by PC users.

http://www.hardware.fr/articles/897-25/gains-moyennes-cpu.html
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Hardware.fr or AT DT reviewer do not use 7Zip integrated bench, they compress an actual folder and measure the time, this indeed bias the bench since memory and aventual caches can have an influence.

As for Fritzbench it is variable, it did bring an healthy 8% for the i5 HW, that s not negligible for an Integer improvement as that s above average, most improvement are in FP and that s why such benches are redundant as they help to cancel the much lower improvement in anything Integer, that is, for 99% of the softwares used by PC users.

http://www.hardware.fr/articles/897-25/gains-moyennes-cpu.html

Yep. You can see in AT's test the effect that cache has on broadwell C.

7-zip will be similar but even more synthetic. Thus it is not suitable for a general indication of IPC (but it is very useful to have as a subtest).

Fritz showed some gains for Haswell but minimal gains for other intel chips. Fritz is also one of the few (and decreasing) tests where PII whoops BD.

IMG0041504.png


PII x4 whoops trinity and PII and the 8350 are roughly equal in terms of perf/ghz (not normalizing to the number of cores).

IMG0047549.png


It can also be seen that for AMD, improvements of Fritz are around half of the general performance improvements of the 8350 over the PII x6. Fritz simply does not scale well with general performance improvements. (Note that there are basically no IPC improvements in Fritz fro SB to IVB - compare the i3-2130 vs. the 3240 both at 3.4 ghz)
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
something is fishy Q9650 at 22 -> i5-760 at 60, thats almost a three fold increase, from core2 to nehalem.. its not avx ..
edit : lots of other inconsistencies, i cant put my finger on the common denominator just yet

Core 2 did scale well in Tombraider (however this game is known to be light on the CPU):

Gaming_04.png


P.S. Nehalem did really good in this thread (against modern processors), but there were no core 2 processors for comparison --> http://forums.anandtech.com/showthread.php?t=2437733