[Techspot] Then and Now: A decade of Intel CPUs compared, from Conroe to Haswell

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
And what if they didn't have any p4's or able to get one.

those systems are 10+ years old.

I don't see the problem stopping with conroe I don't know anyone still using a p4 while I know people still on conroe I think you are looking for bias when there isn't any.

So who cares if they had used some Pentium D to make FX look better? Intel isnt still selling pentium D, while FX8350 is top of the line for AMD. But who is surprised? The usual claims of bias against AMD by the usual suspect. Doesnt he ever get tired of trying to spin AMD's failings.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Core 2 Duo/Quad is just not enough ;)

Once I saw SC2 results, I ditched my Q6600 @ 3.4Ghz in no time. There is a certain group on our forum that continued to spread BS how the move from Core 2 Quad to Nehalem/Lynnfield was not that big but the move from 1st gen i5/i7 to SB was somehow huge but the reality was the complete opposite.

For anyone who plays SC2 games, Core 2 Quad was a pile of junk - from the same site for consistency:
CPU.png


Not everyone plays SC2 or strategy games though but the low IPC and low/slow cache of C2Q were known 5 years ago.

Some of their other gaming conclusions that there is little difference between SB and Haswell or i5 and i7s are flawed since some gamers use 980Ti SLI and 4K gaming at which point a 5930K @ 4.5Ghz would destroy an i5 4690K in some well-threaded games. Also, some of us might have 20-50 tabs open for work/productivity and then instead of closing all of those programs, we might run a game. At that point an i7 would be superior to an i5. They did test their platform with a 980 which is enough to show how far behind the older stock processors are but certainly not enough to differentiate the more modern generations of i5 2500K/2600K vs. 5820K-5960X.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So who cares if they had used some Pentium D to make FX look better? Intel isnt still selling pentium D, while FX8350 is top of the line for AMD. But who is surprised? The usual claims of bias against AMD by the usual suspect. Doesnt he ever get tired of trying to spin AMD's failings.

Hasn't been for a long time. The top of the line is 9590.

Considering FX8000 series often cost as much as an i3 (FX8320 is only $119 and can easily overclock to 4.4-4.5Ghz), no one in their right mind would be expecting it to compete with an overclocked i5 4690K or 4790K or 5820K.

http--www.gamegpu.ru-images-stories-Test_GPU-MMO-Armored_Warfare_-test-AW_proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Simulator-F1_2015_-test-f1_proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher_2_Assassins_of_Kings-test-witcher3_proz.jpg


With Windows 10, performance is improving for multi-core CPUs in many DX11 games.
http--www.gamegpu.ru-images-stories-Test_GPU-MMO-World_of_Tanks_9.9_-test-wot_proz_10.jpg


Sure, we all know that a 4.5Ghz modern i5 smokes the FX8000 series but what happens if you take an FX8150/8350 and overclock it to 4.5-4.7Ghz? It'll destroy an i3.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Grand_Theft_Auto_V_-test-2-GTA5_proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Ryse_Son_of_Rome-test-Ryse_proz.jpg


proz.jpg


crysis3%20proz%202.jpg


There is wayyyy too much hatred for FX8000/9000 series on AT forums, usually coming from the same suspects that hate all things made by AMD including their graphics going back to HD4800 series too. It's a miracle AMD's CPU division even survived this long competing 2 nodes behind but for a budget gamer there is no doubt that FX8300 series+OC is far superior to any i3 Intel offers.

Certainly the i3 doesn't get the same hate as the FX8150/8320/8350/9370 CPUs but an i3 is a worse product overall compared to those CPUs when it comes to a balance of work/productivity, overclocking + gaming performance.

It's amusing how some many are delusional thinking AMD with its market cap of 1.5B should be able to compete and beat both NV (10.73B) and Intel (137B). In no other industry in the world could a 1.5B firm be expected to outperform two competing firms one of which is basically a pure GPU business with a market value of nearly $11B and the other is a company nearly 100X larger in market cap.

Let's be thankful AMD is at least still providing competition to NV and Intel in some form and hopefully Zen is an improvement over 9590.
 
Last edited:

ninaholic37

Golden Member
Apr 13, 2012
1,883
31
91
"In 2006 the Core 2 Duo E6600 retailed for $316 and in its place today we have the Core i7-4790K for roughly the same price at $339. The 4790K is clocked at almost twice the frequency, features twice as many cores, and four times as many threads.

Actual performance gains are more impressive.

The Core i7-4790K is eleven times faster in Excel 2013 and Hybrid x265, six times faster in 7-Zip, Photoshop CC, and HandBrake. When it came to gaming, the 4790K was twice as fast in BioShock Infinite and Crysis 3, seven times faster in Metro Redux, three times faster when testing with Hitman Absolution and just barely faster in Tomb Raider."
Since the Core 2 Duo E6600 is 2.4GHz and the i7-4790K is 4.0/4.4GHz, and Bioshock Infinite and Crysis 3 is only twice as fast, that means IPC hasn't really increased at all? And barely faster in Tomb Raider means it's a regression?
 

BigDaveX

Senior member
Jun 12, 2014
440
216
116
Since the Core 2 Duo E6600 is 2.4GHz and the i7-4790K is 4.0/4.4GHz, and Bioshock Infinite and Crysis 3 is only twice as fast, that means IPC hasn't really increased at all? And barely faster in Tomb Raider means it's a regression?

No, it means that the GPU is more the limiting factor in those situations. In situations that are mostly CPU-dependant, we get things like the Excel test (~11x speedup) or the various encoding tests (around 7x to 10x speedup depending on the software).
 

ninaholic37

Golden Member
Apr 13, 2012
1,883
31
91
No, it means that the GPU is more the limiting factor in those situations. In situations that are mostly CPU-dependant, we get things like the Excel test (~11x speedup) or the various encoding tests (around 7x to 10x speedup depending on the software).
Ok, I thought those tests were better because they used multi-threading, and the gaming tests were the true test of actual speed. But it's hard to know for sure.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Did you just use an OCed chip as an excuse?

You need to have your eyesight checked.

At stock clock the i7 920 pulls the same amount of juice as the FX-8350 -- When the i7 is overclocked to the same clock speed as the FX, the i7 is pulling close to double the electricity of the FX. So this thread is a ton of drama over nothing.

Considering the stock clock speed of the FX and its die size, it's pretty average for desktop power consumption. As others have already noted, a single core Pentium 4 draws similar amounts of electricity to an eight core FX..... So gains have clearly been made over the past 10 years.

BTW, crappy Vishera? LOL -- I guess the i7 3770K is crappy, too -- since us Linux users know that an FX-8350 runs roughly as fast as the i7 3770k under Linux (the FX is artificially gimped by the Intel Compiler under Windows most of the time).

http://www.phoronix.com/scan.php?page=article&item=amd_fx8350_visherabdver2&num=1
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Data taken from the review in the op,

Performance per Watt (higher is better)


54buk5.jpg


But the best way to measure perf/watt is at the same performance or at the same power consumption.

Another interesting point to note is that Core i5 2500K has higher perf/watt than Core i7 2700K. That is most probable because SandyBridge HT doesnt have high enough performance scaling (like Haswell) to overcome the Quad Core i5 2500K.

Also to note that for a 125W TDP the FX 8350 in this application has a nice perf/watt being the second fastest CPU in the review.
 

BigDaveX

Senior member
Jun 12, 2014
440
216
116
At stock clock the i7 920 pulls the same amount of juice as the FX-8350 -- When the i7 is overclocked to the same clock speed as the FX, the i7 is pulling close to double the electricity of the FX. So this thread is a ton of drama over nothing.

I'm sure that comparison would be a meaningful one if the FX-8350 had been actually competing against Bloomfield. But Intel were on the Ivy Bridge core by the time AMD released the Piledriver FXes, and at both stock and overclock, Ivy Bridge's power consumption was much lower than the FX-8350 (to say nothing of the 9000-series that came along later).

Really, all your argument goes to show is that AMD were wrong to pursue the low-IPC high-GHz strategy. And yeah, that's pretty much correct, but we all worked that out already.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
You need to have your eyesight checked.

At stock clock the i7 920 pulls the same amount of juice as the FX-8350 -- When the i7 is overclocked to the same clock speed as the FX, the i7 is pulling close to double the electricity of the FX. So this thread is a ton of drama over nothing.

Considering the stock clock speed of the FX and its die size, it's pretty average for desktop power consumption. As others have already noted, a single core Pentium 4 draws similar amounts of electricity to an eight core FX..... So gains have clearly been made over the past 10 years.

BTW, crappy Vishera? LOL -- I guess the i7 3770K is crappy, too -- since us Linux users know that an FX-8350 runs roughly as fast as the i7 3770k under Linux (the FX is artificially gimped by the Intel Compiler under Windows most of the time).

http://www.phoronix.com/scan.php?page=article&item=amd_fx8350_visherabdver2&num=1

You cant cross compare completely different setups. Everyone knows that. Specially not from such a different time period.

And it was you talking about 4Ghz.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I'm sure that comparison would be a meaningful one if the FX-8350 had been actually competing against Bloomfield. But Intel were on the Ivy Bridge core by the time AMD released the Piledriver FXes, and at both stock and overclock, Ivy Bridge's power consumption was much lower than the FX-8350 (to say nothing of the 9000-series that came along later).

Really, all your argument goes to show is that AMD were wrong to pursue the low-IPC high-GHz strategy. And yeah, that's pretty much correct, but we all worked that out already.

The problem with the FX8350 is that people comparing it to the wrong SandyBridge SKUs. The 125W TDP no iGPU FX8350 should be compared to the 130W no iGPU Core i7 3820.
Both of those are made for the same segment (Servers and Workstations), they have the same die size and their platforms are comparable in features.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
With Windows 10, performance is improving for multi-core CPUs in any DX11 games.
Indeed, that will be awesome, someday. Of course, Windows 10 is still an alpha. Anyone know when they plan on getting off their duffs, and turning it into a beta?

http--www.gamegpu.ru-images-stories-Test_GPU-MMO-World_of_Tanks_9.9_-test-wot_proz_10.jpg


Sure, we all know that a 4.5Ghz modern i5 smokes the FX8000 series but what happens if you take an FX8150/8350 and overclock it to 4.5-4.7Ghz? It'll destroy an i3.
Yes, like the fastest AMD CPU destroyed the i3 in the above graph? (Being higher on these graphs means higher performing, btw) So, if they overclock their FX 8350s to 4.7 Ghz, it will be nearly as fast as an unoverclocked i3. Wow, that really is inspiring, isn't it? Maybe with some LN2 or $400 of custom water, they could actually overtake the i3! :rolleyes:
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
That old i3 2100 still owns and the i3 4330 thats replaced is a direct killer.

If anyone ever wondered why AMD is down to ~2% x86 revenue they got another answer in the huge pile that keeps building up.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106

It doesnt matter you pick a single example. The i3 2100 still beats 4 and 6 core FX CPUs on a daily basis. And we talk about the cheapest i3 4½ years ago with a 65W TDP. And just look how the more current, yet still outdated i3 4330 54W fares.

There is a reason why AMD sells as it does. Or rather not sell.
 
Last edited:

JM Popaleetus

Senior member
Oct 1, 2010
375
47
91
heatware.com
lol I was waiting for the first my parents / grand parents still use p4 :)
My aunt is still using her Athlon XP machine daily for email and web surfing.

Work provides her with the workstation she needs for her graphic design. I have no clue how she manages to jump between using an eight-core monster and her personal relic.
 

BigDaveX

Senior member
Jun 12, 2014
440
216
116
The problem with the FX8350 is that people comparing it to the wrong SandyBridge SKUs. The 125W TDP no iGPU FX8350 should be compared to the 130W no iGPU Core i7 3820.
Both of those are made for the same segment (Servers and Workstations), they have the same die size and their platforms are comparable in features.

Except that people don't look at CPUs in such artificially limited terms. When comparing an i7 3820 and a FX-8350, the reaction of most people would likely be to choose neither and go for either a 3770K (which would perform equal to or better than the other two in most scenarios at lower power usage) or a 3930K (which would be a lot more expensive, but completely annihilate the 3820 and FX-8350 in multi-threaded scenarios).

By your logic, no-one should have been comparing the original Phenom to the Core 2 Quad, because the former was a native quad-core chip designed for servers and workstations, while the latter was just two laptop chips crudely slapped together on an LGA775 package.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Except that people don't look at CPUs in such artificially limited terms. When comparing an i7 3820 and a FX-8350, the reaction of most people would likely be to choose neither and go for either a 3770K (which would perform equal to or better than the other two in most scenarios at lower power usage) or a 3930K (which would be a lot more expensive, but completely annihilate the 3820 and FX-8350 in multi-threaded scenarios).

By your logic, no-one should have been comparing the original Phenom to the Core 2 Quad, because the former was a native quad-core chip designed for servers and workstations, while the latter was just two laptop chips crudely slapped together on an LGA775 package.

AMD doesnt have the luxury to create a tone of different SKUs like Intel, the FX8350 was not created to compete against the Intel APUs.
ZEN next year will have the same fate, people will compare it to the Socket 1151 Skylake instead of socket 2011-V3.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,067
3,574
126
you really cant look at a CPU purely based on power draw....

You need to know the performance on the said cpu, to get a Performance : power ratio.

I agree it makes the AMD cpu look absolutely terrible without doing a P : P ratio, which is completely UNFAIR.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
AMD doesnt have the luxury to create a tone of different SKUs like Intel, the FX8350 was not created to compete against the Intel APUs.
ZEN next year will have the same fate, people will compare it to the Socket 1151 Skylake instead of socket 2011-V3.

The performance is not there and was never there with the FX chips. They weren't in the same price bracket either. In which metrics was it supposed to compete?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
The performance is not there and was never there with the FX chips. They weren't in the same price bracket either. In which metrics was it supposed to compete?

In MultiThreading and Multitasking, in Server and Workstation environments. The FX8350 with low IPC and slow caches was not created to compete in the desktop and especially in Games.
The Bulldozer architecture was created for throughput, not single thread performance. It is just simple, AMD couldn't create another dedicated Desktop SKU and they just used the Server design for the desktop.
If GloFo would continue with the 22nm SOI, the Steamroller 4-5 Module SKU in 2013-2014 would be very nice. Unfortunately for us, GloFo and AMD changed their plans and they took the FinFet and Mobile road.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
You cant cross compare completely different setups. Everyone knows that. Specially not from such a different time period.

And it was you talking about 4Ghz.

Oh yeah, you can't compare CPU's from different time periods even though that was the basis of this article from the OP.

What a joke -- fanboys contort themselves into such absurd positions.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yes, like the fastest AMD CPU destroyed the i3 in the above graph? (Being higher on these graphs means higher performing, btw) So, if they overclock their FX 8350s to 4.7 Ghz, it will be nearly as fast as an unoverclocked i3. Wow, that really is inspiring, isn't it? Maybe with some LN2 or $400 of custom water, they could actually overtake the i3!

This Zalman CNPS14X cooler can easily cope with an FX8320/8350 at 4.5-4.7Ghz and it often goes on sale for $10-20. This cooler also was on sale for $20 just last 2 weeks and it can do the same.

Did you miss all the other games? Let's go over it again:

i3 4330 = 40 fps average / 33 minimum
FX9370 (speeds which are achievable by 8320/8350) = 59 fps average (nearly 50% faster) / 40 minimum
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Ryse_Son_of_Rome-test-Ryse_proz.jpg


FX8350 at stock beating 2600K in Crysis 3.
proz.jpg


Project CARS FX series core is 30% faster than an i3 on a 970.

pcars_win_nv_t2.png


Let's expand the testing:

i5-4690K@4.8 vs FX-8350@4.7 in 9 games (GTX 970)
https://www.youtube.com/watch?v=20QPpIqo-YY

Face the truth - an i3 is garbage compared to the FX8320/FX8350 @ 4.4-4.7Ghz for budget gaming with a modern graphics card but they cost very similarly.

Let's also ignore how FX8300 series wipes the floor with an i3 in productivity.

7zip-comp-oc.gif

7zip-decomp-oc.gif

x264-1-oc.gif

x264-2-oc.gif


This is just a stock 8350 trading blows with a 2600K in rendering.
cinebench.gif


PovRay
pov-chess.gif

pov-bench.gif


But ya, let's ignore FX8000 series overclocking, let's ignore that people do other things with their PCs besides gaming, let's ignore all well-threaded modern AAA games so that the i3 looks good again. No need to roll your eyes and ignore the key trends in gaming and PC usage. This forum is amazing and it will forever defend the garbage i3 just like it will forever defend how the i5 is good enough for games and the i7/X99 platform is a waste of $. I've been there before where so many people defended dual cores and they were all wrong. It's only going to get worse for the i3 when games move on to DX12. If a gamer can step up to the i5/i7, that's a good foundation for a solid gaming rig but an i3 isn't unless one is specifically knows he/she will ONLY play dual threaded games for the next 5 years.

I remember last year on AT there was this ludicrous craze for the G3258 and how it's the next thing since sliced bread. The more experienced system builders saw it for what it was - a fun $70 toy but as a CPU a POS. For someone who wants a well-rounded CPU for various tasks and plays modern games, but cannot quiet afford the i5/i7 setups, the FX8000 is an easy choice over the i3.

And BTW, I have just sold 2x 4790K and 2x 5820K systems to friends in the last 45 days. I keep building i5/i7 rigs and overclocking them since the cost of ownership long term is very low, but people who recommend i3s over FX8000/9000 series or gaming/balance of gaming+productivity clearly haven't done their research most of the time.

I stand by my statements that I continue to make since Core 2 Duo E6300/6400 days = a Core i3 is a CPU in no-man's land. It's worthless for a gamer who plays a wide variety of genres/AAA games, worthless for serious productivity, worthless for overclocking, and it costs more long-term to own an i3 than just to get a K series i5 and keep the system for 5 years. The i3 generally sells cuz it's doesn't have the negative stigma associated with Pentium or Celeron brand names for most consumers and it's the cheapest cost of entry into the Intel eco-system. However, as an overall product, it's a worse buy than FX8300 series.
 
Last edited:

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
The performance is not there and was never there with the FX chips. They weren't in the same price bracket either. In which metrics was it supposed to compete?

To reiterate, it depends on the environment you're measuring. Under Linux, the FX-8350 will rival an i7 3770k across the board and in several tasks outrun it. The 3770K usually sells for roughly twice an 8350.