[Techspot] Then and Now: A decade of Intel CPUs compared, from Conroe to Haswell

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Even in these worst case examples, the FX 8350 still provides more performance per dollar than a 4770k or 4790k.

Yeah, but I'm not hurting for money. This is my hobby, and $300 is a pittance, as far as I'm concerned. Spread that out over the 4 or 5 years I'll likely have this system, and it's the equivalent of a night out to dinner, once per year.

Which obviously is the problem, AMD would much rather have been able to get $300+ for these, there is a reason they are cheaper... AMD's design is just too unbalanced compared to Intel. But, in my opinion that doesn't necessarily make it a bad CPU.

Yeah, but you live up by Canada, and it's cold up there! Here is the 10 day forecast for where I live: http://www.weather.com/weather/tenday/l/USTX0327:1:US Yeah, 104° is the lowest. When it's that hot, adding that many extra watts of heat to any room in the entire house is something that no sane person would ever do.:)
 
Aug 11, 2008
10,451
642
126
Yea, except for *very* rare circumstances, I just dont get this quibbling over 50 or even a hundred or two dollars extra for a cpu for a system that will be used for several years, especially when a portion of the extra cost will be recouped in power savings. Lets face it, compared to a lot of other things people spend money on, the price of a computer system spread out over several years is actually a pretty cheap hobby.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
They release much better than specs on paper datasheets, they provide the Spice models of their products, just put the model in the simulator, connect a parametrised power supply, set the clock frequencies with a generator and run all the simulations you wants, dynamic behaviour under load is perfectly modelised within 1%...

I guess that people should do some howmework before posting non sense, currently without such models it is is impossible to reliably design whatever is using a few dozen electronic components, let alone a MB, this has become so strategical to get products rapidly on market that even basic models of generic components like VRM Mosfets are often crypted and no more provided to whom is not a customer...

So the answer is no. Not a surprise after the 140W discovery.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
It took a decade just to double performance in games. Meh.

In any other industry, the only thing you get (as a consumer) after a decade is a doubling in the MSRP for essentially the same product/performance...the fact we even get a doubling of performance for something that costs much less in price with our CPUs is somewhat amazing and pretty much not replicated in any other market outside of semiconductor devices.

(I would love to get 2x the food, 2x the gas mileage, 2x the square-ft/$, 2x the energy efficiency from my appliances, 2x the education, etc for the same price as 10 years ago)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Gotta love the E8600 vs Q6600. I still remember the arguments back in the day to pay twice as much for the slower Q6600 because "it'll be future-proof when games start taking advantage of multiple cores". Now look, both are as irrelevant as each other.

Even more interestingly, Crysis 3 is well-threaded modern game, yet the dual-core 8600 still beats the Q6600.

The real difference is that the E8600 was faster for its entire usable life and the money saved could buy an entire Core i5-2500K, with $48 spare to go towards a new motherboard. That was a real upgrade.

Trying to future proof yourself by buying expensive things for situations that don't exist is stupid and never works. This is point I repeatedly made back in the day.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
So the answer is no. Not a surprise after the 140W discovery.

You are hell bent with myths, show us a single site that support this number with actual measurements, like your buddy you wont find a single one.

On the other hand in the review i linked in the SKL thread we can see that the 4790K TDP is rougly 105W in the Prime 95 test but also in Blender rendering under linux :

http://www.computerbase.de/2015-08/intel-core-i5-6600k-i7-6700k-test-benchmark-skylake/8/

http://www.computerbase.de/2015-08/...-skylake/8/#abschnitt_leistungsaufnahme_linux

So keep on spreading urban legends as a smoke screen for Intel liying about their TDP specs, and the numbers are here to say that it s a fact.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Idontcare, MSI, Asrock..all wrong..we know ;)

Even AMD wont tell you what temperature or voltage to get 125W TDP. And for good reason since they go over 125W at regular conditions.
 
Last edited:

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
In any other industry, the only thing you get (as a consumer) after a decade is a doubling in the MSRP for essentially the same product/performance...the fact we even get a doubling of performance for something that costs much less in price with our CPUs is somewhat amazing and pretty much not replicated in any other market outside of semiconductor devices.

(I would love to get 2x the food, 2x the gas mileage, 2x the square-ft/$, 2x the energy efficiency from my appliances, 2x the education, etc for the same price as 10 years ago)
Well if you look at the previous 10 years to the last 10, then you will understand why people have different expectations for the computer industry.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
If a CPU sucks +100w more power again and again and again and the effect of that includes mild additional whole system consumption via PSU AC/DC power conversion losses / VRM & motherboard loads, etc, on top, well that's just tough. That's simply the price you pay for running a thirstier 32nm chip & platform, and then overclock it even more. Your electricity company bills you for what you draw at the wall. They / Asus / MSI, etc, don't give you an energy rebate for "pretending" your PSU or VRM's don't exist. Nor do people calculate non-PC appliance running costs by excluding the power supply out of sub-component brand fanboyism. And for corporations, the "T" in "TCO" means "Total".

Let's cut out the fake "objectivity" pretense. The people pushing these lame "PCO" (Partial Cost of Ownership) "measurements" (but only for AMD CPU's) are mainly those who try and artificially skew AMD's high figures downwards whilst still quoting Intel's "84w TDP" ratings, even though the same hardware.fr "calculations" result in "46w" i5-3570's or "54w" 2500K's, etc, whose figures are conveniently never similarly used in same context. Usually the same people who compare Intel "240v at the wall Intel loads" vs "12v Fritz Chess AMD loads", who talk about undervolting AMD's alongside "you can knock 23% off with regular non-Prime loads" one minute then promote "105w" LinX over-volted AVX "power virus" Intel loads the next, etc. The same people who quote hardware.fr charts like this one to death, page after page, day after day, yet "accidentally forget" to quote other charts like this or this from same page even once. Nor does the "12v" reading show the FX's consistently using more on a platform level (even in idle state), another 20-30w that 'needs' to be "shaped" out of the equation (even though motherboard platform is a direct result of what CPU you choose, hardly "unrelated")...

This so called "proper" way of testing power measurement is so inconsistently applied, and the 'correct presentation' bias so obvious it's laughable, especially given "at the wall" figures were deemed accurate back when the "space heater" roles were reversed and AMD's 100w lower power + better single-thread difference was declared "a very potent combo" along with happy discussion of AMD's $531 launch prices... Obviously no one cared about perf/watt and AMD never priced anything above $200-$340 i5-i7's when their market share was larger... :sneaky:

This is my hobby, and $300 is a pittance, as far as I'm concerned. Spread that out over the 4 or 5 years I'll likely have this system, and it's the equivalent of a night out to dinner, once per year.

Yea, except for *very* rare circumstances, I just dont get this quibbling over 50 or even a hundred or two dollars extra for a cpu for a system that will be used for several years, especially when a portion of the extra cost will be recouped in power savings.
^ These +1000. ANY other component - BenQ vs LG monitor, Asus vs MSI motherboard, Kingston vs GSkill memory, Seasonic vs Corsair PSU, a $100 vs a $30 case, Epson vs Canon vs HP printer, Coolermaster vs Thermaltake cooler, Sony vs Panasonic TV, LG vs Toshiba Blu-Ray player, etc = zero emotional hysteria, even with $150 "same component" price differences (like a 1TB Samsung Pro 850 vs a 1TB Crucial BX100). But AMD vs Intel CPU's - and it's suddenly a shaking with fervor religious cult down to the last $10 as if your food money and imminent starvation of your children depends on it... :D
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Well according to some of you here, it seems Intel is lying about the TDP of the

Core i7 4790K 88W TDP ------> 110W delta
Core i7 4930X 130W TDP ------> 152W delta
Core i7 3930X 130W TDP -------> 157W delta
Core i7 3960X 130W TDP -------> 162W delta
Core i7 4960X 130W TDP -------> 176W delta

67026.png


:p
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Gotta love the E8600 vs Q6600. I still remember the arguments back in the day to pay twice as much for the slower Q6600 because "it'll be future-proof when games start taking advantage of multiple cores". Now look, both are as irrelevant as each other.

Even more interestingly, Crysis 3 is well-threaded modern game, yet the dual-core 8600 still beats the Q6600.

The real difference is that the E8600 was faster for its entire usable life and the money saved could buy an entire Core i5-2500K, with $48 spare to go towards a new motherboard. That was a real upgrade.

Trying to future proof yourself by buying expensive things for situations that don't exist is stupid and never works. This is point I repeatedly made back in the day.

E8600 was a workhorse. Wifes son was using my old E8600 to play games like World of Tanks. He has since moved onto my old i2500K. Which is imo another workhorse of a CPU.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Well according to some of you here, it seems Intel is lying about the TDP of the

Core i7 4790K 88W TDP ------> 110W delta
Core i7 4930X 130W TDP ------> 152W delta
Core i7 3930X 130W TDP -------> 157W delta
Core i7 3960X 130W TDP -------> 162W delta
Core i7 4960X 130W TDP -------> 176W delta

67026.png


:p

You have to factor the deltas by 0.81 and add the idle power comsumption of the CPU, wich can be find at Hardwre.fr, and you ll get the picture within a few %.

Indeed i already pointed that just the delta x 0.81 can reach 105W with a 4790K, wich should be about 110W at the CPU level if we include its idle power comsumption.

That s 25% more than the official spec, and not only in Prime 95 but under linux with Blender as load..


http://www.computerbase.de/2015-08/intel-core-i5-6600k-i7-6700k-test-benchmark-skylake/8/

http://www.computerbase.de/2015-08/...-skylake/8/#abschnitt_leistungsaufnahme_linux


What s the take of our usual TDP experts..?..

To answer to actual measurements with urban legends and blanks statements..?.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
undervolting AMD's alongside "you can knock 23% off with regular non-Prime loads" one minute then promote "105w" LinX over-volted AVX "power virus" Intel loads the next,

Intel s knows a lot about undervolting, all their fastest CPUs are factory undervolted with voltages margins as low as 10.5%, FTR 10% is the minimum for a consumer grade product.

So Intel are selling CPUs that are on the verge of instability and likely below consumers accepted specs since at higher temps the 10% margin collapse..

No wonder that X99 CPUs are all "overvolted" by MB manufacturers that dont want to take the blame for Intel s own lack of enginering rigor.



Edit : 110W on CPU loading for the "95W" Skylake, TDP is underspecced from day one of launch :

http://www.pcgameshardware.de/Core-...re-i7-6700K-i5-6600K-1166741/galerie/2413684/



Core_i7-6700K_HWMonitor-pcgh.png
 
Last edited:

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
Well according to some of you here, it seems Intel is lying about the TDP
But... but... but... hardware.fr shows Intel's i5's using only 46w (and everyone else on the net must be wrong)! LOL. All you've done is further highlight my point about the instinct of some fanboys to benchmark AMD's according to hardware.fr's criteria, then cherry pick an "apples & oranges" comparison to the worst Intel's results you can find based on the artificial over-volting effect of AVX power viruses specifically on Haswell's FIVR's, which typically result in +40w inflation over normal 100% CPU X264 loads. :rolleyes:

Examples:-
i7-4770K = 43w Idle / 101w Load. Delta = 58w calculating Prime
i7-4770K = 41w Idle / 102w Load. Delta = 61w x264 encoding
i7-4770K = 33.6w Idle / 94.6w Load. Delta = 61w running Cinebench
i7-4770K = 59w Idle / 125w x264 Load / 164w AVX LinX. Delta = 66w (x264 encoding) / 105w (over-volted "power virus")

i7-4790K = 70w Idle / 156w Load. Delta = 86w running Cinebench
i7-4790K = 36w Idle / 128w x264 Load / 182w AVX LinX. Delta = 96w (x264 encoding) / 146w (over-volted "power virus")

So you have your hardware.fr "measuring" regular 77-95w TDP Intel quad-cores at only 45-55w power consumed in their own special funny way, then there's you trying to hold up "110w" 84-88w TDP's under abnormal over-voltage conditions as "typical usage". Meanwhile the real truth as usual is somewhere in between the two extremes as demonstrated above where every normal load usage case is within 10% of the TDP rating (and for the 4770, often 20% lower). You expend a huge amount of energy trying to show how "evil" Intel is for +21w above rated TDP under abnormal over-voltage conditions, but there's the "125w" FX-8350 clearly pulling 191w delta (255w load - 64w idle) = a whopping +66w over rating under same LinX "power virus" conditions. And still pulling 147w Delta (211w - 64w) even under real-world x264. You want to exclude all the motherboard & PSU losses for AMD to "bring down" that 147-191w AMD Delta to nearer 125w, but keep them in for Intel's "110w" even though a 96-110w LinX over-volted Delta -15-20% or so would result in being mere single digit watts away from official 84-88w rating (which unlike the FX's also have iGPU's to power which can inflate the score in rigs without dGPU's). See what I mean about these lame double standards? If you're deliberately over-volting only Intel's via AVX power viruses (which wouldn't occur under in real world usage) whilst leaving the FX-8350's at stock during power consumption tests, then you're doing it wrong.

Meanwhile a more revealing & useful overall "Task Energy Consumed" (not "peak" but power used averaged over the whole task) ends up barely half that of an FX-8350:-
http://techreport.com/r.x/core-i7-4770k/power-task-energy.png
http://media.bestofmicro.com/Z/I/387486/original/wh.png

Even ignoring AMD and looking purely at 2600K (9.4KJ / 212.6Wh used) vs 4770K (8.0KJ / 176.8Wh used) in above charts shows 32nm is less energy efficient than 22nm, so this comical act of pretending AMD's 32nm is "highly competitive but merely being misread by the enemy" is hilarious, and openly debunked by even hardware.fr themselves. Seriously, you are fooling no-one with this "creative accounting power measurement" apples vs orange stuff anymore. What AMD need is a new architecture on at least a 22nm process, not 'goalpost relocation' from salesmen who can only make 32nm FX power consumption look relatively "good" by skewing 14-22nm figures for Intel via one sided over-volting. ;)
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
Intel s knows a lot about undervolting, all their fastest CPUs are factory undervolted with voltages margins as low as 10.5%, FTR 10% is the minimum for a consumer grade product.
Ordinary VID binning is not "undervolting". As you said yourself, 10% is normal. Most people don't overclock, so more than a basic +10% "buffer" to maintain stability in the event of someone having the world's worst PSU whose 12v output swings from 10.5-13.5v is just pure waste. If you think say 30% is "normal", then try feeding a constant 312v (+30% extra voltage margins over 240v AC mains supply) into most of your household appliances. You might want to open your windows to get rid of that funny burning smell coming from half your appliances... :biggrin:
 
Mar 10, 2006
11,715
2,012
126
Intel s knows a lot about undervolting, all their fastest CPUs are factory undervolted with voltages margins as low as 10.5%, FTR 10% is the minimum for a consumer grade product.

So Intel are selling CPUs that are on the verge of instability and likely below consumers accepted specs since at higher temps the 10% margin collapse..

No wonder that X99 CPUs are all "overvolted" by MB manufacturers that dont want to take the blame for Intel s own lack of enginering rigor.

"Verge of instability"? Come on, man.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
..............

It would save you all that time if you would just take a closer look at the graph above and see that there is a FX8350 also.

So if AMD lied about their FX8350 125W TDP because people measure the idle to full power delta, then Intel lied as well and the above graph is all the evidence we really need(according to some users here). Simple as that ;)
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Ordinary VID binning is not "undervolting". As you said yourself, 10% is normal.

Most people don't overclock, so more than a basic +10% "buffer" to maintain stability in the event of someone having the world's worst PSU whose 12v output swings from 10.5-13.5v is just pure waste. If you think say 30% is "normal", then try feeding a constant 312v (+30% extra voltage margins over 240v AC mains supply) into most of your household appliances. You might want to open your windows to get rid of that funny burning smell coming from half your appliances... :biggrin:


Paradoxaly 10% would be normal with reasonably clocked CPUs since they put low pressure on the VRMs regulation loop, and hence create lower voltages variations.

Yet, because they work in the favourable part of the curves, they have the higher voltage margins while they need less of it, contrary to high perfs CPUs who push the currents and power requirements at levels that are much more difficult to deal with.

So your argument that people do not overclocck is moot, this has nothing to do with overclocking but with the fact that the most expensive CPUs are the most lightly specced, their underated TDP would be trivial in comparison if this number wasnt the cause of thoses metrics being downgraded to poor levels.

This require using expensive boards that see their good regulation caracteristics, and the added margin they could had provided, ruined by Intel s sloppy specs, we are talking of expensive set ups after all that are branded high end by Intel, experimental high end would be more accurate...
 
Last edited:

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
It would save you all that time if you would just take a closer look at the graph above and see that there is a FX8350 also.
I posted several graphs so people can see what the real issue is with so called triple digit Intel power consumption. Haswell has been out two years now and AVX 'power virus' voltage alteration effects are well known (and Haswell specific) which is why if you want to know the genuine workload of say video encoding on a i7-4770, you don't use LinX, etc. Conversely, those with an agenda of trying to make the chips look as bad as possible will jump on one single chart that tells them what they want to hear, whilst selecting a very different set of charts for FX chips that give them a consistently skewed perspective... ;)
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
So your argument that people do not overclocck is moot,
Of course it isn't. Increasing voltage increases wattage at any frequency. It makes sense to optimize for the 95% of non-OCers vs the 5% who will increase vCore manually anyway regardless of what the VID table says. If AMD are pumping 1.5v into a chip that will work fine for non-overlockers at say 1.25v and still have a 10% margin and overclockers will still be able to adjust voltage manually, then that's just being wasteful (and certainly explains why they hardly get any mobile contracts...)

This require using expensive boards that see their good regulation caracteristics, and the added margin they could had provided, ruined by Intel s sloppy specs, we are talking of expensive set ups after all that are branded high end by Intel, experimental high end would be more accurate...
I know you desperately try day after day to portray some "the end is nigh" scenario where millions of Intel chips will imminently fail and it's all being kept hushed up but that simply isn't reality. People's chips work just fine "out of the box". You're simply being paranoid over your "theoretical disasters" again and arbitrary "I want 30% thresholds on all CPU's purely because I once owned an AMD CPU that OC'd well at stock voltage" personal obsession.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
"Verge of instability"? Come on, man.

For the X99 dedicated CPUs that s the case, at stock voltage they are not stable enough if cooling is not high end as well, browse here and there in the relevant threads, the consensus is that WC is the cure.

Perhaps you ll understand better why Intel is eagerly promoting WCL, wich is after all a mean to get away with annoyingly slim razor voltage margins.

It s not that decent air coolers didnt allow to dissipate 150-200W, thing is that they did so at higher SKU temps...
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Conversely, those with an agenda of trying to make the chips look as bad as possible will jump on one single chart that tells them what they want to hear, whilst selecting a very different set of charts for FX chips that give them a consistently skewed perspective... ;)

You said agenda..?.

How is Prime 95 relevant to test the FX8350 TDP and irrelevant once we talk of Intel..?.

Or are you assuming that the FX will drain 125W whatever the MT loading..?.

Fact is that under Prime 95 the FX is within its rated TDP while Intels are exceeded by as much as 25% for the 4790K, the debate is not about power in real softs but about Intel publishing underated specs in matter of TDP.

That regular softs wont get over the official limit is irrelevant, they can exceed their TDP when used to test FP applications for instance, in wich case they ll crash if the cooling apparatus is set for the official TDP.
 

Dresdenboy

Golden Member
Jul 28, 2003
1,730
554
136
citavia.blog.de
Are these numbers in the idle to AVX power graph between peaks or averages?

And there is another spoiler: If one CPU has a high idle power level, it creates a smaller delta. This could become even worse if one while OC'ing the CPU also adjusts the lower power P-states voltage to undervolted values.

Some thoughts:
Add to that 1xx W TDP the 300W of a GPU+mem. How does that affect the yearly costs when running the PC at home 2h at max load per day on average and say 4h at surfing, office, etc. power levels?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I posted several graphs so people can see what the real issue is with so called triple digit Intel power consumption. Haswell has been out two years now and AVX 'power virus' voltage alteration effects are well known (and Haswell specific) which is why if you want to know the genuine workload of say video encoding on a i7-4770, you don't use LinX, etc. Conversely, those with an agenda of trying to make the chips look as bad as possible will jump on one single chart that tells them what they want to hear, whilst selecting a very different set of charts for FX chips that give them a consistently skewed perspective... ;)

If you remember, the whole thing about the 125W TDP started with Idontcare's and his FX8350 run on LinX. Then certain people here, took it and started the AMD TDP lie FUD.

As for the Intel CPU power consumption, this is the closest to real you can actually have. This is actual Watt Hours each system used to finish the benchmark.
And if you follow my posts, im not the one to confuse TDP with power consumption. ;)

juii5k.jpg