[Techspot] Then and Now: A decade of Intel CPUs compared, from Conroe to Haswell

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Are these numbers in the idle to AVX power graph between peaks or averages?

And there is another spoiler: If one CPU has a high idle power level, it creates a smaller delta. This could become even worse if one while OC'ing the CPU also adjusts the lower power P-states voltage to undervolted values.

Some thoughts:
Add to that 1xx W TDP the 300W of a GPU+mem. How does that affect the yearly costs when running the PC at home 2h at max load per day on average and say 4h at surfing, office, etc. power levels?

I believe those are peak power measured at the wall. For actual energy consumption of the system see my x264 graph above.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
How is Prime 95 relevant to test the FX8350 TDP and irrelevant once we talk of Intel..?
Same way demanding that people use AVX Prime for 240v testing of Intel Haswell's (knowing it will deliberately over-volt only a Haswell and give up to 50w higher than normal readings) then suddenly switch to "Fritz Chess 12v" load testing for hardware.fr testing of AMD's, whilst ignoring that same hardware.fr methodology readings result in impressively low "61w" i5-4670K's, "55w" i5-2500K, "46w" i5-3570K's, "35w" i3-4340's, etc, which are then promptly forgotten in favor of searching for "less palatable" '110w' Haswell's... And you can "play the same game" there - Take a "110w Haswell", knock off 10% for PSU AC/DC losses - that's suddenly down to 100w. How much does an Intel motherboard, memory chips, HDD's, fans, USB peripherals, idling dGPU, etc, draw under load? If it's 12w or more, then the Intel is also suddenly "within rating". You get the picture...

Of course, the easiest way is to test with realistic apps that load all cores 100% without over-volting either of them (x264, Cinebench, etc). And that's what I posted earlier from several different sites. But then when you do that, those artificially inflated "110w" Intel's numbers drop like a rock to nearer 60-85w (such as i5-4670K = 52w, i7-4770K = 74w, i7-4790K's = 86w, etc) and some daily Intel bashers suddenly have less to moan about when stuff is actually done "like for like" in a genuinely fair manner that doesn't involve screwing around with over-volting one CPU more than than another "accidentally on purpose", then cynically declaring it "cheating" for being +25% higher than normal as a direct consequence of that overvolt that you could have avoided but chose not to... ;)

That regular softs wont get over the official limit is irrelevant
It's 100% relevant to most normal people who only run regular softs and are not remotely interested in "playing" the figures via one-sided power virus over-volting to try and prove a point.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Same way demanding that people use AVX Prime for 240v testing of Intel Haswell's (knowing it will deliberately over-volt only


It's 100% relevant to most normal people who only run regular softs and are not remotely interested in "playing" the figures via one-sided power virus over-volting to try and prove a point.

Power virus do not overvolt Intel CPUs , these are Intel CPUs that auto overvolt when they detect a demanding load that increase voltage variations, i like it how you are liitteraly inverting reality, not that it s not unusual with you....

I wont even bother commenting what is a school case of bad faith, more balanced people will notice that this overvolting is necessary because the CPU take advantage of reduced loads to decrease its voltage margin since voltage variations are lower in this case..

Well, keep on spreading the good words and physics as explained by some Intel marketers, surely very entertaining for the converted masses but certainly a face palm generator for whom has the slightest ounce of logic...
 
Last edited:

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
I'm not being funny Abwx, but I'm honestly struggling to understand what you're saying sometimes (language barrier I mean).
 

Nothingness

Diamond Member
Jul 3, 2013
3,301
2,374
136
Prime95 is not a power virus, it's a real application. Intel Linpack library (also real code, not a power virus) reaches similar power consumption as P95.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
Wow. I promised myself not to get involved with this stupid debate but...

TDP (Thermal Design Power) is NOT a direct measure of power consumption.

Some people don't seem to get that (not naming names here). TDP numbers from different manufacturers are not directly comparable because each company measures TDP differently. This means some companies might measure TDP using a different workload than others and this also means that some companies stated TDP isn't even defined in the same manner as other companies.

It's unfortunate that there isn't some standard method for testing and determining TDP but it's the reason why I only care about TDP whenever CPUs/GPUs of the same company and even generation are being compared.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
For the X99 dedicated CPUs that s the case, at stock voltage they are not stable enough if cooling is not high end as well, browse here and there in the relevant threads, the consensus is that WC is the cure.

Perhaps you ll understand better why Intel is eagerly promoting WCL, wich is after all a mean to get away with annoyingly slim razor voltage margins.

It s not that decent air coolers didnt allow to dissipate 150-200W, thing is that they did so at higher SKU temps...

I see no facts backed up with numbers here. Just fake conjecture.

I guess those AMD 9570 CPUs that shipped with WC in the box must have been unstable. Using your logic...
 

Ranulf

Platinum Member
Jul 18, 2001
2,864
2,514
136
Yeah, but I'm not hurting for money. This is my hobby, and $300 is a pittance, as far as I'm concerned. Spread that out over the 4 or 5 years I'll likely have this system, and it's the equivalent of a night out to dinner, once per year.



Yeah, but you live up by Canada, and it's cold up there! Here is the 10 day forecast for where I live: http://www.weather.com/weather/tenday/l/USTX0327:1:US Yeah, 104° is the lowest. When it's that hot, adding that many extra watts of heat to any room in the entire house is something that no sane person would ever do.:)

There are many factors that go into power usage and price/perf beyond just cpu draw. At 13 cents/kwh it is not that big of a deal to me and I'd never make back the price difference in the service life of the chip. Please tell me you have turned off all superfluous electronics and use led or cfl lights if you're that concerned about 20-40w of cpu power. My Wii and cable box add in 22w together in standby mode.

While I'm in Oregon, not Texas we have had several 100-105 days here this summer. I'm pretty sure my AC would be running just as much if I was using my 2500k instead of my 8350. Gaming would probably have them on even keel because of the vid cards or if I overclocked the 2500k. I bet I'd get no more than 1-2 degrees leeway either way in room temp. I figure using LED lights did more to keep things cool and use less power in my home office than worrying about the cpu power draw. That light swap alone covers one of my servers/NAS at load with about 45w left over.

At near idle right now on the 8350 (what I'm typing this on), power is varying according to HWmonitor, 21-49w for the cpu. My UPS says the computer is using 65w (not including monitor) with one tab of Firefox open.
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
They release much better than specs on paper datasheets, they provide the Spice models of their products, just put the model in the simulator, connect a parametrised power supply, set the clock frequencies with a generator and run all the simulations you wants, dynamic behaviour under load is perfectly modelised within 1%...

Link to "spice model"? What does it "simulate"? ASM code? That would be pretty sweet.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Link to "spice model"? What does it "simulate"? ASM code? That would be pretty sweet.

Those modelisations are provided to MB manufacturers and are not available for customers.

As for what can be simulated, it s the CPU dynamical behaviour, comsumption under load, current peaks when frequency is brutaly increased and so on.

https://en.wikipedia.org/wiki/SPICE

An exemple of spice model for an FDSOI transistor :

www-leti.cea.fr/en/content/download/2003/25809/file/Model_Description_UTSOI_v113_light.pdf

Page 23 and next display the parameter on the left, the mathematical formulae are the ones applied by the simulator to compute the behaviour.
 
Last edited:

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
There are many factors that go into power usage and price/perf beyond just cpu draw. At 13 cents/kwh it is not that big of a deal to me and I'd never make back the price difference in the service life of the chip. Please tell me you have turned off all superfluous electronics and use led or cfl lights if you're that concerned about 20-40w of cpu power. My Wii and cable box add in 22w together in standby mode.

While I'm in Oregon, not Texas we have had several 100-105 days here this summer. I'm pretty sure my AC would be running just as much if I was using my 2500k instead of my 8350. Gaming would probably have them on even keel because of the vid cards or if I overclocked the 2500k. I bet I'd get no more than 1-2 degrees leeway either way in room temp. I figure using LED lights did more to keep things cool and use less power in my home office than worrying about the cpu power draw. That light swap alone covers one of my servers/NAS at load with about 45w left over.

At near idle right now on the 8350 (what I'm typing this on), power is varying according to HWmonitor, 21-49w for the cpu. My UPS says the computer is using 65w (not including monitor) with one tab of Firefox open.

myocardia said:
Yeah, but I'm not hurting for money. This is my hobby, and $300 is a pittance, as far as I'm concerned. Spread that out over the 4 or 5 years I'll likely have this system, and it's the equivalent of a night out to dinner, once per year.

Do you not read English? The post that you quoted said ^^^ (that), and yet you drivel on for a few paragraphs about a few dollars per year worth of electricity? Seriously?
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Those modelisations are provided to MB manufacturers and are not available for customers.

As for what can be simulated, it s the CPU dynamical behaviour, comsumption under load, current peaks when frequency is brutaly increased and so on.

https://en.wikipedia.org/wiki/SPICE

An exemple of spice model for an FDSOI transistor :

www-leti.cea.fr/en/content/download/2003/25809/file/Model_Description_UTSOI_v113_light.pdf

Page 23 and next display the parameter on the left, the mathematical formulae are the ones applied by the simulator to compute the behaviour.

Seems like there is a bit of a leap from a single transistor and an entire CPU.
 
Aug 11, 2008
10,451
642
126
There are many factors that go into power usage and price/perf beyond just cpu draw. At 13 cents/kwh it is not that big of a deal to me and I'd never make back the price difference in the service life of the chip. Please tell me you have turned off all superfluous electronics and use led or cfl lights if you're that concerned about 20-40w of cpu power. My Wii and cable box add in 22w together in standby mode.

While I'm in Oregon, not Texas we have had several 100-105 days here this summer. I'm pretty sure my AC would be running just as much if I was using my 2500k instead of my 8350. Gaming would probably have them on even keel because of the vid cards or if I overclocked the 2500k. I bet I'd get no more than 1-2 degrees leeway either way in room temp. I figure using LED lights did more to keep things cool and use less power in my home office than worrying about the cpu power draw. That light swap alone covers one of my servers/NAS at load with about 45w left over.

At near idle right now on the 8350 (what I'm typing this on), power is varying according to HWmonitor, 21-49w for the cpu. My UPS says the computer is using 65w (not including monitor) with one tab of Firefox open.

We go through this red herring every time power use comes up. What really matters is the cost of the power saved compared to the price difference of the cpus. lf you say 15 cents per kwhr with all the taxes and fees, 100 watts power delta 5 hours per day that is 7.5 cents per day, or about 25.00 per year. Within 2 or 3 years one can easily save the difference in cost, which is what is really relevant to the purchase decision. What kind of light bulbs you use is irrelevant to the comparison of cpus.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
100 watts power delta 5 hours per day that is 7.5 cents per day, or about 25.00 per year. Within 2 or 3 years one can easily save the difference in cost, which is what is really relevant to the purchase decision. What kind of light bulbs you use is irrelevant to the comparison of cpus.

What is irrelevant is to use....irrelevant numbers specialy tailored to fit the demonstration..

That said using your corner case let s assume your 2$/months as relevant and lets assume that the power efficient CPU cost 100$ more.

By your logic the cost is recouped in 4 years, but what about if i live in Chicago where heaters works 8 months/year..??.

I will save 10$ for the 4 summer months and the rest of the time i save nothing as i must compensate those 100W by switching on the heaters.

Cost will be then recouped in a matter of 10 years, and that s with a 100W delta that you would be hard pressed to explain from where it could originate, with more realistic numbers you ll recoup nothing at all..
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
What is irrelevant is to use....irrelevant numbers specialy tailored to fit the demonstration..

That said using your corner case let s assume your 2$/months as relevant and lets assume that the power efficient CPU cost 100$ more.

By your logic the cost is recouped in 4 years, but what about if i live in Chicago where heaters works 8 months/year..??.

I will save 10$ for the 4 summer months and the rest of the time i save nothing as i must compensate those 100W by switching on the heaters.

Cost will be then recouped in a matter of 10 years, and that s with a 100W delta that you would be hard pressed to explain from where it could originate, with more realistic numbers you ll recoup nothing at all..

Heaters don't run 8 months a year in Chicago.

And heating (using natural gas) is done very efficiently and cheaply compared to air conditioning.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Heaters don't run 8 months a year in Chicago.

And heating (using natural gas) is done very efficiently and cheaply compared to air conditioning.

Natural gas heating cost is about the same, it s years that providers have computed KWs/BTU, they just adjusted the prices to be slightly below electricity, but without its safety...

As for Chicago i guess that your remark hold for someone who is heating his appartment at the expense of his neighbours..

After all with just 200m2 of common walls + Floors + ceiling and 30cm concrete thickness, as well as 5°C difference with others appartment, that s still enough to highjack 6KW of power from unsuspecting neighbours...

Well, i digress somewhat, let s focus on those dozen watts that seems to be enough to save a whole world..
 

dark zero

Platinum Member
Jun 2, 2015
2,655
140
106
The hell?
Now we are talking about air conditioning?

We are supposed to talk about Intel...

Remeber than when the Core 2 Duo came the lowest tier became the Pentium 4?
And then when Sandy Bridge game, the Core 2 Duo at 45 nm are the minimun?
Now with Skylake, the Core i3 Nehalem are the lowest chip you can bare. The real lowest tier are the Sandy Bridge.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,293
146
Instead of selling ice to Eskimos, selling them AMD CPUs should be a piece of cake.
 
Aug 11, 2008
10,451
642
126
Natural gas heating cost is about the same, it s years that providers have computed KWs/BTU, they just adjusted the prices to be slightly below electricity, but without its safety...

As for Chicago i guess that your remark hold for someone who is heating his appartment at the expense of his neighbours..

After all with just 200m2 of common walls + Floors + ceiling and 30cm concrete thickness, as well as 5°C difference with others appartment, that s still enough to highjack 6KW of power from unsuspecting neighbours...

Well, i digress somewhat, let s focus on those dozen watts that seems to be enough to save a whole world..

Good, make you a deal. You pay my AC cost this summer. For the summer months, it runs about 3x my highest heating bill, and I live in Minnesota. And the other poster is right as well, electricity is a very expensive way to heat in the winter compared to natural gas.

As for the dozen watts saving the world, just another of your typical absurd comments. My point was simply that the most logical way to compare cost of cpus is to compare purchase price plus cost of operation. Other expenses are irrelevant to that analysis. Bottom line is worse performance *and* higher energy use seems the worst of both worlds, as is evidenced by AMD's market share. Not even rock bottom pricing is enough to persuade consumers to buy them.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
For the summer months, it runs about 3x my highest heating bill,


So i guess that 40W less will surely cut significantly this bill, moreover if it s for the 5 hours/day you quoted..


As for the dozen watts saving the world, just another of your typical absurd comments. My point was simply that the most logical way to compare cost of cpus is to compare purchase price plus cost of operation. Other expenses are irrelevant to that analysis. Bottom line is worse performance *and* higher energy use seems the worst of both worlds, as is evidenced by AMD's market share. Not even rock bottom pricing is enough to persuade consumers to buy them.

As proved by your very exemple there s no savings to be made with a few dozen watts, you are talking of a 100$/month and you think that 1-2$/month will make a difference..?..

Seriously.?..

An advice, take a Killawatt and measure your micro oven comsumption, and not when it s on, i mean when it s plugged to the main but not used...
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Seems like there is a bit of a leap from a single transistor and an entire CPU.

Indeed, fortunately OEMs are given models that are actualy black boxes, they dont know what is inside, their terminals electric caracteristics are all what is needed by simulators.

It would take years to simulate a single clock cycle duration with whatever PC if the file was the CPU netlist, that is, the full electric schematic "spiced" down to individual transistors..
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Instead of selling ice to Eskimos, selling them AMD CPUs should be a piece of cake.

That s not even funny....

At least less than people who want to save 100$ with 40W savings...

Here a FX cost 115€ less than a 4690K, electricity cost in France/Germany is 15/25cts per KWh...

115/(0.04 x 0.15/0.25cts) = 19166 and 11500 hours respectively.

At 5h/day that s 3833 and 2300 days respectively.

And that s assuming that you re living where there s no winter at all, perhaps in Minnesota, dont know the climate there....

Other possibility is to not make those complexe calculations at all, and instead to rely on urban legends, yet again...
 

coercitiv

Diamond Member
Jan 24, 2014
7,359
17,445
136
Wow. I promised myself not to get involved with this stupid debate but...

TDP (Thermal Design Power) is NOT a direct measure of power consumption.

Some people don't seem to get that (not naming names here). TDP numbers from different manufacturers are not directly comparable because each company measures TDP differently. This means some companies might measure TDP using a different workload than others and this also means that some companies stated TDP isn't even defined in the same manner as other companies.

It's unfortunate that there isn't some standard method for testing and determining TDP but it's the reason why I only care about TDP whenever CPUs/GPUs of the same company and even generation are being compared.
What you said about TDP numbers from different manufacturers not being directly comparable only applied to older products. When chips could not accurately measure their own power usage, TDP was defined using a certain workload. That could indeed lead to situations where different manufacturers could have different standards for measuring TDP. Modern chips rely on very accurate power metering to abide their TDP rating, which means they can adjust speed under load in order to make sure they use exactly as much power as they are allowed to. (lots of juicy info here)

One can still attempt to convince peers that processor from company A or B is breaching TDP specs by measuring power usage during limited periods of time, when CPUs take advantage of heat capacity of their coolers and temporarily operate beyond normal energy budget.

But let's not get into such petty details when a far more interesting discussion about Air Conditioning is at hand. It's hot!