IB is not 77W but 95W!!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Distributed Computing?

Sure, but aside from that you'd be hard pressed to find a non-synthetic application capable of fully stressing the CPU, at least for PCs. And on the very few cases that does do it, power management will keep it under.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,346
136
www.teamjuchems.com
We aren't talking about the same chip, we're talking about SB vs IB. To make the inference that IB uses more power simply because it runs hotter is invalid.

A 4850 has a higher temperature than a 4870:


Following your reasoning, I guess you think the 4850 uses more power? Because it doesn't - it actually uses 29W less under load:


The 4850 and 4870 are also on the same process node and the same generation as well, unlike SB vs IB.

Again, you can't infer anything about power usage solely on temperature. The cooler could be different, ambient temperature could be different, etc.


That is not my reasoning. Clearly. Read what I said and please don't put words in my mouth.

If people are looking for reasons why IVB is drawing more power than expected, then it is reasonable to conclude that temperature issues that some have some seen may be the culprit. Per IDCs research. Making your conclusion about "invalid" points "invalid."

Reading the post that Lonyo had, and you quoted, I believe that Lonyo and I are trying to convey the same point. If you don't get it, don't try to make me sound like an idiot.

Lastly, I posed my reply as a question. I am not an expert, but IDC is. He posted a great thread on this, which I believe directly conflicts with your line of thinking. I was trying to remind you of it, and if you weren't aware of it please take a look for it. Maybe you'd look at it and be able to tell me that I made the wrong conclusion - but I don't think so :)

Indulging your example, I would posit that a 4850, all other things held constant, would use less power if its operating temperature were lowered, and that the inverse is also true. I would cite IDC's research :)

I do think it would be interesting to test that (with a GPU) - if I ever get in gear and get my main 5870 under water, I'll be sure to try it and report my findings here! But silicon should be silicon, no?
 
Last edited:

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
That is not my reasoning. Clearly. Read what I said and please don't put words in my mouth.

If people are looking for reasons why IVB is drawing more power than expected, then it is reasonable to conclude that temperature issues that some have some seen may be the culprit. Per IDCs research. Making your conclusion about "invalid" points "invalid."

Reading the post that Lonyo had, and you quoted, I believe that Lonyo and I are trying to convey the same point. If you don't get it, don't try to make me sound like an idiot.

Wouldnt power = higher temps, not temps = higher power?
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,346
136
www.teamjuchems.com
Wouldnt power = temps, not temps = power?

I am not going to track it down just this instant, but IDC has incredible thread where, keeping all other things constant, increasing the operating temperature of a chip drastically effects its power draw. (negatively)

It's here, somewhere :)

Granted, that is for SB, so I think we are all waiting on IDC to do his thing for IVB :)
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
The relationship between power consumption and temperature is not linear. We won't know until Ivy Bridge is out and can be tested.
 

Don Karnage

Platinum Member
Oct 11, 2011
2,865
0
0
Only issue with high temps im having is with IBT. Playing BF3 temps max in the low 70c's at 4.6ghz
 

daveybrat

Elite Member
Super Moderator
Jan 31, 2000
5,821
1,035
126
Sooo basically what i've gathered from this thread is this:

1) If you are building a new computer and prices are the same, then take an IB over a SB cpu.

2) If you currently own a SB processor, there is little to no reason to upgrade till Haswell.

Correct?
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
My SB has a lot of problems in the winter because I keep my water below 0C, is it ok for me to upgrade to IB? I hear it doesn't have a cold bug...
 

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,711
4,671
75
I wonder what the tdp of the human brain is. :D
20W, including memory and semi-solid-state storage. But its power supply is not very efficient. ;)

@daveybrat: Correct. As far as we know right now.
 

AkumaX

Lifer
Apr 20, 2000
12,648
4
81
what's more power efficient (lower in power consumption), an i7-2600k @ 4.2ghz vs IB-???K @ 4.2ghz (no voltage adjust)
 
Last edited:

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
what's more power efficient (lower in power consumption), an i7-2600k @ 4.2ghz vs IB-???K @ 4.2ghz (no voltage adjust)

Not guaranteed both will do 4.2 at stock, but if they did, ivy would probably use less.

I think even with both at 1.35, ivy would have less power draw
 

PlasmaBomb

Lifer
Nov 19, 2004
11,636
2
81
Going from aluminum to copper wont do anything but raise prices. Right now the heat is so concentrated in one small area that air and liquid cant dissipate it fast enough since the surface area is so small.

It already is copper, it's nickel plated for corrosion resistance. Have a look at a lapping thread -

DSCN0455Q9505Lapped.jpg
|*

*Image belongs to Idontcare.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
If people are looking for reasons why IVB is drawing more power than expected, then it is reasonable to conclude that temperature issues that some have some seen may be the culprit. Per IDCs research. Making your conclusion about "invalid" points "invalid."
Reading the post that Lonyo had, and you quoted, I believe that Lonyo and I are trying to convey the same point. If you don't get it, don't try to make me sound like an idiot.
I’m afraid it’s you that needs to take a look at the arguments being presented here. I stated that you can’t infer TDP by simply looking at temperatures, and this is absolutely correct. No amount of your appeal to authority logical fallacy will change this.

I can prove this right now from a leaked review: http://www.tweaktown.com/reviews/46...th_ivy_bridge_motherboard_review/index11.html

IB has higher temperatures compared to SB, so I guess IB also uses more power too? Actually no, it doesn’t. What more proof do you need that you can’t infer TDP solely off temperature?

Indulging your example, I would posit that a 4850, all other things held constant, would use less power if its operating temperature were lowered, and that the inverse is also true.
Not relevant. Again, the issue here is claiming ‘A’ uses more power than ‘B’ simply because ‘A’ has a higher temperature. You absolutely cannot make that inference and I’ve shown two real examples (4850 vs 4870 and SB vs IB) that show the higher temperature parts using less power.

If you can’t understand the significance of these examples then you don’t even understand what the issue is.

IB could have a smaller heatsink mass, lowered fan speed profile, or they could’ve run it under higher ambient temperatures. There are many possible reasons why it could have a higher temperature not caused by a higher TDP, like the article seems to imply.

I do think it would be interesting to test that (with a GPU) - if I ever get in gear and get my main 5870 under water, I'll be sure to try it and report my findings here! But silicon should be silicon, no?
It’s not really relevant to anything but sure, you do whatever floats your boat.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I am not going to track it down just this instant, but IDC has incredible thread where, keeping all other things constant, increasing the operating temperature of a chip drastically effects its power draw. (negatively)

It's here, somewhere :)

Granted, that is for SB, so I think we are all waiting on IDC to do his thing for IVB :)

You pretty much nailed it in a nutshell already ;), I'll toss in the link to the thread just for good measure :D

Effect of Temperature on Power-Consumption with the i7-2600K

PowerversusTemperature.png


Temperature directly impacts power-consumption because of the fundamental physics involving the Poole-Frenkel effect and leakage.

PtotalVccTGHz.png


This is fundamental to all CMOS devices, be it CPU's, memory, GPU's, the IC's in your microwave oven, etc.

The higher the operating temperature the higher the power-consumption, so all TDP specifications must incorporate and comprehend this unavoidable reality when being defined with respect to TJmax and heat-transfer specs.

Temperature alone does not define TDP, no one parameter on its own can or does dictate TDP. But you are assured that the same chip operating at the same clockspeed and same voltage will require a higher TDP spec if it is to operate at a higher temperature than a cooler one.

(this is part of the motivating factor for spec'ing a lower max operating temperature with AMD CPU's versus Intels, by spec'ing a lower max operating temp AMD is able to operate the CPU at a higher clockspeed while staying within their spec'ed TDP footprint...but it only works if you can keep the temps below that max spec'ed operating temp)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
Temperature alone does not define TDP, no one parameter on its own can or does dictate TDP.
Yep, that's exactly the point I was making.

But you are assured that the same chip operating at the same clockspeed and same voltage will require a higher TDP spec if it is to operate at a higher temperature than a cooler one.
I saw an example of this a few years ago:

amp_power.png

The Zotac actually has a factory overclock over the reference GTX480 yet it still uses less power because of its superior cooler.​
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,346
136
www.teamjuchems.com
I’m afraid it’s you that needs to take a look at the arguments being presented here. I stated that you can’t infer TDP by simply looking at temperatures, and this is absolutely correct. No amount of your appeal to authority logical fallacy will change this.

I can prove this right now from a leaked review: http://www.tweaktown.com/reviews/46...th_ivy_bridge_motherboard_review/index11.html

IB has higher temperatures compared to SB, so I guess IB also uses more power too? Actually no, it doesn’t. What more proof do you need that you can’t infer TDP solely off temperature?


Not relevant. Again, the issue here is claiming ‘A’ uses more power than ‘B’ simply because ‘A’ has a higher temperature. You absolutely cannot make that inference and I’ve shown two real examples (4850 vs 4870 and SB vs IB) that show the higher temperature parts using less power.

If you can’t understand the significance of these examples then you don’t even understand what the issue is.

IB could have a smaller heatsink mass, lowered fan speed profile, or they could’ve run it under higher ambient temperatures. There are many possible reasons why it could have a higher temperature not caused by a higher TDP, like the article seems to imply.


It’s not really relevant to anything but sure, you do whatever floats your boat.

We are talking about two different things. Completely. I am not talking about SB vs IVB, despite your repeated attempts to say that I am. We aren't talking about 4850s vs 4870s, rather one CPU family.

What I get out of this thread is that a certain SKU, the 3770k specifically, is running hotter than expected. It also appears to be getting a 95W TDP rating instead of the 77W TDP rating that was expected.

Does temperature play a role in this? Maybe. That is all I am saying. If the Intel thermal solution isn't performing as well as expected, the chip could be using more power than what was initially anticipated for a given clock speed. (Just one of many possible reasons a chip might run warmer than intended, no?)

Which you are basically agreeing with? (ie, a better cooled 480 uses less power) I don't see what the argument is?

A separate issue is this: at higher clock speeds, where the temperatures are looking to be an even bigger issue, their might be a situation where the power leakage of IVB becomes too great and SB becomes a better performing chip for a given amount of power (as it might have a large advantage in clock speed).

But that wasn't what I was talking about, and I am not sure how it came across that way. I am also a bit confused on how that assertion could be so aggravating, we are still in speculation land. We'll just have to see how it shakes out.

Effect of Temperature on Power-Consumption with the i7-2600K

Temperature directly impacts power-consumption because of the fundamental physics involving the Poole-Frenkel effect and leakage.

Temperature alone does not define TDP, no one parameter on its own can or does dictate TDP. But you are assured that the same chip operating at the same clockspeed and same voltage will require a higher TDP spec if it is to operate at a higher temperature than a cooler one.

(this is part of the motivating factor for spec'ing a lower max operating temperature with AMD CPU's versus Intels, by spec'ing a lower max operating temp AMD is able to operate the CPU at a higher clockspeed while staying within their spec'ed TDP footprint...but it only works if you can keep the temps below that max spec'ed operating temp)

I think that what I am saying might be the case jives completely with this series of statements.

If the 3770k can't run at it's target temp under load, Intel would simply raise its TDP in relationship to ceiling needed to accommodate the additional power needed at the temperature it does run under load with their provided cooling solution. Who knows under what conditions that additional ceiling might be needed.

Heck, that increase in TDP might be simply from deciding to consolidate to a single cooler SKU for the entire IVB line that is simply a little under built for the the 3770k. All sorts of reasons could be postulated that have little negative bearing on the chip itself.

That's the only point I am making, please don't interpret it as anything else. :)


Not relevant. Again, the issue here is claiming ‘A’ uses more power than ‘B’ simply because ‘A’ has a higher temperature. You absolutely cannot make that inference and I’ve shown two real examples (4850 vs 4870 and SB vs IB) that show the higher temperature parts using less power.

It was relevant to the point I was making, which is completely different than what you are talking about, evidently.

If we are talking about such disjointed topics, there is not much to be resolved here, I guess.

I agree. There is no call saying that IVB uses more power than SB simply because it has a higher running temperature. Abso-freaking-lutely.

Evidently it is not this thread, but another thread, where folks actually have retail boxes w/the 3770k displaying a 95W TDP. My mistake, I thought that was some where in this thread.

http://forums.anandtech.com/showthread.php?t=2237805&page=14

There is it, a ways down the page...
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Current attempts to explain this focus on the cooler. Apparently the CPU really has a TDP of 77W but the cooler is still certified for 95W.
I know, sounds weird, but this is what I've been hearing from several sources.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
I mentioned it already, boxleitnerb, I think IB will be a lower power chip but the new finfet node needs better cooling for a given TDP. By specifying it as 95W there is less risk of OEMs under-speccing the cooler required.

Although, I will still be curious to see retail power results. It may be that it falls somewhere between 77W and 95W at stock settings. Hard to justify a change in rated TDP if IB is actually a 85-90W chip.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Although, I will still be curious to see retail power results. It may be that it falls somewhere between 77W and 95W at stock settings. Hard to justify a change in rated TDP if IB is actually a 85-90W chip.

I found it strange why they would change the TDP in Ivy Bridge to 77W, and back to 95W in Haswell. I've also seen documents saying Ivy Bridge never changed TDP figures and it was always 95W.

According to this figure, Ivy Bridge only lowers voltage usage on intermediate frequency levels. Not lowest, not base, not Turbo, but in between: http://www.heise.de/newsticker/meld...dge-Prozessoren-1438747.html?view=zoom;zoom=2

That could be the reason why we're seeing such results. Most usage results will show power reductions since it does not fully load it. But on applications that fully stress the chip, it won't show any power savings over previous generation.
 
Last edited:

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
Something seems wrong with Ivy Bridge if the temperature concerns are true. I don't remember when one process shrink created such a sharp rise in temperature density. As others have pointed out, maybe it's the Tri Gate transistors are behaving differently.