Don Karnage
Platinum Member
- Oct 11, 2011
- 2,865
- 0
- 0
Distributed Computing?CPUs are so powerful nowadays its hard to find a real world application that can keep it at TDP levels. As such, modern CPUs stay far under TDP.
He doesn't need to. It's not that uncommon. My 2600K does 4.8GHz w/HT and 5.0GHz w/o HT on 1.424V. The NH-D14 or any high-end tower cooler can handle that.Show me :biggrin:
Why on Earth did you go from SB-E to IB?
So damn what? Almost everyone here is going to overclock the piss out of those chips anyways and will see higher wattage then stock.
" At 4.6 - 4.7 Ghz the temperature gets so high the processor throttles "
But but, Ivy is just as fast as a higher clocked sandy at 4.6ghz ... clearly just because it hits 80c at 1.24v doesn't mean a thing... screw all these retail samples we have floating around lets wait till reviewers get their cherry picked chips. Ivy is teh ish people are just ignorant.
The explanation they gave about die size makes no sense.
Are you finished with the axe grinder? I need it for another thread....
As bad as it is that Ivy Bridge failed, AMD needs that leeway to catch up.![]()
It won't be enough. AMD is so far behind its not funny. Intel would have to screw up IB and their next CPU as well I'm thinking before AMD could get into position. Besides, IB is not a failure yet. I kinda laugh how people are whining IB is not hitting 5GHz like they want. Like 4.6GHz is not enough?? <insert laughing face here>
The explanation they gave about die size makes no sense. A smaller die size not being able to dissipate heat as well might make temperatures increase but it wont increase the actual power the processor uses, AKA TDP.
Ivy Bridge is basically a die-shrunk Sandy Bridge. They didnt add any features on it to make it use significantly more power and offset that die shrink.
Coup27 said:I find it very unlikely that Intel will release a batch of 95W CPUs while they are still refining the manufacturing process and in a few months the same CPUs will be 77W. That's what they spend all those months doing QA, validation and testing for. Smells more like BS to me at the moment.
Theyve not shown theres higher heat. Heat is not the same as temperature; heat is measure in Joules, not C.As shown by IDC, higher heat = higher power, so actually it would increase the actual power use.
Again, their explanation makes no sense. You cant infer that a higher temperature means a higher TDP.That doesn't mean it is the explanation for the increase in TDP, but it can't be ruled out that they are running hotter than anticipated and that's contributed to an increase in power consumption.
They’ve not shown there’s higher heat. Heat is not the same as temperature; heat is measure in Joules, not C.
All they’ve shown is temperature and you can’t infer anything about TDP from that alone.
Again, their explanation makes no sense. You can’t infer that a higher temperature means a higher TDP.
Take a passively cooled Intel IGP on air and a GTX580 in a liquid nitrogen bath. Because the GTX580 has a lower temperature does that mean it uses less power than the Intel IGP? Nope, of course not, because temperature alone doesn’t tell you anything power consumption.
A smaller die might make it harder for the heatsink to draw heat away (AKA higher temperature) but the processor is still using the same amount of power regardless.
We aren't talking about the same chip, we're talking about SB vs IB. To make the inference that IB uses more power simply because it runs hotter is invalid.If a chip is running hotter (temperature) it will use more power than it would if it was cooled to a lower temperature for a given clock speed and voltage.
