First complete review of Haswell i7-4770K

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

galego

Golden Member
Apr 10, 2013
1,091
0
0
You are talking about gamers right? Crysis3 is the first piece of software that will actually hog into those 4+ cores .. As I see it, the need for those cores have only just begun.

Agree. Many people has considered the effect of new consoles on Nvidia, but few considered the effect on Intel.

The effect here is double. At the one hand the gap between the cheap AMD chips and the expensive top i7 reduced drastically and Haswell is not going to change that.

At the other hand, AMD proves that they had the technology to compete with Intel and that the main problem was on the software.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I have to disagree. Cause and effect are backwards here. The reason why computer sales are down is because nobody can increase the performance of cpu's significantly. Nobody needs to upgrade when newer machines aren't any faster than older machines. It's not because there's no money in the market so nobody is bothering trying to improve things. We have already been knocking on the end of moore's law for a couple of years now. Datafarms and supercomputers are only a small slice of the market even if they are the highest profit margin. And besides, it looks like Intel failed at increasing performance/watt too.
Is it broken? I just recommended an upgrade earlier today, not because an E4500 w/ 1GB RAM and IGP was too slow, but because it was a consumer price-point box, and it's going to start risking failure by aging components, soon, if others have been any indication.

A PC of similar cost today could include an i5-3330-3450, which is insanely faster than an E4500 (close match in bench). But, flawless full-screen Youtube playback is all it won't do. I know some folks that are still more than pleased with the A64 X2s, as well.

There will continue to be demand for more raw performance, but the masses want more and better and cheaper smart devices, for the time being and near future (once everything is connected, and portable devices are limited mostly by their UIs, then what?), and businesses want to reduce TCO of the next round.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
For people like me with an Intel i3 2120 Haswell is GOOD!

How can that upgrade really be bad?

First, we (me at least) were considering the difference between i7-3770k and i7-4770k. The review in the OP compared both chips.

About upgrading from a i3. Since the 4770k gives virtually the same performance than ivi but consumes more power, maybe requires another PSU, and a new mobo (by new here I mean untested not fined-tuned engineering unlike the dozens of sandy and ivi mobos), and will be more expensive, the upgrade to the 3770k looks much better.
 

bononos

Diamond Member
Aug 21, 2011
3,939
190
106
First, we (me at least) were considering the difference between i7-3770k and i7-4770k. The review in the OP compared both chips.

About upgrading from a i3. Since the 4770k gives virtually the same performance than ivi but consumes more power, maybe requires another PSU, and a new mobo (by new here I mean untested not fined-tuned engineering unlike the dozens of sandy and ivi mobos), and will be more expensive, the upgrade to the 3770k looks much better.

Out of curiosity what is your computer specs? Just the cpu/mb.

You are trying too hard here. The difference in peak power consumption is very small (23W) compared to AMD power usage. Do you know that a stock 8350 can consume up to 200W despite its 125W rating?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Out of curiosity what is your computer specs? Just the cpu/mb.

You are trying too hard here. The difference in peak power consumption is very small (23W) compared to AMD power usage. Do you know that a stock 8350 can consume up to 200W despite its 125W rating?

Not that I disagree with your overall point and direction, but just a minor correction: TDP isn't power consumption.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Not that I disagree with your overall point and direction, but just a minor correction: TDP isn't power consumption.

I've seen this said so many times, but its just not true.

TDP is very relevant when you are running applications that are extremely demanding, or using a iGPU setup and playing games.

For CPU-only tests, Intel Burn Test/LinX will get to TDP easily. Sure, for a very short time it may exceed it in some circumstances, but the power unit should average it out over long periods.

Some shoddy designs like Pentium 4 chips and even Bulldozer can exceed TDP with an application that's not power-virus(like Prime 95), but most chips don't do that. In case an application will demand it above TDP, the power manager will force it to turn off Turbo to keep it back at TDP levels.

Since the 4770k gives virtually the same performance than ivi but consumes more power,

I bet you the power consumption outside of AVX2 optimized applications will not be much higher, if at all. That's demonstrated by Orthos Paging Software running only 0.6W higher on 4770K. The Intel Burn Test will fully load the AVX2 units in Haswell and the performance there will approach 2x Ivy Bridge though, so the increase in power will be justified.

Now, I'm not denying 4770K is looking like not much of a upgrade.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
No, no, no. Please look up the references on what TDP is. You are not correct - TDP isn't power consumption. It's a measure of how versatile the cooling system has to be. The prior poster implied that since TDP is 125, that means that power consumption has to be under 125W. That is not the case.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
No, no, no. Please look up the references on what TDP is. You are not correct - TDP isn't power consumption. It's a measure of how versatile the cooling system has to be. The prior poster implied that since TDP is 125, that means that power consumption has to be under 125W. That is not the case.

You are confusing workload dependent power with general power use. Modern day processors have sophisticated power management systems that allow great variance in power use depending on what program you are running.

But that doesn't mean it can't reach TDP, or that it'll exceed it(for sustained periods of time). Since Nehalem, Intel's chips have Power Control Unit(PCU) to forcefully limit the CPU from going above TDP, if the workload is demanding enough to make it exceed the levels.

Intel's datasheets also say that for few seconds or whatever it can exceed TDP levels, but the power manager makes it so that long term average power remains no greater than TDP.

Lot of references claim that TDP is "practical power use" that's not "max power" but when there's a dedicated piece of hardware that makes sure it doesn't exceed the TDP limits, the older definitions are pointless to talk about. Again, the key is "sustained", or "long average" power when referring to TDP.
 
Last edited:

Atreidin

Senior member
Mar 31, 2011
464
27
86
You are confusing workload dependent power with general power use. Modern day processors have sophisticated power management systems that allow great variance in power use depending on what program you are running.

But that doesn't mean it can't reach TDP, or that it'll exceed it(for sustained periods of time). Since Nehalem, Intel's chips have Power Control Unit(PCU) to forcefully limit the CPU from going above TDP, if the workload is demanding enough to make it exceed the levels.

Intel's datasheets also say that for few seconds or whatever it can exceed TDP levels, but the power manager makes it so that long term average power remains no greater than TDP.

Lot of references claim that TDP is "practical power use" that's not "max power" but when there's a dedicated piece of hardware that makes sure it doesn't exceed the TDP limits, the older definitions are pointless to talk about. Again, the key is "sustained", or "long average" power when referring to TDP.

Blackened23 did not say that processors can't reach or exceed TDP. In fact, the opposite was claimed; that was the point of the original statement.

You can't say that TDP is power consumption then go and contradict yourself. Well you can, but you make no sense.

The point is that TDP does NOT mean maximum power draw of a processor. It doesn't even really mean "average" power draw.

You also can't compare AMD and Intel TDP definitions, they aren't the same. Intel plays with thermal power definitions a lot, I mean, recently they invented a totally new term with its own weird definition so they could claim they have 7W processors.

Manufacturers put weasel words in their definitions to define what is "normal use". Any company that uses processors in their product designs should employ knowledgeable and competent engineers that know these definitions and how to put in safety margins for their own designs.
 
Last edited:

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
You also can't compare AMD and Intel TDP definitions, they aren't the same. Intel plays with thermal power definitions a lot, I mean, recently they invented a totally new term with its own weird definition so they could claim they have 7W processors.
Yeah that was called SDP & I was like WTH seriously !
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
But that doesn't mean it can't reach TDP, or that it'll exceed it

TDP does not have a strict meaning, the definition is variable.From intel themselves:

And from Intel’s datasheet for Prescott CPUs:

“Thermal Design Power (TDP) should be used for processor thermal solution design targets. The TDP is not the maximum power that the processor can dissipate.”

First, TDP's definition is variable. Secondly, TDP is not power consumption. You are still wrong on insisting that it is.
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
The point is that TDP does NOT mean maximum power draw of a processor. It doesn't even really mean "average" power draw.

Sorry, but TDP *is* an average by definition:

Intel's: Thee upper point of the thermal profile consists of the Thermal Design Power (TDP) and the associated Tcase value. Thermal Design Power (TDP) should be used for processor thermal solution design targets. TDP is not the maximum power that the processor can dissipate. TDP is measured at maximum TCASE.


AMD's: TDP. Thermal Design Power. The thermal design power is the maximum power a processor can draw for a thermally significant period while running commercially useful software. The constraining conditions for TDP are specified in the notes in the thermal and power tables.”

So I guess the two companies are settled on this.

You also can't compare AMD and Intel TDP definitions, they aren't the same. Intel plays with thermal power definitions a lot, I mean, recently they invented a totally new term with its own weird definition so they could claim they have 7W processors.

Intel didn't mess with their TDP definition, as they know better than to mess with a variable used by engineers to design cooling solutions.

Intel did say that *in some very specific* scenarions you could bring IVB to 7W and so design a cooling solution for that, and that's why it's called SDP, or scenario design power. By no means it is a standard measure of a standard scenario, and Intel clearly stated that after the initial snafu.

But it seems that your memory is too short. Intel never messes with TDP, but AMD did. Once they were stuck with inferior products with higher TDP, they tossed away their own TDP measure and started to show something called ACP. The problem? You couldn't design a cooling solution with ACP in mind, unless you calculated an ACP for your specific workload and didn't deviate from that.

A certain former senior marketing director also lied about AMD TDP definition on forums, and also lied about the usefulness of ACP in designing thermal solutions for AMD chips. AMD since then shelved their ACP scam.
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
I know what you are all saying, that TDP is thermal power and not electrical power. But via the law of conservation of energy, there is no way a CPU consuming 200 W only puts out 125 W of heat. Where would you propose the rest of that energy went?
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Out of curiosity what is your computer specs? Just the cpu/mb.

You are trying too hard here. The difference in peak power consumption is very small (23W) compared to AMD power usage. Do you know that a stock 8350 can consume up to 200W despite its 125W rating?
I also am waiting for the poster to reveal his cpu/gpu specs. What is sad is that I showed my specs below, and in another thread asked the poster to reveal his when he argued over whether or not the FX8350 was better than the 3770k etc. Now we have to be subjected to statements such as a 4770k using more power than a ivy bridge? How was that statement arrived at when haswell has not even been released? P.S. I read the thread analysis of a early testing by someone. I'll wait until the Haswell is officially released and the NDAs expire to pass judgment on its power usage compared to the Ivy.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
That;s ancient news, anyways if Intel does go down this road for any length of time AMD could but more importantly should catch up, if not overtake them, in the not too distant future :p

At 14nm, all Intel has to do is increase their core count and the fanciful notion of AMD catching up will be ended.
 

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
At 14nm, all Intel has to do is increase their core count and the fanciful notion of AMD catching up will be ended.
The core count will not be increased, you know that, because they have to increase their IGP performance level to match low end discrete so AMD does have a realistic chance here, although you accepting that or not is a different matter altogether !
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
The core count will not be increased, you know that, because they have to increase their IGP performance level to match low end discrete so AMD does have a realistic chance here, although you accepting that or not is a different matter altogether !
Huh?

Why can't Intel offer a 6 core model to compete against AMD's Steamroller?

At 14nm, Intel will have almost twice the transistors to play with, that they had at 22nm, so they could give 50% of that to the CPU and 50% of that to the IGP(which will be significantly redesigned as well).
 

zebrax2

Senior member
Nov 18, 2007
977
70
91
I know what you are all saying, that TDP is thermal power and not electrical power. But via the law of conservation of energy, there is no way a CPU consuming 200 W only puts out 125 W of heat. Where would you propose the rest of that energy went?

Used in switching the transistors
 

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
Huh?

Why can't Intel offer a 6 core model to compete against AMD's Steamroller?

At 14nm, Intel will have almost twice the transistors to play with, that they had at 22nm, so they could give 50% of that to the CPU and 50% of that to the IGP(which will be significantly redesigned as well).
They can but will they, talk realistically here because we're not talking about the enthusiast market anymore !