First complete review of Haswell i7-4770K

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
AHAHAHAHAHAHAHAHAHAAHAHAHAHAHAHAHAHAHAHAHAHAHA :whiste:

*goes off looking at dual ivy cpu boards*

Guys ive known coolaler for a while now...
If something was wrong, he probably would of said something is wrong with your cpu.

Instead he's saying its UGLY... lolol... I pretty much trust his assessment.

Calm down there sparky ;)

Let's wait for final reviews before making broad assumptions. Doom and gloom was predicted with the Ivy Bridge pre-release as well, but the 3770k actually overclocks very well IMHO. We will just have to wait and see what reputable reviewers think. :)
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Here's the GT 630 GDDR5 going against HD 4000 using DDR3-1600: http://technewspedia.com/intel-hd-graphics-4000-vs-nvidia-geforce-gt-630/

Average(excluding 3DMark) shows 77% gain for the GT 630 GDDR5. Latest drivers bring 10-20% gain in games(also fixes low clock bug) while Haswell GT2 will add further 30% or so to that.

It makes the other benchmarks of the 4600 look weird

(Probably because of the high AA).

15252_i5-3570-juegos720P.jpg


Based on their performance in games, the GeForce GT 630 GPU performance shows 55.56% (75.33% higher if we include synthetic test) than the IGP Intel HD Graphics 4000,

Remove the synthetic test and you are looking at 56% performance deficit.

Better drivers (updated ones).

A DDR3 630 (which practically every single 630 is, why would they test a GDDR5 version when almost no 630 chips use GDDR5, its like testing a ddr3 7750).

Probably something like a 25-35% performance advantage for the hd 4600 vs hd 4000.

You could see hd4600 catch the tail end of the 630.
 

beginner99

Diamond Member
Jun 2, 2009
5,319
1,765
136
For high resolution laptop displays not even the HD4000 is enough. Look at the Retina Macbook Pros.

Yeah and ignoring the fact as other said that the problems aren't really from HD4000, how many such laptops exist? You got it, only the Retina Macbook Pros. I don't even now in what form factor they exist, 13, 15 and 17? That's 3 models world-wide of probably thousands. Pretty much confirms what i said. iGPU is only relevant in niche cases.

"Retina" displays will only become usable (on Windows) once MS finally gets their shit together and can do proper scaling and not rely on the useless font DPI increase shit.
 

colonelciller

Senior member
Sep 29, 2012
915
0
0
Business junior. We buy a lot more PC's than the home user and power consumption is very important to us.

:rolleyes:
let's refrain from calling others 'junior' when having difficulty arguing from logic. as a counterpoint to your comment and as a former purchaser of computers in "business" (whatever that is supposed to mean), power consumption was NEVER on the radar. never ever
 

colonelciller

Senior member
Sep 29, 2012
915
0
0
looks like we need a new CPU paradigm as Intel seems to be incapable of delivering significant performance improvements on silicon.

there is nothing built into the laws of the universe that mandates that CPUs be designed/constructed as they currently are.
 

NTMBK

Lifer
Nov 14, 2011
10,461
5,846
136
looks like we need a new CPU paradigm as Intel seems to be incapable of delivering significant performance improvements on silicon.

there is nothing built into the laws of the universe that mandates that CPUs be designed/constructed as they currently are.

Intel are very far from incapable, it just isn't their priority right now.

From their point of view, the most profitable customers who demand performance improvements are those who run massive datafarms, or build supercomputers- the kind who buy literally thousands of high margin Xeon chips. And what do they care about most? Performance/W. Look at why they shut down Roadrunner, because it cost too much to run compared to modern designs. When your compute cluster uses 2.5 megawatts of power and runs flat out 24/7, damn straight you care about efficiency. They want Intel to make chips which are as efficient as possible (maximum performance/W), and then they buy thousands and thousands of cores.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I don’t understand why people referring to mobiles, servers, tablets, Laptops etc on a high-end Desktop CPU topic.

Yes others may need performance/watt or lower power consumption but WE are DESKTOP users and this topic is about the Core i7 4770K, the highest socket 1150 Intel CPU selling at $300+.
 

colonelciller

Senior member
Sep 29, 2012
915
0
0
Intel are very far from incapable, it just isn't their priority right now.

From their point of view, the most profitable customers who demand performance improvements are those who run massive datafarms, or build supercomputers- the kind who buy literally thousands of high margin Xeon chips. And what do they care about most? Performance/W. Look at why they shut down Roadrunner, because it cost too much to run compared to modern designs. When your compute cluster uses 2.5 megawatts of power and runs flat out 24/7, damn straight you care about efficiency. They want Intel to make chips which are as efficient as possible (maximum performance/W), and then they buy thousands and thousands of cores.

For the desktop market & workstation market (excluding companies with insane #'s of computers) this performance/watt discussion is complete nonsense and a poorly-disguised marketing misdirection.

performance/watt is also a great way to redefine performance. when you add the "per watt" then you can keep the same performance and cut the watts in half and declare from the mountaintops that "performance has doubled!!!!" ...those same marketers know full well that the people hearing that pronouncement understands performance to be performance... not some arbitrary new definition favored by Intel's Marketing Group.

I do understand that Intel would love to milk silicon into the ground for another 10-20 years even if they already had an alternative CPU-making method ready to go that would quadruple performance overnight... completely understandable corporate logic would dictate keeping that tech locked down until the silicon could be milked no longer.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I don't really care about performance per watt, I doubt most enthusiasts do.

The total power used by the CPU does kind of make a difference, because it impacts the cooling and thus the noise the system produces. There is also a very real limit to how much you can cool on air and then on water and doubt anyone wants to move to the noisy and unreliable phase change cooling solutions just to run a modern CPU. So I think they had to stop increasing the power consumption but I don't particularly care about performance per watt as a metric in itself, only to some extent the impact it has on a system and its cooling. I would say there is a maximum around 150Watts that I don't want the CPU to exceed by default. Beyond that it doesn't matter to me and I doubt to many others.

However I do care about performance. I have programs today that take 10s of minutes to compile due to their size and complexity. If someone gave me a CPU with 100x as many cores or 100x more frequency I could put it to work immediately. What I can't get working is 100 machines on the same activity, it doesn't scale well due to the ratio of the size of the data to the size of the calculation. The overhead of a network dominates. I need more performance, like a million times more at least.

If intel were to continue to push on with 10% increases it would 8 product releases before they even double performance. In the 1995 to 2005 period we could have relied to get that within a single product release. To get the 1000x we have enjoyed in over a decade it would now take 73 product releases. 73! SB, IB, Haswell are all kind of disappointing products from a computing perspective. We can already see the impact, software has moved predominantly onto the server side into massive clusters of machines.
 

NTMBK

Lifer
Nov 14, 2011
10,461
5,846
136
For the desktop market & workstation market (excluding companies with insane #'s of computers) this performance/watt discussion is complete nonsense and a poorly-disguised marketing misdirection.

It's the same cores, from their ultrathin laptops up to their massive HPC clusters. They need one design for the core, and that design prioritises efficiency because that is what suits the majority of their customers the best.

Also- higher performance/watt means you can cram more cores into the same TDP. Just look at Ivy Bridge E- the new Xeon E5s will have up to 12 cores, up from 8 in the Sandy Bridge E5s.
 

NTMBK

Lifer
Nov 14, 2011
10,461
5,846
136
I don’t understand why people referring to mobiles, servers, tablets, Laptops etc on a high-end Desktop CPU topic.

Yes others may need performance/watt or lower power consumption but WE are DESKTOP users and this topic is about the Core i7 4770K, the highest socket 1150 Intel CPU selling at $300+.

It's the same cores, that's why. Tablets, laptops, servers, desktops. All the same core design. Intel's design priorities for the core are not dictated by the tiny enthusiast desktop market.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Intel are very far from incapable, it just isn't their priority right now.

This is more of the typical mantra: Intel is going to...

Haswell is here after so many speculation and the result is disappointing.

http://www.extremetech.com/computin...rmance-boosts-mixed-with-odd-design-decisions

Haswell is another Intel (product). They always promise, and never deliver. It is not about performance in a synthetic (benchmark) it is about (that) they don't support essential features we need and don't need to worry about compatibility on (our) code. Fusion, Radeon and GeForce have it all covered and if we have issues they resolve them or work (with us). Forget it they live in a mobile world right now.

http://www.brightsideofnews.com/new...u-results-show-disappointing-performance.aspx

In a pair of months we will read, on lots of sites, about how good Haswell successor will be...
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
It's the same cores, that's why. Tablets, laptops, servers, desktops. All the same core design. Intel's design priorities for the core are not dictated by the tiny enthusiast desktop market.

They may be the same design but they dont have the same performance.

We dont care about 17W CPUs in High-End desktops although they are the same cores. We want high-End products(CPUs) at 130W for the same price we had them 3-4 years ago, like Core i7 920 at $300.

Why not have a standard 130W Desktop at $300 with every CPU generation ?? I want Intel to introduce the 130W $300 products at the same time they introduce the 77W products.

Is that unresonable ?? We had the Core i7 920 before the 1156 socket.
 

NTMBK

Lifer
Nov 14, 2011
10,461
5,846
136
This is more of the typical mantra: Intel is going to...

Haswell is here after so many speculation and the result is disappointing.

http://www.extremetech.com/computin...rmance-boosts-mixed-with-odd-design-decisions



http://www.brightsideofnews.com/new...u-results-show-disappointing-performance.aspx

In a pair of months we will read, on lots of sites, about how good Haswell successor will be...

Haswell is good- just not for desktop. Desktop isn't their priority. They're integrating more parts (VRMs), producing a true SoC for ultrabooks/tablets, improving the graphics parts massively- all important in the low power market. But they're also beefing up the computation engine with AVX2, FMA3, gather instructions, etc- which will go down very, very well with the HPC crowd. But enthusiast desktop? Intel aren't prioritising it, so it basically gets shafted.
 

NTMBK

Lifer
Nov 14, 2011
10,461
5,846
136
They may be the same design but they dont have the same performance.

We dont care about 17W CPUs in High-End desktops although they are the same cores. We want high-End products(CPUs) at 130W for the same price we had them 3-4 years ago, like Core i7 920 at $300.

Why not have a standard 130W Desktop at $300 with every CPU generation ?? I want Intel to introduce the 130W $300 products at the same time they introduce the 77W products.

Is that unresonable ?? We had the Core i7 920 before the 1156 socket.

The 3930k is there if you really want that 130W desktop chip.

And yes, the low end matters. The reason Intel can hit those 17W models is because they have prioritised efficiency in the inherent core design. This is why their 130W performance part is 6 cores, instead of 4 cores.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
The 3930k is there if you really want that 130W desktop chip.

And yes, the low end matters. The reason Intel can hit those 17W models is because they have prioritised efficiency in the inherent core design. This is why their 130W performance part is 6 cores, instead of 4 cores.

3930k is now like 2 generations old and its price is a far cry from 300$.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
The 3930k is there if you really want that 130W desktop chip.

The 3930K is not a $300 product, 3820 is.

And yes, the low end matters. The reason Intel can hit those 17W models is because they have prioritised efficiency in the inherent core design. This is why their 130W performance part is 6 cores, instead of 4 cores.

Those 6 core 130W at 32nm would be 90W at 22nm. I want the same 130W with every CPU release. We could even have an 8 core at 130W at 22nm.
 

MisterMac

Senior member
Sep 16, 2011
777
0
0
Well those OC results look extremely disappointing.

We knew it'd be mobile, we knew workloads not using the new pipeline features or ISA extensions would be minimal ish.
But decent still.


....but i sure as hell hoped like 32nm did, that 22nm would mature into a monster OC.

That'd be enough for a SB upgrade for me - but this looks like itll oc less than IVY?

Ouch, ouch ouch :C
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
This is more of the typical mantra: Intel is going to...

I'd estimate that about 80% of your posts in this forum consists of "AMD is going to..." type mantra, so you posting what you did above is just forehead-smacking irony.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
Is the verdict here that I should hold onto my 3770k because it maybe a sidegrade at best?