[THG]Core i7-4770K: Haswell's Performance-Previewed

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tipoo

Senior member
Oct 4, 2012
245
7
81
CPU-wise it looks almost like a tick instead of the tock it was supposed to be. Pretty minor changes in overall CPU performance.

GPU wise, I wonder how much further the GT3E with eDRAM would push performance?
 

Roadrunners

Junior Member
Jan 7, 2013
10
0
61
CPU-wise it looks almost like a tick instead of the tock it was supposed to be. Pretty minor changes in overall CPU performance.

GPU wise, I wonder how much further the GT3E with eDRAM would push performance?

The lack of competition is slowing the CPU market progression right down.
 

tipoo

Senior member
Oct 4, 2012
245
7
81
The lack of competition is slowing the CPU market progression right down.

I'm not sure I completely buy that, Intel has to convince owners of previous Intel platforms to buy the new ones as well. Regardless of what AMD is doing, Intel will want to keep pushing performance forward, otherwise the heads of IT companies and more informed general consumers will further extend their upgrade cycles, which means less revenue for Intel.

Generations with sideways performance changes rather than upwards have happened before, I think it's just that. Perhaps Intel put so much emphasis into this being a great mobile architecture that little was left for the enthusiast segment.

And to be fair, some future workloads will run faster on Haswell than what we've seen since the new extensions are faster than the general code improvements.

It is disappointing though. All that waiting and wondering and it turns out again to be nothing an IVB or even Sandy Bridge owner should envy.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
They "featurize" through blown fuses. All i can make from Intels decision disabling TSX on K models is simple greed, they dont want TSX users to buy cheap desktop unlocked chips to get the extra performance for free but want them to move to 6cores up in the 500-1000$ price range, they keep AVX2 in the entire Core lineup since its essential for the new extension to be supported and further distance them from AMD but they segment on TSX/VT-d, seems they think that TSX is a more business/server oriented feature rathen than something useful and essential for the home desktop.

Quite right. And can we please get AVX2 on the inevitable Celeron/Pentium variety of Haswell...? (asking very nicely :p)

I mean AMD already has support for AVX in the ultra-low-end A4-5300. In this case the more chips out there that support AVX, the more developers would support it. (Hopefully)
 

mikk

Diamond Member
May 15, 2012
4,300
2,383
136
The lack of competition is slowing the CPU market progression right down.


Not the only reason imo. The market shifted towards mobile: Notebooks, Tablets, Smartphones. The key is power efficiency in this market. Haswell was made with power efficiency in mind. Also performance increases are not that easy anymore. I mean IPC and frequency...but they could develop a 6 core Mainstream CPU at least. But this would be suited only for desktop and not so for Notebooks. Probably not lucrative for Intel. Maybe in 2-3 years with Skylake.
 

videogames101

Diamond Member
Aug 24, 2005
6,783
27
91
Who's waiting for software? GCC 4.7 supports AVX2, and I know one of my friends will be recompiling his gentoo installation when these chips are released.

You should be able to recompile most open source software with AVX2 instructions on day 0.
 
Last edited:

Hulk

Diamond Member
Oct 9, 1999
5,143
3,742
136
Should be a decent performance upgrade from my 2500k in the applications where I actually need the speed. Namely video apps. That coupled with the fact that this could be the last non bga chip means I'll most likely upgrade. If I had originally gone with a 2600k the decision would be harder.

While I have to say it's a little disappointing that we're not seeing enormous IPC increases I think we also need to temper that feeling with reality. I think many of us are basing every "tick" on the Core2Duo release when we saw huge IPC gains from Net Burst. That was nearly 100% in some applications for me. But besides Net Burst being a very inefficient dead end for Intel, they also had the added pressure of actually being behind AMD at the time. It was a perfect storm for them to blow us away with C2D.

Fast forward 7 years and they have been constantly improving the same C2D architecture. It's getting hard to make giant IPC strides. Not to mention the fact that there is basically no x86 competition for them at the high end. I guess they could double the tick tock cycle and we could see 10 to 20% IPC increases but then we'd be like "is that all after 3 years, or 6 years of development?"

Outside of applications using new instructions I think it's unrealistic to expect 20% performance gains. Seems like 5 to 10% will be the norm with more emphasis on power and gpu for the time being.

Of course if Haswell turns out to be a great overclocker then I bet that most people will get on board despite modest IPC gains.

Personally I'm very interested in the fact that CUDA will be part of Haswell. I use Sony Vegas Pro quite a bit and currently it uses CUDA acceleration. I'm wondering if it will be Haswell capable and if the Haswell GPU is strong enough compute-wise to actually be helpful in Vegas.

I'm not familiar with nVidia and AMD discrete cards, I'm not a gamer. Can anyone guess as to the relative compute of the various Haswell gpu's compared to nVidia and AMD discrete cards?
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Not that great for desktops but for mobile amazing.

Igp improvements are stunning. More than 30% gain in gpu performance.

20% gain in hitman at 768p, 50% gain in dirt 3 at 768p, 28% gain in skyrim at 768p, 16% gain in wow.

With pre-release drivers and with less bandwidth than the ivy bridge hd 4000 (even the hd 3000 shows a small gain with greater bandwidth so the 20% less bandwidth that the haswell preproduction sample is getting is hurting it quite badly). I wouldn't be surprised to see additional 10-20% gain (probably more) with better drivers and the memory bandwidth issue fixed.

Unlike ivybridge, all i5's and i7's desktop have the hd 4600. If the i3's also have the hd4600 amd is going to be in a rougher position than they are in now (considering the a10 is only about 10-20% faster as of now--this will be less on release of haswell). The a10 will still win for value but it will not sweep the table as it does with the current i3 chips, many of which have the hd 2500.

For mobile, the hd 4600 will with proper drivers beat trinity (considering trinity is about 20-30% faster than the hd 4000--about 2/3 the desktop a10's power) and tie with richland (really trinity with some power optimizations and slghtly higher gpu clocks (5-7%). The gt3 (if its 40 eu at 800m mhz) will eat trinity alive.
If the gt2 hd 4600 is on the i3 mobile chips as well then amd is going to lose their graphics edge on their apu's. And if they get full gt2 performance at ULV (17 watt) levels, Amd is going to be left in the dust for ultaportable gpu performance.
 

CurrentlyPissed

Senior member
Feb 14, 2013
660
10
81
Very glad I purchased the 3770k instead of waiting. Phew. I really hope there are improvements left. That's terrible. Seems 90% of the focus was on the iGpu
 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
*looks at his E8400*...can't wait! :D
i just want to know one little thing before i fork over my money: does it heat up like IVB with OC and extra voltage???

and when will we see decent implementation of iGPU+dGPU working together and actually improving performance?
 

Sherlockwing

Member
Aug 11, 2012
38
0
0
8% improvements, with a 9% increase in TDP. Meh. Should be nice for certain workloads once AVX2 becomes common, but that won't be for a while.

Disable the IGPU and the overall Motherboard+CPU TDP should remain the same between Haswell and Ivy, remember that part of the TDP increase is the on-die VRM which means on-Motherboard VRM TDP/power loss is lower.
 

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
Quite right. And can we please get AVX2 on the inevitable Celeron/Pentium variety of Haswell...? (asking very nicely :p)

I mean AMD already has support for AVX in the ultra-low-end A4-5300. In this case the more chips out there that support AVX, the more developers would support it. (Hopefully)

Celerons and Pentiums are out of the question, AVX2 is a "feature" for the Core lineup and it gets disabled for the bottom line, Core i3s will get AVX2 but i think not TSX...
 

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
*looks at his E8400*...can't wait! :D
i just want to know one little thing before i fork over my money: does it heat up like IVB with OC and extra voltage???

and when will we see decent implementation of iGPU+dGPU working together and actually improving performance?

I doubt it will come to Sandy Bridge thermals but i believe it will be better than IB.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
Overall pretty much what was expected, but disappointing. And it is stupid of Toms to keep comparing it to a six core. A very efficient quad core can compete with a less efficient cpu with more cores, but not against a six core with basically the same efficient architecture.

Come on Intel, give us a clockspeed bump and six cores in the mainstream.

And as far as the new instruction sets, who knows whether they will ever be widely useful. "Our cpu is great, the software just has not caught up yet" ---we have heard that already. Just make it better for the current software.

Maybe final silicon will be slightly better, but not holding out much hope.

Ivy E will come out with 6 to 12 cores. The latter on a xeon . gl
 

Unoid

Senior member
Dec 20, 2012
461
0
76
Mr 2600k, looks like we'll keep chugging together at 4.8ghz for a longer while.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
I doubt it will come to Sandy Bridge thermals but i believe it will be better than IB.

Let's hope.
I could careless about buying one for my Desktop. My 3570K is the last quad-core I'm buying. I am very excited to have this in my work laptop though. Time to retire the hot 1st gen i3.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
Mr 2600k, looks like we'll keep chugging together at 4.8ghz for a longer while.

No reason too, unless this review is bunk.

I am guessing here for IPC - Sandy @ 4.8Ghz = Ivy @ 4.5Ghz = Haswell @ 4.2Ghz
 
Last edited:

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
For mobile, the hd 4600 will with proper drivers beat trinity (considering trinity is about 20-30% faster than the hd 4000--about 2/3 the desktop a10's power) and tie with richland (really trinity with some power optimizations and slghtly higher gpu clocks (5-7%). The gt3 (if its 40 eu at 800m mhz) will eat trinity alive.
If the gt2 hd 4600 is on the i3 mobile chips as well then amd is going to lose their graphics edge on their apu's. And if they get full gt2 performance at ULV (17 watt) levels, Amd is going to be left in the dust for ultaportable gpu performance.

richland is supposed to increase 20 to 40% in gpu performance over trinity :p

but gt3e might eat it for lunch, even kaveri...and many low end dGPUs
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
I have a system with an old e6400 that I will be putting a 4570K/Mobo in, but this is not exciting for a high end gaming desktop part over what is out there right now.

As far as all the new modern games that support multi-threading, SB-E is still going to be king of the hill until we see IB-E.
 

Ventanni

Golden Member
Jul 25, 2011
1,432
142
106
For mobile, the HD 4600 will with proper drivers beat trinity (considering trinity is about 20-30% faster than the HD 4000--about 2/3 the desktop A10's power) and tie with Richland (really Trinity with some power optimizations and slightly higher GPU clocks (5-7%). The GT3 (if its 40EU at 800 mhz) will eat trinity alive.
If the GT2 HD 4600 is on the i3 mobile chips as well then AMD is going to lose their graphics edge on their APU's. And if they get full GT2 performance at ULV (17 watt) levels, Amd is going to be left in the dust for ultra portable GPU performance.

Don't get too excited just yet. The HD4000s in the low wattage IB mobile parts are just as TDP limited as Trinity parts, so you also expect that Haswell's desktop GT2 performance is going to be a bit faster than its mobile counterpart.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Not that great for desktops but for mobile amazing.

Igp improvements are stunning. More than 30% gain in gpu performance.

20% gain in hitman at 768p, 50% gain in dirt 3 at 768p, 28% gain in skyrim at 768p, 16% gain in wow.
It still won't play these games well though...

Unlike ivybridge, all i5's and i7's desktop have the hd 4600. If the i3's also have the hd4600 amd is going to be in a rougher position than they are in now (considering the a10 is only about 10-20% faster as of now--this will be less on release of haswell). The a10 will still win for value but it will not sweep the table as it does with the current i3 chips, many of which have the hd 2500.

For mobile, the hd 4600 will with proper drivers beat trinity (considering trinity is about 20-30% faster than the hd 4000--about 2/3 the desktop a10's power) and tie with richland (really trinity with some power optimizations and slghtly higher gpu clocks (5-7%). The gt3 (if its 40 eu at 800m mhz) will eat trinity alive.
If the gt2 hd 4600 is on the i3 mobile chips as well then amd is going to lose their graphics edge on their apu's. And if they get full gt2 performance at ULV (17 watt) levels, Amd is going to be left in the dust for ultaportable gpu performance.
All these comparisons are only true for selected mainstream games. As soon as you leave mainstream, you hit walls like this one:
http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review/17
That even got worse, current Minecraft 1.5 will not run above 15 fps on an i7 3517-u even with optifine and everything tuned for speed. That's ridiculous considering that Trinity will run it fine even with Sonic shaders.

Granted, I ranted about a single game only, but I'm really dissapointed so far. Pick something not covered by media and performance is a gamble at best. Also, while you can squeeze another 10-30% out of Trinity with some slight oc, you're dead in the water if a game doesn't run well on your IB processor.
 
Last edited: