[THG]Core i7-4770K: Haswell's Performance-Previewed

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,695
4,657
75
I'm kind of surprised everyone is so down on Haswell. This is pretty close to the performance I expected. 0% improvement in existing FPU-intensive apps, ~10% improvement in single-threaded integer apps, and ~20% improvement in integer apps with hyper-threading. Actually, I hoped for more improvement with HT - maybe it's bandwidth-limited as was the cause of some of the other low numbers.

A lot of apps will require recompiling - though in many cases not redesigning - to take advantage of Haswell's AVX2 and FMA. Actually, I wouldn't be surprised to see an app come out that converts multiplies and adds in existing code to FMAs.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
I'll probably still end up picking one of these up unless Steamroller is a very large improvement on Piledriver...

Well Steamroller will probably be a huge improvement of Piledriver (edit: did that even make sense ? :)). But Intel have already plenty room for more Ghz or Cores, and when steamroller hits market Intel is already at next step. What Steamroller can give is perhaps a little bit competition - like AMD K6 LOL
 
Last edited:

Piroko

Senior member
Jan 10, 2013
905
79
91
First this is with even pre-beta drivers (look at 7970 at release vs now and then compare it to an engineering sample of the 7970 which will be even lower) and with reduced bandwidth (about 19% less than IB will make a small but noticeable difference).
Your point being? HD4600 isn't new architecture, HD7000 was. Also, Intels driver updates took long to release and did little for performance in the past.


From notebookcheck on trinity tdp. Without the gpu enabled the chip can't even run at its rated clock speed.
Being cynical here: At least it does hit its rated speed for a short time. That is more than what I can report.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Your point being? HD4600 isn't new architecture, HD7000 was. Also, Intels driver updates took long to release and did little for performance in the past.
There's still a revision that's taken place, and now that there are more resources, there is room to change the drivers to take advantage of the larger unit count. And there are other variables that are introduced with Haswell, like the decoupled L3 and eDRAM. There's absolutely room for improvement and optimization.

Whether or not they've already made those improvements or not is an entirely different question.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Haswell GT2 has 20EUs, 25% more than IvyBridge GT2 at 16EUs. In desktop they dont have a chance against Trinity, not to mention Richland.

Also, Core i3 Haswell will not be released until the end of 2013, close to Kaveri release.

Mobile will be very interesting with Haswell GT3 and GT3 + Cristalwell against Richland/Kaveri + Hybrid CrossFire.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
AFR cannot be compared to a single GPU solution. I would never recommend AFR to someone outside of the enthusiast market. It's a bad solution for the average consumer who has no idea about the problems associated with it.
 
Last edited:

Zap

Elite Member
Oct 13, 1999
22,377
7
81
CPU-wise it looks almost like a tick instead of the tock it was supposed to be. Pretty minor changes in overall CPU performance.

Wasn't Ivy Bridge more of a tick+? If it was more of a normal tick, then you can compare Haswell to Sandy Bridge in CPU performance, and see more of the droids tock you were looking for.
 

Hulk

Diamond Member
Oct 9, 1999
5,143
3,743
136
Wasn't Ivy Bridge more of a tick+? If it was more of a normal tick, then you can compare Haswell to Sandy Bridge in CPU performance, and see more of the droids tock you were looking for.


That is a good point. We've gotten used to tick's increasing IPC a bit.
 

Meekers

Member
Aug 4, 2012
156
1
76
Wasn't Ivy Bridge more of a tick+? If it was more of a normal tick, then you can compare Haswell to Sandy Bridge in CPU performance, and see more of the droids tock you were looking for.

The + only referenced the igpu power. Had nothing to do with overall performance.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Doesn't matter what they call it. Bottom line is over 2 generations there has been only about 10% improvement in CPU performance. We used to see more than that in one "tock" plus another few percent from the "tic".

Tic/tock has pretty much become tick.....tick...tick.
Only if you're silly enough to define everything by performance increase in existing applications. Thankfully for the rest of us, semiconductor firms are smarter than that.
 
Last edited:

Meekers

Member
Aug 4, 2012
156
1
76


Here are a bunch of sources that disagree with you.

Ivy Bridge is a "tick," taking the 32 nm "Sandy Bridge" architecture and scaling it down to 22 nm. But Ivy Bridge goes further than past ticks—it includes an extensively improved GPU architecture, leading Intel graphics architect Tom Piazza to describe it as a "tick plus."
HD Graphics 4000: The Plus In Intel's Tick+ said:
Intel fellow who unveiled Ivy Bridge’s graphics subsystem at last year’s IDF, made the case that even though we’re looking at a die shrink—a tick in the company’s cadence, by definition—the integrated GPU is more accurately characterized as a tock.

Meanwhile, Intel graphics architect Tom Piazza isn't content to call Ivy Bridge a "tick" all all. He calls it a "tick+" because the graphics architecture has been extensively overhauled, more along the lines of what happens with a "tock" refresh on the CPU side.
The company calls Ivy Bridge a “tick plus” because the chip brings significantly better graphics and new power-management capabilities.
Where the CPU component follows perfectly on the tick tock Intel development model, with the simpler die-shrink tick following the more in-depth architectural revamp tock, the GPU part is being characterised as a 'tick plus'.
You are, no doubt, quite familiar with Intel's CPU-release "cadence" of tick-tock by now. If not, the short story is that every tock brings a major breakthrough, while ticks are decent upgrades but nothing to Twitter home about. hat's not necessarily the case with Intel's latest tick, the Ivy Bridge CPU. Sure, the performance enhancements on the x86 side of the aisle won't exactly knock you on your tuchus, but they're still decent. The upgrades to the graphics core, however, make Ivy Bridge more noteworthy.
But the bigger step-up is in the integrated graphics processor, which makes a stronger case for doing without a discrete graphics chip and relying solely on this facility for onscreen action. Inspired by the larger graphics improvement over Sandy Bridge, Intel refers to this upgrade as ‘Tick-Plus’.
Mooly called the Ivy Bridge a "Tick Plus", not just the regular die shrink from the Intel's well known tick-tock cadence. The graphics performance is significantly increased, and for the first time Intel will be compatible with Microsoft's DirectX 11 API.

I think I have found enough sources. Let me know if you want more, I am only 30 results into my google search.
 
Last edited:

Piroko

Senior member
Jan 10, 2013
905
79
91
There's still a revision that's taken place, and now that there are more resources, there is room to change the drivers to take advantage of the larger unit count. And there are other variables that are introduced with Haswell, like the decoupled L3 and eDRAM. There's absolutely room for improvement and optimization.
Ever since GMA was released there was lots of room for improvement. The X3000(? can't remember) was supposed to take off with the more efficient DX10. The 4500 should have fixed that, but never did. The original HD graphics - prematurely crowned as Discrete killer - chopping along with flaky hardware acceleration, terrible image IP and, compared to actual game fps, overblown benchmark scores.

I can't remember a single bug that was fixed after the respective chips were released. I'm sure that some of you will now search for one to try and disprove my point, but honestly, trust has to be earned and Intel has had the worst track record of any company. Not even Matrox or 3dfx failed that hard to deliver.

--- raged enough for now, probably picking harder on them than needed, but I am beefed that my low expectations were undercut by quite a lot. Considering that I played Minecraft, Civ5 and Sins of a Solar Empire just fine on my 780g chipset with sideport when my old graphics card broke and the HD4000 should be several times faster according to benches...
 
Last edited:

Ventanni

Golden Member
Jul 25, 2011
1,432
142
106
If you're going to consider GPU performance, you're absolutely going to have to consider the company's track record in delivering quality drivers.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Here are a bunch of sources that disagree with you.
"Tick-tock" is the shorthand Intel uses to describe the cadence on its product roadmap that delivers chips with smaller transistors, or "ticks," and chips with major architectural changes, or "tocks," in alternating years. Ivy Bridge chips—the first Intel will produce using its next-generation 22-nanometer process—are a "tick," but they'll also include some significant "tock"-like design improvements, so the chip giant is calling them a "tick+."
http://www.pcmag.com/article2/0,2817,2392932,00.asp

But yes, it appears that you're right. Seems like a few of the tech sites got it wrong.
If you're going to consider GPU performance, you're absolutely going to have to consider the company's track record in delivering quality drivers.
And if you're going to involve yourself in an online discussion, it is important that you don't take things out of context.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Just curious, what do you expect from Broadwell that makes you want to wait for it?

Lower power for the same performance. With the integrated VRMs I would think the idle-active power will be impressively low (20W) combined with the 14nm core logic.
 

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
I think IDC talks about desktop Core i7s hitting 20w idle states at Intels 14nm node. :)
 

beginner99

Diamond Member
Jun 2, 2009
5,318
1,763
136
Doesn't matter what they call it. Bottom line is over 2 generations there has been only about 10% improvement in CPU performance. We used to see more than that in one "tock" plus another few percent from the "tic".

Tic/tock has pretty much become tick.....tick...tick.

I see your point. But if you consider that while CPU perfomance did not increase much, from Core2Duo to Haswell a whole GPU was added to the chip on top of the CPU increase. Without the GPU, we could easily have 8-core mainstream cpus. Actually sad there is no such thing...
 

mikk

Diamond Member
May 15, 2012
4,300
2,383
136
I can't remember a single bug that was fixed after the respective chips were released. I'm sure that some of you will now search for one to try and disprove my point, but honestly, trust has to be earned and Intel has had the worst track record of any company.


No need for it, you are plain wrong.