Haswell model specs leaked

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

podspi

Golden Member
Jan 11, 2011
1,982
102
106
Intel's model naming has blown goat barf ever since they discovered the word "Core". It is so ridiculously overloaded at this point that nobody knows what anyone else is talking about half the time. It would be like if Ford suddenly decided to call every vehicle that rolls off the assembly line a variation of "Car" with a number on it.

Hey guys: there are lots of words in the English language. Pick a few of them and use them for model names. If you don't like any of them, you can even make up new ones!


In my opinion, Intel's naming scheme hit a peak on ridiculousness with the Core 2 Duo. The model numbers they use now are meaningless, but given the huge number of models they now field simultaneously (as opposed to letting chips trickle down the price curve) it makes sense. There generally are, at anytime, only a few that most enthusiasts would be interested in in the first place.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Intel's model naming has blown goat barf ever since they discovered the word "Core". It is so ridiculously overloaded at this point that nobody knows what anyone else is talking about half the time. It would be like if Ford suddenly decided to call every vehicle that rolls off the assembly line a variation of "Car" with a number on it.

Hey guys: there are lots of words in the English language. Pick a few of them and use them for model names. If you don't like any of them, you can even make up new ones!
I disagree completely. I like the nomenclature. An i7 is better than an i5 which is better than an i3. The only part that gets confusing is with their mobile chips.
 

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
The i3/i5/i7 stuff isn't so bad. As podspi said, it was worse a few years ago, when we had Core Solo and Core Duo and Core 2 Duo and Core microarchitecture on top of it.

I just hate the use of the word "Core" for a brand name. It's too generic, and that makes it confusing.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
The i3/i5/i7 stuff isn't so bad. As podspi said, it was worse a few years ago, when we had Core Solo and Core Duo and Core 2 Duo and Core microarchitecture on top of it.

I just hate the use of the word "Core" for a brand name. It's too generic, and that makes it confusing.

LOL, so true, but they out-did themselves by going one step further with "Atom" :|

But "Core" is a good moniker for capturing the present era of "race for more cores" in comparison to the preceding era of "race for more MHz".
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,277
614
126
But "Core" is a good moniker for capturing the present era of "race for more cores" in comparison to the preceding era of "race for more MHz".

Except that lately we've not seen either increase much in the mainstream CPUs... :\
 

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
But "Core" is a good moniker for capturing the present era of "race for more cores" in comparison to the preceding era of "race for more MHz".

*Is* there even a "race for more cores" any more? Seems that has stagnated as well. (ETA: Fjodor said this too, I'm still waking up.)

All we have now is a "race for less power consumption" and a "race for slightly less mediocre integrated GPUs".

No wonder the CPU world is getting so dull. :)
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,277
614
126
*Is* there even a "race for more cores" any more? Seems that has stagnated as well. (ETA: Fjodor said this too, I'm still waking up.)

All we have now is a "race for less power consumption" and a "race for slightly less mediocre integrated GPUs".

No wonder the CPU world is getting so dull. :)

The question is whether there is anything on the horizon that could lead to substantial CPU performance increases in the next 5-10 years?

Or will we have to wait for quantum computers to make an entrance before things get exciting again?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Except that lately we've not seen either increase much in the mainstream CPUs... :\

*Is* there even a "race for more cores" any more? Seems that has stagnated as well. (ETA: Fjodor said this too, I'm still waking up.)

All we have now is a "race for less power consumption" and a "race for slightly less mediocre integrated GPUs".

No wonder the CPU world is getting so dull. :)

Well I was more just thinking of the generalized case of what we have seen since going sub-100nm.

90nm was all about dual-cores, 65nm was all about quads, 45nm saw AMD with hex-cores, and 32nm brought 8 cores.

Intel followed this up through 45nm if we consider the addition of HT to their quad-cores, making them 8-thread capable and so on.

And of course the core counts per socket for x86 server parts continues to increase every node.

But I do agree, mainstream products have more been targeting the lowering of socket power usage than increasing thread count per socket.

performance/watt is definitely the dominant metric nowadays, spanning the entire gamut of compute products from TOP500 supercomputers to servers to desktops to tablets to smartphones.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Thats funny IDC , I love some stuff coming up, I know avaton is a server chip but with eight cores it should be fun . Than airmont supposedly =to phI performance on the 6 core chip . I can here myself saying hay thats great performance even tho we by that point long ago . Its the cpu world turned upside down , The only justic inthis it was Apple that caused it all. I love apple is bigger and better that MS . I hate the jobs died . But he did live to see his company were it should have always been , Gates I hope lives to see his company go bankrupt . The theif
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Gates I hope lives to see his company go bankrupt . The theif

You can't cheat an honest man. Gates may have stolen IP but practically everyone on the planet was willing to reward him for it (to enrich themselves in the end because they wanted the enhanced productivity that came with the questionably appropriated software).
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
YA I understand IDC. Sorry but an accounting shell be his reward . Thats just the way of it. Now do I hold it against him not so much After he stepped up . Did the right thing . As a result look at apple today. That does not change accountability. He gives alot to Charity , You can't not buy accountability. One thing that people should know about a true witness. They have accountability that is held at even a higher level. So things are not real good for them . many times to know evil they have to become evil.

Your can't cheat an honest man . I agree when I meet one I will tell you about it . We cheat ourselves IDC all of . By not honoring our parents and their parents . I going sommewhere else right now. We don't honor the past . Only somethings change from the past . Evolution if it is , Doesn't happen over night it took billions of years. Some things just never change .
Every generation thinks its better than the one befor . LOL ! IN the world of High Tech thats a good point . But Hightech when you really get down to is all about people isn't it.
What if history as we know it nothing but lies? Food for thought. The 12/21/12 thing . Its amusing how people lol at other people . Passing judgement is not honorable . witnessing however is!
Some observations we don't have years here just a few words .
Year 0 Christ death year harod died 4bc year roman emperor at christ death came to power 3 AD Problem . A lie exist here. Fast forward Were in the mini ice age the year 1492 . Spanish rape south am,erica . Take ancient relics back to rome
Mayan calender shows up . Back than they new more about symbol writing than did we . Rome tried its hardest to destroy the truth threw the Latin Language. Jump to 1582 pope screws with clalender under the lie that it was to correct for easter in relation ship to the stars and spring. ALL lies. XMAS never happened in the christian manner . Thats a 6,000 year old story only changes were the names were altered. None KNOW the hour day or year was the teaching. Calender had to be altered to make the mayan calender meaningless . Here the laughter in the world . I did . and I witnessed

Real world ever year on new years eve I make a prediction . NOT last year . But this year I did . I called it the year of the meteors . I also made this prediction some time ago because I understood I had to . It a matter of record here . Befor we knew what science is telling us now . They are calling this the year of the cometS. and there is so much more so much and I have to stand by and witness . This is not 2013 it is infact 2012. It is not the end either. But god almighty we all shall desire death.
Yep I crazy and off topic . But I tell you this IDC . There is an accounting and a reckoning . Their shall also be a wonderful change at the end of the day.If you bother to look Its been 4 tears when I started telling about the year coming . It would probly be better if you don't look and see. Last year I skiped the prediction but I did account for it in my writings I really didn't think a drought was new years eve worthy . But I still told the people and got laughed at which is fine by me , So long as at the end of that day I can laugh in joy
 
Last edited:

Haserath

Senior member
Sep 12, 2010
793
1
81
You can't cheat an honest man. Gates may have stolen IP but practically everyone on the planet was willing to reward him for it (to enrich themselves in the end because they wanted the enhanced productivity that came with the questionably appropriated software).

Evolution gives the cheaters the upper hand. As long as other humans don't care, the cheater is rewarded.

And now we have Intel squashing AMD.
 

NTMBK

Lifer
Nov 14, 2011
10,480
5,896
136
In my opinion, Intel's naming scheme hit a peak on ridiculousness with the Core 2 Duo. The model numbers they use now are meaningless, but given the huge number of models they now field simultaneously (as opposed to letting chips trickle down the price curve) it makes sense. There generally are, at anytime, only a few that most enthusiasts would be interested in in the first place.

Oh heck no, I wish they had stuck with Core 2 Solo/Duo/Quad. It made sense! Solo/Duo/Quad is an easy way of telling how many cores you have, the number in the middle tells you what generation. Means the man on the street can at least quickly grasp roughly what his processor is like. Try explaining to someone that a laptop i3 and i5 are essentially the same processor, just one has as 200Mhz clock bump. Unless they're from different generations, in which case the Ivy Bridge i3 will be way better than the Westmere i5. Especially when you have Sandy Bridge and Ivy Bridge on the market at the same time.

At least on the desktop side, there's the clear demarcation of Pentium = 2C2T, i3=2C4T, i5=4C4T, i7=4C8T. In mobile, anything from an i3 to an i7 is a dual core with hyperthreading, and only minor clock differences between them. Apart from the i7s with a Q buried in the obscure and consumer-unfriendly code number. Seriously, try explaining to someone that the i3-3120M will outperform an i7-2617M.
 

coffeejunkee

Golden Member
Jul 31, 2010
1,153
0
0
At least on the desktop side, there's the clear demarcation of Pentium = 2C2T, i3=2C4T, i5=4C4T, i7=4C8T. In mobile, anything from an i3 to an i7 is a dual core with hyperthreading, and only minor clock differences between them. Apart from the i7s with a Q buried in the obscure and consumer-unfriendly code number. Seriously, try explaining to someone that the i3-3120M will outperform an i7-2617M.

You forgot about the i5 3470T (and the i5 6xx line before that).

Mobile line-up is a bit more complicated for sure, but an easy way to explain is: it's newer.
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,739
156
106
A guy over at overclock.net just wrote up this nice blog on the FIVR
FIVR BLOG
I thought this was a nice read and more detailed than any others i've seen so far.
 
Last edited:

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Except that lately we've not seen either increase much in the mainstream CPUs... :\

Because they are too busy cramming terrible GPUs and other shit we don't want into them.

What happened to their 2% more performance for every 1% of die space rule?

i think a 6 core haswell with no shitty GPU would just slightly outperform a 4 core haswell with :p
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
What happened to their 2% more performance for every 1% of die space rule?

I think you misread the rule somewhere. Its 2% performance for 1% power rule. Point is to keep power consumption in check and even better, on a dropping level.
 

NTMBK

Lifer
Nov 14, 2011
10,480
5,896
136
Because they are too busy cramming terrible GPUs and other shit we don't want into them.

What happened to their 2% more performance for every 1% of die space rule?

i think a 6 core haswell with no shitty GPU would just slightly outperform a 4 core haswell with :p

They managed to finally get acceptable performance from integrated graphics, used by the majority of PCs and vitally important in laptops to avoid wasting battery life and space on a discrete GPU, and you think that people don't want that? :rolleyes:
 

tommo123

Platinum Member
Sep 25, 2005
2,617
48
91
indeed. if they can get acceptable gaming performance (720p, min 30 fps at not crappy settings) then i'll take the die space loss. i'll stick in a discrete card anyway but if that dies then i'll still have a working PC and can still game (albeit at a lesser performance).

hell, right now i still haven't put my card into my PC and am running off the intel igpu. don't game yet but for everything bar 1 flaw it's no diff than having my 6970 in there. that one flaw being it doesn't like stopping and resuming videos in mpc-hc. fine with my ati card though :eek:
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Anything official on overclocking yet? Sitting on a 2500K @ 5.0GHz, I feel like I won't be upgrading. However, if the IGP turns out to be competitive with my 6550m, I might consider upgrading my laptop.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Anything official on overclocking yet? Sitting on a 2500K @ 5.0GHz, I feel like I won't be upgrading. However, if the IGP turns out to be competitive with my 6550m, I might consider upgrading my laptop.

If you are sitting with a 5GHz 2500k now then it is unlikely you'll find much performance benefit to come from upgrading for another 4 years or so.

In late 2006 I bought a QX6700 (the first quad chips on the market) and using vaporphase I OC'ed it to 4GHz. What I noticed was that because of the OC'ing there wasn't anything worth upgrading to until 2011 when Intel rolledout Sandy Bridge (the chip you have now), going to a 5GHz 2600K was the first upgrade that made sense after having a 4GHz QX6700.

So I suspect a similar situation will hold for you too (and for me, sitting with my 5GHz 2600K, it'll be years before I will benefit much from upgrading)...now performance/watt is a whole different story of course.

If you are looking to reduce your wattage while hitting roughly the same performance then stepping up to Haswell will do that for you (or the 14nm stuff). Otherwise you should be looking to the 10nm chips before thinking you'll see much that is going to really best your 5GHz 2500K IMO.