Haswell model specs leaked

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Oh, I didn't know you were staying with the 2600K over the 3770K, makes me feel even more sound in my decision. :cool:

Truth be told, I actually keep the 2500K at 4.8GHz most of the time because it only needs a smidgen of voltage to get there, but most of the benchmarking I do I keep at 5.0GHz because it's a nice round number. Really, I haven't found any application that taxes the CPU at 4.8GHz, nevermind 5.0GHz. I caught an i5-750 in between, but I agree that I probably could have kept my Q6600 @ 3.6GHz circa 2007 until now and still have had a pleasant gaming experience today.

Performance/watt interests me, but my entire system already idles at 75W from the wall, I'm not sure how much more benefit I'll get. On the flip side, it doesn't seem like I'll be able to squeeze much more performance out of the high end due to the silicon limitations we're already seeing with Ivy Bridge (hopefully that changes). While higher efficiency under load is desirable, most of my load comes from running my GPU @ 100% mining while my CPU idles.

I think the most tangible benefit for me will come from the new mobile processors. However, Intel has a tough situation to compete with since my Acer 3820TG 13.3" notebook at just under 4lbs. still gets 8 hrs.+ useable battery life, and the 400 shader 6550m plays BF3 and other modern games easily while the Sandy Bridge chip can overclock to 3.4 GHz+. I'd only upgrade if I can get something with similar battery life and performance, but is lighter and comes with an IPS screen in the same price range (<$800). I'm not sure if my notebook was way ahead of its time, or that tech has stagnated so much that it's a tough act to follow. Either way, Intel will have to work for my dollar.

Thanks for the info. :thumbsup:
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Oh, I didn't know you were staying with the 2600K over the 3770K, makes me feel even more sound in my decision. :cool:

I'm replacing an aging fleet of five 3.3GHz Q6600's. So I did go with a 3770k, two of them in fact, as well as an FX-8350 and the 2600k, but none of them are necessarily "over" the others as they all kinda perform roughly within the same class in comparison to the Q6600's they are replacing.

But I think I'm done replacing chips for a while now. I suspect the now modernized fleet will last me until the 10nm generation. I skipped all things 45nm and will probably skip all things 14nm.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
They managed to finally get acceptable performance from integrated graphics, used by the majority of PCs and vitally important in laptops to avoid wasting battery life and space on a discrete GPU, and you think that people don't want that? :rolleyes:

Not for performance parts.

It makes sense on the low end and for mobile. It does not make sense on a 3770K.

Also, acceptable is a matter of opinion.
 
Last edited:

Soulkeeper

Diamond Member
Nov 23, 2001
6,712
142
106
I'm replacing an aging fleet of five 3.3GHz Q6600's. So I did go with a 3770k, two of them in fact, as well as an FX-8350 and the 2600k, but none of them are necessarily "over" the others as they all kinda perform roughly within the same class in comparison to the Q6600's they are replacing.

But I think I'm done replacing chips for a while now. I suspect the now modernized fleet will last me until the 10nm generation. I skipped all things 45nm and will probably skip all things 14nm.

you say this now ...
wait till you read the benchmarks :)
or get the itch in 6-8 months
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
We all knows what happens to IDC when Haswell arrives and people start posting Linpack numbers... :D
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
Not for performance parts.

It makes sense on the low end and for mobile. It does not make sense on a 3770K.

Also, acceptable is a matter of opinion.

Not everyone who wants high CPU performance wants or needs a high end graphics card. And Intel has brought out a couple of high performance quad cores without IGP, and they haven't sold well.
 
Aug 11, 2008
10,451
642
126
Not everyone who wants high CPU performance wants or needs a high end graphics card. And Intel has brought out a couple of high performance quad cores without IGP, and they haven't sold well.

If I recall correctly, the quads without IGP offered no major improvements over the regular quad core, not even a much cheaper price, so why would you choose one without IGP. I bet if intel offered an i5 without the igp for 140.00 or so, it would sell like crazy.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
If I recall correctly, the quads without IGP offered no major improvements over the regular quad core, not even a much cheaper price, so why would you choose one without IGP. I bet if intel offered an i5 without the igp for 140.00 or so, it would sell like crazy.

People forget that the IGP cost close to nothing. The development cost is rather miniscule and the diespace aint costing much either. Yet it serves something atleast 65% of all consumers directly demand.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
People forget that the IGP cost close to nothing. The development cost is rather miniscule and the diespace aint costing much either. Yet it serves something atleast 65% of all consumers directly demand.

So you think that an i7 3790K that is 6 core with no integrated graphics for the same price as the 3770K wouldn't sell?

The die space is costing quite a bit, you remove the graphics, add two cores, and STILL have a smaller chip.

Some mapped die shots.

http://www.itproportal.com/2012/04/24/picture-ivy-bridge-vs-sandy-bridge-gpu-die-sizes-compared/
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
So you think that an i7 3790K that is 6 core with no integrated graphics for the same price as the 3770K wouldn't sell?

The die space is costing quite a bit, you remove the graphics, add two cores, and STILL have a smaller chip.

Some mapped die shots.

http://www.itproportal.com/2012/04/24/picture-ivy-bridge-vs-sandy-bridge-gpu-die-sizes-compared/

Smaller die, yes, but also new (unique) mask-set too which drives cost on low-volume parts like no other.

If Intel sold it for the same ASP as a 3770K then they would be making much less money on it.
 

cytg111

Lifer
Mar 17, 2008
23,049
12,720
136
If you are sitting with a 5GHz 2500k now then it is unlikely you'll find much performance benefit to come from upgrading for another 4 years or so.

.. At the very least 4 years, I would think. We dont have much coming in the IPC / Clocks department for the foreseeable future, which again points towards more cores for more performance .. for that to make sense, we have some software engineering troubles to overcome first. That happening within the next 4 years ? 5, 6 7 ? I dont see it.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Not for performance parts.

It makes sense on the low end and for mobile. It does not make sense on a 3770K.

Yes, I think it is somewhat ironic that the K model desktop quad cores get HD4000, but all the low end desktop dual cores come with HD2500 (with the exception of one core i3 desktop SKU).

If anything, people forking out $200+ for K model quad cores would also be the same people willing to pay $200+ on a discrete video cards (and not use the supplied iGPU) In contrast, anyone buying budget dual core desktop would probably want the best iGPU they could get.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
In contrast, anyone buying budget dual core desktop would probably want the best iGPU they could get.

And presumably they do get the best iGPU they can get (AMD), unless they actually aren't all that interested in getting the best iGPU they could get.

There is no shortage of people who want more without paying for it; for the people who want more and are willing to pay for it there are plenty of "priced-appropriately" options.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Yes, I think it is somewhat ironic that the K model desktop quad cores get HD4000, but all the low end desktop dual cores come with HD2500 (with the exception of one core i3 desktop SKU).

If anything, people forking out $200+ for K model quad cores would also be the same people willing to pay $200+ on a discrete video cards (and not use the supplied iGPU) In contrast, anyone buying budget dual core desktop would probably want the best iGPU they could get.
I don't think it's ironic. I do see a place for strong integrated graphics being coupled with a strong CPU. That's where AMD would like to be, and honestly, how awesome would it be to buy one inexpensive chip with decent graphics and a great processor for consumer workloads?

Do you really need a 7970? Not really. Having a top of the line card made sense years ago, but the hardware these days is overshooting software, leading to IGPs and other lower performance graphics units being acceptable for gaming. So get an A10 or i5 K series processor and be done with it. That would be fantastic for budget builds.

Ideally, bigger monitors with more pixels and more graphically intensive games will curb this... but that's not really happening right now.
 
Aug 11, 2008
10,451
642
126
And presumably they do get the best iGPU they can get (AMD), unless they actually aren't all that interested in getting the best iGPU they could get.

There is no shortage of people who want more without paying for it; for the people who want more and are willing to pay for it there are plenty of "priced-appropriately" options.

I think what was meant was "the best igp" available with any given cpu architecture. That is the best igp possible with the intel cpu. If I inderstand correctly, it would not cost intel any more to enable HD4000 on the i3 and i5 chips than to go with the HD2500. I mean that is basically what the do on the majority of the mobile chips, so I dont see what it would cost them to do the same on the desktop. That said, I dont really care, because I would add a discrete card to any desktop I buy or build, but the way they manage the igp on the desktop seems very strange to me too.
 
Aug 11, 2008
10,451
642
126
I don't think it's ironic. I do see a place for strong integrated graphics being coupled with a strong CPU. That's where AMD would like to be, and honestly, how awesome would it be to buy one inexpensive chip with decent graphics and a great processor for consumer workloads?

Do you really need a 7970? Not really. Having a top of the line card made sense years ago, but the hardware these days is overshooting software, leading to IGPs and other lower performance graphics units being acceptable for gaming. So get an A10 or i5 K series processor and be done with it. That would be fantastic for budget builds.

Ideally, bigger monitors with more pixels and more graphically intensive games will curb this... but that's not really happening right now.

I would love it too if there was a cpu with an igp equal to something like a HD7770, which is pretty much what I consider the minimum for gaming with current titles and future titles which seem to be getting more demanding graphically. Unfortunately, current igps are far from this level and will be inadequate even for some current titles and presumably even more future titles.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
If I inderstand correctly, it would not cost intel any more to enable HD4000 on the i3 and i5 chips than to go with the HD2500.

Actually Intel does have a specific die for the dual core with 2500HD and the dual core with HD4000.

http://www.anandtech.com/show/5876/the-rest-of-the-ivy-bridge-die-sizes

http://forums.anandtech.com/showthread.php?t=2234017&highlight=ivy+bridge+cut+lines <---Also see this excellent thread on Intel die configurations.

In contrast, AMD seems to be the company that sells full dies with disabled portions. I believe this, in part, has to do with their low volumes (compared to Intel) vs. the price of making specific mask sets for derivatives.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
In contrast, AMD seems to be the company that sells full dies with disabled portions. I believe this, in part, has to do with their low volumes (compared to Intel) vs. the price of making specific mask sets for derivatives.
That's pretty much the exact reason. That and their budget simply being smaller in the first place.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Smaller die, yes, but also new (unique) mask-set too which drives cost on low-volume parts like no other.

If Intel sold it for the same ASP as a 3770K then they would be making much less money on it.

I'm sure there are many like me who have had no incentive to upgrade from Sandy because Intel is failing to beat it.

Unless Haswell overclocks beyond 5GHz it is going to be yet another pass for me... I'm definitely not the type of person to sit on my hands for three CPU generations.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
So you think that an i7 3790K that is 6 core with no integrated graphics for the same price as the 3770K wouldn't sell?

The die space is costing quite a bit, you remove the graphics, add two cores, and STILL have a smaller chip.

Some mapped die shots.

http://www.itproportal.com/2012/04/24/picture-ivy-bridge-vs-sandy-bridge-gpu-die-sizes-compared/

You forget all other physical properties. So its a silly comparison at best. A "3790K" for example would quickly break the board TDP limit and/or with lower clocks. The heat density would be crazy. Not to mention 2 additional stops on the ringbus instead of 1. Possible memory starvation and so on.
 
Last edited:

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
I don't think it's ironic. I do see a place for strong integrated graphics being coupled with a strong CPU. That's where AMD would like to be, and honestly, how awesome would it be to buy one inexpensive chip with decent graphics and a great processor for consumer workloads?

Do you really need a 7970? Not really. Having a top of the line card made sense years ago, but the hardware these days is overshooting software, leading to IGPs and other lower performance graphics units being acceptable for gaming. So get an A10 or i5 K series processor and be done with it. That would be fantastic for budget builds.

Ideally, bigger monitors with more pixels and more graphically intensive games will curb this... but that's not really happening right now.

4K displays are on the way. A10 cannot play a huge number of titles acceptably. (40fps on low at native resolution)
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
You forget all other physical properties. So its a silly comparison at best. A "3790K" for example would quickly break the board TDP limit and/or with lower clocks. The heat density would be crazy. Not to mention 2 additional stops on the ringbus instead of 1. Possible memory starvation and so on.

What.

The performance difference in most apps between ddr3-1333 and ddr3-2133 is 0-4% save synthetics and 7zip.

The board TDP limit? it would be a smaller die?

Are you suggesting we haven't done this before? Or that Intel isn't doing it right now with the same technology and just overcharging?

http://www.newegg.com/Product/Produc...82E16819174635

130w

http://ark.intel.com/products/52576/Intel-Xeon-Processor-X5690-12M-Cache-3_46-GHz-6_40-GTs-Intel-QPI
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
What.

The performance difference in most apps between ddr3-1333 and ddr3-2133 is 0-4% save synthetics and 7zip.

The board TDP limit? it would be a smaller die?

Are you suggesting we haven't done this before? Or that Intel isn't doing it right now with the same technology and just overcharging?

http://www.newegg.com/Product/Produc...82E16819174635

130w

http://ark.intel.com/products/52576/Intel-Xeon-Processor-X5690-12M-Cache-3_46-GHz-6_40-GTs-Intel-QPI

LGA1155 platform is 95W only. And diesize is irrelevant to power consumption as such. There is a reason why core 2 and 3 is hotter than 1 and 4. And 1 being the coldest core on LGA1155 SB/IB.

The IGP only uses 8W peak. And it functions as "dark silicon" in terms of heat. Replacing it with 2 cores would raise both power consumption and heatspots quite radically.

SandyBridgeTempsandDarkSilicon.jpg
 
Last edited: