Intel Haswell-E price list available

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cbn

Lifer
Mar 27, 2009
12,968
221
106
Anybody care to speculate as to if we may actually see an X99 board in the mini-itx form factor? There hasn't been a mini-itx HEDT thusfar, but mini-itx hadn't really caught fire back when X58/X79 were introduced. I've had my NCASE M1 sitting here collecting dust, a HW-E would be tempting to throw in it.

Interesting thought, because there will be less RAM slots, but
I'm guessing that the overall TDP of the HEDT CPUs may make that a bit prohibitive, for "most" mini-ITX cases. (How many mini-ITX cases exist that can take aftermarket tower heatpipe coolers?)

1.) Maybe Mini-ITX and two DIMM slots would be do-able (even for the bandwidth needs of octo-core) now that RAM is DDR4?

quad channel DDR3 ~= dual channel DDR4 right?

(With that mentioned, I think the wild card out of the gate would be the cost of the fast DDR4)

2.) There are Mini-ITX cases capable of taking 160mm tower coolers. (If not, there is always H20 to consider). The main thing (IMHO) would be having the necessary power delivery on the motherboard.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
Yeah -- I'm having second thoughts or option inspirations for my plan to build a Haswell-E next year with custom-water-cooling. I'm skeptical about the OC scalability of these chips. Only the software would use the power of hex-core, and it's beginning to look like an overkill-luxury. Unless there's a specific application that would benefit massively from six or eight cores, I could see scaling my ambitions back to four.

As for the 3570K as a "futureproof" status-quo, I feel the same way about my i7-2600K system. Not only did I "build it" and overclock it in 2011, it has "evolved" since then. If I had to "let go," it would be hard. That rig continues to amaze me. And I can see how Ivy and Haswell have improved raw performance, likely augmented the instruction sets and features.

Well. Heck. According to my "project schedule," I have a good part of a year to think it over . . . Don't I? Yup.

Your machine with hyperthreading and being clocked higher than mine, should last you for quite a while yet, unless you decide to upgrade when you don't really need to.

Win 7 still seems very fresh and fully functional to me, so even if Microsoft fix up what is wrong with Win 8 in version Win 9, there is still no reason to upgrade.

Also having an SSD drive makes everything seem so blazingly fast.

The only upgrade I am likely to do anytime soon, is for a new graphics card(currently have an AMD HD7850 2gb), but only after Half Life 3 comes out, as nothing else would tempt me to upgrade.

So let's see what Win 10 and 10nm CPU's look like. :biggrin:
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
10gb ethernet would add a few $100 in cost to a motherboard is my guess. But, maybe Asus will produce a model with it. I can't see it coming as standard on every X99 board though, it's an expensive feature.

I seriously doubt we'll see 10 Gb-e on an enthusiast motherboard - that'll jack the price of a ROG board into the stratosphere! On the plus side, it's likely that most boards with have 2x1 Gb-e; hopefully a few will have the drivers/software to gang them together.

If one really needs that 10 Gb-e, it can be had in an x4 PCIe config or by getting a server board plus a Haswell Xeon (though no overclocking - sadly).
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,239
595
126
With SSDs reaching beyond 1 GByte/s speeds, 10 Gbit/s Ethernet is already obsolete. We need 100 GBit/s instead. :biggrin:
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
With the IPC difference between Ivy and Haswell on the 6 core models I don't think we will see that maybe DDR4 will help.

As for the 8 core model I think that is doable.

Is Cinebench updated to support AVX2? The version I have is from 2010, so I think not - otherwise, Haswell would kick butt.

This brings up a important point: for software that uses AVX2, HW-E is going to have ~30% higher throughput than IVB-E. Now that is a nice bump in performance (plus another ~+15% for the 5960X) on a clock normalized basis.

I think this was already mention visa vi games, but this will apply for people who use certain workstation apps on DIY systems as well.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
With SSDs reaching beyond 1 GByte/s speeds, 10 Gbit/s Ethernet is already obsolete. We need 100 GBit/s instead. :biggrin:

I'm on disagreeing with you, I already max out my 1 Gb-e link backing up my workstation to my server. By the time I build my next system (probably a year out, unless I wait for BW-E), I'm sure an M.2 X4 SSD will be pumping out > 1.5 GBps. :eek: :awe:

I just think it's unlikely that even ASUS will be willing to add $200 to their already high HEDT board prices.
 

Makaveli

Diamond Member
Feb 8, 2002
4,978
1,571
136
Is Cinebench updated to support AVX2? The version I have is from 2010, so I think not - otherwise, Haswell would kick butt.

This brings up a important point: for software that uses AVX2, HW-E is going to have ~30% higher throughput than IVB-E. Now that is a nice bump in performance (plus another ~+15% for the 5960X) on a clock normalized basis.

I think this was already mention visa vi games, but this will apply for people who use certain workstation apps on DIY systems as well.

i'm not sure as there are a few version floating around.

But if it was added you are 100% correct should be a nice boost.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
I seriously doubt we'll see 10 Gb-e on an enthusiast motherboard - that'll jack the price of a ROG board into the stratosphere! On the plus side, it's likely that most boards with have 2x1 Gb-e; hopefully a few will have the drivers/software to gang them together.

If one really needs that 10 Gb-e, it can be had in an x4 PCIe config or by getting a server board plus a Haswell Xeon (though no overclocking - sadly).

Yeah I feel 10gb-e at home is a looong ways out. You're also going to need a switch with support and every machine you have on the network with the hardware support. Very expensive for the very few people who need those kinds of transfer speeds. I would guess it's mostly something people who do a lot of work with video or consistently make huge back ups find a use for.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,239
595
126
Yeah I feel 10gb-e at home is a looong ways out. You're also going to need a switch with support and every machine you have on the network with the hardware support. Very expensive for the very few people who need those kinds of transfer speeds. I would guess it's mostly something people who do a lot of work with video or consistently make huge back ups find a use for.

But isn't 10 Gbps Ethernet expensive because of low volumes? If it would have been included on mainstream motherboards in high volumes, would it really be that much more expensive than 1 Gbps Ethernet?

It seems like 1 Gbps Ethernet has been around for "ages", so it's about time that we move along to 10 Gbps.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,239
595
126
At least Intel will be using soldered heatspreaders for Haswell-E, see:

www.bit-tech.net/news/hardware/2014/07/28/intel-heatspreaders/

As you know there have been some discussions on why Intel moved from solder to TIM in the first place. The article mentions an interesting explanation:

Although many considered the move [from solder to TIM] to have been made out of a desire to boost profits, there were sound engineering reasons relating to smaller die sizes causing cracking of the solder and the formation of heat-trapping voids that can damage the chip.

Those issues appear to have been resolved in time for the Haswell-E enthusiast chip family, thankfully.
Has that explanation been mentioned elsewhere before? At least I did not know about it.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
At least Intel will be using soldered heatspreaders for Haswell-E, see:

www.bit-tech.net/news/hardware/2014/07/28/intel-heatspreaders/

As you know there have been some discussions on why Intel moved from solder to TIM in the first place. The article mentions an interesting explanation:

Has that explanation been mentioned elsewhere before? At least I did not know about it.

IB-E uses solder too. Kinda destroys the article completely in every way.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
IB-E uses solder too. Kinda destroys the article completely in every way.

Well, not every way. The 2011 CPUs have larger dies than the consumer CPUs, so it may be half right.

Either way, it's just like I said before. It's soldered. It doesn't make sense to use TIM on a platform used for servers, and I'd imagine that it would actually be more expensive to have two different heatspreader designs for the same die.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,239
595
126
IB-E uses solder too. Kinda destroys the article completely in every way.

Yes, IB-E use solder too. But you don't know the reasons for that. It could be that because IB-E has higher TDP than IB they had to use solder instead of TIM (since solder transports heat more efficiently than TIM), despite knowing about the risks mentioned in the article.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Yes, IB-E use solder too. But you don't know the reasons for that. It could be that because IB-E has higher TDP than IB they had to use solder instead of TIM (since solder transports heat more efficiently than TIM), despite knowing about the risks mentioned in the article.

The article is irrelevant since it fails on even basic facts.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Care to enlighten us in what way, and how that for certain makes their theory about the switch from solder to TIM on IB incorrect?

When the article says HW-E got solder again, after the previous model didnt (Read. IB-E.). Then you kinda disqualify the rest of the article for any purpose.´Not to mention everything else is nothing but random speculation.
 

TrulyUncouth

Senior member
Jul 16, 2013
213
0
76
When the article says HW-E got solder again, after the previous model didnt (Read. IB-E.). Then you kinda disqualify the rest of the article for any purpose.´Not to mention everything else is nothing but random speculation.

Is it not possible that they needed to switch to TIM for IB and just said might as well go TIM on IB-E for simplicity? And now that they have had time to think and maybe work on better solder application they can go back to it? Perhaps IB-E was large enough(Assuming their theory is correct) but intel just chose to stick to the same setup for supply line simplicity or to be safe?

My point is that the article clearly makes assumptions- but just assuming something in return on your end doesn't immediately debunk the entire article.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Is it not possible that they needed to switch to TIM for IB and just said might as well go TIM on IB-E for simplicity?

This is IB-E delidded:

03.JPG


4960X to be exact.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,342
265
126
The combination of minimal performance gains + laziness to take down my current build = no upgrades for me a while. Well, maybe a larger SSD but that's almost plug and play.

Maybe I'll feel differently if I see any $375 hexcores.
 
Last edited:

Redstorm

Senior member
Dec 9, 2004
293
0
76
Either way, it's just like I said before. It's soldered.

I think you will find HW-e is using conductive epoxy, in the photo it doesn't even look like solder (i.e. silver and shiny in color) see the above shot of IB-e now that is solder.

ROHs regs limit the amount of lead in solder and for intel to meet the EU requirements i doubt they could use solder.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
I think you will find HW-e is using conductive epoxy, in the photo it doesn't even look like solder (i.e. silver and shiny in color) see the above shot of IB-e now that is solder.

ROHs regs limit the amount of lead in solder and for intel to meet the EU requirements i doubt they could use solder.

If they met the requirements before, why wouldn't they now?

Either way, delidding the CPU destroyed the die, so we at least know that the contact is good.
 
Last edited: