Who's getting Haswell?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
I might still be rocking S775 in 2020, if the trend towards restrictive DRM being embedded deeper and deeper into computer systems keeps up.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
I know this is a for fun thread but it's still funny that Intel has your money even without knowing the performance of the chip or what apps will be out to utilize it. Damn those *ews are good.

Because they have brilliant track record unlike the other competitor who thinks they are in the construction sector?
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Likely not! probably to the 360 replacement which should be out around the same time.
 

dealcorn

Senior member
May 28, 2011
247
4
76
Haswell will do whatever it does on the CPU side but the GPU side will start to get interesting. All that Larrabee stuff that didn't do what Intel wanted at 45nm is back and it is amazing what you can do at 22nm with re branded IP. The technical discussion of the relationship of AVX2 and the vector processing unit will follow in due course and I anticipate getting a kick out of the part of the discussion where they get into how, from a certain perspective, what Intel is doing is inherently more powerful and more efficient than anything offered by its "more advanced" competitors. My expectation is that it will take out the bottom third of the discrete graphics card market.
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
Haswell will do whatever it does on the CPU side but the GPU side will start to get interesting. All that Larrabee stuff that didn't do what Intel wanted at 45nm is back and it is amazing what you can do at 22nm with re branded IP. The technical discussion of the relationship of AVX2 and the vector processing unit will follow in due course and I anticipate getting a kick out of the part of the discussion where they get into how, from a certain perspective, what Intel is doing is inherently more powerful and more efficient than anything offered by its "more advanced" competitors. My expectation is that it will take out the bottom third of the discrete graphics card market.

You think Intel is going to fully integrate its GPU+CPU into one unified instruction set? I suppose there is (much) precedent.
 

Predicament

Junior Member
Oct 21, 2011
9
0
0
Haswell's speed isn't even required with today's games :(
Hopefully fewer and fewer console ports will be released!
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
You think Intel is going to fully integrate its GPU+CPU into one unified instruction set? I suppose there is (much) precedent.

Isn't that AMD's plan with fusion? It only stands to reason that Intel would be making parallel moves.
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
Isn't that AMD's plan with fusion? It only stands to reason that Intel would be making parallel moves.


That is my understanding -- but I am surprised that Intel is "following" them in this regard. I expected them to downplay GPGPU because quite frankly, they don't have it atm, and their attempts at building a GPU are ... disappointing.

The fact that they are (may be) serious about GPU-compute says a lot about future Intel GPU performance. Which is a good thing.
 

dealcorn

Senior member
May 28, 2011
247
4
76
You think Intel is going to fully integrate its GPU+CPU into one unified instruction set? I suppose there is (much) precedent.

I doubt Intel will fully integrate its GPU+CPU into one unified instruction set. But AVX2 and its joined at the hip vector co processor unit will perform some functions traditionally thought of as GPU tasks. Intel's starting point was not lets take a class leading discrete GPU and put it on the same piece of silicone as our CPU for improved efficiency. Instead Intel assumed it would all be done a one piece of silicone and looked at what needs to be done and where should it be done with the understanding that it must be scalable. My understanding is that a separate GPU remains, but it is used more selectively. By abandoning some of the legacy nonsense that comes from a rich discrete heritage, Intel hopes to beat them at their own game using a bottoms up approach with better engineering.

Perhaps, Intel is just blowing smoke. I intend to stayed tuned and find out.
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
I'll get Ivy unless it turns out (unlikely) to be incompatible with my system; in that case I'll possibly go for Haswell or its subsequent die shrink. Not that what I have now is blowing chunks - far from it.
 

zlejedi

Senior member
Mar 23, 2009
303
0
0
Only if feel like speed increase would be noticeable.
With 2500K@4,5 Ghz I think i'm set for next few years.
Through i might still upgrade from boredom but then maybe IB first and 14nm Haswell refresh later ;)
 

khon

Golden Member
Jun 8, 2010
1,318
124
106
I usually skip one generation, and I have a SB for my desktop right now, so I'll probably skip IB and go for Haswell.

For my laptop I have Westmere, so there I will probably skip SB and go for IB.

Thus I will likely get both IB and Haswell.