• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Who's getting Haswell?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I might still be rocking S775 in 2020, if the trend towards restrictive DRM being embedded deeper and deeper into computer systems keeps up.
 
I know this is a for fun thread but it's still funny that Intel has your money even without knowing the performance of the chip or what apps will be out to utilize it. Damn those *ews are good.

Because they have brilliant track record unlike the other competitor who thinks they are in the construction sector?
 
Haswell will do whatever it does on the CPU side but the GPU side will start to get interesting. All that Larrabee stuff that didn't do what Intel wanted at 45nm is back and it is amazing what you can do at 22nm with re branded IP. The technical discussion of the relationship of AVX2 and the vector processing unit will follow in due course and I anticipate getting a kick out of the part of the discussion where they get into how, from a certain perspective, what Intel is doing is inherently more powerful and more efficient than anything offered by its "more advanced" competitors. My expectation is that it will take out the bottom third of the discrete graphics card market.
 
Haswell will do whatever it does on the CPU side but the GPU side will start to get interesting. All that Larrabee stuff that didn't do what Intel wanted at 45nm is back and it is amazing what you can do at 22nm with re branded IP. The technical discussion of the relationship of AVX2 and the vector processing unit will follow in due course and I anticipate getting a kick out of the part of the discussion where they get into how, from a certain perspective, what Intel is doing is inherently more powerful and more efficient than anything offered by its "more advanced" competitors. My expectation is that it will take out the bottom third of the discrete graphics card market.

You think Intel is going to fully integrate its GPU+CPU into one unified instruction set? I suppose there is (much) precedent.
 
Haswell's speed isn't even required with today's games 🙁
Hopefully fewer and fewer console ports will be released!
 
Isn't that AMD's plan with fusion? It only stands to reason that Intel would be making parallel moves.


That is my understanding -- but I am surprised that Intel is "following" them in this regard. I expected them to downplay GPGPU because quite frankly, they don't have it atm, and their attempts at building a GPU are ... disappointing.

The fact that they are (may be) serious about GPU-compute says a lot about future Intel GPU performance. Which is a good thing.
 
You think Intel is going to fully integrate its GPU+CPU into one unified instruction set? I suppose there is (much) precedent.

I doubt Intel will fully integrate its GPU+CPU into one unified instruction set. But AVX2 and its joined at the hip vector co processor unit will perform some functions traditionally thought of as GPU tasks. Intel's starting point was not lets take a class leading discrete GPU and put it on the same piece of silicone as our CPU for improved efficiency. Instead Intel assumed it would all be done a one piece of silicone and looked at what needs to be done and where should it be done with the understanding that it must be scalable. My understanding is that a separate GPU remains, but it is used more selectively. By abandoning some of the legacy nonsense that comes from a rich discrete heritage, Intel hopes to beat them at their own game using a bottoms up approach with better engineering.

Perhaps, Intel is just blowing smoke. I intend to stayed tuned and find out.
 
I'll get Ivy unless it turns out (unlikely) to be incompatible with my system; in that case I'll possibly go for Haswell or its subsequent die shrink. Not that what I have now is blowing chunks - far from it.
 
Only if feel like speed increase would be noticeable.
With 2500K@4,5 Ghz I think i'm set for next few years.
Through i might still upgrade from boredom but then maybe IB first and 14nm Haswell refresh later 😉
 
I usually skip one generation, and I have a SB for my desktop right now, so I'll probably skip IB and go for Haswell.

For my laptop I have Westmere, so there I will probably skip SB and go for IB.

Thus I will likely get both IB and Haswell.
 
Back
Top