• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Who's getting Haswell?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I still haven't decided on Haswell, will probably just skip until Skylake. Definitely will get the Skymont though (the die shrink of Skylake).
 
My rig is fine, has been fine for 4 years and will continue to be adaquete for my DAW workstation and 1080p 8x AA 16xAF high detaily multisample, vsync on triple on

I get 60fps capped no matter what I do where I go in any game,, including Crysis 2 and MASS EFFECT 2.

Also games take up about 60 percent of CPU power,, soo there is 40 percent not even being used. Guys wanna keep upgrading. Only difference is video rendering will be faster,, but IM a audio guy and gamer,, soo that doesnt apply to me. My dads 2600k 3.8Ghz SSD 6GBS renders fast,,, but I could care less about a 1 minute difference or soo. My friend renders with Premiere with his loley Q8300 2.8Ghz ,, locked BIOS lol. He has a 9800 GT and is happy. Sonar X1c x64 with plethora of synths and plugins and NI stuff. Runs at 25 percent CPU load on a average project I do.

Getting the 24/192khz to render takes less then the songs time. Soo ya most people are throwing their money, instead of being productive behind their Rig, they just wanna get the latest and run benchmarks 24/7 ...gl

I have a friend who is obsessed with using GPUs for his audio mixing says it would let him layer far more effects in real time than with his CPU thoughts on this???
 
Im hoping to be back on an AMD platform by the time Haswell comes out. *Hoping*
 
Ill be getting a new Rig in 2020. It will be a code name Sharon Bridge. Sandys older sister. LOL

The rig will be 32 core + HT for 64 cores soo many that my speedfan cant even show them all.

128GB RAM and a 1TB Intel SSD that does 2GB per second sequential read and write. 0.00000001 ms latency. Couple more SSD's for backup and DAW storage. Windows 11. DX14. 😛 D:
 
I;m debating to get Ivy or haswell, I think I will let time tell. If i can survive my Ivy envy, I think I will get Haswell in 2013. but regardless, Intel's got my money this round for sure.
 
I have a friend who is obsessed with using GPUs for his audio mixing says it would let him layer far more effects in real time than with his CPU thoughts on this???
As long as a many-millisecond buffer is OK (IoW, real time meaning faster processing than real time, not fast response time, which GPUs suck at for several reasons), there are people working on exactly that, though I haven't kept up enough to know if any commercial products have materialized.
 
Do we have any concrete information regarding if the Performance platform for Haswell will have support for six-core CPUs? We already know Ivy Bridge won't, but I hope we won't get stuck with quad-cores for the next "tock".
 
<-- is getting haswell, it doesn't seem like i7 920 is going to bottleneck any single 28nm GPU
 
Sorry, can we see pictures of this? I am calling you out on this unless you have proof....

If you look at the AT bench, the 560 Ti gets around 43fps with a i7 920 @ 3.33 on Crysis 1: Warhead on 1920,1200., which is as fast or faster than your C2Q. Thats with 4AA, not even 8AA!!

I don't like calling people out, but this statement is pretty obvious a lie.

Ok good call out. I have 21 games and will soon have battlefield. Im very picky when it comes to latency smoothness mouse smoothness and framerate being 60fps ,, not drop to 40's just low 50's sometime. All my games do this.

The proof I can give is taking a video of me playing and show the FPS ticker on the top.

BTW I use max game settings Ultra but disable motion blur , I dont use any DX patch, I just play what the guys made at first, plus I have no problems with my stock version. I usually don't upgrade games unless its multiplayer too and requires a update to connect to servers.

Also look @ my video cards OC. Ill have that video soon my friend.
 
Last edited:
Ok good call out. I have 21 games and will soon have battlefield. Im very picky when it comes to latency smoothness mouse smoothness and framerate being 60fps ,, not drop to 40's just low 50's sometime. All my games do this.

The proof I can give is taking a video of me playing and show the FPS ticker on the top.

BTW I use max game settings Ultra but disable motion blur , I dont use any DX patch, I just play what the guys made at first, plus I have no problems with my stock version. I usually don't upgrade games unless its multiplayer too and requires a update to connect to servers.

Also look @ my video cards OC. Ill have that video soon my friend.

Might want to look at all of the threads you've started about crashes, etc. and then look at no patches and your video card's OC. 😉
 
This question is basically, "who is going to be buying a new computer in ~1.5 years".

If I get SB-E, I won't be getting Haswell. If I don't get SB-E I will be more likely to get Haswell than IB since I'm not at all excited about IB (improved IGP big woop, save it for the OEMs).
 
I'm actually more interested by Haswell's implications on the laptop side of things. I can see myself picking up a MBA type notebook based Haswell, and that would be enough for my daily driver besides gaming. If they find out a way to attach a discrete card via Thunderbolt or some other high speed interface, I'd say there's no longer any need to have a desktop.
 
I know this is a for fun thread but it's still funny that Intel has your money even without knowing the performance of the chip or what apps will be out to utilize it. Damn those *ews are good.
 
If Haswell is all that they are making it out to be, I want one and will wait for one.

8 cores idling with 10 watts of idle system power consumption sounds too good to be true to me. If they can actually do this, I'm in. 🙂
 
Back
Top