i7 2600K Sandy VS i7 980 Extreme

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
I have not decided to get the 2500K or 2600K yet. My son is putting pressure on me as he knows my E8400/HD4870 system will be his hand-me-down. Who can blame him, he is running an old AMD 3700+/HD4650 system!hehe
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
so Ivy Bridge will be out in 9 months? um Sandy Bridge 2011 will just be coming out then so do you have those confused?

It won't, Ivy Bridge should release by CES 2012 so to keep 1 year lifespan for Sandy Bridge.

This has me curious how exactly does it have more Bandwidth then the triple channel setup?

That depends on the benchmark, like Sysmark's max memory bandwidth, the tri-channel setup is higher: http://www.techspot.com/review/353-intel-sandy-bridge-corei5-2500k-corei7-2600k/page9.html

Sandy Bridge has expanded the load and store address units on the CPU by 2x, so it achieves higher memory bandwidth, though the maximum bandwidth should be higher with 3-channel.
 

Edrick

Golden Member
Feb 18, 2010
1,939
230
106
Sandy Bridge has expanded the load and store address units on the CPU by 2x, so it achieves higher memory bandwidth, though the maximum bandwidth should be higher with 3-channel.

This is very noticable. I used the same RAM at the same timings on both my i5 750 and i7 2600 builds. (1600mhz 7-7-7-21 T1) The difference was about 2.5GB/s Read/Write and 5.2ns in latency (using AIDA64).
 

Pohemi

Lifer
Oct 2, 2004
10,962
17,142
146
I wasn't willing to upgrade for a while because I felt like I had an i5 already, with my Q9650 and a 12MB cache...but these sandy bridge chips are looking tasty, might have to do it...
 

Makaveli

Diamond Member
Feb 8, 2002
5,026
1,624
136
Sandy Bridge has expanded the load and store address units on the CPU by 2x, so it achieves higher memory bandwidth, though the maximum bandwidth should be higher with 3-channel.

This is what I was looking for thanks.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Here, this pic should explain the results:
comparisons.jpg


Pics courtesy of Xbitlabs
 

Koniakki

Junior Member
Jan 11, 2011
4
0
0
Here, this pic should explain the results:
comparisons.jpg


Pics courtesy of Xbitlabs

+1 You just wanna be at my TOP favorite user list, don't you? :D

Seriously, thank you for saving 30-40min of looking, organizing and posting. Appreciated.

Taken from Guru3D. More sources are available of course but you can always google and read all the reviews. I read most of them.
2s0hkc3.jpg
 
Last edited:

Koniakki

Junior Member
Jan 11, 2011
4
0
0
This part was confusing to me as well, but considering the crazy fervent SB love going on around here lately, it's no surprise it was posted.

Well, considering the fact that I am on a G31 with 4GB DDR2-667 and an ATI 4770 and waiting for Bulldozer and 2011 before I finally upgrade, I wouldn't say I love SB. I love technology but I'm not in love with it.. No one should. :p

I think intel should send 980x owners a free ssd. I mean come on intel. 980x has been out a few months and they put out a new midrange platform that competes with it for €519.26 less. I'd be pissed at intel if I forked over a grand and 3 months later the equivalent is available for €255.75

You have a few key points. But SB is newer and we know the technology keeps advancing. Yesterdays High-End is as Todays midrange... :cool:
But the 980X is the BETTER CPU here and there's no doubt about it. Those with i7 9xx/8xx systems, there's no reason for an upgrade.
Negligible Real-World performance differences, unless you have 10 benchmark suites lined-up and waiting for your 24/7 benchmark madness.. :rolleyes:


Haha. Not only does Tweakboy have to annoy everyone in the Video forum but here too. He used to create a thread for every new Nvidia driver update and talk about how much it improved his gaming on a 8800GT with no benchmarks and ignoring the fact that the drivers provided no improvement for his card.

Dont be that harsh.. He's annoyingly amusing.. :biggrin:

P.S: Come on guys. As Maximilian very wisely said about the Q6600, I will like to add the we should respect everyone Rigs. If it works for them, let them be. Of course as long they don't provide exaggerated, misleading and inaccurate information about them.. :)

I would love to have a Q6600/980X/SB but since my Sammy LCD broke because of a f*****g WII remote; my s****y system suits me well at my resolution until my next complete upgrade..
 
Last edited:

PlasmaBomb

Lifer
Nov 19, 2004
11,636
2
81
Haha. Not only does Tweakboy have to annoy everyone in the Video forum but here too. He used to create a thread for every new Nvidia driver update and talk about how much it improved his gaming on a 8800GT with no benchmarks and ignoring the fact that the drivers provided no improvement for his card.

Wouldn't that be ban worthy?
 

Axon

Platinum Member
Sep 25, 2003
2,541
1
76
My jump from i7 920 -> 2500k wasn't that much of a change. The biggest deal is the beastly overclock, which does make some difference. Maybe I'll sell it and grab a 2600k. :)
 

AdamK47

Lifer
Oct 9, 1999
15,846
3,638
136
920 to 2500K sounds like an odd upgrade. The newness of it all must have been one of the driving forces. 2600K would have been the wiser choice.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,134
3,668
126
What is the actual power draw at the outlet for your comp?

They had a solution @ CES and i picked it up!!! It was aimed at IPAD's, but i had the Doc mod it to plug into my AX1200 PSU!

40-mr_fusion.jpg


finally found it! now to find a flex capacitor so i can take my 990X back in time and break all the WR back in 1970!


None Troll part:
Roughly a lot... depends on idle and load and overclock.
The gulftown's and im sure adam has a B2 version has a feature which can disable unused cores when in idle.
So technically it can be only 1 32nm core being used with the 5 others disabled for a lower power draw.

But you guys are really funny.
You do know the 990X i am on is clocking higher then the average 2600K with 2 extra cores right?

Replacement? i think not..
 
Last edited: