2600k 3GPU vs 3930k 2GPU

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

omeds

Senior member
Dec 14, 2011
646
13
81
Interesting thread guys. I made a new thread without seeing this one, so I'll repost here:

Hey guys, long time lurker first time poster :p

I have a dilemma.. Long story short, I had an x58 system with Tri-SLI 580, mobo got damaged so I sold the CPU in favour of building a new system.

What to build for a gaming rig? z68 system with 2600k/2700k or x79 system with 3930k?

Money is a concern, but so is performance, I dont want to gimp my 3 GPU's too badly.

Basically I can build a z68 system (Maximus 4 Extreme-Z + 2600k) for just over HALF the cost of an x79 system (Rampage 4 Formula + 3930k)..

My concern is pcie lanes and how it will effect performance. I only play on single displays of either 1080p @ 120hz or 2560x1440, so I'm hoping the pcie lanes wont effect performance too much like they possibly would with an uber high res surround system.

Halp pls!

The z68 board in question runs the 3 GPU's at 8+16+16 thanks to nf200. I have been looking at some benchmarks, and a Tri z68 system outperforms a Tri x58 system despite it having more native pcie lanes, both CPU's were clocked to 4GHz and the x58 system had a 200bclk and tri channel memory. I kind of think the same would hold true for x79 platform too?

SB generally clocks better than SB-E from what I've seen, at least on modest cooling (H100 or less) and I feel on a single display it may be the better performer in most games by virtue of faster core speed, however in surround type set ups, you might want the extra pcie bandwidth.. but for a single display I dont think its needed as much, or at least I'm hoping as it will save me $500!
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
I think the answer to that question is to wait for the 4 core LGA2011 CPUs to come out. They will be the right answer to Tri SLI that doesn't need more than a 4C / 8T CPU (which I believe is just about every game known to man.)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think the answer to that question is to wait for the 4 core LGA2011 CPUs to come out. They will be the right answer to Tri SLI that doesn't need more than a 4C / 8T CPU (which I believe is just about every game known to man.)

Well the quad-core chip won't be out for a while. So in that case, the alternative also has to be weighted: Z77 chipset with native PCIe 3.0 support. If it only has 16x lanes, but those lanes are going to be operated at PCIe 3.0 speed (2x faster), then the PCIe bottleneck will pretty much become a non-issue. However, Ivy Bridge will have better overclocking and better IPC than SB-E, making it faster for games.

Cinebench_Single.png


Cinebench_Multi.png


Source

If $ is a concern, then for gaming, 2500k + Z68 board with NF200 is the best bang for the buck, followed by 2600k. HardOCP benchmarks show that 2600k with Z68 and Tri-SLi had no problems dominating the X58 platform, despite less native PCIe lanes. So if PCIe was an issue, how could that happen?

The biggest limitation for 3x GTX580s is actually CPU clock speed of modern SB processors. GTX580s are very powerful and require the most speed you could thrown at them. Most 3930k's don't overclock past 4.5-4.6ghz, with very few hitting 4.7ghz. However, a lot more 2600k's hit 4.8-5.0ghz. Once IVB is introduced, the X79 platform will be an even harder sell imho. We might be looking at IVB CPUs hitting 5.2-5.3ghz and on top of that an average 6% increase in IPC, or more.
 

omeds

Senior member
Dec 14, 2011
646
13
81
In the end I went for a maximus 4 extreme P67 board ($100 cheaper than z68 version) and 2700k (on sale for 2600k price). I know the board doesnt support pcie 3.0 but maybe I'll sell it once IVB arrives, if needed.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
Ive said this before the onboard GPU is only useful for big companies and users but it will never equal to the performance of a actualy video card,,,,,, which is why its disabled once it detects a video card... pointless... in a way...