SLI Question...

AmongthechosenX

Senior member
Aug 10, 2008
468
0
76
Basically all i'm asking is whether or not the CPU becomes the bottleneck when running SLi at high resolutions.

the CPU's in question are:
E6600 Overclocked to 3.6ghz.
E8400 Overclocked to 4ghz

would there be a difference, when using sli, between 4ghz and 6MB's of L2 cache (e.g. E8400 OC'd) versus 3.6ghz and 4MB's of L2 Cache (e.g. E6600 OC'd), when running at the resolution of 1920 x 1200?
 

geoffry

Senior member
Sep 3, 2007
599
0
76
I would say the difference would be relatively small. A 3.6 ghz 65nm C2D is still a VERY nice gaming processor.
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
It still depends on the individual game/app in question, and the GPU running in SLI.

Easier to answer if OP is specific, but in general most modern games at 1920x1200 (high-ultra detail) are still GPU limited with 2x 8800GT in SLI, in which case the CPU difference is minimal. However, go to 2x 280 in SLI the CPU factor is more likely to play a difference.
 

AmongthechosenX

Senior member
Aug 10, 2008
468
0
76
well, i currently have 1 x 8800GTX. no OC for now (driver/stability issues).

I'm planning on SLi'ing that card. 2 x 8800GTX's are about 30% faster than a single GTX 280, for much less than half the price of a GTX 280. both cards would have 768MB's of RAM on board them.

I'm currently running an E2180 at 3.2 ghz. I'd love to upgrade to an E8400, but i dont want to drop more money into hardware (rather would save for a nice 24" TN LCD, and then a second 8800GTX). I just want to make sure the CPU isn't going to be a SEVERE bottleneck.
The E6600 I'd be getting off a trade, and would only cost me about $60 ish.

I'd be gaming at 1920x1200 Resolution.
 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
I can tell you ive seen virtually no difference when using a q6600@3.6ghz vs an e8400@4ghz with two G92 8800GTS 512mb cards in sli.

I actually debate with myself all the time which cpu i should keep in my "main gaming rig".
Its currently the e8400, but there are times when i think about swapping the q6600 back in.

ANd i think you're doing the smart move going 8800GTX sli, they have enough vram and its cheaper than getting a GTX280.

Me personally, im moving to a single GTX280 because the 512mb of V-ram on my cards are causing stuttering when textures load in games like Oblivion and Fallout 3.

Although my two 8800GTS 512mb cards bench faster than the GTX280 (just look at the many reviews).
It doesnt mean squat when the reviews leave out that The card with more V-ram may not have higher avg fps, but its definately a smoother gameplay experience without the framerate being all over the place.

For example wife has a 8800GTX Oc'ed to ultra speeds, it benches slower than my setup in Fallout 3, but her fps are rock steady while playing while mine are all over the place.
I almost stole that 8800GTX back from her, then decided just to move on to a GTX280 and leave hers be. ;)


 

AmongthechosenX

Senior member
Aug 10, 2008
468
0
76
wow thank you so much MTDEW, thats exactly what I was looking for! even google couldn't help me there! lol.

Did you ever run any benchmarks with your SLi setup to compare the E8400 with SLi versus the Q6600 in SLi?
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
Originally posted by: MTDEW
I actually debate with myself all the time which cpu i should keep in my "main gaming rig".

LOL

Good MTDEW: Keep the E8400@4GHz for sweet gaming. :sun:
Evil MTDEW: No, swap in the Q6600 and see if you can push even further. :evil:

 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
Originally posted by: AmongthechosenX
wow thank you so much MTDEW, thats exactly what I was looking for! even google couldn't help me there! lol.

Did you ever run any benchmarks with your SLi setup to compare the E8400 with SLi versus the Q6600 in SLi?

I did when i made the swap, but i never kept the numbers since i was never expecting to share them, they were just personal notes.

And i remember it was after 3.2ghz that the cpu speed didnt really matter anymore and the gains were very minimal on both cpu's bumping past 3.2ghz.

So i overclocked the cards instead for the speed boost. :D

Nothing really took adavntage of the four cores over the two cores either to make a difference.
Ive read that supreme commander does, but i dont own that.
My games which are mostly FPS never benefitted from the extra cores, if they used them at all they just spread the workload out thinner between all 4 cores as opposed to loading the two cores more.

I kept the e8400 in here, but i never really could get myself to sell the q6600 just in case games start making use of the extra cores enough.

Sadly , by the time that actually does happen, the q6600 will probably be to old to matter anyway.

So my q6600 is in my "backup" rig with an 8800gt and i use it to encode my videos and stuff on too.

And b4 anyone asks, no i wont sell my q6600 for fear i may regret it later....LOL
Besides it looks like my two 8800GTS cards are gonna need a new home when my gtx280 gets here.

No wonder my light bill is always so high. :laugh:


EDIT: Oh yeah, i did forget to add, im gaming on a 24" so my rez is 1920x1200 also in case you were wondering.





 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
Originally posted by: betasub
Originally posted by: MTDEW
I actually debate with myself all the time which cpu i should keep in my "main gaming rig".

LOL

Good MTDEW: Keep the E8400@4GHz for sweet gaming. :sun:
Evil MTDEW: No, swap in the Q6600 and see if you can push even further. :evil:

LOL, this q6600 isnt going any higher unless theres frost on the case.
Believe me, i tried...LOL
 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
Originally posted by: AmongthechosenX
i didnt see any benchies in there :(
The links are in jaredspaces post.

Here.
http://www.pcgameshardware.com...CPU_benchmarks/?page=2

Now i didnt see any difference in Crysis, Oblivion, stalker and any of the games i played at the time i switched.
But it seems Far Cry 2 likes cpu's with more cache the better.
Now how many future games will it matter?...well your guess is as good as mine.



Edited for really poor typing....LOL