Core 2 Quad: Pair with P35, 680i, or wait for newer SLI chipset?

CZroe

Lifer
Jun 24, 2001
24,195
856
126
My AN8SLI Athlon X2 system was never stable, but I'd hate to leave the dual 7800GT for a single 8800GTS just because the CPU was a steal and the SLI motherboards are inferior.

I wasn't really in the market, but I couldn't pass up an Intel Core 2 Quad + ECS/Via mobo for $300. I went ahead and got two "Superclocked" $300 8800GTS cards to go with it (320MB).

I've been led to believe that the performance advantage of the P35 chipset motherboards is significant enough to ignore nVidia SLI boards. In that case, I intend to use the 8800GTS cards in my older A64 X2 3800+ system until current game performance demonstrates that a better CPU is needed and then, if a decent SLI board is available, I'll fire them up in the Core 2 Quad system. Until then, I'll use the C2Q for my heavy encoding tasks.

Even then, I'm torn about pairing it with a P35 only to have to buy an SLI board later. Am I crazy if I want to run it in the ECS board for a few months? Does the other plan sound crazy to (IOW, should I go with the 680i?)? What do you guys suggest?

nVidia really needs to open SLI up to other manufacturers if they hope to sell extra video cards when they have an inferior motherboard chipset!
 

theYipster

Member
Nov 16, 2005
137
0
0
P35 boards tend to support a higher max FSB. Unless your an extreme overclcoker, this advantage will only play out on cheaper Core 2 Duos with lower max multipliers. A 680i board should have absolutely no trouble hitting a 400mhz FSB, which is enough to get to 3.6ghz on an E/Q6600 w/ max multiplier (and a Q6600 isn't going to reach 3.6ghz unless an advanced cooling solution is used.)

I by no means believe that the FSB advantage is greater the SLI advantage. On the other hand, and with all things being equal, a higher clocked FSB will always perform better than a lower clocked FSB, yet having SLI may not always be better than not having it. Better advice would be to determine whether or not you would benefit from a modern SLI setup. For this, I advise on two criteria, monitor size, and willingness to purchase high end cards. If you game at a resolution of 1920x1200 or higher and are willing to buy 2 8800 GTX cards, then yes, SLI is the way to go. Otherwise, the best high end card you can afford will provide a better solution. SLI really only kicks into gear at high resolutions, and even if you meet that first criteria, the additional complexity and heat output usually isn't worth the small performance gain realized over the single next-best card (others may say there is a benefit to SLI'd 8800 GTS 640, but I would disagree.)

Hope this clears things up,

Mark.
 

cyman83

Member
Jun 8, 2003
32
0
0
hey man i jumped in on the same deal yesterday! but sorry to say, that ecs motherboard is not compatible with quad cores. i'm assuming frys were not aware of the fact that its not compatible.
 

CZroe

Lifer
Jun 24, 2001
24,195
856
126
Originally posted by: theYipster
P35 boards tend to support a higher max FSB. Unless your an extreme overclcoker, this advantage will only play out on cheaper Core 2 Duos with lower max multipliers. A 680i board should have absolutely no trouble hitting a 400mhz FSB, which is enough to get to 3.6ghz on an E/Q6600 w/ max multiplier (and a Q6600 isn't going to reach 3.6ghz unless an advanced cooling solution is used.)

I by no means believe that the FSB advantage is greater the SLI advantage. On the other hand, and with all things being equal, a higher clocked FSB will always perform better than a lower clocked FSB, yet having SLI may not always be better than not having it. Better advice would be to determine whether or not you would benefit from a modern SLI setup. For this, I advise on two criteria, monitor size, and willingness to purchase high end cards. If you game at a resolution of 1920x1200 or higher and are willing to buy 2 8800 GTX cards, then yes, SLI is the way to go. Otherwise, the best high end card you can afford will provide a better solution. SLI really only kicks into gear at high resolutions, and even if you meet that first criteria, the additional complexity and heat output usually isn't worth the small performance gain realized over the single next-best card (others may say there is a benefit to SLI'd 8800 GTS 640, but I would disagree.)

Hope this clears things up,

Mark.

Are you saying that an SLI 8800GTS 320MB setup is not worth the $580 I already paid? My intended displays are 1680x1050 (Dell 2007WFP) and 16:9 1080p (Sony 52" XBR LCD HDTV). Please let me know while I can still return these.

I intend to play stuff like Doom3, Quake 4, Enemy Territory Quake Wars, Lost Planet, Halo 2, Shadowrun, Half-Life 2 Episodes, Resident Evil 4, and a whole lot of older console ports in an HTPC/desktop PC hybrid environment. I also expect to be doing a lot of Blu-Ray encoding and decoding work (I am aware that it would benefit with hardware decode support from the lower-end models).

Originally posted by: cyman83
hey man i jumped in on the same deal yesterday! but sorry to say, that ecs motherboard is not compatible with quad cores. i'm assuming frys were not aware of the fact that its not compatible.

Are you talking about my thread? Fry's said that it was compatible and that they are in contact with ECS to get the website updated.
 

theYipster

Member
Nov 16, 2005
137
0
0
An 8800 GTS 320 SLI setup is a bad idea and a waste of money. Return both cards and for $80 less, get an 8800 GTX which will be a great deal faster and will last much longer. In fact, at 1680x1050, all you really need is a single 8800 GTS 640meg, which will be as fast in the pixel-pushing department as the 320 SLI setup, but will be last longer (and perform better) due to the increased video memory of the single card (and you'll save several hundred $$$.) With either non-sli setup, you'll benefit by having a simpler one card setup which will produce less heat.

The 8800 GTS 320 MB is specifically limited in SLI due to its low video memory. Keep in mind that having two cards does not equal having one card that's twice as powerful (or that has twice the amount of RAM.) As newer Direct X 10 games come out, you'll quickly find that the 320 Meg limit will hinder your ability to play with high settings, despite having two cards in tow. Also, as I mentioned above, SLI in general is really only worth if you are gaming at high res (1920x1200 or above) and are willing to buy two of the fastest cards available (thus increasing your video processing power beyond what is capable with a single card.) In all other cases, you'll be better off with a better single-card solution for less money.

Mark.
 

CZroe

Lifer
Jun 24, 2001
24,195
856
126
Thanks a lot. That really helps push me toward a DDR2 P35 board. Other than my laptop that shipped with 2GB, this'll be my first DDR2 system and yet they already have DDR3 P35 boards. :D The DDR2 was cheap, performance vs. price seems to be in favor of DDR2, and there's no money for DDR3 anyway...

BTW, I noticed a substantial difference in benchmark numbers between the standard GTS and the GTS 320. Is there a pipeline or two disabled in the 320MB versions? Memory capacity alone can't be the only factor with the games being tested today.

"all you really need is a single 8800 GTS 640meg"
Oh, and just to clarify, did you mean a single GTS 640meg or a single GTX 640meg card?
 

theYipster

Member
Nov 16, 2005
137
0
0
I'm not sure of the pipeline differences... I think memory may be the only thing. However, it easily can make a huge difference given how intricate textures are in games today. Regardless of the other specs.. # of stream processors, clock speed, memory speed, fill rate, etc, if there isn't enough room to handle the processing, things are going to slow down.

EDIT: to answer your last question.

There are four different 8800 cards available. The 8800 GTS 320Meg, The 8800 GTS 640 Meg, the 8800 GTX 768 Meg, and the 8800 Ultra (which is a slightly higher clocked 8800 GTX 768 Meg for a lot more money.) I was referring in my last post to the GTS 640 meg, as there isn't a GTX 640 meg.

Mark.