What are the downsides to SLI?

PCJake

Senior member
Apr 4, 2008
319
0
0
I just got my GTX 480 today, and I love it, and it made me start thinking about SLI. I wouldn't get another one for a while (I'd have to get a new motherboard, CPU, RAM, and PSU to do it), but some of the reviews I've read claim some excellent SLI scaling with the GTX 480 (100% in some cases). What are the major drawbacks to going SLI (besides cost and power)?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
yeah I would suggest an overclocked i7/i5 cpu if you want to get the best scaling with a gtx480 sli setup. and of course a 1000 watt psu to be safe.

I think sli has come a long ways but there still be some micro stutter issues with any multi gpu setup.
 
Last edited:

PCJake

Senior member
Apr 4, 2008
319
0
0
yeah I would suggest an overclocked i7/i5 cpu if you want to get the best scaling with a gtx480 sli setup. and of course a 1000 watt psu to be safe.

I think sli has come a long ways but there still be some micro stutter issues with any multi gpu setup.

Actually, micro-stuttering is the main thing I was curious about. My understanding of it is limited, but is it correct to say that micro-stuttering disappears if a game's FPS gets above your monitor's refresh rate (which would usually happen with 480s in SLI except maybe in Metro 2033 and Crysis)?
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Actually, micro-stuttering is the main thing I was curious about. My understanding of it is limited, but is it correct to say that micro-stuttering disappears if a game's FPS gets above your monitor's refresh rate (which would usually happen with 480s in SLI except maybe in Metro 2033 and Crysis)?

Microstutter will never disappear with a Multi-GPU setup using AFR, it's effect can only be lessened. Running with Vsync and Triple buffering is the best bet to limit microstutter or perhaps using a different rendering mode (Split frame, supertiling etc). I got used to it on my 4870X2 which I'd run with Vsync and Triple buffering and it didn't bother me. However, later on when I threw in a single card (GTS250) I absolutely noticed the difference as the single GPU was 'smoother' at the same framerate. So if you don't have a side by side comparison it probably won't bother you and you'll get used to it.
 

PCJake

Senior member
Apr 4, 2008
319
0
0
Microstutter will never disappear with a Multi-GPU setup using AFR, it's effect can only be lessened. Running with Vsync and Triple buffering is the best bet to limit microstutter or perhaps using a different rendering mode (Split frame, supertiling etc). I got used to it on my 4870X2 which I'd run with Vsync and Triple buffering and it didn't bother me. However, later on when I threw in a single card (GTS250) I absolutely noticed the difference as the single GPU was 'smoother' at the same framerate. So if you don't have a side by side comparison it probably won't bother you and you'll get used to it.

That makes sense. I hate using Vsync, though, (even with triple buffering) because it introduces noticeable mouse lag in every game I play. Would split frame a feasible option (is it tricky to enable in games)? I don't mind tearing as much as I mind mouse lag. And I've never heard of supertiling, is that a good option?
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
That makes sense. I hate using Vsync, though, (even with triple buffering) because it introduces noticeable mouse lag in every game I play. Would split frame a feasible option (is it tricky to enable in games)? I don't mind tearing as much as I mind mouse lag. And I've never heard of supertiling, is that a good option?

Have a look through this thread and this thread for some good info on the different rendering modes for Multi-GPU. evolucion8 has already done some testing with some interesting results between the different modes. I'm not too sure about Nvidia and supertiling, I think they may call it something different (It's not mentioned on Wiki: SLI). There's probably not much you can do to get around the input lag but then again the Microstutter may not bother you that much (or you'll get used to it) so you wouldn't want Vsync in the first place.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
You wont get double the performance just know that much. SLI is something like 25 percent more power..................
25%? its usually much higher than that and does approach 90% or more in very gpu intensive games especially at high resolutions.
 
Last edited:

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I don't think the P35 chipset supports SLI (only Crossfire). The only Intel chipsets you can run SLI on are X58 and newer.

To run your Q6600 with SLI'ed cards you need either nforce 780i or 790i. The main differences between those being that 780i is DDR2 and 790i is DDR3, and that 780i is basically 680i + a PCIe 2.0 bridge chip and 790i is a single chip. The reason why 680i is not compatible is because they re-worked the voltage regulators to on 780i to meet quad core specs.

edit: It's been a while since I've even thought about socket 775/SLI, so this info is all if I recall correctly.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Currently, Crossfire and SLI scales so well, sometimes it can even double the performance.
 

PCJake

Senior member
Apr 4, 2008
319
0
0
I don't think the P35 chipset supports SLI (only Crossfire). The only Intel chipsets you can run SLI on are X58 and newer.

To run your Q6600 with SLI'ed cards you need either nforce 780i or 790i. The main differences between those being that 780i is DDR2 and 790i is DDR3, and that 780i is basically 680i + a PCIe 2.0 bridge chip and 790i is a single chip. The reason why 680i is not compatible is because they re-worked the voltage regulators to on 780i to meet quad core specs.

edit: It's been a while since I've even thought about socket 775/SLI, so this info is all if I recall correctly.

Yeah, that all sounds right, but I wouldn't mess around with getting another 775 board. If I were to upgrade in the next few months I would probably go with an ASUS P6X58D Premium and an i7 930. I don't know what all Intel is planning for upcoming processors, though. Will LGA 1366 be obsolete before long, anyone?
 

Puffnstuff

Lifer
Mar 9, 2005
16,187
4,871
136
Well in just cause 2 I get 100% scaling as in adding the second card doubled my fps.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Microstutter will never disappear with a Multi-GPU setup using AFR, it's effect can only be lessened. Running with Vsync and Triple buffering is the best bet to limit microstutter or perhaps using a different rendering mode (Split frame, supertiling etc). I got used to it on my 4870X2 which I'd run with Vsync and Triple buffering and it didn't bother me. However, later on when I threw in a single card (GTS250) I absolutely noticed the difference as the single GPU was 'smoother' at the same framerate. So if you don't have a side by side comparison it probably won't bother you and you'll get used to it.

ditto, my thoughts exactly. Fyi, Im a long time dual gpu user too
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Yeah, that all sounds right, but I wouldn't mess around with getting another 775 board. If I were to upgrade in the next few months I would probably go with an ASUS P6X58D Premium and an i7 930. I don't know what all Intel is planning for upcoming processors, though. Will LGA 1366 be obsolete before long, anyone?

that would be a sweet setup :thumbsup:

IIRC, LGA 1366 is here to stay until Q4 of 2011.
 

Friendo

Banned
Nov 24, 2009
121
0
0
I just got my GTX 480 today, and I love it, and it made me start thinking about SLI. I wouldn't get another one for a while (I'd have to get a new motherboard, CPU, RAM, and PSU to do it), but some of the reviews I've read claim some excellent SLI scaling with the GTX 480 (100% in some cases). What are the major drawbacks to going SLI (besides cost and power)?

If the game/driver doesn't support SLi, you end up getting slower performance compare to a single card setup, especially if your mobo isn't pci-e x16/x16, and idle will chew up more watts.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Like other said, SLI is game dependent. Regardless of Nvidia marketing will tell you, its not supported it lots of games, modern and old. In fact some games you have to disable SLI to get good performance.

SLI/Crossfire is simply not worth it. Never has been. The only time multi-card setup worked well was with Voodoo cards back in the day. But with todays powerhouses of machines %90 of games are fine even on midrange cards.

The cost vs performance gain is just not there. Unless you plan on dropping $1000 roughtly on a setup for ONE game you play all the time.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Like other said, SLI is game dependent. Regardless of Nvidia marketing will tell you, its not supported it lots of games, modern and old. In fact some games you have to disable SLI to get good performance.

SLI/Crossfire is simply not worth it. Never has been. The only time multi-card setup worked well was with Voodoo cards back in the day. But with todays powerhouses of machines &#37;90 of games are fine even on midrange cards.

The cost vs performance gain is just not there. Unless you plan on dropping $1000 roughtly on a setup for ONE game you play all the time.

Bullshit.

The only thing where you do have a point, is older games. Even though there are sli profiles for LOTS of games, with new drivers sometimes older games tend to get broken.

But with newer games, sli will scale quite nicely, and so will crossfire. That's because driver teams know those games get benchmarked, but that doesn't matter much imo. Also, older games, like let's say Modern Warfare, will run fine even if SLI or CrossFire isn't working.

Personaly, I don't notice any micro-stuttering and I've been running both sli and cf-rigs for over a year now. Scaling has been quite good if not perfect in most games, but that's probably also because I usually only play the latest games.

Only new, graphically demanding game I know sli/cf sucks in, is Assassin's Creed 2.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Like other said, SLI is game dependent. Regardless of Nvidia marketing will tell you, its not supported it lots of games, modern and old. In fact some games you have to disable SLI to get good performance.

SLI/Crossfire is simply not worth it. Never has been. The only time multi-card setup worked well was with Voodoo cards back in the day. But with todays powerhouses of machines %90 of games are fine even on midrange cards.

The cost vs performance gain is just not there. Unless you plan on dropping $1000 roughtly on a setup for ONE game you play all the time.

Yeah that's wrong, especially if you're not the type to buy high end cards. Two cheaper cards can get you great performance for the dollar.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Yeah that's wrong, especially if you're not the type to buy high end cards. Two cheaper cards can get you great performance for the dollar.
And the quality is cheaper too. Never buy into a multi-GPU setup when there's a single GPU setup that can give you similar performance.
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Drawbacks of SLI, beside cost, power and potentially (when on air) noise?

1. Relying on driver releases or profiles to get good scaling in some games. Usually, scaling is very good, sometimes close to 100&#37;. In some games, SLI scaling doesn't come with the game "out of the box", you have to wait for a driver or patch. Worst case scenario is that the game never really supports SLI, and provides no or very lousy scaling. It's pretty rare, though.

2. Random problems or constraints in some games that don't happen on a single GPU. Flickering textures/shadows or alt-tabbing issues are an example - usually patches take care of that but not always. Examples: flickering HDR sky in Fallout 3, no Bokeh filter in JC2. Most of these issues are not deal breakers but can be annoying.

3. Microstuttering and input lag. Both are IMO blown out of proportion but they do exist. In games like Crysis or Metro 2033 where even SLI systems don't offer stellar fps in eye candy high res modes, games can often feel smoother on a single GPU, despite lower frame rate.

In a nutshell:

A. Modern SLI setups are fast and scaling is usually great.
B. IMO, it's only reasonable to buy a high-end SLI setup (like 480 SLI) if you can take the power consumption, heat and possibly noise. Buying midrange SLI setups doesn't make sense, I'd take a single GPU high end card any day, even if it's slightly slower.
C. SLI is and probably never will be as robust and reliable as a single GPU card. Problems are usually minor and are rare but they're present.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Like other said, SLI is game dependent. Regardless of Nvidia marketing will tell you, its not supported it lots of games, modern and old. In fact some games you have to disable SLI to get good performance.

SLI/Crossfire is simply not worth it. Never has been. The only time multi-card setup worked well was with Voodoo cards back in the day. But with todays powerhouses of machines &#37;90 of games are fine even on midrange cards.

The cost vs performance gain is just not there. Unless you plan on dropping $1000 roughtly on a setup for ONE game you play all the time.

What a LOAD. rofl.

I've never had to disable SLI to get good performance kid. The global setting for applications without a profile is SINGLE GPU rendering, which means SLI is not working and you are rendering with one card. And this almost never happens, because nvidia has profiles for almost every game, and on top of that you can create profiles for games using profiles for games with the same engine as a baseline guide for compatibility flags in nhancer.

Second it's not true that it's not worth it. It all depends what resolution you want to run, and what kind of AA filtering you want to apply. Multi GPU becomes more and more valuable the more you move up in res and AA. The whole point of combining multiple GPUs is to reach the 60 fps you want to vsync with your display. If one card does the job for your resolution and AA then multi GPU is not needed.

And I don't agree with BD231 either, buying two slower cards as opposed to a single high end card (same price level) is stupid, it's less power efficient and more chance of poor performance due to scaling issues. For example, buying two 5830s or 5770s instead of one 5870, is a stupid stupid idea. You only combine GPUs when the best single GPU card cannot provide enough performance.


Drawbacks of SLI, beside cost, power and potentially (when on air) noise?

1. Relying on driver releases or profiles to get good scaling in some games. Usually, scaling is very good, sometimes close to 100%. In some games, SLI scaling doesn't come with the game "out of the box", you have to wait for a driver or patch. Worst case scenario is that the game never really supports SLI, and provides no or very lousy scaling. It's pretty rare, though.

2. Random problems or constraints in some games that don't happen on a single GPU. Flickering textures/shadows or alt-tabbing issues are an example - usually patches take care of that but not always. Examples: flickering HDR sky in Fallout 3, no Bokeh filter in JC2. Most of these issues are not deal breakers but can be annoying.

3. Microstuttering and input lag. Both are IMO blown out of proportion but they do exist. In games like Crysis or Metro 2033 where even SLI systems don't offer stellar fps in eye candy high res modes, games can often feel smoother on a single GPU, despite lower frame rate.

In a nutshell:

A. Modern SLI setups are fast and scaling is usually great.
B. IMO, it's only reasonable to buy a high-end SLI setup (like 480 SLI) if you can take the power consumption, heat and possibly noise. Buying midrange SLI setups doesn't make sense, I'd take a single GPU high end card any day, even if it's slightly slower.
C. SLI is and probably never will be as robust and reliable as a single GPU card. Problems are usually minor and are rare but they're present.

I just want to say that this is well written.

However microstuttering does exist, even on single gpu, and it's a huge problem that requires vsync to fix. Unfortunately this brings input lag and we need a way to resolve this. Frame limiting to 1 frame short of your refresh rate (i.e. 59 fps for 60hz) eliminates input lag, it would be great if nvidia and ati gave an option to cap frame rate at the driver level.
 
Last edited: