What are the downsides to SLI?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
Wow guys nice responses. I was obviously incorrect you guys know.

I was told before I can do crossfire since I have 2 PCIe slots,

But problem is I dont want the big red, never strusted them and I dont trust their drivers. shame really

I had bad experiences with x800 xt pe which was 600 bones at that time. I mean the card when working is as fast as nvidia and 59xx blows nvidia away. I know nVidia has something up their sleeves. They might do 3 triple GPU on single card.
 
Last edited:

PCJake

Senior member
Apr 4, 2008
319
0
0
that would be a sweet setup :thumbsup:

IIRC, LGA 1366 is here to stay until Q4 of 2011.

Hmm... *evil chin rub*

Drawbacks of SLI, beside cost, power and potentially (when on air) noise?

1. Relying on driver releases or profiles to get good scaling in some games. Usually, scaling is very good, sometimes close to 100%. In some games, SLI scaling doesn't come with the game "out of the box", you have to wait for a driver or patch. Worst case scenario is that the game never really supports SLI, and provides no or very lousy scaling. It's pretty rare, though.

2. Random problems or constraints in some games that don't happen on a single GPU. Flickering textures/shadows or alt-tabbing issues are an example - usually patches take care of that but not always. Examples: flickering HDR sky in Fallout 3, no Bokeh filter in JC2. Most of these issues are not deal breakers but can be annoying.

3. Microstuttering and input lag. Both are IMO blown out of proportion but they do exist. In games like Crysis or Metro 2033 where even SLI systems don't offer stellar fps in eye candy high res modes, games can often feel smoother on a single GPU, despite lower frame rate.

In a nutshell:

A. Modern SLI setups are fast and scaling is usually great.
B. IMO, it's only reasonable to buy a high-end SLI setup (like 480 SLI) if you can take the power consumption, heat and possibly noise. Buying midrange SLI setups doesn't make sense, I'd take a single GPU high end card any day, even if it's slightly slower.
C. SLI is and probably never will be as robust and reliable as a single GPU card. Problems are usually minor and are rare but they're present.

Thanks for this, I'll start doing some more research on microstuttering and input lag. It just makes sense in my mind to get another GTX 480 (after some upgrades to my system, of course) once the price has come down as opposed to waiting a couple of years for a new single card that could beat two of the 480s.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Hmm... *evil chin rub*



Thanks for this, I'll start doing some more research on microstuttering and input lag. It just makes sense in my mind to get another GTX 480 (after some upgrades to my system, of course) once the price has come down as opposed to waiting a couple of years for a new single card that could beat two of the 480s.

Bad idea. Microstutter is like a hype, it got blown out of proportions. Do NOT look into it. The most you should do, is check out a sli-rig from a friend, or at a shop, play a few games you usually play, and just play.

Chances are, you won't notice a thing, and that's probably a 90&#37; chance. If you consciously start looking for it, you might actually notice it at times. Me, I've never seen a thing and I couldn't do without SLI/CF (running 5760x1080 requires a lot of horsepower).

Oh, and it won't take years for a videocard to come out that's as fast as two GTX 480's. I probably won't even take 1 year. The moment ATI releases a 28nm-gpu, it will most likely beat dual GTX 480's most of the time, save for some games SLI scales exceptionally well. And if Nvidia catches up, I'm also quite sure their gpu will be just as quick.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Bad idea. Microstutter is like a hype, it got blown out of proportions. Do NOT look into it. The most you should do, is check out a sli-rig from a friend, or at a shop, play a few games you usually play, and just play.

Chances are, you won't notice a thing, and that's probably a 90% chance. If you consciously start looking for it, you might actually notice it at times. Me, I've never seen a thing and I couldn't do without SLI/CF (running 5760x1080 requires a lot of horsepower).

Oh, and it won't take years for a videocard to come out that's as fast as two GTX 480's. I probably won't even take 1 year. The moment ATI releases a 28nm-gpu, it will most likely beat dual GTX 480's most of the time, save for some games SLI scales exceptionally well. And if Nvidia catches up, I'm also quite sure their gpu will be just as quick.


Microstutter is not a hype. If you can't see it, it's just your insensitivity to it.
Go play any Source engine game or Call of duty game, and tell me that while your fps count says 80 it doesn't feel more like 40, I dare you. And then go turn on vsync and watch how 60 fps all of a sudden is smoother than 80 fps. Just experiment for yourself.

And if you cant see the difference, then check frame times with fraps. Tell me that a 16.6-16.6 frame cadence is the same as an 8-25-8-25 cadence. For the love of god the problem is there and is massive and I could tell right away if my vsync was turned off even if there wasn't any tearing.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
There is no benefit of having CF/SLI setup other than the fact that it is faster than a single card setup. In short, if you want to get better FPS than the fastest card, then get more of them and go CF/SLI.

The downside is doubling the price != doubling the performance. 2 cards = double the price. Games are not optimized at the kind of power, meaning powers are wasted, but are using electricity. Some games are not even optimized for CF/SLI config, resulting shutter, or even drop in performance. You will probably need to wait for a patch or 2 from new games before you can see the extra performance.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
IMO anyone in the market for enthusiast level hardware is far more likely to notice both input lag and microstutter than Joe Average. Yes, it may well be that only 1 in 10 people are sensitive to these issues - but high end gamers are not even 1 in 20.

The reason we go for low latency network connections, high frame rate capable displays and high end video hardware is precisely because we can easily tell the difference between 30 fps and 60 fps. And even 80 fps to 120 fps. Heck, in the olden days I could reliably determine when error correction & compression was turned on with my 2400 baud modem by keystroke latency. And that latency increase per keystroke was about the same as a single frame on a 60hz monitor.

Then again, some people claim not to see > 30 frames/sec and aren't bothered by several frames of latency input lag. For them multi-gpu is a godsend.

To my eyes AFR looks bad. No better than single GPU. But to a bechmark, woah, you're pushing double the frames man! Once SFR implementations become more the norm I'll consider multi-GPU.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Your main consideration these days in going SLI or CF with either companies top end card, is do I own a 30" monitor, or in the case of ATI, am i going eyefinity ?

SLI 480 or CF 5870 is a waste under 2560x1600, one of either card is enough.

Spend the money on a bigger monitor first. I went 5870 CF and still had my 24", realized it was insane overkill, so bought a 30" rather than selling the extra 5870. ^^
 

PCJake

Senior member
Apr 4, 2008
319
0
0
IMO anyone in the market for enthusiast level hardware is far more likely to notice both input lag and microstutter than Joe Average. Yes, it may well be that only 1 in 10 people are sensitive to these issues - but high end gamers are not even 1 in 20.

The reason we go for low latency network connections, high frame rate capable displays and high end video hardware is precisely because we can easily tell the difference between 30 fps and 60 fps. And even 80 fps to 120 fps. Heck, in the olden days I could reliably determine when error correction & compression was turned on with my 2400 baud modem by keystroke latency. And that latency increase per keystroke was about the same as a single frame on a 60hz monitor.

Then again, some people claim not to see > 30 frames/sec and aren't bothered by several frames of latency input lag. For them multi-gpu is a godsend.

To my eyes AFR looks bad. No better than single GPU. But to a bechmark, woah, you're pushing double the frames man! Once SFR implementations become more the norm I'll consider multi-GPU.

Well put, I'm definitely one of the people that would notice the input lag and microstuttering and be completely put off by it. Some of my friends (console gamers) think I'm crazy for spending hundreds of dollars on a new graphics card just because it bugs me when my frame rate drops below 60, even if it's only by one or two FPS, but there it is. I'm definitely going to put all of my focus into doing a CPU/motherboard upgrade that will unlock the full potential of the single GTX 480 I have now before I even consider getting another card.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Bullshit.

The only thing where you do have a point, is older games. Even though there are sli profiles for LOTS of games, with new drivers sometimes older games tend to get broken.

But with newer games, sli will scale quite nicely, and so will crossfire. That's because driver teams know those games get benchmarked, but that doesn't matter much imo. Also, older games, like let's say Modern Warfare, will run fine even if SLI or CrossFire isn't working.

Personaly, I don't notice any micro-stuttering and I've been running both sli and cf-rigs for over a year now. Scaling has been quite good if not perfect in most games, but that's probably also because I usually only play the latest games.

Only new, graphically demanding game I know sli/cf sucks in, is Assassin's Creed 2.


In those games that you double FPS do you notice any difference vs one card playing? Nope. If you think you do, its placebo effect..aka bull shitting yourself.

Heck even World of Warcraft, that huge game you might of heard about has the SLI/crossfire problem.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Microstutter is not a hype. If you can't see it, it's just your insensitivity to it.
Go play any Source engine game or Call of duty game, and tell me that while your fps count says 80 it doesn't feel more like 40, I dare you. And then go turn on vsync and watch how 60 fps all of a sudden is smoother than 80 fps. Just experiment for yourself.

And if you cant see the difference, then check frame times with fraps. Tell me that a 16.6-16.6 frame cadence is the same as an 8-25-8-25 cadence. For the love of god the problem is there and is massive and I could tell right away if my vsync was turned off even if there wasn't any tearing.

I'm not saying it doesn't exist, I'm saying chances are he won't even notice it. I can't think of the right example, but with many other things it's exactly the same, you won't notice it unless you're explicitly looking for it and even then a lot of people won't even see it.

This discussion isn't about differences in fps. I don't use vsync because it introduces mouse lag. Anyways, I could probably notice microstuttering, if I looked for it. But I don't, and my dual-gpu setup isn't bothering me...

In those games that you double FPS do you notice any difference vs one card playing? Nope. If you think you do, its placebo effect..aka bull shitting yourself.

Heck even World of Warcraft, that huge game you might of heard about has the SLI/crossfire problem.

Sorry, I'm a bad example, I game at 5760x1080, that's 6 megapixel, 1,5x as much as 2560x1600. If I turn of CF on my HD 5970, games become unplayable.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
I'm not saying it doesn't exist, I'm saying chances are he won't even notice it. I can't think of the right example, but with many other things it's exactly the same, you won't notice it unless you're explicitly looking for it and even then a lot of people won't even see it.

This discussion isn't about differences in fps. I don't use vsync because it introduces mouse lag. Anyways, I could probably notice microstuttering, if I looked for it. But I don't, and my dual-gpu setup isn't bothering me...



Sorry, I'm a bad example, I game at 5760x1080, that's 6 megapixel, 1,5x as much as 2560x1600. If I turn of CF on my HD 5970, games become unplayable.

Theres a difference between 'not noticing it' and 'getting used to it'. I'd suggest you are probably part of the latter group. I had my 4870X2 for quite a while (I've also owned 3850 crossfire), I didn't notice it at first- all I saw was more fps in games over my old setup. However, upon playing the same games with a single GPU at the same framerate there was an obvious difference.

In future I will not be using Multi-GPU setup as I now know that my playing experience was sub-optimal with such a setup. I may have been able to run at higher in game settings but after seeing the difference, it's not worth it with a microstutter.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Theres a difference between 'not noticing it' and 'getting used to it'. I'd suggest you are probably part of the latter group. I had my 4870X2 for quite a while (I've also owned 3850 crossfire), I didn't notice it at first- all I saw was more fps in games over my old setup. However, upon playing the same games with a single GPU at the same framerate there was an obvious difference.

In future I will not be using Multi-GPU setup as I now know that my playing experience was sub-optimal with such a setup. I may have been able to run at higher in game settings but after seeing the difference, it's not worth it with a microstutter.


Exactly what he said. Microstutter is not something you see or you can look for either, it's something you feel.

I had no idea what microstutter was when I was using my 8800GTX SLI system about 3-4 years back, I was always getting 100+ fps in Source engine games like DOD or TF2, and I always thought that's what 100 fps outta feel like. Well, as soon as I read about microstuttering, and I turned on vsync, I noticed immediately that the game was way smoother. The input lag was brutal, but there are ways to compensate for it, like fps capping.

So now I play with 59 fps vsynched at 60hz, and I get the best of both worlds. Smooth, responsive, tearing and input lag-free game play. You should try it and see what a difference it makes, MarcVenice.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Microstutter is not a hype. If you can't see it, it's just your insensitivity to it.
Go play any Source engine game or Call of duty game, and tell me that while your fps count says 80 it doesn't feel more like 40, I dare you. And then go turn on vsync and watch how 60 fps all of a sudden is smoother than 80 fps. Just experiment for yourself.

And if you cant see the difference, then check frame times with fraps. Tell me that a 16.6-16.6 frame cadence is the same as an 8-25-8-25 cadence. For the love of god the problem is there and is massive and I could tell right away if my vsync was turned off even if there wasn't any tearing.

It's completely subjective. I've had 3 SLI rigs and have never been able to perceive it.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
It's completely subjective. I've had 3 SLI rigs and have never been able to perceive it.

Have you ever tried to fix what you can't perceive, to see if there was an improvement to what you thought was fine?

If no then you can't say you can't perceive it, because you have no other term of comparison.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Have you ever tried to fix what you can't perceive, to see if there was an improvement to what you thought was fine?

If no then you can't say you can't perceive it, because you have no other term of comparison.

I need the damn NV surround driver to come out so I can have SLI enabled with all 3 of my screens, so I run my setup in single GPU mode half of the time anyway. I have plenty of valid comparison experience to draw upon when I say I can't notice it.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
I need the damn NV surround driver to come out so I can have SLI enabled with all 3 of my screens, so I run my setup in single GPU mode half of the time anyway. I have plenty of valid comparison experience to draw upon when I say I can't notice it.

And what tests have you done?

Have you tried 3-way SLI with and without vsync in a scenario where you are getting between 60 and 120 fps? That's the region where microstutter is most evident, because your perceived frame rate will be less than 60 while your counter says otherwise. Below 60 fps you don't notice it as much because the frame rate is already low and your brain perceives stuttering (kind of like a crt flickering at 30 or 40 Hz, your eyes perceive the stutter). Mismatched frame intervals are only adding to that stutter, but it is very hard to perceive it. And above 120 fps, there are enough frames rendered to make all intervals less than 16.6ms, so your brain perceives the same smoothness of 60 fps.

Launch an fps game where you know you get high fps like a source engine or COD, then cap your fps at 80 and play, move around. Then enable vsync and see how it feels. Come back and let me know.