Hey, SLI users, does SLI still have issues?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

FalllenAngell

Banned
Mar 3, 2006
132
0
0
Originally posted by: BFG10K
Yup, definitely no issues with SLI there. :roll:

Years after its introduction SLI still has critical problems affecting major release titles and even system stability.

LOL

Of course, you could post a similar list for ANY hardware or software, but who cares? You're trying to make a point!

Out of curiosity, which of those issues would you consider "critical"?

 

darXoul

Senior member
Jan 15, 2004
702
0
0
Damn, why do you even mention Crossfire, dude? Who gives a crap? I started this thread to talk about SLI and X2 CPUs, not to ignite another moronic nVidia vs. ATi flame war.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
@darXoul (aka OP)
Yes, SLI and X2 CPU's still have issues.

Most CPU issues are fixed with the hotfix and drivers.

SLI still has some tearing and other vsync problems.

But I don't think there are enough problems to warrent not buying either a X2 or SLI. So IMO, I'd get either.
 

FalllenAngell

Banned
Mar 3, 2006
132
0
0
Originally posted by: darXoul
Damn, why do you even mention Crossfire, dude? Who gives a crap? I started this thread to talk about SLI and X2 CPUs, not to ignite another moronic nVidia vs. ATi flame war.

You're right, edited, I was just trying to point out that as multi gpu solutions go SLI is comparatively user friendly.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Yeah, and ati doesnt have any
This is a strawman argument.

most of those issues arent even real issues.
So most of them are imaginary issues then? nVidia driver engineers just decided to make up most of them for the hell of it?

but your lack of tact kinda sucked imho.
Lack of tact...let's see, that would be criticizing nVidia would it?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Of course, you could post a similar list for ANY hardware or software,
Yes but you claimed the only SLI issue was with Serious Sam 2.

but who cares?
That would be the original poster when he asked whether SLI has issues.

You're trying to make a point!
Yes Rollo, I'm debunking your AEG propaganda. You should know this because it's been happening since you registered in 2001 and were claiming nobody needs resolutions above 1024x768, AF or AA.
 

FalllenAngell

Banned
Mar 3, 2006
132
0
0
Originally posted by: BFG10K
Of course, you could post a similar list for ANY hardware or software,
Yes but you claimed the only SLI issue was with Serious Sam 2.

Your "issues" are largely a joke.

Oh noes! The load balancing line is corrupt in Civ 4 at 25X16!

The fact of the matter is that SLI is very user friendly, configurable, and stable these days. The issues are rare and trivial for the most part.
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Here, I just have to agree. Most of the "issues" listed are very minor ones, I wouldn't even call them real issues. These are neglectable imperfections, usually occurring in pretty exotic configurations / circumstances.

The only thing that is bothering me a bit is the problem with F.E.A.R. This is of course one of the "critical" games that I would like my potential SLI setup to work with. In case of older games, no big deal if I encounter tearing. Like I said, 16*12/4/16 is more than enough to satisfy me in terms of IQ, so I can run older games with SLI disabled to get rid of tearing.

BTW, does anyone know how SLI works with Oblivion? As a big time RPG fan (also pen & paper), I'm definitely getting this game. I'd like to know:

1) if it has any issues with SLI (tearing, corruption, etc.);
2) how much of a boost it gets thanks to SLI, compared to a single card.

If anyone can share some insight, I'll be grateful.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: darXoul
Here, I just have to agree. Most of the "issues" listed are very minor ones, I wouldn't even call them real issues. These are neglectable imperfections, usually occurring in pretty exotic configurations / circumstances.

The only thing that is bothering me a bit is the problem with F.E.A.R. This is of course one of the "critical" games that I would like my potential SLI setup to work with. In case of older games, no big deal if I encounter tearing. Like I said, 16*12/4/16 is more than enough to satisfy me in terms of IQ, so I can run older games with SLI disabled to get rid of tearing.

BTW, does anyone know how SLI works with Oblivion? As a big time RPG fan (also pen & paper), I'm definitely getting this game. I'd like to know:

1) if it has any issues with SLI (tearing, corruption, etc.);
2) how much of a boost it gets thanks to SLI, compared to a single card.

If anyone can share some insight, I'll be grateful.

I'll let you know on Thursday when my 7900GT's come.

DAM UPS! :|
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Great :/ I've just read on the nVidia boards - SLI doesn't seem to work with Oblivion at all - it simply doesn't improve the game's performance over a single card.

This is something really disappointing because Oblivion was one of the games I wanted to get SLI for...
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: CP5670
Originally posted by: munky
With dual cores the AMD driver fixes some of the problems, but some games still have pauses/hitches. The original Unreal is one of them, and you have to set the app to run only on one core, but so far I havent gotten any dual core problems with other games.

Do you have to do that every time you open the program or just once? I'm thinking of picking up a 165 soon with all the discounts Monarch is running, but I play the Unreal-based Deus Ex fairly often and could definitely do without any further compatibility hassles in that.

You have to do it every time you run the app. But to make it easier, I have a launcher app that I set to run the game on one core always, so instead of alt-tabbing out of the game I just run it from the launcher.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: munky
Originally posted by: CP5670
Originally posted by: munky
With dual cores the AMD driver fixes some of the problems, but some games still have pauses/hitches. The original Unreal is one of them, and you have to set the app to run only on one core, but so far I havent gotten any dual core problems with other games.

Do you have to do that every time you open the program or just once? I'm thinking of picking up a 165 soon with all the discounts Monarch is running, but I play the Unreal-based Deus Ex fairly often and could definitely do without any further compatibility hassles in that.

You have to do it every time you run the app. But to make it easier, I have a launcher app that I set to run the game on one core always, so instead of alt-tabbing out of the game I just run it from the launcher.

This is very useful:

http://www.robpol86.com/Pages/imagecfg.php
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: BFG10K
I downloaded the 81.94 release notes, and did a search for 'dual', 'dual-core', 'disable' and 'X2', and no where in the release notes does NVIDIA recommend users disable any dual core optimizations, nor do thet specify any known issues with dual-core CPU's.
This is a joke, right?

http://download.nvidia.com/Windows/84.21/84.21_Forceware_Release_Notes.pdf

Different ForceWare versions probably explains that (not sure why I grabbed an old version)... I don't really spend my time reading driver release notes, but then again I'm not looking for problems that I don't have. I've been running dual core for almost a year now with no issues that I can link directly to the dual core, and not any more issues than I've had with any other rig I've built and less than some.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
In BF2 and HL2, I have to set the affinity to one core instead of two in order to get good framerates.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Matt2
In BF2 and HL2, I have to set the affinity to one core instead of two in order to get good framerates.

I thought I had an issue with dual core in HL2 (which is how I discovered the app I posted above), but then I got hold of an FX-55 and it stuttered just as much as the X2 4200+. As a matter of fact, I've seen HL2 stutter with an X2 4200, FX-55, X2 4400+, with single and dual ATI and NVIDIA cards.

Don't know about BF2, I don't play it.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: nitromullet
Originally posted by: Matt2
In BF2 and HL2, I have to set the affinity to one core instead of two in order to get good framerates.

I thought I had an issue with dual core in HL2 (which is how I discovered the app I posted above), but then I got hold of an FX-55 and it stuttered just as much as the X2 4200+. As a matter of fact, I've seen HL2 stutter with an X2 4200, FX-55, X2 4400+, with single and dual ATI and NVIDIA cards.

Don't know about BF2, I don't play it.

It's more than a stutter. If I leave the affinity on Core 0 and Core 1, the main menu flashes... I know sounds weird, bt it flashes. In game does the same thing, flashes and the cl_showfps 1 counter show I'm getting like 33fps and it dips even lower than that.

If I ctrl-esc when I'm at the main menu and set the affinity to Core 1 and uncheck Core 0 in the task manager, it's smooth as a baby's butt. No weird flashing, smooth and the fps counter is tacked at 150fps.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: nitromullet
Originally posted by: munky
Originally posted by: CP5670
Originally posted by: munky
With dual cores the AMD driver fixes some of the problems, but some games still have pauses/hitches. The original Unreal is one of them, and you have to set the app to run only on one core, but so far I havent gotten any dual core problems with other games.

Do you have to do that every time you open the program or just once? I'm thinking of picking up a 165 soon with all the discounts Monarch is running, but I play the Unreal-based Deus Ex fairly often and could definitely do without any further compatibility hassles in that.

You have to do it every time you run the app. But to make it easier, I have a launcher app that I set to run the game on one core always, so instead of alt-tabbing out of the game I just run it from the launcher.

This is very useful:

http://www.robpol86.com/Pages/imagecfg.php

Thanks, I'll give that a try.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Matt2
Originally posted by: nitromullet
Originally posted by: Matt2
In BF2 and HL2, I have to set the affinity to one core instead of two in order to get good framerates.

I thought I had an issue with dual core in HL2 (which is how I discovered the app I posted above), but then I got hold of an FX-55 and it stuttered just as much as the X2 4200+. As a matter of fact, I've seen HL2 stutter with an X2 4200, FX-55, X2 4400+, with single and dual ATI and NVIDIA cards.

Don't know about BF2, I don't play it.

It's more than a stutter. If I leave the affinity on Core 0 and Core 1, the main menu flashes... I know sounds weird, bt it flashes. In game does the same thing, flashes and the cl_showfps 1 counter show I'm getting like 33fps and it dips even lower than that.

If I ctrl-esc when I'm at the main menu and set the affinity to Core 1 and uncheck Core 0 in the task manager, it's smooth as a baby's butt. No weird flashing, smooth and the fps counter is tacked at 150fps.

You should try that app I posted. That way you won't have to exit the game to set the affinity.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Your "issues" are largely a joke.
I don't see nVidia driver engineers putting smilies anywhere, do you?

Oh noes! The load balancing line is corrupt in Civ 4 at 25X16!
Don't forget to tell us about your son's bleeding eyes.

The fact of the matter is that SLI is very user friendly, configurable, and stable these days. The issues are rare and trivial for the most part.
Good Rollo, now go back to your cage like the good little AEG parrot you are.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Different ForceWare versions probably explains that (not sure why I grabbed an old version)...
No, you were using a beta driver that doesn't have full release notes. AFAIK every official nVidia driver since dual-core optimizations were added has that section in it.

I don't really spend my time reading driver release notes, but then again I'm not looking for problems that I don't have.
Again this is why I tend to rely on official documentation rather than personal anecdotes.
 

FalllenAngell

Banned
Mar 3, 2006
132
0
0
Originally posted by: BFG10K
Your "issues" are largely a joke.
I don't see nVidia driver engineers putting smilies anywhere, do you?

Hmmmm.

Darxoul seems to agree with me:

Originally posted by: darXoul
Here, I just have to agree. Most of the "issues" listed are very minor ones, I wouldn't even call them real issues. These are neglectable imperfections, usually occurring in pretty exotic configurations / circumstances.

and I notice you didn't bother to tell us which of those "issues" you actually consider important.

I must be right, as all you seem to have as a response are bizarre, irrelevant requests.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
and I notice you didn't bother to tell us which of those "issues" you actually consider important.
If you actually played games you wouldn't need to be spoon-fed this information.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Ok, I'm going to try to go back to a subject without being interrupted by the war between FallenAngel or BFG10K. F.E.A.R is a game I just recently beat. But I've been trying to tweak it just right for my rig and I've been noticing things. I set my GPU rendering mode to SLI AA, the AA is set to SLI8x, and 16xAF, High Performance rather than High Quality, and Transparancy AA Supersampling. I run it fine frame rate wise. Yet, the vsync is bad. Most of the time I don't even put it on just becasue it hits pretty hard. Just my vsync in general seems to drop the frames, even if Fraps says its running at 30~40fps. When I had a single card, I got used to being able to distinguish between 20 and 30 fps. It's a pretty obvious distinction. But when I'm even playing BF2 with vsync on, Fraps will say I'm getting like 40fps and when I look at the game play it seems more like 20. ???

I LOVE F.E.A.R, but I'm stuck with playing it without vsync because I'd rather have some screen tearing than sluggish frames. Is there any kind of tweak that works good on someones system that performs as well as looks good. I like the performance and quality I get with mine, but when I turn on that vsync it seems like completely different. I can get some good things from having the vsync turned on, but it comes at the cost of degrading some of the IQ I have.
(Note: Sometimes it seems like the game's settings override my video card's ones. Ex. FSAA turned off in F.E.A.R but SLI 8x turned on in video properties results in there being no FSAA during F.E.A.R gameplay)

I eventually work around it and it seems to be just a minor bug, but I still have to sacrifice for that damn vsync. Is this a common problem or what?