GT200b to launch at Nvision 08 (Aug 25-27)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
It's stuff like this that shows you just how much NV hates losing. If they can meet that release date it's quite impressive IMO.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
Originally posted by: Extelleron

With the 4870 X2, you get 90% the performance of the GTX 280 even when Crossfire scales 0%. When Crossfire scales even horribly, you get much better performance.
Not when performance is less than a single card, as is often the case when multi-GPU scales poorly. Plus with ATi not offering end-user profiles there's no easy way to disable Crossfire in such games; at least with nVidia I can set a problematic profile to single GPU if need be.

And thankfully this time around there are no microstutter issues.
You don't have any valid evidence to make that claim. What we need is a demonstration that at a given framerate where previous solutions stuttered they don?t stutter on a 4xxx at a similar framerate, otherwise it?s just wishful thinking.

So there is really no reason NOT to get 4870 X2 at this time.
Is this some kind of a joke? Aside from the issues of input lag and micro-stutter plus general Crossfire scaling issues, there?s also the issue of general driver compatibility and robustness, especially for legacy titles. The fact is multi-GPU will always be less robust than a single GPU because it needs more driver coddling to prop it up.

Of course if GTX 280+ offers the same level of performance within a single GPU as the 4870 X2 does when it scales well, then that is another story and I'd recommend it over R700.
I?d say the GTX280+ will be quite competitive and even if so I?ll still be considering a single 4870 1 GB depending on what happens.

Then all the games in which CF does scale.... there is no competition.
A drop in the bucket compared to the 90+ games I have under constant play rotation. You see to me a video card needs to do more than just provide good performance in a few cherry picked new titles.

There are no real downsides.
That statement is not even close to being accurate.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
You come of a little strong BFG. All these 90 games you're talking about, why aren't you reporting them in the thread made by Derek, so anandtech can bench them, and have ATI release a magic hotfix? I don't think I've seen you post a single game where crossfire doesn't scale like it should, in fact, the whole thread is a bit of a dissapointment, because barely any1 is reporting any games that don't scale well with SLI/CF, even though people like you keep touting the big horn on how horrible CF scaling is. Come on man, own up, or stop coming of strong and demolishing people's post where YOU don't show any evidence either?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: bryanW1995
9800gx2~gtx280, right? if 4870x2 is 45% faster than them, how is that roughly equivalent? :confused:
No, the GX2 typically benches out higher than the GTX 280, remember that was the basis for all the fuss around GT200 not being worth it over older SLI/multi-GPU parts. GTX+ in SLI benches extremely well and scales very close to 4870CF/X2. And we've already seen 4850CF perform almost identically as 4870CF in many cases for half the price.

The real problem for NV is that ATI raised the bar with AA and in benches that show 8x or higher, they will most likely lose badly to RV770. This is the same situation we're seeing with the 9800GTX+ compared to the 4850. With 4xAA or less the GTX 280 trades wins with 4870X2 and a 15-20% faster GTX 280+ will only increase that lead or decrease any deficit. I would consider trading wins roughly equivalent, especially if one solution is prone to multi-GPU scaling problems and the other is not.


What I find interesting is that the most vocal proponents on this forum for CF/SLI seem to have little to no experience with multi-GPU. Then there's the ones who have tried it and are constantly "upgrading" looking for a better solution. I haven't seen anyone who has a GTX 280 complain about it yet, except for those few who received defective cards. Some of the best testimonials though come from places like the EVGA forums where GX2/SLI users can't get rid of their multi-GPU set-ups fast enough. Even though they're giving up some FPS, they're overall much happier with performance. Personally I think people who don't have any experience with multi-GPU but recommend it to others are giving some really irresponsible advice and really shouldn't comment on its merits until they've tried it out for themselves.


 

hooflung

Golden Member
Dec 31, 2004
1,190
1
0
I see the band playing the microstutter music but I can't hear it. My SLI system doesn't have microstutter. My friends with 3870X2s, X1900XT CFs and 7800GT's don't see it either. Mine in 8800GS SC in SLI running on a x8 x 2 NF4 SLI. Until it becomes a problem, and Single GPU's don't have massive problems from BOTH brands, you can talk about microstutter until the cows are tipped it doesn't really mean jack squat.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: hooflung
I see the band playing the microstutter music but I can't hear it. My SLI system doesn't have microstutter. My friends with 3870X2s, X1900XT CFs and 7800GT's don't see it either. Mine in 8800GS SC in SLI running on a x8 x 2 NF4 SLI. Until it becomes a problem, and Single GPU's don't have massive problems from BOTH brands, you can talk about microstutter until the cows are tipped it doesn't really mean jack squat.

Just because you don't hear the music or don't see microstutter doesn't mean the problem doesn't exist....its already been proven the phenomena exists and is easily tested. If you're so confident you don't have the issue on any of your SLI/CF, use FRAPs to log some frame dumps and compare the intervals, or post them up here so we can have a look. Make sure to note the resolution and whether Vsync is enabled also, as frame rates above refresh will reduce the impact of microstutter by normalizing frame rates.

Besides microstutter is just the top of the list with multi-GPU solutions, I was actually referring to scaling issues which is still a very valid concern. Basing performance on an ATI-mandated limit of 4 popular titles isn't exactly what I'd call an accurate representation of performance and scaling.
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
WOW !!

I haven't been quoted so many times before.

But just to clarify, I was not trashing Nvidia at all. I was just stating how can the new card keep up with the X2 with core and shader increases. And the laughing icon was just meant like....Yeah right.

And just to prove my point....I will be purchasing a Asus GTX 260 within 4 weeks :thumbsup:
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Not when performance is less than a single card, as is often the case when multi-GPU scales poorly. Plus with ATi not offering end-user profiles there's no easy way to disable Crossfire in such games; at least with nVidia I can set a problematic profile to single GPU if need be.

And how often does that happen? I've seen that sometimes with Quad SLI and in the rarest circumstances, 2-way SLI/CF. But that is a driver issue and it is not at all a common occurence. Unless you can name several modern games where the performance of the 4870 X2 is lower than the 4870 and this impacts gameplay, this is not a valid argument against it.

You don't have any valid evidence to make that claim. What we need is a demonstration that at a given framerate where previous solutions stuttered they don?t stutter on a 4xxx at a similar framerate, otherwise it?s just wishful thinking.

Sure I do. I am the one quoting evidence from people who have R700 cards in their hands and have done the tests, you are the one with no valid evidence to refute that claim. Unless you have an R700 in your hands and can show it has the same microstutter as R680, then you shouldn't be making such a statement.

Is this some kind of a joke? Aside from the issues of input lag and micro-stutter plus general Crossfire scaling issues, there?s also the issue of general driver compatibility and robustness, especially for legacy titles. The fact is multi-GPU will always be less robust than a single GPU because it needs more driver coddling to prop it up.

Once again unless you can prove microstutter exists on R700, then don't be using it as a argument against purchasing it. Legacy titles don't need Crossfire support. They are fast enough as it is. Multi-GPU is less robust, sure, but that doesn't refute what I said. You aren't sacrificing anything with R700, becuase you get between GTX 260 & 280 performance even when Crossfire doesn't work at all.

A drop in the bucket compared to the 90+ games I have under constant play rotation. You see to me a video card needs to do more than just provide good performance in a few cherry picked new titles.

If you have a list of 90+ games that don't scale with Crossfire where the 4870 X2 does not provide good enough performance without it, I'd like to see it. Quake III running at 300 FPS @ 2560x1600 24xAA not scaling with Crossfire does not count. ATI's driver team focuses on the titles where performance is necessary. The difference between 20 FPS and 30 FPS in Crysis is more important to consumers than the difference between 200 FPS and 300 FPS in a 5 year old game. I highly doubt that you can name more than a few games where the performance of the 4870 X2 is not adequete, and in those, I doubt the GTX 280 is any better.

That statement is not even close to being accurate.

It is accurate in terms of performance. The only problem I see with R700 is power consumption. Other than that.... if you want ultimate performance there is no other option. If it consumed less power and I hadn't purchased a GTX 280 w/ step-up path I'd buy it as soon as it comes out.




 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
Originally posted by: Extelleron
Not when performance is less than a single card, as is often the case when multi-GPU scales poorly. Plus with ATi not offering end-user profiles there's no easy way to disable Crossfire in such games; at least with nVidia I can set a problematic profile to single GPU if need be.

And how often does that happen? I've seen that sometimes with Quad SLI and in the rarest circumstances, 2-way SLI/CF. But that is a driver issue and it is not at all a common occurence. Unless you can name several modern games where the performance of the 4870 X2 is lower than the 4870 and this impacts gameplay, this is not a valid argument against it.

You don't have any valid evidence to make that claim. What we need is a demonstration that at a given framerate where previous solutions stuttered they don?t stutter on a 4xxx at a similar framerate, otherwise it?s just wishful thinking.

Sure I do. I am the one quoting evidence from people who have R700 cards in their hands and have done the tests, you are the one with no valid evidence to refute that claim. Unless you have an R700 in your hands and can show it has the same microstutter as R680, then you shouldn't be making such a statement.

Is this some kind of a joke? Aside from the issues of input lag and micro-stutter plus general Crossfire scaling issues, there?s also the issue of general driver compatibility and robustness, especially for legacy titles. The fact is multi-GPU will always be less robust than a single GPU because it needs more driver coddling to prop it up.

Once again unless you can prove microstutter exists on R700, then don't be using it as a argument against purchasing it. Legacy titles don't need Crossfire support. They are fast enough as it is. Multi-GPU is less robust, sure, but that doesn't refute what I said. You aren't sacrificing anything with R700, becuase you get between GTX 260 & 280 performance even when Crossfire doesn't work at all.

A drop in the bucket compared to the 90+ games I have under constant play rotation. You see to me a video card needs to do more than just provide good performance in a few cherry picked new titles.

If you have a list of 90+ games that don't scale with Crossfire where the 4870 X2 does not provide good enough performance without it, I'd like to see it. Quake III running at 300 FPS @ 2560x1600 24xAA not scaling with Crossfire does not count. ATI's driver team focuses on the titles where performance is necessary. The difference between 20 FPS and 30 FPS in Crysis is more important to consumers than the difference between 200 FPS and 300 FPS in a 5 year old game. I highly doubt that you can name more than a few games where the performance of the 4870 X2 is not adequete, and in those, I doubt the GTX 280 is any better.

That statement is not even close to being accurate.

It is accurate in terms of performance. The only problem I see with R700 is power consumption. Other than that.... if you want ultimate performance there is no other option. If it consumed less power and I hadn't purchased a GTX 280 w/ step-up path I'd buy it as soon as it comes out.

I completely agree Extelleron. Side note, I've never seen a moderator go to such extreme lengths to downplay a card before, it REALLY makes me question whether or not this guy should be moderating this forum. We have enough paid Nvidia zealots as it is that sour any non-Nvidia related discussion.

You really have a 90+ game rotation? I call bullshit man, what a joke. At least try and be believable. All the older legacy titles that CrossFire doesn't scale well in are fast enough on a single 4870 to turn up nearly all eye candy anyways so it's a moot point. In the games that matter (i.e. the newer titles that push the limits of current technology), CrossFire usually works. ATI is committed to getting compatibility better with each driver revision and now that multi-GPU is the way of the future it only makes sense for them to improve.

Rule 13, don't call out the moderators please.

-ViRGE
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
I completely agree Extelleron. Side note, I've never seen a moderator go to such extreme lengths to downplay a card before, it REALLY makes me question whether or not this guy should be moderating this forum. We have enough paid Nvidia zealots as it is that sour any non-Nvidia related discussion.

Let's not get into a big argument like there was with keys being a moderator & focus group member. BFG just doesn't like multi-GPU, he's not an nVidia fanboy. And anyway that is not going to impair his ability to moderate fairly.

But that is my point, the newer games are what matters. I wouldn't want ATI to be working on providing CF support for a game that I can already play at 100 FPS+. I want them to concentrate on support for games like Crysis where the performance is needed.
 

g00n

Member
Mar 16, 2004
52
0
0
EVGA Step-up question...

I was wondering, does EVGA even consider the GTX+ a NEW base model? I know you cant step up from say a 8800GT to a 8800GT SUPERCLOCKED "SC" model as they are considered the same base model. Does EVGA acknowledge the GTX+ as a whole new line - the successor to GTX or simply an enhanced version of that base? They should... If so, then you technically still couldn't go from a 9800gtx+ to a GTX 260+ (GT200b 55nm) and jump over the 260 regular could you? Or, since the 9800gtx+ did come out AFTER the 260 in time, do they acknowledge that and let you jump to the next best release within 90 days even though there is an upgrade model in between?
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
Originally posted by: MarcVenice

All these 90 games you're talking about, why aren't you reporting them in the thread made by Derek, so anandtech can bench them, and have ATI release a magic hotfix?
So you think ATi will release hotfixes for five year+ games to allow Crossfire to work properly if I tell Derek about it? What if they don?t? What then?

I don't think I've seen you post a single game where crossfire doesn't scale like it should, in fact, the whole thread is a bit of a dissapointment, because barely any1 is reporting any games that don't scale well with SLI/CF, even though people like you keep touting the big horn on how horrible CF scaling is.
That?s great for you but I?m willing to bet I have much stricter standards around backwards compatibility than you do. As I browse through nVidia?s list of SLI supported games I can see many of my titles not there and I?m almost certain nVidia have more profiles than ATi.

Then there?s the fact that even having a profile and performance scaling doesn?t guarantee a problem free experience.

Come on man, own up, or stop coming of strong and demolishing people's post where YOU don't show any evidence either?
So if I run out and buy a 4870 X2 will you reimburse me if I?m not satisfied? No? Then why do I need to throw money at a problem just to prove you or anyone else wrong?

I?ve used plenty of multi-GPU setups and they aren?t for me as there are simply too many drawbacks with them. They simply aren?t robust enough compared to single cards.

For someone that only plays modern games straight off the shelf, plays through them once and then moves on might well find multi-GPU is great but for me that situation is not even close to being good enough.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
Originally posted by: Extelleron

And how often does that happen? I've seen that sometimes with Quad SLI and in the rarest circumstances, 2-way SLI/CF. But that is a driver issue and it is not at all a common occurence.
It doesn?t matter how it often it happens; the point is that it happens and it?s naïve to think you?ll get the same performance and compatibility of a single card when the driver doesn?t scale.

Unless you can name several modern games where the performance of the 4870 X2 is lower than the 4870 and this impacts gameplay, this is not a valid argument against it.
Modern titles? As in the dozen or so cherry picked ones that are passed around in these kinds of discussions that reviewers use? Like I said that?s not even close to being good enough for me.

Sure I do. I am the one quoting evidence from people who have R700 cards in their hands and have done the tests, you are the one with no valid evidence to refute that claim.
Where? Show me these tests.

Are these ?tests? done by the same people that claim they couldn?t even see micro-stutter on systems that were known to have them? The same kind of people that claim the human eye only need 30 FPS and AA over 4x makes no difference?

Like I said if multi-GPU fits their low standards that?s great for them but it doesn?t fit my standards.

Unless you have an R700 in your hands and can show it has the same microstutter as R680, then you shouldn't be making such a statement.
Once again unless you can prove microstutter exists on R700, then don't be using it as a argument against purchasing it.
Uh, no. Micro-stutter is inherent to AFR and as such the burden of proof is on you to show it doesn?t happen. Otherwise you have no business claiming otherwise.

Legacy titles don't need Crossfire support.
Utter rubbish. Why don?t they need Crossfire support? I expect a single card to support them and deliver optimal performance so why should multi-GPU be treated differently, especially if you?re paying more money for it?

You?re the one telling us how great multi-GPU is but then turning around and telling us to accept various limitations. Sorry but that doesn?t fly. If it doesn?t run old games properly it?s worse than a single card that does run them properly.

They are fast enough as it is.
Fast enough for whom? What if nVidia put a driver cap of 30 FPS on your GTX280 and told you accept 30 FPS as fast enough? Would you be happy with that?

Of course not; you expect your single GPU to deliver as much performance as possible at all times so why should standards towards multi-GPU be any different?

You aren't sacrificing anything with R700, becuase you get between GTX 260 & 280 performance even when Crossfire doesn't work at all
Like I said earlier claiming everything works like a single card when it doesn?t scale is naive. In fact based on your comments I wonder whether you?ve used a recent multi-GPU implementation in a good cross-section of games.

Quake III running at 300 FPS @ 2560x1600 24xAA not scaling with Crossfire does not count.
If the single card delivers optimal performance in the same situation then multi-GPU shouldn?t be any different.

ATI's driver team focuses on the titles where performance is necessary.
And there it is. ?Multi-GPU is great except for all these situations??, and that?s when proponents of the technology start listing justifications as to why you should accept said limitations.

I highly doubt that you can name more than a few games where the performance of the 4870 X2 is not adequete, and in those, I doubt the GTX 280 is any better.
Uh, how about when the driver doesn?t work properly and you get performance worse than a single card? How about general driver incompatibilities such as rendering glitches?

Just look at the forums to see things like flickering problems with Crossfire when single cards have no such issues.

Other than that.... if you want ultimate performance there is no other option.
If by ultimate you mean ?gets bigger graphs in a few modern games in exchange for input lag, micro-stutter and/or more driver issues in legacy titles? I agree with you.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
Originally posted by: OneOfTheseDays

Side note, I've never seen a moderator go to such extreme lengths to downplay a card before,
Apparently you still don?t get I'm criticizing multi-GPU in general.

it REALLY makes me question whether or not this guy should be moderating this forum. We have enough paid Nvidia zealots as it is that sour any non-Nvidia related discussion.
Look, I'm getting mighty tired of this shit. Last month it was claimed I'm a paid ATi shill and now I'm a paid nVidia zealot.

My posting history speaks for itself which is why fanboys from both camps get annoyed. I buy hardware from my own pocket and I have no affiliation to any camp so please refrain from posting such nonsense in the future.

You really have a 90+ game rotation? I call bullshit man, what a joke.
I don't really care what you "call". You think what you like while I?ll continue to play my games and pick the hardware that works best for me.

ATI is committed to getting compatibility better with each driver revision and now that multi-GPU is the way of the future it only makes sense for them to improve.
If they're so committed why have they still not allowed custom profiles like nVidia? Why do ATi users have to resort to renaming executable names and play the guessing game in an effort to get scaling in unsupported games?
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
BFG10K, I have to agree with Extelleron. If you're playing older games and your SLI/CF setup isn't scaling, yet you're getting 100+ FPS then what's the problem? I realize that in a perfect world SLI/CF should be working for all games so that you're getting your full value for purchasing the two cards. But if the only way you can tell the difference in gameplay between one and two cards is with FRAPS, then it's not a real issue because it's not taking anything away from your gaming experience. I would have to call it more of a pet peeve than an actual issue.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Like I said own up, and name your 90+ games. You're not the only one, like I said, the whole thread where Derek asks for people to report games that don't scale well is a huge dissapointment. I'm not calling you a fanboy, but the fact that the only thing you have against multi gpu in general is microstutter * only discovered recently and not many ppl care * and 'input' lag, once again something only anal people care about because the rest of us can't see it. Then again, if THOSE two are the reason you won't buy a HD4870X2 over a gtx280, even though it most likely gives better performance 90% of the time, fine. But perhaps you should stop making out the card to be a piece of junk?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I am sensing both sides of argument here have good points:

1) BFG - if I am paying $500 for a graphics card setup, I want to get "$500" gaming performance in all the games I play, not just the newer titles. Even though 4870x2 outperforms GTX280 in the currently popular games, I also play other titles in which a dual card scenario could be inferior. I enjoy to go back to replay older titles at the highest settings I can, in which case a single card will guarantee that I am getting optimal performance. And although most people see marginal benefit from going beyond 4AA I play at the highest resolutions where I enjoy to get the highest quality settings possible (such as Supersampling). In addition, I don't just play the most popular games for which ATI/NVIDIA target their drivers so I dont want to wait for ATI/NV to release a new driver so it will scale with my game appropriately. At least NV provides customer profiles while in ATI's case you are stuck either renaming the .exe file to get "fake" scaling or you are playing with a single card. And even if not all games experience micro-stutter, I dont want to pay for the idea that in some games I will experience this problem which could have been avoided with a single card.

vs.

Sure if you play older titles, your dual card setup will likely provide no tangible benefit over a single 4870 for example. Having said that, a 3 or 4 year old game even at 2560x1600 24AA or otherwise will have inferior graphics than any recent game at 1024x768 4AA. In addition, there is no game out there that old that is unplayable with a single 4870 in any setting you can think of (without going overboard). As it stands now for the majority of "popular" selling titles out of the box 4870x2 delivers the best performance for the $ between it and GTX280. Since most people tend to buy the fastest graphics setup for the latest games (since even 3870 and 8800GT will be sufficient for most older games), the understanding is that performance in the latest games is more important for making a purchasing decision based on value.

----

At the end of the day, it's a personal preference.
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
I think BFG is very nice cos he could have given some folks a vacation for accusing him of being paid by Nvidia without base.
I've seen him be accused of being an Nvidia fanboy one moment and an ATI/AMD fanboy the next.

His only problem is that he likes to dig deeper into issues and so comes to a definitive opinion -- which is annoying to people who only skim over an issue. (ooh! green! I like green better than the color red...ATI SUXX!!)

As for his game collection, I have like 30 PC games (200+ total) and don't see how it is far fetched to have +90 games -- especially if one has been gaming for a long time + a game collecting hobby.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: gersson
I think BFG is very nice cos he could have given some folks a vacation for accusing him of being paid by Nvidia without base.
Rofl, I don't think they can take action because they've already let such baseless accusations go for months by many of the same individuals. Taking action now would come off as extremely preferential.

Originally posted by: RussianSensation
As it stands now for the majority of "popular" selling titles out of the box 4870x2 delivers the best performance for the $ between it and GTX280.
If the games are over a month old or have been reviewed a few times showing poor scaling, maybe. New games out of the box historically have horrible CF/SLI support, so you basically have to wait for a month or more hoping for a profile and driver update. You won't hear much about it here because people recommend CF/SLI but don't actually use it. Game-specific or vendor forums are much better resources for problems and fixes.

Originally posted by: BFG10K
For someone that only plays modern games straight off the shelf, plays through them once and then moves on might well find multi-GPU is great but for me that situation is not even close to being good enough.
Actually I find SLI/CF worst as many games I purchase fall into this category. From what I've seen it can take months before SLI/CF is supported in new releases, at which time I may have moved on to a different game. I certainly would hate waiting for a CF/SLI fix before I could take advantage of performance I already paid more for.

 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
There is one further issue that has not been mentioned yet: triple buffering. This is supposed to be another inherent limitation of AFR and has been a problem on other multi GPU configurations in the past. There is no indication that the 4870 X2 is any different. I have this on in almost every game I play and don't care how fast the card is if it does not support this, since it would be unable to provide sufficiently smooth motion for me despite the framerate.

However, I can see how the 4870 X2 is a compelling deal for someone who doesn't use vsync and triple buffering. The nice thing about it is that a single 4870 is already quite fast, so in the worst case you still get pretty good performance. The jury is still out on what exactly is going on with microstutter though.

90 games is quite believable. My main games directory currently has 54 games from a variety of time periods, and I have many more that aren't installed. Someone in the PC Gaming forum posted a screenshot of over 400 games he had installed. Most of these games will be fine on a single card, but some definitely stand to benefit from the second GPU. Far Cry is one example, and that game actually had scaling problems with SLI last I checked.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Since this thread has evolved into a single vs. multi-GPU discussion I'll go ahead and post up a short compilation of the pros/cons I made in a different thread. I have not used SLI/CF in its current AFR form, the last time I had multi-GPU was with 3Dfx and their SFR version of SLI. My list is based on common issues I read about on forums, here and especially game-specific and vendor forums.

Honestly I challenge anyone who has no experience with CF/SLI to try it yourself and prove these issues don't exist before recommending others to go multi-GPU. You can google any of the bullet captions and find tons of evidence on each of the issues listed below. I think the last guy who called BS is off trying to get a single GPU to run, much less a 2nd. I've personally never made a huge issue about CF/SLI because I've never seriously considered going that route even though I had an SLI capable board.


  • Profiles/Scaling- SLI/CF rely on driver profiles for their performance and in the case of ATI, you can't change these yourself. So if your particular game doesn't have a pre-defined profile you may see no benefit or even *worst* performance than with a single card. In the case of relying on two individually slower cards than your single card, you can see that you may actually be paying more for *worst* performance which is unacceptable to me. Scaling is typically erratic as well even when CF/SLI is working, so seeing 2x the performance is rare (although RV770 does show improvements here).
  • Micro-stuttering- Pretty heated debate about the significance of this problem on this board and others although it pops up infrequently. Basically the timing of each frame from the different GPU in AFR can be erratic, leading to this effect. Apparently some people are very sensitive to it and some aren't. I certainly wouldn't be happy if I spent $400-600 for SLI/CF only to find I couldn't stand micro-stutter.
  • Heat/Power/Space - Typically not an issue for most enthusiasts, but it can become a problem when you have 2 or even 3x the power draw and heat from high-end cards. The PSU issue can be a total W issue, but also a power connector issue with so many high-end parts needing 6 or even 8-pin PCI-E connections. Many cases and motherboards can also have problems accomodating 1x9"+ card, much less 2, 3 or 4.
  • Multi-Monitor (NV only) - NV multi-GPU solutions do not support multi-monitors. I don't know if this is a superficial driver limitation to prevent desktop cards being used in professional workstations or a truly technical issue, but I'm leaning towards driver limitation as I'm assuming the Quadro GX2 would support more than 1 monitor..... Multi-Monitor support is important to me as I play full screen on my 1920 and use my 2nd monitor for various monitoring tools, surfing the web, and desktop productivity etc.
  • Bandwidth/Frame Buffer - Not as big a deal at 1920, but one of the major reasons to upgrade to the fastes GPU is for ultra high resolutions with AA. With a GX2 or SLI/CF solution, you're still limited to the same bus width and frame buffer as the individual cards even if you have more rendering horse power. This limitation is apparent in the higher resolutions with AA when comparing a GTX 280 with a true 512-bit bus and 1GB frame buffer to the X2/SLI solutions with a 256-bit bus and 512MB buffer. R700 addresses this problem somewhat with a larger frame buffer and GDDR5.
  • Chipset specific limitations - ATI CF requires an Intel/AMD chipset and NV SLI requires an NV chipset. This unnecessarily ties your platform to your GPU between generations and in the case of SLI, to NV's flaky chipsets.
  • Overclocking ability? - NV used to have problems overclocking in SLI in Vista but I think its been fixed. Not sure if ATI has similar problems although I know many of their parts are clock-locked via BIOS. SLI/CF on a single card can also have heat issues that prevent further overclocking.
  • Low Minimum Framerates (CP5670 mentioned) - SLI/CF for whateve reasons tend to have much lower minimum framerates than single-GPU solutions. You may not see the difference in AVG FPS as the CF/SLI solution may make-up the difference by rendering less intensive frame sequences at a higher rate.
  • Vsync and Triple Buffering consistency problems (CP5670) - seems this can be hit or miss with SLI (not sure about CF) and is certainly an important feature at both high and low FPS. At low FPS Triple Buffering, helps mitigate the incremental FPS denominations of Vsync and at high FPS they can prevent screen tearing on LCDs.
  • AFR Input Lag (BFG10K mentioned) - seems inherent to AFR due to the latency between pre-rendered frames generated by the CPU and when they are rendered by the GPU and displayed on-screen.

Again, there's really only 1 person on these forums shouting from the rooftops about how great SLI/CF is. Most everyone else is basing their opinions on how awesome a few numbers and graphs look from a few games without ever having tried it themselves. Then there's the silent few that have tried it, moved on and are quiet about their experience. If you look around and actually read the replies from people who have actually used multi-GPU I think you'll find that the vast majority of responses are quite different. In any single thread over on the EVGA forums about step-up from SLI/GX2 to a single GT200....its quite obvious how much happier people are with their single-GPU set-ups.


 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
Low Minimum Framerates (CP5670 mentioned) - SLI/CF for whateve reasons tend to have much lower minimum framerates than single-GPU solutions. You may not see the difference in AVG FPS as the CF/SLI solution may make-up the difference by rendering less intensive frame sequences at a higher rate.

Just to elaborate on this, the low minimum framerates seems to be specific to the 9800GX2 and G92-based SLI configurations. I don't know if that was/is a problem with other SLI or CF setups.

The real issue can be best described as inconsistent performance within a game. Even if the hardware sites show great performance in benchmarks, on both averages and minimums, they can only test a very small part of the game. There seem to be certain situations in games where multi GPU setups inexplicably take a heavier performance hit than you would expect. The Far Cry thing I brought up in the sticky thread is one example, but I remember Splinter Cell: CT also having similar issues.

This is probably not very common, especially in the high profile games that are commonly benchmarked on sites, but it's highly noticeable and irritating when it does occur.
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
Originally posted by: Stoneburner
Nvidia has a history of doing a paper launch to coincide with the release of better performing ATI parts.

I hope they break with their tradition :)

What?, you are confused!. That was ATI x18 & x19 series!
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: g00n
EVGA Step-up question...

I was wondering, does EVGA even consider the GTX+ a NEW base model? I know you cant step up from say a 8800GT to a 8800GT SUPERCLOCKED "SC" model as they are considered the same base model. Does EVGA acknowledge the GTX+ as a whole new line - the successor to GTX or simply an enhanced version of that base? They should... If so, then you technically still couldn't go from a 9800gtx+ to a GTX 260+ (GT200b 55nm) and jump over the 260 regular could you? Or, since the 9800gtx+ did come out AFTER the 260 in time, do they acknowledge that and let you jump to the next best release within 90 days even though there is an upgrade model in between?

Pretty sure you will. EVGA only has this stipulation on their OCed models. If NVIDIA puts out a "GTX 280+" with higher clocks than the current GTX 280 it will still be a reference card that you could step-up to. You wouldn't, however, be able to step up to an EVGA "GTX 280+ SSC FTWWTFBBQ" though.

Originally posted by: SolMiester
Originally posted by: Stoneburner
Nvidia has a history of doing a paper launch to coincide with the release of better performing ATI parts.

I hope they break with their tradition :)

What?, you are confused!. That was ATI x18 & x19 series!

Well... don't forget the 4870X2... Sure, they called it a "preview" when they managed to get 4870X2's in the hands of review sites and not actually release a real product, but we all know what it really is.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
well, there is a simple enough way to check... does evga allow people to step up to the 9800GTX+?
If it does, then it would probably allow step up to the GTX 260 / 280+ as well.