SLI GT Reviews and questions

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Pacfanweb

Lifer
Jan 2, 2000
13,158
59
91
Originally posted by: aka1nas
That Bit-tech review is pretty suspect, they weren't getting much if any SLI scaling and they were unable to get SLI working at all with the Crysis Demo. Most of the other reviews were able to do so, and I would probably trust their results more.

Edit: The GT SLI setup was still beating everything but GTX SLI pretty soundly in most of their other tests on top of that. Not bad when a pair of GTs will cost about as much as a single GTX.
They admittedly had problems with their test system, so to me, that renders all of their results suspect.
They had the GT SLI slower than a single GT. That right there tells you something isn't right, especially when every other site that did the same test had the SLI GT's spanking everything but SLI GTX's.

I always understood the argument against SLI before the GT came out, which was it didn't make sense because the two cards it took to beat a single GTX cost more than the single GTX did, and since they didn't really kill the single GTX, why bother?
But the GT rendered that argument obsolete, since you can get two GT's for what one GTX costs.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: aka1nas
All of the new DX10 games are going to be increasingly shader bound, so a pair of GTs is probably only going to look better and better against a single GTX. That said, neither setup will be desirable in 2-3 years, but when isn't that true?

i agree. The GT is a much newer archetechture and they DO know what they are doing.. it is only gonna be comparatively BETTER as time goes by, not comparatively worse... Especially because it is a shader powerhouse (in exchage for everything else) Which is exactly where the performance is gonna be needed most for true DX10 games...

Keep in mind that the G80 parts were all developed for a theoretical DX10 standard before there existed any actual games that were using it... when games were developted the G92 came about changing the archetechture by sacrificing memory bandwidth to focus on shader, and absolutely rips G80 to shreds...
 

Pacfanweb

Lifer
Jan 2, 2000
13,158
59
91
Originally posted by: JAG87


haha omg thats fantastically worded. props to you brother.

Im just trying to say, once DX10 is upon us dont come crying in the forums making threads "omg i got 8800GT SLI and alan wake or hell gate london or crysis or whatever other DX10 mumbo jumbo runs slow".

its not like Ill be sailing with my GTXs, they are pretty terrible at DX10 as well, but I guarantee you that I'll get double the FPS, and suddenly the your GT isn't a single slot GTX anymore...
But at that point, it'll be irrelevant, since there will be more current options that beat today's GT's and GTX's.....it's likely neither the GT nor the GTX will ever do a stellar job with DX10...but who cares? Trying to future-proof a system for stuff that'll be out next year or the year after is like spitting into the wind.
But a single GTX isn't going to gain ground on SLI GT's, no matter what the future games are, nor are SLI GT's going to ever compete with SLI GTX's.

IMO, though, I'll get more mileage out of SLI GT's than anyone with a single GTX will, and I won't have to be the one crying that I can only sell my 1100 bucks worth of SLI GTX's for 250 when it's time to upgrade, either.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Yeah becuase getting a 4% gain over single card on a game that just came out and is OBVIOUSLY CPU LIMITED AS IT IS SINGLE THREADED IN THE DEMO is a perfectly good example of how performance is going to flesh out in crysis.

You cant tell me that a card that is about 10% behind a GTX in properly working AFR/SFR multirendering is going to be "crushed" by its big brother especially in the scenario where SLI shines (high resolutions).

Im not sure what youre trying to prove Jag...

SLI GTs are faster than a single GTX, there is no argument.

SLI GTXs are faster than SLI GTs... big shocker... It also comes with more than double the sticker price and isnt even along the same lines as my comparison.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: Pacfanweb
Originally posted by: JAG87


haha omg thats fantastically worded. props to you brother.

Im just trying to say, once DX10 is upon us dont come crying in the forums making threads "omg i got 8800GT SLI and alan wake or hell gate london or crysis or whatever other DX10 mumbo jumbo runs slow".

its not like Ill be sailing with my GTXs, they are pretty terrible at DX10 as well, but I guarantee you that I'll get double the FPS, and suddenly the your GT isn't a single slot GTX anymore...
But at that point, it'll be irrelevant, since there will be more current options that beat today's GT's and GTX's.....it's likely neither the GT nor the GTX will ever do a stellar job with DX10...but who cares? Trying to future-proof a system for stuff that'll be out next year or the year after is like spitting into the wind.
But a single GTX isn't going to gain ground on SLI GT's, no matter what the future games are, nor are SLI GT's going to ever compete with SLI GTX's.

IMO, though, I'll get more mileage out of SLI GT's than anyone with a single GTX will, and I won't have to be the one crying that I can only sell my 1100 bucks worth of SLI GTX's for 250 when it's time to upgrade, either.


First of all, you will see for your self that SLI GTs will loose to a single GTX in the near future. Just look at Crysis to begin with.

Second of all, most people don't swap video cards every 6 months, some people keep them for years, like 2 or 3 years, and most people buying 8800GTs today are looking for a 2-3 year investment. Frankly since the card gets absolutely killed with current DX10 titles, I doubt it will do any good with future ones.

And last but not least, when I am going to sell my 1100 worth of GTXs I'll have played more than a whole year with them, while you are just getting your first taste of G80. Your mouth smells like milk still so put a pacifier in it and quiet down.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: JAG87
Originally posted by: Pacfanweb
Originally posted by: JAG87


haha omg thats fantastically worded. props to you brother.

Im just trying to say, once DX10 is upon us dont come crying in the forums making threads "omg i got 8800GT SLI and alan wake or hell gate london or crysis or whatever other DX10 mumbo jumbo runs slow".

its not like Ill be sailing with my GTXs, they are pretty terrible at DX10 as well, but I guarantee you that I'll get double the FPS, and suddenly the your GT isn't a single slot GTX anymore...
But at that point, it'll be irrelevant, since there will be more current options that beat today's GT's and GTX's.....it's likely neither the GT nor the GTX will ever do a stellar job with DX10...but who cares? Trying to future-proof a system for stuff that'll be out next year or the year after is like spitting into the wind.
But a single GTX isn't going to gain ground on SLI GT's, no matter what the future games are, nor are SLI GT's going to ever compete with SLI GTX's.

IMO, though, I'll get more mileage out of SLI GT's than anyone with a single GTX will, and I won't have to be the one crying that I can only sell my 1100 bucks worth of SLI GTX's for 250 when it's time to upgrade, either.


First of all, you will see for your self that SLI GTs will loose to a single GTX in the near future. Just look at Crysis to begin with.

Second of all, most people don't swap video cards every 6 months, some people keep them for years, like 2 or 3 years, and most people buying 8800GTs today are looking for a 2-3 year investment. Frankly since the card gets absolutely killed with current DX10 titles, I doubt it will do any good with future ones.

And last but not least, when I am going to sell my 1100 worth of GTXs I'll have played more than a whole year with them, while you are just getting your first taste of G80. Your mouth smells like milk still so put a pacifier in it and quiet down.

The... Crysis... Demo... Is... Single... Threaded...

The... Game... Will... Not... Be...
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: Acanthus
Originally posted by: JAG87
Originally posted by: Pacfanweb
Originally posted by: JAG87


haha omg thats fantastically worded. props to you brother.

Im just trying to say, once DX10 is upon us dont come crying in the forums making threads "omg i got 8800GT SLI and alan wake or hell gate london or crysis or whatever other DX10 mumbo jumbo runs slow".

its not like Ill be sailing with my GTXs, they are pretty terrible at DX10 as well, but I guarantee you that I'll get double the FPS, and suddenly the your GT isn't a single slot GTX anymore...
But at that point, it'll be irrelevant, since there will be more current options that beat today's GT's and GTX's.....it's likely neither the GT nor the GTX will ever do a stellar job with DX10...but who cares? Trying to future-proof a system for stuff that'll be out next year or the year after is like spitting into the wind.
But a single GTX isn't going to gain ground on SLI GT's, no matter what the future games are, nor are SLI GT's going to ever compete with SLI GTX's.

IMO, though, I'll get more mileage out of SLI GT's than anyone with a single GTX will, and I won't have to be the one crying that I can only sell my 1100 bucks worth of SLI GTX's for 250 when it's time to upgrade, either.


First of all, you will see for your self that SLI GTs will loose to a single GTX in the near future. Just look at Crysis to begin with.

Second of all, most people don't swap video cards every 6 months, some people keep them for years, like 2 or 3 years, and most people buying 8800GTs today are looking for a 2-3 year investment. Frankly since the card gets absolutely killed with current DX10 titles, I doubt it will do any good with future ones.

And last but not least, when I am going to sell my 1100 worth of GTXs I'll have played more than a whole year with them, while you are just getting your first taste of G80. Your mouth smells like milk still so put a pacifier in it and quiet down.

The... Crysis... Demo... Is... Single... Threaded...

The... Game... Will... Not... Be...

so what are you saying that its CPU bound?

so explain these then

EDIT

particularly explain the 1920x1200 framerates. look theo from the inq might be a complete idiot, but he got this one right

256-bit bus could become a bottleneck in next batch of games

 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: JAG87
Originally posted by: Acanthus
Originally posted by: JAG87
Originally posted by: Pacfanweb
Originally posted by: JAG87


haha omg thats fantastically worded. props to you brother.

Im just trying to say, once DX10 is upon us dont come crying in the forums making threads "omg i got 8800GT SLI and alan wake or hell gate london or crysis or whatever other DX10 mumbo jumbo runs slow".

its not like Ill be sailing with my GTXs, they are pretty terrible at DX10 as well, but I guarantee you that I'll get double the FPS, and suddenly the your GT isn't a single slot GTX anymore...
But at that point, it'll be irrelevant, since there will be more current options that beat today's GT's and GTX's.....it's likely neither the GT nor the GTX will ever do a stellar job with DX10...but who cares? Trying to future-proof a system for stuff that'll be out next year or the year after is like spitting into the wind.
But a single GTX isn't going to gain ground on SLI GT's, no matter what the future games are, nor are SLI GT's going to ever compete with SLI GTX's.

IMO, though, I'll get more mileage out of SLI GT's than anyone with a single GTX will, and I won't have to be the one crying that I can only sell my 1100 bucks worth of SLI GTX's for 250 when it's time to upgrade, either.


First of all, you will see for your self that SLI GTs will loose to a single GTX in the near future. Just look at Crysis to begin with.

Second of all, most people don't swap video cards every 6 months, some people keep them for years, like 2 or 3 years, and most people buying 8800GTs today are looking for a 2-3 year investment. Frankly since the card gets absolutely killed with current DX10 titles, I doubt it will do any good with future ones.

And last but not least, when I am going to sell my 1100 worth of GTXs I'll have played more than a whole year with them, while you are just getting your first taste of G80. Your mouth smells like milk still so put a pacifier in it and quiet down.

The... Crysis... Demo... Is... Single... Threaded...

The... Game... Will... Not... Be...

so what are you saying that its CPU bound?

so explain these then

EDIT

particularly explain the 1920x1200 framerates. look theo from the inq might be a complete idiot, but he got this one right

256-bit bus could become a bottleneck in next batch of games

You know what Jag, we will see when Crysis launches wont we.

The demo engine and the game engine are not the same.

If you think for a SECOND its going to perform identically utilizing 1 core and 4 cores... Im not even gonna go there.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
whats to explain? your link shows a single GT getting slightly worse score then a single GTX.... which is completely in line with his argument... he is specifically saying DUAL GT > Single GTX. And every test that tested it so far shows dual GT getting over 70% improvement compared to a single GTX... and dual GTX will ofcourse kill duel GT... When the game actually supports SLI there is absolutely no reason to even imagine that a single GTX will outperform dual GT on it.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
If it was a memory size or bandwidth-related issue, the GTS 640 would be beating the 512MB GT in that benchmark, not the other way around.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: aka1nas
If it was a memory size or bandwidth-related issue, the GTS 640 would be beating the 512MB GT in that benchmark, not the other way around.

I was saving that for him later, you ruined it :|
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Jag is making a very valid point.

In ridiculously GPU-heavy situations, the GTS 640 MB closes the gap between it & the GT.

In SLI, same thing, & the GTX pulls far ahead in some of those cases.

Obviously, i suspect by the time it really matters, we'll all be running much better cards anyway, but the GT is designed around only a 256-bit interface, & like Jag has pointed out, that's an issue for those insanely high resolutions + extremely GPU intensive situations.

I do not see the GT becoming closer to the GTX performance over time; rather the opposite.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
When you SLI the GT, not only you double the effective bandwidth but both pixel/texel fillrates.

Since the GTs pack quite abit of shader performance, it is a better buy hands down. The only reason it performs worse in some games is because of immature SLI drivers. SLi doesnt work in Crysis also as of now.
 

MichaelD

Lifer
Jan 16, 2001
31,528
3
76
Can we honestly say whether or not SLI works in Crysis since Crysis is not out yet? Allegedly, the code in the DEMO is NOT the same as in the Retail version.

Admittedly, I'm a bit confused by this...code is what makes the game so did they make two games? :confused:

I'm actually hoping that it is true b/c the demo is SINGLE threaded...and whoops on my rig like no tomorrow. I'm hoping that he demo will be multi-threaded and will take advantage of 2 or more CPU cores. There's never been a better time to go Quad-Core, you know. :D

Anyway, back OT: When Crysis Retail hits and NVidia/AMD get SLI/Crossfire working with it, I will probably upgrade to a SLI MB and Quad-Core at that point.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
not different code, just unfinished unimplemented code... they do confirm that the demo is one thread only, and the final version is multi threaded.
 

Margo1

Member
Nov 4, 2007
30
0
0
Your mouth smells like milk still so put a pacifier in it and quiet down.

Boys, boys please be civil. Why do men talk about computer parts like cars or football teams? It is becoming clear to me that a video cards' performance is subjective. But reading this thread I started is kind of fun.
 

Pacfanweb

Lifer
Jan 2, 2000
13,158
59
91
Originally posted by: Acanthus
Jag is just plain wrong in this case.

GT SLI slaughters a single GTX across the board.
Just thought this needed repeating, since he keeps changing his criteria to make the GT look not as good as 99% of the review sites say it is.

And even the last link he provided "wholeheartedly recommended" the GT.

Those Crysis figures....IT IS A DEMO. NOT A REAL GAME YET. And the game isn't playable on any hardware at 1920 res, so it's an irrelevant comparison at this point.

And besides, how many people play at those extremely high resolutions? Less than 1%, I'm willing to bet.

Bottom line, OP has been provided with links to GT's in SLI, and all of the legitimate reviews have dual GT's crushing a single anything else in the vast majority of games AND demos that are out there.

While there MAY be a very few select scenarios that the GT might come off second to a single GTX, these are few and far between, and the overwhelming amount of tests show the dual GT is a better performer than a single anything.
In another year, that probably won't be the case, but you could say the same thing about any current card right now, including dual GTX's.

 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: Pacfanweb
Originally posted by: Acanthus
Jag is just plain wrong in this case.

GT SLI slaughters a single GTX across the board.
Just thought this needed repeating, since he keeps changing his criteria to make the GT look not as good as 99% of the review sites say it is.

And even the last link he provided "wholeheartedly recommended" the GT.

Those Crysis figures....IT IS A DEMO. NOT A REAL GAME YET. And the game isn't playable on any hardware at 1920 res, so it's an irrelevant comparison at this point.

And besides, how many people play at those extremely high resolutions? Less than 1%, I'm willing to bet.

Bottom line, OP has been provided with links to GT's in SLI, and all of the legitimate reviews have dual GT's crushing a single anything else in the vast majority of games AND demos that are out there.

While there MAY be a very few select scenarios that the GT might come off second to a single GTX, these are few and far between, and the overwhelming amount of tests show the dual GT is a better performer than a single anything.
In another year, that probably won't be the case, but you could say the same thing about any current card right now, including dual GTX's.

and that was the sentence I started my original reply with. SLI is only good for that, and in those scenarios the 8800GT seems to struggle because of its limited frame buffer and bandwith.

Acanthus, what exactly were you saving for later about the 8800GTS? look at the benchmark again, at 1920x1200 the GTS gets 13.2 fps while the GT gets 12 fps. weird for card thats as good as a GTX no? s houldn't it be getting close to 20fps?

to Cookie Monster who said that bandwith doubles with SLI:
you are dead wrong, you have no idea how SLI works. SLI splits the load between the 2 cards, but the card you have your monitor plugged into has the duty of putting the split frame together, or the alternate frames together, and output the final full resolution. this happens in the frame buffer of ONE card, not both.

I think I am in the position to make these comments since I have the hardware, and I have decided to stick with my year old GTXs for the time being, despite loosing a lot of resale value, until nvidia releases new cards that will actually outperform the 8800GTXs in my scenario.

n7 seems the only one who understands what I am trying to say, I just hope that the OP got it too. bottom line is SLI is only worth it when running VERY HIGH resolutions with high anti aliasing, and at those settings, the 8800GT struggles. Put it through your head, 512MB and 256-bit is not enough to render high res + high AA, which beats the whole purpose of having SLI in the first place.

do you even have SLI by the way? or am I talking to a "dont worry, I read AT and I know it all" type of person?

EDIT

thought Id throw this into the mix
http://forums.anandtech.com/me...=2115260&enterthread=y

 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: JAG87
Originally posted by: Pacfanweb
Originally posted by: Acanthus
Jag is just plain wrong in this case.

GT SLI slaughters a single GTX across the board.
Just thought this needed repeating, since he keeps changing his criteria to make the GT look not as good as 99% of the review sites say it is.

And even the last link he provided "wholeheartedly recommended" the GT.

Those Crysis figures....IT IS A DEMO. NOT A REAL GAME YET. And the game isn't playable on any hardware at 1920 res, so it's an irrelevant comparison at this point.

And besides, how many people play at those extremely high resolutions? Less than 1%, I'm willing to bet.

Bottom line, OP has been provided with links to GT's in SLI, and all of the legitimate reviews have dual GT's crushing a single anything else in the vast majority of games AND demos that are out there.

While there MAY be a very few select scenarios that the GT might come off second to a single GTX, these are few and far between, and the overwhelming amount of tests show the dual GT is a better performer than a single anything.
In another year, that probably won't be the case, but you could say the same thing about any current card right now, including dual GTX's.

and that was the sentence I started my original reply with. SLI is only good for that, and in those scenarios the 8800GT seems to struggle because of its limited frame buffer and bandwith.

Acanthus, what exactly were you saving for later about the 8800GTS? look at the benchmark again, at 1920x1200 the GTS gets 13.2 fps while the GT gets 12 fps. weird for card thats as good as a GTX no? s houldn't it be getting close to 20fps?

to Cookie Monster who said that bandwith doubles with SLI:
you are dead wrong, you have no idea how SLI works. SLI splits the load between the 2 cards, but the card you have your monitor plugged into has the duty of putting the split frame together, or the alternate frames together, and output the final full resolution. this happens in the frame buffer of ONE card, not both.

I think I am in the position to make these comments since I have the hardware, and I have decided to stick with my year old GTXs for the time being, despite loosing a lot of resale value, until nvidia releases new cards that will actually outperform the 8800GTXs in my scenario.

n7 seems the only one who understands what I am trying to say, I just hope that the OP got it too. bottom line is SLI is only worth it when running VERY HIGH resolutions with high anti aliasing, and at those settings, the 8800GT struggles. Put it through your head, 512MB and 256-bit is not enough to render high res + high AA, which beats the whole purpose of having SLI in the first place.

do you even have SLI by the way? or am I talking to a "dont worry, I read AT and I know it all" type of person?

EDIT

thought Id throw this into the mix
http://forums.anandtech.com/me...=2115260&enterthread=y

You are retarded.

ONE GTX < TWO GTs

At no point did I even come close to saying that a GT was faster than a GTX. Stop trying to twist it in that direction. The only places you see a GTX beating GT SLI is in places where SLI is obviously not working.

It is crystal clear that:
1. Youre too stubborn to admit youre wrong.
2. You have some skewed reason to think the GTX is magic (probably because you own 2).
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: Acanthus
Originally posted by: JAG87
Originally posted by: Pacfanweb
Originally posted by: Acanthus
Jag is just plain wrong in this case.

GT SLI slaughters a single GTX across the board.
Just thought this needed repeating, since he keeps changing his criteria to make the GT look not as good as 99% of the review sites say it is.

And even the last link he provided "wholeheartedly recommended" the GT.

Those Crysis figures....IT IS A DEMO. NOT A REAL GAME YET. And the game isn't playable on any hardware at 1920 res, so it's an irrelevant comparison at this point.

And besides, how many people play at those extremely high resolutions? Less than 1%, I'm willing to bet.

Bottom line, OP has been provided with links to GT's in SLI, and all of the legitimate reviews have dual GT's crushing a single anything else in the vast majority of games AND demos that are out there.

While there MAY be a very few select scenarios that the GT might come off second to a single GTX, these are few and far between, and the overwhelming amount of tests show the dual GT is a better performer than a single anything.
In another year, that probably won't be the case, but you could say the same thing about any current card right now, including dual GTX's.

and that was the sentence I started my original reply with. SLI is only good for that, and in those scenarios the 8800GT seems to struggle because of its limited frame buffer and bandwith.

Acanthus, what exactly were you saving for later about the 8800GTS? look at the benchmark again, at 1920x1200 the GTS gets 13.2 fps while the GT gets 12 fps. weird for card thats as good as a GTX no? s houldn't it be getting close to 20fps?

to Cookie Monster who said that bandwith doubles with SLI:
you are dead wrong, you have no idea how SLI works. SLI splits the load between the 2 cards, but the card you have your monitor plugged into has the duty of putting the split frame together, or the alternate frames together, and output the final full resolution. this happens in the frame buffer of ONE card, not both.

I think I am in the position to make these comments since I have the hardware, and I have decided to stick with my year old GTXs for the time being, despite loosing a lot of resale value, until nvidia releases new cards that will actually outperform the 8800GTXs in my scenario.

n7 seems the only one who understands what I am trying to say, I just hope that the OP got it too. bottom line is SLI is only worth it when running VERY HIGH resolutions with high anti aliasing, and at those settings, the 8800GT struggles. Put it through your head, 512MB and 256-bit is not enough to render high res + high AA, which beats the whole purpose of having SLI in the first place.

do you even have SLI by the way? or am I talking to a "dont worry, I read AT and I know it all" type of person?

EDIT

thought Id throw this into the mix
http://forums.anandtech.com/me...=2115260&enterthread=y

You are retarded.

ONE GTX < TWO GTs

At no point did I even come close to saying that a GT was faster than a GTX. Stop trying to twist it in that direction. The only places you see a GTX beating GT SLI is in places where SLI is obviously not working.

It is crystal clear that:
1. Youre too stubborn to admit youre wrong.
2. You have some skewed reason to think the GTX is magic (probably because you own 2).

From the other thread I linked:

Originally posted by: munky
Originally posted by: deerhunter716
PLENTY of benchmarks that show it does very well at 1920x1200 which is what I run. I have YET to see where running AA or AF makes a difference that I can really notice. It will do just fine even with today's games.

http://sg.vr-zone.com/articles..._GT_Review/5369-7.html

It BEATS the GTX on Crysis at 1920x1200 -- yes with no AA and no AF which again I have tried with my GTX and see no visible difference when playing.

You may not see a difference, but I see plenty of difference between AA and no AA at that resolution, and I have seen plenty of benches where the 8800gt falls far behind a gtx even at 16x12, nevermind 19x12. Besides being 2x as slow in Crysis, it also falls on it's face here,here, here and here, just to name a few examples. There is no way that card would last me even a year without having to turn down settings or give up AA. I'm going to need a real high end card for that resolution, and even the gtx does not represent what a high end card should be at this point in time.

We're only talking about SLI and 'high res high AA' scenarios here, which is what SLI SHOULD BE USED FOR, otherwise its a waste of money. Wow, look at Call of Juarez 1920x1200 4xAA: from 8.4 fps (GT SLI) to 17.5 fps (single GTX). AM I DREAMING? Dirt 1600x1200 4xAA, from 4.5 fps to 24.5 fps. woa, kind of skewed for a card thats just as good as a GTX.

Its crystal clear that:
1. You are too stubborn to admit that the 8800GT doesn't excel at 'high res high AA'
2. You have some skewed reason to think that a 512MB card with 256-bit bus can render 'high res frames, because as we have discussed, SLI doesn't double frame buffer NOR bandwith.

and take it easy on the insults. lets not start now shall we.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: JAG87
to Cookie Monster who said that bandwith doubles with SLI:
you are dead wrong, you have no idea how SLI works. SLI splits the load between the 2 cards, but the card you have your monitor plugged into has the duty of putting the split frame together, or the alternate frames together, and output the final full resolution. this happens in the frame buffer of ONE card, not both.

What are you on about? What you just said is how rendering works with SLi and i dont how thats related to what i was saying.

SLi does double your effective bandwidth and pixel/texel fillrate. NOT FRAME BUFFER SIZE.

A good example is the 7950GX2 (which was connected through a SLi bridge).
Link

Notice the pixel/texel fillrate is 16000MP/s and 24000MP/s. Bandwith is at an effective 76.8GB/s?

Two G71s clocked at 500Mhz/1200MHz (core/mem).
So basically one G71 provides a pixel/texel fillrate of 8000MP/s and 12000MP/s. Not to mention a bandwidth of 38.4GB/s.

Now times two this.

edit - If you still dont believe this, then you or I can go ask the big boys at B3D directly. ;)




 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Originally posted by: Cookie Monster
Originally posted by: JAG87
to Cookie Monster who said that bandwith doubles with SLI:
you are dead wrong, you have no idea how SLI works. SLI splits the load between the 2 cards, but the card you have your monitor plugged into has the duty of putting the split frame together, or the alternate frames together, and output the final full resolution. this happens in the frame buffer of ONE card, not both.

What are you on about? What you just said is how rendering works with SLi and i dont see a

SLi does double your effective bandwidth and pixel/texel fillrate. NOT FRAME BUFFER SIZE.

A good example is the 7950GX2 (which was connected through a SLi bridge).
Link

Notice the pixel/texel fillrate is 16000MP/s and 24000MP/s. Bandwith is at an effective 76.8GB/s?

Two G71s clocked at 500Mhz/1200MHz (core/mem).
So basically one G71 provides a pixel/texel fillrate of 8000MP/s and 12000MP/s. Not to mention a bandwidth of 38.4GB/s.

Now times two this.


GX2 DOES NOT EQUAL SLI. not even close.

notice how the GX2 has 1024MB of frame buffer. does memory add up with SLI?

the memory bus in the GX2 is connected internally like you said, and this had to be done since the single card has to output a finished frame.

SLI doesn't work like that, the driver splits/alternates the frames between the two cards, then retrieves the information from the second card, sends it back to first card which then synthesizes the two pieces together to output the final finished video stream. Thats why although there are 2 cards rendering the image, the bandwith does not double, and neither does the frame buffer, because yes the rendering workload is split but the compiling is still done in the primary card.

I hope this helps you understand why high resolution output with 8800GT SLI cannot perform well, and why it's a waste of money.


edit
as a matter of fact to prove my statements you can take a look in the nvidia control panel. with a GX2 card there is NO REFERENCE made to SLI. the option is called "enable Multi-Gpu" while when you have 2 physical cards its says "enable SLI"
 

Pacfanweb

Lifer
Jan 2, 2000
13,158
59
91
I guess the problem I have with the examples that Jag is using is this: None of them, on any current card, are playable. None. So it's irrelevant whether or not the GT starts wheezing at 1920 res with everything turned on....because so does the GTX and every other card. And that isn't going to change as more games that are so hard for a video card to run come out....those cards are still not going to let you play them at those high resolutions....and you end up buying a next-generation card, whatever that may be, to play those resolutions.

So, all you can do is look at the resolutions that ARE playable, and the GT comes off quite nicely there. And it will continue to fare very well...the current games aren't going to magically get harder to run, and Crysis isn't really relevant yet, anyway, just like UT3, because we don't have the full game.

You're sitting here arguing that the GT isn't good because it won't run an unreleased game as well as a GTX can, yet neither card can hack that resolution.....you might as well compare a Geforce 4600 to a Radeon 9700 Pro and argue about which one runs Crysis better.....it doesn't matter, because neither is playable.

Stop cherry-picking specific scenarios to back up your argument and look at the whole picture....which is, the GT is the best bang for the buck currently (and only a complete idiot would argue against that), and in 99% of benchmarks can't be beaten in SLI by any current single card. (GTX)

So if you want to play the games that are out NOW and the near future, you'll be perfectly okay in the vast majority of cases with a dual GT setup, and in fact, with a single GT, particularly if you're like the majority of people and run 16x12 or lower.

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Its memory bus ISNT connected internally. I never said anywhere it does. Nor did i state memory adds up. Why are you putting words in my mouth?

Its connected by an SLi brigde. Why do you think the GX2 suffered the same SLi limitations? e.g Vsync? not performing as well in titles where SLi doesn't work etc etc

Anyway, ive been discussing this with ChrisRay and others who do have a profound knowledge behind multi GPU technology. Things ive come across:

-SLi is technically very optimal in fillrate bound situations. I.e High resolutions and High AA because it effectively "doubles" the fillrate thanks to the secondary GPU.
-How you described SLi is only descriding how SFR (Split Frame Rendering) works which isnt as efficent as AFR. AFR is used the most and what is AFR:
"Alternate Frame Rendering (AFR): One Graphics Processing Unit (GPU) computes all the odd video frames, the other renders the even frames. (i.e. time division)"

"Alternate Frame Rendering (AFR), the second rendering method. Here, each GPU renders entire frames in sequence - one GPU processes even frames, and the second processes odd frames, one after the other. When the secondary card finishes work on a frame (or part of a frame) the results are sent via the SLI bridge to the master GPU, which then outputs the completed frames. Ideally, this would result in the rendering time being cut in half, and thus performance from the video cards would double."

-This is why bandwidth/shader performance/fillrates effectively double (assuming the load is distributed evenly and also in fillrated situations i.e high resolution/AA environments) BUT not frame buffers because the output is being sent out from the master GPU. (the completed frame from the second GPU is sent to the first GPU via SLi bridge)

Conclusion:
8800GT SLi is a MUCH better buy compared to a 8800GTX because the benefits heavily outweigh the cons especially since the 8800GT SLI is priced around a single GTX.

ChrisRay's 8800GT SLi vs 8800GTX SLi preview

For the same price as a single 8800GTX you can have up to 60-70% more performance than a single 8800GTX.

Lastly the 7950GX2 IS SLi:

Link

You see, at the heart of the GX2 is NVIDIA?s SLI technology. To put it simply, the GX2 is basically two 512MB GeForce 7900 cards stacked one on top of the other and linked together via a sort of expanded SLI bridge. There is no revolutionary GPU hiding under the twin coolers, it?s essentially the same G71 with all the same features that we already have on the 7900 GTX and 7900 GT. And because its heart is SLI, it has all the same disadvantages that come with a regular dual-card SLI setup.


Now close this thread because i think this just ended the whole debate.

;)

edit - And sorry for going abit OT.
 

Margo1

Member
Nov 4, 2007
30
0
0
I don't understand the technical aspect of most of the posts here. But I have a general idea that SLI would be worth it for me: Mostly higher frame rates at 1920x1200 resolution. Thanks to all who participated in the conversation. It was entertaining.