Nvidia SLI

Coherence

Senior member
Jul 26, 2002
337
0
0
In the simplest terms, SLI splits the screen in half so one card draws half the screen and the other card draws the other half, therefore increasing overall performance since each card has less work to do. (If memory serves, this was first done by 3dfx a few years ago, which we know was bought out by Nvidia.)

Not to start a flame war, but why does this make anyone excited? Doesn't it seem like a lazy way for the engineers to increase performance? More cards, more performance. Well, DUH!!

Why not split the screen into thirds, or even quarters? They could sell even more cards then! :roll:

Sorry, but I'm just not impressed. Honestly, it just sounds to me like the engineers Nvidia picked up from 3dfx have hit some kind of roadblock and can't think of anything better to increase their benchmarks.

The cost of both the extra card needed plus the space inside the PC seems like an awful waste for such a setup.

Opinions?

(Oh, and just so people don't accuse me of being an ATi fanboi, I have 3 PC's in my home, and 2 of them run Nvidia.)
 

Lazy8s

Golden Member
Jun 23, 2004
1,503
0
0
I have to agree. I think dual graphics cards will go the way of dual processors in the gaming department. I mean can you imagine the requirements on the boxes?

Requirements:
Intel 12.9ghz or higher (or compatable)
3gigs of ram (PC9000 or higher)
2 NVIDIA FX 72000 512MB (or higher)

I just don't see that happening to tell ya the truth. I mean it SOUNDS like a good idea until you realize that in a year you could spend half the money and buy a single card that can do the same thing.
 

Megatomic

Lifer
Nov 9, 2000
20,127
6
81
I guess SMP systems don't excite you either? Sometimes two is just flat out better than one... :p

And if you don't think that a lot of engineering was spent coming up with this solution then you are not seeing things clearly. This is a monumental leap forward and it required them (NV) to innovate in hardware and software. I AM impressed.
 

Steg55

Member
May 13, 2004
96
0
0
See here. All the nvidia SLI discusion you could need.

And the answer to your question? HELL YES we are excited. nVidia don't actually desperatly need to improve there benchmark scores as they are on a level (or beating) ATi in more or less every benchmark. This SLI has been engineered in since the birth of the NV40 - its just taken a while for it to surface.

Steg
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
1. Voodoo. Not your average cheap PC dealer.
2. The gaming use is to get people excited. But look, you'll need a Xeon. What does this really mean? People who need the performance on their workstations can get it. As in Quadro.
3. Also, ~77% scaling is excellent, and this is certainly an easier and cheaper way to do the job than try to fab a 400m transistor chip and use a 512-bit RAM interface--the 220m part is already causing them headaches due to low yields. I wouldn't be suprised if they release a low-end model w/ 8 pipes pretty soon.
 

Megatomic

Lifer
Nov 9, 2000
20,127
6
81
Originally posted by: Cerb
2. The gaming use is to get people excited. But look, you'll need a Xeon. What does this really mean? People who need the performance on their workstations can get it. As in Quadro.
There are "rumors" floating around that say that the NF4 chipset will allow dual (or more) PCI Express slots. I am very excited about that.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Depends on how well NVidia is able to bring it to market, and what they do with it once it makes it. If they support the feature well, and bring it to lower end cards and continue to develop its feature set, then it should be a welcome addition. Unlike dual CPU's, this should always improve performance by quite a bit in all video card limited applications past present and future, unlike dual CPU's which requires application support (no current games) to truly benefit. If NVidia gives it crap support like ATi did with its Maxx card and doesn't make it more financially accessible to the masses, then this will quickly die off and simply be a footnote for the gaming community.
 

Lazy8s

Golden Member
Jun 23, 2004
1,503
0
0
that was where I was going with this. I don't see myself or any of my friends or even most "hardcore gamers" (though our week-long LANs get us almost to that level) shelling out $300+ per card, plus paying for the connector as well as a new MoBo. I mean you'd pay closer to $700 to do this with a 6800. Couldn't even the most hardcore gamer get the framerate they desire for a lot less than $700?

My only concern would be the exact synch of the tob and bottom half. Is it impossible to get a frame off? I would assume so but I think I would have to see real benefits to this technique before I shelled out for it.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Yes!!!!

It is VERY excting!!!!

Stop thinking about GAMES for a bit . . .

(really) . . .



. . . NOW picture a 3D render Workstation; now picture that same 3D workstation nearly DOUBLING their work output by making a few HW changes . . . .

. . . it's a CHEAP upgrade for the new MB and Xeon procs. ;)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Megatomic
Originally posted by: apoppin
Stop thinking about GAMES for a bit . . .
You sure do know how to kill a guy's buzz... :(



:D
Well, i was taking a BREAK from plauying Thief: Deadly Shadows (III) when i accidently wandered in here. :p

:roll:

apoppin OUT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

:D

(apoppin blackjacks another fanboi, looks furtively around, listens for clues, and sneaks away without signing out . . . . ) :D

(yeah sure)

back to da buzz . . . .
 

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
Not to start a flame war, but why does this make anyone excited? Doesn't it seem like a lazy way for the engineers to increase performance? More cards, more performance. Well, DUH!!

no, not realistically. if the competitor has one card that can match your sli, u are doomed. it requires anything but laziness to be pulled off. but it also gives you a new upgrade path.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
already the CURRENT single GPUs are *so* sophisticated that HARDLY ANY game around yet does even take advantage of it's technologies. Why would it be BETTER to put in 2 or more cards ?

Most of us 'geeks' probably have hardware rated at 10 stars.....which runs software rated at 4 or 5 stars with engines which are 'back compatible' so every low-end dummy can run 'em too. Doom3 and HL2 are NOT yet reality...just a reminder.

You would have a HELL of a time trying to sell me SLI with the slogan 'more power'..since *already* the majority of hardware 'power' goes totally UNUSED. Again: SOFTWARE (engines) <---- is the priority and not another 'trick' to somehow rape/scam the geek/users more money for what there is not really a use/need.

I am really getting sick of that game...slightly OT...but same with 64bit CPUs which they sell for $500 or more at newegg, problem just is Longhorn is not even HERE....

Besides...as said in another tread...i think SLI is retarded.... (2x more power, heat, noise, $$$)....no more features etc...etc...etc...
 

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
no, not really, at very high resolutions witha ll the af/aa turned on even these new cards drop way below 100fps in the newer games already.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Yeah, I have to sort of agree. It's so expensive to have 2 cards and it's really not a technological feat. It's simply having more of the same.

But there are some good points that I will admit:
1. Why not? It doesn't seem like it would be hard to do so you might as well do it. It's a lot easier than implementing new technology.
2. Good for workstation people.
3. After the card has become old, it may be cheap to upgrade your system to 2 cards if you already have 1.

And by the way - shouldn't we not call this SLI? It doesn't interleave lines. It just the screen into halves.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Nobody has mentionned this, but what about the power requirements?

Surely two of these cards will push 99% of PSU's over the edge in a millisecond. What are they going to do about this?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: flexy
already the CURRENT single GPUs are *so* sophisticated that HARDLY ANY game around yet does even take advantage of it's technologies. Why would it be BETTER to put in 2 or more cards ?

Most of us 'geeks' probably have hardware rated at 10 stars.....which runs software rated at 4 or 5 stars with engines which are 'back compatible' so every low-end dummy can run 'em too. Doom3 and HL2 are NOT yet reality...just a reminder.

You would have a HELL of a time trying to sell me SLI with the slogan 'more power'..since *already* the majority of hardware 'power' goes totally UNUSED. Again: SOFTWARE (engines) <---- is the priority and not another 'trick' to somehow rape/scam the geek/users more money for what there is not really a use/need.

I am really getting sick of that game...slightly OT...but same with 64bit CPUs which they sell for $500 or more at newegg, problem just is Longhorn is not even HERE....

Besides...as said in another tread...i think SLI is retarded.... (2x more power, heat, noise, $$$)....no more features etc...etc...etc...
Unused is hard to quantify in this case. If you define "unused" as being features, then this is true, as no one has yet to really maximize the use of Shader Model 2.0 features. But if you define it as load, then current video cards are always getting loaded down, it may take 4x AA to get there, but they are getting loaded and used.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: desertfox04
Will this combine the graphics cards memory to 512mb?
I highly doubt it. Even the original V2 could only share its frame buffer memory, not its texture memory. The V5 and MAXX had the same issue, so I don't think Nvidia is doing any better, although I'm surprised there was no mention of it.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SickBeast
Nobody has mentionned this, but what about the power requirements?

Surely two of these cards will push 99% of PSU's over the edge in a millisecond. What are they going to do about this?
according to the specs at THG, 550w'll do

not massive;

my TT 480w would (happily) do 2 GTs

;)

Power Requirements:
The power consumption of such an SLI system will be immensely high. In addition to the CPU, the power supply will have to be powerful enough to feed the two x16 slots with 75 Watts each, in addition to the GF 6800 cards' auxiliary power connectors. On top of that, there are of course the remaining components such as hard drives, optical drives etc. that also draw power.

A power draw of 250 Watts for the 6800 Ultra SLI solution is very realistic. NVIDIA is confident that a PC equipped with sundry drives and a 6800U SLI configuration should be able to run with a power supply rated at 550 Watts. Consequently, smaller power supplies should be sufficient for a 6800 GT or 6800 (standard) SLI configuration.

not too much cheese for a 3D RENDER WORKSTATION :p

:roll:

They are OVERclockABLe!!! (sure, why not?)
 

batmanuel

Platinum Member
Jan 15, 2003
2,144
0
0
Originally posted by: apoppin
Yes!!!!

It is VERY excting!!!!

Stop thinking about GAMES for a bit . . .

(really) . . .



. . . NOW picture a 3D render Workstation; now picture that same 3D workstation nearly DOUBLING their work output by making a few HW changes . . . .

. . . it's a CHEAP upgrade for the new MB and Xeon procs. ;)

Isn't most production 3D rendering done on clustered 1U rackmounts with RAGE chips? I though that the the Quadros are only put in the workstations that are used for modeling and animation, and those machines are only put into service rendering during evenings when they are not being used by the animators. As far as I know Renderman and similar rendering software still rely solely on the system's CPU to to rendering tasks. I think the new Quadro cards can assist the CPU if the rendering software is written to support it, but in the end I think it is still more cost effective for most production houses to simply buy more render clients to speed up rendering time than to outfit all the existing machines with Quadros. Someone please correct me if I am wrong.
 

txxxx

Golden Member
Feb 13, 2003
1,700
0
0
It doesnt really get me excited - infact, its quite sad just how inefficient modern architecture really is! Maybe they should address these power thirst issues first.
 
Feb 10, 2000
30,029
67
91
I'm preliminarily inclined to say that yes, we should be excited, assuming the real-world performance gains are close to what's been promised.

It seems to me for those of us with aging systems, it makes the nVidia upgrade path very attractive indeed.

At this point, the 6800 cards appear to be very comparable to their X800 competitors. You'd have to think nVidia will incorporate support for dual-video PCIx into its next-gen AMD chipsets, so planning ahead for SLI will just mean buying a newer nVidia-chipset mobo and nVidia vid card. For my part, I'd probably get a 6800GT at first, secure in the knowledge that the card's price will drop. Once the card came down to at or near $200, I could just drop in a second one for a huge boost in performance. If, by that time, ATi or nVidia has introduced a new card that betters the performance of 6800GTs in SLI, then no harm done, and I could upgrade to the new card instead if I chose to.

I remember paying $600 for my two Voodoo 2s back in the day . . .
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Steg55
See here. All the nvidia SLI discusion you could need.

And the answer to your question? HELL YES we are excited. nVidia don't actually desperatly need to improve there benchmark scores as they are on a level (or beating) ATi in more or less every benchmark. This SLI has been engineered in since the birth of the NV40 - its just taken a while for it to surface.

Steg

Weren't there some persistant rumors around the time that NV absorbed 3Dfx, that although the up-and-coming NV products in the pipeline at the time wouldn't have it, later products would incorporate some of 3Dfx's technology? I guess perhaps, this is what was rumored - NVidia SLI tech.? I like it. :) All they need to do now is support mult-monitor rigs for true "surround gaming". Hardcore gamers would pay through the nose for a rig like that.