Nvidia SLI

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: apoppin
. . . NOW picture a 3D render Workstation; now picture that same 3D workstation nearly DOUBLING their work output by making a few HW changes . . . .

. . . it's a CHEAP upgrade for the new MB and Xeon procs. ;)

That actually makes an awful lot of sense, considering NVidia's "Gelato" technology, oriented towards the 3D rendering/digital-film industry, which most likely already uses Xeon-based workstations if they run Intel-based platforms.

NV could make some very significant inroads into those industries, ahead of ATI, because of this graphics solution. I'm not really a fanboy either way, but color me impressed, NV has done some excellent engineering here, if it works out as well as they claim. The only concern is the return of the GF FX5800 Ultra scenario - too much heat and power.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: ViRGE
Originally posted by: desertfox04
Will this combine the graphics cards memory to 512mb?
I highly doubt it. Even the original V2 could only share its frame buffer memory, not its texture memory. The V5 and MAXX had the same issue, so I don't think Nvidia is doing any better, although I'm surprised there was no mention of it.

In order for the onboard card memory and the rendering pipeline to efficiently used, textures will obviously have to be uploaded to both cards. I'm not sure if the same geometry data is, and then the cards communicate betweent themselves, or more likely, the software drivers only send the appropriate geometry to either card. However, I would seem to think that it would be faster/more efficient as well as easier to implement (in terms of the software drivers), if the same geometry was sent to both cards, and the card's T&L engine did the clipping/culling necessary (different static clip planes on each card?), rather than the host system's CPU. After all, if applications are pushing so much geometry data nowadays as to be CPU-limited on a single NV-based card, then asking the host CPU to juggle/split/modify that same geometry data in order to send it to two cards, would seem to me to load down the host CPU even more, resulting in the application being more CPU-bound than before. No wonder this solution is also oriented towards dual-CPU workstation-class boards as well.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: apoppin
according to the specs at THG, 550w'll do

not massive;

you're saying 550w is NOT massive? i've never heard of a PSU greater than 550w, so it sounds pretty serious to me. it makes my 350 watter look like a AA battery. :p
 

caz67

Golden Member
Jan 4, 2004
1,369
0
0
Yes..!!! Very excited, just think of what you could do with the graphics in games..
 

imported_ozone

Junior Member
Jun 15, 2004
8
0
0
Yeah, the best part about Nvidia SLI is that each 6800 takes up (2) slots, so (2) cards takes up (4) slots! And if you look at that image, you're left with only one available PCI slot! W00t, I can't wait!
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Originally posted by: desertfox04
Will this combine the graphics cards memory to 512mb?
We already had that. Geforce 2 Ultra, GeForce 3, early GF 4 cards...
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Originally posted by: flexy
already the CURRENT single GPUs are *so* sophisticated that HARDLY ANY game around yet does even take advantage of it's technologies. Why would it be BETTER to put in 2 or more cards ?

Most of us 'geeks' probably have hardware rated at 10 stars.....which runs software rated at 4 or 5 stars with engines which are 'back compatible' so every low-end dummy can run 'em too. Doom3 and HL2 are NOT yet reality...just a reminder.

You would have a HELL of a time trying to sell me SLI with the slogan 'more power'..since *already* the majority of hardware 'power' goes totally UNUSED. Again: SOFTWARE (engines) <---- is the priority and not another 'trick' to somehow rape/scam the geek/users more money for what there is not really a use/need.

I am really getting sick of that game...slightly OT...but same with 64bit CPUs which they sell for $500 or more at newegg, problem just is Longhorn is not even HERE....

Besides...as said in another tread...i think SLI is retarded.... (2x more power, heat, noise, $$$)....no more features etc...etc...etc...
Bah. I have a FX 5900 XT @ 440/900. I can bring it to its knees with Serious Sam: The First Encounter. Now, the 6800U should be able to play newer games at decent framerates, but it will still be easy to bring it down when Doom 3 and HL 2 finally get here. Performance from video cards still leaves a bit to be desired. Sure, the features are waiting, but we can use the raw power of todays cards even with older games.