Nvidia brings SLI back to life. Coming in September.

TourGuide

Golden Member
Aug 19, 2000
1,680
0
76
The uptake on this idea will be minimal at best. Who in there right mind would be shelling out twice the price of current leading edge technology? For heaven sake - you can built an entire box for what they want to make this happen.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: TourGuide
The uptake on this idea will be minimal at best. Who in there right mind would be shelling out twice the price of current leading edge technology? For heaven sake - you can built an entire box for what they want to make this happen.

2 less than top end cards could make it very attractive value/perormance perhaps?
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Err, how would AA/AF work between the two cards (if a texture is split between them, would they know which card is doing what, or if there is a boundary between the two cards, how would it work out what to do?)
With AA, you take samples from surrounding pixels, so does that mean there would be overlap in the break between the two sections of the screen so each card had enough information to be able to do AA?
 

Slick5150

Diamond Member
Nov 10, 2001
8,760
3
81
With the resolutions this setup would be able to play at high framerates, you really wouldn't even need Anti-Aliasing anyways
 

Rent

Diamond Member
Aug 8, 2000
7,127
1
81
Originally posted by: rbV5
Originally posted by: TourGuide
The uptake on this idea will be minimal at best. Who in there right mind would be shelling out twice the price of current leading edge technology? For heaven sake - you can built an entire box for what they want to make this happen.

2 less than top end cards could make it very attractive value/perormance perhaps?

Keep in mind its not like you have to do it all at one time... As time goes on and performance starts to drag, hey, go pick up another and healtily boost your performance. The longer the cards are out, the more the price will go down...
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Rent
Originally posted by: rbV5
Originally posted by: TourGuide
The uptake on this idea will be minimal at best. Who in there right mind would be shelling out twice the price of current leading edge technology? For heaven sake - you can built an entire box for what they want to make this happen.

2 less than top end cards could make it very attractive value/perormance perhaps?

Keep in mind its not like you have to do it all at one time... As time goes on and performance starts to drag, hey, go pick up another and healtily boost your performance. The longer the cards are out, the more the price will go down...


Give this man a cigar!!!!!!

And besides. If you bought 2 6800nu for say 600.00 you would only be paying 100.00 more than if you bought a single 6800Ultra. When PCI-express becomes more common, there will be plenty of OEMs making dual PCI x16 slot boards and the prices will come down. I would rather have 2 6800nu's than a single 6800U anyday of the week. Of course 2 6800U's would be the pinnacle but at a 1000.00 price point.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: PinwiZ
With the resolutions this setup would be able to play at high framerates, you really wouldn't even need Anti-Aliasing anyways

You would, however, need a very good monitor.

Since most people are using LCD's now (or buyin gthem anyway), unless you had the $$$ for a 20" LCD for 1600x1200 you woul dbe screwed.
1280x1024 is standard 17~19" LCD res, IIRC. This is not a very good resolution if you cannot use AA/AF, since the extra graphics power would be wasted.
AA/AF would be almost needed to make the extra purchase worth it, or why not go for a single card that can hit good frame rates at that res?
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Lonyo
Err, how would AA/AF work between the two cards (if a texture is split between them, would they know which card is doing what, or if there is a boundary between the two cards, how would it work out what to do?)
With AA, you take samples from surrounding pixels, so does that mean there would be overlap in the break between the two sections of the screen so each card had enough information to be able to do AA?

I would guess that the card that is responsible for rendering a certain percentage of the screen would also be responsible for AA for it. And I dont believe textures will be "split". Each card runs the same textures but only uses its full power to render its load balanced percentage of the rendered image.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,777
31,786
146
nV pci-E SLI nForce4+Soundstorm2+A64=me sporting a chub :laugh:
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Lonyo
Originally posted by: PinwiZ
With the resolutions this setup would be able to play at high framerates, you really wouldn't even need Anti-Aliasing anyways

You would, however, need a very good monitor.

Since most people are using LCD's now (or buyin gthem anyway), unless you had the $$$ for a 20" LCD for 1600x1200 you woul dbe screwed.
1280x1024 is standard 17~19" LCD res, IIRC. This is not a very good resolution if you cannot use AA/AF, since the extra graphics power would be wasted.
AA/AF would be almost needed to make the extra purchase worth it, or why not go for a single card that can hit good frame rates at that res?

Where does it say most people are using LCD's now? How did you come to that conclusion?
Hardcore gamers tend to steer clear of LCD's. And the gamers that do have to spend a ton of cash to get a LCD screen that is even 70% capable of performing like a good quality CRT. I bought 2 18" Dell LCD's. One for my wife and one for my mother this past Christmas. They love them. But they dont game so they didn't care. I still have my 19" Dell CRT Flat Screen that handles 1600x1200 just fine.

And anyways, as stated in the article. This setup is for the True Enthusiast. Which means, money is not object when it means getting the best of the best no matter what it takes.
 

webmal

Banned
Dec 31, 2003
144
0
0
"And I have to say, if I were about to pull the trigger on a new ATI video card purchase that I knew would be for a long-time install, this NVIDIA SLI technology would certainly make me stop and rethink my position." - ATI no 1 fan
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
And once again, THG manages to screw up the facts.




Originally posted by: Tom's Hardware Guide

Like Alienware's solution, the NVIDIA setup has one card rendering the top half of the scene while the other one renders the bottom. The important difference here is that this division is not fixed at 50/50 but is flexible. Using a dynamic load-balancing technique, NVIDIA divides the load evenly between the two cards, letting each work at full capacity.

It will be interesting to see what will happen to Alienware's recently announced graphics array technology. Compared to SLI, this system is much more complex and less effective due to the lack of load balancing.




Originally posted on: Alienware VideoArray FAQ

Does each video card process 50% of the screen?

Video Array uses a ?Predictive Load Balancing? technology that evaluates on each frame the processing load for each GPU. Based on this, it ?predicts? the load distribution for the next frames, and adjusts the ?Split Ratio? accordingly. While the system always starts at a 50% split, as the content of the screen changes, the ratio changes accordingly (75/25. 85/15, 80/20, etc., etc). This logic enables Video Array to maximize the use of the graphics processing power from each card.




Does THG staff even bother looking for facts or just make it up as they go if it sounds good?
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: rbV5
Originally posted by: TourGuide
The uptake on this idea will be minimal at best. Who in there right mind would be shelling out twice the price of current leading edge technology? For heaven sake - you can built an entire box for what they want to make this happen.

2 less than top end cards could make it very attractive value/perormance perhaps?

not at this point. to vanilla 6800's would cost more than one ultra, and then you still have to take into consideration price/availability of motherboards.. and of course, one would have to consider the power requirements as well, even with lower end nv40s.

to me where it has the most potential to be an attractive solution is that when your gt/ultra begins to show it's age, you could simply add another card (which by then would be priced more reasonabily than the new gen card) instead of having to wait forever as we are in this gen, or having to pay an excessive premium for new technology..
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: keysplayr2003
Originally posted by: Rent
Originally posted by: rbV5
Originally posted by: TourGuide
The uptake on this idea will be minimal at best. Who in there right mind would be shelling out twice the price of current leading edge technology? For heaven sake - you can built an entire box for what they want to make this happen.

2 less than top end cards could make it very attractive value/perormance perhaps?

Keep in mind its not like you have to do it all at one time... As time goes on and performance starts to drag, hey, go pick up another and healtily boost your performance. The longer the cards are out, the more the price will go down...


Give this man a cigar!!!!!!

And besides. If you bought 2 6800nu for say 600.00 you would only be paying 100.00 more than if you bought a single 6800Ultra. When PCI-express becomes more common, there will be plenty of OEMs making dual PCI x16 slot boards and the prices will come down. I would rather have 2 6800nu's than a single 6800U anyday of the week. Of course 2 6800U's would be the pinnacle but at a 1000.00 price point.


Although in theory this would work, it is doubtful that either ATI or Nvidia would allow its own products to compete one another in this regard. There would be some catch so that you couldn't just buy, for example, two low-profit (for Nvidia) FX6800nu's and link them up to greatly exceed the performance of one (high profit) FX6800 Ultra/Extreme.

So, there would have to be some catch - they'd charge $50+ for the connector, or (most likely) only allow "SLI" (or whatever dual card technology they use) on the high end cards.

It makes no business sense for either company to allow 'linkable value cards.' If you could put together two GF4 Ti4200's in their heyday, for example, and have them significantly outperform the GF4 Ti4600, Nvidia would be losing money, because the margin on top-end cards is significantly larger then on value cards, where competition limits profits.

Edit: then again, as long as they are still charging $300-400 for a GF FX 6800nu, they will be making good money. If it ever becomes a true 'value' card in the sense of the GF4 Ti4200 (was $150-200 as a modern card, and <$100 after the FX came out), then it will be cutting into Nvidia's margins to sell SLI equipped 6800nu's .

A third dynamic could be the availability of dual PCI-e motherboards. If Nvidia is the only mass-producer of mainstream boards if/when this technology is available, then they can solve that problem by charging exorbitant amounts for the motherboard (ie $50+ over the norm).
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: jiffylube1024
Originally posted by: keysplayr2003
Originally posted by: Rent
Originally posted by: rbV5
Originally posted by: TourGuide
The uptake on this idea will be minimal at best. Who in there right mind would be shelling out twice the price of current leading edge technology? For heaven sake - you can built an entire box for what they want to make this happen.

2 less than top end cards could make it very attractive value/perormance perhaps?

Keep in mind its not like you have to do it all at one time... As time goes on and performance starts to drag, hey, go pick up another and healtily boost your performance. The longer the cards are out, the more the price will go down...


Give this man a cigar!!!!!!

And besides. If you bought 2 6800nu for say 600.00 you would only be paying 100.00 more than if you bought a single 6800Ultra. When PCI-express becomes more common, there will be plenty of OEMs making dual PCI x16 slot boards and the prices will come down. I would rather have 2 6800nu's than a single 6800U anyday of the week. Of course 2 6800U's would be the pinnacle but at a 1000.00 price point.


Although in theory this would work, it is doubtful that neither ATI nor Nvidia would allow its own products to compete one another in this regard. There would be some catch so that you couldn't just buy, for example, two low-profit (for Nvidia) FX6800nu's and link them up to greatly exceed the performance of one (high profit) FX6800 Ultra/Extreme.

So, there would have to be some catch - they'd charge $50+ for the connector, or (most likely) only allow "SLI" (or whatever dual card technology they use) on the high end cards.

It makes no business sense for either company to allow 'linkable value cards.' If you could put together two GF4 Ti4200's in their heyday, for example, and have them significantly outperform the GF4 Ti4600, Nvidia would be losing money, because the margin on top-end cards is significantly larger then on value cards, where competition limits profits.

Sounds good, but it's also just guessing. Oh and Nvidia WANTS you to buy the lower end mainstream. Thats their bread and butter. Not the Ultra high end.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: jiffylube1024
A third dynamic could be the availability of dual PCI-e motherboards. If Nvidia is the only mass-producer of mainstream boards if/when this technology is available, then they can solve that problem by charging exorbitant amounts for the motherboard (ie $50+ over the norm).

that seems pretty far-fetched, considering nvidia doesn't even make motherboards.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,777
31,786
146
Originally posted by: biostud666
Originally posted by: DAPUNISHER
nV pci-E SLI nForce4+Soundstorm2+A64=me sporting a chub :laugh:

You forgot A64 Dual core :p
Now you did it! I have to go change pants now :eek:
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: TheSnowman
Originally posted by: jiffylube1024
A third dynamic could be the availability of dual PCI-e motherboards. If Nvidia is the only mass-producer of mainstream boards if/when this technology is available, then they can solve that problem by charging exorbitant amounts for the motherboard (ie $50+ over the norm).

that seems pretty far-fetched, considering nvidia doesn't even make motherboards.

And the fact that an extra 50.00 for a Dual PCI-e x16 is more than an acceptable price to pay for the ability to utilize 2 PCI Express nvidia cards. At least for me. Intel is already producing a server board that has dually PCI x16 slots, but its extremely expensive at the moment. OEM's will follow suit but I dont know how much of a demand there will be for such mobos. Most people dont know what's what.
And only the high end user/enthusiast/overclocker would know about such things.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Lonyo
Originally posted by: PinwiZ
With the resolutions this setup would be able to play at high framerates, you really wouldn't even need Anti-Aliasing anyways

You would, however, need a very good monitor.

Since most people are using LCD's now (or buyin gthem anyway), unless you had the $$$ for a 20" LCD for 1600x1200 you woul dbe screwed.
1280x1024 is standard 17~19" LCD res, IIRC. This is not a very good resolution if you cannot use AA/AF, since the extra graphics power would be wasted.
AA/AF would be almost needed to make the extra purchase worth it, or why not go for a single card that can hit good frame rates at that res?

Those of us who don't want to pay for an LCD comparable to our 22" CRTs for gaming wouldn't hate this....

Hmmm, add this to the list of things you're missing when you buy that X800XT mysteriously below MSRP at launch.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,777
31,786
146
Originally posted by: Homerboy
am I going to need a 1000watt PSU now?
Every forum I've been to today that point has been made. I would just add a 2nd TP330 at that point as oppossed to buying a single expensive higher powered unit.
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: Homerboy
am I going to need a 1000watt PSU now?

Doubt full youll still probalby be able to run it on a 550w high end PSU...however if there was a 1000w unit...id buy one just because Id be able to justify the bad ass high voltage sticker ive got laying around :p