jiffylube1024
Diamond Member
- Feb 17, 2002
- 7,430
- 0
- 71
Personally, I was unaware that NForce4 would have dual PCI-e (nor was I aware of any mainstream, affordable dual PCI-e 16X boards on the near horizon).
With that said, if dual PCI-e 16X boards do come out this fall at an affordable price (ie not way higher than single PCI-e 16X solutions), then 'SLI' will be a viable alternative.
However, there are a few issues that make it not as simple as the Voodoo2 boards, and I think some people forget that.
1. Voodoo2 SLI used PCI cards and virtually every board had 5-6 PCI slots back then. GeForce6 SLI will use dual PCI-e 16X slots, of which zero mainstream boards are out right now and only some will have this feature in the future. And the price of these boards is still unknown.
2. Increased power consumption. I don't think this will be much an issue for people with 450W+ brand name PSU's, but when overclocking a powerhungry CPU along with two power hungry GPU's, it will put much more strain on the PSU, for sure. Plus the added heat.
3. Possible incompatibility with top-end cards due to dual slot design. This is just a logistical issue - most 6800Ultra's are dual-slot designs, and therefore it probably wouldn't be possible to link them up since the PCI-e 16X slots would have to be side by side. And if they did design PCI-e SLI with a 1-slot space between PCI-e cards, then that means the two PCI-e 16X slots would take up four (out of 6) brackets on the back of your PC.
Who are the people most likely to be able to afford SLI? Probably the same people who can afford the higheset priced cards.
4. Performance is still unknown. I have full confidence in Nvidia delivering excellent performance in 'SLI' mode, but true 2X performance wouldn't be possible due to overhead, interfacing, etc.
Personally, I'm intrigued by GF6 'SLI,' especially if Nvidia can pull it off in a sucessful top-to-bottom fashion (ie it would be possible using 6800 and 6600 series cards; with dual 6800nu's and dual 6600GT's looking to be the best bang/buck). However, it's not here yet, it will require a motherboard upgrade from everyone (will this even be possible using an Intel CPU - ie. are there dual PCI-e 16X boards for Intel on the horizon?), and there will be some inevitable kinks to work out in drivers. Also, if Nvidia runs with SLI, then they will have to keep excellent driver support for it up to date.
And regardless if everyone here goes out and buys a dual 6600GT/6800nu/6800GT setup when available, ATI's claim of:
Will still be 100% correct, because the vast majority of gamers are still running GF4 MX setups, as well as other DX7/8 configurations.
A final note - is this configuration even called "SLI" (scan-line interleaving?) I thought that was 3dfx's solution of getting one V2 board to render one scan line and the other board to render the next line, in an alternating fashion, while Nvidia's solution gets one card to render the top half of the screen while the other card does the bottom. Either way, this is just a semantics issue
.
Rollo, what did you honestly expect ATI's stance to be on GF6 'SLI' - a feature they have no counter to (yet?)? "We strongly support Nvidia SLI and we think it is an excellent technology to look into?" Puh-lease!
Just look at Intel's complete 180 on 64-bit extensions for a reference in a similarly competitive field! Intel spent 2+ years downplaying 64-bit only to covertly slip it into their new chips under a confusing name. Double speak and downplaying the competitors is the name of the game!
With that said, if dual PCI-e 16X boards do come out this fall at an affordable price (ie not way higher than single PCI-e 16X solutions), then 'SLI' will be a viable alternative.
However, there are a few issues that make it not as simple as the Voodoo2 boards, and I think some people forget that.
1. Voodoo2 SLI used PCI cards and virtually every board had 5-6 PCI slots back then. GeForce6 SLI will use dual PCI-e 16X slots, of which zero mainstream boards are out right now and only some will have this feature in the future. And the price of these boards is still unknown.
2. Increased power consumption. I don't think this will be much an issue for people with 450W+ brand name PSU's, but when overclocking a powerhungry CPU along with two power hungry GPU's, it will put much more strain on the PSU, for sure. Plus the added heat.
3. Possible incompatibility with top-end cards due to dual slot design. This is just a logistical issue - most 6800Ultra's are dual-slot designs, and therefore it probably wouldn't be possible to link them up since the PCI-e 16X slots would have to be side by side. And if they did design PCI-e SLI with a 1-slot space between PCI-e cards, then that means the two PCI-e 16X slots would take up four (out of 6) brackets on the back of your PC.
Who are the people most likely to be able to afford SLI? Probably the same people who can afford the higheset priced cards.
4. Performance is still unknown. I have full confidence in Nvidia delivering excellent performance in 'SLI' mode, but true 2X performance wouldn't be possible due to overhead, interfacing, etc.
Personally, I'm intrigued by GF6 'SLI,' especially if Nvidia can pull it off in a sucessful top-to-bottom fashion (ie it would be possible using 6800 and 6600 series cards; with dual 6800nu's and dual 6600GT's looking to be the best bang/buck). However, it's not here yet, it will require a motherboard upgrade from everyone (will this even be possible using an Intel CPU - ie. are there dual PCI-e 16X boards for Intel on the horizon?), and there will be some inevitable kinks to work out in drivers. Also, if Nvidia runs with SLI, then they will have to keep excellent driver support for it up to date.
And regardless if everyone here goes out and buys a dual 6600GT/6800nu/6800GT setup when available, ATI's claim of:
The truth is that these kind of exotic arrangements are of interest to only a tiny minority of gamers.
Will still be 100% correct, because the vast majority of gamers are still running GF4 MX setups, as well as other DX7/8 configurations.
A final note - is this configuration even called "SLI" (scan-line interleaving?) I thought that was 3dfx's solution of getting one V2 board to render one scan line and the other board to render the next line, in an alternating fashion, while Nvidia's solution gets one card to render the top half of the screen while the other card does the bottom. Either way, this is just a semantics issue
Rollo, what did you honestly expect ATI's stance to be on GF6 'SLI' - a feature they have no counter to (yet?)? "We strongly support Nvidia SLI and we think it is an excellent technology to look into?" Puh-lease!
Just look at Intel's complete 180 on 64-bit extensions for a reference in a similarly competitive field! Intel spent 2+ years downplaying 64-bit only to covertly slip it into their new chips under a confusing name. Double speak and downplaying the competitors is the name of the game!