• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Crossfire motherboards to support SLI? Even DUAL 3D1 cards?!

Chocolate Pi

Senior member
Man, this SLI vs. Crossfire war sure has been crazy, especailly considering IT HASN'T STARTED YET! First, nVidia does some voodoo summoning and pulls SLI back from the grave, except it is bugging and near impossible to work with. Then ATI announced their makeshift solution that actually looks vastly superior, as it sets up flawlessly and adds new modes and Super AA. But as Crossfire is delayed... and delayed...... and delayed...... SLI gets driver improvements upon improvements until it ends up with most of the functionality Crossfire is supposed to have, in some cases more. If the pattern continues, ATI will be scared into improving their final version of Crossfire, which will force nVidia to make even more improvements. And on, and on, and on. This of course assumes that Crossfire actually comes out.

There, I think that summarized the situation.

But as the topic title points out, Anandtech's Crossfire article mentioned SLI working on the Gigabyte Crossfire board, or so Gigabyte said. I'd say that this is a big win for ATI, but it's not; now nVidia will just be forced to allow Crossfire on their chipsets to stay competitive. It's a win for consumers. (Unless of course this is just Gigabytes doing, which neither ATI or nVidia wants to support with drivers. That could be the case.)

But what really got me was the supposed support for dual 3D1 cards. Think about that for a minute. 4 GPUs. That means that Crossfire boards would be more capable at running nVidia solutions than SLI boards! How humorous!

But think of the funtionality! FOUR GPUs! the old, highly incompatiable 6600GT verison of the 3D1 is not worth talking about, but the new 6800GT version is! And dare I mention that the 7800GTX is relatively low power/heat and single slot, making it a candidate for the 3D1 treatment.

Whose motherboards will the nVidia fans buy then?
 
Well, it's all just speculation at this point.

Theoretically, there is nothing stopping SLI from working on any motherboard that has 2 PCIex16 slots. It's all in the drivers, and the cards communicate through the hardware bridge, not over the PCIe bus. NVIDIA, however, is currently only allowing it to work on NF4 SLI chipsets (although Intel's dual-PCIe is supposedly going through testing/validation right now). That's why it no longer works on modded NF4 Ultras, and at least at first would probably not work on ATI's chipsets.

Crossfire supposedly requires some amount of support from the northbridge/PCIe controller. It's unclear if other companies will be able to produce Crossfire-compatible chipsets. I wouldn't be shocked to see cross-licensing between ATI and NVIDIA like Intel and AMD have.

As for running dual 3D1 cards... somehow I doubt it. AFAIK, the inter-card bridges on those GPUs are already hardwired to the other GPU on the same card.

And considering that Crossfire seems to more or less work in the previews, it's almost certainly coming out sometime in the next few months.
 
That's why it no longer works on modded NF4 Ultras

I'm sorry but that's incorrect.

It will not wory on UN-modded NF4 Ultra chipsets due to Nvidia disabling it with their drivers.

It used to run SLI just fine but wont work with newer drivers.

If you mod the chipset however, you get your self the same exact chipset the SLI motherboard has, they are the same thing.

The board will even boot up as the SLI motherboard.

http://www.vr-zone.com/Stratix/nf4_sli_mod2.jpg

I have tested SLI on the motherboard and it works.
 
Originally posted by: BouZouki
That's why it no longer works on modded NF4 Ultras

I'm sorry but that's incorrect.

It will not wory on UN-modded NF4 Ultra chipsets due to Nvidia disabling it with their drivers.

It used to run SLI just fine but wont work with newer drivers.

If you mod the chipset however, you get your self the same exact chipset the SLI motherboard has, they are the same thing.

Perhaps a bad choice of terms. The 'easy' mod of just switching an NF4 chipset into x8/x8 mode will no longer work, and SLI will not work with x16/x4 or x16/x2 configurations anymore. If you somehow manage to convince the driver that you really have an NF4 Ultra SLI chipset, it will work.
 
Originally posted by: Matthias99
Originally posted by: BouZouki
That's why it no longer works on modded NF4 Ultras

I'm sorry but that's incorrect.

It will not wory on UN-modded NF4 Ultra chipsets due to Nvidia disabling it with their drivers.

It used to run SLI just fine but wont work with newer drivers.

If you mod the chipset however, you get your self the same exact chipset the SLI motherboard has, they are the same thing.

Perhaps a bad choice of terms. The 'easy' mod of just switching an NF4 chipset into x8/x8 mode will no longer work, and SLI will not work with x16/x4 or x16/x2 configurations anymore. If you somehow manage to convince the driver that you really have an NF4 Ultra SLI chipset, it will work.



You have to take a #2 pencil and mod the chipset.

The Ultra chipset is the same exact thing as the SLI chipset.

The mod works perfectly as shown in the pic.

http://myweb.cableone.net/mondo/sli1.jpg

http://myweb.cableone.net/mondo/sli2.jpg


Anandtech's Morphing the NF4 Ultra into SLI
 
I've never quite understood the desire to mod the Ultra chipsets into SLI ones. If you're going to shell out at least another $200-$300 for a second mid- to high-end graphics card, why not just bite the bullet and pay the extra money for a true SLI mobo? This seems especially true since it is pretty pointless to SLI anything less than a 6800GT now that the 7800GTX is out.
 
Originally posted by: batmanuel
I've never quite understood the desire to mod the Ultra chipsets into SLI ones. If you're going to shell out at least another $200-$300 for a second mid- to high-end graphics card, why not just bite the bullet and pay the extra money for a true SLI mobo? This seems especially true since it is pretty pointless to SLI anything less than a 6800GT now that the 7800GTX is out.

You're right. If they have the money to SLI, they shouldn't even blink at the cost of an SLI mobo.
It's not so much the saving money part that gets these guys going. It's the combination of "Ha Ha, screw you big corporation!" and the challenge to see if it can be done.

 
Originally posted by: batmanuel
I've never quite understood the desire to mod the Ultra chipsets into SLI ones. If you're going to shell out at least another $200-$300 for a second mid- to high-end graphics card, why not just bite the bullet and pay the extra money for a true SLI mobo? This seems especially true since it is pretty pointless to SLI anything less than a 6800GT now that the 7800GTX is out.

You dont think 2x6800GT is pointless? A 7800GTX is much cheaper than two 6800GT's, and often times faster. Being over $100 cheaper, cooler, quieter, using less power, less hassle, all the while being faster most of the time, the 7800GTX (to me) is a much better deal than 2x6800GT's.

The only SLI that "makes sense" to me right now, is 2x7800GTX's.
 
Originally posted by: batmanuel
I've never quite understood the desire to mod the Ultra chipsets into SLI ones. If you're going to shell out at least another $200-$300 for a second mid- to high-end graphics card, why not just bite the bullet and pay the extra money for a true SLI mobo? This seems especially true since it is pretty pointless to SLI anything less than a 6800GT now that the 7800GTX is out.

From a rational perspective, I agree. But it is the challenge of it all:evil:
 
I never really understood the point of SLI besides getting higher frames. I meen you spend 2x the price of something that gives 50% performance at best, and in a year or so, each GPU company comes out with a new card thats performs just as well as using the previous gen cards in SLI. You might as well buy the top end card every year and save $500 for something else, like a girl, car, or any kind of payments ya have. Yaya I know, bragging rights, oh I can get 700 in quake 3 team arena instead of your 500 😉. But the thing is, you cant really tell the difference between 50 frames in most games at max setting, than 75 frames (yes I know its the average fps). And if you can, your lying because the human eye only sees 30fps. Things might appear to look smoother but they really are the same. Now with AA16 or whatever Nvidia has done with SLI is the only real advantage of having 2 cards.

So to getting to my point, why does anyone really need these things, unless you have a huge screen or just someone who wants the best of the best.

I would get neither SLI or Xfire and OC a single card every generation.
 
Originally posted by: Soccerman06
I meen you spend 2x the price of something that gives 50% performance at best

Best-case performance improvements for NVIDIA SLI are in the neighborhood of 80-90%. Often people look at a couple CPU-limited benchmarks and conclude that SLI is worthless, which is not true.

Also, if you want/need more performance than a single high-end card can supply, then there's not really any other choice, is there?

and in a year or so, each GPU company comes out with a new card thats performs just as well as using the previous gen cards in SLI.

Not true. A single 7800GTX is not faster than a 6800GT or 6800U SLI.

But the thing is, you cant really tell the difference between 50 frames in most games at max setting, than 75 frames (yes I know its the average fps). And if you can, your lying because the human eye only sees 30fps. Things might appear to look smoother but they really are the same.

Wrong, wrong, wrong. Also, you're wrong. 😛

Please read one of any of the long threads discussing this (conclusion: "you can't see over 30FPS" is crap), and/or look around for that program you can download (FPSCompare, I think) that lets you compare FPS rates in a simple OpenGL app. Most people seem able to discern differences up to ~100FPS or so.

So to getting to my point, why does anyone really need these things, unless you have a huge screen or just someone who wants the best of the best.

They don't. Of course, you don't really "need" a 3D card at all, unless you use it professionally (in which case I'm sure you would welcome the extra horsepower).

Please go away and read some of the actual threads and articles on SLI.
 
Back
Top