SLI patch: makes all dual pci-e mobos SLI capable

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Matt2
Originally posted by: munky
Well, I did the SLI mod on my Ultra-D earlier this week, and it worked. But it's a physical mod, where you connect 2 pins on top of the nf4 chip, using a pencil. I'd be skeptical of any other similar mod that did not involve conecting the 2 pins. And it only worked because the Ultra-D is identical to the real SLI board except for the chipset.

Did yours have some "goop" on top of the pins you had to connect?

Mine did, had to take a razorblade to my NF4 chip. :shocked:

Yup, mine had the goop also. I had to scrape it off
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: AthlonAlien
Originally posted by: Matt2
Originally posted by: AthlonAlien
Originally posted by: nitromullet
Yeah, I think the point is non-nForce4 chipsets... The DFI Ultra-D is unique in that it is an nForce4 Ultra cipset mobo with two physical PCIe x16 slots.

This is cool, but even better would be a patch that lets you run CrossFire in an SLI mobo.

This is incorrect. It has 1 PCIe x16 slot. DFI doesn't make any nForce4 motherboards with 2 PCIe x16 slots. The Ultra-D (in stock form) can run in normal mode (which is 1 graphics card in the top PCIe x16 slot) or in "alternate" mode (which is 1 graphics card in the bottom x8 PCIe slot). It can also run in DXG mode (which is 2 graphics cards... 1 in the top PCIe slot running at x16 and 1 in the bottom PCIe slot running at x2). To make the Ultra-D capable of SLI, you need to remove the chipset cooler and connect 2 points (a CircuitWriter pen is recommended, but a pencil will do okay). After the SLI mod, the board is capable of running full SLI (which is 2 cards - both running at PCIe x8 each). If you are wanting to run SLI x16 (which is both cards running at PCIe x16), you will need to look somewhere other than DFI... like ASUS for example.


DFI Expert is 16x/16x

Wrong!!

not according to AT's article, which clearly states 2 x16 slots:

"Then, at CES, DFI was displaying both nForce4 SLI and nForce4 Ultra motherboards with two x16 PCIe slots. We were told that Epox also had an nForce4 Ultra motherboard with another semi-SLI solution based on the cheaper Ultra chipset. DFI told us that they used the same PCB for both versions of the nForce4 boards for economy"

other reviews state the same thing. the specs:

2 PCI Express x16
1 PCI Express x4
1 PCI Express x1
2 32-bit/33MHz
 

Yreka

Diamond Member
Jul 6, 2005
4,084
0
76
Originally posted by: CaiNaM
Originally posted by: AthlonAlien
Originally posted by: Matt2
Originally posted by: AthlonAlien
Originally posted by: nitromullet
Yeah, I think the point is non-nForce4 chipsets... The DFI Ultra-D is unique in that it is an nForce4 Ultra cipset mobo with two physical PCIe x16 slots.

This is cool, but even better would be a patch that lets you run CrossFire in an SLI mobo.

snip

DFI Expert is 16x/16x

Wrong!!

not according to AT's article, which clearly states 2 x16 slots:

"Then, at CES, DFI was displaying both nForce4 SLI and nForce4 Ultra motherboards with two x16 PCIe slots. We were told that Epox also had an nForce4 Ultra motherboard with another semi-SLI solution based on the cheaper Ultra chipset. DFI told us that they used the same PCB for both versions of the nForce4 boards for economy"

other reviews state the same thing. the specs:

2 PCI Express x16
1 PCI Express x4
1 PCI Express x1
2 32-bit/33MHz

It's somewhat confusing. We, as usual are wading through marketing bullshant.

My understanding is the Expert has 2 X16 slots, but do not use the New NF4X16 chipset like the Asus, Abit, and MSI. They count the SLI Bridge bandwidth instead of having true X16 X 2 support on the board.

DFI Expert

Chipset
NVIDIA nForce4 SLI
- Supports NVIDIA SLI (Scalable Link Interface)

- Each x16 slot operates at x8 bandwidth. When the graphics cards are connected
via the SLI bridge, the total bandwidth of the two graphics cards is x16.

Asus A8N32-SLI

Chipset
NVIDIA nForce?4 SLI X16

2 x PCI-E x16 with SLI? support at full x16, x16 mode

I think the 2 different parties are both technically correct. One is saying yes, it has 2-16x slots, while the other is saying the slots do not operate at 16X together in tandem.
 

AthlonAlien

Senior member
Nov 10, 2004
428
0
0
DFI does NOT have ANY nf4 boards that are SLI x16. It doesn't matter what articles have appeared. The Ultra-D, SLI-D, Expert, and NEW Venus (yes, there is another one) ALL have 2 "long" PCIe slots... HOWEVER, one slot is PCIe x16 and the other slot is PCIe x8. When ran in SLI mode, BOTH slots run at PCIe x8. That's it. :)
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: AthlonAlien
DFI does NOT have ANY nf4 boards that are SLI x16. It doesn't matter what articles have appeared. The Ultra-D, SLI-D, Expert, and NEW Venus (yes, there is another one) ALL have 2 "long" PCIe slots... HOWEVER, one slot is PCIe x16 and the other slot is PCIe x8. When ran in SLI mode, BOTH slots run at PCIe x8. That's it. :)

no one is arguing they can both run @ x16 at the same time; afaik there are only 2 boards (shipping) that can do that at this time. the only reason i even brought it up was the fact you were quick to jump on the OP for saying that (twice even), when in fact, he wasn't incorrect within the context he stated it. it does in fact have 2 x16 slots; they just can't run in x16 mode simultaneously.

furthermore, it's been shown there is no performance advantage of simultaneous x16 vs simultaneous x8 when running sli (at least at this time).

 

AthlonAlien

Senior member
Nov 10, 2004
428
0
0
Originally posted by: CaiNaM
Originally posted by: AthlonAlien
DFI does NOT have ANY nf4 boards that are SLI x16. It doesn't matter what articles have appeared. The Ultra-D, SLI-D, Expert, and NEW Venus (yes, there is another one) ALL have 2 "long" PCIe slots... HOWEVER, one slot is PCIe x16 and the other slot is PCIe x8. When ran in SLI mode, BOTH slots run at PCIe x8. That's it. :)

no one is arguing they can both run @ x16 at the same time; afaik there are only 2 boards (shipping) that can do that at this time. the only reason i even brought it up was the fact you were quick to jump on the OP for saying that (twice even), when in fact, he wasn't incorrect within the context he stated it. it does in fact have 2 x16 slots; they just can't run in x16 mode simultaneously.

furthermore, it's been shown there is no performance advantage of simultaneous x16 vs simultaneous x8 when running sli (at least at this time).

I didn't mean for it to sound like I was "jumping on" the OP. I was just trying to imply that possibly he hadn't worded it the best, and I was attempting to clarify exactly what the DFI boards are capable of (since I have personally performed the Ultra-D to SLI-D mod myself). The wording was just weird... as it is with your post above: "they just can't run in x16 mode simultaneously."... That is true, however, they also cannot run x16 mode independently either (only the top slot can ever be x16... the bottom slot can never be x16... even with one card in the bottom slot and nothing in the top slot, the bottom will always run at x8). Not trying to beat a dead horse here, just making sure everyone understands. I'll let it go now :) :beer:

-LaTeR
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
First off let me say:

PROUDLY BANNED FROM NGOHQ FOR NO APPARENT REASON. Seriously i was trying to have a conversation and ask them why they made these claims and that idiot regeneration guy who typed more curse words in one paragraph and more obscene remarks in one post than i have ever seen in my life, banned me.

I wrote a formal letter to the Editor of NGO asking how he can accept this, but the guy didn't even give me the courtesy of E-Mailing me back, and i never even remotely insulted his forums.

As for the patch. I am running the Ultra-D right now with the patch. While it seems to be working i cant really say much because im only running one card ;) ....By working i mean they haven't killed my system.....yet.

only the top slot can ever be x16... the bottom slot can never be x16

Where in the hell did you get that information from? AFAIK all the correct traces for full x16 operation is there. However when using 2x cards you have to move the block jumpers which effectively converts it to x8 x8. The slot itself is also x16. Although it may not run at full x16 speeds when in SLI, it is still a physical x16 connection.

At any rate though the other one is most definitely a full x16 slot. Why would one be x16 and x8, and then when you run SLI it just knocks the other one down. The NB can only handle a set amount of PCI-E traces, it goes from x16 (for either slot but not at the same time) to x8, x8 (Notice it still adds up to 16 ;)). Doing it your way they start out with more traces than the NF4 northbridge can handle, and just when it is used, it bumps it down for some reason; doesn't work that way.

-Kevin

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: AthlonAlien
:D

Well, it's kind of misleading... Perhaps one could say it physically has 1 x16 slot and 1 x8 slot (since there is no way the second PCIe slot can do x16).

That would be incorrect though. If it had an x8 phyical slot, the second slot would be smaller... You are confusing physical and electrical interfaces. The Ultra D has two physical PCIe x16 slots. Period. The fact that they are electrically x16 and x2 (by default) doesn't change that fact.

I'm not just pulling this out of thin air either.

http://www.interfacebus.com/Design_Connector_PCI_Express.html

The PCI Express [PCIe] bus defines the Electrical, topology and protocol for the physical layer of a point to point serial interface

...and

PCIe uses 4 different sizes of connector, all of which are card-edge type to accept a PCI Express card using card-edge fingers spaced on a 1.00mm pitch. The 1x size is the smallest with 36 contact positions. The x4 uses 64 contacts, the x8 uses 98 contacts, and the x16 has 164 contact positions. The nominal height of the connector above the PWB is 11mm.

The confusion comes from the fact the the dual 8x SLI/CrossFire motherboards basically break from the PCIe standard. IMO, SLI actually created its own standard, which CrossFire adopted as well. What DFI did with the Ultra D board was a complete bastardization of the standard though. Obvioulsy, they did this to allow for people to mod their nForce4 Ultra chipsets to run in SLI.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
As for the patch. I am running the Ultra-D right now with the patch. While it seems to be working i cant really say much because im only running one card ....By working i mean they haven't killed my system.....yet.

I'm not sure why you would run that patch on an Ultra D, since it doesn't have a ULi chipset. if you want to run the Ultra D as an SLI board, you should do the "pencil trick". That will pretty much give you a "real" nForce4 SLI chipset, without needing a patch. The point of this patch is to run SLI on non-NVIDIA chipsets.

What I would really love to see is a patch that makes CrossFire work on SLI boards. While the A8R-MVP works fine, I still have an A8N-SLI Premium (for now) that is a nicer board and has considerably more features than the MVP.
 

AthlonAlien

Senior member
Nov 10, 2004
428
0
0
Originally posted by: Gamingphreek
First off let me say:

PROUDLY BANNED FROM NGOHQ FOR NO APPARENT REASON. Seriously i was trying to have a conversation and ask them why they made these claims and that idiot regeneration guy who typed more curse words in one paragraph and more obscene remarks in one post than i have ever seen in my life, banned me.

I wrote a formal letter to the Editor of NGO asking how he can accept this, but the guy didn't even give me the courtesy of E-Mailing me back, and i never even remotely insulted his forums.

As for the patch. I am running the Ultra-D right now with the patch. While it seems to be working i cant really say much because im only running one card ;) ....By working i mean they haven't killed my system.....yet.

only the top slot can ever be x16... the bottom slot can never be x16

Where in the hell did you get that information from? AFAIK all the correct traces for full x16 operation is there. However when using 2x cards you have to move the block jumpers which effectively converts it to x8 x8. The slot itself is also x16. Although it may not run at full x16 speeds when in SLI, it is still a physical x16 connection.

At any rate though the other one is most definitely a full x16 slot. Why would one be x16 and x8, and then when you run SLI it just knocks the other one down. The NB can only handle a set amount of PCI-E traces, it goes from x16 (for either slot but not at the same time) to x8, x8 (Notice it still adds up to 16 ;)). Doing it your way they start out with more traces than the NF4 northbridge can handle, and just when it is used, it bumps it down for some reason; doesn't work that way.

-Kevin

I have PERSONALLY verified this. Do you have a DFI board? If so, give it a try. You will see the second PCIe slot will NEVER run at x16 (only x8). I'm not sure why people keep arguing this. It's simple to understand really. If you still don't believe me, do some searching. I have seen some posts in THIS forum about the issue. Also, if you visit DFI-STREET.com, I have seen several threads about it there also. Here is a link... RGone (one of the main 2 moderators there, confirms this in the thread).

http://www.dfi-street.com/forum/showthread.php?t=44028&highlight=slot


nitromullet,

I see where you are coming from. The board does have 2 "physical" PCI-E x16 slots on it. Even though 1 of them can never do x16, it is still "as long" as a true x16 slot... so by physical definition, it is a x16 slot. I personally still don't consider this a "True x16" slot though :)

"What DFI did with the Ultra D board was a complete bastardization of the standard though. Obvioulsy, they did this to allow for people to mod their nForce4 Ultra chipsets to run in SLI."...... At first, DFI was promoting DXG for the Ultra-D. It ran two cards in a x16 / x2 configuration and gave about 90% the performance of SLI. This allowed DFI to provide all users (even the ones not wanting to spend $200+ on the first SLI-DRs) with the 'advantage' of 2 graphics cards. I believe that was the real reason for dual x16 PCIe slots. The fact that DFI also used the same PCB layout for both boards, was icing on the cake :D Once people found out about the "True" SLI mod and how easy it was to do... they started doing that, and got 100% the performance of SLI (instead of the 90% with DXG) becase of the x8 / x8 configuration (hence, x2 will bottleneck a current PCIe card, whereas, x8 will not... and as you stated above, x8 doesn't hurt cards at all vs. x16). Another thing worth mentioning... once Nvidia found out about this "DXG" mode and how it provided 90% the performance of SLI (at the basic cost of an Ultra-D motherboard) they feared it would hurt SLI sales, so they disabled DXG in their drivers from that point on. That is another reason people started doing the "hard mod" to true SLI. No drivers can disable a hard mod. Then Nvidia learned of that, and started making DFI apply epoxy over the 2 points that need to be connected for SLI to function. However, as we all know, you are still able to remove the epoxy (if done carefully) and make the board full SLI :cool:
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
i have a dfi nf4 infinity sli board. i am not sure if it counts but i have my x800gt plugged into the upper pic-E slot and it runs at x8 speed instead of the expected x16 ( i think cpu-z told me this though i am not sure)

anyway back to topic i now regret buying a sli mobo, i should have bought a cross fire one and used this to run sli if needed