pci e 8x

Viper GTS

Lifer
Oct 13, 1999
38,107
433
136
What kind of cards are you trying to put in that require x16 slots?

Also what server platform is this? Many will have an option of say 2 x8 slots vs 1 x16 via a riser change.

Viper GTS
 

Lean L

Diamond Member
Apr 30, 2009
3,685
0
0
it is an ibm system x3500. I can get a good price for it but I'd like the ability to add a nice vcard in it in case I decide to use it for more gfx intense purposes later. Afaik most video cards are x16. I don't see the point of a x1 card, as it doesn't offer much improvement over int vid.

From what i've read so far, there are two types of x8 slots. One that is full length but half speed and one that is half length and half speed. I know that the full lengths can be modded to fit a x16 but I haven't read anything on the half lengths. I assume I can just knock out the end but I want to be sure.
 

Viper GTS

Lifer
Oct 13, 1999
38,107
433
136
What you're looking at are x16 physical/x8 electrical. They are fairly common on server platforms where there are not a ton of PCI-e lanes to pass around. You are vastly under-estimating the x1 cards though, they are a MASSIVE improvement over the onboard video. I used some x1 ATI Fire boards in an R200 recently when they needed to drive digital displays in a grade show. It worked fine (where nvidia's x1 Quadro NVS did not).

Bottom line is you should not expect to get anything to work. Servers aren't designed to run desktop graphics cards. It may work OK, it may not work at all. If you need a workstation buy a workstation.

Viper GTS
 

PingSpike

Lifer
Feb 25, 2004
21,758
603
126
In general, from what little information is available out there and my own testing: A 16x graphics card will work in an 8x slot. Whether that be facilitated through an 8x electrical 16x physical slot, a 16x to 8x adapter board or the ghetto "knock out the back" method.

PCI-e is suppose to be backwards compatible. Tom's hardware did a couple PCI-e scaling analysis articles where they used scotch tape to block and therefore "turn off" the extra lanes. This was tested in a 16x graphics card slot however.

The only thing that should possibly be able to trip you up is power requirements of graphics cards. A 16x PCI-e slot is suppose to provide something like 75w of power just through the slot. Other are only required to provide less, I believe its 25w? This is why all the 1x PCI-e graphics card you see, in addition to being overpriced also suck major ass. They need to fit into the 25w envelope. Midrange cards use more then 25w...but they also don't have external power connectors.

My own experience is limited to using a 16x to 1x adapter to use full size graphics cards in a 1x slot. I had a theory, backed up with some research that if you had a 16x card that used an external power connection that the 25w envelope would not be of concern. The card would draw any extra power it needed from the external power source.

I tested this with an x800pro, an x850xt and I think I even had my 8800GTS in there once but I'm less sure on that since it wasn't my goal. I'm also fairly sure I tried my x1900xt in there once. The cards all had external power and they worked fine. I ran them through 3dmark for a few hours and I played a few short runs of games with them.

Some notes:
1) The adapter boards are pretty expensive and the card doesn't mount securely or elegantly when using them.
2) There is a performance hit when using less lanes, check Tom's hardware for the scaling analysis...it varies between card types.
3) I have no idea if the ULi chipset based board I did most of this testing in potentially had an unusual PCI-e 1x port that provided additional power beyond the spec. In other words, my results may very well have been totally different on another motherboard.