• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

How do I get 16x on my Biostar TA790GX-128 board?

v3rax

Member
Sep 24, 2007
64
0
0
I am running a single Nvidia 9600GTX on this board and according to the insstruction manual, I am suppose to put the dummy card in PCIex #2 which is actually the TOP slot, and put the video card in PCIex slot #1 which is the bottom slot.

I did this, however, CPUz and GPUz are telling me that my card is running in 8x mode. In CPUz under the Mainboard tab, under graphic interface, then link width" it shows 8x, and under Max supported it says 16x.

I can't figure out how to get my single GFX card to run in 16x mode.

There are some BIOS settings, one being to disable onboard graphics which I did, but there is another that ask me to chose a certain order for GFX and it isnt explained in the manual.

Does anyone know how I can get this sigle 9600gt to run in 16x mode in the single slot?

Thanks

It is the Biostar TA790GX-128 board.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
I have heard of this issue. Trying to remember where but in the meantime - did you disable the on-board GPU? Since you're using 9600 GT you don't need the on-board GPU.
 

v3rax

Member
Sep 24, 2007
64
0
0
Yes, I disabled unboard video and tried a hunfred differnt combinations under the chipset settings in bIOS and nothing works. Biostar tech support said they could not dublicate it so I would have to send them my mobo AND video card..yeah right?? Oh well, hopefully somebody who has had exeprience with this will come across this and give me a answer.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Hi,

I still don't remember where I saw it but I'd suggest a couple things you could try;

- Change PCIe clock. If set to 'Auto', then change it to 100, or some other value (but no lower than 90 and no higher than 115) <- be careful with this tweak. I would disconnect HDDs other than system drive, and then some. There is always a risk toying around the PCIe clock.
- Give more voltages to NB and/or HT and/or PCIe (if available)
- Switch slots around, and if you have any other PCIe device available try stick it in on available PCIe slots

IMO, it's most likely a BIOS quirk, rather than something systematic. I haven't had an issue with graphis cards, but I once had a gigabit LAN speed drop to 10mbit/s. It just came back to normal the next day.
 

v3rax

Member
Sep 24, 2007
64
0
0
Originally posted by: Flipped Gazelle
I have the TA790GX-128m board, and there's zero performance difference between 8x and 16x PCI-e.

That is intersting. Unfortunatley since I can't get the 16x I can't run benchmarks to test it, but I suppose like another poster said, the 9600gt may not be capable of utilizing the bandwidth anyway... however, its a phsycological thing, I sit there thinking it shou-ld be running at 16x and it bothers me that I can't get it to do it. Oh well, I will try the things the other poster said and see if there is a differnce, if not, I guess I will run 8x until I can get a HD 487 or something.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
There can be difference between x16 and x8. Actually it is easy to produce a benchmark showing a difference. But the caveat is this: The difference shows when the vid card needs to access system memory (or even HDD).

Some people say higher end cards will be bottlenecked by x8, but it's more of the opposite. Because these higher end cards tend to be equipped with large amount of fast RAM, data need not travel through PCIe too often. (Of couse one can always configure a system to show the difference, like running x16AA @2560x1600)

When the x8 shows its performance loss most prominently is when data needs to be swapped constantly from system RAM. For instance, if you have a 9800 GT with 256MB trying to run 1920x1600/4AA, you will see a measuarable difference between x16 and x8.

But you already see what's wrong. In both scenarios (x16 and x8), the playback will be choppy and unpleasant. 9800 GT should be equipped with 512MB of frame buffer.

I am very interested in this subject and planning to test it in near future.
 

v3rax

Member
Sep 24, 2007
64
0
0
Originally posted by: lopri
There can be difference between x16 and x8. Actually it is easy to produce a benchmark showing a difference. But the caveat is this: The difference shows when the vid card needs to access system memory (or even HDD).

Some people say higher end cards will be bottlenecked by x8, but it's more of the opposite. Because these higher end cards tend to be equipped with large amount of fast RAM, data need not travel through PCIe too often. (Of couse one can always configure a system to show the difference, like running x16AA @2560x1600)

When the x8 shows its performance loss most prominently is when data needs to be swapped constantly from system RAM. For instance, if you have a 9800 GT with 256MB trying to run 1920x1600/4AA, you will see a measuarable difference between x16 and x8.

But you already see what's wrong. In both scenarios (x16 and x8), the playback will be choppy and unpleasant. 9800 GT should be equipped with 512MB of frame buffer.

I am very interested in this subject and planning to test it in near future.

So basically you are saying that if I had a card with 512, 896, or 1gb of onboard memory, there wouldnt be any differece between 8x and 16x? I have a 22 inch monitor and always run games at 1680x1050.

Regardless, it seems wierd to me that this board even has this issue. I mean, the slot that the card is suppose to go in is a 16x slot, so theoretically, the card should be running at 16x, right? Biostar keeps telling me to update my BIOS, but the only BIOS available is the original release that I already have. However, there are "A LOT" of GFX settings in the BIOS that I have never seen before and I am sure one of them has to have somethign to do with this, but the BIOSTAR tech team doesn't seem to know. I am about ready to RMA this board and just get a 790FX (non-biostar).
 

Flipped Gazelle

Diamond Member
Sep 5, 2004
6,666
3
81
I have an 8800gt. Do you guys want to me to run some tests? Which ones?

When I installed the mobo, I ran 3Dmark06 at 8x and 16x, and found no difference.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
You will not see the difference at the default setting. And I agree with you, Gazelle, x8 and x16 don't make a practical difference. Both will be equally enough to send data from GPU to CPU. The difference, from my limited testing, occurs when data needs to access system RAM, which isn't an ideal situation anyway (for both x8 and x16).
 

v3rax

Member
Sep 24, 2007
64
0
0
Well, if you could just put the 8800gt on the motherbaord, get it running at 16x and then tell me how you have it set up...which slots? What BIOS settings, etc... to get it to run uin 16x..

I am not so concerned with whether 16x is remarkably faster than 8x.... I just want to be able to get it to run at 16x so that I know nothing is wrong with the mobo, I have tried 2 differnt 9600 GT and tried them in both the upper and lower GFX slot and regardless of what I do, I still get 8x... its kind of a principle kind of thing to me.... this mobo "SHOUL" run my card at 16x, as advertised, and if it isnt, either I have something set up wrong, or the mobo is defective...

Either way, if some of you are using an nvidia card on this same mobo and ARE getting 16x, please tell me how you did it... if none of your suggestions work, then I know the mobo is bad and I can RMA it.

Thanks!
 

Flipped Gazelle

Diamond Member
Sep 5, 2004
6,666
3
81
Originally posted by: v3rax
Well, if you could just put the 8800gt on the motherbaord, get it running at 16x and then tell me how you have it set up...which slots? What BIOS settings, etc... to get it to run uin 16x..

I am not so concerned with whether 16x is remarkably faster than 8x.... I just want to be able to get it to run at 16x so that I know nothing is wrong with the mobo, I have tried 2 differnt 9600 GT and tried them in both the upper and lower GFX slot and regardless of what I do, I still get 8x... its kind of a principle kind of thing to me.... this mobo "SHOUL" run my card at 16x, as advertised, and if it isnt, either I have something set up wrong, or the mobo is defective...

Either way, if some of you are using an nvidia card on this same mobo and ARE getting 16x, please tell me how you did it... if none of your suggestions work, then I know the mobo is bad and I can RMA it.

Thanks!

I just put the little paddle card into the top PCI-e slot, and the system automatically changes the PCI-e speed for the existing card from 8x to 16x. No BIOS changes necessary.

FWIW, in 3Dmark06, I score 14456 @ 8x, 14452 @ 16x. As Lopri pointed out, 3Dmark06 @ default isn't going to push things enough to actually test the effect of 8x vs 16x.
 

v3rax

Member
Sep 24, 2007
64
0
0
Well, I don't know what the problem is, but I have done everything suggested here, including putting the paddle card in the top, changing various BIOS settings and I still can't get 16x. It is beginning to look like the problem might be caused by the 9600GT, I wish I had an ATI card to try out.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
I have run into this problem back when I had a 8800GTX and I have seen only Nvidia users with this problem really. It could be a driver issue so makesure you have the latest drivers from Nvidia or perhaps try the latest stable WHQL drivers. Also try setting your PCI-E frequency in the Bios above 100 to about 105- this sometimes has caused it to go back to 16x.
 

eduecris

Junior Member
Apr 23, 2009
2
0
0
Hi guys!

I purchased the same board and also a ATI HD 4830. The mobo simply does not recognize de video card. I tried the 2 pcie slots, tried with and without the paddle card, I changed the memory, changed the power s., and nothing!! Also tried all configurations options in the bios and also tried different bios versions.
No one OS recognized the network card also.
Do you agree with me that the card is deffective?
regards,
Ed
 

Gman39

Junior Member
Apr 29, 2009
1
0
0
You're correct in that the top PCIE slot is for your card, and the bottom slot is for the paddle card.

Try this:

In the BIOS, under Advanced Chipset Settings, select AMD 790GX config., and set "Internal Graphics Mode Config" to "Disable."

Now select "PCI Express Confg." and select "Port #02 Features." Set "Gen2 High Speed Mode" to "Advertised RC".

Save and Exit. You should now be running at 16X.

Hope this helps,

Glenn
 

eduecris

Junior Member
Apr 23, 2009
2
0
0
Hi Glenn,

I tried this and when the I turn on the computer it starts beeping because it cannot find any video board.
 

sanders4617

Member
Mar 3, 2009
87
0
66
The paddle card goes into the slot closest to CPU... and the actual GPU goes into the lower slot away from CPU.