Any reason a GF2 won't work on an LX board? Make it a BX now.

McCarthy

Platinum Member
Oct 9, 1999
2,567
0
76
Updated stuff up here:
Went to do the same as below, only this time broke out my Soyo BA+ BX motherboard ( 1 x 32 bit AGP slot (v1.0 compliant) AGP 1x/2x). Exact same problem. Now I do turn up plenty of results with "BX" and "GF2" as the search terms in google - 99% of them say all is well, same as I'd find with any other chipset. So now I'm doubting the amperage thing, especially since the card works in safe mode/as a standard VGA card, etc. Just when presented with the drivers (Dets .32 to .42 I think) that it won't load into Windows. I could kind of accept that when it was just the LX board, but now it doesn't feel right. I mean why would it be drawing a lot more power to display windows in 640x480x16colors with the nvidia drivers than with the standard VGA driver? If the card requires a later AGP spec it'd be real nice if Gainward would mention it, but they don't so I'm not going to assume it does. Had it running in 1x AGP mode on my main computer for awhile when I did something stupid to my 4in1 drivers so it can't be a 1x/2x/4x problem that I can see.

Ran Detonator Destroyer to start over fresh, get through everything fine, standard PCI VGA setup, restart, install Dets, restart - posts, Windows splash screen, goes black as every card I've ever had does for just an instant while switching from splash to desktop mode - and that's where it stops, black screen, never gets to desktop.

Surely someone else has seen this problem and figured out how to fix it, even if it's a software fix instead of hardware, which I'm thinking it might be now. If it is a power problem it certainly is baffling to me.

--Mc

-------------------
My spare computer is Celeron 600 on an LX board. Tonight I was going to throw my GF2 in there to see if the TV out was better quality than the old Canopus in there (main computer isn't close enough to TV). Eventually I plan to upgrade video on the main system and the GF2 would go in there anyway.

Running W98se on the spare. Fresh install, no previous drivers. Got to windows, it detected card, went through and installed the drivers. All's well.

Reboot. Go through post, bootup screens, W98 splash screen, just at that instant where it's supposed to go to the desktop all I get is a flashing _ _ _ _ in the upper left hand corner of the screen.

Went safe mode, redid drivers multiple times, no apparant software problem. Went to google and searched for "gf2 geforce2 LX chipset" in various combinations and mostly found vendor pages, but didn't see anything to answer my question.

So, something the LX is missing that the GF2 requires? Am I missing something infinitely basic here? Surely a GF2 can work in 1X or 2X AGP mode just fine, can't it? I know I know, but I'm not playing Quake on it, just a repository for old hardware I use for other stuff.

--Mc
 

Aquaman

Lifer
Dec 17, 1999
25,054
13
0
I can't really answer your question but wasn't the AGP on the LX board's like the first ones out?

Anyhow, Good luck with it :)

Cheers,
Aquaman
 

AndyHui

Administrator Emeritus<br>Elite Member<br>AT FAQ M
Oct 9, 1999
13,141
17
81
Just about all LX chipset based boards, apart from the Intel Atlanta AL440LX motherboard, cannot supply enough Amps to any present day full featured video cards.

At the time, motherboard manufacturers were not expecting the huge power draw that was introduced with the nVidia TNT cards. They took the cheap way out and did not implement a linear voltage power regulator, since prior to the TNT, no video card came anywhere near to the maximum AGP 1.0 power specifications.

There are some workarounds for this problem; I know that ASUS describes a way to supply enough power to the AGP slot directly from the ATX power supply for their P2L97 board. Other manufacturers may offer something similar, but as I remember, boards from Gigabyte and Abit are out of luck.
 

McCarthy

Platinum Member
Oct 9, 1999
2,567
0
76
Bump post (new info above, but edits don't bump) and further update. Just tied a fresh install of W2K with it thinking maybe there was some glitch with my combo and the 9x drivers. Same deal. Perfectly fine until the nvidia drivers are introduced, then nothing but black screen at the switch to desktop moment.

This really sucks, guess I'll be buying a different card for my spare machine instead of demoting this nvidia thing to it since my Radeon already left. IT worked fine in there, I know it worked in the BX and I think I used it in the LX as well. But not a GF2. Shoulda stuck with ATI. One of you nvidia followers want to change my mind, tell me how to make a GF2 work in a BX?

--Mc
 

Insane3D

Elite Member
May 24, 2000
19,446
0
0
Just curious, have you tried the older Nvidia drivers...like non 20.XX? IIRC, there have been issues with the newer dentonators (Detonator 4) and some older cards. Maybe try the Detonator 3's? I would suggest trying the 12.XX series. I also remember something about the newer detonators not working properly in AGP 1.0 systems. :)


Edit: Go here for the 12.xx series of drivers....

3D Chipset
 

RossGr

Diamond Member
Jan 11, 2000
3,383
1
0
I ran my GF2 GTS in BX boards (Abit BP6 and a BE6) for a least 6 months with no similar problems. The BP6/Celeron 400Mhz @500Mhz would tend to freeze up after a bit of 3d gaming, so it ended up in the BE6/PIII 450,here it ran flawlessly until I put it in a 1.2Ghz Tbird/ Shuttle Ak31 system.
 

MadRat

Lifer
Oct 14, 1999
11,965
279
126
AGP slots support multiple voltages in their specs. However, the LX boards do not support 1.5v which the NVidia family uses since the GeForce introduction. Intel doesn't support 3v on the Pentium4 chipsets which means you cannot run alot of older videocards on your brand new board! But, who cares? Its all about progress. :)

Alot of videocards cannot run on LX motherboards because of this specific problem.
 

McCarthy

Platinum Member
Oct 9, 1999
2,567
0
76
Where did the 1.5v power come into the AGP equation? Apparently it's there for BX boards, or is it just some BX boards and not others? If it's a power problem, why's it work fine until the drivers are introduced? That seems strange, half the card never gets powered up until the nvidia drivers tell it to?

Tried the older series drivers, they didn't work any differently. Might still have parts of the newer ones in there, det destoyer or not though. At this point I'm thoroughly frustrated with the whole endevour. It's one thing to change the specs so part A doesn't work with part B even though they physically fit together, it's another to change it and not make it clear to the end user. All I could find at nvidia was some blurb in their faq for the GTS saying it only consumes 10w, but not at what voltage or what was required. AGP slot and card are keyed the same, guess to my way of thinking they should work together then.

As to your question, MadRat, I care. Trying my best not to be sarcastic about nvidia and anyone else who supplied hardware for my little boggle here, but it's hard to restrain. When my main computer is quieter and has a sharper output with my old V3-2000 PCI (which has worked in every PCI slot I've ever tried it in) than with the GF2 in question as I type this I really have to question the whole concept of "progress".

--Mc
 

Boogak

Diamond Member
Feb 2, 2000
3,302
0
0
Weird... my GeForce2 Pro runs fine on my BX6-2, I use it for TV-OUT as well. Do you have all the correct BIOS settings? (assign IRQ to VGA, etc) Are you booting up with just the TV-OUT with no monitor? If so, try it out with a monitor instead of the TV. Good luck!
 

AgentofEvil

Senior member
Jun 5, 2001
390
0
0
I had this problem with my mx on a bx board. I eventually just gave up and stuck it my duron box. Sorry I can't offer any help.
 

McCarthy

Platinum Member
Oct 9, 1999
2,567
0
76
TV out cables detached, just hooked up to a standard monitor.

AGP Slot is keyed 3.3v (2x) Card is keyed universal (2 notches) so it should work in 3.3 or 1.5v slots, but not all it seems.

ATI is nice enough to include a chart. Still looking for such a thing from nvidia. BTW, on that chart my combo would be "D", which if the card was ATI, would work (and did).

Leadtek shed a little bit of light on the matter in a faq stating "All GeForce line of graphics cards required AGP 2.0 slot support from motherboard. The main different between the AGP 1.0 and 2.0 is the 2.0 has extra 3.3v support. Without this extra 3.3v from the AGP slot, GeForce graphics cards will not function properly. Most recent released motherboards should have AGP 2.0 slot design, but some old motherboard sometime will show up as 1.0. These motherboards sometime will work if the makers have enabled the 3.3V support for the AGP slot in the 1.0 design. Users with these type motherboards should contact their maker for more information on compatibility issues."

So you need AGP 2.0, or AGP 1.0 with extra 3.3v I guess. Seems some keying of the slot/card was in order. Guessing the guys who got Geforce cards to work on their AGP 1.0 slot BX boards were using later revisions of the boards with that unofficial change to the AGP 1.0 slot or perhaps even a 2.0. Found someone using a Soyo BA+100 (same AGP description as my BA+ "1 x 32 bit AGP slot (v1.0 compliant) AGP 1x/2x") with a Geforce2 last night in my searching. Similar to mine, but a bit later revision of the board.

Later:

In an effort to prove myself wrong and Andy right I'm going to try giving it more 3.3v power as described here. The GA-6LX7 I have has the same 3.3v pad as shown on other boards so if the multimeter says all's well I'll give it a shot. No such pad on the Soyo so I wouldn't know what to do to it, but then that wasn't the goal to start with. Heck, probably blow both up and be shopping later tonight

Later Still:

Nope, that didn't do it. Nothing blew up, but it's reacting exactly the same. So it's definately not short on 3.3v power. Which is the shortcoming Gigabyte had and changed, no mention of 1.5v anywhere in their fix. So back at square 1, only now knowing it's got plenty of 3.3v power on the LX. Which isn't the problem, so doubtful 3.3v is the problem on the BX. As for this 1.5v as I'm understanding either a card runs off 3.3v or 1.5v, not both at once.

And still a later thought:

So if 2x is 3.3v and 4x is 1.5v and the card can operate off either, maybe it's in 2x until the drivers try to kick it up to 4x. Anyone know how to force the card itself to always stay at 2x to test this?

BTW, yes, video is the only card in the system. All others pulled (both boards) to eliminate conflict problems.

And after that:

Well, tried again with older drivers AND the extra power. Still no go. Had tried to give up on it, found myself not being able, now I think I'm finally ready to call it a bust. Guess I'll just sell the card when the time comes to upgrade and buy a new one for the main, spare's gonna have to stick with the RIVA128 under W98 for now, which has nice enough TV out when it's not crashing. Would love to move the spare to W2K, but you guessed it, no drivers (that support the TV out anyway).

Madrat - AGP's never been 5v that I can find, just 3.3 to start with, switchable 3.3/1.5 and now moving to straight 1.5. Can't find any REASON this won't work, just finding posts on other forums from people saying they've run into the same thing (one guy with a P4 board, same deal, find at 640x480x16 but blank screen once drivers are installed) with no solution. Dunno, whole thing's weird. Gonna go put the computer back together before I get mad about it again. :)

--Mc