Converting back to 3dfx? Opinions needed...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Soccerman

Elite Member
Oct 9, 1999
6,378
0
0
no actually, no-one is required to follow the reference design. they are required to have the same chip nd memory default as the reference (I think).

this is one reason why we see terrible 2D image quality in most nVidia based cards (actually, I haven't seen it myself, so I'm only going on what others have said), they go el cheapo in that area, and 2d quality suffers.

Creative is the devil! don't buy from them (ok, their Nomad I think is the best MP3 player, but that's it).

j/k, they aren't the devil, they just suck (imho).

as for switching, I'd recommend trying as much as possible to solve these problems before giving up.

why? well nVidia cards have been good only for people who are willing to coax them to full health. I have a friend who had a Geforce (when the original came out), and he returned it and bought a CD burner. He had some crappy Hercules 128 card which I think was based on an S3 trio chip. needless to say it sucked balls. I felt sorry for him, and leant him my currently not in use Voodoo Banshee, and he came back the next day saying.. wow that's fast! LOL.

moral of the story? people who don't like to spend time installing driver updates, modding their hardware (for better 2D), and troubleshooting problems might want to go 3dfx rather then nVidia.

the speed difference now (with the 1.04.00 drivers /w tweaks enabled) is much less then before. yet no-one has bothered to do a comparison (even firingsquad, who normally does that kind of thing hasn't yet).

and as DaveB3D stated in another thread, the Geforce isn't very future proof at all. it's not very compatable with Direct X 8, which is where most T&L enabled games will come from (if I'm not mistaken, OpenGL doesn't fully support T&L does it? not unless you program for individual T&L units I think).