Originally posted by: vj8usa
Originally posted by: sandorski
Better Quality Output. In 2D "Speed" is pretty much a non-Issue.
What do you mean? Was there some issue with the GF2's connectors? I don't recall any output issues back when I used mine.
Back in those days, the analog output filters and things weren't integrated into the GPU, but were separate components on the PCB.
GeForce2 cards were notorious for having poorly designed filters and components that performed below spec. The result was that at high resolutions and/or refreshrates (remember, back then anything over 1024x768 and 60 Hz could be considered 'high'), the image got blurry. You had poor contrast and text would appear 'washed out', especially black-on-white.
ATi built its own cards back then, and always used high quality components. Both ATi and Matrox had an incredible reputation for delivering very sharp pictures to your monitor, regardless of the resolution and refreshrate. ATi was actually recommended by photo companies such as Kodak.
I had an Asus GeForce2 GTS which suffered from the problem... Having come from a Matrox card, it was almost unacceptable. There were mods around... removing some components from the filter would fix the blurriness. I've performed that on my card, and the result was an image that was about as sharp as a proper Matrox/ATi card.