• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Got computer back from repair shop

RESmonkey

Diamond Member
My old GPU (radeon 9600) died and the dude threw in a Rage 128 Pro. I have a Geforce 2 MX in the closet.

Should I swap the Rage 128 Pro w/ my old Geforce 2 MX? Which one is faster/better? I plan on using 1680x1050 as desktop resolution.
 
The Rage probably displays Desktop Apps better. "Faster" shouldn't matter much unless you try to Game with it, then the GF2 MX would be better, but very limited still.
 
the rage 128 pro is a dinosaur from the Super 7 and Pentium 2 days (1998), the GF2 MX will provide miles better performance....but both cards suck considerably compared to the 9600.
 
Originally posted by: sandorski
The Rage probably displays Desktop Apps better.

What makes you say that? The 128 Pro competed with the TNT2. The GF2 is a generation newer (though both are ancient now). My GF2 MX400 served me well (all the way up until I got my X1900). I was only using it at 1024x768, though.
 
Originally posted by: vj8usa
Originally posted by: sandorski
The Rage probably displays Desktop Apps better.

What makes you say that? The 128 Pro competed with the TNT2. The GF2 is a generation newer (though both are ancient now). My GF2 MX400 served me well (all the way up until I got my X1900). I was only using it at 1024x768, though.

Better Quality Output. In 2D "Speed" is pretty much a non-Issue.
 
Originally posted by: sandorski
Better Quality Output. In 2D "Speed" is pretty much a non-Issue.

What do you mean? Was there some issue with the GF2's connectors? I don't recall any output issues back when I used mine.
 
ATI cards had better 2D back in the day. Geforce 2 era just had crappy 2D. I suggest you try both cards and see which you like best.
 
Originally posted by: vj8usa
Originally posted by: sandorski
Better Quality Output. In 2D "Speed" is pretty much a non-Issue.

What do you mean? Was there some issue with the GF2's connectors? I don't recall any output issues back when I used mine.

He is talking about this:

Originally posted by: sandorski
The Rage probably displays Desktop Apps better. "Faster" shouldn't matter much unless you try to Game with it, then the GF2 MX would be better, but very limited still.

and see post above mine by Azn.

By quality output he does not mean the build quality/connectors.
 
So which one would play the best video?

BTW, it's my dad's comp now so he's not going to game. Just watch movies I guess.
 
Originally posted by: RESmonkey
So which one would play the best video?

BTW, it's my dad's comp now so he's not going to game. Just watch movies I guess.

what kind of videos. I also hope you didnt pay somebody to work on that relic. :Q
 
Rage 128 Pro...LOL, that brings back memories. I had one in a G3 Mac I used to use. It was nice for Quake 3 and GLQuake, but that was about it. 😉

The GF2 MX will be faster for 3D, but in reality neither card can possibly replace a 9600 for gaming.
 
Paid a guy at a computer shop just to switch out video cards? Shouldn't have been that hard to diagnose the problem and realize it was a dead video card, buy a new one, replace, done.
 
How did the GF2 MX stack up to the GF4 MX?

Weren't they essentially the same arch with an enhanced mem crossbar?

I have no fond recallections of my MX440, in fact I remember being very excited when a mate got an FX 5200 and I got to see the true beauty of the water in FarCry (I am assuming it uses DX9 shaders?), albeit very slowly 😉

 
Well let's do a little comparison..

Geforce2 MX:
Clock= 175mhz
MemClock= 166mhz
Fillrate= 700 MT/s
DirectX= 7
OpenGL= 1.2
Bus Type= SDR
Bus Width= 128bit
Bandwidth= 2.7 GB/s
Timings= 0:2:4:2
Fabrication= 180nm

Rage 128 Pro:
Clock= 125mhz
MemClock= 143mhz
Fillrate= 250 MT/s
DirectX= 6
OpenGL= 1.2
Bus Type= SDR
Bus Width= 128bit
Bandwidth= 2.28 GB/s
Timings= 0:2:2:2
Fabrication= 250nm


Based on comparing the specs, and allowing for variance of performance due to differences in architecture, I'd say the GF2 MX is higher enough in performance to outweigh such variables.


 
Originally posted by: dug777
How did the GF2 MX stack up to the GF4 MX?

Weren't they essentially the same arch with an enhanced mem crossbar?

I have no fond recallections of my MX440, in fact I remember being very excited when a mate got an FX 5200 and I got to see the true beauty of the water in FarCry (I am assuming it uses DX9 shaders?), albeit very slowly 😉

A GF4 MX 440 and 460 was almost as fast in DX 7 games as the GF 3 TI200.

IIRC the GF4 MX was just a really supercharged GF2 MX.
 
Originally posted by: RESmonkey
I plan on using 1680x1050 as desktop resolution.

These older cards to not support this resolution. You probably need to buy another card.
 
The computer died because of the powersupply going out and killing an HDD and the GPU (9600). He was kind enough to do it for free since we knew him as a family/friend.

I guess I'll just keep the ATi inside for now. 🙂
 
was using a geforce 2 MX until 2005/2006, so its still good! im not sure if it does widescreen though. geforce 2 MX and geforce 4 MX were essentially the same to answer dugs question...

oh man, GOOD times with those old school cards. loved them. 🙂
 
The repair bill probably cost more than the computer.

Rage 128 Pro. Holy crap - that was literally last century technology.
 
Originally posted by: vj8usa
Originally posted by: sandorski
Better Quality Output. In 2D "Speed" is pretty much a non-Issue.

What do you mean? Was there some issue with the GF2's connectors? I don't recall any output issues back when I used mine.

Back in those days, the analog output filters and things weren't integrated into the GPU, but were separate components on the PCB.
GeForce2 cards were notorious for having poorly designed filters and components that performed below spec. The result was that at high resolutions and/or refreshrates (remember, back then anything over 1024x768 and 60 Hz could be considered 'high'), the image got blurry. You had poor contrast and text would appear 'washed out', especially black-on-white.

ATi built its own cards back then, and always used high quality components. Both ATi and Matrox had an incredible reputation for delivering very sharp pictures to your monitor, regardless of the resolution and refreshrate. ATi was actually recommended by photo companies such as Kodak.

I had an Asus GeForce2 GTS which suffered from the problem... Having come from a Matrox card, it was almost unacceptable. There were mods around... removing some components from the filter would fix the blurriness. I've performed that on my card, and the result was an image that was about as sharp as a proper Matrox/ATi card.
 
Back
Top