Image Quality vs. FPS - V5, Radeon, GeForce2

miniMUNCH

Diamond Member
Nov 16, 2000
4,159
0
0
In the dragout scrap over FPS between nVIDIA and 3dfx user's (and some Radeon users) there hasn't been too much talk about image quality.

Personally, I would never play a GForce2 chip because the image quality bites bigtime compared to the V5 and Radeon. I mean, hell! I don't care if the Geforce2's get high FPS in Quake if the image looks like frickin' ATARI PONG.


"I'm the joystick-stuck-to-my-hand guy! Now gimme some eyecandy!!" (rendition of Adam Sandler)

And what kind of name is "nVIDIA" anyway?! What the hell does that mean?! What FARSIDE character came up with that name?! (Now I'm just being retarded)

Actually I just kidding; I like nVIDIA. But I want to know what people think of the big three when image quality is taken into consideration.
 

SleepyTim

Member
Oct 27, 2000
106
0
0
Ahhh, an easy question for once.

2D Quality.

# 1- Radeon
# 2- Voodoo5
# 3- GeForce2 (GeForce2 cards are a distant 3rd here. Radeon and V5 are much better)

3D Quality. (gaming)

# 1- Radeon
# 2- Voodoo5
# 3- GeForce2 (GeForce2, while still 3rd here, is not as bad as some people claim, making them a closer 3rd)
 

SleepyTim

Member
Oct 27, 2000
106
0
0
Well Adam, right now I have a 64MB Radeon in one machine, and an ULTRA in the other. So actually only 2 out of 3 at the moment. I had a AGP V5 5500 when it first came out, but returned it. I am considering getting the PCI version of the V5 5500 for my glide games now that the price has dropped a lot, and will add it to one of those systems. Hope that answers your questions.
 

AdamK47

Lifer
Oct 9, 1999
15,802
3,608
136
I have the Radeon 32MB DDR, GeForce 2 GTS 32MB DDR and the Voodoo 5 5500. I've noticed that the various color depths have various rendering quality differences on all three boards. Of the three in 16-bit color, it would go like this:

1- Voodoo 5
2- GeForce 2
3- Radeon

In 32-bit color....

1- Radeon
2- GeForce 2
3- Voodoo 5
 

SleepyTim

Member
Oct 27, 2000
106
0
0
Actually Adam I completely agree. Radeon is # 1 in 32-bit quality but not 16-bit. I just realized the reason I didn't break them down into 2 categories is because I don't ever play in 16-bit anymore, so I forgot it existed. The Voodoo5 definately has the best 16-bit quality. Sorry about that. My fault for overlooking it.
 

pidge

Banned
Oct 10, 1999
1,519
0
0
I just sold me ATI AIW Radeon and went back to Geforce 2 GTS and all my problems went away. No more weird flickering on D3D games. No more choppiness in D3D games under heavy intense fighting scenes (improved my gameplay quite a bit). No more strange reboot problems. FSAA runs much more stable. I can finally play NFS:porsche Unleashed again with FSAA. I did not notice any differences when playing Quake III Arena but in Unreal Tournament, it did seem more colorful on the ATI AIW Radeon. Doesn't mean I miss the ATI AIW Radeon. UT tournament plays so much smoother with my Geforce 2 GTS. If you like your Radeon's then congratulations. I however prefer the Geforce 2 GTS for now. I just got a 21 inch Mitsubishi so I will be upgrading to NV20 when it comes out.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
i've had 2 GTS's and 2 5500s. My bro who lives down the street has a Radeon

here's how they rank in my eyes (on a scale of 1-10, with 10 being the best). Tested on include MDK2, Q3, Q2, Q1, UT, Unreal, Deux Ex, 3dMark2000, NFS-PU, Evolva

2d quality
5500 - 10
Radeon - 10
GTS - 8 (up to 1024)
GTS - 6 (above 1024)

both the 5500 and Radeon look awesome. It's almost hysterical to see the difference @ 1600x1200 between the GTS and the 5500 or Radeon.

16-bit 3d
5500 - 10
GTS - 7.5
Radeon - 6 (ugh.....)

with a few driver tweeks (lodbias "-0.5", alpha blending "sharper", mipmap dithering "enabled", 3d Filter quality "high"), 16-bit on a 5500 looks so much like 32-bit it's not even funny.

note-WITH DEFAULT driver settings, the 5500's 16-bit looks LAME. You gotta go in and adjust the drivers. Yet another reason why 99% of the website &quot;reviewers&quot; need a solid <thwap> upside their heads.

32-bit 3d
Radeon - 10
GTS - 9
5500 - 8

This one is tough. The Radeon's anisotropic looks a lot better than the GTS's. Thankfully, the postfilter is no longer enabled in 32-bit on the 5500. lodbias &quot;-1.0&quot; + mipmap dithering &quot;enabled&quot; ALMOST makes up for lack of trilinear. The GTS seems to take a pretty ehfty hit with anisotropic enabled, moreso than the RAdeon.

2xFSAA
5500 - 10
GTS - 5
Radeon - D'OH!!!!!

16-bit (with above tweeks) + 2xFSAA on the 5500 removes all banding and dithering associated with 16-bit.

I've taken screenshots of rocket smoke trails and water and lights and fog w/1024x768x16 w/2xFSAA. They look 32-bit. quite incredible. The GTS's 1.5 x 1.5 is worthless, IMHO. Someone said there was a game that the 1.5 x 1.5 looks decent. I never saw it.

Also note, switch the lodbias up slightly to &quot;-0.75&quot; or &quot;-1.0&quot; with FSAA @ 2x. This eliminates much of the &quot;issues&quot; that Ben just loves to harp on WRT RGSS and FSAA in general. Of course, he hasn't used the 5500 nearly enough, so he wouldn't know. heh...

The drivers my bro has doesn't have a &quot;2x&quot; type option on the Radeon.

4xFSAA
5500 - 10
Radeon - 7
GTS - 6

again, the lodbias slider makes all the difference here. It's not about &quot;jaggies&quot; with FSAA. It's about the obnoxious texture swimming and whatnot. the 5500's 2x removes it almost entirely, and the 4x...it's gone.


I think it is WAYYYYY extreme to say the GTS's image quality is poor. It's not. It's quite good in regular apps. It has a major advantage in a game like Evolva, which has Dot3. Of course, the Radeon does it a bit better, although I wonder if that's the 16-tap anisotropic of the Radeon vs. the 8-tap &quot;anisotropic&quot; of the GTS.

The GTS looks &quot;bad&quot; in 2d, &quot;okay&quot; in FSAA, &quot;good&quot; in 16-bit, and &quot;excellent&quot; in 32-bit
The Radeon looks &quot;excellent&quot; in 2d, &quot;okay&quot; in FSAA, &quot;okay&quot; in 16-bit and &quot;superlative&quot; (that's better than &quot;excellent&quot;) in 32-bit
The 5500 looks &quot;excellent&quot; in 2d, &quot;superlative&quot; in FSAA, &quot;superlative&quot; in 16-bit, and &quot;very good&quot; in 32-bit

There. That's my opinion. If you disagree, piss off. <g>