16-bit on ATI cards isnt "disgusting" !

Canuck

Junior Member
Nov 25, 2000
6
0
0
I have an ATI Rage Fury Pro (soon to be replaced with V5 or Radeon)
and yes, as it turns out I play alot of Half-Life (err. Counter-Strike) and while there is a little problem with it, its not that baaaddddd, yes it you concentrate and stare at the corner at the screen for a while you can see a bit of a blurriness that shouldnt be there, but in fast-paced games like CS, thats the last thing you worry about, especially for hardcore veterans like myself (kinda ironic wierd since im only 12 yrs. old, lol) and while ATI isnt the best for drivers, they dont suck, I havent encountered very many problems or glitches that are serious as of yet! So the Voodoo cards are the kings of stability but ATI is still a great company, as for Nvidia, get out of my town!

(Crazy [====]:::::>) {CANUCK}
 

Looney

Lifer
Jun 13, 2000
21,938
5
0
Ack, you're a disgrace to all Canadians Canuck :p

Actually, ATI Rage Fury Pro support is crappy... but i can't say the same with the Radeon though. There's been regularly releases of drivers every few weeks, with the latest being a huge improvement IMO.

If you play a lot more games or have other cards to compare to, i'm sure you would say the same thing.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
3dfx's 16 bit colour is by far the most superior with its 22 bit upsampling technique. I think when people say 16 bit on the Radeon sucks, they mean that you don't see a large performance boost when switching from 32 bit colour to 16 bit colour.
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
I think when people say 16 bit on the Radeon sucks, they mean that you don't see a large performance boost when switching from 32 bit colour to 16 bit colour.
Nope, we mean it sucks. I had an Xpert 128/Rage Fury and 16-bit stuff looks like poor dithering. They didn't fix the problem on the Rage Fury Pro, and I don't believe they fixed it on the Radeon. Some reviews mentioned it in passing, but that's because they ran all tests at 32-bit. On the Rage Fury, if you had fast or decent framerates, it wasn't too bad; but when things got slow you could definitely notice it. There was a driver switch to lessen it somewhat, but it impacted performance a bit.

BTW, just (re)played some Half-Life on my new V5500 at 1280x960x16 with 4xFSAA. NICE and smooth.
 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
I owned an ATI rage fury pro 32MB before my Savage 2000 and finally my Geforce SDR....
all i have to say is that the fury pro had by far the worst 16-bit 3d quality...in cs it was
blurry all over the ground like crazy, etc. - even worse depending on the quality settings used.
i'm referring to the highest...the savage 2000 is excellent, in fact, i'm still quite partial to my
little savage 2k passive cooled card - i miss it...dirt cheap price, excellent performance
(in unreal engined based games, and q3 engine based games...and half-life)....
why'd i get rid of it?????....no idea.


Athlon 700@784 (112FSB)
Asus K7M
Asus v6600 pure (det 6.35)
Diamond MX300
128MB PC100 generic
20.4GB fujitsu 7200rpm/8.4gb maxtor
win98se
 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
and why are u hating on nVidia anyway?
the company makes damn fine cards - however overpriced they are, but so is the competition's...

come on, support trident and cirrus logic video chipsets.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
WetWilly:

and I don't believe they fixed it on the Radeon

Really? I knew the Rage 128 had poor 16 bit colour quality but it was pretty much attributed as yet another symptom of ATi's lackluster drivers in those days. I had no idea it was carried over to the Radeon. :frown:
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
BFG, not only is 16-bit on the Radeon slow, but it is ugly.

just plain ugly

they do 32-bit up just right tho. Best 32-bit going right now.

their anisotropic runs circles around both my GTS w/trilinear + anisotropic set and with the 5500 using lodbias and all the driver tweeks

just a shame its a bit on the buggy side

my bro says it's not too bad in win98 in most games, aside fro mhis older ones, but in win2k, it's buggy and unstable
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
BFG,

It's true. From Anand's Radeon 64MB DDR Review:

Still present from the Rage128 days is the texture shimmering when in 16-bit color. In addition, the card still performs poorly at 16-bit color, once again a result of the poor driver sets. While this makes running in 32-bit color more attractive (as shown in the benchmark sections), it does limit the options one can play at.

I don't believe it's a driver issue, because:
1) The "fix" in the Rage Fury's drivers didn't totally resolve the problem
2) ATI keeps saying "ignore 16-bit because 32-bit is just as fast"
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
WetWilly:

I don't believe it's a driver issue, because:

Hmmm... that really does suck. A lot of old games don't support 32 bit colour. The Rage 128's 16 bit colour was horribly pixelated and I would never want to play at that setting.
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
BFG,

A lot of old games don't support 32 bit colour

Exactly. At least Anand mentioned the problem generally. Lots of sites get wrapped up in the minimal speed difference between 16 and 32-bit (to the Radeon's credit), and don't even raise the issue of older games. I should qualify my comments above about driver issues. It may be possible to fix the problem in drivers, but ATI's emphasis on 32-bit performance suggests that it's not a priority.
 

PlunX

Golden Member
May 26, 2000
1,001
0
0
I may be alone here, but.. I -always- use 16-bit color in games like UT and Quake3 with my Radeon. I get better framerates and the only difference that I see between 16-bit and 32-bit is tht 32-bit is darker than 16-bit and I lose about five frames per second. I don't see anything bad about 16-bit color.
 

BW

Banned
Nov 28, 1999
254
0
0
That just goes to show how people see things differant. When i tried to play half life counterstrike it looked horrible it looked all grainy like. Like my old riva 128 did back in the day.
 

Mday

Lifer
Oct 14, 1999
18,647
1
81
so, i like my rage fury.

i like my aiw radeon.

so, what's your point you crazy canuck.
 

EvilDonnyboy

Banned
Jul 28, 2000
1,103
0
0
So ATi engineered their cards and drivers towards 32 bit performance instead of 16 bit performance/quality. I prefer it this way.

Besides, how many games worth playing can't support above 16bit colour? HL can do it with a console command; Q2 does 32 bit on my G400 for some reason.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
Besides, how many games worth playing can't support above 16bit colour? HL can do it with a console command; Q2 does 32 bit on my G400 for some reason.

actually, the vast majority of games DON'T support 32-bit color

and don't kid yourself. Quake2 may have a "32-bit color" setting, but that doesn't mean you're actually getting 32-bit precision.

Quake2 has a couple hundred thousands shades of brown. Not a couple million. :)
 

SteelyKen

Senior member
Mar 1, 2000
540
0
0
I think that System Shock 2 and Thief are games WELL worth playing. They do not support 32 bit color (How I wish they did!).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
So ATi engineered their cards and drivers towards 32 bit performance instead of 16 bit performance/quality

(1) As was pointed out there are still a lot of 16 bit colour games.
(2) Prior to the Radeon, ATi's 32 bit performance wasn't exactly spectacular.