What's so hard about forcing a 32 bit RGBA framebuffer through the drivers?

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I've always been pissed because nvidia and ATi won't either emulate dithering at the driver level or simply force a 32 bit frame buffer. Does anyone know why they won't fix the 16 bit color issue other than the fact that "they're old games and no one wants to play them anymore"?

I'd really like to play Rayman 2 or Prince of Persia 3D, but they look like crap on recent graphics hardware, so I don't bother.

Some people say it should be up to the developer of the game, but I disagree. It's too many games.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
What are you saying? That modern cards can't run in 16- or 15-bit mode? I hadn't heard that, I thought that they were backwards compatible. Is this not true?
 

Elixer

Lifer
May 7, 2002
10,371
762
126
Erm, what are you talking about?
If the game uses 16bit (5--5-5 or 5-6-5) then converting that to 8-8-8 is going to do what besides slow everything down?
The source material is 16bit, so you can't improve on it.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Erm, what are you talking about?
If the game uses 16bit (5--5-5 or 5-6-5) then converting that to 8-8-8 is going to do what besides slow everything down?
The source material is 16bit, so you can't improve on it.

No no no, what he's talking about is a problem with a lot of old games.
Back in the day, 16 bit reigned, but the video cards used dithering to make it look better. (3dfx with their 18-bit dithering downsampled to 16 bit for instance)

I would think it would be possible to just force 32 bit color in place of dithering though. Try 3danalyzer, it might have that as an option.
I know the old kyro cards didn't even support 16 bit, everything was just automatically rendered as 32bit.