The 'True colour' rendering roadmap?

MadAd

Senior member
Oct 1, 2000
429
1
81
This ones an appeal to the vid techs out there, how is true colour rendering going to be achieved? Will it be a case of increasing the bit count to huge levels and calling it 'near true' or will there be 'other' methods used? if so, what? (wild guesses allowed)

Today we have 32 bit (3x8 rgb +8A) and its natural that at some point soon the bar will be pushed to 40 or 48 even (eg 4x8 cymk +16A) as gamers and card companies both figure the next thing to pick on thats nasty (last year jaggies, next mebee banding?) and market the hell out of it - however even 48 bit images dont exactly fill a sky with a smooth colour transitions - so where will they go from there?

Also, what will the processing needs be to fulfill 48bit? Are we talking something that can be reworked out of the current specs of cards/cpus today? Would tile based systems like the kyro redesigned for a higher bitrate succeed where others would choke?

And what of the development cycle? True colour is going to be great for skies, large areas of land or sea, skin tones etc in t/l situations, is likely to be backward compatible with earlier bitrates etc, but what of the implimentation? Would it add much more to the design of a world/model or would OGL/D3D handle the hard work leaving lower level cmds?

Lastly what of tomorrow? Is there any developments outside of traditional rendering sytems we might see?
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
24 bit rendering is already pretty much all the average human eye can distinguish. There are some exceptional people who can tell the difference between 24 bit and 30 bit (which Matrox is using) but most average joes can't, probably 40% of the population can even tell the difference between 16 bit and 24 bit. What 30 bit is REALLY useful for is the internal calculation. If you store colours as 24 bit, your going to get rounding errors which are going to multiply if you put them through too many calculations (add lighting, fog, smoke, coloured lighting, bump mapping, distance, anisotropic filters and all that and you have a fair few transformations of the original pixel), you might get artifacting or banding or whatnot. The easiest way to prevent this is to just bump up the internal bit depth.

I dont know how you say skies are banded in 32 bit, Ive seen hundreds of landscape shots, renders and just plain old digital photography with absolutely no banding at 24 bits.
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
Originally posted by: Shalmanese
I dont know how you say skies are banded in 32 bit, Ive seen hundreds of landscape shots, renders and just plain old digital photography with absolutely no banding at 24 bits.
My eyes must be rather sensitive, but on a plain gradient on a good monitor, I can see the color banding in 8bits/channel RGB. However, sky scenes are usually dithered in such a way that the banding is not noticable.
BTW, I heard something about the new R300 chip from ATI, that it will use 32 bits/channel RGB for internal calculations in 3D. Dunno if it's true or not, but that would be f-ing awesome! Hopefully, anyway, it will provide the capability to output more than the status quo of 8 bits/channel RGB. :)
 

sao123

Lifer
May 27, 2002
12,653
205
106
The current limit of mainstream (nonserver) PCs is 32bit processing.
In order for higher than 32bit color to appear, hardware that can make use of 64bit instructions has to become mainstream.
64bit CPU
64bit Databus
64bit D-word RAM
64bit AGP/PCI replacement bus
64bit OS
etc etc.
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Originally posted by: sao123
The current limit of mainstream (nonserver) PCs is 32bit processing.
In order for higher than 32bit color to appear, hardware that can make use of 64bit instructions has to become mainstream.
64bit CPU
64bit Databus
64bit D-word RAM
64bit AGP/PCI replacement bus
64bit OS
etc etc.

I disagree... the textures cause colors and they're handled by the video card. THe card could do 256bit if you really wanted to on a 32-bit cpu, it just wouldn't help humans ;)
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
Originally posted by: sao123
The current limit of mainstream (nonserver) PCs is 32bit processing.
In order for higher than 32bit color to appear, hardware that can make use of 64bit instructions has to become mainstream.
64bit CPU
64bit Databus
64bit D-word RAM
64bit AGP/PCI replacement bus
64bit OS
etc etc.
No, Adobe Photoshop can work with 16 bits/channel images (aka 64 bit) right now, and 3DS max can render 16bits/channel on consumer-level (32bit) hardware. The limitation is purely in the graphics cards. Now, when graphics cards that can do more bits per channel of color become available, the limitation will then be in the CPU, which will need more power to throw around those larger chunks of pixel data. Also, the bit-ness of the RAM, AGP, and PCI bus doesn't really matter, afaik, as long as it is capable of transferring enough data to keep the video card saturated with things to do.
 

Kolio

Junior Member
Jun 29, 2002
8
0
0
Originally posted by: CTho9305
Originally posted by: sao123

I disagree... the textures cause colors and they're handled by the video card. THe card could do 256bit if you really wanted to on a 32-bit cpu, it just wouldn't help humans ;)

the modern cards are manifactured with 256bit cpus and 256bit memory but this doesnt mean that the color will be 256bit, but the future cards coming later this year will have 128bit color option
 

Moohooya

Senior member
Oct 10, 1999
677
0
0
There are so many colours that cannot be represented with rgb. First we need display devices that can display these before we need worry too much.

As for printing, rgb just doesn't cut it. Not sure how they know what the image will look like until they print it if their display is incapapeable.
 

lifeguard1999

Platinum Member
Jul 3, 2000
2,323
1
0
Sorry, but 24 bit rendering is NOT all the average human eye can distinguish. That is only 256 shades of red; 256 shades of green; 256 shades of blue. Remember, the human eye is more sensitive to green and red than it is to blue. 256 shades of blue may be fine, but 256 shades of red is not.
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
Originally posted by: sao123
The current limit of mainstream (nonserver) PCs is 32bit processing.
In order for higher than 32bit color to appear, hardware that can make use of 64bit instructions has to become mainstream.

64bit Databus
Original 5V Pentium had this (and so has every PC processor since then)

64bit D-word RAM
See above.... RAM width has to match bus width unless you require they be used in pairs/quads/etc..... or does that technically make it a 64 bit word and 128 bit D-word... we'll have to go back to 486's!!

64bit AGP/PCI replacement bus
AGP bus has always been 64 bit

 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
Can you see the banding in this pic (in "truecolor" mode)?:

114 greens in a GIF
256 greens in a RAR'ed BMP

Note: MS paint shows the BMP as blue (different programs seem to disagree on BMP channel order...) MS PhotoEdit, WinJPEG and most other viewers show it as green.

I have to look pretty close to notice it, and even then I can only see it on the darker half of the spectrum, and green being smack in the middle of the visible spectrum is the color our eyes are most sensitive to.

The pic is only 256 shades of pure green. (8 bit monochrome)

Odds are that the times where you think you are seeing the gradient you are either in high-color (15-bit) mode, or the programs creating the gradient are taking the shortcut of drawing 50 or so rectangles rather than actually calculating ALL the possible 8-bit per channel shades.

You can zoom in on this pic and verify there actually are 256 green bars -- there is no dithering hiding the gradient.

Edit: changed gradient orientation to vertical rather than horizontal to improve GIF compression ratio -- this also has the interesting effect of making the gradient far more visible on an analog monitor as signal noise hides it less. I still can't see it on the brighter half of the spectrum...
 

Remnant2

Senior member
Dec 31, 1999
567
0
0
That's a good example image: It makes the banding effect very obvious to the naked eye.

I could see us moving to a 32bit framebuffer with no alpha channel (something like 11/11/10 bits per channel), which would give you thousands of shades of each primary, which should be quite sufficient to avoid visual banding.

For 3D accelerators doing many-pass texturing and blending, I don't think the trend will stop until we hit full single-precision 32bit floating point per channel (96bit color). This gives sufficient accuracy to do some fairly sophisticated calculations without bad rounding error.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
AGP bus has always been 64 bit
WRONG! AGP is and always has been 32bit! AGP is faster than PCI due to a 66mhz clock speed and clock doubling for 2x and clock quadrupling for 4x. agp 2.0 specs.

I believe the industry is moving towards 64bit color with 4 16bit floating point color channels. Currently, color gradients can be seen in games because the numerous layers of color calculations in games leads to round off errors. These are visible if you look for them. This is going to get worse as games employ more light sources in scenes so it's off to 64bit we go!
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Originally posted by: zephyrprime
There are only 114 colors in that green image that was posted.
there is definitely banding - not nearly enough colors are used.
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
Originally posted by: zephyrprime
There are only 114 colors in that green image that was posted.

Good catch. Teach me to trust photo edit to preserve colors in GIF compression.

The .RAR compressed bitmap I am posting now has all 256 green shades and looks much better.
 

kazeakuma

Golden Member
Feb 13, 2001
1,218
0
0
After reading this I thought of the supposd NV30 specs. The 4:1 colour compression deal. Of course we don't really know what it is, but we can speculate. I wouldn't think compression would actually help in terms of colourspace though. I'm thinking it's more for bandwidth saving, but if Nvidia is able to fit 4 times the colour info in the same bit then wouldn't we be looking at better IQ?
 

Moohooya

Senior member
Oct 10, 1999
677
0
0
None of this addresses this issue that a CRT cannot display all the colours that the human eye can see. It is easy to make a video card that will do 2^48 colours, 16 bit D2A convertors are cheap. If that isn't good enough, lets make a 2^60 colour video card.

But what about all those other colours?

Check out Colour Faq for a decent colour faq