• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

What bit are CRT's

Originally posted by: Stumps
I dont think it counts as CRT's are Analog, so they wouldn't have a digital bit rate

yeah...i'm a bit confused by it but essentially i assume if your gfx card could output it, a CRT could display it...
 
Originally posted by: dug777
Originally posted by: Stumps
I dont think it counts as CRT's are Analog, so they wouldn't have a digital bit rate

yeah...i'm a bit confused by it but essentially i assume if your gfx card could output it, a CRT could display it...

This is mostly true due to there analog nature, A CRT is definately capable of displaying more colors than an 8bit LCD. I had an 8bit LCD next to my CRT and noted gradient banding on the LCD which didnt occur on the CRT. You'd need an expensive 10BIT LCD to get closer to CRT gradients but of course these models have horrendous response times.

p.s. There are still people to this day that claim old analog records sound better than cdroms , I wouldnt doubt it 😉
 
Infinite. The CRT inputs are analog. The Digital to Analog conversion is handled in your graphics card.

Right now the voltage is divided into 256 steps for each color and represented by 8 bits each, but that is the graphics card that does the digital part. I believe some graphics cards can do 10 bit which would give you 1024 steps for each color. For most of us, 8 bits is fine as we can't spot the difference between any of the 256 shades.

8 bit LCDs handle the conversion internally and the problem here is that sometimes the transitions done by twisting the Liquid Crystal are not as smooth and predictable as the CRT varying the voltage on the guns, so even though 8 bit might normally be adequate, it comes out banded on LCD.
 
Of course in practice nothing is infinite. But color gamut is not the limitation.

Gamut is the range. More bits just give you more steps within that range, small and smaller steps. By the time you reached 12 bits there is probably no one left on the planet who could distinguish the steps between colors, so going further than that is pointless.

 
Can you tell right away weather a panel is 6 bit or 8 bit? I mean if you had 2 LCD's next to each other with the same desktop picture one LCD being a 6 bit and one being an 8 bit could most people tell the difference? The reason I ask is while at Frys they have the NEC 1970GX which is 6 bit next to the Viewsonic VP930B which is 8 bit. Both had the same desktop picture running, and to me anyway the NEC 6 bit looked better to me, the color looked more alive and the text also looked better on the NEC.
 
Originally posted by: JRW
p.s. There are still people to this day that claim old analog records sound better than cdroms , I wouldnt doubt it 😉
That's not a good example because those are the same people who buy snake oil cabling. 😛 Also, it's easy to have a "decent" sounding digital recording for cheap, but to get "decent" sound out of analog records takes a lot of money in higher end equipment (turntable, stylus, preamp, etc.). Nobody claims "better sounding records" using a cheap turntable in an all-in-one desktop stereo system from the 80s.
 
Back
Top