• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Adapter: 32-bit color faster than 16-bit color?

BadThad

Lifer
I read this and am perplexed:

LCD/CRT Article

When using 32-bit color more video ram is consumed. How can it be that a display will run "faster" at 32-bit color? There are more colors to render and less ram is available.

Also stated in the article is to use a SLOW refresh rate! 60Hz is recommended for an LCD so that the screen is brighter and more colorful.

This is going against everything I've believed in. Run LESS COLOR to free video ram and improve performance and run as high a refresh as possible to reduce eye strain. 😕

Opinions?
 
This not that Technocal...but I will dab my finger in the topic for your sake...


As for 16bit color speed vs 32bit color, i havNO IDEA whrere you heard that...it is kind of obvious that 16 requires less power...ie....play quake and check the framerate drop...
However, I think what you are basing yourself on is DRIVERS..because the majority of games are 32bit now, programmers have no real reason to concentrate on the 16bit aspect of them. Take my 64MB DDR Radeon. WIth some drivers, it preforms worse on 16bit settings.


As for the Refresh rate, remember that LCDs are NOT CRTs and therefore they have different properties. look at a CRT @60hz and then an LCD..big difference huh?
Some LCDs are at 57-60hz sometimes....

EDIT: wrong term
 
I am kind of perplexed at your post now that I read the article....it explains it right there.....

As for 32-bit color...for reference..it is the same as 24-bit, but adds shading...ie24bit color is screwy sometimes on most monitors.

As for LCDs, the lower refresh rate does not heavily strain the screen. Even on a 42'' plasma I setup for someone, their rfresh rate was low, and the native resolution was perfect...every other setting looked like crap...

 


<< it is kind of obvious that 16 requires less power...ie....play quake and check >>



Not always though. Remeber that the original Radeon 64ddr card play much faster whe switched to 32-bit mode. And I really dont think it was the drivers that were a fault.
 
the card might render 32-bit internally. when you use 16, you're actually calculating at 32 and then dithering down to 16.
 
Hmm....well, the video card thing is this:
Higher end video cards, such as GF3+ and Radeon 8500+ are actually designed with 32-bit mode in mind. 32-bit colors are generally at the top of the proverbial "stack" whereupon the graphics card gets it's color combinations on newer cards. This means it's less searching ( it's actually not quite like this, but you get the idea I think ) and faster into the Video RAM for rendering. 32-bit colors are not inherenty larger either. It's not necessary to use colors that are always out of the 16-bit range. That's why you can run 16-bit color apps in a 32-bit color desktop. By setting your graphics card to 32-bit you enable it to draw from a full color pallete. Also, Drivers do play a large part of the whole process as well. As games move to 32-bit so do driver opcodes and the 16-bit color ones sometimes get dropped for the 32-bit ones.

The LCD/TFT thing has been explained 🙂
 
Back
Top