Originally posted by: Navid
I see that I did not word my question properly. Let me try again.
When you calibrate brightness on a CRT, you can change the size of the scanned area on screen to create an area that is not scanned above or below the scanned area by reducing the vertical size. Then, you put a black image on screen (RGB=000). Then, you gradually decrease brightness until the color of the black scanned areas (RGB=000) becomes the same as the area that is not scanned. Then, you know that your brightness is right.
I know how to change the brightness on the LCD. I don't know how to tell when the brightness is right since you cannot adjust the scanned area size on an LCD.
Originally posted by: Matthias99
Actually, that's how you should adjust your *contrast*. Brightness affects the white level, and you need to adjust that so that you are not crushing while values below 255. Adjusting brightness based on the black level could give you results that are significantly nonoptimal.
The black level is the brightness and the white level is the contrast. Why can't they just call things what they really are.Originally posted by: Matthias99
Actually, that's how you should adjust your *contrast*. Brightness affects the white level, and you need to adjust that so that you are not crushing while values below 255. Adjusting brightness based on the black level could give you results that are significantly nonoptimal.
Originally posted by: Navid
Originally posted by: Matthias99
Actually, that's how you should adjust your *contrast*. Brightness affects the white level, and you need to adjust that so that you are not crushing while values below 255. Adjusting brightness based on the black level could give you results that are significantly nonoptimal.
Not according to tens of web pages that come up if you search for monitor calibration.
This is just an example.
