Can someone please explain lcd and crt refresh rates? does it have anything to do with frames per a second if the Hz is higher do you see more frames per second?
edit: googling right now.
"The refresh rate is the number of times a display's image is repainted or refreshed per second. The refresh rate is expressed in hertz so a refresh rate of 75 means the image is refreshed 75 times in a second. The refresh rate for each display depends on the video card used. You can change the refresh rate in the display properties. However, if you change the refresh rate to a setting that the display or video card cannot support, the display goes blank or the image becomes distorted. It is recommended to consult the display and video card manuals before changing the settings to determine the supported refresh rates."
so if you have your monitor at 100Hz you can see 100frames per a second correct? so why do people get lcd's that have a max of like 85 and some res only support 60Hz?
edit: update
"Note: Do not confuse the refresh rate with the term "frame rate", often used for games. The frame rate of a program refers to how many times per second the graphics engine can calculate a new image and put it into the video memory. The refresh rate is how often the contents of video memory are sent to the monitor. Frame rate is much more a function of the type of software being used and how well it works with the acceleration capabilities of the video card. It has nothing at all to do with the monitor.
The refresh rate is important because it directly impacts the viewability of the screen image. Refresh rates that are too low cause annoying flicker that can be distracting to the viewer and can cause fatigue and eye strain. The refresh rate necessary to avoid this varies with the individual, because it is based on the eye's ability to notice the repainting of the image many times per second. My experience has generally been as follows: "
ok its starting to make since. what about ms times? for lcds
edit: googling right now.
"The refresh rate is the number of times a display's image is repainted or refreshed per second. The refresh rate is expressed in hertz so a refresh rate of 75 means the image is refreshed 75 times in a second. The refresh rate for each display depends on the video card used. You can change the refresh rate in the display properties. However, if you change the refresh rate to a setting that the display or video card cannot support, the display goes blank or the image becomes distorted. It is recommended to consult the display and video card manuals before changing the settings to determine the supported refresh rates."
so if you have your monitor at 100Hz you can see 100frames per a second correct? so why do people get lcd's that have a max of like 85 and some res only support 60Hz?
edit: update
"Note: Do not confuse the refresh rate with the term "frame rate", often used for games. The frame rate of a program refers to how many times per second the graphics engine can calculate a new image and put it into the video memory. The refresh rate is how often the contents of video memory are sent to the monitor. Frame rate is much more a function of the type of software being used and how well it works with the acceleration capabilities of the video card. It has nothing at all to do with the monitor.
The refresh rate is important because it directly impacts the viewability of the screen image. Refresh rates that are too low cause annoying flicker that can be distracting to the viewer and can cause fatigue and eye strain. The refresh rate necessary to avoid this varies with the individual, because it is based on the eye's ability to notice the repainting of the image many times per second. My experience has generally been as follows: "
ok its starting to make since. what about ms times? for lcds
