Originally posted by: Jeff7
I think it's pretty funny that 75 Hz was seen as ideal a few years ago, but now it's just not good enough.
Yeah, now I can see flicker at 85Hz. I've been spoiled by 120Hz@ 1024x768. That's on a 21" Nokia 445Pro by the way. Just checked - for the record, it can do 1600x1200@85Hz. The flicker is noticable, but not serious.
That's part of the problem though, ironically enough. Those high-end monitors with enough video bandwidth to do extremely high refresh rates at lower resolutions, often also have a shorter-persistance phosphor coating to match, so that you don't have "CRT ghosting". Yet therefore, if you run at a really high resolution, at a necessarily lower refresh rate, then the refresh becomes more noticable than it would be otherwise. Also, any potential "flicker" on larger screens tends to be more noticable, simply because no matter what part of the screen you are focused on, generally some other portion of the screen is in your peripheral vision, and that tends to amplify the perception of flicker. (If you can understand that, I know I didn't word it very well, sorry.)
I used to play spaceship shoot-em-up games on my games consoles, on a nice little monochrome composite display when I was younger. For one, the display was small and easy to fit in the space that I had for it, and two, it probably actually improved my gameplay, because of the phosphor trails streaming off of the moving pixels. (Mentioned as an example of CRT ghosting, due to long-persistance phosphors.)