Any monitor purchased within the past few years is known as a "multi-scanning" monitor. Basically, there is a synching circuit within the monitor that attempts to read the incoming signal and figure out the refresh rate coming in. If it can match it, it will. Otherwise, it will let you know that it's not supported.
Earlier monitors would try to synch with anything, sometimes permanently damaging the monitor if the refresh rate were set too high.
In ANY case, higher refresh rates are harder on the monitor. It forces the electron beams to work harder, and the circuitry within the monitor must process the signal faster. Sometimes because of the faster refresh rate, the electron beams don't match up perfectly with the phosphor anymore, causing the "blurriness" or lack of focus common at very high refresh rates on most monitors.
Most people have suggested that refresh rates between 72 and 85 Hz are optimal. They prevent screen flicker due to interaction with 60Hz AC lighting (especially fluorescent lights) and are not too difficult for newer multiscanning monitors to display.
In my opinion, refresh rates higher than 85 aren't terrifically useful, since the human eye can't differentiate things that quickly, and screen flicker should be completely reduced by 85 Hz (25 Hz faster than the 60 Hz interference signals).
Kyle