how high should I set my refresh rate????

Cruze8

Member
Jan 15, 2002
111
0
0
I'm just wondering how high I should set my refresh rate for direct3d and opengl games that run at 1024x768 and 1280x1024? thanks for any and all input
 

rimshaker

Senior member
Dec 7, 2001
722
0
0
99% of most people don't see flickering at 75Hz. While many out there say use the highest possible supported by the monitor, there would be no difference if you went up to 85 or 100Hz. Monitor manufacturers confirmed that it doesn't hurt or shorten monitor life.... but what they didn't say is that your monitor consumes more power operating at higher refresh rates. So i just stick with an even 75Hz at all resolutions.
 

gunf1ghter

Golden Member
Jan 29, 2001
1,866
0
0
Some folks insist that they can even see the difference between 85hz and 100hz. Generally speaking, if you are satisfied with the image quality, then I wouldn't worry about it. Another thing to keep in mind is that your refresh determines the FPS limit on first person shooter games. You might need to experiment with turning V-synch on and off. Also, 60 or 75 hz might look like crap on one monitor and look quite acceptable on another. My Samsung only supports 60 hz for 1600X1200 but looks better at that resolution than my higher end Sony which could run that resolution at 75hz.
 

vitocorleone

Junior Member
Jul 6, 2001
8
0
0
The minimum recommended resolution - also the VESA standard recommended - is 85Hz. Just because you don't "see" flicker does not mean you aren't experiencing eye strain. Long-term eye strain leads to vision problems. Is it worth it?

It's true that people vary in their sensitivity to flicker (I need 90Hz+). I personally don't think my eyes are worth sacrificing in order to save a few watts on my power bill.

Each monitor has different bandwidth capabilities. My 17" Viewsonic PS775 can do 1152x864@100Hz, but the text is a bit fuzzy. I knock it down to 95Hz and it's sharp as can be, which is why I lowered it to 90Hz and am now running at 1280x1024. This could be one big reason for the difference at different resolutions between the samsung and sony.

75Hz is at the low end. If your monitor can handle it and it still looks good, use 85Hz+.
 

WA261

Diamond Member
Aug 28, 2001
4,631
0
0
i always use 100Hz...anything elses (lower) i can tell and dont like
 

tazdevl

Golden Member
Mar 1, 2000
1,651
0
0
Depends... desktop 1280*1024 and just about everything else... 100HZ. CS 1024*768, 140HZ.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Use at least 75 Hz as a minimum with 85 Hz being the preferred amount. Of course it'll depend on what your monitor can actually do.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
I normally use 85hz ,I`ve tried 100hz and above and to be honest I notice on my monitor it`s slightly sharper at 85hz so I set all my res to this setting.

:)
 

zsouthboy

Platinum Member
Aug 14, 2001
2,264
0
0
85hz seems to be the sweet spot for most monitors..... I prefer 100hz.... but........

zs
 

rimshaker

Senior member
Dec 7, 2001
722
0
0
I think some of you are over-exaggerating when you mention things like 85 and 100Hz are giving headaches or that you can even TELL there's flickering above 75Hz. So lemme guess, you practically have siezures when watching TV (60Hz) or watching a movie (24fps). Only a minute percentage of normal human beings are sensitive enough to see flickering above 75Hz. The degree of eyestrain is different for everyone, of course. Basing your level of comfort with a hard number like 75, 85, 100 is ridiculous. It's not a test score, it's not a contest. We're analog, not digital. All you have to do is set your monitor to display something bright in a dark environment, look off too the side a little, and if your peripheral vision notices flickering, then adjust accordingly. Like I said, most ppl don't see the peripheral flickering at 75Hz, much less 85.

If you notice anything at 100Hz... either you're NOT at 100Hz... you're Superman.... or your monitor/vid card really suks :)
 

Amused

Elite Member
Apr 14, 2001
57,118
18,646
146
I was never able to tell the difference between the 75Hz and 85Hz settings prior to instaling WinXP.

Now, using WinXP, I can tell the difference between 85Hz and 100Hz. Something in XP just isn't right when it comes to the refresh rate. Anything below 100Hz is now unbearable to look at, when previously I could run at 75Hz and be happy.