Is 100hz better for the eyes than 85hz?

bandXtrb

Banned
May 27, 2001
2,169
0
0
Thanks for reminding me about that. I set my monitors from 60 (default) up to 85. There are always some things I forget to do when I reinstall my operating system.
 

Zim Hosein

Super Moderator | Elite Member
Super Moderator
Nov 27, 1999
64,975
388
126


<< in theory, I believe so.

but if your monitor is fuzzy at 100hz than heck no.
>>



Agreed GtPrOjEcTX.

d1abolic, IMHO I believe that half is the science, while the other half is personal preference, meaning what the individual person is sensitive to. I personally am very disturbed by low refresh rates while a good friend of mine can tolerate more than I can :confused:
 

Rallispec

Lifer
Jul 26, 2001
12,375
10
81
yeah, less flicker. i can really tell between 65 and 85. 100 should be even better
 

joohang

Lifer
Oct 22, 2000
12,340
1
0
My Matrox allows 1 Hz increment in refresh rates, so I can crank it up to 1024x768 @ 89 Hz on my Viewsonic E771, but I run it at 75 Hz because I found the text fuzzier at higher refresh rates. Probably because the monitor is cheap.
 

ThaGrandCow

Diamond Member
Dec 27, 2001
7,956
2
0
It depends on how good the monitor is. If you've got a cheap monitor it may be able to hit 100hz but it may be fuzzy. It's different for each monitor what the best refresh rate is. Bump it one step at a time. You want the highest refresh rate possible with no fuzzy text or screen irregularities.
 

zayened

Diamond Member
Feb 28, 2001
3,931
0
0
is it BAD to uncheck that box that says "only show refresh rates the monitor can handle" (or something like that) and set the refresh rate to, say, 75 even though the only rate showing was 60 when the box was checked?
 

CocaCola5

Golden Member
Jan 5, 2001
1,599
0
0
If your video card is fast and likes very high refreshes than a higher rate will look better, a cheaper card will like the refresh low and well under its max. rate.
 

Nefrodite

Banned
Feb 15, 2001
7,931
0
0
well probably, over the long run your eyes probably grow less tired.

as for the videocard thing, well 100hz is 100hz:p if your vidcard sucks then it sucks:)
 

Jerboy

Banned
Oct 27, 2001
5,190
0
0
You should experience less eye-fatigue after a long period of use. higher frequency means lesser flicker. One on-off per second is considred one hertz. A typical fluorescent lighting has "refresh rate" of 120Hz, because it goes on and off twice in every cycle. You can use fluorescent lamps with electronic ballasts and drive them at 40,000Hz. You will not see any flicker at this rate. I wonder if it is possible to get refresh rate up to 40,000Hz on computer monitors.. :D:D. Anyhow, LCD monitor has almost no flicker.
 

ZeroBurn

Platinum Member
Jul 29, 2000
2,892
0
0
i think i read an article somewhere that said only 10% of people can tell the difference between 75hz and 85hz, anything after that and you won't be able to tell the difference.

 

d1abolic

Banned
Sep 21, 2001
2,228
1
0


<< is it BAD to uncheck that box that says "only show refresh rates the monitor can handle" (or something like that) and set the refresh rate to, say, 75 even though the only rate showing was 60 when the box was checked? >>

If it says 60 when the box is checked, that means 60 is all your monitor supports.
 

ndee

Lifer
Jul 18, 2000
12,680
1
0


<< i think i read an article somewhere that said only 10% of people can tell the difference between 75hz and 85hz, anything after that and you won't be able to tell the difference. >>



lol.
 

kduncan5

Golden Member
Apr 22, 2000
1,794
0
0
A question along the same lines: Using Windows98se, isn't it true under normal circumstances, the only refresh rates that are shown will be the ones your video card can handle? Also, exactly what is "Optimal"?:confused: -kd5-
 

Nefrodite

Banned
Feb 15, 2001
7,931
0
0
optimal is bullsh*t generally. and if you set ur refresh too high, no harm, you just don't see anything for 15 secs
 

Bullhonkie

Golden Member
Sep 28, 2001
1,899
0
76
It's probably marginally easier on your eyes but I think most people wouldn't notice the difference. I always use 85hz where possible, as I can still notice flicker sometimes at 75hz. I haven't seen any difference above 85hz though, so I just stick to 85hz. I'd just notch it up one step at a time until you no longer notice a significant improvement as others have recommended. :)
 

duragezic

Lifer
Oct 11, 1999
11,234
4
81


<< is it BAD to uncheck that box that says "only show refresh rates the monitor can handle" (or something like that) and set the refresh rate to, say, 75 even though the only rate showing was 60 when the box was checked? >>


You'd have to check the specs from the manual or manufacturer's website to be sure if it can handle higher, because if it can and it's not letting you, try reinstalling the drivers. A few times, it looked to be installed right (Samsung 900NF, not unknown monitor or whatever), but it only gave me 60hz and 75hz even though it can go much higher than that. So no it's not bad if you still can't get it to give you more refresh rates even though you know for sure your monitor (and video card, but usually isn't what's limiting it) can handle it.
 

MikeO

Diamond Member
Jan 17, 2001
3,026
0
0



<< i think i read an article somewhere that said only 10% of people can tell the difference between 75hz and 85hz, anything after that and you won't be able to tell the difference. >>



I sure as hell can tell the difference between 85hz and 100hz. Sometimes when I disable the Radeon's tv-out it sets to 85hz instead of 100hz as it's supposed to and I notice it immediately. I've had this monitor over a year, I guess my eyes are so used to 100hz already :)