what's the best refresh rate for lcds?

Barfo

Lifer
Jan 4, 2005
27,539
212
106
I used the max possible with my old crt but I'm not sure if I should do the same now that I got an lcd.
 

Vesper8

Senior member
Apr 29, 2005
253
0
0
as far as I know LCDs don't use refresh rate

so the default is 60hz and it's the only one that works (and only one that should be available)
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Of course LCDs use refresh rates. Apparently most can be run at 75Hz even when not specified. Running the highest possible refresh rate on an LCD is pretty important for gaming, as vsync is necessary to avoid the noticeable tearing which is the result of the crystals' relatively slow response time. The higher the refresh rate, the higher fps you get with vsync on.

 

Barfo

Lifer
Jan 4, 2005
27,539
212
106
Originally posted by: Dethfrumbelo
Of course LCDs use refresh rates. Apparently most can be run at 75Hz even when not specified. Running the highest possible refresh rate on an LCD is pretty important for gaming, as vsync is necessary to avoid the noticeable tearing which is the result of the crystals' relatively slow response time. The higher the refresh rate, the higher fps you get with vsync on.

uh huh, so this was the cause of 3d games running choppy since I lowered it to 60 hz :p

thanks
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
Tearing is not a result of the display's abilities (can happen with CRT's just as with LCD's) but some combo of graphics software and hardware. Regardless of the display, if there is tearing then vsync can be enabled to preclude it however if the graphics system then cannot maintain at least the framerate equivalent to the refresh rate, then choppiness (severe framerate dips) can occur. To minimize that, triple buffering may be enabled at the expense of greater VRAM useage which may in turn result in performance hampering overflow into slower system RAM which in turn may result in even slower paging to the HDD. So since it all depends upon the specific system and game it is best to use settings profiles for the best experience.
 

Fraggable

Platinum Member
Jul 20, 2005
2,799
0
0
LCDs don't work like CRD do, where it needs each frame to be sent to it to be able to display a smooth image. LCDs can hold a frame until the next one is sent, resulting in a 'flicker-free' image.

60Hz will look just like 75Hz, assuming your LCD can take either rate. If you're talking about Vsync, I'm not sure if raising it to 75Hz would make a difference or not. I have no experience there. Assuming that turning on Vsync actually does show 75 FPS over 60 FPS, who can tell the difference anyway? 60FPS looks more than perfectly smooth.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
If you're using DVI, the best refresh rate is the highest it can support. With DVI, LCDs will not suffer blur from the added bandwidth requirements of higher resolution or refresh rate. CRTs will have blurrier text as bandwidth increases and I assume the effect is similar on LCDs operating on VGA. There will certainly be an added strain and stronger noise.

Flicker-free refers to the fact that crystals are static until requested otherwise, which is quite irrelevant to how they interpret frames or whether they wait for the sync signal. The screen will update faster at 75 Hz than 60 Hz, even on an LCD. A couple have problems with the overdrive function slowing down at higher refresh rates, but the only one that comes to mind at the moment is the Samsung 970P. The ViewSonic VP930b works fine at 75 Hz with no noticeable extra blur effects.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
I know that CRTs will also show tearing, it just tends to be less noticeable because the pixel transitions are faster. Tearing occurs when the monitor is currently drawing one frame and the graphics card outputs a newly rendered frame, causing one half of the old frame and one half of the new frame to momentarily be displayed.

Higher refresh rates will make a difference if the graphics card is outputting fewer frames/sec than the refresh rate of the monitor. Vsync will always cause you to lose rendered frames in this situation, as the video card will have to wait for the next refresh start to output its current frame. That lost time adds up and reduces your overall framerate. Someone running a refresh rate at 60Hz with the video card outputting at 40 fps will NOT get the full 40 fps - they'll get something closer to 32 fps. Triple buffering can allieviate this by allowing the video card to continue rendering additional frames down the line even if the monitor is not ready.

With my CRT I prefer not to use vsync because it really hurts my framerates and introduces a more laggy feel to some games as buffering causes a delay between the current frame and the actual input for that frame, which occured when it was 1 or 2 frames back in the buffer.




 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: xtknight
Flicker-free refers to the fact that crystals are static until requested otherwise, which is quite irrelevant to how they interpret frames or whether they wait for the sync signal. The screen will update faster at 75 Hz than 60 Hz, even on an LCD.

Thank you. Why do so many people have a hard time understanding this? LCDs crystals stay in a constant state until a change is requested, so with a static image, they are perfectly flicker-free and refresh rate is irrelevant. With CRTs, refresh rate is relevant all the time as the electron guns need to keep repeatedly striking the phosphors to maintain a steady emission of light.

Refresh rate becomes very relevant for LCDs when it comes to fast moving images, especially with vsync on.



 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
doesn't a faster refresh rate on LCDs hurt response time? I'd stick with 60 unless you KNOW that it works perfectly @ 75
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: gersson
doesn't a faster refresh rate on LCDs hurt response time? I'd stick with 60 unless you KNOW that it works perfectly @ 75

Well if you can't notice the extra blur whether it occurs or not, it doesn't really matter does it? ;)
 

Barfo

Lifer
Jan 4, 2005
27,539
212
106
I did notice a drop in 3d games performance when lowering refresh rate to 60 Hz, so I'm running it at 75 Hz and it seems fine, I have a DVI btw.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: wizboy11
Originally posted by: BFG10K
Put it as high as possible.

no matter what kind of display

/thread

Not exactly true for CRTs. CRTs will lose clarity and become blurry if you set the refresh rate too high. On my 930SB there is a noticeable loss in sharpness when moving from 75Hz to 85Hz, although not bad at all. 100Hz looks horrible, however.


 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: Dethfrumbelo
Originally posted by: wizboy11
Originally posted by: BFG10K
Put it as high as possible.

no matter what kind of display

/thread

Not exactly true for CRTs. CRTs will lose clarity and become blurry if you set the refresh rate too high. On my 930SB there is a noticeable loss in sharpness when moving from 75Hz to 85Hz, although not bad at all. 100Hz looks horrible, however.

Alright, let me revise the statement.

Choose the highest one on an LCD (since they don't really get that blurry on DVI) but for CRT's choose the highest the looks the best.

/thread?