refresh rates on LCDs?

X14

Senior member
Aug 17, 2000
360
0
0
What is the signifcance of refresh rates on LCD monitors? I read people a saying that refresh rates don't apply to LCDs they way the do to CRTs. If you have an LCD, should you use the highest refresh rate available even if it doesn't matter?
 

X14

Senior member
Aug 17, 2000
360
0
0
My LCD give me a few different refresh rates to choose from ranging between 60 and 72. Is 72 any better than 60?
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
The situation is a little confusing. For CRTs, refresh rates matter because higher refresh rates mean lower flicker and consequently lower eye strain. But LCDs don't have any flicker at any refresh rate. So in this sense, refresh rate doesn't matter for LCDs. But LCDs have a ghosting problem which CRTs don't have. And ghosting is limited by rise/fall times. The thing is, refresh rates on LCDs are also determined by rise/fall times (refresh rate = 1second/(rise+fall time)). So even though you don't need to care about a LCD's refresh rate with regards to flicker, you still need to care about a LCD's refresh rate because it determines the amount of ghosting on the screen. You go it? So higher refresh rates are better on LCDs. All LCDs have very similiar refresh rates but this will be changing shortly with the advent or feed forward LCDs. (feed forward is discussed in one of the other threads and I won't rehash things here.)
 

dszd0g

Golden Member
Jun 14, 2000
1,226
0
0
zephyrprime, that post is a little misleading IMO. The pixel response time (also called rate, even though it is not a rate) of an LCD (which is what the rise/fall times are generally stated as and which is what determines the ghosting) is a hardware limitation of the LCD itself and independent of what refresh rate one sets the video card to. I haven't seen the refresh rate the video card is set to make any difference for LCDs, but maybe someone else will have a better answer as I really don't know.

However, IMO, if one has a 25ms pixel response time LCD it is the equivalent of a 40Hz monitor as far as ghosting goes (1/0.025s = 40Hz). That doesn't mean that it is the same thing as a 40Hz CRT, it is a different measurement. A 40Hz CRT would flicker enough to make a nun go on a killing spree, but on an LCD has no flicker at all. I just wouldn't want to game on it. See LCD + gaming = okay? for a more in depth discussion of all this.

[Edit:

When I thought about it, basically the refresh rate of the video card may make a small difference as it decreases the amount of time it takes for a command to be sent to the LCD. However, the LCD only responds so fast so there is no way the LCD is going to keep up with even a 60Hz refresh rate -- well, a 16ms response time LCD will, and they do currently make down to 12ms. But the standard 25ms and higher (as linked to in this thread) won't. A higher refresh rate may decrease the propagation delay until when the pixel responds.
]
 

Kingofcomputer

Diamond Member
Apr 6, 2000
4,917
0
0
http://www.hitachidisplays.com/how_monitors/flatpanel.htm

You will also notice that, unlike CRTs, most vendors recommend using a lower refresh rate ; 60 or 70 Hz; if this setting can provide an acceptable picture with your application. For example, the LC153 will run at 1024 x 768 with a 75Hz refresh rate. However, the picture sometimes looks better at 60Hz, depending upon the application. This is true with all flat panels, and other vendors also suggest using lower settings, which can sometimes attain better pictures and text than higher refresh rates. The reason for lower refresh rates in LCDs is because the liquid crystal molecules damper flicker and higher refresh rates are not as important. So you can choose a refresh rate that looks best with your application.
 

adhoc

Member
Sep 4, 2002
86
0
0
There is a difference between refresh-rate and pixel response time (yes, even with CRTs) Look Here near/at the end of the thread for more info.
 

adhoc

Member
Sep 4, 2002
86
0
0
X14,

In answering your question, however, setting the refresh-rate from 60Hz - 72Hz as you stated is not going to gain anything, and may even hinder the monitor's performance... If you are using a digital interface (don't know if you are or not), you aren't really sending a 60Hz signal to the monitor anyways, no matter what setting you set in Windows.
 

dszd0g

Golden Member
Jun 14, 2000
1,226
0
0
adhoc, you don't think it makes any difference in propagation delay? (I doubt it would be humanly noticeable, but I would think it would make some difference)

God, trying to draw a timing diagram in these forums:


VID_CLK = Refresh rate of the video card
SIG_tp = Propagation delay for signal
SIG = Pixel signal
LCD_r = Rise time propagation delay
DIS = Pixel displayed on screen

VID_CLK = 60Hz

VID_CLK __|--|__|--|__|--|__|--|__|--|__
SIG........._____|------------------------------
DIS........._______|---------------------------

......................|-| = SIG_tp
.........................|--| = LCD_r

VID_CLK = 120Hz

VID_CLK _|-|_|-|_|-|_|-|_|-|_|-|_|-|_|-|_|-
SIG......... ____|---------------------------------
DIS......... ______|------------------------------

.....................|-| = SIG_tp
........................|--| = LCD_r


I would think between a 60Hz and an 120Hz video card refresh rate, the pixel would change about 4.1ms sooner. I haven't actually looked at timing diagrams for LCDs, so this is just a wild guess. I don't think this will manage to stay formatted and I think it is probably impossible to get it to stay formatted on everyone's systems because of different fonts and such, but I would hope any CEs or EEs would be able to follow it.
 

Cadaver

Senior member
Feb 19, 2002
344
0
0
Of course, this all only applies if you're using an LCD via the VGA connector.
It's a moot point if you use a DVI connection. The "refresh rate" is locked at 60Hz.
 

dszd0g

Golden Member
Jun 14, 2000
1,226
0
0
Originally posted by: Cadaver
Of course, this all only applies if you're using an LCD via the VGA connector.
It's a moot point if you use a DVI connection. The "refresh rate" is locked at 60Hz.

Most manufacturers currently seem to use 60Hz. The Digital Display Working Group (DDWG) Digital Visual Interface (DVI) 1.0 specification states that 640x480@60Hz must be supported (Section 2.2.4), but does use clock recovery (Sections 3.3.1 and 4.7.3). In the DVI specification it even has some examples of DVI CRTs running at 85Hz and I have seen some companies working on those where the DAC is inside the CRT instead of on the graphics card.

So your statement is currently true, but it should be possible for products to be available in the future that do not use a 60Hz fixed frequency when LCDs actually get the response time faster than that.

you aren't really sending a 60Hz signal to the monitor anyways, no matter what setting you set in Windows.

Actually, you are most likely sending a 60Hz signal if you are using DVI, well sort of. The actual transmission takes place much faster (up to dual 165MHz from what I read), but the carried signal is 60Hz. If you are using the HD15 (VGA) connector then it is whatever you have set in Windows.
 

X14

Senior member
Aug 17, 2000
360
0
0
Thanks everyone. I'm using a DVI connection so I guess that answers my question, I'll set everything to 60.
 

Bovinicus

Diamond Member
Aug 8, 2001
3,145
0
0
What about when playing games with V-sync enabled? A higher refresh rate might be useful in that situation. However, if the image is degraded as a result, that would counter the effectiveness of V-sync in the first place. Some knowledgeable posters seem to be involved in this thread. What is your take on my stated situation?
 

dszd0g

Golden Member
Jun 14, 2000
1,226
0
0
Originally posted by: Bovinicus
What about when playing games with V-sync enabled? A higher refresh rate might be useful in that situation. However, if the image is degraded as a result, that would counter the effectiveness of V-sync in the first place. Some knowledgeable posters seem to be involved in this thread. What is your take on my stated situation?

I almost always play games with vsync on. I can't stand texture tearing and other visual artifacts that occur when vsync is off. However, vsync does limit the game to that rate. Since things are only displayed so fast on the screen in most games there really isn't much advantage to having the game output more frames than are displayed to the screen. Now some games may limit your movement or movement response time by your fps, so a higher fps may increase your movement rate or response. One won't see the additional movement as it occurs, but it may occur in game and you would see the results of the additional movement. I have seen some people claim that as a reason they disable vsync. I don't know what games that may apply to and I'm not a serious enough FPS player to really care, but it's something to think about if you are. Maybe someone else has more information about this or whether it is complete bs.

Now the same thing applies with a higher refresh rate and LCDs. Does an increased fps in the game help even when it isn't displayed to the screen. Some people say yes.
 

Bovinicus

Diamond Member
Aug 8, 2001
3,145
0
0
I almost always play games with vsync on. I can't stand texture tearing and other visual artifacts that occur when vsync is off. However, vsync does limit the game to that rate. Since things are only displayed so fast on the screen in most games there really isn't much advantage to having the game output more frames than are displayed to the screen. Now some games may limit your movement or movement response time by your fps, so a higher fps may increase your movement rate or response. One won't see the additional movement as it occurs, but it may occur in game and you would see the results of the additional movement. I have seen some people claim that as a reason they disable vsync. I don't know what games that may apply to and I'm not a serious enough FPS player to really care, but it's something to think about if you are. Maybe someone else has more information about this or whether it is complete bs.
That's not what I meant. What I meant was, if you force the LCD to use a higher refresh rate, then games will get higher FPS with V-sync enabled. I always have it enabled too. I was not suggesting that one disables V-sync.
 

dszd0g

Golden Member
Jun 14, 2000
1,226
0
0
Originally posted by: Bovinicus
That's not what I meant. What I meant was, if you force the LCD to use a higher refresh rate, then games will get higher FPS with V-sync enabled. I always have it enabled too. I was not suggesting that one disables V-sync.

Sorry, I was using that as an analogy I didn't think that is what you meant. Turning vsync off runs the game faster than the video card displays. Upping the refresh rate on the video card runs the refresh rate faster than the LCD displays. See the analogy? Maybe its not a very good one. A number of the LCD manufacturer's FAQs recommend running in 60Hz (or whatever the native for the LCD is in some cases 70Hz or 75Hz).

Philips
Cyco
Mirror

Sorry for the two lesser known brands, a number of the major brands don't even mention refresh rates in their LCD FAQs.
 

adhoc

Member
Sep 4, 2002
86
0
0
dszd0g,

Referring back to your timing diagrams: I'm not sure if those are correct or not, I'll take your word for it. However, the problem I see with that theory is that the technology behind the LCD input system is different than say a CRT. There are most likely a number of registers and memory to hold the serial input from the DVI (and most likely a ADC to convert a D-SUB).

From what I rememver, the video card sends clocked data to a TDMS transmitter, which serializes the data from 24-bits of color (rgb), Hsync, Vsync, and a few others that are currently unused into a high-speed 3-bit differential trasmittion line, plus clock. Actually, the transmitter is divided up into three logical encoders (red, green blue) of 8 bits each (24bits total). There are two controll signals per encoder (6 total), and a global clock. of the six control signals, only two are used (the others are driven low for compliance and for future support), and these two are Hsync and Vsync.

When the TDMS receiver recovers this data stream and convertes it back to a 24-bit RGB and Hsync and Vsync. As far as I understand, the only reason why hsync and vsync are required for spec is because DVI is compatible with VESA standards, and if we had a digital input CRT, we would need the hsync and vsync signals. LCDs, for purposes for or against, do not need the hsync and vsync signals (they use the clock for synchronization). This is why I believe changing the frequency in Windows will have no effect if using an LCD with a DVI interface...

Now, I'm not sure about analog, because the LCD has to have a circuit to encode the analog singal to digital... I do not know if raising the refresh-rate would allow a better sampling rate or not... I haven't looked into it.

Let me know if I am not making any sense!

 

dszd0g

Golden Member
Jun 14, 2000
1,226
0
0
Originally posted by: adhoc

Now, I'm not sure about analog, because the LCD has to have a circuit to encode the analog singal to digital... I do not know if raising the refresh-rate would allow a better sampling rate or not... I haven't looked into it.

Let me know if I am not making any sense!

Your post seems about right. I am not going to dig up my copy of the DVI specification to nit-pick as I can tell your post isn't even close to complete and I'm pretty sure has a few inaccuracies, but the concept is there. However, my timing diagram above was if one was using the analog input. And I wasn't saying that one would get a better sampling rate, but I was saying it would decrease the propagation delay slightly (<5ms is definitely not humanly noticeable).

According to the DVI 1.0 spec:

2.2.10 HSync, VSync and Data Enable Required

It is expected that digital CRT monitors will become available to connect to the DVI interface. To ensure display independence, the digital host is required to separately encode HSync and Vsync in the T.M.D.S. channel.

The analog part in DVI-I must also have the hsync and vsync. I think it is pretty clear what the hsync and vsync signals are for in DVI and it has nothing to do with LCDs.

When one is using DVI the refresh rate is locked so changing the refresh rate makes no difference. Even with analog input most LCDs only accept specific refresh rates (some only 60Hz, some will take 70Hz or 75Hz). So we are talking about two different things. Of course refresh rate makes no difference with the DVI input, because as far as I can tell on most cards it doesn't change anything.