Guys - help me out. Interesting debate in regards to LCDs

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
I have a discussion going on on prad.de, which is a very big german site where they test LCD Monitors. Its a very known site.

Anyway i am talking with a head-guy from there and its about the vertical frequency of LCD monitors which (as we all know) is set to 60hz, there is only one or two LCDs which are actually capable of displaying more than 60hz, for a given resolution.

The argument (on his side) goes like this:

1) a graphics card doing 200FPS eg. in Crysis would be totally useless for people on LCD/TFT since the monitor can only "give out" 60hz anyway

My argumentation is that it is indeed a big difference to play 120+, 200+ FPS, a difference which is very real and feelable, even if my LCD itself is only "capable" of displaying 60hz vertical frequency.

2) I was also bringing in the argument that we have the option of "Vsync on" or "Vsync off"...which allows to run any game engine and graphics cards as fast as possible, wothout a forced sync to the monitor's limited rate.

3) Some people ask whether certain LCDs "can handle XYZ vertical frequency"...and i say its totally irrelevant since there is no such thing like "vertical frequency" for LCDs, unlike on CRTs. The picture on a LCD are static, so it doesnt really matter what vertical frequency they're at (assuming you play with Vsync off)

They say a LCD which can do 75hz at 1440 would be better than 99,999% of all the other LCDs who only can do 60hz. I say this thinking is nonsense.

4) Question...how does actually the vertical frequency of LCDs really come into play...is it REALLY the case that it is nonsense to play a game at 200+ FPS since (according to some people) we wouldnt see more than 60 frames/second since the LCD cant display more than that due to it's hardware limits?

If i play with "vsync off"...i am playing 100% independent from whatever vertical frequency the monitor is capable of...but the guy(s) still argue it doesnt matter. They say the monitor can only display 60 frames/second and there is no visible advantage.

How does the fact that a LCD usually has lower vert. frequency over a CRT really translate in real life and playability of games??

Georg.

 

ajaidevsingh

Senior member
Mar 7, 2008
563
0
0
1. The word is "tearing" you are right basically unless v-sync is enabled your FPS can be more than the refeash rate. But this does not apply to LCD's haha. In LCD's only response rate matters. Lcd's have seperate illuminate @200Hz and imaging @Response time components. So a LCD with 60Hz can hae a response time of say 6ms and get a FPS rate of more than 60.

2. See 1.

3. You are right but even if Vsync is ON it would not matter much as UB would be the same.

4. See 1.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I'm pretty sure LCDs can tear given tearing is caused by the GPU sending a new frame before the display has finished displaying the old frame. Even if the LCD buffers the image first, it?s feasible the GPU could send a new frame before the buffering is complete, thereby leading the LCD to buffer two or more partial images.

As for refresh rate, technically LCDs have none but they still have a response time which dictates how quickly their pixels can change. So if it?s 16 ms then the LCD is limited to roughly 60 full frames per second.
 

nevbie

Member
Jan 10, 2004
150
5
76
Originally posted by: ajaidevsingh
1. The word is "tearing" you are right basically unless v-sync is enabled your FPS can be more than the refeash rate. But this does not apply to LCD's haha. In LCD's only response rate matters. Lcd's have seperate illuminate @200Hz and imaging @Response time components. So a LCD with 60Hz can hae a response time of say 6ms and get a FPS rate of more than 60.

I'm pretty sure that is complete BS.


Here is how I think it works:

When a LCD works with 60hz, it means that it updates it's picture every 1/60second. That is approximately 16,7ms waiting time between the "put data on screen" commands.

Response time means the delay of time between the moment when a pixel receives the new color signal and the moment when the transition to the new color has ended. And this response time depends on what the previous color was and what the requested new color will be. So in a typical game situation certain pixels change a bit faster to the new frame than the other ones that get a more difficult transition.

Overdrive and similar technologies probably 'boost' the transition by telling the pixels to change to more extreme colors first (depending on the transition), and then relax the request during the color changing process, so that they end up picking the desired color faster. Overshooting would mean that the control mechanism lets the pixels to change a bit more than desired, so that the pixels actually show the more extreme colors for a short moment before "going back" to the desired color.

I suppose input lag means that the data that video card sends to the monitor is kept in a some kind of buffer inside the monitor(perhaps so that something can be done to it "dynamic contrast" and whatever) and that it will either not send the newest data that it has (multiple buffers) or that there simply is a delay somewhere in the process. With multiple buffers the input lag would only be certain values like 0ms, 17ms and so on.

Now I believe that people who say "you get only 60 frames, it cannot show more!!!1" are correct. But they miss one point, that the source data is different between a game that runs 60fps (vsync on) or some other value like 200fps. When my CRT has a refresh rate of 85 or 100 or whatever, I can definitely see differences in the flow of games even with higher-than-refreshrate fps values. The pictures ARE different, even if there still are 60 of them per second (or whatever refresh rate you have).

For example if the game runs 200fps, and your monitor eventually outputs image data at 60fps, the corresponding times are:

200fps game updates the framebuffer on video card: 0ms, 5ms, 10ms, 15ms, 20ms, 25ms..
60fps monitor gives these moments of the game if we assume an instant delivery: 0ms:"0ms"
16,7ms:"15ms"
33,3ms:"30ms"
..and so on.

Eventually, as they are not synchronized, the output will not be steady, and there will be small "jumps" (in 17ms of time you see less OR more than 17ms of game happening).

And there is still one point more: the game may behave differently with different fps values, for example the default "OSP" physics in Quake 3 depend on your fps value. Certain other games may as well behave a tiny bit differently with differing fps values.

And I believe it's these small differences + the fact that monitor is not synchronized with game output that generate the game "flow" differences with different fps values, even when using a lower hz monitor.

edit:
so in my theory, if your LCD gets the data from video card let's say 6ms after it displayed the last image, the time it takes for each individual pixel to change to the desired color of the new image is:
(16,6666..ms - 6ms) + individual pixel response time. If we assume that every pixel finalizes it's change in 5ms, the new image will be in your screen in (16,6666...ms - 6ms) + 5ms = approx. 15ms.

Just my theory though.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
I see tearing bigtime. If vsync is enabled most programs stutter or pause. (very annoying)

Also backlighting for LCD's - based on CCFL - is driven at a very high frequency so it has no perceptible flicker. With a CRT the refresh rate also affects how often the light source - phosphors on a shadow mask - are actually turning on and off. A phosphor with a longer persistence will appear to flicker less however the afterglow can appear to leave tails on small fast moving objects on dark backgrounds. The solution was to ratchet up the refresh rate to 85Hz or higher. I am the sensitive type and need 100Hz or better before it looks "solid". Problem is at higher refresh rates and resolutions requires more video bandwidth and the connectors as well as the cable become very important. Any reflections resulted in ghost images displayed to the right of vertical lines which gives everything a fuzzy out of focus appearance. Not all CRT's were equal as well and some only could display a crystal clear image at 60-75Hz which meant sensitive users (me!) had to deal with either blurriness or flicker. (or using a lower resolution) Thank goodness LCD's eliminated this!
 

will889

Golden Member
Sep 15, 2003
1,463
5
81
^Higher refresh rates on a CRT require an output source (monitor with correctly loaded INF files) with the capability of higher video bandwidth providing the input source is also capable of said refresh rate
 

nevbie

Member
Jan 10, 2004
150
5
76
Does tearing originate from the reading of graphics card memory in an "unlocked" manner or does it originate from the reading of monitor buffer that stores the image (in an unlocked manner)?

If it's the latter, then my theory would have to be fixed so that the screen does not update as one object, but each row of pixels updates (or even single pixels, in an order).. and that these rows would update in a controlled order. If it's the former, then my theory stays unknown for that part. (what size is one update unit in a LCD?)
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
4) Question...how does actually the vertical frequency of LCDs really come into play...is it REALLY the case that it is nonsense to play a game at 200+ FPS since (according to some people) we wouldnt see more than 60 frames/second since the LCD cant display more than that due to it's hardware limits?

If i play with "vsync off"...i am playing 100% independent from whatever vertical frequency the monitor is capable of...but the guy(s) still argue it doesnt matter. They say the monitor can only display 60 frames/second and there is no visible advantage.

The other guys are correct. The video card is always outputting information at a fixed rate (the refresh rate) regardless of what type of monitor it's connected to. The 60hz refresh rate places an upper limit on how high the actual, displayed framerate can go, and this is in addition to any limitations created by the response time. An LCD (say 19") with a 60hz rate will never display more than 60fps even if it had a 0ms response time, since the video card will always be sending exactly 1280x1024x60 pixels per second through the DVI port.

This is not necessarily 60 internal frames by the way, since it depends on how you define a frame. If the video card's framerate goes over the refresh rate, it will drop pixel data from some frames in order to match up with the overall 60hz rate, which results in the tearing effects you see.

Whether or not all this is actually noticeable depends on how sensitive you are to these things. I think it most certainly affects the smoothness in certain games. A few 19" LCDs support 75hz (without dropping frames) and there is a noticeable improvement in just the mouse movements in Windows.

People usually talk about refresh rates on CRTs in terms of the flickering, but they actually also affect the framerate in this way.
 

recoiledsnake

Member
Nov 21, 2007
52
0
0
Originally posted by: CP5670
4) Question...how does actually the vertical frequency of LCDs really come into play...is it REALLY the case that it is nonsense to play a game at 200+ FPS since (according to some people) we wouldnt see more than 60 frames/second since the LCD cant display more than that due to it's hardware limits?

If i play with "vsync off"...i am playing 100% independent from whatever vertical frequency the monitor is capable of...but the guy(s) still argue it doesnt matter. They say the monitor can only display 60 frames/second and there is no visible advantage.

The other guys are correct. The video card is always outputting information at a fixed rate (the refresh rate) regardless of what type of monitor it's connected to.

Sorry, that is just plain wrong. The only time the video card does that is when vsync is turned on. The whole phenomenon of tearing happens because the video card sends many more frames than the monitor can display at it's refresh rate. Hence the monitor displays top part of one frame and the bottom part of the next frame at a single instant resulting in a 'tear' in the middle. Video cards can and do push more fps than the refresh rate can handle.

The rest of your post is correct in noting that the monitor cannot display more frames than it's refresh rate.

 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Originally posted by: recoiledsnake
Sorry, that is just plain wrong. The only time the video card does that is when vsync is turned on. The whole phenomenon of tearing happens because the video card sends many more frames than the monitor can display at it's refresh rate. Hence the monitor displays top part of one frame and the bottom part of the next frame at a single instant resulting in a 'tear' in the middle. Video cards can and do push more fps than the refresh rate can handle.

No, vsync has nothing to do with this. Think about it this way. The whole reason that it's cutting off parts of certain frames (which is essentially what tearing is) is that the frames in total contain more information than is supported by the refresh rate. The video card needs to remove pixel data from them in order to match up with the refresh rate.

If the monitor actually displaying the image was causing the tearing as you say, we would only be seeing it on CRTs and not LCDs, which is of course not what happens.

The way current video cards operate, information is sent at a fixed rate that does not depend on any internal framerate, which is why the refresh rate makes a difference even in 2D things like Windows.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I've personally seen no difference between 60hz and 120hz LCDs side-by-side. That's not to say that a difference does not exist, just simply that I haven't noticed one.

The other problem with 120hz LCDs is that it would require a video card that could output say Crysis at 60fps (and preferrably 120fps) for it to make any discernable difference.

IMO there are many more important factors in deciding which LCD you want.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
There are no 120hz LCDs out there. The ones marketed as such aren't 120hz at all, but just 60hz with a black frame thrown in between every normal frame.

The best refresh rates are attained by the 17/19" 1280x1024 LCDs (maybe also the 1440x900 ones, not sure), which support 75hz, but it seems that many of those just discard every fifth frame. However, a few do actually show the full 75hz, and I find the difference to be pretty noticeable on those. Crysis is a bit of an exception on current hardware but most games can run a lot faster, especially older ones.
 

nevbie

Member
Jan 10, 2004
150
5
76
Originally posted by: CP5670
No, vsync has nothing to do with this. Think about it this way. The whole reason that it's cutting off parts of certain frames (which is essentially what tearing is) is that the frames in total contain more information than is supported by the refresh rate. The video card needs to remove pixel data from them in order to match up with the refresh rate.

If the monitor actually displaying the image was causing the tearing as you say, we would only be seeing it on CRTs and not LCDs, which is of course not what happens.

The way current video cards operate, information is sent at a fixed rate that does not depend on any internal framerate, which is why the refresh rate makes a difference even in 2D things like Windows.

Cutting off parts of frames? needs to remove pixel data? that's quite vague.

I thought it goes like this:
There is a frame buffer in the graphics card. It holds the pixel data of what is currently on screen. Programs write to this memory location when they want to change what is shown on the screen. I guess the vid card sends the contents of the frame buffer to the monitor-side buffer according to the refresh rate setting (set to gfx card by drivers or something). Now tearing would occur if a program writes to the frame buffer at the same time as the buffer is being sent to monitor. And VSync would be a setting that locks the frame buffer so that the program couldn't write to it if the transfer is going.

And if the video card sends more than one frame per 16,666..ms, it would just overwrite the buffer at the monitor-end. Monitor just updates when it wants to.. but it seems to lock it's internal buffer (otherwise there would be tearing with VSync on).

Would that be correct? =P
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
If I am getting way more than 60FPS in a game consistently, I always enable vsync. It looks so much cleaner. Tearing sucks.


If you are not getting more than 60FPS consistently, then the display is forced to run at a factor of 60 (or sum of factors) I think?

10, 15, 20, 30, 45, 50?


Whatever. Just play with the settings that look the best to you. You can get brain numbness from arguing with people on the internet too much. ;)
 

The Bakery

Member
Mar 24, 2008
145
0
0
So...

With a 60hz refreshing monitor, the viewable performance ceiling for ANY card is 60fps?

If that's true, then no one should be concerned about breaking 60+ fps unless they have
a 60hz+ refresh rate on their monitor (or if they lack other means to massage their ego).

I don't honestly recall seeing many frame rates above 60 on benchmarks for recent games
and GPUs - most (from memory) hover between 20-50 depending on game and card.

So... realistically - the only person this affects would be an enthusiast trying to achieve
top end performance on a monitor with a lower refresh rate.

I just think it's interesting that the performance junkies haven't made this a major issue
before fumbling thousands of dollars away to break 60fps.

That is to say, if it's true to begin with.
Seems simple though - you can't put 10 gallons of water in a 8 pound bucket. But we're talking refresh rates and visual data, not water and buckets.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: The Bakery
if it's true
It's true.

Many enthusiasts won't even use an LCD because of limitations like this. Others just like to crank up the AA/AF and run at 2560x1600. :beer:
 

birdog

Member
Jul 11, 2001
65
0
0
Hz = frequency in times per second, so fps directly relates to Hz and vice versa. (e.g. 100Hz is same as 100fps).

BUT

If you video card is only doing 40 FPS, but your monitor is 60HZ, then it will show 60 frames every second. So 20 of the 60 frames it shows will be the same.

On the other hand if your vidoe card is doing 80FPS, but your mointor is still only 60HZ then it still shows 60 frames. So 20 of the generated images from the video card are wasted/not displayed.


Vsync (not sure on this) but this is my understanding

your monitor could refresh right in the middle of a frame so it would have last 1/2 of the 39th frame (for example) and the first 1/2 of the 40th frame causing what they call tearing. So Vsync is a buffer (single, double, ..) so that a whole frame is displayed everytime. buffering means storing each image in a buffer which can cause a delay.


ms is generaly a measurement for ghosting only

The ms rating on a monitor is the time that it takes for one pixel to go from white, through every single colour that it can do to black. If a monitor has a 2ms White to Black response time then it is a very good quality panel.

2 ms = 2/1000 seconds = 0.002seconds, so frequency = 1/period = 1/0.002 = 500 Hz.

So when LCD screens talk about response time they aren't talking about the refresh rate of the monitor, which is typically only in Hz. The response time on a LCD has to do with ghosting on LCD screens, and there is some sort of formal test which gives those response times in ms, such as 2ms. The difference between a 8ms, 5ms or 2ms screen is not very noticeable. It use to be a big issue when response times were in the order of 30ms+, but the marketing hype has carried on.

Most 2ms monitor use a technology called RTA which fakes the response time and produces undesirable graphical defects in order to cheat in the aforementioned test. These LCD screen usually achieve 5ms when RTA is turned off.

Screens with 8ms or less response time are generally 6-bit panels, which means they have 6-bit red, green and blue channels rather than the standard 8-bit. This means they can only display approx 256,000 colours and use dithering techniques, such as those used in newspaper printing, to fake the rest. Whereas 8-bit panels support over 16.7 million colours. Note: Manufacturers will often claim 16 million colours on 6-bit panels, even though this achieved through dithering. Normally true 8-bit panels will say 16.7 million colours, but I have seen this label increasingly on 6-bit panels.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Originally posted by: nevbie
Cutting off parts of frames? needs to remove pixel data? that's quite vague.

I thought it goes like this:
There is a frame buffer in the graphics card. It holds the pixel data of what is currently on screen. Programs write to this memory location when they want to change what is shown on the screen. I guess the vid card sends the contents of the frame buffer to the monitor-side buffer according to the refresh rate setting (set to gfx card by drivers or something). Now tearing would occur if a program writes to the frame buffer at the same time as the buffer is being sent to monitor. And VSync would be a setting that locks the frame buffer so that the program couldn't write to it if the transfer is going.

And if the video card sends more than one frame per 16,666..ms, it would just overwrite the buffer at the monitor-end. Monitor just updates when it wants to.. but it seems to lock it's internal buffer (otherwise there would be tearing with VSync on).

Would that be correct? =P

Yeah, that is basically right. If a new frame is written to the buffer before the existing frame has been fully sent, the card will output part of the old frame and part of the new one, which results in tearing at the boundary between the two pieces of the frames. The end result is that the video card discards some part of each frame in the process, but the number of actual pixels sent per second, controlled by the refresh rate setting, remains constant.

Seems simple though - you can't put 10 gallons of water in a 8 pound bucket.

Exactly. :D
 

DerekWilson

Platinum Member
Feb 10, 2003
2,920
34
81
i still owe bfg some timing diagrams wrt what happens between the framebuffer and the screen (i.e. when do things get written out over tmds, how do monitors usually treat the data they recieve etc)

I haven't forgotten -- i've just been busy ... i might actually do an article on it when i finally get the info and go through it ...

a lot can change depending on when and how fast the framebuffer is read and sent to the display wrt tearing, how many frames you see, and how fast you see those frames.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
>>>
The video card is always outputting information at a fixed rate (the refresh rate) regardless of what type of monitor it's connected to. The 60hz refresh rate places an upper limit on how high the actual, displayed framerate can go,
>>>
I dont think that the video card is dependant on the monitor's refresh rate...this would defy anything i know about graphics cards. It would also defy any sense of the "vsync" option.
Its clear that a monitor cant *display* more that what its refresh rate, but you're implying basically that a fixed vertical refresh caps the output of a graphics card.

if it were, then we would not have tearing since tearing is a result of the non-capped output of the card.

Also..let's say i am playing some older game capable of 200+ FPS (HL2) on my 60hz LCD...you're saying 140+ of the calculated frames are wasted since i can only "see" 60 FPS anyway...and the poor graphics card is working shooting 200+ FPS out from the framebuffer, 140 of it for nothing??

This is extremely interesting to hear people's opinion btw. very interesting debate.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Originally posted by: flexy
I dont think that the video card is dependant on the monitor's refresh rate...this would defy anything i know about graphics cards. It would also defy any sense of the "vsync" option.
Its clear that a monitor cant *display* more that what its refresh rate, but you're implying basically that a fixed vertical refresh caps the output of a graphics card.

if it were, then we would not have tearing since tearing is a result of the non-capped output of the card.

Well, there are two different things you're referring to as output here. One is the GPU's internal frame generation, which is normally uncapped and writes frames into frame buffer as fast as it can (unless vsync is on). The other is what the video card's DSP actually sends through the DVI port, by reading off the frame buffer. This is what is capped at the refresh rate.

Also..let's say i am playing some older game capable of 200+ FPS (HL2) on my 60hz LCD...you're saying 140+ of the calculated frames are wasted since i can only "see" 60 FPS anyway...and the poor graphics card is working shooting 200+ FPS out from the framebuffer, 140 of it for nothing??

That is essentially correct. I think the best way to think of it is in terms of the pixels though. It's a bit ambiguous to say that 140 frames are dropped, because it's still sending small pieces of most of those frames, but chopping and changing them. However, the rate at which pixels are sent is always fixed.

I haven't forgotten -- i've just been busy ... i might actually do an article on it when i finally get the info and go through it ...

Great idea. There is often a lot of confusion over these issues.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Same thing, just that it's now the video card's RAMDAC that becomes capped.

In theory, it might be possible to have a variable refresh rate system (at least with DVI LCDs), so the video card outputs frames at the same rate as they're generated and the monitor also displays them as fast as it gets them, but no current cards or port standards work that way.