120hz lcd best for FPS games?

sim651

Junior Member
Mar 10, 2011
5
0
0
Was wondering if anyone who has experience with both 120hz lcd monitors and the normal 60-75hz lcd monitors who play fps games like css notice a difference? Is it worth the extra money to buy the 3D monitors that run at 120hz just for this purpose? If not why and what monitor would you recommend?
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Mods: move this to Video Cards and Graphics? Might get more responses there...
 

sim651

Junior Member
Mar 10, 2011
5
0
0
Oops. Well if this isn't the right place for this question can an admin move it to the appropriate spot?
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Honestly you need to make the decision for yourself. Most people on here do indeed see a difference. I personally can't play games at 60fps because it looks like a sideshow and even worse has ridiculous input delay. I won't even use Google Chrome because it scrolls at 60fps.

One time I was reinstalling drivers and didn't restart my computer for a few days. After I did restart, I moved my mouse from the default middle position to the start menu. I honestly thought I had a virus because that movement was so choppy, and I forgot the monitor was running at 60hz. Eleven inches. The pointer moved eleven inches on my screen and I knew something was up.

Not everyone can see a difference. My friend cannot tell the difference between 60hz and 170hz. Thousands of people on this forum love Fallout 3.
 

Bryf50

Golden Member
Nov 11, 2006
1,429
51
91
I personally can't play games at 60fps because it looks like a sideshow and even worse has ridiculous input delay.
I can't possibly imagine 60fps looking like a slideshow. You must be superhuman.
 

thescreensavers

Diamond Member
Aug 3, 2005
9,916
2
81
Honestly you need to make the decision for yourself. Most people on here do indeed see a difference. I personally can't play games at 60fps because it looks like a sideshow and even worse has ridiculous input delay. I won't even use Google Chrome because it scrolls at 60fps.

One time I was reinstalling drivers and didn't restart my computer for a few days. After I did restart, I moved my mouse from the default middle position to the start menu. I honestly thought I had a virus because that movement was so choppy, and I forgot the monitor was running at 60hz. Eleven inches. The pointer moved eleven inches on my screen and I knew something was up.

Not everyone can see a difference. My friend cannot tell the difference between 60hz and 170hz. Thousands of people on this forum love Fallout 3.

I can't possibly imagine 60fps looking like a slideshow. You must be superhuman.

^ This?

:hmm: eye sees a max of 30fps?

But a quick google gets , http://www.boallen.com/fps-compare.html and 60fps looks alot smoother :hmm:
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Sideshow may be too aggressive of a word, but its definitely not smooth. It sucks that so many things are being optimized for 60hz nowadays.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
^ This?

:hmm: eye sees a max of 30fps?

But a quick google gets , http://www.boallen.com/fps-compare.html and 60fps looks alot smoother :hmm:
Dead horse: your eyes do not see frames. Your eyes see changes over time. A fast enough change, and it cannot be processed as motion, and this threshold will vary person to person, both naturally and by training (playing twitch games all the time). To mimic this, you need one of two things: (1) accurate motion blur (crappy motion blur exists--you'd need, at the least, a real-time game engine, to make this work well), or (2) hundreds of frames per second (saturation).

It is logistically simpler to add frames.
 
Last edited:

simonizor

Golden Member
Feb 8, 2010
1,312
0
0
Dead horse: your eyes do not see frames. Your eyes see changes over time. A fast enough change, and it cannot be processed as motion, and this threshold will vary person to person, both naturally and by training (playing twitch games all the time). To mimic this, you need one of two things: (1) accurate motion blur (crappy motion blur exists--you'd need, at the least, a real-time game engine, to make this work well), or (2) hundreds of frames per second (saturation).

It is logistically simpler to add frames.

You're pretty much contradicting yourself there. FPS is a measure of how many frames are changed per second, and you are saying that our eyes see change, so they should be able to see frames being changed. What you said makes no sense. The amount of movement that our eyes see can be measured in frames per second, as can any other amount of movement.
 

Caerid

Junior Member
Dec 21, 2009
18
0
0
I'm surprised this argument about whether the human eye can't see more than 30fps still goes on. As Cerb was trying to explain, our eyes perceive changes over time...it doesn't do shutter motion capture like a camera would (although u could argue that if u go to a more micro level)

The bigger issue seems to stem from people saying human eyes can't perceive more than 30fps. But if the object you are viewing is moving at a fast pace those 30fps captures wouldn't have a smooth depiction of whats happening.

http://www.jazzil.se/forum/60vs24fps.avi.zip

That's a good example of how much of a diff fps makes....and I would argue that effect is still noticeable up to 120...maybe even a lil beyond in games that are faster paced. If you can't tell the diff between the 24fps video and the 30fps then I don't know what to say. There's a good reason why alot of sports are filmed at 60fps
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
You're pretty much contradicting yourself there. FPS is a measure of how many frames are changed per second, and you are saying that our eyes see change, so they should be able to see frames being changed.
No, changes over time. Think analog signaling: there are rising and falling levels, not ons and offs.

If you actually move something slowly in front of you, you have a nice gradual change. You detect motion. The other sensors in your eyes can be used to confirm this and compliment it. If you could see it as images, they would be blurry. Your eyes' response is fairly slow, and the processing of all the information together in your brain is what creates what you perceive as a clear image of a thing moving.

Now, move that something very fast. You don't see motion. A very fast fan, for instance, simply looks translucent.

On the opposite side of things, if you take something from a stop to moving extremely fast to a stop, again, it will almost look like it instantly moved...but you will see a little bit of it over the course of the movement, usually, giving your brain enough hints. The faster this movement, the more perfectly-focused static images you would need to percieve motion (in the absence of added motion blur).

What you said makes no sense. The amount of movement that our eyes see can be measured in frames per second, as can any other amount of movement.
Er, no. If and only if you record or render enough frames to saturate your ability to detect that these images are not being instantly swapped out (typically, with peripheral vision not accounted for, that's somewhere in the low 20s), and those frames have enough motion blur to mimic what your eyes might otherwise see, then there is enough information that you don't see that there are separate frames.

In games, the above is not true, due to the lack of motion blur, and that we are often close enough to the monitor to utilize peripheral vision. Every image is like a perfect photograph. Watch a high-profile sports event, or average new action movie, and you can see this, too: each frame looks great paused, but it is difficult to perceive motion. Then watch an old movie--old war movies are good for this, as they tend to have a lot of jarring movements--and you can see the opposite, with fuzzy still frames, but where you can perceive fine details and even read small print when the movie is playing. Recent animated CGI movies will tend to be on one side or the other of this. While you don't see each and every frame, you end up with a few frames effectively mashed on top of one another, but still without the information that your brain wants to be able to see that things are moving. It's a fitting analogy to say that you see the dots, but not lines connecting them.

As of yet, there have been very few attempts to create a game engine that is either real-time (vary rendered detail per frame to meet a known deadline, like maybe 15ms/frame), or that can predict how long the current and next frame will take to render (which need to be known for accurate motion blur with varying frame rates), one of which would be required to implement good motion blur.

However, there is another way to get around that problem: render enough frames quickly enough that motion blur, like good films have, is not necessary.

Finally, there are natural differences among us, and also it is something that can be trained. Playing fast-paced games all the time, on systems that can put out high frame rates, can make you more sensitive to frame rate dips and input lag.

Also, 120Hz monitors can reduce perceived input lag, when actually rendering fewer frames, especially when combined with vsync.
 
Last edited:

Nvidiaguy07

Platinum Member
Feb 22, 2008
2,846
4
81
I won't even use Google Chrome because it scrolls at 60fps.

Is this really true? I just upgraded to 120hz and definitely noticed how much better scrolling looked. I am using google chrome. I tried firefox scrolling out after seeing this post and it looks the same to me.....
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Scroll by clicking the middle mouse button. I use that religiously which is why I hate Chrome.
 

Nvidiaguy07

Platinum Member
Feb 22, 2008
2,846
4
81
Scroll by clicking the middle mouse button. I use that religiously which is why I hate Chrome.

I see. I never scroll like that, so it doesn't affect me. Scrolling with the bar works fine. I wonder why its like that though.
 

LiuKangBakinPie

Diamond Member
Jan 31, 2011
3,903
0
0
Again. LCDs don't have a refresh rate. They don't have a electron gun. They only have one to set to be compatible with your gpu. A gpu will only send the frames once it gets a signal from the LCD. they have to run in sync coz if they go out of sync you'll get tearing and ghosting issues. That's why we use vsync but it has a problem. It drops the fps half of the refresh rate in a lot of occasions instead of under it. It can be solved by tripple buffering. So the 120mhz will mean your LCD can receive 120fps before its ready again and if you don't use tripple buffering the fps running at 60 plus won't make a difference. But if you run it at 60mhz 30fps you can see a difference. So 120mhz you have more headroom to play with before you notice the difference when the frames drop. Reason we only see 120mhz and not many higher are due to the connection bandwidth limited. Dvi limited to 60. Hdmi and dual dvi can do 120mhz.
To work out the real emulated refresh rate a LCD can do is to
1000/response time
But manufacturers set it lower due to bandwidth limits and to give them headroom with the performance of it.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
LCDs do have a refresh rate

A GPU sends frames out to LCDs the exact same way they did to CRTs, line by line, continuously.

120 Hertz, not mhz

DVI has the bandwidth to pump out thousands of frames per second, depending on the resolution.

That calculation would be the maximum ghost free refresh rate.

Stop talking about things you don't know about. It degrades the forums.
 

Nvidiaguy07

Platinum Member
Feb 22, 2008
2,846
4
81
"IF YOU CAN'T CONVINCE THEM, CONFUSE THEM" ...........yup

@LiuKangBakinPie Seriously, I had no idea what you were saying. Grammar and content. work on that.
 

yakapo99

Junior Member
Oct 4, 2005
9
0
0
Hope I'm asking this in the right place... I'm considering getting a 120hz monitor. First a little background. Have you ever read someone post "if you can tell the difference between 120hz and 60hz, then fluorescent lights must drive you crazy"? Well fluorescent lights drive me totally crazy. I wear sunglasses in stores and it's not a fashion statement. There's more too it but that's another story.

Normally fps games give me headaches too. Right now my gaming is limited mostly to rts games. I've noticed that the more choppy a game is, the more likely I am to get a headache. I'm hoping that switching to a monitor that's capable of more than 100fps would help. Right now I'm using an i2500k @ 4.6ghz and a 5850. I'll probably upgrade to a 6950 or xfire 6850's when I get a 120hz monitor to help hit the high fps I would need. From what I understand 5850's don't support 120hz monitors anyway.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
If most fluorescent lights drive you crazy and not just a couple that are failing in the back of Walmart, its safe to say 120hz will help you out substantially.

5850's DO support 120hz. ATi had multiple driver problems with quite a few of their cards getting 120hz support, but that is mostly behind them. The problem stems with ATi cards not being able to force resolutions. DalNonStandardModes helps a bit, but it doesn't fix the root problem. No one cares anymore (or ever did), but if you want to take your monitor to the limits, you have to go Nvidia.