Question on 120Hz vs FPS

kamikazekyle

Senior member
Feb 23, 2007
538
0
0
I tried Googling, but it seems like all the answers are generic "120 Hz vs 60 Hz" or are on gaming sites that are blocked here at work.

Now that some 23" 1080p 120Hz monitors are skimming just over the $300 mark, I'm kinda toying with the idea of trying one out (especially since my HP IPS panel is annoying the hell outta me). I suppose I'm willing to sacrifice IPS quality since I do a lot of gaming, and 120Hz seems to be highly recommended. I can always run my other IPS panels as secondaries. I'd like a U2711 but $799 on sale + a GPU upgrade is a little out of the league at the moment :p

If I'm getting, say, 40-50 FPS without vsync on average in a game and don't plan to do 3D, will a 120Hz monitor offer any noticable improvement in motion quality? Basically, if the GPU isn't pushing >60fps, is there a point?

From other threads I've read around here I've heard arguments both ways. I wish I could compare in person, but the only 120Hz monitors I have access to are only used for static image 3D stereo, not for actual motion use. Playing around on an XP desktop stuff did seem a lot smoother and the mouse was more responsive. I didn't have any IPS panels nearby to compare image quality, but the loss in IQ didn't seem to be too bad compared to my HP (course my HP is kinda screwy, but that's another rant I've already been over). By the way, those 120Hz monitors I installed were the 22" Viewsonic 1680x1050s.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
On older CTR monitors, I could swear Id see the refresh rates of the screens when they where near the 60hz mark, soon as you hit 75~85hz, I couldnt anymore.

On my current LCD, even at 60hz I dont notice a thing, going up to say 75hz.
 

kamikazekyle

Senior member
Feb 23, 2007
538
0
0
I agree with you on the CRT. 60Hz was certainly flikery. I actually used active shutter glasses back in the day with a 60Hz refresh on a CRT. That mean each eye only saw 30Hz, and that equaled PAIN.

On LCDs, there's no flickering obviously, and is a slightly different beast. It's a pretty general consensus that in an LCD 120Hz is better than 60 Hz when it comes to fluiditiy of motion, especially for gamers. I'm just curious as to if you need your system putting out framerates equal to or an even fraction of the 120Hz mark.

I did enjoy the brief time I had on a 120Hz monitor using the desktop, though I dunno if I could warrant going from an IPS panel (even if it is an annoying beast) to a TN just for mostly static desktop use. Hence the gaming FPS question :)

Edit: bunnyfubbles answered my question here. It looks like the effects are minimal unless you are pushing a good chunk of FPS. So even if I did get a 120Hz monitor, I'd probably have to give up settings or get a new GPU anyway.
 
Last edited:
Nov 26, 2005
15,197
403
126
If I'm getting, say, 40-50 FPS without vsync on average in a game and don't plan to do 3D, will a 120Hz monitor offer any noticable improvement in motion quality? Basically, if the GPU isn't pushing >60fps, is there a point?

3D on a 120Hz, what's that?! pffff, that was the least thing on my mind when I bought my Viewsonic 120Hz vx2265wm I came from a 2ms Viewsonic and the improvement was nice... very smooth gameplay, took away some input lag, and my eyes are more at ease. All on a slow game called Unreal Tournament III, lol
 

kamikazekyle

Senior member
Feb 23, 2007
538
0
0
3D on a 120Hz, what's that?! pffff, that was the least thing on my mind when I bought my Viewsonic 120Hz vx2265wm I came from a 2ms Viewsonic and the improvement was nice... very smooth gameplay, took away some input lag, and my eyes are more at ease. All on a slow game called Unreal Tournament III, lol

I don't really do much in the way of arena or fast-paced FPS anymore. Anything FPS that I play nowadays is usually something like Oblivion or Fallout. Otherwise it's mostly 3rd person stuff. And the only online gaming I do is MMOs.

I might get ASUS VG236HE from Amazon. At least there, if I don't like it, I"m only out return shipping. It seems like if I want to go either of my potentially desired routes (U2711 or 120Hz) I'll probably need a new video card. I also had the idea of buying a 32" plasma, wall mounting it, and moving my desk back a bit. I have a swing mounted monitor already I could use for daily work, then push it out the way to game on the plasma. But finding a 32" plasma is hella hard, and anything bigger than that would be pointless.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
On older CTR monitors, I could swear Id see the refresh rates of the screens when they where near the 60hz mark, soon as you hit 75~85hz, I couldnt anymore.

On my current LCD, even at 60hz I dont notice a thing, going up to say 75hz.

its because the two are fundamentally different

refresh rate on CRTs is the rate at which the CRT repaints an image to the screen where as refresh rate on LCD is the rate at which the image is updated

you can notice the refresh rate on a CRT @60Hz because its effectively a very fast strobe, think of a flip book that you can't stop flipping even if its just a depiction of something static
 

LiuKangBakinPie

Diamond Member
Jan 31, 2011
3,903
0
0
LCDs don't have a refresh rate. They don't have electron guns and don't need to refresh like CRT does. But you see a refresh in windows to be compatible with your gpu.
Now you'll see 120mhz or 60mhz coz the connection limits it to that. Dvi can only transmit 60mhz where dual dvi/Hdmi can transmit 120mhz.
Basically the gpu will only send the images thru once it receives a signal from the LCD. now the refresh rate it emulates is to keep in sync with the gpu. If they're out of sync the frames will overlap each other leading to tearing and such. That's why we use vsync so often with LCD these days. Now that caps it just under your refresh rate. 60mhz or 120mhz. But it has the nack to cut your fps half of your refresh rate. That will be 30fps or 60fps. Now 30fps you can see a difference and if you did a overkill you feel you wasted your money...... To combat that you can use tripple buffering which will run it just below the refresh rate as the gpu will use a extra buffer to give the LCD time for the signal.

Now you can see if you buy a high end gpu and not looked at your monitor. 1080p with 6950 or 570 etc and you see sumthing like 30-50fps. 120mhz you'll see higher fps which will be better coz you have more headroom if the card have to work a scene really hard.

So always when you buy a gpu. First your native->gpu that can power it->then a cpu that can keep up with the gpu, ram that can feed the cpu and disks that can keep up
 

kamikazekyle

Senior member
Feb 23, 2007
538
0
0
Now you can see if you buy a high end gpu and not looked at your monitor. 1080p with 6950 or 570 etc and you see sumthing like 30-50fps. 120mhz you'll see higher fps which will be better coz you have more headroom if the card have to work a scene really hard.

So always when you buy a gpu. First your native->gpu that can power it->then a cpu that can keep up with the gpu, ram that can feed the cpu and disks that can keep up

That's pretty much what I do. I was budgeting this system out as a duplicate of my old system (Q6700, 285GTX) to install at my girlfriends house but with modern components. So I had a goal of 40fps average max settings, no AA at 1920x1200. I've pretty much exceeded that goal with my 460 GTX and its paired Athlon.

Rift right now is the most demanding game I play, haning around 40-60 fps without vsync on in most areas at some custom settings. That's why I was worried if I'd see any apparent or subjective smoothness increases/reduction in ghosting. There are some older games I load up which certainly exceed 60 fps.

Well, I still have a day or two to toy with the idea.
 

Emulex

Diamond Member
Jan 28, 2001
9,759
1
71
the geforce 430 sends a dual image (1920x1080 * 2) using hdmi 1.4 so it can be switched really fast on the tv instead of sequential alternating frames or half-resolution. Since 3D is rendered in the GPU i do not believe its a double horsepower required to give an effect of depth?
 

LiuKangBakinPie

Diamond Member
Jan 31, 2011
3,903
0
0
the geforce 430 sends a dual image (1920x1080 * 2) using hdmi 1.4 so it can be switched really fast on the tv instead of sequential alternating frames or half-resolution. Since 3D is rendered in the GPU i do not believe its a double horsepower required to give an effect of depth?

no its limited to your connection. 120mhz/120fps is the max you will get on any resolution on a LCD.

The actual real refresh rate a LCD can emulate you can work out by
1000/response rate
Eg if it has a 8ms response rate it can
1000/8 =125mhz or 125fps
But the connections limit it to 60mhz or 120mhz and manufacturers like to play it safe. So they set it lower.
The gpu will not sent anything until the LCD sents the signal. When it does the gpu will flip its buffers as one clears 2nd take its place. Now when the LCD receives it and they were out of sync (the gpu doesn't know of the LCDs limitations) it will frames overlapping each other. That's the tearing we see normally.
 

kamikazekyle

Senior member
Feb 23, 2007
538
0
0
Just saw the Radeon 5870 deal on the Hot Deals forum. I'm now seriously considering grabbing two of those for crossfire and then maybe trying out that Asus.

Edit: Meh. My HP now has significant brightness differences on the left 3/4 and right 1/4, on top of the bluish hue on the left side and a reddish on the right. Seems to be getting worse. My Dell SP2309 has better color accuracy now, and that's kinda blah even for a TN (at least it's a consistent blah). I went ahead and splurged on a 5870 crossfire setup and I think I'll give that Asus a whirl. If I don't like the Asus, I'm only out a return shipping fee, and the 5870 crossfire setup should be OK to drive a U2711 or U3011 setup if I really wanna go all out. And I still have my three NEC 20MWGX's to fall back on for IPS panels...they're pretty godlike in image quality. Just a bit small and lower resolution than I'd like for gaming.
 
Last edited:

LiuKangBakinPie

Diamond Member
Jan 31, 2011
3,903
0
0
Just saw the Radeon 5870 deal on the Hot Deals forum. I'm now seriously considering grabbing two of those for crossfire and then maybe trying out that Asus.

Edit: Meh. My HP now has significant brightness differences on the left 3/4 and right 1/4, on top of the bluish hue on the left side and a reddish on the right. Seems to be getting worse. My Dell SP2309 has better color accuracy now, and that's kinda blah even for a TN (at least it's a consistent blah). I went ahead and splurged on a 5870 crossfire setup and I think I'll give that Asus a whirl. If I don't like the Asus, I'm only out a return shipping fee, and the 5870 crossfire setup should be OK to drive a U2711 or U3011 setup if I really wanna go all out. And I still have my three NEC 20MWGX's to fall back on for IPS panels...they're pretty godlike in image quality. Just a bit small and lower resolution than I'd like for gaming.

why not get the 6950 2gb version rather than messing with a 5870 that gave all sorts of problems since day 1. are you going for a 3 x way monitor setup?
 

kamikazekyle

Senior member
Feb 23, 2007
538
0
0
I jumped on the 5870 deal. The 6950 is only marginally faster than a single 5870 card, yet a lot more expensive (again, compared to the deal). When I can get a 5870 CF setup for just a bit more than a single 6950 and smoke it's performance, well, there ya go. If the 5870 wasn't on sale, it might be another story.

I also wound up getting an Asus 120Hz monitor. It's pretty safe to say that I love 120Hz now, just at the desktop. Gaming is certainly smoother as well, even at <60fps. I still hate the TN display and the monitor does have some backlight bleed (which seems to be getting better?), but it's not too bad color wise from my IPS. I still need to tweak it some, but I dunno if I can go back to 60Hz after playing around with 120. The 1920x1080 resolution at 23" isn't bad. I do notice the lost pixels but the pixel density is a smidgen better, so it balances out a bit.