- Jul 22, 2000
- 4,694
- 0
- 0
Do frame rates over 60 matter on non-120hz LCDs? The per-pixel refresh rate is still 60hz, right?
Do frame rates over 60 matter on non-120hz LCDs? The per-pixel refresh rate is still 60hz, right?
I need to get in touch with some local people with 120hz or IPS lcds to see if it's worth upgrading my Samsung 226BW (lost the panel lottery, not that I was even aware of that when I bought it).
I could just go on the recommendation of a serious gamer, whether they prefer 120hz or IPS, since they're still exclusive. 120hz LED backlit might be really nice too.
120hz is the better choice if gaming is your concern. And fps above 60 on a 60hz monitor is still noticeable if your playing FPSs
I'm not so sure about that --- but I'll be the first to admit that I'm not an expert.
My understanding is that the response is essentially the same between 60Hz and 120Hz, but the advantage of 120Hz is that smoothing of motion blur is induced by the panel. This is primarily accomplished through the insertion of interpolated frames. The perception is faster pixel response time when in effect it is simply an extra inserted interpolated frame meant to reduce visual artifacts from judder.
It's more visually pleasing, but not necessarily with a quicker pixel response.
--
I have a 2233RZ and FW900. I didn't reply because the thread was actually going places and didn't feel like going into another pointless driver debate.This is why I was extremely puzzled by a poster in a previous thread who complained about a Radeon 4870 or 4890 and said he was going back to his 8800 (I think it was an 8800) cause it got 140hz vs the Radeon which was only getting 120hz. I think he's CPU bound but I asked him what monitor he had and all I heard was crickets chirping. Most people gaming today have LCD's which are locked at 60hz but I believe some monitors can go into an "overclocked" state at 72hz. There are starting sell 120hz LCD monitors though.
Some game engines benefit when being ran at a higher framerate e.g. Source. Also you eliminate some input delay by drawing the scene faster. For most people it is negligible, but if you take the time to reduce input delay in other places those milliseconds add up.I am always a bit mystified by those who demand 100+ fps when not running 120Hz LCDs. Perhaps flooding your monitor with frames faster than it can render hides display latency? Or they see scene tearing and think it's "low fps"?
The default USB polling rate is 125hz. Some mice targeted to gamers will set their polling rate to 500/1000 automatically.Don't mice poll at 100hz or so? Wouldn't it be beneficial to have your frames sync'd to that too?
IPS are nice, most HD televisions are IPS or MVA, so to get a feel for the technology, just look at some TV's. The main selling point of those two are viewing angles.
LED backlighting isn't necessarily better quality (unless it's RGB), it's just lower power. CCFL has a wider color gamut compared to white LED's.
IPS and especially MVA are going to have more input delay compared to TN though, since there has to be some image processing to reducing ghosting.
For basically everyone, VSync with triple buffering is probably the best solution. If your video card is RAM bound however though, triple buffering may lower performance. Also the people wanting the absolute max performance out of their system may prefer to leave this off.So wouldn't it make sense (for the most part) to always have VSync enabled then on LCDs?
I always try to maintain a MIN of 60FPS so that means your AVG FPS usually has to be around 100.
You tell us.Does vsync and buffering matter a lot for strategy/rpg games if your frame rate is already decent? I thought tearing didn't really become noticeable unless it's fast paced stuff with a lot of distance on the screen covered in a short time (such as in an FPS)?
Some older LCD monitors will go up to 75. Other than reducing input delay or having the game engine run optimally the answer is no framerates over your refresh rate do not matter.
I have a 2233RZ and FW900. I didn't reply because the thread was actually going places and didn't feel like going into another pointless driver debate.
Ive decided to keep using the 4890 as the OJB port of CS:S changed the netcode too much for my friends and I to handle, and we didn't feel like learning the game over again. Because of the OJB engine I made the complete switch over to the 2233RZ because I don't play first person shooters anymore and the LCD is much nicer for everything that isn't raw FPS or color accuracy.
The 2233RZ doesn't overclock worth crap vs the FW900. Overclocking of the monitor was the only reason why the 8800GT displayed more FPS than the 4890. Since the 8800GT was never mine in the first place (I revived my friends card by baking it), and I wouldn't see much gains with the LCD, I gave my friend the card since he needed it more.
To get back on topic:
Some game engines benefit when being ran at a higher framerate e.g. Source. Also you eliminate some input delay by drawing the scene faster. For most people it is negligible, but if you take the time to reduce input delay in other places those milliseconds add up.
The default USB polling rate is 125hz. Some mice targeted to gamers will set their polling rate to 500/1000 automatically.