• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

30-60 Vs 60 Vs 60+ FPS, What is the true?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nextJin

Golden Member
Apr 16, 2009
1,848
0
0
at least you can read and can comprehend unlike that other fool.

if you do not care about the science behind it, point taken. simply skip those posts. for other - they may want to know the science.

as for the troll tag, each to its own. btw - try to keep your post at a professional level.
You can't get around the fact that we are discussing videocard rendering in real time and not cinema style recording. Wasting your time talking about recording a PC screen with a 30fps camera is one of the dumbest arguments I've seen since P&N. The bottom line is that the OP is discussing frame rates relative to videogame rendering.
 

angevil

Member
Sep 15, 2012
29
0
0
Didn't read the whole thread, but a few points:

1) The human eye can see well over 200FPS given the right conditions. Our vision is based more so on contrast and movement, but you can certainly put frames well over 60FPS to use.

2) Fluidity (or lack thereof) is extremely subjective and is the root cause of a lot of the discussions/arguments around here. Some people see it and are bothered by it, others can't and aren't, some can see it and aren't bothered by it.

3) Just because your monitor can't display more than 60/120FPS doesn't mean those extra frames won't be put to use. Higher FPS render side and server side can make a game "feel" more fluid even though it doesn't LOOK more fluid.

4) Rendering below your refresh rate is easily detectable by most people. That's why Vsync exits - tearing drives some people nuts (it doesn't bother me, however).

5) To me, it doesn't matter how "smooth" an image is if it looks like crap in the first place. That's why I have an IPS monitor. But that just is an example of the many contrasting opinions on this subject.
These are all valid points which i observed since i started pc gaming 15 years ago. I personally can make up individual frames at 60 fps, which is the monitor's refresh rate. If you pay attention during a melee attack animation in a 3d game , you will see some ghosting of the previous frame.

I notice a big difference from 30 to 60 fps, and while i never saw a 120 hz monitor, i can feel that the fluidity at 120 fps can get better than at 60 hz with 60 fps. When i had a CRT monitor i could tell fast the difference between 60 and 75hz, 75hz and 85hz, and 85hz and 100hz, and even between 75 and 80hz. I know that because i "overclocked" my last CRT from 75hz to 80hz at its maximum resolution, and i noticed the improvement in fluidity.

It is laughable that people say that human eye cannot detect more than 24 or 30 fps. I am not exaggerating about noticing differences in CRT, it is not placebo effect. For example i "overclocked" the CRT because i compared 75hz to 85hz in games and 85 fps felt more fluid, but i also wanted to play at the maximum resolution, which was at 75hz, so i made it 80. I did this AFTER i compared the 2.

The point is that i did not do it "for the sake of it", it is that i really noticed how 85 hz is better than 75hz. I did not want to decrease the lifespan of the monitor or to put up with various issues video card drivers had, since my monitor was not supported properly. The fluidity i got from 5 extra fps was worth it for the hour of tweaking i did.

Again, it is not placebo effect, it is real.
 
Last edited:

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
This is still going on? Wow.

Someone would have to have a serious vision disorder not to see the differences here:

http://boallen.com/fps-compare.html

http://frames-per-second.appspot.com/

(In the second one, be sure to turn off motion blur in the two examples. Also pick the soccer ball as it really shows a difference.)
click those links while at 30fps. is it fluid?

click those links while at 120fps. is it fluid?

enough said.

done beating a dead horse. if you have not comprehend it by now. you never will. ignorance is bliss. carry on.
 

Pia

Golden Member
Feb 28, 2008
1,563
0
0
if you do not care about the science behind it, point taken. simply skip those posts. for other - they may want to know the science.
As if you understand the science. :D

Say we have a first person shooter with 90 degree FOV on a 1920px wide screen, and we are making a half-turn in two seconds at a constant rate. And let's say we have absolutely perfect rendering, including perfect temporal antialiasing. Then, at 120FPS every one of our perfectly rendered frames will blur the picture horizontally by 16 pixels, and at 60FPS by 32 pixels. Detail smaller than that is destroyed. The 60FPS blur is immediately apparent and clearly unacceptable if we want the result to be indistinguishable from reality. The blur at 120FPS is still quite high and could be observable. This is with unrealistically optimistic assumptions which underestimate the frame rate needed.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I notice a difference between 30 and 60 but after around 45, I have trouble deciphering the frame-rate differences.

For example: If I could use SGSSAA instead of multi-sampling, would game in the 40's and 50's over-all with some sustained minimums in the 30's, in a single player game.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
As if you understand the science. :D

Say we have a first person shooter with 90 degree FOV on a 1920px wide screen, and we are making a half-turn in two seconds at a constant rate. And let's say we have absolutely perfect rendering, including perfect temporal antialiasing. Then, at 120FPS every one of our perfectly rendered frames will blur the picture horizontally by 16 pixels, and at 60FPS by 32 pixels. Detail smaller than that is destroyed. The 60FPS blur is immediately apparent and clearly unacceptable if we want the result to be indistinguishable from reality. The blur at 120FPS is still quite high and could be observable. This is with unrealistically optimistic assumptions which underestimate the frame rate needed.
You mention blur. But can you clarify what you mean?

1) Blur can occur because of a failing of our eyes. Like trying to see the spinning blades of a helicopter, our vision persistence makes it blurred. If we had better eyeball hardware, we could see the individual blades, but we are not in the matrix yet.
2) Blur also can occur because the software artificially generates a blurring to trick our minds into thinking motion is occurring, when it's actually just a blurred image. The computer intentionally provides an image that is specifically blurred for us. So you could have a perfectly rendered image, at full resolution, that includes a blur applied to it. Like going into photoshop, and applying a blur filter, then displaying that blurred thing perfectly.
3) Blur can also occur because the monitor or antialiasing cannot provide the intended image. For a general example, if you stretch/enlarge an image, the computer can try to make it look better by interpolating/blurring, so it's a compensation/compromise.

When you say blur occurs, I wonder if you are talking about the 3rd situation? Or something else?
 

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
I can tell apart 120fps@144Hz and 144fps@144Hz on a 144Hz LCD (Lightboost turned off for this test). This is because of uneven motion effects. You can test this with a GTX 680 and a Source Engine game. Turn on VSYNC (for this particular test). Use an fps_max 120 versus fps_max 144. You'll see perfect smoothness at 144fps, but some slight stutterings at 120fps (when the monitor is set to 144Hz).

For impulse-driven displays, you can tell if (fps equals Hz) versus (fps not equals Hz) even well beyond 60Hz. Even when people tested medical CRT (2048x1536@85Hz) at 240fps@240Hz (640x480), you could just about tell 120fps@240Hz apart from 240fps@240Hz. (Yes, there were some high-end pricey medical CRT's that could sync at 240Hz at low resolutions!)

For impulse-driven displays, it is the same double-edge effect as 30fps@60Hz, but the distance between double-edges is halved at 60fps@120Hz for the same motion speed, and quarter the distance between edges at 120fps@240Hz. The double-edge effect of half framerate vs refresh rate, only disappears when fps == Hz.

Your eyeballs are continuously moving while tracking a fast moving object. The frame samples need to be at the right location when the frame is displayed, or the image is in a different location.

So the answer is YES --
For information, see Science & References.
 
Last edited:

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
You mention blur. But can you clarify what you mean?
He's describing eye-tracking-based motion blur on a sample-and-hold display. This is described scientifically in Science & References. Impulse-driving using flicker (e.g. CRT or LightBoost) shortens the frame samples stroboscopically and sharpens the motion.

With a CRT, (or with a LCD with LightBoost enabled), I can even count single pixels moving at 960 pixels/second in PixPerAn. With CRT or with LightBoost LCD, I can also read text speed of 30 in PixPerAn. (Unlike 7-8 for most LCD's)
 
Last edited:

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
But, I told him. For the human eye, its impossible to feel any diference while playing over 60 FPS.
This is false.
Give your friend this web address: blurbusters.com/references

That's why there is the V-Sync, to limit the game's FPS to the Hz of the monitor's resolution. Turning off V-Sync would only result a waste of processing from the VGA, and no performance difference.
For many people this may be true, but do not forget "input lag".
Rendering more frames means frames are fresher when finally delivered to the display.
60fps means frames may be 1/60second stale (16ms).
120fps means frames may be 1/120second stale (8ms).
240fps means frames may be 1/240second stale (4ms).

At 240fps instead of 60fps, your display gets a frame that's "12 milliseconds fresher" (16 - 4 = 12)
(frames rendered only 4ms ago, not 16ms ago.)
That means you can see enemies sooner. React sooner.
That's a HUGE advantage in competition gaming!
Even if you can't feel it!

Competition championship online game -- ONE MILLISECOND CAN MATTER
When two people shoot at the same time in a video game,
The person who shoots first wins!
Reacting 1ms sooner means you win the trophy prize in the pro competition championship!
Even if you CANT feel the millisecond!
It's like winning the 100 meter Olympics sprint -- tiny amounts matter.

THEREFORE, It's not "necessarily" a waste to have fps > Hz and fps > 60 even if you can't feel it.
(Even though, in fact, one can feel the difference between 60fps@120Hz, and 120fps@120Hz)

Also, AnandTech has a good article about input lag.
 
Last edited:

Idocare

Junior Member
Feb 4, 2013
20
0
0
This is false.
Give your friend this web address: scanningbacklight.com/references

For many people this may be true, but do not forget "input lag".
Rendering more frames means frames are fresher when finally delivered to the display.
60fps means frames may be 1/60second stale (16ms).
120fps means frames may be 1/120second stale (8ms).
240fps means frames may be 1/240second stale (4ms).

At 240fps instead of 60fps, your display gets a frame that's "12 milliseconds fresher" (16 - 4 = 12)
(frames rendered only 4ms ago, not 16ms ago.)
That means you can see enemies sooner. React sooner.
That's a HUGE advantage in competition gaming!
Even if you can't feel it!

Competition championship online game -- ONE MILLISECOND CAN MATTER
When two people shoot at the same time in a video game,
The person who shoots first wins!
Reacting 1ms sooner means you win the trophy prize in the pro competition championship!
Even if you CANT feel the millisecond!
It's like winning the 100 meter Olympics sprint -- tiny amounts matter.

THEREFORE, It's not "necessarily" a waste to have fps > Hz and fps > 60 even if you can't feel it.
(Even though, in fact, one can feel the difference between 60fps@120Hz, and 120fps@120Hz)

Also, AnandTech has a good article about input lag.
Good point. But if I should tell it all you forgot to say for example that is pure theory speaking of challenges, cause to win a championship you have to play with others and your "facing others" is heavily dependant on the netcode, if that doesn't allow you to have benefits you can have a plasma monitor with a crazy refresh-rate linked to a 800fps VGA and you will not notice any improvement in terms of gameplay. I mean if netcode is biased and synched at 60fps you can even have a a system capable of ten time that amount but you will not have any improvement going faster from your side, if the synch is at 120fps and you run at 60fps expect many bullets in the middle of your eyes cause you're slower than the netcode.

In a private game you can clearly see a big improvement playing on a high refresh-rate monitor but speaking of LAN events or Online-Play it's all up to squeeze every bit out of the netcode, once your system can cap it everything else is a waste of resources. IMHO.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
Good point. But if I should tell it all you forgot to say for example that is pure theory speaking of challenges, cause to win a championship you have to play with others and your "facing others" is heavily dependant on the netcode, if that doesn't allow you to have benefits you can have a plasma monitor with a crazy refresh-rate linked to a 800fps VGA and you will not notice any improvement in terms of gameplay. I mean if netcode is biased and synched at 60fps
Yes, netcode can have a big effect, very game-dependant.
It can be a Great Equalizer.
However, netcode cannot control all factors:
  • Human brain reaction time lag. netcode can't dictate this.
    .
  • Display-induced motion blur. netcode can't dictate this. Even in situations of equal input lag, a stroboscopic 60Hz (CRT display) can lead to better human performance (for many gameplay styles, such as circle strafing) than a continuous-shine 60Hz (sample-and-hold LCD), because of less motion blur during faster motion. 60fps@60Hz is MUCH sharper on strobe displays (CRT, plasma) than continuous-shine displays (e.g. traditional LCD). Likewise for 120fps@120Hz or any fps=Hz situation. Clearer motion, faster human brain reaction time. Netcode can't prevent your display choice.
    .
  • Netcode running at 60Hz do not always mean a framelimit of 60fps. You can have netcode at 60Hz with a framerate of 120fps, for example. Such games that don't framelimit to 60fps, you still have an input lag advantage at 120fps because you get to see the 60Hz net position updates 1/120sec sooner. This, however, depends on the game and how they cap the framerate vs. physics/player position updates, etc. These can be two independent rates. In some cases, game software can insert a tiny artifically-inserted input lag for 120fps@120Hz users to equalize them with 60fps@60Hz users in an attempt to equalize things (in practice, this was rarely done). Even when artificial input lag equalization is attempted by software, you will still gain the reduced human brain lag caused by faster reaction time by not suffering as much motion blur at 120Hz versus 60Hz.
    .
  • Display lag and cable lag (input lag beyond the graphics output). netcode can't prevent some displays from having better lag than others. For example, HDMI cables have a millisecond (or more) lag than VGA cables.
    .
  • Controller lag (beyond the API layer to the game). A gaming mouse (1000Hz) will still have less input lag than a 125Hz mouse, since the gaming mouse electronics deliver the position updates more quickly to the mouse drivers. The mouse position updates are fresher at 1000Hz, it was measured from the mouse only one millisecond ago (plus any mouse USB cable lag, USB hub lag, where applicable -- you should plug mouse direct to PC instead of hub for minimum lag). Netcode can't modify mouse hardware or mouse drivers. Mice is still able to give faster/fresher/more accurate X+Y positions to the mouse drivers, even if netcode throttles or averages the API mouse readouts to a lower rate (60Hz) because the fewer sample are still 'fresher' individually.
    .
  • Even where netcode runs at 60Hz, reacting 1ms sooner still has an advantage. You have a bigger chance of having your input be captured in an earlier "netcode refresh cycle" (Example: the final 1ms of a previous netcode cycle, versus the first 1ms of the next netcode cycle). Therefore, gaining just a few extra millisecond from external factors such as a lower-lag display, still gives you a noticeable competition advantage in fierce "equal-skills" contests.
Although netcode works to level the playing field,
there are MANY factors that netcode is unable equalize or level the playing field.
It is impossible for netcode to equalize every single external factor.

See sooner (good display setup), react sooner.
React sooner (good controller setup), see sooner.
...is generally always true, regardless of netcode.
 
Last edited:

Idocare

Junior Member
Feb 4, 2013
20
0
0
Yes, netcode can have a big effect, very game-dependant.
It can be a Great Equalizer.
However, netcode cannot control all factors:
  • Human brain reaction time lag. netcode can't dictate this.
    .
  • Display-induced motion blur. netcode can't dictate this. Even in situations of equal input lag, a stroboscopic 60Hz (CRT display) can lead to better human performance (for many gameplay styles, such as circle strafing) than a continuous-shine 60Hz (sample-and-hold LCD), because of less motion blur during faster motion. 60fps@60Hz is MUCH sharper on strobe displays (CRT, plasma) than continuous-shine displays (e.g. traditional LCD). Likewise for 120fps@120Hz or any fps=Hz situation. Clearer motion, faster human brain reaction time. Netcode can't prevent your display choice.
    .
  • Netcode running at 60Hz do not always mean a framelimit of 60fps. You can have netcode at 60Hz with a framerate of 120fps, for example. Such games that don't framelimit to 60fps, you still have an input lag advantage at 120fps because you get to see the 60Hz net position updates 1/120sec sooner. This, however, depends on the game and how they cap the framerate vs. physics/player position updates, etc. These can be two independent rates. In some cases, game software can insert a tiny artifically-inserted input lag for 120fps@120Hz users to equalize them with 60fps@60Hz users in an attempt to equalize things (in practice, this was rarely done). Even when artificial input lag equalization is attempted by software, you will still gain the reduced human brain lag caused by faster reaction time by not suffering as much motion blur at 120Hz versus 60Hz.
    .
  • Display lag and cable lag (input lag beyond the graphics output). netcode can't prevent some displays from having better lag than others. For example, HDMI cables have a millisecond (or more) lag than VGA cables.
    .
  • Controller lag (beyond the API layer to the game). A gaming mouse (1000Hz) will still have less input lag than a 125Hz mouse, since the gaming mouse electronics deliver the position updates more quickly to the mouse drivers. The mouse position updates are fresher at 1000Hz, it was measured from the mouse only one millisecond ago (plus any mouse USB cable lag, USB hub lag, where applicable -- you should plug mouse direct to PC instead of hub for minimum lag). Netcode can't modify mouse hardware or mouse drivers. Mice is still able to give faster/fresher/more accurate X+Y positions to the mouse drivers, even if netcode throttles or averages the API mouse readouts to a lower rate (60Hz) because the fewer sample are still 'fresher' individually.
    .
  • Even where netcode runs at 60Hz, reacting 1ms sooner still has an advantage. You have a bigger chance of having your input be captured in an earlier "netcode refresh cycle" (Example: the final 1ms of a previous netcode cycle, versus the first 1ms of the next netcode cycle). Therefore, gaining just a few extra millisecond from external factors such as a lower-lag display, still gives you a noticeable competition advantage in fierce "equal-skills" contests.
Although netcode works to level the playing field,
there are MANY factors that netcode is unable equalize or level the playing field.
It is impossible for netcode to equalize every single external factor.

See sooner (good display setup), react sooner.
React sooner (good controller setup), see sooner.
...is generally always true, regardless of netcode.
All good points again except it seems you didn't get what I meant.

Simply put, like no hardware can improve internal brain reaction timings even no hardware can improve gameplay running FASTER than netcode dictates.

I explain myself better, when you appear in the visual field of a virtual opponent he can shoot at you as fast as his reflexes allow it but he simply cannot shoot at you BEFORE you appears in his visual fields even if he should have a 0ms brain reaction time, Try to get this, if your VGA and your monitor(theoretically) allows you to have let say 400fps on YOUR MACHINE and you can FEEL the game smoother than playing it at 60fps it's all about your personal confort but your speed will not increase in your opponent monitor neither in his CPU nor in the netcode, thus if the game states that a projectile fired at you has a speed of X*m/s (meters/second) and can reach you in X*ms (milliseconds) is the netcode that state if it will able to reach you or not and all that matters is the speed of the connection and the LAG between your machine and your opponent's one, NOT between your monitor and your opponent's one, the input-lag between you and your machine is another pair or arms, obviously the faster you can give commands to your machine the fastest they are processed by the netcode but it's not up to your monitor refresh IN A DIRECT WAY that you can make a difference cause it CANNOT allow you to overcome the LAG of the netcode layer itself.

Better put, if you can see what happens in the game 400 times a second or one time every 2,5ms this doesn't means you can have your actions transferred at the same speed on your opponent machine which for example could have a 100 times a second refresh rate or one time every 10ms. This could be an advantage if there is no LAG on the line between you and the other machine, thus all your arguments are valid IF [netcode speed]+[total LAG]<[(your machine reaction time)+(your reflexes)+(your LAG)]<[(your opponent machine reaction time)+(your opponent reflexes)+(your opponent LAG)]

Given a game server with the netcode on it that updates player's x-y-z cohordinates every 10ms it's not worth having the screen refreshed every 2ms cause you are bounded to the netcode refresh rate, in this scenario having a refresh rate (speaking of both layers, monitor and engine) at effective 60Hz or 60fps that means about 16,5ms could be a disadvantage but ONLY if your opponent is running faster, otherwise you will be running slower than the netcode but that's all, once you cap the netcode speed, taking in account every LAG, you are the nearest possible to your opponent actions leaving your brain to make the difference, in the example made of 10ms you will have no benefits pushing nor your engine neither your monitor beyond 100Hz.

[player(1)]->[LAG-P1]->[monitor/input-P1]->[engine-P1]->[netcode-P1]->[LAG-N1]->[Server/main engine/main netcode]<-[LAG-N2]<-[netcode-P2]<-[engine-P2]<-[monitor/input-P2]<-[LAG-P2]<-[player(2)]

Simply put, your character speed is bounded to a code and there is no hardware that can make you faster than the code allows, for how smooth and reactive you can see your character on YOUR screen it doesn't make it faster on the server, it will run at a fixed speed, sway at a fixed speed, jump at fixed speed, etceteram, even if you can FEEL yourself more "speedy" on your screen. Probably this thing can give anyway a psycological benefit helping a bit the overall reflexes, but that's all about it.

Playing locally there are benefits involved, to an extent unless you are faster than your CPU, playing on remote instead screen refresh comes to play till you reach the refresh between your engine and the server engine, beyond that is useless to go faster, for example playing with the 40/50ms LAG of a regular DSL connection cannot be affected by a local overkill rig that can only allow you to respawn a little faster when your opponent with a mediocre rig BUT with a 4ms line will shot a hole in your forehead, for sure however a 240Hz monitor will allow you to see a smoother animation of your death... :biggrin:
 
Last edited:

Whitestar127

Senior member
Dec 2, 2011
397
24
81
some of u got some serious issue blaming fps for your visual downfalls.

clearly the culprit is something else. rather that be microstutter, input lag, response time, ping, or just your head.

just becuase higher fps tends to mask those visual downfalls. does not make fps the culprit. fps is merely the bandage.

anyone who claims to tell the difference between 60fps and 120fps. whatever you are on, we all could use a little of it from time to time. carry on.
Well, from a theoretical viewpoint I have to agree with this (although you could tone down the aggressiveness of your post), based on my own experience.

On my CRT I have NO chance at all to see the difference between 60fps @60hz and 100fps @100hz (well apart from the ever so slight flicker at 60hz). I'm talking only visual smoothness now, not the response that you can feel through the mouse. The motion fluidity is exactly the same.
Because 60 fps is already baby bottom buttery smooth. Let's just call it 100% smooth. 100fps doesn't add anything to that. So when presented on a platform (CRT) that has no response time issues (at least they are negligable), I cannot see the difference.

On my Benq LCD on the other hand, it's easy to see the difference between say 60fps @60hz and 120fps @120hz. But the real reason for that is as you say response time and the way the LCD presents the fps, not the fps itself.

That's how I see it at least. :)
 
Last edited:

Whitestar127

Senior member
Dec 2, 2011
397
24
81
You don't think? It might differ from game to game, but at least those I remember playing at 60fps were silky smooth. DOOM3 for example. Might have to reconnect my CRT and go back and check others.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
Doom 3 has always been framerate capped at 60fps (though the BFG edition is apparently different)

I remember playing it on my 85Hz CRTs back in 2005. Had to turn the settings way down at the time in order to maintain 60fps, and yes that was smooth(-ish). Despite Carmack's weird policy of framerate caps, ID Software as a company usually makes games that work well on PC, unlike some others that I could mention...

But I remember comparing it to something like 85fps on games like Return to Castle Wolfenstein, and, well, let's just say that all baby bottoms are smooth, but some are smoother than others :)

EDIT: This may be of interest: http://120hz.net/showthread.php?617-120hz.net-Games-Report.
 
Last edited:

Whitestar127

Senior member
Dec 2, 2011
397
24
81
Well I guess I'll have to revisit it on my CRT again then, just to see if my eyes agree with yours. :)
 

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
On my Benq LCD on the other hand, it's easy to see the difference between say 60fps @60hz and 120fps @120hz. But the real reason for that is as you say response time and the way the LCD presents the fps, not the fps itself.
Response time is actually only part of the story. It's the sample-and-hold effect.

Pixel response on fast TN monitors are only 2ms (for most of the pixel transition). However, the effective response on the human retinas is often the full 16 milliseconds because the image is continuously shining for the whole refresh (1/60sec = 16ms). Your eyes are continuously moving while tracking a moving object. So the continuously-shining static frame is blurred across your retinas as you track moving objects. Faster movement, more blurring.

To fix this, you need to either increase refresh rate, or strobe the display.
e.g. 60fps@60Hz using 1ms strobes is 50% sharper motion than 60fps@60Hz using 2ms strobes

So, a LightBoost strobe backlight monitor using 1.5ms strobes (frame sample of 1.5ms), has about 90% less motion blur than a regular 60 Hz LCD monitor (frame sample of 1/60sec = 16ms because it's continuously shining for the whole 1/60sec refresh). The stroboscopic effect (CRT style) essentially freezes the image with a short flash, preventing it from being blurred across your retinas.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
2ms cause you are bounded to the netcode refresh rate, in this scenario having a refresh rate (speaking of both layers, monitor and engine) at effective 60Hz or 60fps
Right, that's true. I understand what you are trying to say. I am a computer programmer, and understand the concepts, even if I'm not in the game industry.

IIRC, Carmack's Doom3 kept the framelimit the same as netcode; it had an annoying framelimit of 60fps, though that was removed (I believe) for Doom 3 BFG. Nowadays, frame limit of 60fps is rarely forced by default in practice. Good riddance, really. Thankfully, all source engine games (e.g. TF2, BF3), games like Quake Live, etc, allow you to play above 60fps online, even 120fps. Some games such as Crysis series get away with a 60fps limiter, since most graphics cards can't exceed 60fps anyway on these, but even this limiter is configurable. However, the monitor-side 60fps limit is no longer being done in practice in many games thanks to the introduction of 120 Hz LCD's and the complaints of being limited to 60fps. Yes, netcode still runs at a lower update rate.

Even if the netcode runs at a lower update rate, you can still turn in place at 120fps (even if the axis updates are only transmitted at 60Hz). This still benefits the player hugely, by eliminating motion blur while trying to identify objects while turning fast. And even more so if you're using an impulse display such as CRT or LightBoost (measured 90 to 94% less motion blur than 60Hz LCD!) Even if your next shoot occurs 1/60sec later, you've still reacted faster thanks to less motion blur during panning.

It's a definite advantage, confirmed by many sources of LightBoost users gaming at 120fps@120Hz. As Elvn(HardForum) said "And OMG i can play scout so much better now in TF2", and Baxter299(OCN) said "replacing my [Sony CRT] fw900 witch is finally taking a rest in my closet". Dozens of quotes. I can give lots of links.

Yes, I understand what you are saying, however, 60fps frame-cap is not being done anymore in most online games. Yes, some servers may enforce an fps_max but this is not done by default. So you still gain the benefit of higher framerates (even with low netcode update rates, and even if it has input-lag-equalizing logic).
 
Last edited:

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
Just for the record, Crysis doesn't actually have a frame rate limited of 60. You just need to 'force' 120 using things like Radeon Pro. I was playing it on Sunday at 120fps, 2560x1440 (on medium with no AA obviously). Awesomeness
 

Idocare

Junior Member
Feb 4, 2013
20
0
0
Right, that's true. Carmack's Doom3 often did this; it had an annoying framelimit of 60fps, though that was removed (I believe) for Doom 3 BFG. Nowadays, frame limit of 60fps is rarely done in practice. Good riddance, really. Thankfully, all source engine games (e.g. TF2, BF3), games like Quake Live, Crysis, etc, allow you to play above 60fps online, even 120fps. The monitor-side 60fps limit is no longer being done in practice. Yes, netcode still runs at a lower update rate.

Even if the netcode runs at a lower refresh rate, you can still turn in place at 120fps (even if the axis updates are only transmitted at 60Hz). This still benefits the player hugely, by eliminating motion blur while trying to identify objects while turning fast. Even if your next shoot occurs 1/60sec later, you've still reacted faster thanks to less motion blur.

Yes, I understand what you are saying, but again, 60fps frame-cap is not being done anymore in most online games.
Let's take this example, imagine this kind of duel:

Example 1) (Both faster than netcode)
You and something else playing through a server and spawning on the server at the very same time on the same axis but faced in opposite direction at 180° have to turn 90° clockwise in order to see who shots first, the time the server takes to communicate the position and to allow your local engine to display the "other" is the same for both and, let say, it's the already used example of 10ms, imagine this in a LAN environment with a 2ms of LAG, lets put you can turn clockwise 90°, focus and aim in 4ms while your opponent is limited to 8ms, the both of you will see each other after 10ms not first, the reaction time starts from when your eyes can see something and your engine cannot make your VGA to display your opponent BEFORE it has the input from the main engine nor you can address bullets to him, same thing on the other side, keeping the difference in responce-time between a powerful machine and a weaker one in the order of few fractions of a picosecond thus negligible the input peripherals here plays their role as usual, the ending point of a 10ms timer is the same for a 60Hz, 120HZ, 240Hz monitors and everything else, time is time, let say your overall responce-time is 0,2s and you push fire after that exact amount of time, it doesn't matter if in that amount of time you have seen your opponent 6 or 12 times but matters how much time it takes for you to realize it's there and to react.
Conclusion: your system is faster and you think to have an advantage but is everything up to the reflexes of the players.

Example 2) (Both slower than netcode)
Lets put the same example but with a server that this time communicates positions every 2ms in the same 2ms LAG environment, your 4ms in this scenario gives you a small but eventually valuable advantage onto your opponent's 8ms beause both your engine and your opponent's one knows the respective position at the same time or 2ms from the server output but you can see it after 2ms more while your opponent have to wait 6ms more to see you, at the same level of reaction time, given the same LAG on your input devices the IO states you shot first.
Conclusion: your system is faster and allows you a theoretical advantage but to win you need to be at least as fast as your opponent.

Example 3) (asymm.LAG discrepancies>netcode speed)
Lets put the above example in one environment with the same 2ms server refresh (netcode) with 4ms LAG for your move, 8ms LAG for your opponent BUT with 20ms net-LAG for you and 10ms net-LAG for your opponent, you can be ready after only 4ms but you will see your opponent after 20ms while him could be ready after 8ms but can see you after only 10ms, it's not worth being ready sooner as your engine cannot address a bullet to someone it doesn't know it's there, your opponents takes more time to be ready but will be ready when you will show up on his monitor thus will have the chance to address a bullet to the virtual "you" faster than you.
Conclusion: your system is faster but your line sucks, you already have an hole in your forehead no matter what your whole system is fast.

As we can see it's only helpful having the chance to be ready sooner until we cap the netcode refresh rate, beyond that it's a waste of resource.

In theese examples I have to tell that's not taken in account a 60Hz or 120Hz range, they are enough generic to cover every possible scenario, given that what matters is the engine speed being aligned to that is good, the more synched the monitor is with the engine the faster are the reactions is true but going beyond that is not useful in the same way is not useful being fater than the fastest possible link with the server, if your engine updates (locally) the position of your character 120 times in a second having a VGA that puts out 120fps and a monitor that can display them all is a good thing, while is useless having the same card outputs 240fps even if the monitor could manage that refresh cause we will end up seeing twice in a row the same frame and while this can seem smoother to the eye in terms of real "action time" it gives no improvements, furthermore the more amount of processor is needed for the PCI-E bus the lesser is left to run the engine of the game and in this line of things going further and pushing a system beyond the maximum point of usefulness could only lead to take a hole in the forehead.

Back to your last point this time I don't find is completely good, even if there is no 60fps one a cap is however required to synch the actions of the players (and that is the useful limit whichever could be), otherwise it would be a mess and however a server still to match actions correctly have to force a LAG between player's actions and this leads to my statement "it's useless to see sooner a bullet that have already hit you".:biggrin:

Back in the days I used to be an huge LAN-party time-waster, since then many things has changed, the speed is higher and codes are improved but I don't think that basic rules have been already rewritten on this base I have to ask you since I'm not all up to date, is imperative today to use the motion-blur feauture or you can simply turn it off when you don't like to use it ?
 

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
I am not disagreeing... It true netcode works to level the playing field. That's very true. However, for display-based motion blur (not GPU motion blur effects, which is a totally different subject), I will point out to testimonials that claim a very clear advantage with LightBoost 120Hz:

Inu said:
I can confirm this works on BENQ XL2420TX
EDIT: And OMG i can play scout so much better now in TF2, this is borderline cheating.
(from esreality)
in BF3 my scores definitly are improving. Especially in close combat circling around enemies. I'm able to keep on focus, where it used to blur all. GREAT! Also spotting foes from the corner of your eye when running, flying or driving improved a lot.
(from overclockers.co.uk,Game Stats)
Cat said:
With my Asus VG278HE at 120Hz and Lightboost (the Lightboost registry hack doesn't currently support 144Hz) playing at 1080p I am pretty much brutalizing my competition. Even with its 2-5ms input lag, which is worse than the 1ms of my old 120Hz monitor the difference with Lightboost is so huge the input lag literally becomes a non-issue. The only thing that matters now that I don't experience any motion blur is my true reaction time.
(from quakelive forum)
Cat said:
Try it out and see for yourself. Believe me, the difference with Lightboost and without it are friggin nuts. It's been a long time since I was last called an aimbotter in CA (I blame being slightly famous), but it happened just today. I'm hitting accuracies I used to hit back in 2010-2011 after playing 1 weekend with this thing, and that's after 2 years of inactivity.
(from quakelive forum)
As you can see, eliminating motion blur improves human reaction time significantly (sometimes out-compensating a tiny difference in input lag). Netcode cannot fully control human reaction time.

My gaming has also improved too as I'm able to react faster to images on the computer screen than I normally do. During fast turns, I can notice things in the corner of my eye faster and look at them sooner -- even before I slow down my turns before stopping turning. Even if the images are not delivered any faster to the monitor. Just less display-based motion blur. Multiple confirmations always win out.

Yes, it's a game of skillz, but do you prefer to game at 20fps or at 60fps? Do you prefer an old 30Hz PS/2 mouse or a 1000Hz gaming mouse? Do you prefer to use a 60Hz LCD or a LightBoost 120Hz LCD (90% less motion blur than 60Hz LCD)? Does an Olympic runner prefer cheap shoes or expensive custom running shoes? For equally-matched skills, the better equipment will generally have higher win averages. The proof is in the pudding.
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
My friend and I were discussing about FPS (frames per second) on games.

I told him, he wouldn't see (really) big diferences among 30-60 FPS since,

Below 30 he will start getting freezes, and really see problems while playing.

Among 30-60 FPS only professional people, audio-visual workers, can REALLY see a diference while playing.

But, I told him. For the human eye, its impossible to feel any diference while playing over 60 FPS.

That's why there is the V-Sync, to limit the game's FPS to the Hz of the monitor's resolution. Turning off V-Sync would only result a waste of processing from the VGA, and no performance difference.

Although,

He says when someone plays at 60+ FPS, his character will "run faster", jump "faster" (higher !?), shoot "faster" than anyone in the game.

He says some gamers prefer to play at 350 FPS (for example) just to be "faster" and take advantage over other who plays at 60, which I disagree.

Which of us are right, wrong? Both?

** All these examples are from playing in 1080p at 60Hz. **

Thank you !
You're confusing multiple issues.

First of all the human eye can see way more than 30fps, 60fps, 120fps and even higher, the benefits of displaying more unique frames to a user is indisputable.

Vsync is specifically to stop tearing, tearing is an artefact of rendering and sending that frame to a traditional monitor, it's merely a side effect of vsync that your frame rate is capped at your monitors refresh rate, in fact with triple buffering enabled the GPU actually still works as fast as it can to produce frames.

Separate to both these issues, your friend is right, some engines (mostly old ones) can see slightly different behaviour of players due to the frame rate running on the client, this was often exploited in old FPS games to get higher jumping, faster firing weapons etc, it's more of a bad design choice with regards to the engine and it doesn't exist in most modern game engines.
 

ASK THE COMMUNITY