frame latencies.

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Does higher frame latency mean higher input lag or is frame latency something totally different?

I was thinking about getting a piledriver CPU, but I heard that it had significantly higher frame latencies so that's why I started this thread.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
two shouldn't be directly related, but even so, frame latency leads to apparent input lag.

you can't see on-screen effect of your mouse movement, before it gets drawn right?
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
Frame latency is usually used to refer to the delay between one frame and the next. Over longer periods of time, frame latency is an inverse measure of FPS. Several sites have started graphing the frame latency of different cards and CPUs in different games because they realized that even if two setups deliver the same framerate over the course of a whole second, some setups are more prone to having a mix of very long frames (stutters) mixed with very short frames (called runt frames) which gives a worse experience than the same number of frames at even intervals.
 

Lavans

Member
Sep 21, 2010
139
0
0
Higher frame rate latency gives you a choppier experience regardless of frame rates. Normally this is something that's handled by your drivers.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Higher frame rate latency gives you a choppier experience regardless of frame rates. Normally this is something that's handled by your drivers.

More variable frame rate latency gives a choppier experience. Not higher.
If you have consistent 40ms latency that won't be choppier than latency between 10ms and 40ms.

Evening out frame latency increases input lag as you "artificially" delay frames from when they would be displayed to correspond to your input.
If you display frames as soon as they are ready, the input lag is relatively consistent.

To alter frame latency, frames are spaced more evenly apart than they would be if they were processed ASAP, resulting in frames getting delayed, increasing input lag but decreasing frame display inconsistency.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
There is a mixing of terms going on. The stuttery traces are all based on frame times, that doesn't have anything to do with frame latency.

Frame latency I don't think is a term we really use, we normally use input latency as we are interested in the total latency from user input until display, whereas frame latency to me suggests the time measured from the production of a frame (its presentation or maybe from beginScene) through to when it appears on screen.

Regardless its not a term used for taking about stutter and frame times or FCAT so has no baring on anything I know we can measure today.
 

Lavans

Member
Sep 21, 2010
139
0
0
More variable frame rate latency gives a choppier experience. Not higher.
If you have consistent 40ms latency that won't be choppier than latency between 10ms and 40ms.

That makes sense

To alter frame latency, frames are spaced more evenly apart than they would be if they were processed ASAP, resulting in frames getting delayed, increasing input lag but decreasing frame display inconsistency.

Isn't this also influenced by pre-rendered frames?
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Pre rendered frames add input latency, but that's something entirely different to what's being discussed here.
So does that mean that Piledriver doesn't cause more input lag than my 2500k?

The lower number of pre-rendered frames the better (I care more about low input lag than high fps) IMO.
 
Nov 26, 2005
15,189
401
126
With the Unreal Tournament III typing 'stat fps' will bring up the fps and the ms latency. The higher the FPS the lower the latency.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
So does that mean that Piledriver doesn't cause more input lag than my 2500k?
Changing to a Piledriver from a 2500K is a downgrade. Seriously, it'll perform worse while chewing up more power, plus you'll need a brand new motherboard as well.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Changing to a Piledriver from a 2500K is a downgrade. Seriously, it'll perform worse while chewing up more power, plus you'll need a brand new motherboard as well.
Thank you:)

I decided not to since I read that it runs at higher temps than I was initially lead to believe.

I was going to upgrade to Haswell until I read that it would use TIM instead of solder. Even if the z87 chipset runs at lower temps than my z77, the haswell CPU would run at higher temps than my 2500k, so I'm damned if I upgrade and damned if I keep my current setup.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
So consistent 1000ms frame times are less choppy than alternating 1/2ms frame times?

There obviously is a maximum frame time before things feel like a slide show, though a slide show may be different than being choppy, but neither are good.

There is also a minimum frame time that after which point, it doesn't really matter if it goes lower, but somewhere between 8ms and 33ms, fluctuations are the primary problem when it comes to choppy game play.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I think all the recent discussions of microstutter and frametimes have distorted the classical definition of choppy. 5 years ago choppy was synonymous with low frame rates regardless of consistency.

Maybe that is what is happening here, much in the same way that the term "lag" morphed from exclusively being network problems into an undefined mess and slowly back again.
 
Feb 19, 2009
10,457
10
76
It depends, for a lot of gamers, anything <60 fps is choppy.. that's why its STILL is the defacto standard to aim for.

Some people claim benefits at 120 fps too. So it varies a lot.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
It depends, for a lot of gamers, anything <60 fps is choppy.. that's why its STILL is the defacto standard to aim for.

Some people claim benefits at 120 fps too. So it varies a lot.

I am one of these people, but the whole point of the recent obsession with frame times is that 'fps' is an adequate metric for smoothness. You can have high fps but crappy frame times. Perhaps the most extreme case is multi-GPU microstutter. When I was running 2 7970s, Skyrim went from beautiful to a stuttering mess. 170fps in Crossfire felt noticeably more 'choppy' than 70fps on a single 7970