• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

VOIP Tech question

racolvin

Golden Member
Looking for an explanation or a link to an explanation of how a "jitter buffer violation" is calculated. The idea is that a packet arrival outside the jitter buffer size is discarded as being no longer relevant because it took too long to arrive. This means that somehow there is a comparison between a timestamp and a clock source. What I don't know is how the receiving device decides - does it look at its own internal clock and compare, or does it track the timestamps inside the packet instead?

TIA,
R
 
It looks at the real-time arrival of the serialized packets.

As it see larger variances of delay (a bunch of packets, then none, then a bunch, then a few ....spaced out...etc), it increases the de-jitter buffer (makes it longer to accommodate the possibility of high-latency) until it hits a threshold where buffering the packet would screw things up more than just tossing it and smoothing out the gap.

It doesn't have to look at timestamps, just the actual arrival rate (and variance of the timing).

It gets a lot deeper and uglier than that, explanation-wise, but that's it in a nutshell (within the limits of my understanding).

Good Luck

Scott
 
Back
Top