VOIP Tech question

racolvin

Golden Member
Jul 26, 2004
1,254
0
0
Looking for an explanation or a link to an explanation of how a "jitter buffer violation" is calculated. The idea is that a packet arrival outside the jitter buffer size is discarded as being no longer relevant because it took too long to arrive. This means that somehow there is a comparison between a timestamp and a clock source. What I don't know is how the receiving device decides - does it look at its own internal clock and compare, or does it track the timestamps inside the packet instead?

TIA,
R
 

ScottMac

Moderator<br>Networking<br>Elite member
Mar 19, 2001
5,471
2
0
It looks at the real-time arrival of the serialized packets.

As it see larger variances of delay (a bunch of packets, then none, then a bunch, then a few ....spaced out...etc), it increases the de-jitter buffer (makes it longer to accommodate the possibility of high-latency) until it hits a threshold where buffering the packet would screw things up more than just tossing it and smoothing out the gap.

It doesn't have to look at timestamps, just the actual arrival rate (and variance of the timing).

It gets a lot deeper and uglier than that, explanation-wise, but that's it in a nutshell (within the limits of my understanding).

Good Luck

Scott