Looking for an explanation or a link to an explanation of how a "jitter buffer violation" is calculated. The idea is that a packet arrival outside the jitter buffer size is discarded as being no longer relevant because it took too long to arrive. This means that somehow there is a comparison between a timestamp and a clock source. What I don't know is how the receiving device decides - does it look at its own internal clock and compare, or does it track the timestamps inside the packet instead?
TIA,
R
TIA,
R
