• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Audio delay

Qacer

Platinum Member
Good afternoon,

What would be considered the maximum allowable delay time that will not cause any viewing interference?

For instance, if I had wireless headset for a TV, how much audio processing time is allotted so that a user can still perceive the received sound and the television picture as being synchronized?


Thanks!

 
You could determine this by experiement.

Get an anvil and hammer. Go in ~5ft increments away from the anvil, once you get to the distance where the sound and visual becomes disconnected, you have enough data to determine the time necessary.

find the humidity and air pressure for the day and you can calculate the speed of sound at that time. Then you take your distnace and divide it by the speed of sound, that will determine your maximum time.

edit - What is your application? You could just go with a simple flat target of "I want it to be similar to a viewing distance of 15ft" and use that distance divided by the speed of sound to get your time.
 
My intention was to actually transmit audio via a wireless channel from the TV, so my main concern is that how much time I can set aside for all the sound processing via an audio processing chip, microcontroller, etc. My initial calculations with the audio processor using how much time it takes to transfer and process data via, say, an I2C bus was in the microsecond range, but that does not include the time required to sample, setup, and process data from a microcontroller.

However, you gave me an idea. I can just use a sound program like Cool Edit and compare two tracks with various time delays. I'll just use that to find out when I'll start hearing an echo.

Thanks!
 
For me, it is above 2 frames. That would be about 1/15 of a second. I can notice it at 2 frames and depending on what it is, 1 frame.

BUT, I edit video, so I may not be the norm.
 
It's an acedemic exercise. Sound delay in major network source material is highly variable already. I doubt you could attribute any particular delay to your equipment or something else in the chain.
 
A footnote, when editing, I slip the video a frame per 100ft from the audio source to sync them. This is important for video of marching bands.
 
At most about 30 ms to be on the safe side (it varies). Our hearing will "integrate out" anything shorter than that (this is known as the Haas effect). There are plenty of books where you can find information about this, e.g. "The Master Handbook of Acoustics".

dkozlosvki: It can still be very annoying, on good home cinema-recivers you can usually change the audio-delay for precisely this reason (but that will of course only help if it is the PICTURE processing that is slower than the sound processing).
 
The trouble with that is that you just about have to "ride" the control as the picture changes from scene to scene to commercials and back.
 
Originally posted by: gsellis
For me, it is above 2 frames. That would be about 1/15 of a second. I can notice it at 2 frames and depending on what it is, 1 frame.

BUT, I edit video, so I may not be the norm.

It will depend on the person. Some people might not be bothered by 100ms. Others might find something slightly amiss with only 10ms.
 
in relation to what oynaz said about 10-30 ms

yeh its like 7-35 ms
it is called the hoss zone
where the human ear cannot separate the delay signal from the source signal
commonly used to thicken things up in post production
13-16 ms is most unnoticeable for my ears
 
Thanks for the replies! I think 10ms is plenty enough. I should be able to do some extra stuff if I use 10ms as my reference.
 
Back
Top