Digital audio, Jitter and the truth

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
I frequent a lot of audio and home theater boards and hear all the time about differences in digital cables.

I for one think it is a bunch of baloney. A properly designed digital transmitter should produce very little variances in timing. Not to mention that home audio bitrates are pathetically slow (hundreds of Kbs) compared to the multigigabit rates running without error today.

This rather slow bitrate would be even less prone to jitter or clock slips than a high speed one.

So does jitter really matter in home audio? Does physical media have anything to do with the delivery of properly clocked bits, especially given such short cables (less than 10 meters). I mean I use 100 meg ethernet year round with NO PHYSICAL ERRORS. Ethernet has a clock syncing mechanism that starts witha pattern of 4 bytes I believe:

10101010111100001010 - called the preamble. The receiving transceiver uses this to clock the incoming bits. Makes receving a 10s of megabits/sec stream rather easy.

Is there no such preamble or clock mechanism in Dobly Digital or DTS or PCM audio? If not then WHY NOT!!!??? Heck, the phone company figured out all this stuff decades ago. I simply cannot believe that a receiver/transmitter can get so far out of sink to actually cause a single time slip. If it can then solve the problem with an external clock to all the gear involved. Or are the 500 dollar digital cables or monster cable just baloney.

Time to get my bread and mustard.

I'll post this link, but haven't read any of the resources. Found using google.
http://www.nanophon.com/audio/
 

miguel

Senior member
Nov 2, 2001
621
0
0
Just some comments:

1. Ethernet will gracefully discard errored frames and retransmits are transparent. Unless you are running some kind of Layer 1 or Layer 2 analyzer year-round, it seems to me that the statement of running 100 Mb ethernet year-round without errors is a bit exagerrated.

2. Low quality digital cables can introduce jitter (referred to as line jitter). This can also amplify jitter caused by the interfaces. Probably best to get good cables.
 

Killbat

Diamond Member
Jan 9, 2000
6,641
1
0
I found this page all about S/PDIF, it gets into the raw signal formats, very interesting.
http://rambl.narod.ru/info/spdif.htm

S/PDIF has similar preamble methods in place, scroll down to "The Coding Format".

[edit] Oh, I see you're asking about DTS. That's 'newer' than S/PDIF, right? Surely it uses comparable (hopefully even superior) techniques.
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
Thanks for the link Killbat.

As a serious audiophile I WANT to believe there is some performance gains. But as a EE, reality sets in. I'll check up on the differenct encoding methods. Also I've tested many cables on my favorite tracks and can't tell a difference.

From what I know there are these types of consumer digital audio
dolby digital 5.1
DTS
Pulse Code Modulated - Full bit rate

TO superdupercooper - this thread is actually for you. Let's pick this one apart until there is nothing left.

<edit> -

<< 1. Ethernet will gracefully discard errored frames and retransmits are transparent. Unless you are running some kind of Layer 1 or Layer 2 analyzer year-round, it seems to me that the statement of running 100 Mb ethernet year-round without errors is a bit exagerrated >>


I do, still no errors. The layer2 protocol will discard the frame based on a bad CRC check. The fact that I run 10 thousand nodes with no CRC errors or FCS errors tells me it is error free. That is what's so perplexing - If I can run 100 or even 1000 Mhz on a single pair of cable with no bit level errors for 100 meters then why the heck can't digital audio gear do the same a less than 1 Mhz, for less than 10 meters.
 

Superdoopercooper

Golden Member
Jan 15, 2001
1,252
0
0
Ok... here's my nickel before I've read any of the links or done any research. This is just a discussion starter until I feel like reading more stuff. ;)

Let's see.... let's assume that the CD player output is a perfect voltage source... and that the receiver input is a perfect infinite impedance load... this will help for the sake of my argument (but doesn't really detract from it, either). The rising and falling edges of the CD output are 25ns for arguments sake, and the edge placement accuracy is of 0.1% (I think this would be considered jitter). Ok... then the following are properties about a given cable/transmission line, and are a function of their construction.

1) Impedance at a given frequency is always fixed (gotten from a BODE plot of the impeance model of the cable... or a frequency sweep measurement)

2) Phase response (ditto above).

3) Group delay - If I remember right, this is just the derivative phase.

4) Propagation delay is a function of frequency.

That being said, Cable 1 has an Impedance of A, Phase of B, Grp Delay of C, and Prop Delay of D. Cable 2 has these same parameters with values W, X, Y, and Z.

Now we hook these two cables up in our system. What can we disregard? In our "ideal system" or even in real life... since we are talking such a low bit rate and a short run of cable, we can assume the impedance of cable 1 vs. 2 is irrelevant and can be ignored. If we were talking about frequencies in the 10's of megahertz... and if the CD players/receivers had a spec for 50/75 Ohm cables, then we wouldn't be able to. Also, since (3) is derived from (2), all we will focus on is (2) and (4)

Hmmm... well... I'm already tired of typing (it is too early on a Sat Morn... I need some food). I'll continue this in my next post. Start thinking about this, scrutinze my argument set-up, and be thinking of your own issues. My goal is to show that one cable will not introduces "harmful" jitter over another one, or amplify jitter as miguel said.

To Be Continued......
 

Killbat

Diamond Member
Jan 9, 2000
6,641
1
0
Oh, I thought we were talking about an optical cable, here. :eek:
Either way, any errors in digital audio should be plainly obvious, correct? as in noticible pops or dropout...
So long as the sound is there, it's 100% intact. Unless, of course, modern hardware interpolates when there's an error. (parity check fails) MiniDisc players interpolate if bad data comes off the disc. You can't tell the difference unless a lot of data is missing, but these are also entire frames of an ATRAC stream. I imagine it would be even harder (impossible?) to detect in the case we've got here. In this case, one single sample is tossed out. It's not very hard to interpolate one sample without notice.
 

JesseKnows

Golden Member
Jul 7, 2000
1,980
0
76
Killbat - I believe this is exactly the point made by the proponents of expensive cables. They say you _do_ end up with interpolated values which are very close to the original, and therefor only audible on the equally expensive amplifiers they use.
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
well, why not interface the other end back to a computer, dump it to a file, and look if there are any errors?
 

flood

Diamond Member
Oct 17, 1999
4,213
0
76
from my experience, jitter, if does happen, is inaudible. using a very nice dac, amp, and headphone setup (Sennheiser hd600, McCormack), I could not hear the difference between a cheap optical cable with a cheap transport and an expensive optical cable with a fancy transport.