I frequent a lot of audio and home theater boards and hear all the time about differences in digital cables.
I for one think it is a bunch of baloney. A properly designed digital transmitter should produce very little variances in timing. Not to mention that home audio bitrates are pathetically slow (hundreds of Kbs) compared to the multigigabit rates running without error today.
This rather slow bitrate would be even less prone to jitter or clock slips than a high speed one.
So does jitter really matter in home audio? Does physical media have anything to do with the delivery of properly clocked bits, especially given such short cables (less than 10 meters). I mean I use 100 meg ethernet year round with NO PHYSICAL ERRORS. Ethernet has a clock syncing mechanism that starts witha pattern of 4 bytes I believe:
10101010111100001010 - called the preamble. The receiving transceiver uses this to clock the incoming bits. Makes receving a 10s of megabits/sec stream rather easy.
Is there no such preamble or clock mechanism in Dobly Digital or DTS or PCM audio? If not then WHY NOT!!!??? Heck, the phone company figured out all this stuff decades ago. I simply cannot believe that a receiver/transmitter can get so far out of sink to actually cause a single time slip. If it can then solve the problem with an external clock to all the gear involved. Or are the 500 dollar digital cables or monster cable just baloney.
Time to get my bread and mustard.
I'll post this link, but haven't read any of the resources. Found using google.
http://www.nanophon.com/audio/
I for one think it is a bunch of baloney. A properly designed digital transmitter should produce very little variances in timing. Not to mention that home audio bitrates are pathetically slow (hundreds of Kbs) compared to the multigigabit rates running without error today.
This rather slow bitrate would be even less prone to jitter or clock slips than a high speed one.
So does jitter really matter in home audio? Does physical media have anything to do with the delivery of properly clocked bits, especially given such short cables (less than 10 meters). I mean I use 100 meg ethernet year round with NO PHYSICAL ERRORS. Ethernet has a clock syncing mechanism that starts witha pattern of 4 bytes I believe:
10101010111100001010 - called the preamble. The receiving transceiver uses this to clock the incoming bits. Makes receving a 10s of megabits/sec stream rather easy.
Is there no such preamble or clock mechanism in Dobly Digital or DTS or PCM audio? If not then WHY NOT!!!??? Heck, the phone company figured out all this stuff decades ago. I simply cannot believe that a receiver/transmitter can get so far out of sink to actually cause a single time slip. If it can then solve the problem with an external clock to all the gear involved. Or are the 500 dollar digital cables or monster cable just baloney.
Time to get my bread and mustard.
I'll post this link, but haven't read any of the resources. Found using google.
http://www.nanophon.com/audio/
