Originally posted by: destrekor
Originally posted by: spidey07
Originally posted by: ethebubbeth
My question is, HOW is the output different. I realize that digital != digital in all cases. Usually that refers to DAC quality and noise withing the circuit... how does this apply to transmission of the signal from the source?
If I hooked up my dvd player to my receiver via toslink or an RCA coax cable, what would be different?
EDIT: I am not looking to pick a fight, I am just looking for an explanation.
It's a whole 'nutter thread. Don't want to hijack Tech's thread.
In your exmple it's packetized and jitter doesn't matter. PCM is a whole other story.
where are you going with this? if you are sending between source and receiver without decoding first (thus, digital sound sent in the DTS or Dolby format, not in PCM), where is the technical difference? bleeps of electricity (simplified) represent the same thing blips of light do, whereas noise on the line will distort the portions that represents both 1's and 0's and thus may have some 1's that just get dropped because they may become inbetween 1 and 0 or spike to high, and 0's may spike too low to inbetween, and the receiver would likely drop them because they are unreadable.
this is what we learned through our CCNA program (back when i was interested in that crap), when discussing fiber connections and non-fiber connections.
maybe i am failing to see your point, but how is the data presented any differently between the two lines (not counting PCM or other audio formats, specifically just the way they are transmitted).