Originally posted by: BrownTown
If all you are saying is that the digital signals are infact represented by real valued voltages which can be affected in transmission than this is certainly not a big revelation to most here. The whole point though is that given a decent transmission line the distortion can be completely removed. So long as the noise in the distorted signal is not large enough to completely overshadow the actual data then a PERFECT transmission of data can be achieved. With an analog signal this is not possible because it is impossible to separate the noise from the actual signal. OF course you can try your best by modeling the channel be measuring the distortion in known signals and the like, be information is ALWAYS lost given a non trivial data stream.
I'm out of my element here but I agree. It would seem to be much easier to distort an analog signal due to the fact that the information is in a continuous wave form is more susceptible to degradation than one that is quantized (represented by a number system like binary). If the suggestion by OP is merely that both signals have time as a factor, well, duh.