Originally posted by: blahblah99
Each channel is allocated a frequency range in which they operate on. Your receiver does nothing more than "tune" into the frequency range of interest. Everyone who subscribes to cable get the same signal... your receiver is the one that does all the work.
Edit: For example, lets say you have 10 audio channels that you want to send out through a coax cable. Well, in order to squeeze all 10 channels through a single wire, you will need to modulate each channel on its own frequency. Because audio is band-limited to 22Khz, you need to sample at 44khz to prevent aliasing. If you modulate that to a higher frequency, then the bandwidth required for transmission becomes 88Khz, or roughly 100khz.
Let's say channel 1 gets modulated with a 1Mhz signal.. then its transmission frequency range is 900khz - 1.1Mhz, channel 2 gets modulated with 1.2Mhz and operates on 1.1Mhz - 1.3Mhz.. and so on.
They all get superimposed on each other and transmitted. On the receiving end, there is a bandpass filter that removes all frequencies except for the one of interest. The data gets de-modulated and recovered.
Sheesh, people with Phd these days... !
To defend the Ph.D's, that wasn't the question that I asked, at all. Your talking cable TV in general. I'm talking about Digital Cable, specifically On Demand TV. According to
this linkAnd if I'm reading it right, Coaxial cable RG-6 can carry 270 Mb/s for a distance of 415 meters. Since digital cable requires 11.6 Mbs per channel or On Demand TV, I do not see how RG-6 can carry 300 channels let alone On Demand TV to every house that has it.
I understand the whole concept of peak time periods and relay stations (called proxy's and repeaters for internet junkies), but if you look at the number of relay stations you local cable office has, and the number of people per relay, the numbers don't add up. And you can't say they have cable that is super fast now, because the cable that leads to my parents house and entire neighborhood was laid down over 20 years ago and hasn't been dug up since. So were talking about 20 year old technology delivering all this bandwidth. Before you start talking about superimposing frequencies, look up the frequency that digital cable runs at, look at the distance this travels before noise degrades the signal, the number of relays and repeaters then run the numbers. The compression, I do understand, and I don't doubt they use it, but when I watch movies on the High Definition 52" bigscreen they look like DVD quality, so they aren't using very lossy compression, if at all(lossy). DVD's use compression, but they are still 7-14 gigabytes! If not more for movies like Lord of the Rings. Run the numbers and you'll understand, there is something else.
Even with relays, lossy compression (which doesn't look lossy at all in this case on a 52"), time sharing, and so forth, it still doesn't make sense by the numbers that I have.