Originally posted by: Paperdoc
Originally posted by: soccerballtux
When we actually switch to digital I think all the antennae on the market currently will work just fine-- they'll be pumping up the signal to full broadcast power. Currently they're only running at something like 10 or 25% power.
Paper/B2BW, what sort of noise is getting picked up? Would there be any way to selectively filter the ATSC signal out in the up-to-1Ghz range??
As silverpig said, multipath noise is a significant factor - or, more precisely, was very significant for analog TV because no amp or tuner could distinguish one path from another. I have the impression that digital TV is less affected by this noise source because the digital signal processors can hone in on the strongest signal and ignore the much weaker delayed signal "echo", but I cannot verify that. Anyone here really know about this?
The next two most significant trouble sources are actual random noise added to the signal in cables and at connections between the antenna and the amplifier or tuner, and the negative side: signal strength loss in the cable (which reduces the signal relative to later addition of outside noise). By far the best cure for both is a better antenna to pick up a stronger signal in the first place (hence all the sophisticated antenna designs), but that may not be practical for many applications. Addition of external noise can be reduced by using good cables with good shielding, and sometimes by surrounding open terminals with a shield. This is one key reason that people using an antenna with a balanced 300-ohm output avoid using the old "flat-lead" antenna cable, even twisted. Instead they put a transformer right at the antenna terminals and convert to 50-ohm or 75-ohm coaxial cable which has good shielding. For most, it would be sensible here to go directly to 75 ohms since that is the TV's input impedance.
Signal loss in the cable can be handled three ways. First is the obvious - keep your cable runs as short as possible. Next is to buy a low-loss cable, which usually costs more because it uses more expensive materials and is larger so it uses more of those materials. Usually those same high-quality cables also have better shielding from external noise signals. Third is one we have not discussed here yet - in-line amplifiers. These are sold in retail outlets and consist of two modules. One is the amplifier itself, and it looks like a little tubular unit that inserts into the cable line. It is mounted as close to the antenna as possible before more noise is added. At your antenna the system looks like: antenna screw terminals (300 ohm) -> transformer (output at 75 ohm) -> in-line amp -> cable down to your house. The second module is the power supply for the amp. It is inserted right at the the end of that same cable coming down to your room, likely just where the cable enters your TV or distribution amp, and it plugs into the wall. Its trick is to recognize that the signals coming down the cable are all AC - in the range of 5 to 1000 MHz -and no DC component is needed in the signal. So it puts little blocking capacitors in the signal line that allow all the signal through but block any DC voltage, and then uses the center and shield conductors of the cable to send the DC power up the cable to the antenna-mounted in-line amp.
Maximizing the original signal and minimizing the addition of noise really are the only ways to get the best signal. Filtering is not a good option. In the analog signal domain, trying to build a bandpass or a notch filter to remove small portions of the signal band containing noise is impractical because the signals you want are spread over the entire range from 50 to 1000 MHz with few gaps (channels) you don't plan to need. Although I do not understand digital signal processing and filtering, I expect the same type of limit applies there, too. The advantage of digital processing seems to be that it makes the design of "filters" with much sharper "cutoff curves" possible, but you still can't pluck noise pulses out of signal pulses in the same frequency range, unless there is a substantial difference in pulse magnitude - that is, a better signal-to-noise ratio in the input.
Regarding the transmitter power levels, it is common to start up a transmitter at reduced output power and debug it over the start-up period, raising output as the system is optimized. Since the cutover to all-digital OTA is done (or almost), I would sincerely hope that most broadcasters are nearly finished that phase by now and already up to max output.