Originally posted by: Xavier434
Originally posted by: lupi
From what I recall when others were talking about cable quality, the signal through an HDMI is all digital; either the 0s and 1s make it through or they don't. I doubt most people would be using sufficent length in their house for length/gage to matter.
I was under the impression that loss can occur in any kind of cable despite whether or not the signal is digital. The only difference is the max cable length that you can use without experiencing any loss for each cable type. Can anyone confirm this? I will be in the market to get some HDMI cables soon myself and I would like to know.
The difference between an analog signal and a digital signal is that an analog signal has an infinite number of amplitudes and a digital signal only has two amplitudes. So while a 35% difference in amplitude in an analog signal will result in a 35% difference in the output, a 35% difference in a digital signal would still be recognizable as what it was intended to be (a 1 or a 0) and thus you would have no difference in the output. I assume you also have parity bits and whatnot in a digital signal so that if you do lose some data it can be recovered.
Also, with an analog signal the result of signal loss or noise in the transmission is noise on the resulting image. With a digital signal I think the loss would be much more noticeable, such as the blocking and artifacts that you sometimes see with digital cable or satellite TV. So it'd be very clear if your cable is inadequate. Side note about digital TV - that's a digital signal modulated onto an analog signal (often from an analog source, no less

). It is more prone to signal interference, because it uses multiple amplitudes* to squeeze more data through the tubes. A pure digital signal doesn't work that way:
http://en.wikipedia.org/wiki/Manchester_code (HDMI uses more complex encoding, but it's still a pure digital signal:
http://en.wikipedia.org/wiki/8B/10B_encoding http://en.wikipedia.org/wiki/T...ifferential_Signaling)
* About those multiple amplitudes - this is all a bit fuzzy in my mind, so it may not all be entirely accurate, but it'll be close. The term "bandwidth" that we often use to refer to bitrate refers to the width of the frequency band (in hz) that is used for a transmission. For example, a POTS modem uses the frequency band of 600-3000 hz, so it has a bandwidth of 2400 hz. When you are modulating digital data, you can send one "baud" for each 1 hz of bandwidth. Thus the maximum "baud rate" (a term that was also improperly used to mean bitrate up until the 9600/14.4 modem timeframe) of a POTS modem is 2400 baud per second, or 2400 baud. 2400 bits per second sucks, so they came up with ways of squeezing more than one bit into a baud - by using 4 amplitudes instead of 2, you can send 2 bits per baud, etc. They used phase or amplitude to squeeze multiple bits into a baud, and then they started using phase AND amplitude modulation together and QAM was born. QAM allows you to fit a lot more bits into a baud by using different combinations of amplitude and phase. And of course now it is used for digital cable TV and cable modems. The more combinations they try to fit into one signal, the closer together the combinations get, making them more susceptible to signal loss and interference than a digital signal like ethernet and USB. That's why you see artifacts more often in digital TV than when you're playing video games.
So yeah, I think I'd go for the 28 AWG. That's what I bought for my brother's PS3, and he hasn't complained. Honestly, I don't think Monoprice would sell the cables if they weren't good.
FWIW, Wikipedia says this about HDMI:
Cable length
The HDMI specification does not define a maximum cable length. As with all cables, signal attenuation becomes too high at a certain length. Instead, HDMI specifies a minimum performance standard. Any cable meeting that specification is compliant. Different construction quality and materials will enable cables of different lengths. In addition, higher performance requirements must be met to support video formats with higher resolutions and/or frame rates than the standard HDTV formats.
The signal attenuation and intersymbol interference caused by the cables can be compensated by using Adaptive Equalization.
HDMI 1.3 defined two categories of cables: Category 1 (standard or HDTV) and Category 2 (high-speed or greater than HDTV) to reduce the confusion about which cables support which video formats. Using 28 AWG, a cable of about 5 metres (~16 ft) can be manufactured easily and inexpensively to Category 1 specifications. Higher-quality construction (24 AWG, tighter construction tolerances, etc.) can reach lengths of 12 to 15 metres (~39 to 49 ft). In addition, active cables (fiber optic or dual Cat-5 cables instead of standard copper) can be used to extend HDMI to 100 metres or more. Some companies also offer amplifiers, equalizers and repeaters that can string several standard (non-active) HDMI cables together.
So it looks like the 28 AWG cable would not be sufficient for the higher-bandwidth applications of HDMI 1.3 at longer cable lengths - I don't know what those applications are.