Confused About Audio Codecs

Elfear

Diamond Member
May 30, 2004
7,165
824
126
I'm pretty new to the whole audio codec stuff and after doing some research into the matter I still need some clarification. What I'm trying to do is set up my HTPC to output HD audio and to find an HT receiver that will work with it. My HTPC consists of a 2600Pro, E2180@2.8GHz, and LG HD-DVD/Blu Ray combo drive. From what I understand, the 2000 series ATI cards will not pass TrueHD or the DTS equivalent through an HDMI adapter but it will pass LPCM. Doesn't the "L" in LPCM stand for lossless, so wouldn't LPCM be just as good as TrueHD?

If TrueHD is indeed better what options are there to pass TrueHD from my HTPC? ATI 4000 series cards? New Asus HDMI soundcard?
 

newnameman

Platinum Member
Nov 20, 2002
2,219
0
0
The L in LPCM stands for linear...that said, LPCM is uncompressed audio, like a WAV file, so it's just as good as Dolby TrueHD or DTS-HD MA.
 

Muadib

Lifer
May 30, 2000
18,124
912
126
According to chizow in this thread,:

The HD sound formats need to be decoded by whatever software package you use and are output via analog through your sound card.

ATI 4000 series have their own onboard audio codec that does output 8ch LPCM over HDMI. Nvidia's current HDMI solution only allows for s/pdif passthrough of standard AC3/DD/DTS formats.

Around September you'll start seeing new sound cards that will do uncompressed/lossless bitstreams over HDMI as well as 8ch LPCM. Expect them to run ~$200 (Asus Xonar 1.3 and Auzentech Prelude Home Theater).
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
Originally posted by: newnameman
The L in LPCM stands for linear...that said, LPCM is uncompressed audio, like a WAV file, so it's just as good as Dolby TrueHD or DTS-HD MA.
But limiting the issue to LPCM, TrueHD, and DTS-HD MA is also misleading, because there's also DTS-HD HR and DD+ to consider. If your receiver does a better job of decoding these than your computer (which is a real possibility), you'd want to be able to bitstream them, not just decode to LPCM.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Hmm. So would an ATI 4000 series card be the best bet? It will output 8-channel LPCM but is that necessarily the best option? All the receivers I'm looking at should decode all the newest HD formats but I need a video card that will output the bitstream of HD audio, right? Will the ATI 2000 series do that over HDMI?
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Originally posted by: Slick5150
No, nothing currently bitstreams HD audio on the PC.

So what is the difference in sound quality between LPCM and TrueHD/DTS-HD MA?
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Originally posted by: Slick5150
Originally posted by: Elfear
Originally posted by: Slick5150
No, nothing currently bitstreams HD audio on the PC.

So what is the difference in sound quality between LPCM and TrueHD/DTS-HD MA?

None

Well that's good. So what's all the fuss about people waiting for the Asus and Auzentech sound cards coming out if the ATI 4000 series cards will already produce the same quality sound at a better price?
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
The argument for bitstream over LPCM is that bitstream means decoding at the end of the chain before it is amplified and sent to your speakers, where as LPCM has to deal with both magnetic interference from equipment the transfer cable goes by as well as EMI from the source (and the computer is a powerful source of EMI).

This is why I'm not real big on "high end" analog sound cards. Whether you have a "midrange" or "high end" the lowest common denominator is the PC itself and its massive source of various distortions.

That said I usually go with LPCM for movies, it's hard to discern minor differences in "quality" when the majority of output is dialogue and explosions.

For music I use bitstream any day.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Originally posted by: Astrallite
The argument for bitstream over LPCM is that bitstream means decoding at the end of the chain before it is amplified and sent to your speakers, where as LPCM has to deal with both magnetic interference from equipment the transfer cable goes by as well as EMI from the source (and the computer is a powerful source of EMI).

This is why I'm not real big on "high end" analog sound cards. Whether you have a "midrange" or "high end" the lowest common denominator is the PC itself and its massive source of various distortions.

That said I usually go with LPCM for movies, it's hard to discern minor differences in "quality" when the majority of output is dialogue and explosions.

For music I use bitstream any day.

Ah, that makes sense. My HT is mostly for movies anyway so maybe I won't get overly worried about bitstreaming from my HTPC.

Another question, is there different levels of LPCM? I've read about downsampling to a lower frequency (?) and that may affect sound quality. Does that happen to LPCM?
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
No, disregard my entire post, LOL.

I didn't do my homework.

Bitstream just means formats like Dolby Digital, DTS, etc, remain in their "zipped" format and the receiver at the end of the chain "unzips" it.

Linear PCM means the DTS/Dolby Digital is already pre-"unzipped."

Now I'm not an expert but there shouldn't be an audible difference. Since both formats are digital if there is any affect from electromagnetic interference it would affect both of them. But digital signals for the most part are unaffected by EMI.

With coaxial or optical digital, both of them had bandwidth limitations which was why output was limited to either PCM (2-channel), or a compressed format that would fit within the PCM bandwidth. Now with HDMI you don't need to compress anything given the massive increase in bandwidth.

Lol so from a practical standpoint, it's entirely up to you whether you bitstream or do linear pcm. I usually do linear pcm simply because its hassle-free, since you know everything has been pre-decoded already and you don't have to wonder if you need to do any juggling on the receiver end to make sure you are getting the right output.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Originally posted by: Astrallite
No, disregard my entire post, LOL.

I didn't do my homework.

Bitstream just means formats like Dolby Digital, DTS, etc, remain in their "zipped" format and the receiver at the end of the chain "unzips" it.

Linear PCM means the DTS/Dolby Digital is already pre-"unzipped."

Now I'm not an expert but there shouldn't be an audible difference. Since both formats are digital if there is any affect from electromagnetic interference it would affect both of them. But digital signals for the most part are unaffected by EMI.

With coaxial or optical digital, both of them had bandwidth limitations which was why output was limited to either PCM (2-channel), or a compressed format that would fit within the PCM bandwidth. Now with HDMI you don't need to compress anything given the massive increase in bandwidth.

Lol so from a practical standpoint, it's entirely up to you whether you bitstream or do linear pcm. I usually do linear pcm simply because its hassle-free, since you know everything has been pre-decoded already and you don't have to wonder if you need to do any juggling on the receiver end to make sure you are getting the right output.

Thanks for clearing that up. So would the cheapest route for HD audio be to wait for the ATI 4400 or 4600 series cards to come out? Would there be any benefit to forking over the dough for one of the new HDMI sound cards when they come out and pair it with my 2600Pro?