• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

What is the highest quality Audio format for HD movies

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Using a PC eliminates most of this mess for me. Slightly lower quality audio (not the full resolution, but still much much better than anything on normal DVDs)
 
Yes, the Dolby TrueHD is causing an awful lot of confusion for lots of people (including me).

Basically, to get the benefits of TrueHD, you can do one of the following:
1. Have it decoded and outputted directly as analog.
2. Pass the bitstream to a receiver that can decode TrueHD.
3. Decode to PCM and pass via HDMI 1.1 or higher.

For those with older receivers that have HDMI 1.1, you need a player that can convert TrueHD to PCM.
 
Originally posted by: destrekor
Originally posted by: cmdrdredd
Originally posted by: zinfamous
Originally posted by: cmdrdredd
So Uncompressed PCM and TrueHD are the same bitrate etc?

bitrate is FUD. very similar to the marketing FUD that has lead many to believe that MP is the most important aspect of quality in a digital camera.

From what I understand, there is very little reason to even look at bitrate. All codecs and cables are capable of achieving any bitrate that will be necessary for the forseeable future.

The higher the bitrate the MORE data is being processed at one time thus more of the original audio is preserved. That's why DTS 5.1 IS better than DolbyDigital 5.1 it has more bandwidth.

that's NOT guaranteed to be true. the thing is, DTS and Dolby use two completely different compression techniques, and Dolby has a superior compression algorithm. However, while a DTS track was larger simply due to not as efficient compression, it was also larger because it did want to present superior audio to the listener.
however, while in this case it is true, more bitrate isn't guaranteed to present better quality audio since compression algorithms differ amongst companies, and thus one may actually produce a higher quality audio track while actually having a lower bitrate.

yeah, this is how i understand it. if the codec isn't designed to have a higher bitrate, then it doesn't matter compared to one that has a higher bitrate fi the previous codec is far mroe efficient at what it does, with no appreciable loss of quality--and this is what happens.

I trust that most people on these forum don't trust such pallid terms as "DPI" when buying a printer (define a "dot," and tell me how it is significant when one company defines a dot differently than another company, and then uses those numbers to compare against the competition's--completely invalid marketing BS), or MP when buying a digital camera (is it true that every 7.1 MP camera is better than a 4 MP camera? No way in hell)

It's already become quite apparent that several HD movies transferred through MPEG-2 are at least as good, and even better than some compressed through MPEG-4 or even VC-1. You'll find abysmal transfers using all codecs, just as you will find superb ones. I believe the same is true for audio bitrates. It's a term that is incomparable when audio codec is designed to work efficiently off of a specific bitrate. Indeed, the GOAL of a talented A/V compressionist is to achieve the highest possible quality with the lowest footprint. I recently read an article explaining this, but I haven't been able to find it....
 
Back
Top