HDMI and Optical

Chess

Golden Member
Mar 5, 2001
1,452
7
81
How many of you guys run optical verse hdmi for audio to your avr ?

I am about to purchase all my goodies, and determining which components i should run for optical audio and not go via hdmi..

Just curious if some of you guys only do your bluray or tv or what ?
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
I do component\optical for only my cable box (u-verse) to avoid compatibility issues. For everything else, it is all HDMI (Blu-ray & HD-DVD player, PS3, and 360).
 

alfa147x

Lifer
Jul 14, 2005
29,307
106
106
Everything is hdmi. My patents have an older recover that can't do audio via hdmi thus optical / coaxial is needed. It adds more to any wire mess.
 

brownstone

Golden Member
Oct 18, 2008
1,340
33
91
I've been running optical for my sound in order to avoid handshake issues when I turn on or off my tv when listening to music. I find that if I'm running sound through hdmi, I turn off my tv and the music will cut out, then come back on, cut out then come back on (or not). By running optical I can turn off/on my tv with music playing and there are no issues. It could be that there is another way to avoid this, but I don't know what it is.
 

alfa147x

Lifer
Jul 14, 2005
29,307
106
106
I've been running optical for my sound in order to avoid handshake issues when I turn on or off my tv when listening to music. I find that if I'm running sound through hdmi, I turn off my tv and the music will cut out, then come back on, cut out then come back on (or not). By running optical I can turn off/on my tv with music playing and there are no issues. It could be that there is another way to avoid this, but I don't know what it is.

Good point. It's always irritated me.
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
I've been running optical for my sound in order to avoid handshake issues when I turn on or off my tv when listening to music. I find that if I'm running sound through hdmi, I turn off my tv and the music will cut out, then come back on, cut out then come back on (or not). By running optical I can turn off/on my tv with music playing and there are no issues. It could be that there is another way to avoid this, but I don't know what it is.

Sounds like an HDMI control issue (or CEC). If you disable HDMI control, the sound shouldn't cut out on you when you turn your TV off.
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
How would you diagnose this?

To see if HDMI control is turned on? I would go through the TV's and receiver's setup menu and you should see an entry about HDMI control. It also depends how old your equipment is. Just because you have an HDMI capable system, doesn't mean it supports CEC.
 
Last edited:

frowertr

Golden Member
Apr 17, 2010
1,372
41
91
HDMI only for me.

Really if you are playing blu-rays and you have a AVR that can decode DTS-MA/Dolby TrueHD than all you want to run is HDMI. You can't use lossless codecs over optical. Why go back to lossy DTS/DD when Blu-ray offers better options?

I, too, disabled HDMI control on my Panasonic Plasma. I was having problems with handshaking issues but by disabling this all my problems were solved.
 

kornphlake

Golden Member
Dec 30, 2003
1,567
9
81
I use optical because I haven't bought a HDMI receiver, honestly I don't have any reason to at this point. The difference between DTS and DTS MA is negligible with the setup I have, only 2.1. channels.

The only way to get multi channel audio from netflix or other streaming media services is to use HDMI, the optical output either won't work or is downmixed to 2 channel.
 
Last edited:

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
HDMI only for me.

Really if you are playing blu-rays and you have a AVR that can decode DTS-MA/Dolby TrueHD than all you want to run is HDMI. You can't use lossless codecs over optical. Why go back to lossy DTS/DD when Blu-ray offers better options?

I, too, disabled HDMI control on my Panasonic Plasma. I was having problems with handshaking issues but by disabling this all my problems were solved.
Doesn't 2 channel use uncompressed PCM? Asking because I've never watched a Bluray movie (other than seeing bits and pieces without the audio or not caring enough to notice a difference).

Bluray sucks anyway though because the picture is lossy. Few things in life infuriate me more than seeing higher and higher resolutions while the compression is still lossy.
 
Last edited:

NutBucket

Lifer
Aug 30, 2000
27,181
649
126
I run HDMI but since my fat PS3 can't bitstream the audio is still PCM. It does say that Toslink can only provide 2-channel PCM.

I'd still like to know where to get loss-less 1080P content as I keep seeing you post about how lossy BR is....
 

frowertr

Golden Member
Apr 17, 2010
1,372
41
91
Doesn't 2 channel use uncompressed PCM? Asking because I've never watched a Bluray movie (other than seeing bits and pieces without the audio or not caring enough to notice a difference).

Bluray sucks anyway though because the picture is lossy. Few things in life infuriate me more than seeing higher and higher resolutions while the compression is still lossy.

Huh? :confused:

I think we are talking about apples to oranges. If you are talking about CD's then they are typically sampled at 44.1khz with a 16-bit rate and encoded via LPCM which is lossless (you are correct). If you do the math that breaks down to a bit rate of ~1400kpbs. Toshlink and S/PDIF cables can carry this with no problem. But keep in mind this is two channel ONLY. Optical can only carry two channels of lossless audio. If you want to carry multichannel audio (i.e. 5.1+ surround sound) you have to step down to DTS or DD which are lossy codecs.

But we are talking about movies after all and not audio CD's. At least I think that is what we are doing as per the OP's original post. The only type of cable that will carry multi channel LOSSLESS codecs are HDMI at this point in time. The lossless codecs for surround sound/home theater use are DTS/MA and Dolby TrueHD.

I am not sure what you are talking about when you say 1080p is lossy. Lossy compared to what? It is the highest we have if you don't count 4k level displays and those are years away from being mainstream...

You say you have never watched a Blu-ray movie? You must as that is a shame!! You need to find a friend that has a 1080p TV that is calibrated. If I sit you down in front of my 1080p Panasonic Plasma and put in the new Tron in Blu-ray I have a hard time thinking you would say the picture looks "lossy". It is crisp and clear and worlds better than the CRT 480p dinosaurs we had just 5-10 years ago. If you are judging Blu-ray by looking at TV's in box stores than you have to stop that. Those TV's are never calibrated and the lighting in box stores is terrible.
 
Last edited:

slashbinslashbash

Golden Member
Feb 29, 2004
1,945
8
81
Bluray sucks anyway though because the picture is lossy. Few things in life infuriate me more than seeing higher and higher resolutions while the compression is still lossy.

LOL. What consumer-level format provides non-lossy video? If you are rendering Pixar movies in real-time on your GPU, ok, maybe that counts. Or if you're buying surplus film stock from studios and running it through your film projector.... but other than that, I can't think of a totally loss-less video format. Running every frame, uncompressed, would just be.... nuts. A TIFF at 1080x1920x24bit is 5.9MB in size. At 30fps, that's 178MB per second of video. 1 minute of video will be ~10GB. A 2-hour movie will fill up 1.2TB.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
LOL. What consumer-level format provides non-lossy video? If you are rendering Pixar movies in real-time on your GPU, ok, maybe that counts. Or if you're buying surplus film stock from studios and running it through your film projector.... but other than that, I can't think of a totally loss-less video format. Running every frame, uncompressed, would just be.... nuts. A TIFF at 1080x1920x24bit is 5.9MB in size. At 30fps, that's 178MB per second of video. 1 minute of video will be ~10GB. A 2-hour movie will fill up 1.2TB.

You can do loss less compression , the codecs exist for editors, but it still will not be great and the decoding power required is about double that of h.264. 30 Minutes of video is around 28GB.
 

thomsbrain

Lifer
Dec 4, 2001
18,148
1
0
Bluray sucks anyway though because the picture is lossy. Few things in life infuriate me more than seeing higher and higher resolutions while the compression is still lossy.

Yes, Bluray is total crap compared to all those countless other completely lossless consumer video options we have.

:rolleyes:
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I'd still like to know where to get loss-less 1080P content as I keep seeing you post about how lossy BR is....
I'm just saying that the resolution doesn't need to be that high. A lossless compression format will still give some size reduction and it will take away the artifacts.
LOL. What consumer-level format provides non-lossy video? If you are rendering Pixar movies in real-time on your GPU, ok, maybe that counts. Or if you're buying surplus film stock from studios and running it through your film projector.... but other than that, I can't think of a totally loss-less video format. Running every frame, uncompressed, would just be.... nuts. A TIFF at 1080x1920x24bit is 5.9MB in size. At 30fps, that's 178MB per second of video. 1 minute of video will be ~10GB. A 2-hour movie will fill up 1.2TB.
well then don't use as high of a resolution. 1600x900 provides more than sufficient detail and lossless video compression exists so a 2 hour movie doesn't have to be anywhere close to 1.2TB.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I am not sure what you are talking about when you say 1080p is lossy. Lossy compared to what? It is the highest we have if you don't count 4k level displays and those are years away from being mainstream...

You say you have never watched a Blu-ray movie? You must as that is a shame!! You need to find a friend that has a 1080p TV that is calibrated. If I sit you down in front of my 1080p Panasonic Plasma and put in the new Tron in Blu-ray I have a hard time thinking you would say the picture looks "lossy". It is crisp and clear and worlds better than the CRT 480p dinosaurs we had just 5-10 years ago. If you are judging Blu-ray by looking at TV's in box stores than you have to stop that. Those TV's are never calibrated and the lighting in box stores is terrible.
What I was saying was that the higher the resolution, the more space it takes up. I agree that 1920x1080xRGB8 lossless would be too much space, but the resolution doesn't have to be that high. It's all detail and too much artifacts.

As for the audio part, I knew that. What I was asking was... "do any bluray movies have lossless 2 channel audio in addition to DTS/DD scaled down to 2 channels or do they only have DD/DTS scaled down to 2 channels?"

For the record, I prefer lossless 2 channel over DD 5.1. I'd even prefer lossless 16bit 44.1Khz over lossy 24 bit 192Khz. I can see why blurays wouldn't have lossless 2 channel though, because 2 channel DD would be 256 kbps vs a lot more than that for movie audio to be encoded in lossless 2 channel.
But we are talking about movies after all and not audio CD's. At least I think that is what we are doing as per the OP's original post. The only type of cable that will carry multi channel LOSSLESS codecs are HDMI at this point in time. The lossless codecs for surround sound/home theater use are DTS/MA and Dolby TrueHD.
 

chrisheinonen

Junior Member
Feb 6, 2012
22
0
0
Since no one mentioned it, both Dolby TrueHD and DTS-HD Master Audio have provisions in them to downmix a 7.1/5.1 track to 5.1 or stereo, so you can get a lossless, stereo track from a 7.1 or 5.1 mix on a Blu-ray disc. You don't need a separate audio track as it's built into the codec for that reason.

Second, going from 1920x1080 to 1600x900 is only losing 30% in resolution, and would effectively do nothing to allow for a lossless image to be presented. Going with Tree of Life (which is the best looking Blu-ray disc I saw last year), it has an average video bitrate of 33.7 Mb/sec, which is effectively 35:1 compression from if you had a full, lossless RGB 1080p image. Going down to 1600x900 means you only need a compression ratio of 24:1 to get down to this. Of course, this is for a 2 hour movie with no extras on the disc, and for longer movies, it would need to be greater. Now, there are some other things to think about:

- Since Tree of Life is 1.85:1, and other films range from 1.33:1 up to 2.55:1 or anything in-between, you often have some sort of letter-boxing or window-boxing on the image. This is just black, so it compresses amazingly well, and you save space then. So maybe you only need 20:1 or 15:1 to get this.

Right now, movies are encoded at 4:2:0, so you have full Luma detail, and a quarter of the chroma detail. It sucks that you are missing chroma information, but it's also incredibly hard to see when there is any sort of motion, and anything other than a test pattern designed to show off the missing information. This saves you have the bitrate, and the expense of almost nothing.

Of course, then you get into MPEG-4/AVC/VC-1 compression and the artifacts that those introduce. It is possible to have compression to save space without any loss in quality by bringing over data from the previous frame, as with 24p content (all this math is based on 24p, as that is what most Blu-ray content is) there is going to be a lot of repeated information in both frames. This isn't going to get you there, but it's going to get you closer than before.

Finally you have compression that, yes, will add noise, or a loss of detail, or block artifacts, or a combination of everything listed. Done well, you'll be very, very hard pressed to notice this on a Blu-ray disc. For the most part, a lot of this will even be masked by your display, where its inability to fully reproduce motion resolution at 24p is going to cause you to miss details more than the compression on the disc is. I'd argue that your display is holding back the content encoded on the disc more than the disc itself if. Most LCDs are pretty lousy with motion resolution (perhaps 400-600 lines, certainly not 1080), and plasmas are better but they have their own issues as well. Unless you have something like an OLED that has far better motion, or a prototype CrystalLED set from Sony, most of those compression artifacts are going to be missing since a display can't fully render them anyway.

It would be wonderful to have a playback medium that can do full 8-bit or 10-bit per pixel color, with lossless compression that results in no artifacts, but going to 1600x900 isn't going to make a realistic difference in doing so, and the difference between uncompressed and compressed is very, very hard to notice without really reference quality gear.
 

NutBucket

Lifer
Aug 30, 2000
27,181
649
126
Wow. Hell of a post!

Since its late I'll just ask instead of looking myself but is motion resolution mainly an issue with flat panel technologies as opposed to DLP technology?
 

Howard

Lifer
Oct 14, 1999
47,986
11
81
I've been running optical for my sound in order to avoid handshake issues when I turn on or off my tv when listening to music. I find that if I'm running sound through hdmi, I turn off my tv and the music will cut out, then come back on, cut out then come back on (or not). By running optical I can turn off/on my tv with music playing and there are no issues. It could be that there is another way to avoid this, but I don't know what it is.
Well, that's one thing to do. I also suffer from losing audio when the TV is turned off.
 

chrisheinonen

Junior Member
Feb 6, 2012
22
0
0
Wow. Hell of a post!

Since its late I'll just ask instead of looking myself but is motion resolution mainly an issue with flat panel technologies as opposed to DLP technology?

I haven't tested a DLP display for resolution but it's more likely to suffer from issues than other display technologies just due to the nature of how it works I would imagine (each color rendered individually versus simultaneously). Since it's relying on the slight permanence of colors it wouldn't be able to change as quickly and you'd likely get overlap between a pixel as it changes color.

CRTs have far better motion resolution as the colors all decay at the same relative speed, and aren't layered as they are on a DLP, and really everything is trying to get back to that. OLED is very good at this, and will make a huge difference once its available for home.