Which is better? Using HDMI or Component cables?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

YOyoYOhowsDAjello

Moderator<br>A/V & Home Theater<br>Elite member
Aug 6, 2001
31,204
45
91
Originally posted by: JasonCoder
Doesn't HDMI carry both the video and audio over the one cable? So if I were in the OP's boat (which I will be soon) would I have run the audio back out of my TV to my receiver?

Or run a different cable straight from the source to the receiver
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Originally posted by: JasonCoder
Doesn't HDMI carry both the video and audio over the one cable? So if I were in the OP's boat (which I will be soon) would I have run the audio back out of my TV to my receiver?

devices usually allow audio to be sent through other means and disabling the audio transmission through the HDMI port. Such as, use HDMI strictly for video, and out of the device in question (DVD player, or what have you), run a optical connection to the receiver for audio.
inserting a middle-man in the audio stream could be worse if that device inserts any electronic noise. sure, its all optical but would the TV pass the signal directly as optical or would it make it a digital signal through electric means? but nevermind since the audio for HDMI is actually not as perfect as digital optical since its electricity (but still digital signal), so there is still a conversion step in the tv to change it into optical.
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
Originally posted by: destrekor
Originally posted by: JasonCoder
Doesn't HDMI carry both the video and audio over the one cable? So if I were in the OP's boat (which I will be soon) would I have run the audio back out of my TV to my receiver?

devices usually allow audio to be sent through other means and disabling the audio transmission through the HDMI port. Such as, use HDMI strictly for video, and out of the device in question (DVD player, or what have you), run a optical connection to the receiver for audio.
inserting a middle-man in the audio stream could be worse if that device inserts any electronic noise. sure, its all optical but would the TV pass the signal directly as optical or would it make it a digital signal through electric means? but nevermind since the audio for HDMI is actually not as perfect as digital optical since its electricity (but still digital signal), so there is still a conversion step in the tv to change it into optical.

don't use optical for audio.

Use coax.

It really does sound a lot better for digital audio.
 

ethebubbeth

Golden Member
May 2, 2003
1,740
5
91
Originally posted by: spidey07
don't use optical for audio.

Use coax.

It really does sound a lot better for digital audio.

sooooooooooo the bits sent by optical are different than the bits sent by coax :confused:?

Unless the output device processes the outputs different or the target converts them to analog differently, shouldn't there not be any difference provided that there is no outside interference?
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
Originally posted by: ethebubbeth
Originally posted by: spidey07
don't use optical for audio.

Use coax.

It really does sound a lot better for digital audio.

sooooooooooo the bits sent by optical are different than the bits sent by coax :confused:?

Unless the output device processes the outputs different or the target converts them to analog differently, shouldn't there not be any difference provided that there is no outside interference?

GAAAAAAAAAAAAAAAA

Stop this "digital is just digital" nonsense. Start another thread or just listen for yourself.

To answer your question, yes, very different.
 

ethebubbeth

Golden Member
May 2, 2003
1,740
5
91
Originally posted by: spidey07
Originally posted by: ethebubbeth
Originally posted by: spidey07
don't use optical for audio.

Use coax.

It really does sound a lot better for digital audio.

sooooooooooo the bits sent by optical are different than the bits sent by coax :confused:?

Unless the output device processes the outputs different or the target converts them to analog differently, shouldn't there not be any difference provided that there is no outside interference?

GAAAAAAAAAAAAAAAA

Stop this "digital is just digital" nonsense. Start another thread or just listen for yourself.

To answer your question, yes, very different.

My question is, HOW is the output different. I realize that digital != digital in all cases. Usually that refers to DAC quality and noise withing the circuit... how does this apply to transmission of the signal from the source?

If I hooked up my dvd player to my receiver via toslink or an RCA coax cable, what would be different?

EDIT: I am not looking to pick a fight, I am just looking for an explanation.
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
Originally posted by: ethebubbeth
My question is, HOW is the output different. I realize that digital != digital in all cases. Usually that refers to DAC quality and noise withing the circuit... how does this apply to transmission of the signal from the source?

If I hooked up my dvd player to my receiver via toslink or an RCA coax cable, what would be different?

EDIT: I am not looking to pick a fight, I am just looking for an explanation.

It's a whole 'nutter thread. Don't want to hijack Tech's thread.

In your exmple it's packetized and jitter doesn't matter. PCM is a whole other story.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Originally posted by: spidey07

GAAAAAAAAAAAAAAAA

Stop this "digital is just digital" nonsense. Start another thread or just listen for yourself.

To answer your question, yes, very different.

:laugh:

You should've known better than to post that in a nerd forum! :p

Of course anyone "in the know" and thus experienced in audio knows the "Let there NOT be light" rule of avoiding optical. ;)

But they will resist. :p

And yes, Jim, it's NOT dead. Not yet. :laugh:

 

Shawn

Lifer
Apr 20, 2003
32,236
53
91
On a plasma or lcd, HDMI. On a crt it probably doesn't matter as much.
 

biggestmuff

Diamond Member
Mar 20, 2001
8,201
2
0
Originally posted by: spidey07
Originally posted by: destrekor
Originally posted by: JasonCoder
Doesn't HDMI carry both the video and audio over the one cable? So if I were in the OP's boat (which I will be soon) would I have run the audio back out of my TV to my receiver?

devices usually allow audio to be sent through other means and disabling the audio transmission through the HDMI port. Such as, use HDMI strictly for video, and out of the device in question (DVD player, or what have you), run a optical connection to the receiver for audio.
inserting a middle-man in the audio stream could be worse if that device inserts any electronic noise. sure, its all optical but would the TV pass the signal directly as optical or would it make it a digital signal through electric means? but nevermind since the audio for HDMI is actually not as perfect as digital optical since its electricity (but still digital signal), so there is still a conversion step in the tv to change it into optical.

don't use optical for audio.

Use coax.

It really does sound a lot better for digital audio.

I challange you to prove that statement.
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
Originally posted by: biggestmuff
I challange you to prove that statement.

Bah, even my girl could tell the difference. In a double blind test. She picked out the coax every single time.
 

arcas

Platinum Member
Apr 10, 2001
2,155
2
0
There's no easy answer these days. Some TVs accept a 1080p signal over HDMI, others cannot. Some can accept a 1080p over component, others cannot. For example, the Sony SXRD 60A2000 can't accept a 1080p signal over the component inputs while Samsung HL-S series can. Similarly, some TVs can't accept full 1080p over their HDMI inputs (the 2005 JVC HD-ILA units were like this I think).

In short, check your TV, figure out what each of its inputs are capable of handling and go from there. The next generation HDTVs will probably solve this problem but these days it's still an issue.

 

biggestmuff

Diamond Member
Mar 20, 2001
8,201
2
0
Originally posted by: spidey07
Originally posted by: biggestmuff
I challenge you to prove that statement.

Bah, even my girl could tell the difference. In a double blind test. She picked out the coax every single time.

Could you explain the test setup? What equipment did you use?

I've heard a case for S/PDIF over fiber, but never for coax. I haven't tested the two. I'm not being a jerk, just inquisitive.
 

Tom

Lifer
Oct 9, 1999
13,293
1
76
here's another issue, at least on my lcd.

using the dvi connection, most adjustments to the picture are disabled, which might mean I'm getting a more accurate rendition of the source material, but that doesn't mean it's a pleasing to my taste, or the lighting in my room, as a less accurate setting might be.

 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
Originally posted by: biggestmuff
Could you explain the test setup? What equipment did you use?

I've heard a case for S/PDIF over fiber, but never for coax. I haven't tested the two. I'm not being a jerk, just inquisitive.

It surely is not scientific.

Switching between coax/optical, played test tones with a CD to level match. Player is a sony 777es. I was ticked at the quality of the DACs in my receiver for music and wanted to give it a try.

Too many variables to make a conclusion, but I listen to both methods and make my choice.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Originally posted by: spidey07
Originally posted by: ethebubbeth
My question is, HOW is the output different. I realize that digital != digital in all cases. Usually that refers to DAC quality and noise withing the circuit... how does this apply to transmission of the signal from the source?

If I hooked up my dvd player to my receiver via toslink or an RCA coax cable, what would be different?

EDIT: I am not looking to pick a fight, I am just looking for an explanation.

It's a whole 'nutter thread. Don't want to hijack Tech's thread.

In your exmple it's packetized and jitter doesn't matter. PCM is a whole other story.

where are you going with this? if you are sending between source and receiver without decoding first (thus, digital sound sent in the DTS or Dolby format, not in PCM), where is the technical difference? bleeps of electricity (simplified) represent the same thing blips of light do, whereas noise on the line will distort the portions that represents both 1's and 0's and thus may have some 1's that just get dropped because they may become inbetween 1 and 0 or spike to high, and 0's may spike too low to inbetween, and the receiver would likely drop them because they are unreadable.
this is what we learned through our CCNA program (back when i was interested in that crap), when discussing fiber connections and non-fiber connections.

maybe i am failing to see your point, but how is the data presented any differently between the two lines (not counting PCM or other audio formats, specifically just the way they are transmitted).

Originally posted by: YOyoYOhowsDAjello
*slowly backs out of thread*

i SO want to know where you stand on this yoyo. get back in here!
 

ethebubbeth

Golden Member
May 2, 2003
1,740
5
91
Originally posted by: spidey07
Originally posted by: biggestmuff
I challange you to prove that statement.

Bah, even my girl could tell the difference. In a double blind test. She picked out the coax every single time.

I'm actually going to do a double blind test on the issue at hand with my girlfriend on my next day off. This interests me greatly.

Would there be any situation in which optical would prevail over coax? For instance, for an extended cable run or a situation in which there is a lot of power interference?
 

bob4432

Lifer
Sep 6, 2003
11,726
45
91
Originally posted by: MS Dawn
Originally posted by: spidey07

GAAAAAAAAAAAAAAAA

Stop this "digital is just digital" nonsense. Start another thread or just listen for yourself.

To answer your question, yes, very different.

:laugh:

You should've known better than to post that in a nerd forum! :p

Of course anyone "in the know" and thus experienced in audio knows the "Let there NOT be light" rule of avoiding optical. ;)

But they will resist. :p

And yes, Jim, it's NOT dead. Not yet. :laugh:

could you please enlighten those "not in the know"?? at what point is one going to hear a difference - as in quality of gear? where does the difference begin? how is it that optical is inferior to coax if using dd/dts audio? the reason i am posting in this thread is because the other thread i found wasn't even being taken seriously....

op - to be perfectly honest i can't see a difference feeding my 40" 720p lcd from either. i picked up a hdmi cable from monoprice as they are cheap there. i sit ~10' from the lcd and am feeding it hd cable from a sa 8300hd box. i also can't see a difference if i have the tv scale 1080i to 720p or have the box scale it, so maybe my eyes are bad...
 

JasonCoder

Golden Member
Feb 23, 2005
1,893
1
81
Originally posted by: destrekor
Originally posted by: spidey07
Originally posted by: ethebubbeth
My question is, HOW is the output different. I realize that digital != digital in all cases. Usually that refers to DAC quality and noise withing the circuit... how does this apply to transmission of the signal from the source?

If I hooked up my dvd player to my receiver via toslink or an RCA coax cable, what would be different?

EDIT: I am not looking to pick a fight, I am just looking for an explanation.

It's a whole 'nutter thread. Don't want to hijack Tech's thread.

In your exmple it's packetized and jitter doesn't matter. PCM is a whole other story.

where are you going with this? if you are sending between source and receiver without decoding first (thus, digital sound sent in the DTS or Dolby format, not in PCM), where is the technical difference? bleeps of electricity (simplified) represent the same thing blips of light do, whereas noise on the line will distort the portions that represents both 1's and 0's and thus may have some 1's that just get dropped because they may become inbetween 1 and 0 or spike to high, and 0's may spike too low to inbetween, and the receiver would likely drop them because they are unreadable.
this is what we learned through our CCNA program (back when i was interested in that crap), when discussing fiber connections and non-fiber connections.

maybe i am failing to see your point, but how is the data presented any differently between the two lines (not counting PCM or other audio formats, specifically just the way they are transmitted).

I really don't know about quality differences but I know it's damn hard for an optical cable to create a ground loop. A plus in my book.

 

Slammy1

Platinum Member
Apr 8, 2003
2,112
0
76
Depends on the implementation of the component, the source, and other variables leading to an earlier statement(s) that you try both. In general, digital display+digital picture=digital cable, but different components will have different strengths in outputs. I had/have trouble with the concept as applied to PCs, where my experience tells me DVI is better, but better people than me have argued the case that neither is inherently superior.

EDIT: I have an S/PDIF, a coax digital, and RCA connectors running to my receiver. The best playback is from the RCA cables because I have better DACs in my soundcard than my older (6-7 years) Yamaha AVR vs my X-Fi platinum. I even run 44.1kHz for music.
 

biggestmuff

Diamond Member
Mar 20, 2001
8,201
2
0
Originally posted by: spidey07
Originally posted by: biggestmuff
Could you explain the test setup? What equipment did you use?

I've heard a case for S/PDIF over fiber, but never for coax. I haven't tested the two. I'm not being a jerk, just inquisitive.

It surely is not scientific.

Switching between coax/optical, played test tones with a CD to level match. Player is a sony 777es. I was ticked at the quality of the DACs in my receiver for music and wanted to give it a try.

Too many variables to make a conclusion, but I listen to both methods and make my choice.

If you dislike the DACs in the Sony, why would you use that to A/B test the fiber and coax? I'm not making the connection between the two.