• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

HDTV Resolution

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Question..if 1080i and 1080p are pretty much the same, why do EDTVs look so much better than standard def ones?
 
Why aren't digital shows shot and broadcast in 1080p30hz? What's this nonsense about 24hz, interleaving, etc? It's like before the HDTV standards were set in stone they held meetings on how to pigeonhole people into the attitude "we can't tell unless we sit close". Where is the sense in interlacing a perfetly good progressive picture and then forcing the TV to deinterlace it, and convert to a non divisible framerate to boot? I could see it benefitting CRT HDTVs, but can't a CRT interlace a progressive signal anyway?

And why the persistence of 1366x768 TVs? I can understand if the manufacturers were using old, pre HDTV panels, but they are designing brand new screens with that resolution.
 
Originally posted by: spidey07
Originally posted by: Matt2
Which cable company is that?

I've bought digital cable from 2 different companies and all I ever got was 1080i and 720p.

Read the link. 1080i = 1080p for most content (24 frames). Live events and high quality shows are shot in 1080p/30, carried via 1080i/60 and then fully and perfectly de-interlaced back to 1080p/30. That's why discoveryHD and sporting events look so good.

You don't have to transmit 1080p to get fully, pufect, identical 1080p frames.

I went to Best Buy and the guy said that 720p is much better and 1080i and that 1080p is just a hoax because there isn't anything available in 1080p.

So I bought an LCD and I couldn't be happier! 1080i FTW!



































(kidding!)
 
Originally posted by: mercanucaribe
Why aren't digital shows shot and broadcast in 1080p30hz? What's this nonsense about 24hz, interleaving, etc? It's like before the HDTV standards were set in stone they held meetings on how to pigeonhole people into the attitude "we can't tell unless we sit close". Where is the sense in interlacing a perfetly good progressive picture and then forcing the TV to deinterlace it, and convert to a non divisible framerate to boot? I could see it benefitting CRT HDTVs, but can't a CRT interlace a progressive signal anyway?

And why the persistence of 1366x768 TVs? I can understand if the manufacturers were using old, pre HDTV panels, but they are designing brand new screens with that resolution.

hdtv actually came out many years ago. remember your pc's processing power 8+ years ago? pretty sad stuff, 1080p was probably seriously pushing your luck back then.
24fps is film
 
Originally posted by: SLCentral
Originally posted by: spidey07
Originally posted by: tenshodo13
Ah i see, so the people who say that 1080P Tv's are useless are wrong?

More than wrong. Completely cluelees as well as deap up misinformed and unknowlegable.

The posted article shows what I've been trying to say ever since 1080p displays came out.

I understand that 1080p is all great, and plays 1080i content better, but why is it that if I compare a Pioneer Elite 50" plasma side-by-side to the Pioneer Elite 50" 1080p plasma, which costs almost twice as much, I don't see much of a difference? An additional $3-$4K would be much better spent on audio, IMO.

Same idea with LCD's. Comparing the Sony S2010 to the Sony V2500, one being 720p and the other 1080p, while there is a difference in color (the V2500, other then 1080p, is a better set all around), doesn't look all that much different, especially not $500 better. Granted, this is all with 1080i satellite, but that's the majority of what people watch today anyways.

And if anyone wants to tell me that HD-DVD on my Fujitsu 63" 720p plasma doesn't look as good as a 1080p set, they haven't seen the power of a Fujitsu.

There's no such thing as a 720P plasma is there? They are all 1366x768.
 
Originally posted by: DBL

You guys need to read the link posted above. You don't watch anything on your 1080P at 1080i. Your TV converts it and for most 1080i sources (sports and live TV being an exception), it does a pretty good job of giving you all the resolution of a 1080P picture.

The source is still 1080i. You're not going to get increased resolution with 1080p, you'll get a non-interlaced refresh.

In reality, you do lose some detail in the deinterlacing process. I've encoded many movies from my Mini-DV camcorder to play on the computer, and you just can't get the full resolution back when you deinterlace it. You'll either end up with jaggies (because interlaced fields don't match up exactly when you display them on a progressive display) or you'll end up line doubling which cuts the resolution in half.
 
Originally posted by: five40
720p. I don't understand how people can watch anything in 1080i. It's a comb fest at 1080i.

1080i doesn't look like a comb fest on my TV. What kind of TV do you have and what is its native resolution?
 
Originally posted by: TheAdvocate
Originally posted by: waggy
wish direct TV did 1080P. hell i would be happy if they did my local (rockford) channels in HD. sigh

GAAAH.

1080i is the same as 1080p once you deinterlace it. It's the exact same on every modern display.


There is a difference, but you probably can't see the difference yet since the cameras aren't recording it in 1080p.

If it was a true 1080p source, it would be better than simply deinterlacing 1080i content. But since the cameras are usually recording in 1080i and then converting it to 1080p, you're going to lose some detail as compared to a 1080p source.



 
Originally posted by: secretanchitman
720p, no 1080i crap.

720p/1080p FTW.

Another clueless fool who didn't bother to read the link.


Edit: I give up. There are the people who do understand (Spidey) and then the people who have no clue at all but still want to post.
 
This is an impossible question to answer as it will be different for everyone. This will depend on the HD content you are receiving, what you are watching and your TV. Even how far away you sit from your screen. I have seen 720P broadcasts that are better than 1080i (not just sports and fast moving video). I've also seen upconverted content that was better than the native content depending on how well the TV does upconversion. There are many factors that can effect the quality of the picture and these are not limited to: the quality of the HD feed, the resolution, the content itself, the TV, and the TVs ability to upconvert if you are using that function.
 
Originally posted by: spidey07
Originally posted by: Matt2
Which cable company is that?

I've bought digital cable from 2 different companies and all I ever got was 1080i and 720p.

Read the link. 1080i = 1080p for most content (24 frames). Live events and high quality shows are shot in 1080p/30, carried via 1080i/60 and then fully and perfectly de-interlaced back to 1080p/30. That's why discoveryHD and sporting events look so good.

You don't have to transmit 1080p to get fully, pufect, identical 1080p frames.

So most 1080p TV does the job of 1080i60--->1080p60 well?
 
Originally posted by: 91TTZ
Originally posted by: TheAdvocate
Originally posted by: waggy
wish direct TV did 1080P. hell i would be happy if they did my local (rockford) channels in HD. sigh

GAAAH.

1080i is the same as 1080p once you deinterlace it. It's the exact same on every modern display.


There is a difference, but you probably can't see the difference yet since the cameras aren't recording it in 1080p.

If it was a true 1080p source, it would be better than simply deinterlacing 1080i content. But since the cameras are usually recording in 1080i and then converting it to 1080p, you're going to lose some detail as compared to a 1080p source.

That article was talking about 1080P 24fps though. Is that only for movies?
 
Originally posted by: mercanucaribe
Originally posted by: 91TTZ
Originally posted by: TheAdvocate
Originally posted by: waggy
wish direct TV did 1080P. hell i would be happy if they did my local (rockford) channels in HD. sigh

GAAAH.

1080i is the same as 1080p once you deinterlace it. It's the exact same on every modern display.


There is a difference, but you probably can't see the difference yet since the cameras aren't recording it in 1080p.

If it was a true 1080p source, it would be better than simply deinterlacing 1080i content. But since the cameras are usually recording in 1080i and then converting it to 1080p, you're going to lose some detail as compared to a 1080p source.

That article was talking about 1080P 24fps though. Is that only for movies?

The 24 fps standard was made that way because of movies. There are actually a few different 1080p standards, such as 1080p 24 fps, 1080p 30 fps, and 1080p 60 fps. Obviously we'd want 60 fps, but that would require more bandwidth than is available right now for broadcasts. So you can either go with 1080p, 30 fps or 1080p 24 fps. That has to be encoded on the broadcast format, which is usually 1080i, 60 fps (since it's interlaced, that would be 60 fields per second instead of 60 frames per second)

Since 1080i is 1080 interlaced lines at 60 Hz, you can evenly fit a 1080p program that's encoded at 1080p, 30 fps (each frame would just be broadcast as 2 fields). You can also fit 24 fps content on that stream by doing a 3:2 pulldown link
 
Originally posted by: mercanucaribe
By the way, why are digital movies still shot at 24fps?

I don't know if all of them are. But I'd imagine that if they plan on releasing the film in the theaters, which still use film projectors, they'd want to be able to print each frame of the movie on a frame of film.

That's why movies look different than TV shows. Movies still stick to that legacy of 24 fps, while many TV shows film at a different framerate such as 30 fps, progressive.

Have you ever looked at a soap opera and noticed how the motion is different? I think most of them shoot at 60i (60 fields per second, interlaced)
 
Originally posted by: 91TTZ
Originally posted by: mercanucaribe
By the way, why are digital movies still shot at 24fps?

I don't know if all of them are. But I'd imagine that if they plan on releasing the film in the theaters, which still use film projectors, they'd want to be able to print each frame of the movie on a frame of film.

That's why movies look different than TV shows. Movies still stick to that legacy of 24 fps, while many TV shows film at a different framerate such as 30 fps, progressive.

Have you ever looked at a soap opera and noticed how the motion is different? I think most of them shoot at 60i (60 fields per second, interlaced)

All I notice about soap operas is the shallow depth of field and dramatic pauses.

I thought the film might be a possibility, but then I figured all theatres had digital projectors by now.
 
I've read that the content on most HD discs (HD-DVD and Blu-Ray) are encoded at 1080i. Does that mean 1080i60?

If something is encoded on a HD disc in 1080i60 (frames per second), can it be perfectly deinterlaced and shown at 1080p24?

Is that what 3:2 pulldown is?

If what I asked is true, why would content makers even bother encoding it at 1080i60 and not just put it on the disc at 1080p24?

 
Originally posted by: BigToque
I've read that the content on most HD discs (HD-DVD and Blu-Ray) are encoded at 1080i. Does that mean 1080i60?

If something is encoded on a HD disc in 1080i60 (frames per second), can it be perfectly deinterlaced and shown at 1080p24?

Is that what 3:2 pulldown is?

If what I asked is true, why would content makers even bother encoding it at 1080i60 and not just put it on the disc at 1080p24?

No deinterlacing is perfect. You lose some resolution when you do it.
 
Originally posted by: 91TTZ
Originally posted by: BigToque
I've read that the content on most HD discs (HD-DVD and Blu-Ray) are encoded at 1080i. Does that mean 1080i60?

If something is encoded on a HD disc in 1080i60 (frames per second), can it be perfectly deinterlaced and shown at 1080p24?

Is that what 3:2 pulldown is?

If what I asked is true, why would content makers even bother encoding it at 1080i60 and not just put it on the disc at 1080p24?

No deinterlacing is perfect. You lose some resolution when you do it.

Why would you lose resolution?
 
Originally posted by: Inspector Jihad
this stuff be mad confusing yo 🙁

so right now i'm watching blu-ray movies on my 1366x768 display...what should my upgrade be?

Which display?

Size?

Do you like how it looks?
 
Back
Top