• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

no difference between 1080i and 1080p????

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
The issue here is talking about films on blu ray or something

hence they are not true interlaced video..but telecined movie

therefore a deinterlacer is never even used so no deinterlace quality applies to 1080i movies...you would use a standard inverse telecine which would reconstruct a 24fps progressive frame

with a 1080p movie the tv would just decimate duplicate frames to achieve 24 fps progressive..

so the content would be the same..

now..with a TRUE interlaced 1080i video, not a movie that has been telecined..then there would be a difference
 
the other thing is that unless you have a rather large screen, the increased definition isn't terribly noticeable. which is why so many flat panel televisions have a vertical resolution of under 800 pixels and it doesn't matter.
 
Originally posted by: puffpio
The issue here is talking about films on blu ray or something

hence they are not true interlaced video..but telecined movie

therefore a deinterlacer is never even used so no deinterlace quality applies to 1080i movies...you would use a standard inverse telecine which would reconstruct a 24fps progressive frame

with a 1080p movie the tv would just decimate duplicate frames to achieve 24 fps progressive..

so the content would be the same..

now..with a TRUE interlaced 1080i video, not a movie that has been telecined..then there would be a difference

If the chips can truly do it without any loss, then sure...sounds good.

But my audiophile/videophile side tells me to "not muck with your source"

 
Isn't this because most televisions will upconvert to 1080p? From what I've read, most 1080p televisions will take the 1080p signal, convert it to 1080i, and then convert it again to 1080p. That's just messy
 
I have both 1080i and 1080p HDTV sets. Honestly, they both look amazing, and there is not enough of a noticable difference between the two. It's marketing hype- Plain and simple.
 
Originally posted by: PaulNEPats
I have both 1080i and 1080p HDTV sets. Honestly, they both look amazing, and there is not enough of a noticable difference between the two. It's marketing hype- Plain and simple.

Exactly. The only way I can tell if I'm watching 720p content or 1080i content on my 1080i CRT HDTV is to check. All of you just shut up and enjoy the hi-def goodness.
 
Originally posted by: PaulNEPats
I have both 1080i and 1080p HDTV sets. Honestly, they both look amazing, and there is not enough of a noticable difference between the two. It's marketing hype- Plain and simple.

meh, 1080i I can see very pronounced interlacing artifacts and flicker.

1080p video would show you just how big of a difference there is.
 
Originally posted by: MrChad
1080p is overrated. Most TVs that even claim to have the resolution have no inputs capable of receiving it.

And to think I got hammered for buying a Phillips 1080i plasma because it didn't project in 1080p. I'm convinced that most people don't know squat about the formats.
 
Exactly. The only way I can tell if I'm watching 720p content or 1080i content on my 1080i CRT HDTV is to check. All of you just shut up and enjoy the hi-def goodness.

But your TV is resampling the 720p into 1080i. If there were to be any discernerable difference between the two resolutions then you won't be able to tell since you can only display one of them.
 
Originally posted by: UNCjigga
1080p makes a HUGE FRICKIN DIFFERENCE for anything HTPC related--and yes it is noticeable!

BLOODY RIGHT! thank god someone said it. we JUST got our sony VPL-VW100 projector and it displays FULL 1080p via dvi and/or hdmi.

it looks a hell of a lot better than 1080i.

1080p >>>>>>>>>>>>>>>>>>>> 1080i.

/thread.
 
Originally posted by: chuckywang
Isn't only half a frame projected at any moment during interlaced playback? How much deinterlacing does a TV actually do?

60 times a second.

Basically takes each of the two fields that are interlaced to create a single frame.
 
Originally posted by: CPA
Originally posted by: MrChad
1080p is overrated. Most TVs that even claim to have the resolution have no inputs capable of receiving it.

And to think I got hammered for buying a Phillips 1080i plasma because it didn't project in 1080p. I'm convinced that most people don't know squat about the formats.

There is no such thing as a 1080i plasma. Your plasma is most likely 720p. (Unless you just bought it, then there is the possibility you have a 1080p plasma)
 
If you have a bigass TV you might notice the difference (60 inches or larger), but how many people have TVs that big? 😛
 
Originally posted by: spidey07
Originally posted by: chuckywang
Isn't only half a frame projected at any moment during interlaced playback? How much deinterlacing does a TV actually do?

60 times a second.

Basically takes each of the two fields that are interlaced to create a single frame.

Then doesn't 1080i and 1080p require the same bandwidth to transmit? I fail to see the advantages of transmitting 1080i.
 
Originally posted by: chuckywang
Isn't only half a frame projected at any moment during interlaced playback? How much deinterlacing does a TV actually do?

There is no "degree" of deinterlacing. Your set either does it or it doesn't.

Excluding CRT based displays...........

All TVs purchased today present images progressively. Therefore, if the incoming signal is interlaced, the set has to deinterlace it.
 
Back
Top