Originally posted by: spidey07
Originally posted by: Matt2
Which cable company is that?
I've bought digital cable from 2 different companies and all I ever got was 1080i and 720p.
Read the link. 1080i = 1080p for most content (24 frames). Live events and high quality shows are shot in 1080p/30, carried via 1080i/60 and then fully and perfectly de-interlaced back to 1080p/30. That's why discoveryHD and sporting events look so good.
You don't have to transmit 1080p to get fully, pufect, identical 1080p frames.
Originally posted by: mercanucaribe
Why aren't digital shows shot and broadcast in 1080p30hz? What's this nonsense about 24hz, interleaving, etc? It's like before the HDTV standards were set in stone they held meetings on how to pigeonhole people into the attitude "we can't tell unless we sit close". Where is the sense in interlacing a perfetly good progressive picture and then forcing the TV to deinterlace it, and convert to a non divisible framerate to boot? I could see it benefitting CRT HDTVs, but can't a CRT interlace a progressive signal anyway?
And why the persistence of 1366x768 TVs? I can understand if the manufacturers were using old, pre HDTV panels, but they are designing brand new screens with that resolution.
Originally posted by: tenshodo13
I have a 1080P TV
Should i see shows in 720P or 1080i?
Originally posted by: SLCentral
Originally posted by: spidey07
Originally posted by: tenshodo13
Ah i see, so the people who say that 1080P Tv's are useless are wrong?
More than wrong. Completely cluelees as well as deap up misinformed and unknowlegable.
The posted article shows what I've been trying to say ever since 1080p displays came out.
I understand that 1080p is all great, and plays 1080i content better, but why is it that if I compare a Pioneer Elite 50" plasma side-by-side to the Pioneer Elite 50" 1080p plasma, which costs almost twice as much, I don't see much of a difference? An additional $3-$4K would be much better spent on audio, IMO.
Same idea with LCD's. Comparing the Sony S2010 to the Sony V2500, one being 720p and the other 1080p, while there is a difference in color (the V2500, other then 1080p, is a better set all around), doesn't look all that much different, especially not $500 better. Granted, this is all with 1080i satellite, but that's the majority of what people watch today anyways.
And if anyone wants to tell me that HD-DVD on my Fujitsu 63" 720p plasma doesn't look as good as a 1080p set, they haven't seen the power of a Fujitsu.
Originally posted by: DBL
You guys need to read the link posted above. You don't watch anything on your 1080P at 1080i. Your TV converts it and for most 1080i sources (sports and live TV being an exception), it does a pretty good job of giving you all the resolution of a 1080P picture.
Originally posted by: five40
720p. I don't understand how people can watch anything in 1080i. It's a comb fest at 1080i.
Originally posted by: TheAdvocate
Originally posted by: waggy
wish direct TV did 1080P. hell i would be happy if they did my local (rockford) channels in HD. sigh
GAAAH.
1080i is the same as 1080p once you deinterlace it. It's the exact same on every modern display.
Originally posted by: secretanchitman
720p, no 1080i crap.
720p/1080p FTW.
Originally posted by: spidey07
Originally posted by: Matt2
Which cable company is that?
I've bought digital cable from 2 different companies and all I ever got was 1080i and 720p.
Read the link. 1080i = 1080p for most content (24 frames). Live events and high quality shows are shot in 1080p/30, carried via 1080i/60 and then fully and perfectly de-interlaced back to 1080p/30. That's why discoveryHD and sporting events look so good.
You don't have to transmit 1080p to get fully, pufect, identical 1080p frames.
Originally posted by: 91TTZ
Originally posted by: TheAdvocate
Originally posted by: waggy
wish direct TV did 1080P. hell i would be happy if they did my local (rockford) channels in HD. sigh
GAAAH.
1080i is the same as 1080p once you deinterlace it. It's the exact same on every modern display.
There is a difference, but you probably can't see the difference yet since the cameras aren't recording it in 1080p.
If it was a true 1080p source, it would be better than simply deinterlacing 1080i content. But since the cameras are usually recording in 1080i and then converting it to 1080p, you're going to lose some detail as compared to a 1080p source.
Originally posted by: mercanucaribe
Originally posted by: 91TTZ
Originally posted by: TheAdvocate
Originally posted by: waggy
wish direct TV did 1080P. hell i would be happy if they did my local (rockford) channels in HD. sigh
GAAAH.
1080i is the same as 1080p once you deinterlace it. It's the exact same on every modern display.
There is a difference, but you probably can't see the difference yet since the cameras aren't recording it in 1080p.
If it was a true 1080p source, it would be better than simply deinterlacing 1080i content. But since the cameras are usually recording in 1080i and then converting it to 1080p, you're going to lose some detail as compared to a 1080p source.
That article was talking about 1080P 24fps though. Is that only for movies?
Originally posted by: mercanucaribe
By the way, why are digital movies still shot at 24fps?
Originally posted by: 91TTZ
Originally posted by: mercanucaribe
By the way, why are digital movies still shot at 24fps?
I don't know if all of them are. But I'd imagine that if they plan on releasing the film in the theaters, which still use film projectors, they'd want to be able to print each frame of the movie on a frame of film.
That's why movies look different than TV shows. Movies still stick to that legacy of 24 fps, while many TV shows film at a different framerate such as 30 fps, progressive.
Have you ever looked at a soap opera and noticed how the motion is different? I think most of them shoot at 60i (60 fields per second, interlaced)
Originally posted by: BigToque
I've read that the content on most HD discs (HD-DVD and Blu-Ray) are encoded at 1080i. Does that mean 1080i60?
If something is encoded on a HD disc in 1080i60 (frames per second), can it be perfectly deinterlaced and shown at 1080p24?
Is that what 3:2 pulldown is?
If what I asked is true, why would content makers even bother encoding it at 1080i60 and not just put it on the disc at 1080p24?
Originally posted by: mercanucaribe
By the way, why are digital movies still shot at 24fps?
Originally posted by: 91TTZ
Originally posted by: BigToque
I've read that the content on most HD discs (HD-DVD and Blu-Ray) are encoded at 1080i. Does that mean 1080i60?
If something is encoded on a HD disc in 1080i60 (frames per second), can it be perfectly deinterlaced and shown at 1080p24?
Is that what 3:2 pulldown is?
If what I asked is true, why would content makers even bother encoding it at 1080i60 and not just put it on the disc at 1080p24?
No deinterlacing is perfect. You lose some resolution when you do it.
Originally posted by: Inspector Jihad
this stuff be mad confusing yo 🙁
so right now i'm watching blu-ray movies on my 1366x768 display...what should my upgrade be?