1080i vs. 1080p (Quick Question)

chazdraves

Golden Member
May 10, 2002
1,122
0
0
Okay, I've done a bit of research on the subject, and I've come to understand that all LCD and Plasma TV's aren't actually capable of displaying an interlaced signal. This is to say they (even if they don't say 1080p on the box) will take a 1080i signal, de-interlace it, and display it at 1080p. Obviously, I have learned that 1080p is up there as far as marketing scams go since all Plasma, LCD, etc. already support that as the end result.

All this aside: Is a 1080i television (which will be displaying at 1080p) going to display at 60 frames per second or 30. This is to ask: will the device (I'm speaking specifically of PS3/360) output 1080i at 120 fields or 60 (thus resulting in either 60 or 30fps as I understand it)?

In other words, I'm trying to find any advantage of these 1080p televisions.

Regards,
- Chaz
 

krotchy

Golden Member
Mar 29, 2006
1,942
0
76
Ok, 1080i60 = 30 FPS. Basically fields are updated at 60Hz and every 2 fields is considered a frame. When something is recorded at 1080i, it will create slightly jagged edges because each field is recorded 1/60th of a second away from the one before it. So a frozen frame contains information from 2 snapshots in time. With ultra fast motion frozen, you can clearly see separation between the fields when you pause the image

1080p30 = 30FPS, both fields updated simultaneously, all 1920x1080 pixels are a snapshot of the same piece of time. There is also 1080p60, which is 60FPS, and rarely used for video (however computers running 1920x1080 are outputting 1080p60 essentially, so this is why its mentioned)

1080p LCD/Plasma's do in fact update the fields separately in 1080i, and simultaneously at 1080p, but will always show a whole frame at once. 720pLCD/Plasma's (1280x720 and other similar resolutions) LCD's can output 1080i but they actually will interpolate the fields across their pixels, so you can never see an entire 1080i frame at any given time.

The biggest advantage of 1080P TV's is the ability to use them as a 1920x1080 computer monitor. Also, when viewing 1080i video, you actually can see both fields at the exact same time, instead of having it interpolated like on 720P TV's.
 

Snakexor

Golden Member
Feb 23, 2005
1,316
16
81
quick question on the hitachi 42", it is a 1080i panel, if you hook it up to the computer is the resolution 1280x1080 or 1280x720 or 1280x540?
 

krotchy

Golden Member
Mar 29, 2006
1,942
0
76
Originally posted by: Snakexor
quick question on the hitachi 42", it is a 1080i panel, if you hook it up to the computer is the resolution 1280x1080 or 1280x720 or 1280x540?

The hitachi I *think* you are refering to is the 1080P panel, which uses subsampled 1080p, so it is in fact 1280x1080 with pixels that are 1.5 wider than they are tall. So a computer will see 1280x1080, and 1080p video will have all the vertical resolution, but will sample 3 pixels into 2 for width. Things will not appear very distorted at all, although it may give a slight horizontal smear to some edges (you would need a pretty good eye to notice this, as most people cant, but to some it is unforgivable)

Panasonic's HD cameras that use DVDPRO-HD are actually recording 1280x1080p video, so it appears Hitachi has decided that since most HD recording devices sample the video at some modified rate (HDV, DVCPRO-HD, 4:1:1/4:2:0 Mpeg, etc all have some information lost) So Hitachi has decided to reverse sample thin pixels to wide pixels and declare it a 1080p TV. I suspect when watching movies 1280x1080p and 1920x1080p will have virtually no difference from 6+feet away. However using it as a computer monitor will look quite distorted as the pixels are 1.5 width.
 

Snakexor

Golden Member
Feb 23, 2005
1,316
16
81
hitachi rep said it is not 1080p and will not display 1080p. he said it is 1080i with the 1.5 pixels
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Originally posted by: krotchy
Ok, 1080i60 = 30 FPS. Basically fields are updated at 60Hz and every 2 fields is considered a frame. When something is recorded at 1080i, it will create slightly jagged edges because each field is recorded 1/60th of a second away from the one before it. So a frozen frame contains information from 2 snapshots in time. With ultra fast motion frozen, you can clearly see separation between the fields when you pause the image
Maybe on the TV side as I cannot say otherwise. On the video capture side, 1080i60 is 60 fps as is 1080p60. It can be changed to 30fps with a rendering tool (or a camera workflow if available). SD is 480i 30 fps and interlaced. 1080p30 is 30fps progressive. 1080p24 is progressive 24fps (for those that think that the jerky persistance of vision "film" look is ideal.)

All of this makes a differences as our editors have to support it and then render and produce output.

Edit - and HDV ends up being 1440x1080. HD video (1080) is 1920x1080. Pixel shape and revolving 'standards' make the diff. One is a display size and one is a capture standard and they get freaked up. 2K is even better capture and 4k is mind blowing (RED Cinema 4k 4:4:4 60fps progressive for the win - but not a consumer product). Dirty little secret - the Canon HV10 has a 1920x1080 chip, but is stored as a 1440x1080 stream, which is HDV2 packetized elemental stream.
 

nrb

Member
Feb 22, 2006
75
0
0
Originally posted by: chazdraves
In other words, I'm trying to find any advantage of these 1080p televisions.
I suggest you check out the following thread at AV Forums:

http://www.avforums.com/forums/showthread.php?t=409129

AF Forums is a UK site, so we have 50Hz issues to worry about, and a vastly restricted choice of AV hardware compared to the States, but most of what is talked about in that thread is applicable. The first post is especially relevant.

 

chazdraves

Golden Member
May 10, 2002
1,122
0
0
Wow, heh. That was a good deal more information than I was looking for. So, as I read it, I have an LG 42PC3d Plasma hooked up to my 360. According to LG, this set is capable of 1024x768p. This, I would assume, means that 720P is actually superior on this set to 1080i? This is all so very odd because I bumped my 360 up to 1080i the other day and would swear I saw a drastic improvement, but it doesn't seem like that's possible.

Also, the last article mentions that Blu-Ray and HD-DVD aren't truely using 1080p. This has changed since July '05, correct?

Comments, ideas? Thanks!
- Chaz
 

krotchy

Golden Member
Mar 29, 2006
1,942
0
76
As I said before, the only benefit to 1080p TV's is if you ever plan to hook a Computer up to them (Which I do, and its LOVELY :) ). From 6+ feet, 1080i/p and 720p look nearly identical until you start getting to the 65+ Inch sets. As a computer resolution 1920x1080 is 1000x better than 1360x768 or other 720p resolutions. The author of that article never goes into HTPC use, but it does make a big difference in that area at least.