How come most 720P Displays can do 1080i?

BassBomb

Diamond Member
Nov 25, 2005
8,396
1
81
Hello,

I have my NEC montior which is made for HDTV, but normally is a Monitor.

I am going to be watching HD movies and the format I recieved was 1080i

Which got me wondering, how a 720P display can do 1080i? Is it downscaled horizontally to fit the screen?

Would 720P be a better choice for my display than 1080i? Upscaled VS Downscaled?

Thoughts are appreciated,
Thanks
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Depends, but probably 1080i is halved to 540 and then upconverted to 720p. Seems like there is a mandate to display all ATSC formats IIRC for displays so you'll be able to see 720p or 1080i broadcasts if you have an HDTV ready digital display. Implementation varies for the manufacturers and displays.
 

jdoggg12

Platinum Member
Aug 20, 2005
2,685
11
81
Isn't 1080i the same resolution as 720p, it just alternates the picture every other refresh to send twice the info but at half the speed?
 

Viper GTS

Lifer
Oct 13, 1999
38,107
433
136
How often does your display precisely match the reolustion of your source material?

Displays have to be able to accept a wide range of formats, many modern TV's can do anywhere from 480i to 1080P. All are scaled to the native resolution of the display.

Originally posted by: jdoggg12
Isn't 1080i the same resolution as 720p, it just alternates the picture every other refresh to send twice the info but at half the speed?

720P = 1280x720 progressive
1080i = 1920x1080 interlaced
1080P = 1920x1080 progressive

Viper GTS
 

jdoggg12

Platinum Member
Aug 20, 2005
2,685
11
81
http://www.highdefforum.com/archive/index.php/t-2965.html


1080 interlaced... 1080 lines verticle

interlaced.... odd and even scanning...

60fps (frames or fields per second)

1080i - one field is 540 lines even line numbers scanned
and then - one field of 540 odd line numbers scanned.


1080 progressive.... same 1080 lines vertical.

progressive.... entire field scanning - all 1080 lines scanned together...

30 fps (field or frames in this instance - because of entire field scan)

1080p - all 1080 lines odd and even but in order top to bottom at once. No even then odd thing going on.

1080i 60 fps 540 lines scanned in 1/60th of a second...
1080p 30 fps 1080 lines scanned in 1/30th of a second...

hope that helps.
Interlaced vs. Progressive
When you're watching your television, if you go up real close to it and watch carefully (but not too long, remember kids, it'll rot your eyes!) you'll notice that the picture sort of "shimmers." Old skool computer users will probably remember how back in the day, when resolutions higher than 800x600 were in the realm of super-highend workstations, you could sometimes get those higher resolutions if you really tried, but your picture would wind up flickering and shimmering, usually causing groans of disgust and a quick jump back down to a lower resolution. That's interlacing at work.

Progressive video means that every pixel on the screen is refreshed in order (in the case of a computer monitor), or simultaneously (in the case of film). Interlaced video is refreshed to the screen twice every frame - first every Even scanline is refreshed (the little gun at the back of your Cathode Ray Tube shoots all the correct phosphors on the even numbered rows of pixels) and then every Odd scanline. This means that while NTSC has a framerate of 29.97, the screen is actually being partially redrawn every 59.94 times a second. A half-frame is being drawn to the screen every 60th of a second, in other words. This leads to the notion of fields.
 

BassBomb

Diamond Member
Nov 25, 2005
8,396
1
81
That does not answer my question:

Question is, why do 720P displays do 1080i? Could they not to 1080p also with similar downscaling?

What rbV5 said is very interesting, then if that were the case 720P versions would be better since I don't need to mess with the premium for 1080
 

jdoggg12

Platinum Member
Aug 20, 2005
2,685
11
81
Originally posted by: BassBomb
That does not answer my question:

Question is, why do 720P displays do 1080i? Could they not to 1080p also with similar downscaling?

What rbV5 said is very interesting, then if that were the case 720P versions would be better since I don't need to mess with the premium for 1080

Originally posted by: jdoggg12
http://www.highdefforum.com/archive/index.php/t-2965.html


1080 interlaced... 1080 lines verticle

interlaced.... odd and even scanning...

60fps (frames or fields per second)

1080i - one field is 540 lines even line numbers scanned
and then - one field of 540 odd line numbers scanned.


1080 progressive.... same 1080 lines vertical.

progressive.... entire field scanning - all 1080 lines scanned together...

30 fps (field or frames in this instance - because of entire field scan)

1080p - all 1080 lines odd and even but in order top to bottom at once. No even then odd thing going on.

1080i 60 fps 540 lines scanned in 1/60th of a second...
1080p 30 fps 1080 lines scanned in 1/30th of a second...

hope that helps.

It doesn't show all 1080 lines at once, it bounces back and forth between showing the even 540 lines and the 540 odd ones. So at no point in time does it show a full 1080 lines, but alternates 540 and 540 to trick your eyes into seeing 1080
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,035
429
126
Originally posted by: jdoggg12
Isn't 1080i the same resolution as 720p, it just alternates the picture every other refresh to send twice the info but at half the speed?

Not in the least bit! A 1080i native display actually has 1080 lines of vertical information (i.e. there are xxxx by 1080 lines of resolution, where xxxx is usually 1920 for a widescreen 16:9 format screen).

The fact that it is 1080i simply means that it can only process and display an interlaced feed. In an interlaced feed, half the lines in each frame are updated. Basically if you look at the raw data in a 1080i feed, all 1080 lines are there, however, 1/2 those lines are the current frame, the other 1/2 are from the previous frame. This method worked fine in the past when the TV's themselves only drew every other line on the screen each refresh. Now TV's can redraw every pixel on the screen at once, however interlaced data streams were the standard for so long, they made there way into the HD TV as well.

Now some types of HD monitor or TV have varying amounts of vertical information (CRT's, tub projectors, etc.), others have fixed amount (plasma's, LCD, LCoS, LCD projection, LCoS projection, OLED, etc.). A fixed pixel display that has a 720p native resolution, only has 720 lines of vertical information. It may be able to display a 1080i feed, but it down-converts the feed to a 720p resolution. Depending on how good a down-converter it was given it can do any number of things, from horror of horrors, converting to 540i, then up-converting to 720p. The best converters would up-convert to 1080p then down-convert to 720p, but this requires a much more powerful processor, and is thus more expensive. Which is why you will see in cheaper models doing the "horror of horrors" method. You lose information when it is done this way. When it is up-converted first to 1080p, it is basically running a de-interlacer process and then down-converts to 720p losses the least amount of detail from the feed as theoretically possible.

Again "i" just means interlaced. So the feed is sending a merged frame, with every other line being the current frame's data and the lines in between being the previous frame's data. There is a great website which will explain this to you:

http://www.100fps.com/
 

BassBomb

Diamond Member
Nov 25, 2005
8,396
1
81
So then effectively on a 720P display, mine for example, 1920 is downscaled to 1680 meaning about 240 pixel loss

But the 540 lines are upscaled to 1050
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,035
429
126
Originally posted by: BassBomb
So then effectively on a 720P display, mine for example, 1920 is downscaled to 1680 meaning about 240 pixel loss

But the 540 lines are upscaled to 1050

No... a 1080i feed is not 540 lines. It actually contains 1080 lines of data, however 540 of those lines are from the current frame, and 540 are for the previous frame. If you have a crappy de-interlacer, or crappy down-scaler, you may wind up with 540 lines before it is upscaled to 1050 on your screen. However a GOOD scaller would take the 1080i feed, convert to 1080p and then scale to 1050 for your screen, or it may even just take the 1080i feed and crop the top 15 and bottom 15 pixels from the feed. Those areas were not normally displayed anyway (it is the "overscan" area) where other data is encoded on the feed (closed caption information is stored in the top few lines on the screen, I believe surround sound was also put on the bottom at one point). However with the cropping method you lose a lot of the sides of the picture as well, as the aspect ratios are wrong (1080i feed is a 16:9 ratio, your 1640x1050 screen is a 16:10 ratio, meaning there is more height to width in your screen then in the 1080i feed, so while it can show the full height of the feed, it can not show the full width).
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: BassBomb
So then effectively on a 720P display, mine for example, 1920 is downscaled to 1680 meaning about 240 pixel loss

But the 540 lines are upscaled to 1050

A 1680x1050 monitor is not really what I would call a 720P display. It can display a 720p feed 1:1, but it won't fill the whole screen. Of course, then you have "720p" LCDs that are actually 1366x768, and then there are the really wacked out ones with nonsquare pixels...

If you want to display something fullscreen on a fixed-pixel display, and the feed does not match the resolution of the display, it must be transformed somehow to match the native resolution.

A common technique for dealing with 1080i content in HDTVs that are natively 720p (or as output from, say, a cable box or external scaler set to output in 720p) is to crunch it down to 540p internally, then scale that up to 720p. This is somewhat more lossy (and cheaper) than going 1080i->1080p and then scaling the 1080p down to 720p.

However, if you're trying to display 1080i content on a computer, usually what it does is to turn it into 1080p internally (since all computer monitors these days are progressive-scan), then scale that to the desired output resolution.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
No... a 1080i feed is not 540 lines. It actually contains 1080 lines of data, however 540 of those lines are from the current frame, and 540 are for the previous frame

Not quite, each new "frame" is made of 2 new "fields" 1920x540 (not from a previous frame). The 2 "fields" each contain every other scanline (odd, scanlines for one "field", even scanlines for the other "field". They are displayed each "field", one at a time at a time every 1/60th second to make one complete 1920x1080 frame each 1/30th second.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: BassBomb
That does not answer my question:

Question is, why do 720P displays do 1080i? Could they not to 1080p also with similar downscaling?

What rbV5 said is very interesting, then if that were the case 720P versions would be better since I don't need to mess with the premium for 1080

Native 720p material on an LCD looks better than native 1080i yes, because no LCD can handle interlaced content natively, some 1080p displays however can simply line-double 1080i to 1080p (but not all do...beware)

That said, people with 720p LCD displays have watched 1080i broadcasts for years since there were no 1080p displays available, and think/thought it looks outstanding, so upscaled 540p isn't so bad :)
 

BassBomb

Diamond Member
Nov 25, 2005
8,396
1
81
Originally posted by: rbV5
Originally posted by: BassBomb
That does not answer my question:

Question is, why do 720P displays do 1080i? Could they not to 1080p also with similar downscaling?

What rbV5 said is very interesting, then if that were the case 720P versions would be better since I don't need to mess with the premium for 1080

Native 720p material on an LCD looks better than native 1080i yes, because no LCD can handle interlaced content natively, some 1080p displays however can simply line-double 1080i to 1080p (but not all do...beware)

That said, people with 720p LCD displays have watched 1080i broadcasts for years since there were no 1080p displays available, and think/thought it looks outstanding, so upscaled 540p isn't so bad :)

ALALALALLAALLA this is making me crazy

Well, ive watched 1080P trailers so far and they look fine on this ... these are all wmp formats im talking about for use with WMP or VLC, not actual broadcast

I will see how it turns out after I got my media :)... seems to be easier to find 1080i/p than 720p media
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
seems to be easier to find 1080i/p than 720p media

Divx HD showcase used to have a bunch of 720p material since that was the encoders highest resolution, so you might check Divx site. 720p is commonly broadcast, but a little big to share.