• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

720p vs 1080i: Which is the better video quality?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
1080i has a resolution of 1920x1080. Since it is interlaced, only half the vertical resolution is displayed at one time. So 1920*540 = 1036800 pixels displayed at any given instance.

720p has a resolution of 1280*720. It is not interlaced, so the number of pixels displayed at any given instance is 1280*720 = 921600.

In other words, 1080i, despite being interlaced, displays a higher resolution picture (by a difference of 115200 pixels) at any given instance. (921600/1036800)*100 = 89%. 720p displays only 89% of the video information that 1080i displays at any given instance.

Despite these facts, there are still people who will argue that 720p "looks" better. I am not one of them.
 
Originally posted by: Sasha
I have a dedicated room for HDTV. I am using an analog projector that projects a 92" 16:9 image. The HD source allows me to switch back and forth between 720P & 1080i output, and using the OTA (ATSC) receiver I can setup for:

720P source, 720P output
720P source, 1080i output
1080i source, 720P output
1080i source, 1080i output

I've watched the 1080i olympics and 1080i Superbowl in 1080i, but did compare the output in 720P as well. I stayed with the 1080i output so no additional processing was needed.

Recently, I have been watching football on Fox (ATSC OTA) and ESPN-HD football at their native 720P transmissions, but did compare them to 1080i output. I have a discerning eye for video artifacts. Maybe the trained eye of a video professional, but I can see quite a bit and have yet to notice a problem with interlacing artifacts from watching either 1080i or 720P sourced sporting events on a 1080i output display.

There are a lot of things that can lead to motion artifacts and interlacing artifacts and this can be seen in the processor handling the video processing well before the display instrument, whther that be DMD, LCD, LCoS, flat panel, cathode-ray, etc.

It comes down to whether or not your display is doing a better job at video processing than your source device. Keep in mind that this is just in the discussion of HDTV and says nothing about what goes on in the 480i world (a la DVD).


What kind of screen and projector are you using?

I'm still agonizing on what kind of display device I should get this year. I'm trying to wait until the end of the holiday season to see if there will be any price drops or new products on the way. Since 1080p will soon be the new defacto standard I'd like to hold out for one of those screens at the largest size I can afford. Also with LCOS technology on the horizon that may help to lower prices on all larger RPTV displays.
 
bump. I'm not trying to draw attention to myself, but I think I have the definitive post above, and I have edited it several times since I originally posted it. Please read it if you are interested in the raw facts of comparing the two standards.
 
Originally posted by: NuclearNed
1080i has a resolution of 1920x1080. Since it is interlaced, only half the vertical resolution is displayed at one time. So 1920*540 = 1036800 pixels displayed at any given instance.

720p has a resolution of 1280*720. It is not interlaced, so the number of pixels displayed at any given instance is 1280*720 = 921600.

In other words, 1080i, despite being interlaced, displays a higher resolution picture (by a difference of 115200 pixels) at any given instance. (921600/1036800)*100 = 89%. 720p displays only 89% of the video information that 1080i displays at any given instance.

Despite these facts, there are still people who will argue that 720p "looks" better. I am not one of them.

and the very fact that 720p is displaying less pixils helps it. less information to encode in the bit rate. if a 1080i and a 720p stream are given the same bitrate the 720p will have less mpeg artifacting due to the fact that more data cane be put into each pixil because thier are less of them.

also with the market heading to digital displays (plasma, dlp, lcd, oled) ethier 1080p or 720p will be the way to go since interlaced video has to be converted to progressive anyway for it to work on these displays (which negates any extra resolution you get from 1080i and you actually end up with 540p instead)
 
Originally posted by: Falloutboy
Originally posted by: NuclearNed
1080i has a resolution of 1920x1080. Since it is interlaced, only half the vertical resolution is displayed at one time. So 1920*540 = 1036800 pixels displayed at any given instance.

720p has a resolution of 1280*720. It is not interlaced, so the number of pixels displayed at any given instance is 1280*720 = 921600.

In other words, 1080i, despite being interlaced, displays a higher resolution picture (by a difference of 115200 pixels) at any given instance. (921600/1036800)*100 = 89%. 720p displays only 89% of the video information that 1080i displays at any given instance.

Despite these facts, there are still people who will argue that 720p "looks" better. I am not one of them.

and the very fact that 720p is displaying less pixils helps it. less information to encode in the bit rate. if a 1080i and a 720p stream are given the same bitrate the 720p will have less mpeg artifacting due to the fact that more data cane be put into each pixil because thier are less of them.

also with the market heading to digital displays (plasma, dlp, lcd, oled) ethier 1080p or 720p will be the way to go since interlaced video has to be converted to progressive anyway for it to work on these displays (which negates any extra resolution you get from 1080i and you actually end up with 540p instead)

no offense, but your post makes no sense. if you have a web site to back this up, please link it. by your logic, we should all set our pc displays at 640x480 or lower.
 
this has already been decided in HT circles and tests.

720p provides a better picture especially for fast moving scenes.

yuck, on 1080i.

And any mention of resolution differences between the two is mute because no TV can fully resolve 1080i anyway unless it is a fixed pixel 1080p display. Slap some HD resolution patterns on a TV and prepare to see just how bad they are.
 
Originally posted by: NuclearNed
Originally posted by: Falloutboy
Originally posted by: NuclearNed
1080i has a resolution of 1920x1080. Since it is interlaced, only half the vertical resolution is displayed at one time. So 1920*540 = 1036800 pixels displayed at any given instance.

720p has a resolution of 1280*720. It is not interlaced, so the number of pixels displayed at any given instance is 1280*720 = 921600.

In other words, 1080i, despite being interlaced, displays a higher resolution picture (by a difference of 115200 pixels) at any given instance. (921600/1036800)*100 = 89%. 720p displays only 89% of the video information that 1080i displays at any given instance.

Despite these facts, there are still people who will argue that 720p "looks" better. I am not one of them.

and the very fact that 720p is displaying less pixils helps it. less information to encode in the bit rate. if a 1080i and a 720p stream are given the same bitrate the 720p will have less mpeg artifacting due to the fact that more data cane be put into each pixil because thier are less of them.

also with the market heading to digital displays (plasma, dlp, lcd, oled) ethier 1080p or 720p will be the way to go since interlaced video has to be converted to progressive anyway for it to work on these displays (which negates any extra resolution you get from 1080i and you actually end up with 540p instead)

no offense, but your post makes no sense. if you have a web site to back this up, please link it. by your logic, we should all set our pc displays at 640x480 or lower.

ok yes 1080i in a perfect world would have more detail. but both 1080i and 720p have the same bitrate of 19.2Mbps because of this. since thier are less pixils in 720p you will get less artifacting due to the fact that thier are less pixils to encode thierfore more data can be put into each pixil to make them more acurate.

now on my second argument on why progressive in general is better I first ask what type of tv do you have? if you have a CRT based tv (direvt view or projection) 1080i would look correct on that display. but if you have a digital display (LCD, DLP, LCOS, Plasma, oled) then the signal would have to be converted to a progressive signal and since 1080i is interlaced and only really has 540 lines of resolution when it is converted to progressive your going to lose over half your resolution and be stuck with a equivelent res around 1280x540 (some tvs and scailers will try to fake the other 180 lines but it still won't look as good as a tru 720p signal



Originally posted by: spidey07
this has already been decided in HT circles and tests.

720p provides a better picture especially for fast moving scenes.

yuck, on 1080i.

And any mention of resolution differences between the two is mute because no TV can fully resolve 1080i anyway unless it is a fixed pixel 1080p display. Slap some HD resolution patterns on a TV and prepare to see just how bad they are.



this also brings up a good point that further shows that for now 720p is the way to go and when 1080p becomes more prevolent a switch to that for atleast whatever replaced dvd's will be good (you'll proubly never see 1080p for a long time over the air or satalight because it would take double the bandwidth of 720p to be at the same compression level)
 
Originally posted by: NuclearNed
1080i has a resolution of 1920x1080. Since it is interlaced, only half the vertical resolution is displayed at one time. So 1920*540 = 1036800 pixels displayed at any given instance.

720p has a resolution of 1280*720. It is not interlaced, so the number of pixels displayed at any given instance is 1280*720 = 921600.

In other words, 1080i, despite being interlaced, displays a higher resolution picture (by a difference of 115200 pixels) at any given instance. (921600/1036800)*100 = 89%. 720p displays only 89% of the video information that 1080i displays at any given instance.

Despite these facts, there are still people who will argue that 720p "looks" better. I am not one of them.
A higher resolution image will always look sharper and more detailed. Nobody is arguing this fact. A de-interlaced still frame from 1080i will obviously look better than a native 720p image simply because it has >double the pixel count.

3840x2160 interlaced at 30fps looks clearer than 1920x1080 interlaced at 60fps. Would you agree with this statement? Even though it might look more detailed, which would you rather watch for anything other than a picture frame? I believe the original argument made was that 720p is better for fast motion due to the higher framerate, not the higher "image quality".

Just a side note, but if you told me you could see a difference in image quality between a 921600 and 1036800 pixel image without your nose pressed up against the screen, I would call you a liar. Not that it has any relevance to the argument, just that I find Nuclearned's reasoning for his opinion flawed.
 
1080i is 540 lines with 1920 pixels displayed for 1/60 second PLUS 540 lines with 1920 pixels displayed for 1/60 second. So, in 1/30 second 1080 lines are presented, each with 1920 pixels, but at any given instance only 540 lines are seen. Conversely, 720P is 720 lines with 1280 pixels shown for 1/30 second. Now, while the bandwidth for either 1080i or 720P is about the same, the amount of information is grossly different.

Another way to look at it, in 1/30 second I get 1080 lines of 1920 pixels available to my eyes with 1080i, or 720 lines of 1280 pixels with 720P.

HDTV may use a maximum of 19.1 Mbps for bandwidth, but HD prerecorded content (a la JVC's D-Theater D-VHS) can record and playback up to 28.2 Mbps. Also, the amount of bandwidth made available for delivery to your display for HDTV can be quite low. Case in point is with DirecTV's current policy of placing three (3) HD channels on one transponder when the transponder's capacity is well below the 3x19.1 Mbps maximum broadcast offering.

So, while a content producer may be offering said movie or sporting event in XYZ, the delivery may fall far short to make this competely moot. Over compressing content for redistribution is not new at DirecTV, and the evidence can be seen in comparing SD locals between uncompressed to overly compressed via DirecTV.

1080P as a potential new standard for content is a bust until sufficient means of transport can be accommodated. Current deployed broadcast infrastructure using MPEG-2 makes that impossible without deploying new hardware. Having a 1080P display device will also make little difference if the only place allowed to deinterlace 1080i is in the display itself (potential for bad deinterlacing), or if the display disallows 1080P to be directly fed to the display.

720P is quite popular these days with new display products in the electronics market because of 720P LCD and LCoS panels and 720P DMD (DLP) being readily available in large quantities. 1080P LCD and LCoS panels are not plentiful, and I think only one company displayed a 1080P LCD product at CEDIA earlier this month.
 
Originally posted by: Sasha
1080i is 540 lines with 1920 pixels displayed for 1/60 second PLUS 540 lines with 1920 pixels displayed for 1/60 second. So, in 1/30 second 1080 lines are presented, each with 1920 pixels, but at any given instance only 540 lines are seen. Conversely, 720P is 720 lines with 1280 pixels shown for 1/30 second. Now, while the bandwidth for either 1080i or 720P is about the same, the amount of information is grossly different.

Another way to look at it, in 1/30 second I get 1080 lines of 1920 pixels available to my eyes with 1080i, or 720 lines of 1280 pixels with 720P.

HDTV may use a maximum of 19.1 Mbps for bandwidth, but HD prerecorded content (a la JVC's D-Theater D-VHS) can record and playback up to 28.2 Mbps. Also, the amount of bandwidth made available for delivery to your display for HDTV can be quite low. Case in point is with DirecTV's current policy of placing three (3) HD channels on one transponder when the transponder's capacity is well below the 3x19.1 Mbps maximum broadcast offering.

So, while a content producer may be offering said movie or sporting event in XYZ, the delivery may fall far short to make this competely moot. Over compressing content for redistribution is not new at DirecTV, and the evidence can be seen in comparing SD locals between uncompressed to overly compressed via DirecTV.

1080P as a potential new standard for content is a bust until sufficient means of transport can be accommodated. Current deployed broadcast infrastructure using MPEG-2 makes that impossible without deploying new hardware. Having a 1080P display device will also make little difference if the only place allowed to deinterlace 1080i is in the display itself (potential for bad deinterlacing), or if the display disallows 1080P to be directly fed to the display.

720P is quite popular these days with new display products in the electronics market because of 720P LCD and LCoS panels and 720P DMD (DLP) being readily available in large quantities. 1080P LCD and LCoS panels are not plentiful, and I think only one company displayed a 1080P LCD product at CEDIA earlier this month.


I agree with all this except that although 1080i gives you 1080 lines of res every 30 seconds, 720p can run as high as 60p which is what makes it good for sports.
although from what I've read for most tv programs it runs at 24p since the source of the video is film.
 
Falloutboy, yes, IF 720p60 is implemented, but then again who is implementing this? I believe the disucssion, while not originally hard-set to HDTV, seemed to point to that focus of concern. But, if we are going to open the discussion up (which is good) to beyond what re-broadcasters are offering, then we can also include every offering not just under the ATSC specification, but also things well-suited to non-HDTV but still high in definition.

What format the content is created in means little if both the distribution & display, and one's eyes are not equally in the best condition for that well-created content.

Apex, Pepsi is too sweet and interferes with my eating of other sweets. I compliment my candies, bakery items, and other sweet foods with something less sweet in drink. Coke seems to be the fix to go with that spirit demanding much of my attention. 🙂
 
Originally posted by: BD2003
720p for sports, anything high motion

1080i is best for film and low motion video.

BD203 is correct

BTW, at reduced bandwidth (13-15mbs) 720p beats 1080i everytime
(Broadcaster here...)
720 will beat out 1080i in the long run.

 
Back
Top