• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

720p vs 1080i

jacktesterson

Diamond Member
My question to you....

My secondary TV is a 720p/1080i set (1366*768). I use it for Xbox 360, HDTV Satellite and an Upconverting DVD Player.

My question is about the 360 and DVD player.... I can chose on both which resolution to use... Should I use 720p or 1080i? I've been using 1080i on both but am wondering if there would be any advantage of going 720p.

In theory, which would be better? I've got some reasons for both, but am wondering about other peoples opinions.

Thanks
 
Try both and see what you think.

I'd assume 720p would give you better results, but who knows?
For the DVD player, it might even be better to do 480i / 480p depending. Since the TV is going to have to scale 720p anyway, it might be better to just feed it the original resolution and let it scale it just once?
 
I actually thought 1080i would give me a better result.

I am going to play around and see if i can really see any difference with the DVD Player. Its a Philips DVP5990/37. It uses a Mediatek chip.

With my 360 I use the standard Component cables that came with it. I find 1080i looks better on my set then 720p

My understanding is most quality upconverting DVD players do a better job then scalers built in a TV in general. Also a TV should not be scaling a 2nd time as long as its being fed a HD signal.. (correct me if im wrong) Also from the research I've done most 720/768p TV's actually prefer a 1080i signal for quality of the picture.

I was just wondering if anyone would know in theory in my situation which one should look best.
 
Last edited:
I was in the same boat as you... As my projector can tops out at 1080i.

I think the real question here is the difference between progressive and interlace... It's not just about resolution.

[ A ] The earliest known form of video compression was the use of the interlaced format, developed roughly 70 years ago to address early TV technology challenges and broadcast bandwidth constraints.

In interlaced video, each field of a video image displays every other horizontal line of the complete image. For example, in the first interlaced field, the even-numbered lines making up the complete image would be displayed, and then with the second field, the odd-number lines of that image would be shown. Repeat this even/odd interlaced sequence frequently enough, say 25 to 30 times per second, and the "persistence of human vision" allows a viewer to see what appears to be complete moving images.

The main benefit of interlaced video is that it allows more detailed images to be created than would otherwise be possible within a given amount of bandwidth -- in effect, interlacing allows a doubling of image resolution. But interlaced video comes with real-world downside, including image softening that occurs during fast-motion sequences as well as moire or strobing artifacts that sometimes appear when striped shirts, plaid jackets, bricks in a building, or similar types of objects are shown.

Progressive video, on the other hand, is made up of consecutively displayed video frames that contain all of the horizontal lines that make up the image being shown. As a result, images appear smoother, fast-motion sequences are sharper and artifacts are much less prevalent.

The primary drawback to progressive video, at least until very recently, was the higher bandwidth requirement. But today, television systems and packaged media such as DVD are moving away from analog transmission and storage to digital variants, allowing considerably more efficient video compression to be applied. This results in even higher resolution images than were possible via interlaced analog video, using the same amount of bandwidth.

While interlaced video will continue to be with us for some time as a result of the 1,080-line interlaced (1080i) HD format used by broadcasters in the US and some other countries, both displays and packaged media are moving exclusively toward progressive video formats, such as 720- and 1080-line progressive (720p and 1080p) formats.

In fact, all digital, non-CRT displays are natively progressive, and any interlaced video signals they receive must be converted, or "de-interlaced", to the progressive format before they can be displayed.

For me, personally I like the liquid flow that progressive offers and I often watch sci fi movies with action shots ... So, I choose 720P over 1080i, but if your just watching the simpsons or the family guy, I'd think 1080i would be great for toons in HD.

Here is another take...

Interlaced scanning shows half of an image (on the odd rows of pixels) every sixtieth of a second, and then it shows the other half (on the even rows) the next sixtieth. Therefore, it takes one thirtieth of a second to show a complete frame, giving a framerate of 30 frames per second.

Progressive scanning shows the entire image every sixtieth of a second, so the framerate is twice as high - 60 frames per second.

Therefore, progressive scanning creates a smoother image, and is preferable if you have a choice.
 
It's really a personal preference. I like 720p because 1080i has too much of a blurring effect for me.
 
I would guess that 720p would be the best setting for those sources. I'm pretty sure most XBOX360 games are native 720p. With DVDs you'll be upconverting those, and I would think more problems would occur by upconverting them to 1080i and then having your TV scale it down to 1366x768 resolution. I actually set my PS3 to not upconvert the DVD, and I think it made a noticeable improvement in PQ.
 
720p for me. Sports, action movies, anything with movement looks better in 720p. Cartoons and sweeping panoramas look great in 1080i.
 
I like 720p, and since I don't want to switch back and forth depending on what I'm watching, I stick with just 720p and be happy.
 
I have a TV that has the same resolution as yours, but also accepts and downscales 1080p. In the playstation XMB the text is a bit blurred in 1080p, while 720p looks a lot better. I have not tried 1080i. I assumed 720p was always better for HD Ready (1366*768) TV's because there was less work for the scaling equipment in the TV to do.

The only time you don't get scaling is if you have a full HD (1920*1080) TV taking in a 1080p/i stream. An HD ready TV is defined as having a resolution of 720p or above, most like mine and yours have a slightly weird (1366*768) because it was easier to manufacture or something. See whatever you think works best, for me like i said it was 720p by far.
 
Back
Top