720P vs. 1080I PS3 Output On A "720P" TV Revisited

Paddington

Senior member
Jun 26, 2006
538
0
0
I've had my Playstation 3 for a few weeks now. My 32" Sony Bravia TV has a resolution of 1366 x 768, so I set the PS3 to output at 720P, even though the system had detected 1080I. I know the TV can't actually do true 1080I, since it doesn't have 1080 lines of resolution. However, I've been noticing a fair amount of pixelation during gaming, which I don't see while watching the high definition broadcast channels.

Could it be that the "720P" TV isn't actually 720P? The resolution is 1366 x 768. So is the TV taking 720P signals from the PS3 and then scaling them up to its real native resolution, therefore resulting in jaggies? In which case, would I not be better off having the PS3 output at 1080I, and let the TV scale down the higher resolution? :confused:

Here's also some guy's opinion I ran across:

No 1080i looks better than 720p on most HD TV's because it comes closer to filling all the pixels on your TV at any given moment.
Most HD TV's have more pixels than 720p, even a 720p TV has 1366 x 768 as appose to 1280x720 1080i dithers 540 lines at 60fps, because the human eye can only really see motion at 29.97fps a 540 interlaced signal running at 60fps will appear to look closer to a full 1080 signal running at 30fps. 720p shows 720lines at 30fps without interlacing but because only 720 lines are filled at any given frame the picture will appear slightly blurry because it has to be upscaled to fit the 1366 x 768 resolution TV.
Thus 720p looks worse than 1080i. Done't believe me take an HD movie and watch it at 1080i and then watch it at 720p, on most HD TVs 1080i gives you a crisper picture because of the interlaced illusion. Now if you can see at 60fps then 1080i will look terrible. But human eyes clocks out at roughly 30fps. But if you have superhuman powers an can see full 60fps motion like Superman then don't use 1080i as you'll only see have the image at any given frame.
 

EvilComputer92

Golden Member
Aug 25, 2004
1,316
0
0
720p is superior to 1080i. this is why

The guy you quoted is a fool for thinking that the human eye can only see motion at 30fps. The eye has no theoretical limit and what it perceives is limited by the brain. This is why some people cannot notice above 60fps, while others can still notice 100fps.

He doesn't take into account that interlacing brings artifacts into the picture and text on screen looks nearly unreadable.

Not to mention that 1080i increases input lag much more than 720p. This is because the internal processor in the TV needs to do a lot more work to deinterlace 1920x1080 interlaced, as opposed to 1280x720 progressive where it only needs to upscale the signal.
 

ethebubbeth

Golden Member
May 2, 2003
1,740
5
91
Yes, your tv is upscaling the input from 720p (1280x720, all scanlines rendered every frame) to your display's native resolution (1366x768, all scanlines rendered every frame). When your tv receives a 1080i signal (1920x1200, alternate scanlines rendered each frame, for a complete image every other frame), it will scale it down to 1366x768 (and apply deinterlacing). Neither is an ideal situation. For what it's worth, I prefer sending my tv a 720p signal since its deinterlacer introduces terrible ghosting to the image.

EDIT: Forgot to close a parenthesis. Also, its, not it's.
 

biggestmuff

Diamond Member
Mar 20, 2001
8,201
2
0
Originally posted by: EvilComputer92
720p is superior to 1080i. this is why

The guy you quoted is a fool for thinking that the human eye can only see motion at 30fps. The eye has no theoretical limit and what it perceives is limited by the brain. This is why some people cannot notice above 60fps, while others can still notice 100fps.

He doesn't take into account that interlacing brings artifacts into the picture and text on screen looks nearly unreadable.

Not to mention that 1080i increases input lag much more than 720p. This is because the internal processor in the TV needs to do a lot more work to deinterlace 1920x1080 interlaced, as opposed to 1280x720 progressive where it only needs to upscale the signal.

step away from the crack pipe.
 

EvilComputer92

Golden Member
Aug 25, 2004
1,316
0
0
Originally posted by: biggestmuff
Originally posted by: EvilComputer92
720p is superior to 1080i. this is why

The guy you quoted is a fool for thinking that the human eye can only see motion at 30fps. The eye has no theoretical limit and what it perceives is limited by the brain. This is why some people cannot notice above 60fps, while others can still notice 100fps.

He doesn't take into account that interlacing brings artifacts into the picture and text on screen looks nearly unreadable.

Not to mention that 1080i increases input lag much more than 720p. This is because the internal processor in the TV needs to do a lot more work to deinterlace 1920x1080 interlaced, as opposed to 1280x720 progressive where it only needs to upscale the signal.

step away from the crack pipe.

Are you just trolling because you have nothing to say?
 

Paddington

Senior member
Jun 26, 2006
538
0
0
Well, I'm thinking the 1080I output from the PS3 probably is better for a TV with 768 lines, but an interesting point's been raised above about the quality of the processor.

I think another concern is that a lot of PS3 games only do 720P. So if you set it to output on 1080I, I wonder if it would scale up the 720P to 1080I, which would then end up being downscaled to 766 lines, which is pretty bad.

I can't believe how badly this was all setup by the electronics makers.
 

cputeq

Member
Sep 2, 2007
154
0
0
because the human eye can only really see motion at 29.97fps

Yeah, I would have stopped listening to anything that dude had to say at that point, if not sooner.

You might try seeing if you can set your TV to stop scaling input video and run it native. Of course, you'll get a bit of a black border around the image, but if it results in less blockies from the upscaling, it may be worth it.
 

DivideBYZero

Lifer
May 18, 2001
24,117
2
0
I run 1080p and let the TV processor deal with it (Panasonic PX70 42" 1080i panel w/1080p processor). Looks great.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
I run it at 720p just fine even though the PS3 tried to set itself to 1080i. I noticed no difference in switching, but my TV uses true 720p (1280x720) resolution.
 

erwos

Diamond Member
Apr 7, 2005
4,778
0
76
Originally posted by: EvilComputer92
720p is superior to 1080i. this is why

The guy you quoted is a fool for thinking that the human eye can only see motion at 30fps. The eye has no theoretical limit and what it perceives is limited by the brain. This is why some people cannot notice above 60fps, while others can still notice 100fps.

He doesn't take into account that interlacing brings artifacts into the picture and text on screen looks nearly unreadable.

Not to mention that 1080i increases input lag much more than 720p. This is because the internal processor in the TV needs to do a lot more work to deinterlace 1920x1080 interlaced, as opposed to 1280x720 progressive where it only needs to upscale the signal.
You're completely ignoring the scenario where there's very little movement (eg, some movies). In that case, 1920x1080i seems like a much better choice. This isn't as open and shut as the article's author makes it. In fact, the author doesn't address vertical resolution _at all_, which is also another place his argument fails.

Plus, a quality deinterlacer will not produce artifacts like you're describing. I watch 1080i broadcasts all the time. The text is perfectly legible.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
81
Originally posted by: erwos
Originally posted by: EvilComputer92
720p is superior to 1080i. this is why

The guy you quoted is a fool for thinking that the human eye can only see motion at 30fps. The eye has no theoretical limit and what it perceives is limited by the brain. This is why some people cannot notice above 60fps, while others can still notice 100fps.

He doesn't take into account that interlacing brings artifacts into the picture and text on screen looks nearly unreadable.

Not to mention that 1080i increases input lag much more than 720p. This is because the internal processor in the TV needs to do a lot more work to deinterlace 1920x1080 interlaced, as opposed to 1280x720 progressive where it only needs to upscale the signal.
You're completely ignoring the scenario where there's very little movement (eg, some movies). In that case, 1920x1080i seems like a much better choice. This isn't as open and shut as the article's author makes it. In fact, the author doesn't address vertical resolution _at all_, which is also another place his argument fails.

Plus, a quality deinterlacer will not produce artifacts like you're describing. I watch 1080i broadcasts all the time. The text is perfectly legible.

Right. The guy isnt *completely* wrong. On a 1080p set, there are plenty of good reasons to go with 1080i over 720p.

But on a 720p/768p set, you gain nothing by going with 1080i. Either its going to throw away half the lines and display it in 540p, or its going to deinterlace it, and that will probably add lag to the game. Your eye isnt going to notice the diff between 720/768p, so just set it to 720p, especially considering the PS3 games are running in 720p natively.
 

mlm

Senior member
Feb 19, 2006
933
0
0
Originally posted by: Paddington
I think another concern is that a lot of PS3 games only do 720P. So if you set it to output on 1080I, I wonder if it would scale up the 720P to 1080I, which would then end up being downscaled to 766 lines, which is pretty bad.

If it's a game that is natively 720p but is enabled by the developer to upscale to 1080p, the PS3 will keep it at 720p as long as you have it listed as one of your resolutions in the settings menu.

 

Eeezee

Diamond Member
Jul 23, 2005
9,922
0
76
I'm in the same boat as the OP, I switched my PS3 to output in 720p instead of 1080i and noticed no difference...
 

Wuzup101

Platinum Member
Feb 20, 2002
2,334
37
91
Unfortunately, the industry decided to go with 1366x768 screens and a different "matching" resolution in 720p. To be very honest, one of the biggest advantages of owning a 1080p set is that you don't have to worry about scaling with as many sources. On a "768p" set, everything is scaled, because nothing is native 1366x768 (and I mean nothing commercially - not that you couldn't process a BR/HD-DVD and save it as that... or that you couldn't have a computer output a closer format).

In any case, the easiest way to find out what works best is just to try it. You obviously already have the equipment, so do some leg work. As far as what the eye can see, there are limitations and exceptions. The eye/brain is limited to the amount of detail that it can detect from a given distance. It is also limited to the number of "pictures" it can view in a given number of time. However, there can be big differences between what the eye sees and what the brain interprets. Generally, we are much more sensitive to uneven frame rates than we are to low frame rates. That is, if you have a very steady frame rate of 30fps, and a frame rate of 60fps that is dropping a few frames here and there (and thus changing the frame rate +/- 2), you will generally have more problems with the faster frame rate. Higher frame rates can make things seem more fluid, which is great! However, 30fps is plenty adequate for displaying A LOT of media (as the film and TV industry have already proved). The key is consistency, not necessarily how many frames you can get.

Also... remember that if you are scaling something from 720p up to 768p, you are always going to be "making crap up." Obviously, your TV doesn't just do this at random. However, most will agree that "downscaling" is generally preferred to upscaling in theory (this always depends on the gear in practice).

Just try both, and report back! Or sell the 32 and get a 40!