Does my 720p HDTV really do 1080p?

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
I just hooked up my 720p Sanyo HDTV (DP26746 1366x768) with a DVI->HDMI cable and found that the resolution 1920x1080 is available. I previously had it hooked up via components and was limited to 1280x720. Naturally I was skeptical when I saw the resolution available in The Witcher that I just bought also. I figured what the heck it will probably just crash and picked 1080p.

Whoa...that looks amazing. I can see every brick in the castle wall as far as I can see. No major aliasing anywhere. I set it to 720p because I figured it is outputting it at 720p anyway when set at 1080p. Nope....visible aliasing everywhere, distant textures look like crap. I can only see the bricks half way up etc. So I go down to 1924 and 800 because I thought it was impossible to do 1080p with a 720p HDTV correct? Each lower resolution was predictably worse. I go back to 1080p again and it was just like before. Crysis at 720p didn't look this good.

So I try COD4, Gears of War, TimeShift, and of course Crysis. All of them had the resolution and all looked amazing. The Crysis jungle almost photorealistic looking. At 720p in Crysis the fps is 50-60% higher than 1080p. Rivatuner shows that it is 1920x1080 also when it is loading.

What is going on here? The resolution is set as adapter under scaling type in the Nvidia driver but it does show active vertical lines as 540 not 1080. I tried setting it to 1080 but it won't stick. It also claims it is interlaced which should have a 30fps cap on it? I have higher frame rates than that in all games.

Anyone with more HDTV tech savvy than me able to explain this? I don't have lots of experience with HDTV's so I imagine there is a logical explanation but I will continue to leave it at that res because my eyes don't lie. It is much better than 720p and can't be at 1366x768 either. I know many HDTV's only do 1080i with components and 1080p with HDMI but only if they are 1080p capable. Is it upscaling or some other fancy trick going on?
 

Snooper

Senior member
Oct 10, 1999
465
1
76
There ARE 1080 lines in 1080i. But they are only able to be refreshed every other frame (odds, then evens, odds, etc), so the data rate is half that of 1080p. More than likely, your TV is rescaling the signal back to 720p. You can bet your life on the fact that IF that panel could display 1080i/p, Sony would be shouting it from the roof tops.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Wouldn't 1280x720 and 1920x1080 look the same if it was rescaling 1080 to 720p? The frame rate should be the same too but is half the 720p frame rate. I looked into the TV's specs and it uses line doubling so I would think that it means it's refreshing every other frame.

The refresh on this thing is a terrible 25ms so I thought I would be seeing tearing once the frame rate goes over 60. I don't see any at all that is why I wondered what was going on. I played Crysis for over an hour and GOW for a half hour and it looks a lot better at 1080i than 720p with 4x MSAA. I didn't even know the TV could do 1080i. It isn't listed in the manual anywhere, only 720p is listed. Only when I switched from components to HDMI did I notice it.

EDIT found this: Digital Scanning Display Format
720p (all signals converted to 720p display)

If everything is converted to 720p why am I able to set it at 1920x1080 and it looks better than 1280x720? These HDTV's are confusing. Maybe I should stick with my CRT. LOL
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Rendering at a higher resolution than what it is scaled to is essentially a form of anti-aliasing.
You have more detail which creates a more accurate image when scaled down to a lower resolution than just rendering at the lower resolution. I believe it's basically equivalent to super sampling.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
So the video card is rendering at 1920x1080 but I am not seeing it at that resolution? So if I got a 1080p HDTV my frame rates would be the same but it would look much better?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I think that a 720p tv can display 1080i b/c that is a lower res, anyway. As mentioned, 1080i is 1/2 fo 1080p = 540p. 1080p's benefit is smoother playback of action sequences (sports are great for example). I love 1080i displays on my 720p tv b/c they are definitely crisper than traditional 720p displays btw. Planet earth on discovery hd really sold me on the concept of hd btw, it looked a LOT better than other shows that I had seen on my hd channels previously.
 

Elcs

Diamond Member
Apr 27, 2002
6,278
6
81
Check your TV Manual. They are usually evilly difficult to decipher quickly, I know mine was.

I thought I was limited to 720p on my 42" Panasonic Plasma however it does 1080i via HDMI so I can play 1080i in any game my 8800GT will handle that high. Best thing about a super resolution like that is you can drop a lot of AA which really can nerf performance.
 

tshannon92

Member
Nov 28, 2007
41
0
0
I was always curious, my first HDTV was a 26 inch hand me down from my younger brother (its now my computer monitor). It was 720p and I loved it though it was too small for the living room for sure. After I got it I had to have bigger so I bought a 42inch Sharp Aquos (50s were just too big) I love it but notice very little difference between 720p 1080i 1080p I think the only 1080p Ive seen was .mkv's and I don't even know if they were rendered at 1080p at all though I was using HDMI via DVI/HDMI.


Why is it that no matter what the TV is the SA8300 cable box always says 1080? It always says it whether its the 720 or 1080p tv?
 

mruffin75

Senior member
May 19, 2007
343
0
0
Originally posted by: bryanW1995
I think that a 720p tv can display 1080i b/c that is a lower res, anyway. As mentioned, 1080i is 1/2 fo 1080p = 540p. 1080p's benefit is smoother playback of action sequences (sports are great for example). I love 1080i displays on my 720p tv b/c they are definitely crisper than traditional 720p displays btw. Planet earth on discovery hd really sold me on the concept of hd btw, it looked a LOT better than other shows that I had seen on my hd channels previously.

Wrong.

1080i cannot be explained as "only 540p". That's incorrectly thinking that a 720p screen can display 1080i because "it's lower". It's not lower, it's still 1080 lines of information, just that they are not all refreshed at the same time as with a progressive display.

Any way you look at it, a display which has only 768 lines or less of vertical resolution, is scaling the display. Fox5 was correct that it was using some sort of super sampling to downsize the actual display. That is the only reason why it would "look better" on a lower resolution display.

This is also why there's so many cheap LCD displays that say they do "1080i" but only have a resolution of say 1366x768. This is nowhere near full 1080i resolution (in either h or v).

For example:

http://www.tigerdirect.com/app...=3164542&Sku=R104-3200

Looks cheap (even tho a refurb.), and says that it's compatible with 1080i signals. But compatible doesn't always mean that it shows 1080i. It just means that instead of installing an LCD screen that is fully 1080i capable, it's cheaper just to include some scaling circuitry.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
Most newer HDTVs that have native resolutions under 1920x1080 (1024x768, 1366x768, 1680x1050, etc.) still allow you to connect 1080p devices - which the display will scale to its native res. Scalers are becoming more versatile and less expensive, which is nice.

Generally speaking, a high resolution fed to a low resolution display will always look better than the native display resolution when it comes to moving images. As some above have pointed out, it's a method of "free AA". When it comes to static images or text, it usually works against you. That's where scaler quality comes into play.

I've got a new 1080p (1920x1080) plasma and an older 480p (852x480) plasma. They can accept and display 1920x1080 via HDMI and 1920x1200 via DVI/VGA, respectively. Side by side (actually room by room) with the same image going to both with no AA/AF applied, the new 1080p plasma is noticeably sharper, but the penalty comes in the form of visible jaggies. The 480p plasma displaying the same image is completely free from jaggies. From 12 feet away, they almost look the same, but there is definitely a loss of sharpness on the 480p set.

720p sets (1024x768, 1366x768, 1280x800, etc.) have a definitely resolution advantage over my old 480p plasma, so I can imagine that it would be even more difficult to tell the difference. On Black Friday, I picked up some toys from Best Buy, namely two of the $400 Dynex HDTV 32" LCDs and two of the $150 22" LCDs. While the overall PQ of the cheap Dynex is far below my plasmas, I'll have to connect my PC to it and see what it's capable of...
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Shaq...also claims it is interlaced which should have a 30fps cap on it? I have higher frame rates than that in all games.
Your framerate isn't capped becuase you aren't using vsync.


And every HDTV accepts 1080i, most new ones accept 1080p as well. Your display's resolution is 768p though, so whatever resolution you input is scaled to that.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Originally posted by: therealnickdanger
Most newer HDTVs that have native resolutions under 1920x1080 (1024x768, 1366x768, 1680x1050, etc.) still allow you to connect 1080p devices - which the display will scale to its native res. Scalers are becoming more versatile and less expensive, which is nice.

Generally speaking, a high resolution fed to a low resolution display will always look better than the native display resolution when it comes to moving images. As some above have pointed out, it's a method of "free AA". When it comes to static images or text, it usually works against you. That's where scaler quality comes into play.

That would explain why there is no tearing, yet I am still confused that the custom resolution that is there says it is 1080 interlaced with 540 vertical lines. That basically says that it is accepting a 1080i signal, but is still scaling it to 1366x768? So it does scale to 768p and not 720p? I know the difference would be marginal between the two, but I'll take the extra little bit of resolution.

When I set the desktop resolution to 1920x1080 though the icons/text are smaller just like when you set a higher res on a CRT. If it were scaling would they be smaller? But there is definitely some flicker on the menus when they pop up, yet the games look great which explains what you stated. The higher resolution of AA should have a much lower performance hit than enabling supersampling so I can see that it is somewhat "free". Usually enabling supersampling crushes framerates. I tried 8XQ multisampling at 720p and it wasn't even close in IQ to setting the higher resolution.

I've got a new 1080p (1920x1080) plasma and an older 480p (852x480) plasma. They can accept and display 1920x1080 via HDMI and 1920x1200 via DVI/VGA, respectively. Side by side (actually room by room) with the same image going to both with no AA/AF applied, the new 1080p plasma is noticeably sharper, but the penalty comes in the form of visible jaggies. The 480p plasma displaying the same image is completely free from jaggies. From 12 feet away, they almost look the same, but there is definitely a loss of sharpness on the 480p set.

720p sets (1024x768, 1366x768, 1280x800, etc.) have a definitely resolution advantage over my old 480p plasma, so I can imagine that it would be even more difficult to tell the difference. On Black Friday, I picked up some toys from Best Buy, namely two of the $400 Dynex HDTV 32" LCDs and two of the $150 22" LCDs. While the overall PQ of the cheap Dynex is far below my plasmas, I'll have to connect my PC to it and see what it's capable of...

That's interesting...but since my HDTV is 26" and I sit 3 feet away from it 1080p would look better. I guess I will still need one down the road. I await the results of your tests. I think I may try some other custom widescreen resolutions below 1080 and see if I can get a little more performance and find the limit where the jaggies reappear. It's pretty nice that this level of IQ is obtainable on a cheap 720p tv. I did notice that a couple of games that were fairly sharp to begin with had less of an improvement in IQ at the higher res. NFS Pro Street and TimeShift didn't show as much of an improvement as Crysis and The Witcher.

 

Sunrise089

Senior member
Aug 30, 2005
882
0
71
Originally posted by: mruffin75
Wrong.
...

Thank you - the quantity of misinformation early on in this thread was shocking considering this technology is over 5 years old.

How people still to this day confuse what 1080i means in beyond me.
 

TheOtherRizzo

Member
Jun 4, 2007
69
0
0
It is hard though with so much misinformation all over the web. Almost everywhere it says that with interlaced signals one image consists of two half images. This simply isn't true for anything except hollywood movies scanned from a roll of film. For Video, TV and video games interlaced means you are getting a higher temporal resolution for a lower vertical resolution compared to 1080p i.e. 1920 x 540 at 60 Hz instead of 1920 x 1080 at 30 Hz. So 1080p is in no way actually higher that 1080i unless it is 60Hz 1080p which is supported by some game consoles but not by HDTV decoders or HD-DVD/BR players.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Shaq
That would explain why there is no tearing...
Tthe only way you can aviod tearing is to use vsync, otherwise you just aren't noticing the tearing.

Originally posted by: Shaq
.. yet I am still confused that the custom resolution that is there says it is 1080 interlaced with 540 vertical lines.
That is becuase instead of outputing 1080 line progressive frames every refresh, you are outputing 1080 line progressive frames as two 540 line interlaced fields with the even and odd lines comeing every other refresh respectively. That is also why you are limited to 30 vsynced frames a seocnd with interlaced resultions while progressive resolutions can provide up to 60fps.

Originally posted by: Shaq
That basically says that it is accepting a 1080i signal, but is still scaling it to 1366x768?
Yes, your LCD is made up of a grid of pxiels that is 1366 pixels wide and 768 pixels tall, and whatever resolution you input to your display is scaled to that.

As for custom 16:9 resolutions between 1920x1080 and 1366x768; I use 1760x990, 1600x900, and 1440x810.

Originally posted by: TheOtherRizzo
It is hard though with so much misinformation all over the web. Almost everywhere it says that with interlaced signals one image consists of two half images. This simply isn't true for anything except hollywood movies scanned from a roll of film.
It is true for PC output too, which is why Windows lists 1080i as 1920x1080i@30hz.
 

TheOtherRizzo

Member
Jun 4, 2007
69
0
0
If the game puts out 60 fps and I set Windows to 1080i doesn't it just output 60 fields per second? Does it really discard half of the frames and then separate the 30 frames that are left into two fields each? I wouldn't know as I've never run a PC interlaced but that would be really poor. Kind of makes anyone that PC games at 1080i a complete noob. :)
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: TheOtherRizzo
If the game puts out 60 fps and I set Windows to 1080i doesn't it just output 60 fields per second?
If the game is rendering anything more than 30fps with interlaced output then it means you aren't using vsync. With vsync, the framerate will be capped at 30fps. It is still outputting 60 fields per second though, those fields are simply derived from 30 progressive frames, be they the full frames as you get rendering with vsync or the combination of partial ones you get rendering without.

1080i is fine for desktop though assuming the display have a high enough native resolution or at least good enough scaler to make it functional, as well as any game where you wouldn't be getting much more than 30fps at that 1920x1080 regardless. But yeah, one one of the main reason I chose my partuclar plasma is that it's VGA input supports any resolution my videocard can output, meaning 16:9 gaming at up to 2560x1440p. :p
 

saiga6360

Member
Mar 27, 2007
61
0
0
Originally posted by: TheSnowman
... But yeah, one one of the main reason I chose my partuclar plasma is that it's VGA input supports any resolution my videocard can output, meaning 16:9 gaming at up to 2560x1440p. :p

Wow, is this for real? I have always wondered why TV manufacturers cap VGA at some crap resolutions and not letting it go up to 1080p but for it go even higher... that's wonderful! What TV do you have? I'm assuming at least a 50 inch screen, right? Is it d-sub or DVI?

EDIT: Nevermind, you have one of those 1440p 103" plasma TVs right? Damn, I wish I had you cash and living room space... and yes, the TV. :)
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: saiga6360
EDIT: Nevermind, you have one of those 1440p 103" plasma TVs right? Damn, I wish I had you cash and living room space... and yes, the TV. :)
Heh, nah, I've got a 1366x768 Panasonic, the TH-50PH9UK. I'm pretty sure all the commercial Panny's D-sub 15 inputs are the same though, at least the one on my previous 852x480 TH-42PWD7UY worked just the same. Such high resolutions aren't listed in the manuals, but I assure you they work wonderful and games look nearly as sharp as Holywood CG when rendered at 1440p.

Unfortunately, having just switched to Nvidia, in Vista I am stuck at 1600x900 for my max 16:9 resolution. Unlike ATI, Nvidia's Vista drivers won't exceed the display's EDID, and the one on my Plasma is a bog-standard one which claims the max supported resolution is 1600x1200. I'm currently looking solutions for that though, hopefully I will be able to simply reprogram the EDID.

Originally posted by: Throckmorton
The reason 720p looks like crap is that it's interpolating to get 1366x768
I didn't see anyone suggest 720p looked like crap, just not as good as downsampling from a higher rendering resolution. With a decent scaler 720p won't look like crap at all, and most HDTVs these days have decent scalers built into them.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: Shaq
So the video card is rendering at 1920x1080 but I am not seeing it at that resolution? So if I got a 1080p HDTV my frame rates would be the same but it would look much better?

Correct.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Originally posted by: TheSnowman
Originally posted by: Shaq
That would explain why there is no tearing...
Tthe only way you can aviod tearing is to use vsync, otherwise you just aren't noticing the tearing.

Originally posted by: Shaq
.. yet I am still confused that the custom resolution that is there says it is 1080 interlaced with 540 vertical lines.
That is becuase instead of outputing 1080 line progressive frames every refresh, you are outputing 1080 line progressive frames as two 540 line interlaced fields with the even and odd lines comeing every other refresh respectively. That is also why you are limited to 30 vsynced frames a seocnd with interlaced resultions while progressive resolutions can provide up to 60fps.

Originally posted by: Shaq
That basically says that it is accepting a 1080i signal, but is still scaling it to 1366x768?
Yes, your LCD is made up of a grid of pxiels that is 1366 pixels wide and 768 pixels tall, and whatever resolution you input to your display is scaled to that.

As for custom 16:9 resolutions between 1920x1080 and 1366x768; I use 1760x990, 1600x900, and 1440x810.

Thanks for the custom resolutions. 1600x900 is a good balance for performance/IQ. Crysis still stays above 30fps on high. I tried 1366x768 but it won't accept it. It keeps setting the 768 to 540 on auto and setting the other timing modes doesn't work either. It looks like everything is scaled to 720p and not 768p. Anything over 720 and the screen gets red flashing squares all over.