I don't understand HDTV...

Josh7289

Senior member
Apr 19, 2005
799
0
76
OK, I don't understand HDTV TV's... Whenever I look at the specs for them, they say weird resolutions like 1024 x 1024 (a sqaure ??), 1366 x 768 (what is this ??), 1024 x 768 (isn't that 4:3 ??), and all kinds of other things.

I thought they were either 1280 x 720 or 1920 x 1080...So what's going on here? And how can all of these weird resolutions display 720p and 1080i when they don't even have enough pixels (1024 and 768 isn't 1080...) Also, why is a flat panel progressive and interlaced, aren't computer monitors, at least LCD, progressive?

See how confused I am? Please help. Thank you very much!
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Josh7289
OK, I don't understand HDTV TV's... Whenever I look at the specs for them, they say weird resolutions like 1024 x 1024 (a sqaure ??), 1366 x 768 (what is this ??), 1024 x 768 (isn't that 4:3 ??), and all kinds of other things.

I thought they were either 1280 x 720 or 1920 x 1080...So what's going on here? And how can all of these weird resolutions display 720p and 1080i when they don't even have enough pixels (1024 and 768 isn't 1080...) Also, why is a flat panel progressive and interlaced, aren't computer monitors, at least LCD, progressive?

See how confused I am? Please help. Thank you very much!


1366 X 768 and resolutions like that are to maintain certain aspect ratios. However, there are exceptions to the rule. For instance, Toshiba Plasma TV's are 16:9 widescreen at a resolution of 1024 X 768. Well, the reason is because the pixels are reactangular, rather than square. They do not maintain a perfect 1:1 pixel ratio. Basically, it is difficult to explain it on a whole. Do some web searches on google, I remember finding a nice document that is easy to understand on the HDTV standards.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
OK, I don't understand HDTV TV's... Whenever I look at the specs for them, they say weird resolutions like 1024 x 1024 (a sqaure ??),
With Plasma displays, they have native resolution (1024x1024 pixels in this example) That is the maximum addressable resolution, it will then downconvert the HDTV stream and map it to those pixels, so even though it is "HDTV" ready, clearly there will be interpolated pixels and the full HDTV resolution can not be displayed.

Interlacing was a technique used because of the limits of our technology at the time, it's wholly unecessary now.

1080i HDTV is interlaced, and it looks great (the highest broadcast HD resolution). Interlacing simply describes the method that the frames are drawn, it doesn't imply the quality of the output or the state of the technology.

There is no such monitor that can be interlaced and progressive

My HDTV display supports both interlaced and progressive output.

 

pulsedrive

Senior member
Apr 19, 2005
688
0
0
Yeah, going to have to go with rbV5, you CAN have Progessive and Interlaced on the same display. In fact the VAST majority of HDTV's are both.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: pulsedrive
Yeah, going to have to go with rbV5, you CAN have Progessive and Interlaced on the same display. In fact the VAST majority of HDTV's are both.

There are a number of display technologies, but if its based on a CRT (like my RPTV) it can support interlaced and progressive (480p and 1080i for mine). LCD's and DLP based displays are progressive and generally support 720p modes, and now some awesome 1080p displays are coming out.
 

klah

Diamond Member
Aug 13, 2002
7,070
1
0
Originally posted by: Josh7289
OK, I don't understand HDTV TV's... Whenever I look at the specs for them, they say weird resolutions like 1024 x 1024 (a sqaure ??), 1366 x 768 (what is this ??), 1024 x 768 (isn't that 4:3 ??), and all kinds of other things. I thought they were either 1280 x 720 or 1920 x 1080...So what's going on here?

Read the first two articles starting here:
("What is ALiS?" and "What is the difference between square and non-square pixels?")
http://www.avsforumfaq.com/~plasma/#alis

720p is downconverted to 512 lines on those 1024x1024 plasma displays.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: Continuity28
All computer monitors are progressive, and all TVs will be eventually too. Interlacing was a technique used because of the limits of our technology at the time, it's wholly unecessary now.

What monitors have you seen with such odd resolutions? I don't think theres any square out there. :p If you mean resolutions like 1920x1200, these are not that odd, they are 16:10, only a slight adjustment is needed for 16:9 widescreen.

There is no such monitor that can be interlaced and progressive, it can only be progressive, and the interlaced video sent into it becomes de-interlaced. Or it can only be interlaced, and will not support progressive video at all.

You'll have to show me such monitors so I can help you better understand the meanings behind them. :)

There are some old monitors that can be interlaced or progressive, don't think they were ever really consumer level stuff though. It's also possible to simulate interlaced video with programming, though the monitor still thinks it's progressive.

BTW, I think most HDTVs deinterlaced interlaced video before displaying it, or so many wouldn't have trouble properly displaying 60fps interlaced video.

Oh, and 1920x1080 can be supported on lower res sets because the TV doesn't have to display the whole res at a time.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Yeah, some don't have proper resolutions, so they have to squish or stretch the picture. Those aren't the best at image quality.

At the time that TV was invented, because of the limited bandwidth, they were only able to send 30fps, but not only did it flicker, the motion wasn't the best. For this reason, they interlaced it. Since today all media to the TV, including DVDs is interlaced, it would take extra circuitry to de-interlace the media making the tv more expensive.

1080i HDTV is interlaced, and it looks great (the highest broadcast HD resolution). Interlacing simply describes the method that the frames are drawn, it doesn't imply the quality of the output or the state of the technology.
Proggressive looks better than interlaced at least in 480i vs 480p. It's a cleaner picture.

1080i does look amazing however, and you may not even notice the interlacing.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Actually, check your specs very carefully, most monitors that say they support 480p and 1080i also have whats called a de-interlacer, and various filters associated. These take the 1080i signal and deinterlace it so it can run on your screen. All you know is that your source is being displayed, but what you don't know is that its no longer interlaced.

I do check my specs. The deinterlacer (line-doubler) on my display is for 480i signals using the composite/s-video and the low bandwidth YpbPr connectors. My display has a number of inputs with different capabilities. The high bandwidth YpbPr and HD-15 connectors display the input directly.

It's only the highest broadcast resolution because 1080p takes up too much bandwidth

Depending on the compression used. My WMV HD 1080p files require far less bandwidth than my 1080i broadcast streams because they are highly compressed.

Interlacing WAS developed because of hardware limitations back in the 1940s... It stayed because, like all things with technology, backwards compatibility is important. People won't update their TV to view the latest broadcast (in the 1940s - 1950s, and thats when they made the first various NTSC formats)

What backwards compatability were they supporting when they decided to broadcast 1080i?

Progressive scan will always be better than interlaced

Thats not necissarily true either. The quality is totally dependant on the source video, bitrate and compression used, and how it is displayed and the person viewing it. Whether its interlaced or not is usually transparent to the viewer.
 

TGS

Golden Member
May 3, 2005
1,849
0
0
Originally posted by: BouZouki
What is better, 1080i or 720p?

I'm wondering which to use.

For fast moving pictures I believe people tend to lean towards 720p screens, while for low action a 1080i picture may look better.

You still have to take into account how fast the source picture is "moving" and if it will cause the appearance of tearing due to the interlaced lines that 1080i will have to jump back and forth to.

Originally posted by: rbV5


It's only the highest broadcast resolution because 1080p takes up too much bandwidth

Depending on the compression used. My WMV HD 1080p files require far less bandwidth than my 1080i broadcast streams because they are highly compressed.

Interlacing WAS developed because of hardware limitations back in the 1940s... It stayed because, like all things with technology, backwards compatibility is important. People won't update their TV to view the latest broadcast (in the 1940s - 1950s, and thats when they made the first various NTSC formats)

What backwards compatability were they supporting when they decided to broadcast 1080i?

They weren't, they only have to push across a 1920x540 sized picture per frame at 1080i spec. Or 1280x720 pixels at 720p spec.

720p gives (1280x720) 921600 pixels per frame
1080i gives (1920x540) 1036800 pixels per frame
1080p gives (1920x1080) 2073600 pixels per frame

By using 1080i certain feeds will appear to be of much higher quality, though it won't cost double the bandwidth usage that a 1080p would cause over a 1080i source.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Proggressive looks better than interlaced at least in 480i vs 480p. It's a cleaner picture.

Interlaced TV looks better on an interlaced display if the source is interlaced. The result is smoother pans and smoother motion. 480p generally looks better because it has twice the resolution of 480i...thats simple math, hower video display is far more complex than that. You can't give the blanket "always" or describe the technology in simple paragraphs.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Depending on the compression used. My WMV HD 1080p files require far less bandwidth than my 1080i broadcast streams because they are highly compressed.

Broadcast. I'm not talking about the stuff you encode, I'm talking about what comes over cable, etc.

Broadcast video uses compression also, I mearly used WMV as an example.

What backwards compatability were they supporting when they decided to broadcast 1080i?

You misread my post, I said they were after backwards compatibility in the 1940s - 1950s when people would never have bought new TVs if they forced any drastic changes, it's different now.

Thats my point, why is there even interlaced HD at all if its so inferior? There wasn't legacy equipment to support.

At the same resolution, progressive will always be better,
Thats still not true, there are other factors to consider. Its not all about resolution. The bitrate and compression have a lot to say about the PQ as well as resolution.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
1080p looks better than 1080i, you can't really argue this... Take a 1080p monitor with 1080p native content, and the same wit ha 1080i native monitor and put them side by side, you will notice...

Sure I can. Like I already said, resolution is only part of the equation, bitrate and compression have alot to say about the PQ. If the 1080i file has a higher bitrate and less compression, it will probably look better.

Again, we're doing the best with what we have. All I was saying was, most display devices can't do both natively, either interlaced material is being filtered and deinterlaced, or progressive material isn't played.... (or telecined).

You mean most new technology. CRT technology is far superior for resolution handling. New display devices are typically fixed resolution panels that are cheaper to build and carry a smaller footprint. They are also easier for the end user, it doesn't mean its technically superior, just cheaper to produce for the masses.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
When you display interlaced content on your PC monitor, it's always being deinterlaced. That can produce artifacts. Like I said, the only time interlaced material will look good, is on an interlaced monitor (most old TVs).

No its not unless your software is deinterlacing it. If your software isn't deinterlacing the video and it is interlaced material, the monitor will display each field as a frame.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
CRTs have the same problem in the end. The electron beam can't do both progressive and interlaced material natively.

My 22" CRT certainly can (its something you can test for yourself), interlaced resolutions aren't pretty, but they are supported with my Radeon card...how do explain that?
 

Continuity28

Golden Member
Jul 2, 2005
1,653
0
76
Anyways, I'm not helping this thread, I'm probably adding to the confusion.

Can a mod delete my posts for me?