should i purchase an 720p, 1080i HDTV now or wait for 1080p sets?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Progressive MPEG-2 is the same idea as progressive JPEGs, right? Why is it more intensive than deinterlacing? Why don't they do away with interlacing and progressive altogether and just give us some full-quality video? If they can do 1080i/60 over 6 MHz (NTSC) then 30 full quality frames per second shouldn't be an issue.

No, a progressive Jpeg begins to show a lower q and then "update" to its full resolution. Progressive Mpeg-2 simply means that each complete frame is drawn..line by line, followed by the next full frame. On the other hand, Interlaced draws every other line to make a "field" odd lines first, followed by even lines (or the other way around).

It doesn't matter that they are only using 30FPS rather than 60fps for 1080p for Broadcasting, because it will still require twice the bandwidth to draw each individual frame, and they only way to do that over 6MHz is to increase the compression like using H.264 like they are planning with Euro Sat.

The problem is there's too many standards and many are too lenient.

Actually its not bad that there are a number of formats, otherwise you would be stuck with a single resolution/refresh rate for every possible use. Displays need to be flexible, and offer more resolution support...not less.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: rbV5
Progressive MPEG-2 is the same idea as progressive JPEGs, right? Why is it more intensive than deinterlacing? Why don't they do away with interlacing and progressive altogether and just give us some full-quality video? If they can do 1080i/60 over 6 MHz (NTSC) then 30 full quality frames per second shouldn't be an issue.

No, a progressive Jpeg begins to show a lower q and then "update" to its full resolution. Progressive Mpeg-2 simply means that each complete frame is drawn..line by line, followed by the next full frame. On the other hand, Interlaced draws every other line to make a "field" odd lines first, followed by even lines (or the other way around).

It doesn't matter that they are only using 30FPS rather than 60fps for 1080p for Broadcasting, because it will still require twice the bandwidth to draw each individual frame, and they only way to do that over 6MHz is to increase the compression like using H.264 like they are planning with Euro Sat.

The problem is there's too many standards and many are too lenient.

Actually its not bad that there are a number of formats, otherwise you would be stuck with a single resolution/refresh rate for every possible use. Displays need to be flexible, and offer more resolution support...not less.

I dissagree, unless you plan on using your display for a monitor. There is no need to have several different formats. Stick to 16:9 19X10 and leave it there until there is a need to upgrade it. All in all, I guess I have no problem with the *display* being flexable, my problem is with the standard being flexible. I do not think the standard needs to be flexible. It needs to be set in stone for everyone to follow.

At the very least, have 3 supported resolutions, one for 4:3, one for 16:9 and then a third for whatever other aspect ratio is percieved.

I am certainly no TV guru like some of you here, but I do not see why HDTV needs to be flexable and full of many different formats.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: TheSnowman
Originally posted by: rbV5
You'll still have deinterlacing artifacts with alot of material you'll be watching, and most of the HDTV broadcasts are 1080i, none will be 1080p, the rest are 720p.
Heh, yeah I was talking about things other than brodcast TV, which is why I said "with PS3 and HD disk formats." ;)

And xkight, progressive scan on TVs and and progressive compression on JPEGs and such are two very different things. Search Google or such for "interlaced vs progressive" and I am sure you will get plenty of info.

No, I totally agree with you, its just that "1080p" listed on display specs is going to mean different things, and in a lot of cases a 720p display will do exactly what most people might think they "need" a 1080p display for and they'll end up paying a premium for some slick marketing...especially initially. More than ever, you're really going to need to know what you want, and know exactly what you are buying.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: rbV5
Actually its not bad that there are a number of formats, otherwise you would be stuck with a single resolution/refresh rate for every possible use. Displays need to be flexible, and offer more resolution support...not less.

Actually I'm all for the different resolutions. I was talking mainly about video connections. Resolution flexibility allows more HDTV subchannels (and so does reducing the bitrate). Plus it allows broadcasters to better scale their migration to HDTV transmission.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: rbV5
No, a progressive Jpeg begins to show a lower q and then "update" to its full resolution. Progressive Mpeg-2 simply means that each complete frame is drawn..line by line, followed by the next full frame. On the other hand, Interlaced draws every other line to make a "field" odd lines first, followed by even lines (or the other way around).

I guess that would imply then that progressive always has better quality?
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: xtknight
Originally posted by: rbV5
No, a progressive Jpeg begins to show a lower q and then "update" to its full resolution. Progressive Mpeg-2 simply means that each complete frame is drawn..line by line, followed by the next full frame. On the other hand, Interlaced draws every other line to make a "field" odd lines first, followed by even lines (or the other way around).

I guess that would imply then that progressive always has better quality?

Not in itself it doesn't. For instance, 1080i/60 and 1080p/30 both display the same pixel #'s, and both run at 30Frames per second and both refresh at 60Hz. The 1080i just takes half the bandwidth to do it. Depending on the source material, either could be better or worse, or look virtually the same.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
I dissagree, unless you plan on using your display for a monitor. There is no need to have several different formats.

Well if you want to stick to the old different display for each different use. Myself, I want a display that is flexible enough to run the content I want to watch, and media that is flexible enough that it won't be pigeonholed into a single format.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: rbV5

Not in itself it doesn't. For instance, 1080i/60 and 1080p/30 both display the same pixel #'s, and both run at 30Frames per second and both refresh at 60Hz. The 1080i just takes half the bandwidth to do it.
I think you just confused yourself here, but 1080p/30 and 1080i/30 both use exactly the same amount of bandwidth.

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: TheSnowman
Originally posted by: rbV5

Not in itself it doesn't. For instance, 1080i/60 and 1080p/30 both display the same pixel #'s, and both run at 30Frames per second and both refresh at 60Hz. The 1080i just takes half the bandwidth to do it.
I think you just confused yourself here, but 1080p/30 and 1080i/30 both use exactly the same amount of bandwidth.

I was about to say...how ignorant have I gotten? :eek:;)
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: TheSnowman
Originally posted by: rbV5

Not in itself it doesn't. For instance, 1080i/60 and 1080p/30 both display the same pixel #'s, and both run at 30Frames per second and both refresh at 60Hz. The 1080i just takes half the bandwidth to do it.
I think you just confused yourself here, but 1080p/30 and 1080i/30 both use exactly the same amount of bandwidth.

There is no 1080i/30, its 1080i/60.

With 1080i/60, only a single field of 1920x540 (odd or even lines ) is ever broadcast at 1 time..it just does it 60 times every second with 2 fields making each frame. With 1080p/30 each frame must be broadcast at 1920x1080, it does it 30 times a second.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Yeah I meant to type 1080i/60. 1080p/30 and 1080i/60 both use exactly the same amount of bandwidth.

1920 x 540 x 60 = 62,208,000 = 1920 x 1080 x 30
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: TheSnowman
Yeah I meant to type 1080i/60. 1080p/30 and 1080i/60 both use exactly the same amount of bandwidth.

1920 x 540 x 60 = 62,208,000 = 1920 x 1080 x 30

True, but to Broadcast transmit 1080i, you need only transmit 1920x540 of the data at any one time, with 1080p "anything" all 1920x1080 has to transmit each time.
 

Velk

Senior member
Jul 29, 2004
734
0
0
Originally posted by: rbV5
Originally posted by: TheSnowman
Yeah I meant to type 1080i/60. 1080p/30 and 1080i/60 both use exactly the same amount of bandwidth.

1920 x 540 x 60 = 62,208,000 = 1920 x 1080 x 30

True, but to Broadcast transmit 1080i, you need only transmit 1920x540 of the data at any one time, with 1080p "anything" all 1920x1080 has to transmit each time.

As far as I know, if you only receive one field of a frame at 1080i, and then display it anyway, it's probably going to look much worse than if you miss one 1080p frame, so I would suspect that probably the same thing will happen in both cases - they will skip that frame, and display the previous one until they get another full frame.

Does anyone know for sure what happens when the two formats are bandwidth starved ?
 

JBDan

Platinum Member
Dec 7, 2004
2,333
0
0
Originally posted by: rbV5
Originally posted by: JBDan
Originally posted by: rbV5
Games, at their heart, are rendered and vector-based. Once you get to the point where lines are drawn without stair-stepping, added resolution is of marginal benefit. Yes, its possible that game developers could incorporate high-quality 1080p textures. Will they? Its unlikely.

Of course thats a load of Bull, LOL.

He is right about 1080p broadcasts...they are not even on the horizon. 1080 HDTV is 1080p/24, or 1080p/30 or 1080i/60, so basically 1080p/30 and 1080i/60 have the same pixels per minute, so anybody that says they can see the difference between 1080i and 1080p is probably imagining it. 1080p/24 for film to HD DVD on the other hand may very well be the best of all, but it will for sure not be broadcast, and it apparently looks damn good at 720p as well (720p formats are 720p/24, 720p/30 and 720p/60) 1080p is probably more than necissary unless you want to game at 1080p/60. You definately want HDMI in any event.

Imagining it? Maybe some, but not others such as me. It is as plain as day to me on a high-end DLP when a fast moving scene/sports game is broadcast. Just have to know what to look for as some people have a better eye for it. I'll agree with you though that "most" people will not notice the difference.

How do you even watch 1080i on a DLP, isn't it downconverted to 720p? Its not so easy to compare them directly since DLP's can't realy run 1080i can they? Its either downconverted to 720p or upconverted to 1080p depending on the display... is it not?.

I know I've been watching OTA broadcast sports for the past few years, the Olympics were broadcast in 1080i, so are CBS and NBC sports, and they look great at 1080i on my set, plenty of action, and I have a somewhat critical eye when it comes to video PQ.

Could it be that progressive material just looks better on a progressive display?

Your last Q is a good one, IDK, could be true. As for the DLP lines, everything is either converted up/down to 720p or 1080p as you stated. Its really up to the converter chip (Faroujda, etc...) that makes or breaks it. When I watch a 1080i broadcast (albeit DLP/crt/lcd) of a football game for instance on CBS, even on my old DLP, fast panning of the camera leaves such a smudged, unfocused, pixely appearance and in a split second, while the image becomes "still," it clears up. It is more noticeable on a HD CRT IMO. Images are sharper in 1080i on a 1080i native monitor, no doubt. The more you watch a native 1080i broacast on a 1080i native monitor, the less you tend to notice this "out of focus pixely" image during fast pan shots. I don't gripe about this as HD content is very pleasing to me. Just my experience with it. I have been out of the HD loop for about a year and look forward to jumping back on the bandwagon. I am also a little obsessed when it comes to PQ/IQ ;) FOX & ABC have opted for 720p which IMO is a much smoother overall viewing/experience. Tx for all of ya'lls insight here at Anand.
 

Sonikku

Lifer
Jun 23, 2005
15,886
4,886
136
Originally posted by: Solodays

Most people cannot tell the difference between 1080i and 1080p

wtf? you must be reffering to a 90 year old grandma then. as i was informed, 720p > 1080i.

Ah, yes. There is but one problem. Where in my post did did I compare either 1080i or 1080p to 720p? When the resolution is as high as 1920x1080 the only time you'll notice a big difference is when watching material with a great deal of motion.
 

Solodays

Senior member
Jun 26, 2003
853
0
0
so how can you tell if the tv your buying has HDCP? since most dont list it in their specs. so all you need is a TV with HDMI input?