Why are there no 1080i dvds?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

zinfamous

No Lifer
Jul 12, 2006
111,858
31,346
146
Originally posted by: spidey07
venkman, you're agrument is old and untrue.

The year is 2009, 1080p is the only way. 720p is dead.

Plasma is still greater than all. Anybody that say's otherwise is just trying to justify their crap TV purchase.

well, if you're going by current events, then you forgot to mention that plasma is pretty much dead.

sad but true. they've been relegated to the ignore corner in most big box stores. freaking sucks.
 

zinfamous

No Lifer
Jul 12, 2006
111,858
31,346
146
Originally posted by: Joemonkey
Originally posted by: spidey07
venkman, you're agrument is old and untrue.

The year is 2009, 1080p is the only way. 720p is dead.

Plasma is still greater than all. Anybody that say's otherwise is just trying to justify their crap TV purchase.

and anyone trying to say plasma is greater than all is trying to justify spending too much on a TV

I have a 42" LCD 1080p tv and it looks JUST AS GOOD as a 42" Plasma 1080p TV I saw right next to it in the store yet was something like $300 less a year ago

then again, I may have something to gain from being a bit colorblind, but if that saved me $300 then I'm ok with it

not to mention you save money with an LCD compared to a plasma based on electricity and heat costs

:confused:

...you do know that plasma is much, much cheaper than LCD, right? especially as you go larger.

so ah...why are you trying to justify spending more on a crappier TV? :p
 

zinfamous

No Lifer
Jul 12, 2006
111,858
31,346
146
Originally posted by: theprodigalrebel
Originally posted by: MrMatt
Ok, so do any cable companies actually transmit in 1080p??

Satellite (Dish and DirecTV) offers 1080p Video On Demand. Not sure if cable offers it as well.

nope. it's utter shite.

last I checked, Fox was actually broadcasting in 720p (OTA, at least)


either way, they're all compressed. I can't imagine satellite is much better.
 

CZroe

Lifer
Jun 24, 2001
24,195
857
126
Originally posted by: 0roo0roo
there is no point.
even if for instance it was a video 1080i source they'd be better off doing the deinterlacing professionally than leaving it to the consumer device to make a mess of it.

You OBVIOUSLY have no idea what reverse pull-down is or how it applies. Very few video sources would use bob & weave or interpolation because very few are natively 60FPS. 24FPS content that has had 3x2 pull-down applied to match 60FPS can be perfectly reconstructed to a progressive frame by any dumb interlacer. I've heard about a lot of crap TVs that literally throw out half the lines and line-double the rest but even those can be fixed with a good AV receiver.

Originally posted by: DLeRium
...only on TV though. 720p is 60fps for ATSC broadcasts. Different story when we talk about movies which the OP is wondering about.

Actually, very often the 720p ATSC program is mastered on theatrical 24FPS 35mm film. There are MANY examples, even long before HD broadcast. From the original Star Trek in the '60s to Stargate SG-1 in the '90s, 35mm is common even in television even today.
 

zinfamous

No Lifer
Jul 12, 2006
111,858
31,346
146
Originally posted by: CZroe
Originally posted by: 0roo0roo
there is no point.
even if for instance it was a video 1080i source they'd be better off doing the deinterlacing professionally than leaving it to the consumer device to make a mess of it.

You OBVIOUSLY have no idea what reverse pull-down is or how it applies. Very few video sources would use bob & weave or interpolation because very few are natively 60FPS. 24FPS content that has had 3x2 pull-down applied to match 60FPS can be perfectly reconstructed to a progressive frame by any dumb interlacer. I've heard about a lot of crap TVs that literally throw out half the lines and line-double the rest but even those can be fixed with a good AV receiver.

Originally posted by: DLeRium
...only on TV though. 720p is 60fps for ATSC broadcasts. Different story when we talk about movies which the OP is wondering about.

Actually, very often the 720p ATSC program is mastered on theatrical 24FPS 35mm film. There are MANY examples, even long before HD broadcast. From the original Star Trek in the '60s to Stargate SG-1 in the '90s, 35mm is common even in television even today.

But isn't most of broadcast material from the 80s on cheap 'ol analog magnetic tape?

 

CZroe

Lifer
Jun 24, 2001
24,195
857
126
Originally posted by: MrMatt
Ok, so do any cable companies actually transmit in 1080p??

Oh PLEASE tell me you're joking, will ya?

Originally posted by: theprodigalrebel
Originally posted by: MrMatt
Ok, so do any cable companies actually transmit in 1080p??

Satellite (Dish and DirecTV) offers 1080p Video On Demand. Not sure if cable offers it as well.

And it's total BS. They do what your freakin' TV is supposed to do, lower the bandwidth, and pretend that they are giving you MORE quality and resolution.

Originally posted by: zinfamous
Originally posted by: CZroe
Originally posted by: 0roo0roo
there is no point.
even if for instance it was a video 1080i source they'd be better off doing the deinterlacing professionally than leaving it to the consumer device to make a mess of it.

You OBVIOUSLY have no idea what reverse pull-down is or how it applies. Very few video sources would use bob & weave or interpolation because very few are natively 60FPS. 24FPS content that has had 3x2 pull-down applied to match 60FPS can be perfectly reconstructed to a progressive frame by any dumb interlacer. I've heard about a lot of crap TVs that literally throw out half the lines and line-double the rest but even those can be fixed with a good AV receiver.

Originally posted by: DLeRium
...only on TV though. 720p is 60fps for ATSC broadcasts. Different story when we talk about movies which the OP is wondering about.

Actually, very often the 720p ATSC program is mastered on theatrical 24FPS 35mm film. There are MANY examples, even long before HD broadcast. From the original Star Trek in the '60s to Stargate SG-1 in the '90s, 35mm is common even in television even today.

But isn't most of broadcast material from the 80s on cheap 'ol analog magnetic tape?

A lot is, but that has nothing to do with today. For instance, sticking with my Sci-Fi examples, Star Trek: The Next Generation was filmed on 35mm and scanned straight to video before any editing or special effects were applied. The master copies are on SD video. Knowing about the potential of home video quality exceeding broadcast quality, television shows in production today that film on 35mm would NEVER do that, thus, the master copies remain 35mm film.

Also, The vast majority of even old "filmed" television programs as opposed to taped ones are not full of special effects and the masters are either on 35mm or can easily be recreated from the 35mm stock footage (if available).
 

zinfamous

No Lifer
Jul 12, 2006
111,858
31,346
146
Originally posted by: CZroe
Originally posted by: MrMatt
Ok, so do any cable companies actually transmit in 1080p??

Oh PLEASE tell me you're joking, will ya?

Originally posted by: theprodigalrebel
Originally posted by: MrMatt
Ok, so do any cable companies actually transmit in 1080p??

Satellite (Dish and DirecTV) offers 1080p Video On Demand. Not sure if cable offers it as well.

And it's total BS. They do what your freakin' TV is supposed to do, lower the bandwidth, and pretend that they are giving you MORE quality and resolution.

Originally posted by: zinfamous
Originally posted by: CZroe
Originally posted by: 0roo0roo
there is no point.
even if for instance it was a video 1080i source they'd be better off doing the deinterlacing professionally than leaving it to the consumer device to make a mess of it.

You OBVIOUSLY have no idea what reverse pull-down is or how it applies. Very few video sources would use bob & weave or interpolation because very few are natively 60FPS. 24FPS content that has had 3x2 pull-down applied to match 60FPS can be perfectly reconstructed to a progressive frame by any dumb interlacer. I've heard about a lot of crap TVs that literally throw out half the lines and line-double the rest but even those can be fixed with a good AV receiver.

Originally posted by: DLeRium
...only on TV though. 720p is 60fps for ATSC broadcasts. Different story when we talk about movies which the OP is wondering about.

Actually, very often the 720p ATSC program is mastered on theatrical 24FPS 35mm film. There are MANY examples, even long before HD broadcast. From the original Star Trek in the '60s to Stargate SG-1 in the '90s, 35mm is common even in television even today.

But isn't most of broadcast material from the 80s on cheap 'ol analog magnetic tape?

A lot is, but that has nothing to do with today. For instance, sticking with my Sci-Fi examples, Star Trek: The Next Generation was filmed on 35mm and scanned straight to video before any editing or special effects were applied. The master copies are on SD video. Knowing about the potential of home video quality exceeding broadcast quality, television shows in production today that film on 35mm would NEVER do that, thus, the master copies remain 35mm film.

Also, The vast majority of even old "filmed" television programs as opposed to taped ones are not full of special effects and the masters are either on 35mm or can easily be recreated from the 35mm stock footage (if available).

yeah, that much I knew. Especially that the 80s were an odd dark age for retrogressive technology. very sad. not surprising that Reagan was around.... ;)
 

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
yea i remember when indi film makers were touting DV. i was pissed. it looked like sh*t on the big screen back then, and it'll look like sh*t forever more. sometimes progress is not progress at all heh.

and yea startrek the original series is trickling out in hd redone. looks excellent. seinfeld as well:)
the next generation edited on tape as said above..looks like garbage on dvd:(
 

91TTZ

Lifer
Jan 31, 2005
14,374
1
0
Originally posted by: MIKEMIKE
1080p>720p > 1080i>480p>720i>

i was upest to learn that the steelers game was only 1080i, it could have looked so much better.

This isn't true.

We lived with 525i for years and years. Just because it's interlaced doesn't mean it's bad. 1080i looks noticeably better than 720p on a TV that can properly display interlaced video.
 

91TTZ

Lifer
Jan 31, 2005
14,374
1
0
Originally posted by: Shawn
Originally posted by: MIKEMIKE
1080p>720p > 1080i>480p>720i>

i was upest to learn that the steelers game was only 1080i, it could have looked so much better.

WRONG


1080p > 1080i > 720p > 480p

and there is no 720i, but if there was it would be greater than 480p.

Correct.
 

91TTZ

Lifer
Jan 31, 2005
14,374
1
0
Originally posted by: venkman


720p is indistinguishably from the 1080s unless you have a massive screen and even then some blind comparisons on AVS between 1080 and 720 projectors firing up a 120"+ screen haven't shown much of a visible difference to trained eyes.

This is incorrect. I have a 30" CRT HDTV (my room is small) and it's easy to tell the difference between station that broadcast in 720p vs. 1080i. I think a lot of people have 720p LCD TVs and they (obviously) can't tell the difference since the TV can't display it.
 

Koing

Elite Member <br> Super Moderator<br> Health and F
Oct 11, 2000
16,843
2
0
Originally posted by: 91TTZ
Originally posted by: venkman


720p is indistinguishably from the 1080s unless you have a massive screen and even then some blind comparisons on AVS between 1080 and 720 projectors firing up a 120"+ screen haven't shown much of a visible difference to trained eyes.

This is incorrect. I have a 30" CRT HDTV (my room is small) and it's easy to tell the difference between station that broadcast in 720p vs. 1080i. I think a lot of people have 720p LCD TVs and they (obviously) can't tell the difference since the TV can't display it.

But venkman said they were using projectors that could display 1080 and 720. But I also think the projectors display the resolution differently in that it's prejected and not shown directly on pixels so it looks different.

Koing
 

91TTZ

Lifer
Jan 31, 2005
14,374
1
0
Originally posted by: Koing
Originally posted by: 91TTZ
Originally posted by: venkman


720p is indistinguishably from the 1080s unless you have a massive screen and even then some blind comparisons on AVS between 1080 and 720 projectors firing up a 120"+ screen haven't shown much of a visible difference to trained eyes.

This is incorrect. I have a 30" CRT HDTV (my room is small) and it's easy to tell the difference between station that broadcast in 720p vs. 1080i. I think a lot of people have 720p LCD TVs and they (obviously) can't tell the difference since the TV can't display it.

But venkman said they were using projectors that could display 1080 and 720. But I also think the projectors display the resolution differently in that it's prejected and not shown directly on pixels so it looks different.

Koing

I'm not so sure he's correct there. I've seen so many people who don't understand the specs of their equipment. For instance, many devices have a low native resolution but "support" 1080p, and they think it will display 1080p.
 

smitbret

Diamond Member
Jul 27, 2006
3,382
17
81
Most broadcast HD tv is in 1080i with the exception of Fox, ABC, ESPN and a couple of others. The reason? The amount of sports programming. Progressive scanning yields a superior picture when fast motion is involved because the hardware isn't forced into "guessing" what to fill the blanks with. On a static image with little motion, 1080i is going to be superior because there are more pixels and "guessing" can be very accurate. Since bandwidth requirements are similar between 720p and 1080i a network that predicts a high volume of action/sports programming will benefit from the progressive scan 720p. Most likely, the Steelers game would have looked better on a 720p than 1080i broadcast, but since it was probably on ESPN or ABC, it should have been sent to your home at 720p. The only way it would have looked better would have been 1080p. I haven't really checked or researched how a 24fps movie would be 720p vs. 1080i, but I would guess the same rules would apply.

Incidentally, 1080i isn't even a standardized HD broadcast format. It's just kind of a bastard format to make marketing easier and yield a slightly shaper image from a typical broadcast source. A well calibrated, high quality 720p television like the Pioneer Kuro series will look almost indistinguishable from most of today's 1080p sets. That being said:

1080p>720p=1080i>480p

In a controlled environment, plasma is still superior to LCD as far as color, contrast, handling of motion and general picture quality. They do consume about 30-50% more electricity, burn-in can still happen if you leave a static image on it long enough and they are prone to problems with glare. If you REALLY care about image quality and aren't willing to compromise you buy a plasma. If you need an HDTV for general use (80% of the population), you get an LCD.
 

zinfamous

No Lifer
Jul 12, 2006
111,858
31,346
146
Originally posted by: 0roo0roo
yea i remember when indi film makers were touting DV. i was pissed. it looked like sh*t on the big screen back then, and it'll look like sh*t forever more. sometimes progress is not progress at all heh.

and yea startrek the original series is trickling out in hd redone. looks excellent. seinfeld as well:)
the next generation edited on tape as said above..looks like garbage on dvd:(

<---paid $$$ for his miniDV camcorder back in 2002 :( :laugh:
 

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
Originally posted by: venkman
Originally posted by: YOyoYOhowsDAjello
Here's one of the 720p vs 1080p comparisons I know of
http://www.avsforum.com/avs-vb/showthread.php?t=767929

That is the one I was referencing. Thanks YoYo

projection must have had less obvious pixels than on a flat panel somehow. perhaps it was bluring it away.. but i have serious doubts because at that projection size at that resolution...the pixels are huge. i've seen 2k theaters where in the front rows i could pick out pixels. if they were half that my eyes would have popped out of my head.
 

YOyoYOhowsDAjello

Moderator<br>A/V & Home Theater<br>Elite member
Aug 6, 2001
31,205
45
91
Originally posted by: 91TTZ
Originally posted by: venkman
Originally posted by: YOyoYOhowsDAjello
Here's one of the 720p vs 1080p comparisons I know of
http://www.avsforum.com/avs-vb/showthread.php?t=767929

That is the one I was referencing. Thanks YoYo


It's pretty hard to see which one is better when you are viewing a camera shot of a 1280x720 vs. 1920x1080 image in a picture that's 800x533.

Well anything less than actually being there isn't going to cut it. I think reading that people that were there in person had a hard time distinguishing between the two (or even getting it wrong) is a more valuable thing to take away from it than anyone's conclusions on screenshots.

Even if we did have 1920x1080 shots of the screens, that's a pretty poor way to evaluate whether there's really a difference considering the limitations of the evaluation tool.
(Take a picture of the screen with a digital camera (of some resolution that probably isn't really going to line up with the screen's resolution) in a low light environment trying to focus on a whole wall, then possibly compress the image, then view it on a monitor of unknown resolution and color calibration, etc.)

I mention color since that seems to be something people in that thread mentioned a few times just looking at the first few pages. I didn't read it in depth enough for what the actual findings were in the live demonstration, but I would imagine that if it showed up in these images, it was probably pretty apparent in person as well.

I believe ISF considers Contrast ratio, Color saturation, and Color accuracy to all be more important than resolution when determining what factors produce a good image.

That said, I am going to be upgrading my projector, and it's certainly going to be a 1080p model. Not that I'd be looking at the bottom end, but they're even available for under $1000 now.

I know I'm sitting too close to my projector
http://forums.anandtech.com/me...id=67&threadid=2306708
http://pics.bbzzdd.com/users/Y...lo/NewExcelBasedv9.png
(I'm that green dot all the way on the right side with my aging 720p projector)

I'll probably end up with an Epson 8500 or Panasonic AE4000 as a replacement, but more than the resolution, I'm looking forward to the increase in contrast and color from the upgrade.
 

CZroe

Lifer
Jun 24, 2001
24,195
857
126
Originally posted by: smitbret
Most broadcast HD tv is in 1080i with the exception of Fox, ABC, ESPN and a couple of others. The reason? The amount of sports programming. Progressive scanning yields a superior picture when fast motion is involved because the hardware isn't forced into "guessing" what to fill the blanks with. On a static image with little motion, 1080i is going to be superior because there are more pixels and "guessing" can be very accurate. Since bandwidth requirements are similar between 720p and 1080i a network that predicts a high volume of action/sports programming will benefit from the progressive scan 720p. Most likely, the Steelers game would have looked better on a 720p than 1080i broadcast, but since it was probably on ESPN or ABC, it should have been sent to your home at 720p. The only way it would have looked better would have been 1080p. I haven't really checked or researched how a 24fps movie would be 720p vs. 1080i, but I would guess the same rules would apply.

Incidentally, 1080i isn't even a standardized HD broadcast format. It's just kind of a bastard format to make marketing easier and yield a slightly shaper image from a typical broadcast source. A well calibrated, high quality 720p television like the Pioneer Kuro series will look almost indistinguishable from most of today's 1080p sets. That being said:

1080p>720p=1080i>480p

Nope. The key point is in your "I haven't really checked or researched how a 24fps movie would be 720p vs. 1080i" statement. Because most pre-recorded HD material is filmed on 35mm at 24fps, those frames have to go through 3-2 pull-down tp create 60 fields per second (30 interlaced frames per second). What that means is that every odd frame from the source materal is doubled and every even frame is tripled. The FIRST interlaced frame consists of two fields from the same source frame, which is actually a progressive image in an interlaced signal. The only time this isn't true is when that second field is still displayed as the third field is being rendered (from the second interlaced frame). In 1/60th of a second, that, too, will be replaced with an alternate field from the same progressive frame of the sourfce material. So, 1080i can transmit every detail of every frame from a 1080p/24 source. Every single pixel. 720p can't. Obviously, reverse 3-2 pulldown can detect repeated 3-2 frames (or 2-2 in the case of 24FPS to 50hz). and simply restore and display only the progressive frames, eliminating even that moment between the second and fourth fields where an interlaced image is shown.

Oh, and you have a confusing typo in there... "Incidentally, 1080i isn't even a standardized HD broadcast format." should say "1080p." That said, 1080p isn't about broadcast at all. It's about the native resolution of your set. Anything but CRT is natively progressive and interlaced signals need to be converted to progressive to be properly displayed. Though many crappy sets fail 3-2 reverse pull-down on 1080i content they all should do it and, if not, a decent AV Receiver will make up for it. It's universally expected that they all do reverse pull-down for 480i to 480p.

Now, LIVE HD broadcasts of sports games are typically done on a 60 FPS camera and therefure they can not be deinterlaced to 1080p using inverse 3-2 pull-down, but that's pretty much limited to concerts, sports, and news. Because the VAST majority of content can have proper 3-2 pulldown applied to restore the original 1080p/24 frame from interlaced 1080i/60, you are decidedly wrong with you blanket statement that 720p>1080i. Face it. Most people but home theaters for theatrical content and most programs on TV are pre-recorded and sourced from 24p. 720p LOSES. Other than your point about live fast-motion events, it's main benefit is with 60FPS video games. It's easier to render at 60FPS than 1080p and its visually better than 1080i... also assuming 60FPS.

Oh! And since when has ESPN been "broadcast" TV? ;)
 

kalrith

Diamond Member
Aug 22, 2005
6,628
7
81
Originally posted by: YOyoYOhowsDAjello
I believe ISF considers Contrast ratio, Color saturation, and Color accuracy to all be more important than resolution when determining what factors produce a good image.

You are correct.

The fact that contrast ratio, color saturation, and color accuracy are more important to the PQ than resolution seems so lost in this OMG-anything-less-than-1080p-is-total-crap age.

People are posting things like: 'My friend has a 720p plasma. It's a 50". I'm sorry but it looks SO MEDIOCRE it's not even funny. IT bothers me watching it. 720p is not enough for a 50".' This completely overlooks a ton of variables. What was the source? Was the TV calibrated? Was it a $600 Vizio or $2000 Pioneer plasma? And so on.

I always like to point out two things in the 720-vs-1080 debates. One is the ISF viewpoint that YOyoYO already pointed out. The other is the example of my TV. It's the Pioneer 5080, which is a 50" plasma with only 1366x768 resolution (i.e. 720p). When the TV came out a few years ago, more than half of the professional reviewers who reviewed it said that it was THE best flat panel TV ever. "But it's only 720p and this Panasonic plasma and Sony LCD are both 1080p!" Well, there are a lot of (more-important) things than resolution that contribute to the PQ of a TV. However, resolution is something that's much more easily marketed than contrast ratio, color saturation, color accuracy, image processing, etc., and therefore it gets a lot more attention.

I just remembered one more thing. The native resolution of a TV is its static resolution. This would be the resolution when watching a picture or a totally still screenshot. When any motion is added to the mix, the resolution is (usually) lower than the native resolution and is referred to as motion resolution. Here's an article and PDF comparison of 125 HDTVs for motion resolution. This doesn't get mentioned a lot, but it's SO important (unless your TV shows have no motion whatsoever!). The tests show that some TVs (like the Pioneer 5020) do very well with motion and have a resolution of 900 (compared to 1080 static resolution). Whereas other TVs do very poorly with motion (like the Sony KDL-40S4100) and have a resolution of 310 (compared to 1080 static resolution)! I would like to think that the motion resolution on my 720p plasma is more than