1080p worthless?

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Today, I was at 6th Ave Electronics store. I saw a Sony 50" display. Those really fancy ones. It was 5,000 bucks.

The picture was amazing and very detailed, until it started to move. Open Season was playing on it. The bear character looked amazing and you could see every strand of hair on it, but when he moved everything went into blur.

This is because of the crappy framerate of movies. But my thing is: Why buy 1080p when you can only see the detail in still pictures. It feels like 1080p is just a gimmick and that 720p is as good as HD can get.
 

Dacalo

Diamond Member
Mar 31, 2000
8,780
3
76
Originally posted by: VIAN
It feels like 1080p is just a gimmick and that 720p is as good as HD can get.

Whatever you say!

*Off to to play blu-ray on my Samsung 1080p TV*
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,037
431
126
Someone hasn't seen Blu-ray or HD-DVD at 1080p then, or a computer connected to the screen. 1080p is they screen to get, plain and simple. Why? Because unlike a 1080i screen, you can upscale 720p to 1080p and not lose detail (other then the normal scaling problems that can occur). You can also watch 1080i format on the screen with no loss of detail. As for the blur you saw, well, did you ever consider the blur was motion blur of that particular LCD screen? There are others out there which do not suffer as much blur. Especially the newer 120Hz refresh models that have been comming out over the last few months (Sharp, Samsung, eventually Sony, and ALL the other high end LCD makers). That blur might actually be in the source video as well (don't know for a fact, but it is possible). As in my first sentence, go look at what blu-ray at 1080p looks like before you even remotely consider making the statement you made.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Originally posted by: Fallen Kell
As for the blur you saw, well, did you ever consider the blur was motion blur of that particular LCD screen? There are others out there which do not suffer as much blur. Especially the newer 120Hz refresh models that have been comming out over the last few months (Sharp, Samsung, eventually Sony, and ALL the other high end LCD makers).
On a 5000 dollar TV, that's BS. This is the TV:

http://www.sonystyle.com/is-bin/INTERSH...video&CategoryName=tv_flatpanel_46to52

An 8ms panel.... I find it hard to believe that it's the panels fault. There was no loss of color that is usually associated with high latency blur. Not to mention the fact that I know this effect exists in CRT TVs as well. And in movie theaters. I know it's because of a crappy framerate. Which is why I can't wait for the industry to adopt 60fps of video.

And why would 120Hz make a difference. Firstly, LCD screens work at 60Hz, which the source doesn't even have the potential to fill, being that it's only 24fps. Second, it's an 8ms panel, which should provide 125fps, substantially more than 60Hz the monitor supports. 120Hz LCDs are just a gimmick.

Unless you can prove otherwise, I'll stay with my HD DVD on my 720p @ 60Hz.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Here's a nice quote from the link below:

"To make a long story short, there is no difference between 1080i60 and 1080p60 when displaying a 1080p24 source.

This is because 1080i60 is really 1080p30 (when the deinterlacer is working correctly). The problem then becomes, will your player/ display provide the correct 2:3 pulldown (inverse telecline), so until players (like the Pioneer Blu-ray player) start outputting 1080p24, and displays are able to accept a 1080p24 input, the issue of 1080i/p is simply a marketing ploy.

As for the actual resolution... you will not really be able to tell the difference between a 720p and a 1080p image unless you are looking at a "large" image from a reasonably close distance. For example, from 12-15 feet away for a 50" TV, you will not be able to detect much of a difference at all. If you have a larger set (or in your case a larger image due to the throw of your projector), it might begin to be discernable."

http://forums.highdefdigest.com/showthread.php?t=579
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I'm curious and I want to find out why there was so much blur. And if it showed up on other 1080p screens, then what would be the point of 1080p. There could be many reasons, but not any listed here. Except for Fallen Kell which said that it may be from the source video. Or it may have been do to scaling. The CRTs I'm talking about, where I've seen this kind of blur, are Sony CRTs with DRC technology displaying an SD signal.

I'm going to try to go back tomorrow and see how it's hooked up and to check if who's fault it is. I don't remember seeing this effect previously on other HD sets. I was merely curious with my question. I needed some insight as to whether it was a technology to look into, not a brush off like I'm crazy.

The only other problem I have with 1080p is that it's only in TVs 40" and up. I'd like to have a TV that's 1080p, but unfortunately, 40" is just way too big for my room. I plan on being seated about 5 feet from the TV. I've settled on a 32" TV... and none have 1080p. What horse crap!


Here is a good idea of what I'm talking about:

"Back in the era when people first started making silent movies they wanted to expose just enough film to reasonably capture motion without wasting too much film. Initially, that exposure rate was 18 frames per second. However, when they wanted to go to talkies, they discovered that 18 fps was not fast enough to lay down a coherent audio track. So the speed was increased to 24 frames per second to accommodate audio, and the film industry has had 24 fps as a standard since the 1930's. As everyone who has been to the movies knows, 24 fps is not fast enough to resolve rapid motion without some blurring effects.

HD-DVD and Blu-ray will carry this blurring-of-motion tradition forward to some degree since it is a limitation of the original film source. It makes perfect sense that film transfers will be 24 fps on both HD-DVD and Blu-ray since that is what the film source is. However, users should not expect that 1920x1080 super high resolution will make the blur go away."

http://www.projectorcentral.com/hd-dvd.htm


Here's another interesting quote:

"2.) 1080I is smoother watch it, stop saying it's not. Honestly watch something at 24P and 30I and it will look smoother at 30I. If it looks better is a opinion that differs from project to project but it look smoother assuming it's shot using 30I and is displayed with out removing fields. 60 frames or 30I is perfect for sports. Honestly 24P is a motion blur nightmare and can look choppy during pans with out a object to follow with your eyes (It's a odd problem that video cinematographer have when they shoot film.) And quite frankly CSI, House and most Hollywood films look nothing like real life so the 24P add to the cinematic quality and not the realism anymore the CSI's amazing background lightning creates Realism (Wow... how do you people work in this Dark Crime Lab.)

3.) 24P, 720 is not better for games for the same reason it's not better to watch sports in 720P (And why sport are broadcasted in 30I, 1080I) If your watching sports at 1080I technically your seeing sixty frames a second, which is a much smoother realistic picture. If your viewing 24P technically your seeing 2/5 of the frames that you would from a 30I image (For those who don't play computer games, higher frame rates are better.) so that when Fragmaster moves around the corner to get you technically you can see him up to 3 frames sooner and dodge his rocket (Assuming the game is supporting actually 30I and not cheating it with duplicating frames.)"

http://digg.com/hardware/1080p_and_1080i_High_Definition_Resolution_explained
 

bobthemongoloid

Junior Member
Sep 4, 2004
9
0
0
To answer the main question:

Is 1080p worthless? That's a HUGE no. 1080p is where it's at. 1080p material looks soo much better on my lcd than 720p material. Even on my mediocre tv (32" lcd 1366x768) 1080p material looks much sharper than 720. I was a bit skeptical at first for logical reasons, but it just impressed the crap out of me. As far as why that tv blurred; who knows. I just played open season at 1080p on my lcd and it looks sweet, so the source is likely fine.. the 8800 helps too.. but a x1900 will pull it off without choppiness as well.
 

jdoggg12

Platinum Member
Aug 20, 2005
2,685
11
81
I've seen the 720p and 1080p westinghouses hooked up to computers... theres a HUGE difference in screen quality. My 37" 1080p looks AMAZING when playing far cry, HL2, Doom3, etc. The 720p on my friends computer makes you have to sit far away b/c the resolution isn't good enough.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,486
20,572
146
I agree with VIAN, that if you already have a 720p and/or 1080i capable set, there isn't any real incentive to buy a 1080p set, yet. Those who use it as a PC display and game on it, are likely getting the most out of it.
 

SKoprowski

Member
Oct 21, 2003
187
0
0
Exactly BernardP! It depends how close you are to the screen. All HDTV broadcasts are 720 or 1080i anyway. You're only going to use 1080p in hd dvds and video games. I'm not sure how accurate this is, but I was told their isn't enough bandwith to ever broadcast 1080p over the air.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
1080p vs. 720p is going to be a pretty big difference on a PC monitor because of 1:1 pixel mapping and built-in scalers. LCD TVs just seem to scale non-native resolutions better than LCD monitors. Not sure why, maybe its because larger LCD TVs aren't 1:1 anyways? That seemed to be the case with my brother's 1080p 40" Bravia, but the 37" Westinghouse might be 1:1 since some complain about its pixel size and pitch.

Also, as Vian mentioned, source resolution/quality and refresh rate seem to have a pretty big impact when viewing on a 1080p panel. When watching HD cable, 1080i source material looks incredible, but 480i sources look a lot worst with much more ghosting.
 

JBT

Lifer
Nov 28, 2001
12,095
1
81
IMO for gaming there is a huge difference, for HD-DVD/BlueRay there is a bit of a difference, for TV broadcasts no not really any difference.
It all depends on what you plan on using the TV for.
Also if you are stupid enough to buy a TV that has the list price from the manufacterers website you are over paying big time.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Using an LCD TV as a computer monitor, 1080p will make a huge difference compared to 720p.

However, as strictly a TV, I think 720p looks awesome on a native 720p TV. I dont see much of a difference running an HD-DVD on my 50" 720p DLP TV compared to 1080p on my 24" Gateway screen.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
I have a feeling this thread is only to attack Sony/Blu-ray.

No offense, I saw the exact same thing you did on a TV at Best Buy, and even in motion is was the most crisp image I had EVER seen.

Is there a huge difference between 1080p and 1080i? No...they're both still stunning...Probably no humanly noticeable difference. So is 1080p needed? No. Worthless? Absolutely not, it's an advancement that should be welcomed.

720p is still wonderful to watch, but 1080p can be amazing. I don't know what kind of messed up experience you had...
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,037
431
126
I know I posted a reply earlier, but can't find it now....

Anyway, basically I debunked a lot of the questions you asked:

"And why would 120Hz make a difference"
Because at 120Hz you no longer need any 2:3 pulldown because 24Hz, 30Hz, AND 60Hz sources ALL map directly to 120Hz output (120/24 = 5, 120/30 = 4, 120/60 = 2). This completely eliminates all the stuttering that some people detect when watching a 24Hz film being displayed on a 60Hz refreshed screen because you can't time the frames exactly onto that format.

"Second, it's an 8ms panel, which should provide 125fps"
No, it won't provide 125fps. Check what that 8ms is actually measuring. You will find that it measures a best case gray color to gray color transition time for the LCD crystal. It does NOT show the WORST case gray color to gray color transition. The WORST case is what dictates how the panel can actually perform in terms of fps, just like in video games, when you see all the stuttering. It is due to worst case. Even the absolute best LCD panel's (i.e. the $20,000+ professional limited production ones) bairly have the worst case in the lower 20ms, which means only 50fps.

Higher resolution means you can be closer while maintaining the same picture quality. This means the TV can occupy a larger portion of your field of view to give you that true cinema big screen effect in your home. It also allows you to use the screen as an effective computer monitor. Text is the single biggest issue on screens that are only 720p or 1080i. At 1080p, text works very well. As for humans being able to see the noticeable difference, well, if you are going to be 15-20 feet away from a 42" screen, then no. If you are going to be 6-10 feet away from a 50" screen, HELL YES you can see the difference. Again, higher resolution lets you get closer to the screen without sacrificing picture quality.

As for someone stating that Blu-ray/HD-DVD is only minimally different at 1080p vs 720p/1080i, you were probably watching some of the movies that were only recorded at 720p. A large portion of Blu-ray/HD-DVD movies that first came out (and to an extent even ones still comming out) were only created from a 720p source.

Not only the above issues with the movies themselves, but even the players. The first released players (Toshiba HD-XA1, the Samsung BD-P1000, and others) could NOT output 1080p. It was a limitation of the Broadcom decoder chips they used. As a result they would read the 1080i data, then pass it through a Reon processor downstream for deinterlacing to 1080p and output that.

 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Fallen Kell
"And why would 120Hz make a difference"
Because at 120Hz you no longer need any 2:3 pulldown because 24Hz, 30Hz, AND 60Hz sources ALL map directly to 120Hz output (120/24 = 5, 120/30 = 4, 120/60 = 2). This completely eliminates all the stuttering that some people detect when watching a 24Hz film being displayed on a 60Hz refreshed screen because you can't time the frames exactly onto that format.
That's really cool. I always wondered whether future TV's would address this. :thumbsup:

I guess changing a 24Hz DVD into a 60Hz stream for a LCD TV would require complete re-encoding with a very sophisticated algorithm -- way too much to do on the fly -- right?