Originally posted by: BD2003
Originally posted by: iamaelephant
Originally posted by: 91TTZ
I'm also bothered by the LED taillights on cars, since they strobe.
err....
No, he's right. They drive me crazy. Every time I'm stuck behind a caddy the flickering annoys the hell out of me.
I can easily tell the diff between 30gps, 60fps and 90fps.
But the fact that nearly anyone's eyes is bothered by 60hz isnt so much a problem with the framerate as it is the way CRTs scan - they don't flick whole images, but line by line.
Anyway, if you're sitting from a moderately large TV, at a reasonable difference, there is very, very little difference between 720p and 1080i/p. You have to sit absurdly close to a huge TV (5 feet from a 65") in order to see those 1080 lines. It could very well be appropriate for a projection based home theater with a huge screen, but for a regular TV, I'd never sit that close all the time.
Unless you have eagle eyes or something.
Me too, and I'm fine with it.Originally posted by: ariafrost
Meanwhile, I'm stuck using an analog source inputting into my TV tuner to watch TV on my computer monitor. :Q
Originally posted by: spidey07
I don't know the distance off the top of my head, but there's calculators out there for it. Something about you can't really see all the detail of HD and 1080 resolution unless you are at a particular distanc/ratio.
Originally posted by: spidey07
meh, I sit 8 feet away from the 65" which is still within THX and HD limits (7-8.5 feet). Looks great.
Eitherway that article is pure BS and very misleading.
Originally posted by: fs5
1920x1080, that's why I bought a 1080p set.
Originally posted by: spidey07
OH - more idiotry. This bozo was switching the output of the player between 1080p and 1080i FOR MOVIES!!! It's well known that you can fully reconstruct 1080p for display when using a 1080i source. what an idiot.
I meant over a 720p dlp.Originally posted by: LikeLinus
Originally posted by: fs5
1920x1080, that's why I bought a 1080p set.
Then you were had. 1080i resolution is 1920x1080. Why can't people understand that?
Originally posted by: spidey07
That article is so full of misinformation and FUD it isn't funny. That article needs to be withdrawn.
the year is 2007. 1080p is the only way of the future. I'll take flicker free true HD resolution please.
like this golden nugget of idotry...
"While this isn't the most scientific test, both Katzmaier and I agreed that, after scanning through Mission: Impossible III for an hour, it would be very difficult--practically impossible--for the average consumer to tell the difference between a high-definition image displayed on a 1080p-capable TV and one with lower native resolution at the screen sizes mentioned above. At larger screen sizes, the differences might become somewhat more apparent, especially if you sit close to the screen."
no crap you dummy. You were probably not sitting at the correct distance for HD viewing. Not to mention the difference become strickingly clear as you view larger displays. This idiot was watching it on 42 and 50" screens and was most assurably not sitting close enough for HD viewing.
OH - more idiotry. This bozo was switching the output of the player between 1080p and 1080i FOR MOVIES!!! It's well known that you can fully reconstruct 1080p for display when using a 1080i source. what an idiot.
Originally posted by: smack Down
Shouldn't a 1080i screen flicker less then a 1080p because the 1080i effectively has twice the frame rate.
Exactly. But for some reason, the geeks here have a hard time understanding this.Originally posted by: jpeyton
Originally posted by: spidey07
meh, I sit 8 feet away from the 65" which is still within THX and HD limits (7-8.5 feet). Looks great.
Eitherway that article is pure BS and very misleading.
Nope, the article just confirms what many users have confirmed over the years on AVSForum.
Ever wonder why, at the proper viewing distance, most people would prefer a Panasonic EDTV plasma over a competitor's HDTV plasma?
Brightness, contrast, color fidelity, response time...there are a ton of things more important than the almighty 'lines of resolution'.
I am not saying there isn't a big difference between 480P and 1080P. But when you're comparing 720P/1080i to 1080P, the gap is substantially less.
If someone is looking for a plasma HDTV under $2000, 1080P isn't an option. But with all HDTV broadcasts at 1080i/720P and the longevity of the DVD format, I think they'll be more than happy with a quality 720P set.
I think that's the conclusion that CNet is trying to make. Forget the marketing hype of the PS3/Blu-Ray; 1080P is nice, but it's certainly not the second coming.
Originally posted by: spidey07
Originally posted by: smack Down
Shouldn't a 1080i screen flicker less then a 1080p because the 1080i effectively has twice the frame rate.
It's the same frame rate. 30 fps. I still want 60 fps ideally and when I finally do get my next TV it will have that ability.
You can equate 1080i and 1080p the same as 480i and 480p. 30 frames per second are displayed. Interlaced uses fields, in this case 60 fields per second to get 30 frames per second.
The problems with interlacing are the same.
Originally posted by: iamaelephant
Originally posted by: 91TTZ
I'm also bothered by the LED taillights on cars, since they strobe.
err....
Originally posted by: RaynorWolfcastle
As Spidey said, for a 1080p24 source like a movie you will not see any difference at all because there is no difference at all. How the hell does a tool like this become executive editor on CNET.
smack Down, you're confusing refresh rate with frame rate. These are LCDs they tested for the most part, so you won't see any flickering at all regardless of the frame rate. Most modern display devices are progressive and will deinterlace all interlace sources regardless.
Originally posted by: smack Down
Originally posted by: RaynorWolfcastle
As Spidey said, for a 1080p24 source like a movie you will not see any difference at all because there is no difference at all. How the hell does a tool like this become executive editor on CNET.
smack Down, you're confusing refresh rate with frame rate. These are LCDs they tested for the most part, so you won't see any flickering at all regardless of the frame rate. Most modern display devices are progressive and will deinterlace all interlace sources regardless.
You won't see any flicker on any TV made it the last 20 years at least but that is besides the point an interlaced display will have less flicker because it effectively has twice the frame rate.
Originally posted by: RaynorWolfcastle
Originally posted by: smack Down
Originally posted by: RaynorWolfcastle
As Spidey said, for a 1080p24 source like a movie you will not see any difference at all because there is no difference at all. How the hell does a tool like this become executive editor on CNET.
smack Down, you're confusing refresh rate with frame rate. These are LCDs they tested for the most part, so you won't see any flickering at all regardless of the frame rate. Most modern display devices are progressive and will deinterlace all interlace sources regardless.
You won't see any flicker on any TV made it the last 20 years at least but that is besides the point an interlaced display will have less flicker because it effectively has twice the frame rate.
You're misunderstanding what I mean, in a CRT, there's an electron beam that has to refresh the screen once out of every 60 seconds. In a progressive display it would sweep all the pixels on the screen, on an interlaced display it would sweep only half the pixels. The only thing that prevents you from seeing flicker is the phosphorescence of the phosphors on your screen.
On LCD, or DLP, or LCoS (I believe this also goes for plasma) displays, there is no electron beam. The pixels are always driven, so there is no flicker at all. The frame rate (and pixel response time) only changes the fluidity of the motion of what's on screen.
Originally posted by: 91TTZ
Originally posted by: Fritzo
Interesting article on HDTV
I don't believe what other people say. I have to see it for myself.
I've had a few people claim that the human eye can't distinguish anything higher than 60 hz and that setting your monitor to anything higher is useless. They can't see it flicker. I, on the other hand, get sick if the refresh rate is that low on a monitor. It's like looking at a strobe light.
I'm also bothered by the LED taillights on cars, since they strobe.
