CNet: 1080i and 1080p- little to no difference

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
meh, I sit 8 feet away from the 65" which is still within THX and HD limits (7-8.5 feet). Looks great.

Eitherway that article is pure BS and very misleading.
 

Midlander

Platinum Member
Dec 21, 2002
2,456
1
0
Originally posted by: BD2003
Originally posted by: iamaelephant
Originally posted by: 91TTZ
I'm also bothered by the LED taillights on cars, since they strobe.

:confused: err....

No, he's right. They drive me crazy. Every time I'm stuck behind a caddy the flickering annoys the hell out of me.

I can easily tell the diff between 30gps, 60fps and 90fps.

But the fact that nearly anyone's eyes is bothered by 60hz isnt so much a problem with the framerate as it is the way CRTs scan - they don't flick whole images, but line by line.

Anyway, if you're sitting from a moderately large TV, at a reasonable difference, there is very, very little difference between 720p and 1080i/p. You have to sit absurdly close to a huge TV (5 feet from a 65") in order to see those 1080 lines. It could very well be appropriate for a projection based home theater with a huge screen, but for a regular TV, I'd never sit that close all the time.

Unless you have eagle eyes or something.

I agree about the LED strobe effect. The most noticeable issue is when your eyes sweep side-to-side, such as looking to the left and then to the right of the car in front.

I also can pick up 60 Hz on CRT screens. People at work who have their computers set to that refresh rate are amazed when I tell them it needs to be changed. They can't figure out how I know. I don't know what percentage of the population can see the flicker. It can't be too high or computers would have a higher minimum rate. :beer:
 

Slammy1

Platinum Member
Apr 8, 2003
2,112
0
76
I wouldn't say complete BS. It's written for the common consumer, but considering the group here you have other considerations (gaming). Of course, you won't see a difference on a smaller display sitting further back, but think about a picture rendered as 1080p vs 720p. Again, you would expect to see a difference (I've seen it looking at hi-res pics, for example desktops that are the same pic at different rez). This would seem to indicate the problem is source related. I also wonder about the scaler for the testing, as that to me is the limitation of most higher rez differences.

The calculators are designed to describle how to turn a TV into an HT experience (distance relative to size). That's important cause movies are designed for being seen in that way. I've said it before, some people like to sit near the front and others at the back.

Thing is, there was some difference. Consider, why pay for audio equipment that runs 5Hz-20kHz when the upper range of human hearing is like 20Hz - 17kHz? People report a warmer sound and thuds you can feel.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,607
6,094
136
Meanwhile, I'm stuck using an analog source inputting into my TV tuner to watch TV on my computer monitor. :Q
 

Qacer

Platinum Member
Apr 5, 2001
2,721
1
91
Personally, I have a Toshiba HDTV with the following resolution capabilities: 1080i, 720i, 720p, 480p, and 480i.

In my personal opinion, I do notice a difference between 1080i and 720p when watching football games. With 1080i, the image looks really sharp, but as the action gets faster, the more pixelation I see on certain spots. This problem is fixed when I switch to 720p. However, the image is not as sharp, but then again, it still looks good to me.

 

archcommus

Diamond Member
Sep 14, 2003
8,115
0
76
Originally posted by: ariafrost
Meanwhile, I'm stuck using an analog source inputting into my TV tuner to watch TV on my computer monitor. :Q
Me too, and I'm fine with it.

Actually, my roommate as a 55" widescreen TV here but since we don't get digital cable here we're using an analog source with it. Eh...you get used to it.

 

YOyoYOhowsDAjello

Moderator<br>A/V & Home Theater<br>Elite member
Aug 6, 2001
31,205
45
91

jpeyton

Moderator in SFF, Notebooks, Pre-Built/Barebones
Moderator
Aug 23, 2003
25,375
142
116
Originally posted by: spidey07
meh, I sit 8 feet away from the 65" which is still within THX and HD limits (7-8.5 feet). Looks great.

Eitherway that article is pure BS and very misleading.

Nope, the article just confirms what many users have confirmed over the years on AVSForum.

Ever wonder why, at the proper viewing distance, most people would prefer a Panasonic EDTV plasma over a competitor's HDTV plasma?

Brightness, contrast, color fidelity, response time...there are a ton of things more important than the almighty 'lines of resolution'.

I am not saying there isn't a big difference between 480P and 1080P. But when you're comparing 720P/1080i to 1080P, the gap is substantially less.

If someone is looking for a plasma HDTV under $2000, 1080P isn't an option. But with all HDTV broadcasts at 1080i/720P and the longevity of the DVD format, I think they'll be more than happy with a quality 720P set.

I think that's the conclusion that CNet is trying to make. Forget the marketing hype of the PS3/Blu-Ray; 1080P is nice, but it's certainly not the second coming.
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
Originally posted by: spidey07
OH - more idiotry. This bozo was switching the output of the player between 1080p and 1080i FOR MOVIES!!! It's well known that you can fully reconstruct 1080p for display when using a 1080i source. what an idiot.
:confused:
They were using a native 1080p blu-ray source.
 

igowerf

Diamond Member
Jun 27, 2000
7,697
1
76
I completely agree.

I bought the Samsung S4095D (1080p LCD TV) a few months ago and returned it a month later. At 7 feet away, I couldn't really tell the difference between my 40" 1080p TV and my roommate's 42" 1024x768 plasma. To make things worse, even though the Samsung panel was really nice for an LCD (the Sony XBR2s use the same panel, I believe), it still couldn't compare to even a decent plasma. I decided that when I'm sitting 7 feet away, the constrast and colors will be more important than the barely noticeable difference in resolution.

I got a Samsung S4253 (1024x768 plasma) yesterday and I'm very happy with it so far.
 

smack Down

Diamond Member
Sep 10, 2005
4,507
0
0
Originally posted by: spidey07
That article is so full of misinformation and FUD it isn't funny. That article needs to be withdrawn.

the year is 2007. 1080p is the only way of the future. I'll take flicker free true HD resolution please.

like this golden nugget of idotry...
"While this isn't the most scientific test, both Katzmaier and I agreed that, after scanning through Mission: Impossible III for an hour, it would be very difficult--practically impossible--for the average consumer to tell the difference between a high-definition image displayed on a 1080p-capable TV and one with lower native resolution at the screen sizes mentioned above. At larger screen sizes, the differences might become somewhat more apparent, especially if you sit close to the screen."

no crap you dummy. You were probably not sitting at the correct distance for HD viewing. Not to mention the difference become strickingly clear as you view larger displays. This idiot was watching it on 42 and 50" screens and was most assurably not sitting close enough for HD viewing.

OH - more idiotry. This bozo was switching the output of the player between 1080p and 1080i FOR MOVIES!!! It's well known that you can fully reconstruct 1080p for display when using a 1080i source. what an idiot.

Shouldn't a 1080i screen flicker less then a 1080p because the 1080i effectively has twice the frame rate.
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
Originally posted by: smack Down
Shouldn't a 1080i screen flicker less then a 1080p because the 1080i effectively has twice the frame rate.

It's the same frame rate. 30 fps. I still want 60 fps ideally and when I finally do get my next TV it will have that ability.

You can equate 1080i and 1080p the same as 480i and 480p. 30 frames per second are displayed. Interlaced uses fields, in this case 60 fields per second to get 30 frames per second.

The problems with interlacing are the same.
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Originally posted by: jpeyton
Originally posted by: spidey07
meh, I sit 8 feet away from the 65" which is still within THX and HD limits (7-8.5 feet). Looks great.

Eitherway that article is pure BS and very misleading.

Nope, the article just confirms what many users have confirmed over the years on AVSForum.

Ever wonder why, at the proper viewing distance, most people would prefer a Panasonic EDTV plasma over a competitor's HDTV plasma?

Brightness, contrast, color fidelity, response time...there are a ton of things more important than the almighty 'lines of resolution'.

I am not saying there isn't a big difference between 480P and 1080P. But when you're comparing 720P/1080i to 1080P, the gap is substantially less.

If someone is looking for a plasma HDTV under $2000, 1080P isn't an option. But with all HDTV broadcasts at 1080i/720P and the longevity of the DVD format, I think they'll be more than happy with a quality 720P set.

I think that's the conclusion that CNet is trying to make. Forget the marketing hype of the PS3/Blu-Ray; 1080P is nice, but it's certainly not the second coming.
Exactly. But for some reason, the geeks here have a hard time understanding this.
 

smack Down

Diamond Member
Sep 10, 2005
4,507
0
0
Originally posted by: spidey07
Originally posted by: smack Down
Shouldn't a 1080i screen flicker less then a 1080p because the 1080i effectively has twice the frame rate.

It's the same frame rate. 30 fps. I still want 60 fps ideally and when I finally do get my next TV it will have that ability.

You can equate 1080i and 1080p the same as 480i and 480p. 30 frames per second are displayed. Interlaced uses fields, in this case 60 fields per second to get 30 frames per second.

The problems with interlacing are the same.

There is no question interlacing decreases flicker.
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
As Spidey said, for a 1080p24 source like a movie you will not see any difference at all because there is no difference at all. How the hell does a tool like this become executive editor on CNET.

smack Down, you're confusing refresh rate with frame rate. These are LCDs they tested for the most part, so you won't see any flickering at all regardless of the frame rate. Most modern display devices are progressive and will deinterlace all interlace sources regardless.

The limitation on modern display devices isn't to display 1080p but rather the amount of information that must be handled by the electronics that do the signal processing. 1080p requires twice the processing and twice the bandwidth resources of a 1080i signal and that can be a problem. To give you an idea you need to process about 12 million pixels per second at 1080p and you need about 3 Gbps of bandwidth throughout your system to handle the uncompressed video stream
 

smack Down

Diamond Member
Sep 10, 2005
4,507
0
0
Originally posted by: RaynorWolfcastle
As Spidey said, for a 1080p24 source like a movie you will not see any difference at all because there is no difference at all. How the hell does a tool like this become executive editor on CNET.

smack Down, you're confusing refresh rate with frame rate. These are LCDs they tested for the most part, so you won't see any flickering at all regardless of the frame rate. Most modern display devices are progressive and will deinterlace all interlace sources regardless.

You won't see any flicker on any TV made it the last 20 years at least but that is besides the point an interlaced display will have less flicker because it effectively has twice the frame rate.
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: smack Down
Originally posted by: RaynorWolfcastle
As Spidey said, for a 1080p24 source like a movie you will not see any difference at all because there is no difference at all. How the hell does a tool like this become executive editor on CNET.

smack Down, you're confusing refresh rate with frame rate. These are LCDs they tested for the most part, so you won't see any flickering at all regardless of the frame rate. Most modern display devices are progressive and will deinterlace all interlace sources regardless.

You won't see any flicker on any TV made it the last 20 years at least but that is besides the point an interlaced display will have less flicker because it effectively has twice the frame rate.

You're misunderstanding what I mean, in a CRT, there's an electron beam that has to refresh the screen once out of every 60 seconds. In a progressive display it would sweep all the pixels on the screen, on an interlaced display it would sweep only half the pixels. The only thing that prevents you from seeing flicker is the phosphorescence of the phosphors on your screen.

On LCD, or DLP, or LCoS (I believe this also goes for plasma) displays, there is no electron beam. The pixels are always driven, so there is no flicker at all. The frame rate (and pixel response time) only changes the fluidity of the motion of what's on screen.
 

homercles337

Diamond Member
Dec 29, 2004
6,340
3
71
theta = arctan( height / distance )

Gives you radian degrees of visual angle (theta). In the fovea you have ~60 receptors / degree. Thus highest acuity will be achieved at the distance that one pixel subtends 1 arcmin (remeber 60 min / degree ;) ). Now go! Test it. Interlaced will look like crap when progressive looks great in this limit.
 

smack Down

Diamond Member
Sep 10, 2005
4,507
0
0
Originally posted by: RaynorWolfcastle
Originally posted by: smack Down
Originally posted by: RaynorWolfcastle
As Spidey said, for a 1080p24 source like a movie you will not see any difference at all because there is no difference at all. How the hell does a tool like this become executive editor on CNET.

smack Down, you're confusing refresh rate with frame rate. These are LCDs they tested for the most part, so you won't see any flickering at all regardless of the frame rate. Most modern display devices are progressive and will deinterlace all interlace sources regardless.


You won't see any flicker on any TV made it the last 20 years at least but that is besides the point an interlaced display will have less flicker because it effectively has twice the frame rate.

You're misunderstanding what I mean, in a CRT, there's an electron beam that has to refresh the screen once out of every 60 seconds. In a progressive display it would sweep all the pixels on the screen, on an interlaced display it would sweep only half the pixels. The only thing that prevents you from seeing flicker is the phosphorescence of the phosphors on your screen.

On LCD, or DLP, or LCoS (I believe this also goes for plasma) displays, there is no electron beam. The pixels are always driven, so there is no flicker at all. The frame rate (and pixel response time) only changes the fluidity of the motion of what's on screen.

Yeah and an interlaced screen would flicker less. Not that either flicker at all, but like I said that is besides the point.
 

Fritzo

Lifer
Jan 3, 2001
41,920
2,162
126
Originally posted by: 91TTZ
Originally posted by: Fritzo
Interesting article on HDTV

I don't believe what other people say. I have to see it for myself.

I've had a few people claim that the human eye can't distinguish anything higher than 60 hz and that setting your monitor to anything higher is useless. They can't see it flicker. I, on the other hand, get sick if the refresh rate is that low on a monitor. It's like looking at a strobe light.

I'm also bothered by the LED taillights on cars, since they strobe.

People that can't tell the difference between 60hz and 85hz on a tube monitor don't know what to look for. They probably go home wondering why they have eye strain every day.

LCD's work at 60hz fine on the other hand, because they don't constantly scan like tube monitors do.

ANYHOOO...back to the subject :)
 

Staples

Diamond Member
Oct 28, 2001
4,953
119
106
Um yeah, you still need a 1080p TV to see the effective resolution of 1080i. The support for 1080p is just a bonus.

This article just states the obvious but there are few people who know enough about it to have drawn this conclusion.