Fool-proof way of ID'ing 1080 progressive

videobruce

Golden Member
Nov 27, 2001
1,069
11
81
This is a issue with the current crop of HD TV's that can accept a computer signal thru HDMI.
There doesn't seem to be a positive way to confirm if the set is accepting an interlaced or progressive signal and displaying it as a progressive image.
I know that isn't a issue with dediciated computer monitors (LCD displays in particular), but with Microdisplays (DLP, LCD & LCoS RPTVs') it is. There is one set that has been confirmed, but there are others that no one can say one way or the other.

Resolution isn't a problem, it's just the 'i' vs 'p' issue.

Is there some s/w program that is a 3rd party to ATI and nVidea that will positively tell if the display is interlaced or progressive?
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
I'm not really sure what you're asking. Isn't this in the specifications in the TV? Do you mean how do you tell if a station is broadcasting 1080i or 1080p? I don't think any stations are broadcasting 1080p now??

Anyhow, can't you just read the EDID info from the TV (with something like MonInfo) and have it list the possible resolution modes? The EDID is received over the I²C bus via VGA or DVI to your TV and describes the TV's capabilities and color gamma.
 

videobruce

Golden Member
Nov 27, 2001
1,069
11
81
I am refering to setting the output of the video card to 1080i or 1080p when the choices are available.
I don't know if these sets will show that. I don't have one yet.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
If you send it a 1080p signal from your PC and your TV displays it then it does accept a 1080p signal, otherwise it obviously doesn't. You don't need anything special to tell you that.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
1080p scanrate = 66KHz
720p = 45 KHz
1080i/540p scanrate = 33.75KHz

AFAIK, the only display technology that supports interlaced display is CRT.

DLP, LCD & LCoS RPTVs' are all progressive scan displays. They may accept interlaced video, but they won't display it without interpolation and deinterlacing/line-doubling.
 

videobruce

Golden Member
Nov 27, 2001
1,069
11
81
If you send it a 1080p signal from your PC and your TV displays it then it does accept a 1080p signal
Maybe with computer monitors, but not necessary with HD sets. Many have reported it appeared to work, but didn't, hence the question.
AFAIK, the only display technology that supports interlaced display is CRT.
It's what is being feed into the set is what I'm interested in.
 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
videobruce you are wrong, you didn't even read what rbV5 said properly. yes new microdisplays "support" interlaced resolutions like 480i and 1080i but they aren't displaying them natively. Like rbV5 said there is additional processing that must be done before the image is displayed.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
what he is alluding to is the new crop of 1920x1080 panels. One would think they would be able to do 1:1 pixel mapping from digital sources like PC's, but the truth is, THEY don't.

Take for instance, Ben Q 37" TV (well monitor really, since it has no tuner). Although it advertises everwhere of being 1080p compatible and the specs say 1920x1080 resolution, DO NOT BE MISLEAD. That's the resolution of the panel itself. The input circuitry can only accept a maximum of 1280x768 (or so), so it is very deceptive. The internal scalar then upsizes the picture to the panel's native resolution. This has caused a big stir in the HTPC community, and people are actively relaying the message back to the manufacturers (Sharp's recent 1080p line for instance). The only sets that are capable today are Sceptre's 37" and Westinghouse 37"; both featuring 1920x1080p panels AND 1:1 pixel mapping capabilities.

But to answer the OP's question, NO there is no reliable way to tell other then to plug it and test.
 

videobruce

Golden Member
Nov 27, 2001
1,069
11
81
Text Although it advertises everwhere of being 1080p compatible and the specs say 1920x1080 resolution, DO NOT BE MISLEAD.
This is what I am saying.
I edited the first post since it appears it was mis-leading. I should of said orginally, accepting the signal, not just displaying it.
Hope that is better.
NO there is no reliable way to tell other then to plug it and test.
That doesn't really work. It isn't obvious what is going on within the set. IOW's you can't tell just by looking at it.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: videobruce
If you send it a 1080p signal from your PC and your TV displays it then it does accept a 1080p signal
Maybe with computer monitors, but not necessary with HD sets. Many have reported it appeared to work, but didn't, hence the question.
Again, if the picture appears at all when you send it a 1080p signal from your PC, then it displays a 1080p signal. If you send it a 1080p signal and get no picture, then obviously it doesn't support 1080p signals. There is nothing magical about TVs to make them any different from PC monitors in that respect.
 

videobruce

Golden Member
Nov 27, 2001
1,069
11
81
I know that sounds logical, but that isn't what everyone is saying. My take is it concerts to to a interlaced signal and converts it back again.
This has to do with the HDMI interface and the handshake (or something) it performs.
I don't have a set yet, I had one before that I sen t back (for other reasons) and it only accepted an interlaced signal.

Here is one example, there are many other threads on 1080p through HDMI;
http://www.avsforum.com/avs-vb/showthread.php?t=621193&page=1&pp=60
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
I'm pretty sure all LCD TVs transmit EDID over DDC just like every other monitor for PC compatibility. They list the VESA modes in the specifications of every TV. See if the mode is listed in MonInfo's enumeration. If so, and you are specifying one of the modes listed, it will definitely be in that mode. Most monitors have OSDs, so if your TV does it would be listed somewhere in the OSD as well which mode it was in.

I do know NVIDIA's resolution thing is funny. It puts it in virtual resolution sometimes when you didn't intend to. It is weird sometimes so I know what you mean.

Note, you cannot transmit 1920x1080 at higher than ~80 Hz due to single-link DVI limitations, and that also applies for single-link HDMI (is there even a dual-link HDMI?) because HDMI is just DVI with audio AFAIK, hence why DVI->HDMI adapters are available.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: videobruce
I know that sounds logical, but that isn't what everyone is saying. My take is it concerts to to a interlaced signal and converts it back again.
This has to do with the HDMI interface and the handshake (or something) it performs.
I don't have a set yet, I had one before that I sen t back (for other reasons) and it only accepted an interlaced signal.

Here is one example, there are many other threads on 1080p through HDMI;
http://www.avsforum.com/avs-vb/showthread.php?p=6854454#post6854454

as i stated earlier before, you will not know how the manufacturer's implement their scaling / pixel: mapping until you connect it to the unit itself. I was also looking forward tot hat JVC unit only to be disappointed (yet again), by the advertised 1080p nomenclature. There is no "default" each set takes in intially. It is purely dependent on the input device to the monitor itself. it is this way to accept multiple sources like standard 480i, present HDTV broadcasts in 720p/1080i, and newer 1080p formats.

 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: videobruce
I know that sounds logical, but that isn't what everyone is saying.
Some people say all sorts of crazy things, but that doesn't chance how things acutally work.

Originally posted by: videobruceMy take is it concerts to to a interlaced signal and converts it back again.
This has to do with the HDMI interface and the handshake (or something) it performs.
How in the world did you come up with that?


Originally posted by: videobruceI don't have a set yet, I had one before that I sen t back (for other reasons) and it only accepted an interlaced signal.

Here is one example, there are many other threads on 1080p through HDMI;
http://www.avsforum.com/avs-vb/showthread.php?p=6854454#post6854454
At least in the part of the thread that you linked to, they are just talking about overscan issues. That is a whole seprate issue and good picture postion/size control is handy for using your TV as a monitor no matter what resolution you are trying to run.

 

videobruce

Golden Member
Nov 27, 2001
1,069
11
81
How in the world did you come up with that?
Well if you send it out as progressive and the HDMI input doesn't accept it as that and it either gets changed back to interlaced and then converted back to progressive for the actual display.
I don't know what else to say what's happening or not happening. All I know is no one is sure what is actually being processed.

I corrected the link to show the whole thread.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: videobruce
How in the world did you come up with that?
Well if you send it out as progressive and the HDMI input doesn't accept it as that and it either gets changed back to interlaced and then converted back to progressive for the actual display.
I don't know what else to say what's happening or not happening. All I know is no one is sure what is actually being processed.

I corrected the link to show the whole thread.

HDMI allows for 2 way communication between the video source component and the display component. With an enabled system, the system will automatically adjust the output from the enabled player and adjust the display for the content without user intervention as long as the component manufacturer(s) has implemented the technology. The "system" consists of output, display and remote components.

I imagine Its hard to say exactly what components are doing what without the manufacturer stating how they've implemented the tech.
 

videobruce

Golden Member
Nov 27, 2001
1,069
11
81
The program MonitorInfo, that just shows what the monitor is capable of, not what it is doing at the time, correct?
 

GhostDoggy

Senior member
Dec 9, 2005
208
0
0
Originally posted by: videobruce
The program MonitorInfo, that just shows what the monitor is capable of, not what it is doing at the time, correct?

Ok, so you are really concerned with how the display is actually handling a given video signal from the time it 'accepts' it and the time it 'displays' it, right? No biggy, here. The ability to properly deinterlace a 1080i (1920x1080 interlaced) signal to not throw away any of the information contained in all fields of a frame will take considerable processing power. Read that comment to mean $$$. For instance, if you use a $5-10K video processor, or the display (like a $25K Sony Qualia 004) coming internally with the ability, then the deinterlacing can be terrible.

How terrible? Well, imagine taking the 1080i on the input. It contains two fields in each frame. Throw away half the fields (let's say all of the even-line fields) and you are left with 540-lines/frame. Now double (e.g. line-doubler) the 540P you just produced from 1080i and you get 1080P. No, this ain't good, but its done on a lot of very inexpensive processors, deinterlacers, and displays.

Now, some displays will accept a 1080P signal on its input and completely bypass the deinterlacer. And if its native resolution is 1080P then it will not even have to scale the 1080P signal. But not a lot of displays are 1080P (there are some, like the Sony Qualia 006, their new Ruby projector, and even the Westinghouse 37" LCD flat-panel, etc.).

Keep in mind, though, that is you are 1.5X screen width away from your 16:9 displayed image, your 20/20 eyes with good visual acuity actually, although slightly, exceeds the 1080P resolution. But, once you start to place your eyes further away from the displayed image (say 2-4 screen widths sway) then your ability to discern and resolve the 1080P resolution of the displayed image deminishes.
 

videobruce

Golden Member
Nov 27, 2001
1,069
11
81
Your last paragraph is saying what others are asking, is it really worth it?
It's the 1080i vs 1080p debate, what's the hype?
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: videobruce
How in the world did you come up with that?
Well if you send it out as progressive and the HDMI input doesn't accept it as that and it either gets changed back to interlaced and then converted back to progressive for the actual display.
I don't know what else to say what's happening or not happening. All I know is no one is sure what is actually being processed.

I corrected the link to show the whole thread.
But that doesn't make any sense to me as a progressive scan display doesn't have any reason to have any hardware in it to interlace anything. there is a good reason for deinterlacing hardware in the displays, but absolutely no point in going the other way around.

Anyway, your link still goes to the middle of page 3, so I just read the whole thead but I still have no clue what you were trying to link me to. I did learn from the thread that some video drivers by default will automaticly switch to 1080i when they note that 1080p isn't being displayed, but it seems that is easy enough to monitor in PowerStrip. Is that what you were missing here?
 

videobruce

Golden Member
Nov 27, 2001
1,069
11
81
your link still goes to the middle of page 3
Sorry 'bout that, I was on the top of the 2nd page, I thought it was the first page. Corrected (again).
that some video drivers by default will automaticly switch to 1080i when they note that 1080p isn't being displayed
That's the problem, what/where/who do you believe??
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Well one guy in that thread you linked to mentioned that PowerStrip detects the acutal output signal which shows when the drivers switch, do you have any reason not to belive that?
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
But input on what? I mean I set my PC to 1920x1080 at 60Hz, powerstrip says 60hz vertical and 67kHz horizontal and my plasma reports the same. Granted, it is getting downsampled on my TV, but on a native 1080p display the only thing that would get in the way of true 1080p is a little overscan correction. Most TVs don't take 1080p in or only do on certian inputs, but with those that claim to and even some that don't, the one sure test is to try it and the results should be obvious when you do.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: videobruce
No, just wanted more input from advanced users.


videobruce - what is ur main concern here? again, you cannot just tell a set's capability by looking at it's panel specs. yes, hdmi/hdcp capability allows for handshaking, but presently no manufacturer implements it. even if it was in place, you cannot just go down to your local best buy, pick up advertised "1080p" set and just get 1920x1080p display. it is really up to the manufacturer's and their scaler/input circuitry that decides that.