DVI resolution

Da Grump

Junior Member
Nov 26, 2007
5
0
0
I'm using an MSI NX8500GT video card in a HTPC setup coupled to a Toshiba 1080p LCD TV. My issue is I can't set the card up to pass 1080p. The nvidia control panel only displays 480p, 720p, and 1080i. I called MSI tech support and they informed me DVI won't pass 1080p resolution. Is this right? If not, does anyone know how to do it? Thanks.
 

saiga6360

Member
Mar 27, 2007
61
0
0
Yes, DVI can go even higher than 1080p on larger LCD monitors. What does your TV say when you display the output from your PC? If you have a Regza, just press the Menu button on the remote then Exit.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Sounds more like a TV issue. If the TV's DDC info doesn't report 1920x1080@60hz support, the drivers won't list it as an option. Though if you are running XP, you should be able to force 1080p with Powerstrip.
 

Da Grump

Junior Member
Nov 26, 2007
5
0
0
I am running XP, so I'll download Powerstrip and give it a shot.

Is there a way to determine what DDC info is being reported to the PC?
 

PCTC2

Diamond Member
Feb 18, 2007
3,892
33
91
Originally posted by: saiga6360
Yes, DVI can go even higher than 1080p on larger LCD monitors. What does your TV say when you display the output from your PC? If you have a Regza, just press the Menu button on the remote then Exit.

yeah. DVI can support 1600p (Gateway 30" LCD). Cool stuff.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Da Grump
I am running XP, so I'll download Powerstrip and give it a shot.

Is there a way to determine what DDC info is being reported to the PC?
I think this should do the trick, though I've never actually tried it with a digtial connection, just analog.

Originally posted by: PCTC2
yeah. DVI can support 1600p (Gateway 30" LCD). Cool stuff.
That is only with dual-link DVI, which I highly doubt his TV supports. Single-link is fine for 1080p though.
 

themisfit610

Golden Member
Apr 16, 2006
1,352
2
81
Definitely. 1920x1200 is the standard resolution for 24" computer monitors, and those run off single link DVI (by using some vertical blanking)
 

saiga6360

Member
Mar 27, 2007
61
0
0
Yeah, it should show as 1920 by 1080 pixels with 32 bit color at 60Hz on Display properties. On my HTPC, although it only shows as a Plug and Play monitor in Display properties, Catalyst Control Center sees the tv as TSB-TV. I can also force it to show 1080i (1920 by 1080 30Hz) as an option. I doubt it is a TV issue if your tv can output 1080p from other sources (HD player or upscaling DVD player). Is it a DVI port on the TV or are you doing DVI-HDMI via adapter?
 

Da Grump

Junior Member
Nov 26, 2007
5
0
0
Originally posted by: saiga6360
I doubt it is a TV issue if your tv can output 1080p from other sources (HD player or upscaling DVD player). Is it a DVI port on the TV or are you doing DVI-HDMI via adapter?
I don't have another source yet, so I can't confirm that the TV ouputs 1080p (although the data below suggests it doesn't). I'm connecting to one of the HDMI ports on the TV via an HDMI cable and DVI>HDMI adapter on the card.


Here is an excerpt of the data reported by moninfo.exe (thanks, Snowman):

Timing characteristics
VESA GTF support............ Not supported
Horizontal scan range....... 15-46kHz
Vertical scan range......... 59-61Hz
Video bandwidth............. 80MHz
Extension blocks............ 1
Timing recommendation #1.... 1920x540 at 60Hz
Modeline................ "1920x540" 74.250 1920 2008 2052 2200 540 542 547 562 +hsync +vsync
Timing recommendation #2.... 1920x1080 at 30Hz
Modeline................ "1920x1080" 74.250 1920 2008 2052 2200 1080 1084 1094 1124 interlace +hsync +vsync
Timing recommendation #3.... 720x480 at 60Hz
Modeline................ "720x480" 27.000 720 736 798 858 480 489 495 525 -hsync -vsync

Standard timings supported
720 x 480 at 60Hz - Toshiba
1920 x 540 at 60Hz - Toshiba
1920 x 1080 at 30Hz - Toshiba

If I'm interpreting this correctly, this TV reports that it supports 480p, 540p, and 1080i. Is this correct?

This is the TV I have:
http://www.tacp.toshiba.com/te...duct.asp?model=42hl196

I would like suggestions on how to proceed from this point. Thanks.
 

saiga6360

Member
Mar 27, 2007
61
0
0
Actually, I am not entirely sure about that model. Cnet says it is a 1080i set but it is being advertised by Toshiba and some resellers as 1080p. I will have to take your observations and a user opinion on Cnet as confirmation that this is indeed a 1080i only TV. I wish could find the user manual for it.

Honestly though, at 42 inches 1080i is good enough for regular HD programming and 1080p won't be much of an upgrade but for PC use, I imagine 30Hz can easily cause eye strain. Try setting it to 720p.

EDIT: After checking in the AVS forums, this set is not 1080p in a true sense that it accepts 1080p input but instead takes in 1080i signal and upconverts to 1080p. So essentially you have to force out a 1080i signal from your PC to let the TV do its funky conversion. Head over to www.avsforum.com for more details.
 

saiga6360

Member
Mar 27, 2007
61
0
0
As I expected, that manual wasn't so helpful in this case. Had to rely on the AVSforums for an answer to this.

Short answer is, your entire setup is working just fine and it is actually a 1080p resolution that you are seeing on the TV, although it was upconverted by the TV's built in chip.

Your video card is smart enough to know that your TV can only accept up to 1080i signals so that is why your options are limited in the Nvidia control panel. To add to the confusion, your TV on screen display will not tell you that the picture is 1080p because it can only tell your about the incoming signal which is 1080i coming from your video card.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
last I check you could count the number of TVs that can do 1080p on one hand... I very VERY seriously doubt your TV can do 1080p... mostly likely only 720p or 1080i
 

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
Originally posted by: saiga6360
You will have to get the bigger sets to fully appreciate 1080p, or at the very least notice it.

You can tell the difference between 1080i and 1080p on any screen
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: saiga6360
So essentially you have to force out a 1080i signal from your PC to let the TV do its funky conversion.
The "funky conversion" is just deinterlacing, it's what every progressive scan display has to do for interlaced content. That just means each pair of interlaced fields are combined into the progressive frames they were created from. The only drawback with using an interlaced resolution is that they are limited to 30fps, but unless you were hoping to game with better framerate at 1080p, you aren't missing anything by using 1080i.
 

saiga6360

Member
Mar 27, 2007
61
0
0
Originally posted by: Throckmorton
Originally posted by: saiga6360
You will have to get the bigger sets to fully appreciate 1080p, or at the very least notice it.

You can tell the difference between 1080i and 1080p on any screen

That would be nice if true. I would like to see this for myself but that is what I have heard from some sites. I am working towards acquiring a dual format HD drive so hopefully it should be soon. :)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
TheSnowman and saiga6360 are right... you can't tell the difference and there is no real issue...

Basically the only real difference between 1080i and 1080p is that i runs at 30fps while p runs at 60 fps... there is no noticeable difference unless you are playing a fast paced computer game. Even then your chances of noticing are slim (because it is constant, not a shifting variable)
 

themisfit610

Golden Member
Apr 16, 2006
1,352
2
81
that is VERY incorrect.

The difference between 1080i and 1080p is quite fundamental. 1080i is an interlaced display mode, which means that the screen is drawn in alternating "fields" - where each field is the odd or even lines of the screen. This is done ~60 times per second.

Interlaced scanning is very common - and is standard on NTSC and PAL TV.

The technical reasons behind it are complex, but basically it lets you get 60 pictures per second (59.94 actually), while only "really" using 30 fps of bandwidth. The sacrifice is that in high motion, you don't get as much spatial resolution.

Progressive scanning (1080p) is what we're all accustomed to on PC monitors. The display is refreshed one line at a time, which gives full spatial resolution at all times. 1080p60 is very intense to render and deliver, but most (video) content is actually 1080p24, since films are shot at 24fps.

Interlaced video loses perceived spatail resolution in high motion because each field is 540 lines, so in a worst case scenario (LOTS of movement) you only get 540 lines of resolution for each picture you see. In low motion, the fields blend together nicely, and you can "see" the full 1080 lines, since your eyes are tricked by persistence of vision.

It's very important to realize that interlaced scanning only properly works on interlaced displays! PC monitors, LCDs, DLPs, etc ARE NOT. They are inherently progressive. The only "true" 1080i displays are CRTs. All other displays must deinterlace 1080i to 1080p. This is a very difficult process to do well, and in the worst case scenario can result in literally throwing away half the fields and scaling up 540 lines to 1080. The quality of deinterlacers varies tremendously, and in the best case (with a dedicated hardware processor) you can get great results. Most of the time, it's best to just stick with progressive scan output to a progressive scan display. That's why most of the time 720p > 1080i on most displays.

A lot of TVs are actually 1080p these days. If you see an LCD or other flat panel TV advertising 1080i, that means it's likely a 720p panel that takes 1080i, does a questionable job of deinterlacing it (maybe to even 540p as I said), and then scaling it to 720p.

So, long winded video-nerd technical discussions aside - your best bet when connecting your TV to your PC is to determine what the true native resolution of your TV is, and then output that resolution, or the next best thing.

If your panel is native 720p (not exactly 1280x720, but something very close to it, I forget the funky standard) - then output 720p! If you panel is 1080p, then output 1080p! Don't output interlaced video from your PC unless you're going to an old CRT HDTV, or a standard NTSC or PAL CRT TV. In those cases, interlaced is the way to go!

[edit]
After closely reading the thread, I see that this TV probably has a 1080p panel, but doesn't properly take 1080p60 in through HDMI. In this case, it's all up to the deinterlacer in this TV. If it's got good deinterlacing, then 1080i will end up looking better than 720p. If it's a crappy deinterlacer, then 720p will be a better bet.
[/edit]

~MiSfit