True HD from DVI?

TBSN

Senior member
Nov 12, 2006
925
0
76
Simple question: can High-Def signal (720 or1080p) be sent through a DVI cable (the screen res would have to be high enough, obviously...), or is it necessary to use HDMI connectors?

I know that there are many DVD players on the market that can do SD to HD upconversion, which, although a big step down from a high-definition source, does look a lot better than simply watch the DVD in SD (once again, you need a HD tv to watch it...)

So I was wondering if there was some way to watch DVD's on a computer and do the upconversion on-the-fly like the expensive DVD players, and also whether or not DVI can broadcast a hd signal...

Thanks, and let me know if my question is confusing...
 

40sTheme

Golden Member
Sep 24, 2006
1,607
0
0
Well, if the movie is on HD-DVD or Blu Ray so it supports 1080p, if your monitor supports 1920x1080 then it can run in high def through DVI. If it's running through a computer; otherwise you would need HDMI. I think that's right, someone correct me if I'm wrong.
 

TBSN

Senior member
Nov 12, 2006
925
0
76
OK, thanks for answering that question. So, do DVD playing programs automatically up-convert the SD from the DVD to HD when they are playing in full screen, or would you have to do that manually and then play the movie?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
HDMI/DVI, VGA and Component Video can all carry a 1080p signal. Whether you get a 1080p picture or not depends on the output source and the input limitation on the display (some mfgs skimp on input components and only support 1080i).

Like Matt2 said, DVI and HDMI carry the same digital video signal and are both HDCP compliant. You can view HDCP content from an HDMI source to a DVI input using an HDMI > DVI converter.

From my limited experience, I'd say an upconverted 1080p signal would look better than a software DVD player. There's a lot of wasted space on the software DVD players and the only way I've been able to come close to using the full screen size is by using Pan and Scan which just zooms and crops the image. Pretty sure an upscaled DVD player would "blow-up" the image and then fit it to a 1080p mapped area, though I'm not positive. That seems to be the case though with HD Cable, since I doubt some of the older movies have Blu-Ray/HD-DVD versions available.
 

TBSN

Senior member
Nov 12, 2006
925
0
76
Originally posted by: chizow
HDMI/DVI, VGA and Component Video can all carry a 1080p signal. Whether you get a 1080p picture or not depends on the output source and the input limitation on the display (some mfgs skimp on input components and only support 1080i).

Like Matt2 said, DVI and HDMI carry the same digital video signal and are both HDCP compliant. You can view HDCP content from an HDMI source to a DVI input using an HDMI > DVI converter.

From my limited experience, I'd say an upconverted 1080p signal would look better than a software DVD player. There's a lot of wasted space on the software DVD players and the only way I've been able to come close to using the full screen size is by using Pan and Scan which just zooms and crops the image. Pretty sure an upscaled DVD player would "blow-up" the image and then fit it to a 1080p mapped area, though I'm not positive. That seems to be the case though with HD Cable, since I doubt some of the older movies have Blu-Ray/HD-DVD versions available.


The DVD players that upconvert to HD do it by converting th SD to HD, rather than having the TV scale the SD signal to cover the full, HD resolution of the screen.

As far as networks playing HD movies, I'm sure they have access to the HD materials before they are available to the public in the form of "Blu-Ray" or "HD-DVD."

I just don't know how a computer handles playing DVD's, and how the software outputs it to the monitor. Perhaps when you hit "full-screen" on the player, and you have a HD-capable monitor, it will just upconvert it automatically, although I think it is more likely that it would just blow up the image (scaling down the resolution of the monitor...)
 

TBSN

Senior member
Nov 12, 2006
925
0
76
so....

Does anyone use their 1600x1200 monitor to watch HD movies/ TV whatever? Does anyone know how a computer normally handles a DVD in playback?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: TBSN
so....

Does anyone use their 1600x1200 monitor to watch HD movies/ TV whatever? Does anyone know how a computer normally handles a DVD in playback?

It's totally up to the playback software/codecs. SD DVD sources are 720x480(i); the PC will usually deinterlace them (since PC video outputs are all progressive-scan these days), and may also scale the video to match the desired output resolution.

If you use something like ffdshow you can have a tremendous amount of control over the whole process. You can get results that are frequently better than most upconverting DVD players or standalone conversion boxes if you have enough CPU horsepower.
 

TBSN

Senior member
Nov 12, 2006
925
0
76
OK, thanks Matthias99. Now the post linking to the AVSforum makes more sense (I still don't know how to set all that stuff up, but w/e). I have another question though,

How do computer monitors display progressive framerates like 24, 25, when they refresh at the standard 60p? For example, when a video game is playing, and it's at a really low framerate, like 10-20fps, the actual framerate of the monitor doesn't change to accomodate that, or you would notice the actual screen begin to flicker like an old projector running at 18fps.

If anyone could link me to an article or site than can explain this stuff it would be greatly appreciated!
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
How do computer monitors display progressive framerates like 24, 25, when they refresh at the standard 60p? For example, when a video game is playing, and it's at a really low framerate, like 10-20fps, the actual framerate of the monitor doesn't change to accomodate that, or you would notice the actual screen begin to flicker like an old projector running at 18fps.

The video card just outputs whatever is in its framebuffer to the output device 60 times per second (or however many times per second it is set to update).

The game/application/OS can update the contents of the framebuffer more or less frequently if it desires, or can 'sync' to the video card's update cycles (a feature known as 'vsync') to avoid flickering/tearing artifacts. Double or triple buffering are usually required in this case.

Howstuffworks has some basic articles on computers (the details are somewhat dated now, but the concepts are still valid.) I'm not sure if they have one on video cards and monitors.