Is it possible to get High Definition on the monitor?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

rjain

Golden Member
May 1, 2003
1,475
0
0
some video cards don't have a DVI port. It's for backwards compatibility that they allow a VGA input.
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
Originally posted by: Dug
Is it possible to get High Definition on the monitor?
I don't believe there are too many monitors that can display a true pixel perfect 1920x1080i image. Maybe the 24" Sony's or something like that. In fact I'm not aware of any monitor being able to do an interlaced image.
Almost all TV, Projectors, Monitors distort the image in some way because of their physical limitations.
Will the image still look good? Of course it will, because the signal is so good to start with.

It's not so much a matter of monitors not being able to display a 1080i image as it is that it's not desirable to do so.
Anyways, just display it as 1080p rather than 1080i and it will look better than any TV.
 

bob4432

Lifer
Sep 6, 2003
11,727
46
91
Originally posted by: Solodays
every watch a dvd on your computer? it will beat any hd tv i have seen upto $12000

which monitor are you referring to? my CRT does indeed kick as$. if you're referring to my CRT look better than a 12k HDTV, then i would found that hard to believe.

I watched alotta DVDs on my monitor, that's why i want a digital monitor.

just about any new monitor today



 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
what are you talking about?
ALL MONITORS ARE HIGH DEFINITION.
High definition = high resolution like 1280x720(720p), or 1920x1080(1080i)

just about any 19"+ monitor will do 1920x1080...
hell my old 1992 14" monitor will do 1920x1080...

Terminator 2 Extreme DVD on my 21" monitor @ 1920x1080 just rocks :D even though I never watched it, just a short 1 minute sample from t2's website :p
 

50

Platinum Member
May 7, 2003
2,717
0
0
I think you're confused dude. DVI has nothing to do with HDTV. See there are two connections from a monitor to a PC, VGA (analog) and DVI(digital). Right now I see not that much difference(it really depends on which monitor you use). From what I have seen, DVI is only available on LCD monitors and not CRT's. If you want to watch HDTV on your PC a digital monitor will do nothing, it will not affect it. In order to get HDTV you need a tuner card (typically 300-500 bucks). Then you can watch HDTV on your monitor (digital or analog).
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
if your monitor will do 1280x1024, then it'll do 1920x1080... most likely if your monitor supports multiscan, which almost all does.

My 14" monitor is pretty incredible and top of the line for what it is... it will do 75KHz, which allows it to go up to 1600x1200
 

Solodays

Senior member
Jun 26, 2003
853
0
0
Originally posted by: Cartman2003
In order to get HDTV you need a tuner card (typically 300-500 bucks). Then you can watch HDTV on your monitor (digital or analog).



$300 video card? I can get a geforce mx 440 for $50 with a DVI output.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Actually he's right. Most will

Really, I've never used one that would do better than 800x600, let alone back in 1992. Link one that'll do 1920x1080 just for kicks:)
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
click on the link to PICS OF MY COMP in my sig... and you'll see a pic of one that does 1600x1200
 

arod

Diamond Member
Sep 26, 2000
4,236
0
76
Originally posted by: Solodays
Originally posted by: Cartman2003
In order to get HDTV you need a tuner card (typically 300-500 bucks). Then you can watch HDTV on your monitor (digital or analog).



$300 video card? I can get a geforce mx 440 for $50 with a DVI output.


You cant use a regular video card, u need a tuner card like u do with regular cable TV. I have a Fusion II by DVICO (costs 200 bucks). You need a pretty decent computer though and of course HDTV signals in your area.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
I've got a couple old 14" and 15" monitors, both newer than 1992. If most 14" monitors back in 1992 could support 1920 x 1080, show me how its done. I have an HDTV card and I'd love to give it a shot. I'm not sure what those pics are supposed to show me:)
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
it's supposed to show you a pic of the monitor that is a 1992 14" monitor... and you're supposed to take my word that it does 1920x1080...

in fact it can do it at 75Hz...

in order to do 1920x1080.. you just need a 14" monitor that can do 64KHz...
most 14" monitors will do 64KHz, and most higher class 14" in 1992 will easily do 64KHz... and a few very high end 14" monitors in 1992, like the one i'm using, can do 75KHz...
My 14" monitor costed $1500 back in 1992. It is actually an aperture grill monitor

just try it.. if your monitor can do 1280x1024 @ 60Hz.. most likely it'll do 1920x1080...
try 1280x1024 first.. if that works.. then I almost guarantee you that 1920x1080 will work given that your monitor is a multiscan
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
Originally posted by: rbV5
I've got a couple old 14" and 15" monitors, both newer than 1992. If most 14" monitors back in 1992 could support 1920 x 1080, show me how its done. I have an HDTV card and I'd love to give it a shot. I'm not sure what those pics are supposed to show me:)

Check the specs on your 14" monitor. Specifically look for horizontal frequency range. If it's above 66kHz you can do 1920x1080 non-interlaced @60Hz (uncommon on a 14", but even the cheapest 17" will do this).

To do 1920x1080i at 30/60 like a TV, you need to be able to sync to about 33kHz. ANY monitor will do this even a 12" in fact it's near the low range of most, just its not one of the default modes windows supports so you have to add it with a 3rd party tool like powerstrip.

STep 1: DL powerstrip & install.
Step 2: right-click powerstrip system tray icon, go to Display Properties->Configure on the popup menu
Step 3: Hit "Advanced Timing Options"
Step 4: Hit "Custom Resolutions"
Step 5: Select 1920x1080i from the predefined list, don't even need to make a user-defined res for this one. However, I'd recommend raising the horizontal refresh rate value to only 1 number lower than your monitor claims to support. vertical refresh will automatically update when switch to another input field.
Step 6: Hit "Add new resolution" . Chances are since 1920x1080 is already in most video drivers, just not as interlaced, the driver will accept the new resolution without rebooting and you just need to hit OK to switch to it. If not, hit restart and the option should come up in your display control panel after reboot.

Alternatively you can add a custom 1920x1080 at a non-interlaced refresh of 45-50Hz if you have a low end 14" monitor that can't do it at 60Hz.
In general, any CRT that can do 1280x1024 can do 1920x1080 at only 5% lower refresh.


17" that does 1920x1080@63Hz non-interlaced for $85.59
15" that does the same for $99

Doesn't look like anyplace still sells 14".....
 

Solodays

Senior member
Jun 26, 2003
853
0
0
Originally posted by: arod



You cant use a regular video card, u need a tuner card like u do with regular cable TV. I have a Fusion II by DVICO (costs 200 bucks). You need a pretty decent computer though and of course HDTV signals in your area.[/quote]

oh really? so the outcome will not be in ditgital when when plug in a DVI output?
so, what exactly does the DVI output on the video card and a compatiable DVI input on a monitor really do?
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
solodays.. you should really read your replies... as your last question has been explained... but here it is again...

DVI is for LCDs... LCDs uses a digital signal for the color value of each individual pixel. If you hook up a LCD using an analog signal, your monitor will then need to convert it back to digital in order for the monitor to work. This is why LCDs is better off using DVI, so it wouldn't have to use digital to analog with your video card, then sent to your LCD, and has it convert the analog back to digital, which loses quality. It would just have a straight digital signal.

CRT's uses an analog signal. So then DVI is pretty much useless.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Alternatively you can add a custom 1920x1080 at a non-interlaced refresh of 45-50Hz if you have a low end 14" monitor that can't do it at 60Hz.
Yea, I use custom timings with my HTPC using powerstrip right now. Unfortunately it looks to me as if 1080i is broken in Cat3.7 as I'm getting a virtual desktop now via VGA, although 1080p is working. I'm going to drag those old junkers out of the garage, and try them out, They are newer than 1992 but are typical for the era...so we'll see.

CRT's uses an analog signal. So then DVI is pretty much useless.
Some newer highend CRT monitors are using DVI inputs now, many newer CRT RPTV also feature HDCP DVI inputs. IIRC, there is no such thing as a pure Digital signal anyway, there is always an A/D conversion in the stream.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
There are no 14" monitors that can properly if at all display 1920x1080. The proper 4:3 resolution of 1920x1440 is beyond the capabilities of 19" monitors today let alone 14" 10 year old monitors. A resolution that high in 32bit color would require a minimum of 12MB of onboard RAM which was well beyond what video cards back then had when 4MB was considered top of the line.

Based on a typical .25 dot pitch (much better than 92 standards), a monitor would have to have a 24" viewable area in order to resolve every pixel. With today's highend standard of .22, 1920x1440 is displayed perfectly on today's 20" viewable monitors (21"/22").
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
1920x1440 is way beyond 1920x1080...
any 19" monitor can do 1920x1080... though you'd have to have two black bars, one on the top and one on the bottom to get an correct aspect ratio, this can be done by turning the vertical size to minimum
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
On a 4:3 monitor, 1920x1080 with black bars above and below is handled identically to 1920x1440. All the calculations above apply to the 1920x1080 as well. There are no 19" monitors that can properly display every pixel of a resolution that high. You can get the resolution to run on some 19" monitors, but just like running 1080i on almost all HDTV's, you aren't getting every pixel/line of resolution.

A 14" monitor with an optimistic .28 dot pitch, is not physically capable of drawing a resolution higher than 1024x768. Which means if you can some how get 1920x1440 to run on it, you would be seeing barely more than 1/4 of the number of pixels in the original image.
 

mrgoblin

Golden Member
Jul 28, 2003
1,075
0
0
Originally posted by: rbV5
every watch a dvd on your computer? it will beat any hd tv i have seen upto $12000

If you don't mind sitting 2 feet away and 4:3 format or black bars top and bottom:) Personally I prefer my widescreen and the comfort of my couch. It is nice and crisp though.

Well maybe for you. My monitor is a 42 inch plasma buddy :D From gateway :D
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Those displays have a native resolution of 852x480 which is terrible for a display that large and only capable of displaying a 480P picture which isn't even close to HDTV standards. Why anyone would want a plasma screen besides for space saving reasons is beyond me. They are far and away the winner of worst cost per picture quality display available in the HT market.