DVI blurrier than DSUB??

arcas

Platinum Member
Apr 10, 2001
2,155
2
0
Bought a 2005fpw a few months ago but since both my machines and my KVM were VGA-only, I'd been using the monitor's DSUB input. I've been pretty happy with the output quality. This week I upgraded one of my machines to a eVGA 6800GS so today I tried DVI for the first time. Big letdown. The eVGA DSUB output (as well as the output of the previous video card and of the Matrox G400 in the other machine) is far sharper than the eVGA DVI output. The DVI's softness is noticible even on the BIOS config screen.

The only DVI cable I have is the one supplied by Dell with the monitor. Is this softness a symptom of a POS DVI cable?

 

bamacre

Lifer
Jul 1, 2004
21,029
2
61
If this is in regards to on-screen text, look into ClearType. My text looked blury using the DVI with my 2001FP. This wasn't a problem until I actually upgraded from win2k to xp.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
You can't have "soft" on DVI. Pixels are transmitted individually and digitally.

You'll HAVE to run the panel's native resolution, else things will get interpolated, and that's where algorithms will differ. On the BIOS screen, you're expanding from 720x400 to 1680x1050 - that's never going to look good either way (just bad in a different way).

And then enable ClearType.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
ClearType works only when you have a digital display connected digitally. You know that, do you?

And then it works really nice, essentially tripling the horizontal resolution for text characters. Low-resolution panels expose the coloredness of the edges though.
 

Steve

Lifer
May 2, 2004
15,945
11
81
ClearType

Display Properties -> Appearance -> Effects.

ClearType is a method for Windows XP to "smooth edges of screen fonts." It is for, as Peter said, any LCD monitor. Preferably connected via DVI, but it can look okay on analog (look at my screenshot again).
 

UltraWide

Senior member
May 13, 2000
793
0
76
I have a 2001FP with a 7800GT via DVI and it looks 100x better WITHOUT ClearType. I think it just makes the text look blurry. It looks much sharper WITHOUT ClearType. I say ClearType is useless if you have and LCD connected via DVI. YMMV
 

bamacre

Lifer
Jul 1, 2004
21,029
2
61
You know, I went back and checked, and I have ClearType turned off, and I remember why, it looked like crap. 7800 GTX owner here with a 2001FP as well.
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
I use DVI and Cleartype at both home and work, I wouldn't be using an LCD without it the consensus is that this is the way to go. Unless of course you have bad vision and somehow you think it looks better without Cleartype on.
 

Steve

Lifer
May 2, 2004
15,945
11
81
I wonder if your nVidia cards still have those sub-par TMDS transmitters, and if that would explain why ClearType is looking blurry for you two. That or if you have your monitors at something other than 60Hz? Hope not.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
It's also important to point out Cleartype will work with any connection to an LCD, just of varying success. DVI will be the best, and VGA the second best, just like in general terms of quality. It also helps running big text at non-native resolutions. The ClearType Tuner is indispensable.
 

Steve

Lifer
May 2, 2004
15,945
11
81
Originally posted by: Compellor
Using ClearType Tuner will let you customize how good you want your text to look. It's much better than using just the default setting:

ClearType Tuner

That is really neat. I just used it here at work and I'm going to try it at home. Thanks!
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
ClearType makes text look SMOOTH. Leave it off if you want to see your pixels - but with it on, staring at text for prolonged amounts of time is less exhausting. Why? Because the human eye likes smooth (you wouldn't run your games with anti-aliasing off, or would you?), and also because the tripled horizontal resolution also means that the lettering is more evenly spaced - particularly in the fine print.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
my display looks good without Cleartype and using the VGA connector. its a 26" lcd w/ 1280x768 resolution
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
On a DVI cable, that one would be just brilliant to /study/ what ClearType does, with those HUGE pixels 8O
 

arcas

Platinum Member
Apr 10, 2001
2,155
2
0
Yeah, I'm aware of Cleartype (and I've read the arguments for and against it). But in my case, the 'softness' is visible even before an OS loads (eg. the BIOS config screen and the initial boot screens where the system is lists CPU info, drive info, etc).

It seems to me that if this is an LCD scaling artifact, then I should be seeing the same sort of softness when using the VGA cable since the video resolution is 720x480 in either case. But... On the aforementioned BIOS screens, text is razor sharp using the DSUB VGA output. Switch to DVI and the softness is immediately apparent (the 2005fpw lets you toggle between the DSUB and DVI-D inputs so comparisons are easy).

(and, actually, since the DSUB/VGA signal goes through a KVM while the DVI cable is connected directly to the monitor, I would have expected DVI to have an "unfair" advantage)

I'm not saying it looks bad. It's just not what I expected after reading so many people rave about how their fuzzy displays become much sharper after they made the switch to from VGA to DVI. I'm starting to suspect a lot of these "fuzzy" VGA images were the result of $2 VGA cables.

 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
I told you the boot screens were scaled and interpolated. When on DVI, the video card does it, and when you're on VGA, the panel does it. Different methods, different results.

And it's not like that matters. How long do you stare at boot screens, on average?

What matters is the image reproduction once the machine has arrived at its working state - and there, DVI is superior simply because its pixel transmission is 100% accurate and VGA's is not.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
I'm pretty sure you got it reversed there. VGA looks blurry to me. DVI is either flawless or not transmitted at all.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
Originally posted by: Peter
What matters is the image reproduction once the machine has arrived at its working state - and there, DVI is superior simply because its pixel transmission is 100% accurate and VGA's is not.

not if you can't get 1:1 pixel accuracy. If you think every lcd can do this easily through dvi, go over to avsforum and help the hundreds of people who can't figure it out
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: edplayer
Originally posted by: Peter
What matters is the image reproduction once the machine has arrived at its working state - and there, DVI is superior simply because its pixel transmission is 100% accurate and VGA's is not.

not if you can't get 1:1 pixel accuracy. If you think every lcd can do this easily through dvi, go over to avsforum and help the hundreds of people who can't figure it out

I've never seen an LCD computer monitor that couldn't do this. Trying to use a nonstandard LCD-TV as a computer monitor is another story, but that's not what this poster is trying to do.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
Originally posted by: Matthias99
I've never seen an LCD computer monitor that couldn't do this. Trying to use a nonstandard LCD-TV as a computer monitor is another story, but that's not what this poster is trying to do.

that is correct. I am talking about lcd tvs.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
LCD TVs, when fed their native resolution through DVI (and not screwing the received image up again by internal design), /can/ be just as exact. It's the same technology. Of course, it requires everyone getting things right on both ends - the LCD declaring its native 1:1 mode correctly through EDID, and the graphics card driver reading these EDID data and configuring the output to produce exactly that.

On the other hand, true, the average LCD TV demonstrates one poor display quality that is no match to a decent computer monitor. Nonetheless, consumers are eagerly paying the inflated prices for LCD TVs just because it's the fashion.

Reversely, if you actually do use a really good monitor as a TV, you'll notice how awful the broadcasted TV signal quality actually is.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
not all the lcd tvs are greatly inflated

Mine is 26" w/ 1280x768 resolution for $600. And sdtv is OK. I am more disappointed by the poor definition in dark scenes. That is something that I see in all lcd tvs though and it isn't that bad...

and yea, I'm surprised more lcd tv makers aren't on the ball with getting them to work with pcs