DVI with LCD screen quality?

kalbear

Junior Member
Jan 23, 2003
5
0
0
Hey guys -

I just recently got a Dell 2000FP 20" LCD, and am quite happy with it. However, it's running through a VGA connection through a (crappy) Gainward 4200.

What I'd like to get advice about is, how important and how good does a DVI connection with LCD screens make? I'm thinking of buying another 4200 - like an Abit or Asus - but am unsure whether it'll make all that much a difference. Right now, there's still a bit of ghosting, but the overall clarity and crispness is superb. Pros, cons would be great.

Thanks!
 

bigshooter

Platinum Member
Oct 12, 1999
2,157
0
71
I have a samsung 191T, and DVI makes a world of difference. I bought a DVI card from best buy for a day to try out DVI, and i miss it. Everything is sharper, you dont have to mess with color settings, and it even seemed to reduce ghosting (might have just been my imagination). I am building a new pc soon just so I can get a DVI card. There are some things over analog that look a little shaky on my lcd. I think its interference from the multitude of electronic gagetry underneath and around my desk. I bought a DVI card from best buy for a day to try out DVI, and i miss it.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Don't even waste your time thinking about it. You have a 20" Dell Flat Panel - get DVI now!!!

Okay, now for the helpful part. The difference is unreal, once you have DVI the only settings the monitor will let you change are brightness and color temp. Everything else is automatically set - to theoretical perfection. That means no more geometry!

You need to understand what DVI does for you - if you have your screen resolution set to the native resolution of your flat panel (1600x1200 in your case) every pixel on your monitor is being driven directly with a digital value from the video card, which in turn is being told by Windows what value to put up for every pixel on your screen digitally.
This means that you will be seeing on your screen exactly what you should be seeing - pixel for pixel.

DVI at the panel's native resolution is as good as it gets - even in theory. Honestly, I am still amazed at the fact that we have this kind of technology so soon after fixed pixel devices began shipping in numbers.

The next step going forward is to have televisions that are fixed pixel devices that are fed a digital signal at that native resolution. Unfortunately today all the HD formats have different resolutions from the native resolutions of all the fixed pixel display/projection units currently available (Plasmas Screens, DLP, and LCOS unit currently need to perform scaling to display images).




 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
good comments.

you guys are making me want a dvi flat panel that i could hook up to my r8500. i'm running a 22" crt right now tho, and i don't think i can find anything comparable in a flat panel right now in my price range.
 
Mar 8, 2001
115
0
0
Yeah what SXR7171 said!!! I was using my 2000FP with an old GeForce2 for 6 months then I got a Ti4200 and started using the DVI connection. Big time difference in the clarity. If you have a quality LCD you should be able to see a positive difference using your DVI connection.

I can't wait until Home Theater equipment start using DVI connections and Blue Lasers for DVD!!!
 

kalbear

Junior Member
Jan 23, 2003
5
0
0
Thanks, guys! That's exactly the kind of info I was looking for. Especially thanks to sxr1711 - it was nice to hear a technical reason as to why exactly this was the case.

Next question is, is there appreciable difference in DVI quality among various vendors? I would guess that it's basically binary - either it's really bad or just fine, and there's no things to worry about like signal coherence and quality getting lost along the way, but I figured hey, might as well ask, right?

Thanks again!
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
Problem is sxr1711 is not totally correct. Digital (DVI) like (BNC connectors on a CRT) in theory will provide a better picture. However in practice or in a blind test I find it very difficult to tell the difference. This also depends on the quality of the video signal provided by the video card. Some cards are notorious for poor analog video signal quality thus the analog interface takes the rap when in fact it is a poor design on specific video cards.

The myth that DVI (digital) does not need to be converted to (analog) is just that, a myth. On a analog system, the Ramdac (chip that generates the video signal on the VC) has been integrated into the graphics controller chip for years now. Adding DVI means adding a DVI transmitter chip to the VC and a DVI receiver chip in the monitor.

In order to transmit the digital data from the VC in true digital, the graphics chip must have DVI outputs and the video cable would need to have a single wire for each bit. If this were true the cable would need to contain more than 27 wires. You can imagine how thick this cable would be. DVI converts the parallel data to a number of digital serial channels. Depending on the interface used (DVI-I or DVI-D) the number of serial channels varies. The serial bit steam is then converted back on the monitor side, and the signal must be sampled using the pixel clock just like it is in the analog interface.

Even monitors with a DVI interface convert to analog at the LCD driver level. The digital signal must be converted to an analog in order to achieve the 16M colors. If LCD was pure digital only eight colors, would be achievable. In order to generate the 16M colors each red, green and blue cell must be capable of stepping through 256 shades this is an analog function. In fact, most LCDs maintain the video signal in analog form through to the pixel drivers (NEC was the most notable producer of these).
 

BoomAM

Diamond Member
Sep 25, 2001
4,546
0
0
Correct me if im wrong, but arnt DVI-I & DVI-D the same?
When i brought my 9700Pro & hitachi 17" TFT, the 9700pro was listed as using DVI-I, and the TFT DVI-D. The sales assistant said that the interfaces are the same and they will both work together. And they did.

Im about 70% sure that im wrong about the above, so please supply me with the correct information.
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
Yes and No. DVI-I connectors support both analog and digital signals, DVD-D support digital only, they are for the most part interchangeable, however I have heard of cases where people have had compatibility problems.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Digital (DVI) like (BNC connectors on a CRT) in theory will provide a better picture. However in practice or in a blind test I find it very difficult to tell the difference

I would agree with you about BNC. I use them myself and see no real difference between VGA or BNC on my 22" Mitsu monitor even at the highest resolutions...However, I strongly disagree about the analog connection vs the DVI connection. Virtually every setup I've tried (we actually just replaced several CRT's at work with Samsung 191T flat panel displays) I can see a "large" difference in picture quality/sharpness in the displays when hooked up via DVI vs the same panels connected to the same video cards via VGA, enough difference that I find it shocking that someone could not.
 

kalbear

Junior Member
Jan 23, 2003
5
0
0
Well, I bit the bullet and ordered an ASUS V9280TD. I've had good experience with Asus, and better experience with their image quality in general. While I suspect it doesn't matter, it's good to get reliability...which only makes me madder with the Gainward issues. Grrr.

Now I have to think about whether or not I want to do multimon with this beast and the older Iiyama 19" that it replaced...