• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Should I use the DVI-A or VGA on my CRT monitor? Which is better?

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
The CRT monitor is a IBM P260 21" and has a DVI-A and a VGA input; my Radeon 9700 outputs with a DVI-I and of course VGA. I plan on running 1600x1200+ at high refresh rates. Will the "DVI-I to DVI-A" be a better connection than the VGA connection? I am VERY picky at even slight blur.
 

thorin

Diamond Member
Oct 9, 1999
7,573
0
0
For the 45 sec it'll take you to try both you'd have your answer alot quicker then waiting for answers here ;)

Thorin
 

RaiderJ

Diamond Member
Apr 29, 2001
7,582
1
76
I don't think it should matter, since the 9700 should have two equal RAMDAC's working to put out a signal. I may be wrong on this, but you should be fine either way.
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
thorin, I don't have the DVI cable, I will have to buy that, which is why I ask... if it's not better, I will save the money.
 

thorin

Diamond Member
Oct 9, 1999
7,573
0
0
Originally posted by: Chad
thorin, I don't have the DVI cable, I will have to buy that, which is why I ask... if it's not better, I will save the money.
Ah ok that makes sense. (My bad...Sorry :( )

I agree with RaiderJ it should really make any difference.

Thorin
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
No problem... thanks for responding. So DVI-A is exactly equivelant to VGA?
 

thorin

Diamond Member
Oct 9, 1999
7,573
0
0
Originally posted by: Chad
No problem... thanks for responding. So DVI-A is exactly equivelant to VGA?
Well since a DVI-A cable or dongle only mates with the analog pins of the DVI-I connector then it should display the exact same information/quality as the standard VGA connector (IMHO).

Thorin

 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
Well, I guess I should tell ya the whole story. Truth is, I'm deciding between the Dell P1110 21" and the IBM P260. The biggest difference is the P260 offers a DVI-A input whereas the Dell has two VGA connectors. The P260 is also $10 cheaper. I've been researching on Google like mad but it doesn't seem anyone has really asked this question before.

With my current Hitachi SuperScan 753, it supports awesome resolutions and refresh rates, but when I go to 1600x1200x85 I noticed the image gets blurry and moire problems. I'm convinced this is because of the low bandwidth of the VGA cable. Also, here...

http://www.cadenceweb.com/2001/0801/cadlab_3dcards0801.html

states...

All of the cards feature at least one DVI-I connector, which includes a higher-quality analog signal than the old VGA port, to better support the higher resolutions offered on larger CRT monitors. All of the cards also offer a VGA connector except the Fire GL4, which eschews the VGA port for a second DVI-I port (two DVI-I to VGA adapters are included with the card for compatibility). Very few CRT monitors offer a DVI-I or DVI-A (the analog-only version) connector, but if it is available, connecting the monitor and graphics card with a DVI-A cable will offer a higher-quality image.

...which leads me to believe I am right. Note: I am not contradicting you or anyone here, it's just I want to be sure as I plan on running 1600x1200x85 and I don't want to be stuck with the same problem I have now (it's unuseable at that setting).

I also don't want to confuse the issue more, but what about BNC? This is just an aside question as I'm not getting a monitor that supports BNC, but I was thinking, the consensus is that BNC is better at higher resolution, but thinking aloud, how is this so coming from a VGA connector? Doesn't that bottleneck it?

Anyways, back to DVI-A... can anyone please elaborate on the issue?
 

Eug

Lifer
Mar 11, 2000
24,152
1,796
126
VGA -> VGA and VGA->BNC doesn't really make a signif difference on my Samsung 950p @ 1600x1200 x 75. Sometimes I wonder if the BNC is better, but I don't know if I'd be able to consistently pick it out. (I haven't tested it recently though, since I always leave it on BNC anyway.) I did try a not so good VGA cable once and the image quality was inferior to my current VGA cable.

Mind you all of this is moot, since I now run a Samsung 172T LCD. Strangely enough I run VGA though, despite the fact both the monitor and the card have DVI. The VGA is excellent at 1280x1024 x 60 and I keep the DVI connection free for my Mac laptop for dual LCD goodness. :) If I could only find a cheap and good quality DVI-D switch...
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
And see, that to me makes sense. VGA to BNC wouldn't to me seem possible to be really better, you at some point have to come out of the VGA connector... unless the cable itslef has some kind of something in it to "upgrade" the signal, it couldn't possibly be a better connection, well not by much I wouldn't think.

Now the DVI-A is a different story. It has massive bandwidth, which at really high resolutions and refresh rates and colors, I think does come into play.

At least this is my thinking. Anyone here want to learn me though? Please. :)
 

thorin

Diamond Member
Oct 9, 1999
7,573
0
0
"Now the DVI-A is a different story. It has massive bandwidth, which at really high resolutions and refresh rates and colors, I think does come into play."

That may be true for DVI-I but I don't think it is true for DVI-A since it's only using the analog pins.

Thorin
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
I'm not sure I follow. The bandwidth of the cable is what I meant. I see you mentioned pins... from my understanding, DVI-A uses all the pins for DVI-D *AND* a few additional pins for the analog part of it.
 

RaiderJ

Diamond Member
Apr 29, 2001
7,582
1
76
from my understanding, DVI-A uses all the pins for DVI-D *AND* a few additional pins for the analog part of it.

I think when you use a DVI -> VGA converter, it only uses a small number of pins on the DVI port, different ones than a LCD display uses. I want to say it only uses four pins, but that doesn't sounds intuitively correct.

Edit: It does use more than four pins
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
In this case, I would not use a converter. I would use a DVI-A to DVI-A cable. One end connecting to the DVI-I of the video card and the other to the DVI-A of the monitor.
 

thorin

Diamond Member
Oct 9, 1999
7,573
0
0
Originally posted by: Chad
I'm not sure I follow. The bandwidth of the cable is what I meant. I see you mentioned pins... from my understanding, DVI-A uses all the pins for DVI-D *AND* a few additional pins for the analog part of it.

DVI-D is a digital only connector which supports dual link operation, contains 24 contacts arranged as 3 rows of 8 contacts.
DVI-I is a is the same as DVI-D with an additional 5 connectors to support analog video (the + sign and 4 pins).
DVI-A is available as a connector and mates to the analog only pins of a DVI-I connector. "DVI-A is only used in adapter cables, where there is the need to convert to or from a traditional analog VGA signal." ( source )

Now perhaps somehow this Analog signal is somehow supposed to be better then the standard VGA analog signal but I don't see how, the RAMDACs are the same and the information the card is transmitting should be the same. (Perhaps I'm missing something ... the author in the article you quoted likely knows more about this then I, I just don't see how it can be better.)

Thorin
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
Ok, but why did cadenceweb.com say what they said in that quote above? CAD pros likely know a lot about display and display quality... especially a reviewer of cadenceweb. He specifically said it was better.

I am so confused.

Thanks for the links, but none of them talk about anything on DVI-A in respect to VGA specifically (ie: comparison).

I am thinking aloud here thorin, so this isn't a statement as truely, I'm clueless on these matters... I only am trying to take all this information in. Anyhow, could it be the bandwidth of the DVI-A cable is greater than a VGA cable? Does that make any sense? Ignore if I just sounded stupid though. :p
 

thorin

Diamond Member
Oct 9, 1999
7,573
0
0
"I am thinking aloud here thorin, so this isn't a statement as truely, I'm clueless on these matters... I only am trying to take all this information in. Anyhow, could it be the bandwidth of the DVI-A cable is greater than a VGA cable? Does that make any sense? Ignore if I just sounded stupid though. "

I'm just going based on what I'm reading as well. The bandwidth could be greater but since they use the same RAMDAC I can't see how the information being sent through the cable is any different (since it's only Analog). Perhaps you should look for reviews of the two monitors and see if any of them compare the two interfaces.

Thorin
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
Found this thread but I'm not sure if anyone actually answers anything there. In fact, I almost am scared that I will get a WORSE picture now. btw I went ahead and got the P260, and they are shipping it with a DVI-A to DVI-A cable *instead of* the VGA cable (which worries me now if I made the right choice).

All this talk about DAC's on the monitor and ADC's and if they are worse than your video cards (my video card has a RAMDAC, which is like 350hz or something).

::sigh::

P.S. thorin, thanks for helping out in here. I appreciate it man.
 

Eug

Lifer
Mar 11, 2000
24,152
1,796
126
Don't worry about it. DVI cables are expensive. Good quality VGA cables are relatively inexpensive. You got the better deal.

If you find your DVI-A ain't so great then just go and buy a VGA cable.

By the way, are they sending you a DVI-A only cable? I have never seen one. Everything around here is either DVI-D only, or else DVI-I.
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
Yeah, that's what he said at least... although he had a very thick Russian like accent... so you know how that goes. :) But this place is selling this particular monitor, and I'm sure they would know that the cable I need has to be DVI-A being as that is the actual connection on the monitor itself.

Also, for DVI-A to DVI-A cables...


http://www.pacificcable.com/DVI.htm $16.00 - $38.00 (depending on length)
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
Yikes... I missed this in Eug's link...

DVI-A - High-Res Analog

DVI-A format is used to carry a DVI signal to an analog
display, such as a CRT monitor or an HDTV. Although some
signal quality is lost from the digital to analog conversion, it
still transmits a higher quality picture than standard VGA.

Is "standard VGA" what we are talking about? The VGA connection on a monitor... is it "standard" or whatever? I'm starting to think it maybe is better, just would like to know *WHY*? If it is I mean.
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
Sorry, not trying to flood this thread. Just an update, here is some email correspondence to the author of the cadenceweb.com article...


-----End--------


Ineed it should, although it's been a while since I have tested that combination. The pins in a DVI-A connector used for analog signals have much better shielding than those in a VGA cable. At lower resolutions you probably won't notice any difference.

Please let me know what your experience is.
-----Original Message-----
From: Chad Sparks [mailto:chadsparks@cox.net]
Sent: Monday, February 24, 2003 9:56 AM
To: psheerin@cmp.com
Subject: Your DVI article.


Hi,

Thanks for a great article. I was wondering though, I have a Radeon 9700 which has a DVI-I output and I have an IBM P260 21" monitor which has a DVI-A input. I run at really high resolutions and refresh rates.. will connecting my monitor to the video card using a straight DVI-A cable improve picture quality over the standard VGA cable? Thanks!

Best regards,

Chad Sparks
chadsparks@cox.net
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
Gosh... I don't want to get in trouble for posting in my own thread to much (so tell me if this is a nono and I'll stop). But I found something at Tom's on his Radeon 9700 article...

http://www6.tomshardware.com/graphic/20021104/r9700pro-cards-02.html

Anyway, it's about time that creaky analog transmission was dumped and that CRT monitors were only controlled digitally. The digital input could be converted to analog within the monitor and optimized for its picture tube. Some monitors already use this design.

Sounds like he thinks this is better (?)
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
Now things are getting frustratingly confusing! This article has been making me wonder...

http://www.siimage.com/documents/SiI-WP-001-A.pdf

In it, it states that using a DVI connection to a CRT monitor is preffered, citing that the DAC in the monitor is used instead of the video cards, and it demonstrates why this is a "good" thing. My problem is IMO this isn't a very impressive DAC. It can only do 1600x1200x60hz! The DAC in the Radeon is like 400mhz and can do insane refresh rates and resolutions. Seems to me that if the Radeon's DAC is better then it would be better to use VGA so that we bypass the inferior monitor DAC (?).

I hate myself. I always do this... I have no idea why I go anal over these things... couldn't I be like a normal persona and just not care? bleh :(