when a card is dual dvi does that mean no CRT analog?

Mar 19, 2003
18,289
2
71
Nope, they're almost always DVI-I which also carries the analog signal on some dedicated pins. The card will generally come with a DVI-VGA adapter (or two) to allow you to use those analog outputs.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
pretty much all cards come with DVI to VGA adapters for those looking to go with CRT's still like SynthDude2001 said.
 

phatrabt

Senior member
Jan 28, 2004
238
0
0
Correct, if there is dual-dvi there is no VGA connector. However, if the DVI connectors are DVI-I (which have analog) versus DVI-D (which are purely digital), you can use a DVI\VGA adapter and still use your VGA monitor.

Darn, you guys beat me to it! :)
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
On a side note, is there any benefit (as far as signal quality goes) to using a cable like this instead of the adapter and a normal VGA cable?
 

phatrabt

Senior member
Jan 28, 2004
238
0
0
Originally posted by: CP5670
On a side note, is there any benefit (as far as signal quality goes) to using a cable like this instead of the adapter and a normal VGA cable?

None that I'm aware of. I'd go with whatever is cheaper\easier.
 

Geomagick

Golden Member
Dec 3, 1999
1,265
0
76
The adapters come with the cards anyway so there is no point in spending more to get a different cable. Personally I'm building up quite a collection of these adapters. Can't ever see myself using them though.
 
Jun 14, 2003
10,442
0
0
im using one right now, my 7800's have DVI only on the back, but im using the supplied vga converter since i havent gotten round to buyin a DVI cable for my LCD
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
On a side note, is there any benefit (as far as signal quality goes) to using a cable like this instead of the adapter and a normal VGA cable?

Thank you for pointing that cable out- I would expect that if you are running a high end CRT then there would be a noticeable improvement in signal quality using that cable instead of the converters. I can tell you that there is a sizeable dropoff in signal integrity on the adpators when you start pushing the upper limits of high end CRTs(16x12@100- 20x15@85). I have a vid card with DVI+VGA out and a monitor with dual inputs- the supplied adaptor clearly degrades signal quality to a level that it is noticeable to non geeks.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
The adapters just transfer from one pinout to the other. Unless you have one that is crapply made out of metal that isn't conductive enough and/or one that pins and holes don't connect well with the plugs on your other parts, then you won't be loosing any image quality by useing an adapter.
 

JRW

Senior member
Jun 29, 2005
569
0
76
I recently upgraded to a 7800 GTX videocard which only has dual dvi outputs, I have my FW900 CRT connected using the supplied dvi>vga adapter and cant notice any quality loss between it and my previous 6800GT which had an actual VGA output. However my monitor also has a BNC input and im curious if buying a DVI to BNC cable would help any ,Im told BNC offers higher bandwidth at the extreme resolutions vs. VGA so it might have a slight increase in quality? I was looking at a cable like this one.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Thank you for pointing that cable out- I would expect that if you are running a high end CRT then there would be a noticeable improvement in signal quality using that cable instead of the converters. I can tell you that there is a sizeable dropoff in signal integrity on the adpators when you start pushing the upper limits of high end CRTs(16x12@100- 20x15@85). I have a vid card with DVI+VGA out and a monitor with dual inputs- the supplied adaptor clearly degrades signal quality to a level that it is noticeable to non geeks.

Yeah that's what I had in mind, as I have heard that the adapters are not always of the best quality and can introduce blurriness and ghosting when approaching the limits of what video cards can support (20x15 at 85hz seems to be the maximum in fact). My current card has a VGA output but my next one probably will not. I suppose it won't hurt to get one of these cables, as they are only $13 anyway.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The adapters just transfer from one pinout to the other. Unless you have one that is crapply made out of metal that isn't conductive enough and/or one that pins and holes don't connect well with the plugs on your other parts, then you won't be loosing any image quality by useing an adapter.

I've heard that same thing from almost everyone who has never actually tried it pushing the settings I'm talking about. I have.
 

JRW

Senior member
Jun 29, 2005
569
0
76
Guess I forgot to meniton im running 1920x1200 desktop resolution, I noticed no image quality degradation using the dvi>vga adapter ,Im *very* picky about this stuff so im sure I would've noticed.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BenSkywalker
The adapters just transfer from one pinout to the other. Unless you have one that is crapply made out of metal that isn't conductive enough and/or one that pins and holes don't connect well with the plugs on your other parts, then you won't be loosing any image quality by useing an adapter.

I've heard that same thing from almost everyone who has never actually tried it pushing the settings I'm talking about. I have.
Yeah, Ben, we have been though this a few times. However, if you understood the principles of electron-conduction you would understand that your degradation in image quality you saw was either the result of a sub-standard adapters or simply a case of seeing what you want to belive; there is nothing inherent to the use of an adapter that will cause issues any more than using an inch longer cable.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
if you understood the principles of electron-conduction

Which principle is it that is allowing you to avoid the flaws of the physical connectors to exist- you obviously absolutely must have a pure gold connector for each point and you clearly have hooked it up in an oxygen free environment(which the adaptor itself was clearly produced in). Where did you happen to pick this up?

there is nothing inherent to the use of an adapter that will cause issues any more than using an inch longer cable.

Which deity set up your system and built your adaptor? Mine was handled by mortal men and it suffers from severe limitations.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Heh, perfection isn't even close to necessary to avoid signal degradation in a low amperage current signal system like VGA; little bits of copper pressed firmly against each other do that just fine.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Heh, perfection isn't even close to necessary to avoid signal degradation in a low amperage current signal system like VGA; little bits of copper pressed firmly against each other do that just fine.

You talk as if bandwidth has no impact. Look into how great oxygen exposed copper connectors transmit signals at the levels we are discussing.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Exactly how much bandwidth do you think we are talking about, and how does that compare to the bandwidth on a gigabit lan? Yeah.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Exactly how much bandwidth do you think we are talking about, and how does that compare to the bandwidth on a gigabit lan? Yeah.

How much do I think- 8,556,380,160 bits per second(2048x1536@85Hz)- over eight times more bandwidth then gigabit. 16x12@85Hz is about five times more then gigabit- 1024x768@85Hz is only roughly twice as much as gigabit. 800x600@75Hz is about a gigabit per second(roughly).
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
I'm not sure what you are doing with your math there, but something is clearly up as we have had eithernet video extenders cappable of far more than 800x600@75hz for years.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Note that a good share of the highend workstation graphic cards are dual DVI, and a good number of them are attached to highend CRT's using DVI>VGA adaptors and using high resolution and refresh rates. If there is is real problem with those adaptors, I've sure never seen it or heard about it myself.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I'm not sure what you are doing with your math there

Need another reference? BTW- There figures are @75Hz which is why mine are slightly higher. It isn't like the math is very complicated either, not sure why anyone would have problems with it-

Video extension over gigabit is covered in this article.

So, assuming a system is designed to utilize low-cost gigabit-Ethernet components operating at 1250Mbps, it could support SXGA (1280×1024) at 75Hz refresh rate over three fibers or only SVGA (800×600) on one fiber.

The limitations of this technology come into play as the pixel rate and distance increase. The Cat 5 copper cabling has a certain amount of capacitance per foot, and that acts as a low-pass filter that reduces the signal to noise ratio of the differential analog video signal; electrical interference from motors and fluorescent lights also can be coupled into the line. Eventually the signal at the receiver end deteriorates to the point that random noise (?snow? effect) or bandwidth limitations become visible. Earth ground differences between transmitter and receiver can result in slow-moving hum bars on the display.

Note that a good share of the highend workstation graphic cards are dual DVI, and a good number of them are attached to highend CRT's using DVI>VGA adaptors and using high resolution and refresh rates.

Are they pushing the levels I'm talking about? The fall off is pretty steep moving up from 1600x1200@85 to 100Hz or to 2048x1536@85Hz. As is typical with analog signals the integrity remains decent up until a given bandwidth at which point there is a steep drop.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
It's like using a KVM. A KVM degrades the quality as well. High-end should use dual-link DVI or BNC anyway. There are CRTs with DVI in.

On my 17" LCD, it's easy to tell the difference between VGA (auto adjusted phase/clock) or DVI at 1280x1024. I could discern it any day if I was 2 feet from it. There's also a very noticeable contrast degradation. I bet you can reproduce that if you have a really long VGA cable and/or a subpar DVI->VGA converter.

On DVI-I, both DVI and VGA are carried throughout the connector. On DVI-D, only DVI is carried.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BenSkywalker
I'm not sure what you are doing with your math there

Need another reference? BTW- There figures are @75Hz which is why mine are slightly higher. It isn't like the math is very complicated either, not sure why anyone would have problems with it-

I though you might be doing 2048x1536x24x85 but that is still over 2gpbs short of your 8,556,380,160 figure so I wasn't rightly sure what you were throwing in there. But beyond that, I still don't follow the argument as VGA is an analog signal and doesn't rightly have bits-per-anything but rather waves. I don't get the article you linked either as it talks about needing three strands of fiber for SXGA when I know there are video etenders that work well over copper eithernet cable at least up to QXGA.