why won't dvi work w/ my monitor?

magreen

Golden Member
Dec 27, 2006
1,309
1
81
Hi

I just got this new diamond 2600xt video card and was hoping to hook it up to my acer 20" lcd thru the monitor's dvi-d input. But whenever I hook it up that way the monitor says no signal. The only thing that works is using the dvi->vga adapter and a vga cable to the analog input on the monitor. The card is a diamond 2600xt with 2 dvi-i outs. The monitor has a vga in and a dvi-d in. I thought dvi-i output was supposed to work for a dvi-d monitor? Do I have to somehow configure the card to output digital since the dvi-i output supposedly handles both digital and analog data?

I see in the monitor's controls you can switch from analog to digital... but you can only access those controls when the monitor has signal. The whole control osd is unavailable when it has no signal. And when it has an analog signal it won't let me switch it to digital!

How do I make it work? Thanks.

UPDATE: got it working but I don't know why it solved it. Scroll down to see details.
 

magreen

Golden Member
Dec 27, 2006
1,309
1
81
A source button on the monitor? I don't think so. I wouldn't even know where to look for it. I'll try to look in the manual.

It's an Acer AL2016W, btw.
 

magreen

Golden Member
Dec 27, 2006
1,309
1
81
Alright, I RTFM. No help -- there's practically nothing in there about dvi. Any ideas?
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: magreen
I see in the monitor's controls you can switch from analog to digital... but you can only access those controls when the monitor has signal. The whole control osd is unavailable when it has no signal. And when it has an analog signal it won't let me switch it to digital!

So change it to digital first, then change cables.
 

magreen

Golden Member
Dec 27, 2006
1,309
1
81
Tried that... It won't let me. I select digital, the screen goes black for one second and then returns to analog mode as it was.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
this is such a basic thing that using a manual or configuarions is not the issue. Either the video card or the monitor is defective. Try plugging in a different monitor to find out which one.
 

will889

Golden Member
Sep 15, 2003
1,463
5
81
Try the other dvi port on the card itself and also try another DVI cable, if still not go RMA the card.
 

magreen

Golden Member
Dec 27, 2006
1,309
1
81
That would suck if it's faulty stuff. I just got the card and am about to mail in the rebate. How do you handle an RMA when you only have 30 days or something to mail in the MIR? You can't rma it once you cut out the upc, right?
 

will889

Golden Member
Sep 15, 2003
1,463
5
81
You can rma with the UPC missing. Most cards have rebates that require sending off the UPC but you can still RMA within the warranty period. Just tape the UPC back on the box (if you so choose), or not. Up to you.
 

magreen

Golden Member
Dec 27, 2006
1,309
1
81
Good news - I got it working!

I did some searching on google and found something on yahoo answers that helped: it said connect each of the 2 video card outputs to an input on the monitor -- one vga and the other dvi -- and on the 2nd monitor that shows up in settings click extend my desktop to this monitor.

Well I tried that and at first nothing, but then it switched to like a black screen with a mouse pointer and then returned to analog, then did it again and it showed an empty windows wallpaper (no mouse pointer) and the monitor reported it was receiving digital in! But if I removed either cable windows would revert automatically to not extending the desktop, and dvi stopped working from either output.
Did it again and tried switching the dvi cable to the first port on the card -- no luck..

But I just kept trying -- tried with windows display settings as well as display settings in Catalyst control center (CCC). And I fiddled with the input mode setting on the monitor -- I think switching from digital back to analog made windows stop extending onto the second monitor, but I don't remember clearly. At one point with both cables plugged in I saw in CCC that instead of extending onto the second monitor, it reported two monitors plugged into the first video adapter in clone mode. And I checked the monitor and it reported we were in digital input mode! At that point I was able to remove the vga cable from output 1 and continue to use the computer only through dvi! And CCC reported we were no longer in clone mode.

So of course I wasn't satisfied.. I had to try dvi through output 1. I removed the dvi cable from output 2 and put it on output 1. Still worked! So to check that it was permanent I shut down the computer and turned off the monitor. Powered up again and everything was perfect thru dvi on output 1.

Bizarre huh? I'm just glad it's working now! But anybody know why and what happened?
 

will889

Golden Member
Sep 15, 2003
1,463
5
81
It seems as if you had to first force the monitor to "see" DVI manually through alternating analog-dvi and extended desktop, and to me it sounds as if it's actually a driver issue with that particular card. I say that because over the last few weeks I have done more than a few builds and with two I used two used ATI cards and I didn't have that issue. Actually one was onboard 780g with DVI, and the other and HD3850. Anyway glad you got it solved with troubleshooting. Perhaps you might email AMD-ATI and see if they have reported similar issues with that card.