DVI Glitching

cjmeyer

Junior Member
Jun 6, 2006
7
0
0
I recently replaced my motherboard, graphics card and power supply.

My system specs are as follows:

AMD Athlon 64 3200+ (2.0 GHz)
1GB DDR (Corsair TwinX XMS - 2x256M)
Abit AN832X motherboard
BFG GeForce 7600GT PCIe
Antec ATX2.0 TruePower power supply
ViewSonic VP201s (20.1" LCD)

I was testing with an old CRT running at 1280x1024 @ 60Hz and every thing seemed fine. I then connected up my ViewSonic VP201s monitor via DVI (with a Belkin DVI-D cable). I booted up World of Warcraft and I started to see video glitching.

The glitching consists of 1 pixel tall, multi-pixel wide red/green/blue lines jumping around my screen at random locations and times. The first thing I did was try to determine if this was an overheating problem. I downloaded ATITool (I know it is for ATI but I was only looking to really stress the video card) and let it peg, as best it could, my video card. I monitored my video card temp and it was reaching 70+ deg C. This seemed a little high but we have been having 90+ deg F weather here and my air conditioning was out.

Either way, I decided to return it. I had gotten the XFX 7600gt first and replaced it with a BFG 7600gt as listed above. Once installed and running WoW, I saw the same glitching. Again, I monitored my video card temps and they were lower. When ATITool was running I was seeing max temps at about 62 deg C, however, the ambient temp is much lower than when I was testing the XFX video card.

I then tried a few different video settings. If I use DVI at a resolution less than 1600x1200, I get no glitching. If I use the DVI-SVGA adapater and drive my LCD using the VGA signal, I can run at up to 1600x1200 @ 60Hz with no glitching. I have only been able to produce the glitching use DVI at a resolution of 1600x1200 (the maximum supported by single link DVI).

A couple of other notes:

If I run ATITool and just let the 3D image being redered run, I will get glitching any where and every where on my screen. If I am not doing things that are 3D graphics intensive I do not get any glitching.

The glitching seems to be less prevalent in lighter areas. When I was playing WoW, I would see the worst amount of glitching on very dark textures. For those of you that have played WoW, I got a lot of glitching at the top of my monitor when at the login screen...which is very near black.

I am at a complete loss as to what the possible problems could be. Any suggestions or insights are greatly appreciated.
 

kpb

Senior member
Oct 18, 2001
252
0
0
Since you've replaced the video card it might be a bad cable. Have you tried a different dvi cable?
 

cjmeyer

Junior Member
Jun 6, 2006
7
0
0
This is what I was thinking too. I replaced the DVI cable that came with my monitor with the one from Belkin. No difference.

I have just tried out a friend's monitor...which also happens to be a VP201s. I see SOME glitching, but not NEAR the amount I see on mine during game play. Could it be a monitor going bad?...that just seems weird to me.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
You tried running 1600x1200 DVI on a different monitor, not the same brand?
 

cjmeyer

Junior Member
Jun 6, 2006
7
0
0
I have tried running running 1600x1200 on my ViewSonic VP201s, I have tried running 1600x1200 on my friends VP201s (so different physical monitors, same brand/family), and a 17" Dell CRT at 1280x1024 @ 60 Hz.

The glitching only occured on the LCDs (VP201s).

EDIT: I have been trying to locate a different brand LCD to try out but I need one that can handle 1600x1200 and there just aren't a lot of people that have them... *sigh*.
 

secretanchitman

Diamond Member
Apr 11, 2001
9,352
23
91
bad drivers? youve tried different monitor, dvi cable...how about analog? does the glitching appear there?
 

cjmeyer

Junior Member
Jun 6, 2006
7
0
0
Yes, I have tried using analog..SVGA. SVGA works fine as far as the glitching is concerned; I was able to run both the CRT at 1280x1024@60Hz and my LCD at 1600x1200@60Hz using SVGA without any glitching. I only see the glitching when I am using DVI at 1600x1200.

I have not used DVI-A, I have only tried DVI-D and SVGA. I am not sure what the difference between DVI-A and SVGA is other than the cable, but I will see if I can try it, not sure if my cable will support it.

The nature of the problem doesn't strike me as a driver issue. I would think that if this was a driver issue either I would see more pattern to the problem...such as bad textures or polygons, or random splotches of color. Because I am only seeing random horizontal lines that are only a pixel tall, with only red, blue, and green colors (no orange, magenta, etc...), I would tend to lean towards frame errors...an error that occures when accessing the frame buffer memory on the graphics card to produce a video streem to reproduce the image on the monitor.

Another piece of information: I have also tried enabling and disabling vertical syncing while gaming; no effect, I got glitching with and without vertical syncing enabled.

How likely do you think it is that this could be a driver problem? I have the latest drivers from nVidia installed, and other than the horizontal line glitches, I don't seem to have any rendering problems...just rasterization problems.
 

cjmeyer

Junior Member
Jun 6, 2006
7
0
0
Correct me if i'm wrong...

My understanding is that DVI-A is a means to provide a SVGA signal on a DVI port. Such a port is a DVI-I (provides a DVI-A and a DIV-D signal). DVI-A is straigh VGA signaling, requiring only a 'gender' change from the DVI plug to a db-15 plug.

So, if my video card has a DVI-I port, then driving my monitor with an SVGA signal by using the DVI-SVGA adapter that came with my video card, and driving my Monitor using a DVI-A cable is the same thing.

Let me clarify what I have done. I have connected my LCD up using DVI-D at 1600x1200@60Hz and I get glitching. If I connect the DVI-to-VGA adapter to my video card and then use a SVGA cable to connect my monitor to my video card through the adapter, I can run at 1600x1200@60Hz without glitching.

I do not have a DIV-A cable, but I think that using a DVI-A cable would be identical to me using the DVI-to-SVGA and SVGA cable.
 

cjmeyer

Junior Member
Jun 6, 2006
7
0
0
I've heard that some ATI cards have an issue with high resolution LCDs and that the Catalyst drivers have a 'Reduce DVI Frequency for High Resoultion LCDs' or something like that.

Does anyone here know more about that? I'm thinking I might be experiencing something similar with my GeForce card. If so, does anyone know of a similar setting for the nVidia ForceWare drivers?
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
It's actually the NVidia chips that quite consistently produce signal integrity issues on high-resolution DVI.

Two solutions: (1) Use an ATi card, or (2) use an NVidia card that DOESN'T use the chip's own DVI transmitter, but a discrete transmitter chip to overcome the (quite well known) problem.

One workaround: Enable said "alternate DVI timing" mode to reduce the signal frequency - at the expense of the screen redraw rate dropping from 60 to around 48 Hz.
 

cjmeyer

Junior Member
Jun 6, 2006
7
0
0
Peter,

Thanks man, that was kinda the impression I was getting from some further research...been relentlessly searching the web for the last 1 and 1/2 weeks, lol.

Do you happen to know of any nVidia cards that use discrete DVI (TMDS) transmitters off hand?
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Nope ... you just found the reason why I keep putting ATI cards into home and office computers :D

First of all try finding that "alternate DVI" operational mode. It's buried in the NVidia control panel /somewhere/.
 

DasFox

Diamond Member
Sep 4, 2003
4,668
46
91
cjmeyer, or anyone for that matter, can someone get a screenshot of what this looks like?

I'd be interested since I run Nvidias as to what is going on here, since I don't think I have ever experienced anything like this using Nvidia cards.

THANKS