- Nov 9, 2000
- 20,127
- 6
- 81
I have an ATI Radeon 9800 (np) with DVI output. The card currently outputs to a Samsung 955DF (19" flat CRT) via the VGA output. The card came with a DVI -> VGA adapter. I was wondering if there would be any benefit to using the DVI output and the DVI -> VGA adapter with the 955DF monitor? I currently get a slight flickering on the far right of the screen when explorer or IE pages are up (bright white). I don't see it during games or when the primary image being displayed is darker than the white background of IE/explorer.
I half suspect the CRT of being at fault here. I used a display testing program a few months ago and it seems that the monitor has a hard time with bright images (the picture gets a tiny bit smaller when the image being displayed is bright). The program said that there was a problem with the power circuit in the monitor if this occurs. But I only really see it when I use Radeon cards. GF4 and GF FX cards didn't cause this to happen. Do Radeon cards send that much more a powerful signal to the monitor that I would notice it with a Radeon and not a GF?
Thanks for any and all opinions.
I half suspect the CRT of being at fault here. I used a display testing program a few months ago and it seems that the monitor has a hard time with bright images (the picture gets a tiny bit smaller when the image being displayed is bright). The program said that there was a problem with the power circuit in the monitor if this occurs. But I only really see it when I use Radeon cards. GF4 and GF FX cards didn't cause this to happen. Do Radeon cards send that much more a powerful signal to the monitor that I would notice it with a Radeon and not a GF?
Thanks for any and all opinions.