The "downstairs" computer (as I call it) that I built about 18 months ago for family use was connected to a CRT monitor that was getting old and I wanted to upgrade to a flat-panel LCD. I wasn't sure if the wife and kids would enjoy/appreciate a widescreen monitor (and I only considered one because they seem to be actually cheaper), so I decided to pick up a 19-inch "widescreener" at a local electronics store my friend manages (knowing I could take it back).
Shortly before hooking up that monitor, I went ahead and downloaded and installed the latest drivers and Catalyst version for the unit's X700 PRO card.
Shortly after hooking up that (19-inch widescreen) monitor (via VGA), it began flickering on and off, going in and out. I then noticed that there were TWO ATI icons in the system tray, and I thought, AHA - there are two different Catalyst versions running; I didn't uninstall the old before installing the new. So I uninstalled both, reinstalled the new, and unhooked and rehooked everything up. The problem persisted. In fact, it was so bad that the computer would reboot itself, sometimes even continuously rebooting in a cyclical pattern, never usually getting farther than the XP splash screen.
I decided that widescreen wasn't going to be the best for us and so boxed up the monitor and returned it. I ordered a 19-inch non-WS LCD from Newegg, and in the meantime hooked the old CRT monitor back up (via VGA). There were NO problems.
When the new monitor arrived, I hooked it up via DVI and the problem arose again. This time I uninstalled the drivers, pulled out the video card, hosed down the inside of the box and all the connection slots with canned air, plugged the card back in and reinstalled the latest Catalyst and drivers. That seemed to solve the problem for a while, but now it's come back. It seems to get worse the longer the computer is running, and eventually it will just constantly go in and out, the screen displaying "no signal."
I feel certain that there's nothing wrong with this monitor, given that it happened with another monitor and happened when I first hooked up that monitor. The new monitor in question is an Acer, and it didn't come with any special drivers or setup disc. I feel sure I could solve the problem by wiping the hard drive and reinstalling XP from scratch (using XP w/SP2, BTW), but I just hate to have to do that.
I'm next going to try a driver removal tool that I read about in another thread in this subforum, and then I'll reinstall and see how that goes. But my question is: Could something actually be wrong with the video card? Or is there some setting somewhere that I'm missing or don't know about? Is this sort of problem common?
Sorry so darned long-winded. Thanks!
-abs
Shortly before hooking up that monitor, I went ahead and downloaded and installed the latest drivers and Catalyst version for the unit's X700 PRO card.
Shortly after hooking up that (19-inch widescreen) monitor (via VGA), it began flickering on and off, going in and out. I then noticed that there were TWO ATI icons in the system tray, and I thought, AHA - there are two different Catalyst versions running; I didn't uninstall the old before installing the new. So I uninstalled both, reinstalled the new, and unhooked and rehooked everything up. The problem persisted. In fact, it was so bad that the computer would reboot itself, sometimes even continuously rebooting in a cyclical pattern, never usually getting farther than the XP splash screen.
I decided that widescreen wasn't going to be the best for us and so boxed up the monitor and returned it. I ordered a 19-inch non-WS LCD from Newegg, and in the meantime hooked the old CRT monitor back up (via VGA). There were NO problems.
When the new monitor arrived, I hooked it up via DVI and the problem arose again. This time I uninstalled the drivers, pulled out the video card, hosed down the inside of the box and all the connection slots with canned air, plugged the card back in and reinstalled the latest Catalyst and drivers. That seemed to solve the problem for a while, but now it's come back. It seems to get worse the longer the computer is running, and eventually it will just constantly go in and out, the screen displaying "no signal."
I feel certain that there's nothing wrong with this monitor, given that it happened with another monitor and happened when I first hooked up that monitor. The new monitor in question is an Acer, and it didn't come with any special drivers or setup disc. I feel sure I could solve the problem by wiping the hard drive and reinstalling XP from scratch (using XP w/SP2, BTW), but I just hate to have to do that.
I'm next going to try a driver removal tool that I read about in another thread in this subforum, and then I'll reinstall and see how that goes. But my question is: Could something actually be wrong with the video card? Or is there some setting somewhere that I'm missing or don't know about? Is this sort of problem common?
Sorry so darned long-winded. Thanks!
-abs