jswajsberg
Junior Member
A bit of background:
I have a Gigabyte DS3-965P (rev 1.1) motherboard I was futzing around with. I'd overclocked her in the past and was trying to squeeze more juice out of her for Bioshock. Since I leave all settings enabled that auto recall to factory defaults whenever I have a bad overclock, I'd never experienced any permanent consequences from a bad OC before.
However, after this particular overclock (in which the voltages were all upped within stable ranges, except +0.3 for the RAM), my display refused to work. The computer loaded fine and booted all the way to windows, and I could even type in my login and could see the hard drive loading up. Everything was stable - just no monitor.
After trying a few things I returned my old X1900 AiW (thankfully still on warranty), thinking I might have blown it, and bought a new one, a Radeon x1950 Pro. The new card registered the same issue, eliminating the videocard as the culprit.
As a precautionary measure, I also replaced the ram with old ram, and tried each stick one at a time, with no change. System booted fine, just no display. My power supply is also almost new and has been extremely reliable, a 700w GameXStream, so I seriously doubt that's the problem.
SO- I plugged in the d-sub cable and voila, monitor works, albeit refreshing a tad noticeably slower but overall nothing to write home about. Yet, I still want my DVI back.
Even though it's not my monitor, I tried the suggestions here, both the plug ousting and DVI recover utility, to not avail.
I've wiped the CMOS, reinstalled drivers, etc etc ad nauseum. Now, I write to you as a last resort. I'm stumped.
My dead DVI connection strikes me as being more of an indirect, triggered result of the bad overclock rather than a direct consequence of something frying. The monitor itself might be the issue, I don't know.
Hope someone out there can help me. I'm looking forward to suggestions.
I have a Gigabyte DS3-965P (rev 1.1) motherboard I was futzing around with. I'd overclocked her in the past and was trying to squeeze more juice out of her for Bioshock. Since I leave all settings enabled that auto recall to factory defaults whenever I have a bad overclock, I'd never experienced any permanent consequences from a bad OC before.
However, after this particular overclock (in which the voltages were all upped within stable ranges, except +0.3 for the RAM), my display refused to work. The computer loaded fine and booted all the way to windows, and I could even type in my login and could see the hard drive loading up. Everything was stable - just no monitor.
After trying a few things I returned my old X1900 AiW (thankfully still on warranty), thinking I might have blown it, and bought a new one, a Radeon x1950 Pro. The new card registered the same issue, eliminating the videocard as the culprit.
As a precautionary measure, I also replaced the ram with old ram, and tried each stick one at a time, with no change. System booted fine, just no display. My power supply is also almost new and has been extremely reliable, a 700w GameXStream, so I seriously doubt that's the problem.
SO- I plugged in the d-sub cable and voila, monitor works, albeit refreshing a tad noticeably slower but overall nothing to write home about. Yet, I still want my DVI back.
Even though it's not my monitor, I tried the suggestions here, both the plug ousting and DVI recover utility, to not avail.
I've wiped the CMOS, reinstalled drivers, etc etc ad nauseum. Now, I write to you as a last resort. I'm stumped.
My dead DVI connection strikes me as being more of an indirect, triggered result of the bad overclock rather than a direct consequence of something frying. The monitor itself might be the issue, I don't know.
Hope someone out there can help me. I'm looking forward to suggestions.