• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

1680x1050 not working

math20

Member
I had 1680x1050 working on my 22" Westinghouse display, but the maximum I can put it on now is 1600x1200.

I am running a 8800gts and windows vista 32. What could be the problem?
 
Is 1680x1050 listed in available resolutions? If not, have you tried creating the 1680x1050 "custom" resolution in the NVidia Control Panel?
 
I had the same problem and I used Rivatuner for this.. I added that resolution to drivers with that program.
 
I tried reinstalling the drivers, I tried both the beta drivers and the current nvidia drivers, I also tried to do Rivatuner and none of it worked! Do I just have to get a DVI cable or what is the deal?
 
Originally posted by: math20
Do I just have to get a DVI cable or what is the deal?
There you have it. I should have asked before. Get a DVI cable and it will solve your problem: You will be able to add custom resolution and they will "stick".

 
I can't add custom resolutions in the first place though because I am running vista. Also, I had 1680x1050 working before, but it does not work now for some reason. What could have changed between then and now to cause this problem?
 
Back
Top