nvidia driver problem, dvi to 20" widescreen

cubby1223

Lifer
May 24, 2004
13,518
42
86
Searching Google looks like this is a common problem, not really a solution, though. I've got an evga 7900gs that had been connected fine through dvi to a 1920x1200 lcd screen, but set up a new computer for another person and gave him this card. It's a 20" 1680x1050 widescreen monitor. With the newest nvidia drivers, the monitor is detected as a 1680x1050 monitor, but the drivers stretch the screen too wide, perhaps outputting a 16:9 aspect ration instead of the proper 16:10? I don't know, but there's an inch of the desktop off the side of the screen, and when the mouse is brought over to the side, the desktop scrolls over to view the rest.

Not sure what to do short of ordering a dvi > vga adapter and seeing how that works.

I previously had a Radeon 7000 connected to this monitor, through vga, outputting the proper 1680x1050 desktop.

Anyone here seen this and solved it? Seems a common problem from google, but no answers.




driver bug I guess in the available 162.18 drivers. Found the 160.02 drivers and the resolution now works as it should
 

Deinonych

Senior member
Apr 26, 2003
633
0
76
Go into the monitor's display settings (on the monitor itself) and look for an "auto sync" command. It's possible that it's not detecting the signal properly.
 

cubby1223

Lifer
May 24, 2004
13,518
42
86
Google searching revealed something new that I confirmed with the monitor's OSD. Something's screwy, there's a mixup somewhere that the drivers is actually outputting the 1680x1050 resolution but with 1600x1200 output. The monitor identifies the incoming signal as 1600x1200. All other resolutions work fine, and confirmed by the monitor's OSD. But not 1680x1050. I doubt it's anything wrong with the monitor, since it worked fine with the Radeon 7000.
 

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
Originally posted by: cubby1223
Google searching revealed something new that I confirmed with the monitor's OSD. Something's screwy, there's a mixup somewhere that the drivers is actually outputting the 1680x1050 resolution but with 1600x1200 output. The monitor identifies the incoming signal as 1600x1200. All other resolutions work fine, and confirmed by the monitor's OSD. But not 1680x1050. I doubt it's anything wrong with the monitor, since it worked fine with the Radeon 7000.

It's the freaking driver's bug.
I had a similar issue.
I switched back to the Nov 06 released driver for now.
 

metalmania

Platinum Member
May 7, 2002
2,039
0
0
yes, it's the stupid Nvidia driver. My 8500GT showed the same behavior like op's 7900gs. Then I switched back to forceware 165.01 or 163.44, problem solved. :) Both 163.44 and 165.01 are beta drivers, but they support HD acceleration in Windows XP.