Hi folks,
I recently upgraded from my older 21" CRT Sony monitor that saw its last days to a new Acer AL-2223W LC 22" monitor on my eVGA 8800GTX using the DVI cable. I am running XP pro with NVIDIA?s latest drivers and it scales my widescreen monitor into a 1600x1200 resolution, and not 1680x1050.
The funny thing is that the blasted display driver information on my computer states it's running at 1680x1050 but the monitor in its info when I click the button on the monitor shows 1600X1200. It's blurry as all heck and I KNOW it's not running at 1680X1050. It practically fries my eyes after a while.
I have read countless posts on this -- some are saying that it's the drivers that Nvidia has but what I don't understand is that I tried my GF's older 22" LCD and that works perfectly. She has it running on her ATI x1900XT and never had an issue and that one works well with my 8800GTX and the newest drivers.
If I change the cable to the VGA one on this new LCD, it looks amazing and now both displays (on computer in Nvidia control panel and monitor's own info screen) show 1680x1050. Text is sharp and all is well but it will never look as well as my GF?s using its native DVI full digital glory.
I have heard I can revert to older Nvidia drivers; specifically the 96 series and it will fix this. Is this true and if so why? What is missing in the newest ones that wont allow this and have problems with scaling? Is it the fact I am not running Vista as this monitor I bought, the new Acer says it's made for Vista and again some posts are saying it wont work properly in XP.
Sigh? any ideas?
Thanks!
I recently upgraded from my older 21" CRT Sony monitor that saw its last days to a new Acer AL-2223W LC 22" monitor on my eVGA 8800GTX using the DVI cable. I am running XP pro with NVIDIA?s latest drivers and it scales my widescreen monitor into a 1600x1200 resolution, and not 1680x1050.
The funny thing is that the blasted display driver information on my computer states it's running at 1680x1050 but the monitor in its info when I click the button on the monitor shows 1600X1200. It's blurry as all heck and I KNOW it's not running at 1680X1050. It practically fries my eyes after a while.
I have read countless posts on this -- some are saying that it's the drivers that Nvidia has but what I don't understand is that I tried my GF's older 22" LCD and that works perfectly. She has it running on her ATI x1900XT and never had an issue and that one works well with my 8800GTX and the newest drivers.
If I change the cable to the VGA one on this new LCD, it looks amazing and now both displays (on computer in Nvidia control panel and monitor's own info screen) show 1680x1050. Text is sharp and all is well but it will never look as well as my GF?s using its native DVI full digital glory.
I have heard I can revert to older Nvidia drivers; specifically the 96 series and it will fix this. Is this true and if so why? What is missing in the newest ones that wont allow this and have problems with scaling? Is it the fact I am not running Vista as this monitor I bought, the new Acer says it's made for Vista and again some posts are saying it wont work properly in XP.
Sigh? any ideas?
Thanks!