OK,
I decided to try and work this issue out with nVidia before bringing it public, but the canned "We don't support our products" line has forced me out. Here's the deal:
I've been running HDTV on my Sony 21" CRT for over a year now over firewire from my cable box. I was not happy with a) the size of the output being all letterboxed and whatnot and b) the inability to see the cable box's menu on anything but DVI-D or Component. With the Super Bowl/March Maddness coming quick, I decided to buy my very first TV, which I decided would be a Sharp Aquos 32". It's got awesome blacks, good refresh, and very vivid colors.
Anyways, so here's the big deal. I ordered up a HDMI-DVI cable as the Aquos has only one HDMI input for digital signal. It was picked right up as "Sharp HDMI" on the 'puter, and the drivers decidedly want to run the display at 1080i. It's actually kinda hard to get 720p out of it, but that's ok. Needless to say, there's no way to force the TV to accept a native resolution from the input that I can figure, so it actually runs around 1100x700 with overscan compensation on, a far cry from the native 1366x768.
So the big problem is that when I'm trying to play pretty much any game from Quake 3 up to Quake 4, BF2 or FEAR, when SLI is enabled the machine locks right up once the display is initialized, or in the case of Quake 3 there's no OpenGL available. When SLI is off, the graphics driver forces the monitor into 1080i and then letterboxes the video.
What's worse is that when I first hooked this LCD up, it was getting a VGA signal in POST and the initial boot screens, and it actually looked pretty good. Now it's back to display adapter scaling, and 1080i is always on display, even with iTunes in full screen, where the VGA really looked much better.
Now, has anyone been foolish enough to think that a LCD would act more or less like a regular digital monitor and run into issues like this? Any help would be greatly appreciated.
I decided to try and work this issue out with nVidia before bringing it public, but the canned "We don't support our products" line has forced me out. Here's the deal:
I've been running HDTV on my Sony 21" CRT for over a year now over firewire from my cable box. I was not happy with a) the size of the output being all letterboxed and whatnot and b) the inability to see the cable box's menu on anything but DVI-D or Component. With the Super Bowl/March Maddness coming quick, I decided to buy my very first TV, which I decided would be a Sharp Aquos 32". It's got awesome blacks, good refresh, and very vivid colors.
Anyways, so here's the big deal. I ordered up a HDMI-DVI cable as the Aquos has only one HDMI input for digital signal. It was picked right up as "Sharp HDMI" on the 'puter, and the drivers decidedly want to run the display at 1080i. It's actually kinda hard to get 720p out of it, but that's ok. Needless to say, there's no way to force the TV to accept a native resolution from the input that I can figure, so it actually runs around 1100x700 with overscan compensation on, a far cry from the native 1366x768.
So the big problem is that when I'm trying to play pretty much any game from Quake 3 up to Quake 4, BF2 or FEAR, when SLI is enabled the machine locks right up once the display is initialized, or in the case of Quake 3 there's no OpenGL available. When SLI is off, the graphics driver forces the monitor into 1080i and then letterboxes the video.
What's worse is that when I first hooked this LCD up, it was getting a VGA signal in POST and the initial boot screens, and it actually looked pretty good. Now it's back to display adapter scaling, and 1080i is always on display, even with iTunes in full screen, where the VGA really looked much better.
Now, has anyone been foolish enough to think that a LCD would act more or less like a regular digital monitor and run into issues like this? Any help would be greatly appreciated.