- Dec 1, 2009
- 27
- 0
- 0
Just bought an Acer S243HL 24" LED-backlit monitor. I've hooked it up to my computer with a DVI to HDMI cable. The monitor doesn't have any DVI inputs, only a couple of HDMI jacks and an analog VGA.
The problem is that I'm getting some pretty bad color banding. It's not noticeable all the time; I can only see during certain events. For example, it's there when the Win 7 login screen fades out and the desktop fades in. The banding is also noticeable during movie playback--especially the dark scenes. For some reason, there's no banding on photographs.
This only happens with HDMI; VGA color is nice and smooth. The thing is, VGA is less than optimal for 1080p.
So I moved the PC to my living room and connected the HDMI to my plasma TV. Same problem--there was some obvious color banding on the plasma screen. So now I know that it's a video card problem/setting. I took a look at the 8800GTX control panel and didn't find much. RGB is the only HDMI color format it will let me select. Color depth is, of course, set at 32 bits.
Anyone know what might be wrong?
The problem is that I'm getting some pretty bad color banding. It's not noticeable all the time; I can only see during certain events. For example, it's there when the Win 7 login screen fades out and the desktop fades in. The banding is also noticeable during movie playback--especially the dark scenes. For some reason, there's no banding on photographs.
This only happens with HDMI; VGA color is nice and smooth. The thing is, VGA is less than optimal for 1080p.
So I moved the PC to my living room and connected the HDMI to my plasma TV. Same problem--there was some obvious color banding on the plasma screen. So now I know that it's a video card problem/setting. I took a look at the 8800GTX control panel and didn't find much. RGB is the only HDMI color format it will let me select. Color depth is, of course, set at 32 bits.
Anyone know what might be wrong?
Last edited: