Ok, this is a somewhat complicated and ongoing problem but I'll try to keep it brief.
Basically, I have a 17" CRT monitor. After upgrading practically everything BUT the monitor (new mobo, video card, etc), I've noticed an immediate drop in quality. The image is FAR too bright and washed out (to the point where I have to turn the brightness of the monitor almost down to zero, and majorly adjust the gamma). Not only that, but there is a lack of clarity (almost like wearing glasses that are dusty or with a slight smear). Watching DVDs, this is very noticable. Colour reproduction is poor, and blacks are not dark or rich enough either (again, in DVDs and gaming this is most obvious).
In the past, image quality is something I've always taken for granted, and never had a problem with. My previous video cards have included, in this order, a: Geforce4, Geforce 5900, and a Geforce 6800 NU. All of these cards used the AGP slot. The problem started once I switched over to my new system, which uses PCI Express. I had to replace the 6800 NU, so first I tried a 6600GT, and then a 6800GT, both with the same result. I have since upgraded my motherboard, and CPU, but the problem persists.
And yes, I am going to get asked this a million times I know, I AM using the LATEST Nvidia drivers from their website.
I have absolutely no idea what the problem is. I'm thinking of getting a new video card (an 8600GT perhaps), but will I just be throwing money down the drain again? I am so fussy, that I have simply stopped gaming altogether, because I can't stand the bright, washed out look of the picture.
Once again, I have been using the same monitor for the past 3 years, so that is a constant. I noticed the 'downgrade' in image quality immediately when I upgraded, so it's not a result of the monitor wearing out over time. The ONLY variables, then, that I can think of, is the fact that, to date, I have not experienced good quality using a PCI Express card, and secondly.. this might be a long shot, but the type of memory is different. My old eVGA 6800 NU, which produced a perfect picture, used 'plain' DDR memory, while the newer cards use GDDR2/3. Could either of these things have any effect on image quality?
If I can't get some help, and soon, I may have to 'old school' and return to a Pentium 4 system with an AGP motherboard. Oh dear..
Basically, I have a 17" CRT monitor. After upgrading practically everything BUT the monitor (new mobo, video card, etc), I've noticed an immediate drop in quality. The image is FAR too bright and washed out (to the point where I have to turn the brightness of the monitor almost down to zero, and majorly adjust the gamma). Not only that, but there is a lack of clarity (almost like wearing glasses that are dusty or with a slight smear). Watching DVDs, this is very noticable. Colour reproduction is poor, and blacks are not dark or rich enough either (again, in DVDs and gaming this is most obvious).
In the past, image quality is something I've always taken for granted, and never had a problem with. My previous video cards have included, in this order, a: Geforce4, Geforce 5900, and a Geforce 6800 NU. All of these cards used the AGP slot. The problem started once I switched over to my new system, which uses PCI Express. I had to replace the 6800 NU, so first I tried a 6600GT, and then a 6800GT, both with the same result. I have since upgraded my motherboard, and CPU, but the problem persists.
And yes, I am going to get asked this a million times I know, I AM using the LATEST Nvidia drivers from their website.
I have absolutely no idea what the problem is. I'm thinking of getting a new video card (an 8600GT perhaps), but will I just be throwing money down the drain again? I am so fussy, that I have simply stopped gaming altogether, because I can't stand the bright, washed out look of the picture.
Once again, I have been using the same monitor for the past 3 years, so that is a constant. I noticed the 'downgrade' in image quality immediately when I upgraded, so it's not a result of the monitor wearing out over time. The ONLY variables, then, that I can think of, is the fact that, to date, I have not experienced good quality using a PCI Express card, and secondly.. this might be a long shot, but the type of memory is different. My old eVGA 6800 NU, which produced a perfect picture, used 'plain' DDR memory, while the newer cards use GDDR2/3. Could either of these things have any effect on image quality?
If I can't get some help, and soon, I may have to 'old school' and return to a Pentium 4 system with an AGP motherboard. Oh dear..