What's wrong with the image quality on my PC??

Davez621

Member
Jul 23, 2005
46
0
0
Ok, this is a somewhat complicated and ongoing problem but I'll try to keep it brief.

Basically, I have a 17" CRT monitor. After upgrading practically everything BUT the monitor (new mobo, video card, etc), I've noticed an immediate drop in quality. The image is FAR too bright and washed out (to the point where I have to turn the brightness of the monitor almost down to zero, and majorly adjust the gamma). Not only that, but there is a lack of clarity (almost like wearing glasses that are dusty or with a slight smear). Watching DVDs, this is very noticable. Colour reproduction is poor, and blacks are not dark or rich enough either (again, in DVDs and gaming this is most obvious).

In the past, image quality is something I've always taken for granted, and never had a problem with. My previous video cards have included, in this order, a: Geforce4, Geforce 5900, and a Geforce 6800 NU. All of these cards used the AGP slot. The problem started once I switched over to my new system, which uses PCI Express. I had to replace the 6800 NU, so first I tried a 6600GT, and then a 6800GT, both with the same result. I have since upgraded my motherboard, and CPU, but the problem persists.

And yes, I am going to get asked this a million times I know, I AM using the LATEST Nvidia drivers from their website.

I have absolutely no idea what the problem is. I'm thinking of getting a new video card (an 8600GT perhaps), but will I just be throwing money down the drain again? I am so fussy, that I have simply stopped gaming altogether, because I can't stand the bright, washed out look of the picture.

Once again, I have been using the same monitor for the past 3 years, so that is a constant. I noticed the 'downgrade' in image quality immediately when I upgraded, so it's not a result of the monitor wearing out over time. The ONLY variables, then, that I can think of, is the fact that, to date, I have not experienced good quality using a PCI Express card, and secondly.. this might be a long shot, but the type of memory is different. My old eVGA 6800 NU, which produced a perfect picture, used 'plain' DDR memory, while the newer cards use GDDR2/3. Could either of these things have any effect on image quality?

If I can't get some help, and soon, I may have to 'old school' and return to a Pentium 4 system with an AGP motherboard. Oh dear.. :(
 

Davez621

Member
Jul 23, 2005
46
0
0
32-bit color (sic) quality, 1024x768 resolution. The same as I've been running for years...
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I doubt that this is a hardware issue. My guess is that you had the color/brightness settings setup the way you liked them on your old install, but when you upgraded your system to PCIe (along with a fresh install of Windows) you lost those settings. You might want to try adjusting the desktop color settings in the NVIDIA control panel to see if that helps.

Let us know how it goes.
 

Davez621

Member
Jul 23, 2005
46
0
0
Well, your advice is sound, but you are forgetting that I had my old PC for nearly 3 years, and in that time, I reformatted multiple times, so my settings were always reset. I have always had perfect image quality on the default settings on all of my previous PCs and video card combinations. It is only now that I am experiencing this problem, so I am convinced it is hardware related. Adjusting the color settings in the control panel does not have the desired effect. It partially solves the brightness problem, but not the other issues such as the lack of clarity of text / DVD picture quality, and so forth. I'll admit that to the average person (well, maybe 9/10 people or more), my PC looks fine. But I am very fussy and I can notice the difference. Perhaps something to do with the digital to analog conversion .. as I am using VGA, and video cards all output digital these days, is there any degradation from the conversion?
 

Martimus

Diamond Member
Apr 24, 2007
4,488
152
106
I have never had a 6000 series nVidia card, but I have read many complaints about the image quality they had on this forum. It was supposedly fixed in the 8000 series though. That may be the issue.
 

BassBomb

Diamond Member
Nov 25, 2005
8,396
1
81
Originally posted by: Martimus
I have never had a 6000 series nVidia card, but I have read many complaints about the image quality they had on this forum. It was supposedly fixed in the 8000 series though. That may be the issue.

Did you even read his issues?


Anyways, @ OP, I would check the nVidia control panel to check to see if somehow brightness/contrast and even Video Overlay settings somehow changed.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Yeah, with some nVidia drivers my TV and DVD playback would become purple (by default). They've fixed/broken this multiple times IIRC, so it's worth looking into.

Please look at the settings BassBomb is talking about and let us know what you see (or upload a screenshot to imageshack.us as a JPEG -- Paint can do the file conversion). A screenshot would also allow people to verify if the problem is on the video card (i.e. we could see it) or outside (i.e. the digital-to-analog converter, the cable, or something else that we cannot see).

Some older CRTs do have problems with becoming too bright and washed out; however, it's way too early to assume that's the problem. If nothing else resolves your issue and the problem is not present in any screenshots that you take, then I'm confident one of the CRT aficionados here will be able to help you further.
 

Davez621

Member
Jul 23, 2005
46
0
0
Martimus - as I mentioned, the 6800 NU AGP was giving me a perfect picture. so it can't be a fault in the 6000-series.
nitromullet - very good thinking. I had actually considered that very possibility. if that's the case, a current generation chip could be the fix.
nullpoint - I don't think a screenshot will be any help because a screenshot captures an image in digital only format, from the computer's memory. that is, it captures the image before that image is output to the display device. hopefully you can see how this is useless? it means the image will appear completely different to every single person because every single person's setup is different. The only thing I can think of is taking a *photograph* of the screen, but I'm not sure that would come out very good.
 

Davez621

Member
Jul 23, 2005
46
0
0
Unfortunately, I much prefer the superior image quality of CRT (and it's ability to display any resolution) compared to LCD. I don't plan on swapping over at any point in the future. But I don't want to get into a CRT vs. LCD debate. :p
 

kmmatney

Diamond Member
Jun 19, 2000
4,363
1
81
Have you tried the monitor on another computer, just to make sure there isn't something funny that happened recently with the monitor? My old CRT became too washed out before I replaced it. In fact, the symptom was that it would be exteremly washed out for about 15 minutes, after which it warmed up and the display was better (but still not perfect).
 

Davez621

Member
Jul 23, 2005
46
0
0
Originally posted by: kmmatney
Have you tried the monitor on another computer, just to make sure there isn't something funny that happened recently with the monitor? My old CRT became too washed out before I replaced it. In fact, the symptom was that it would be exteremly washed out for about 15 minutes, after which it warmed up and the display was better (but still not perfect).

Yes, I have used it several of my PCs. Firstly, my old PC (up until 2005). No problems. Then, my 'new' PC (2005-2008). This was the one with PCIX and where the problem began. Now, I am using it on another, newer, PC. Again, same problem.

The monitor is no piece of junk. It is an LG Flatron F700-series, purchased in early 2005, *before* I bought my new PC. It's tube is one of the best for picture quality and rivals (and IMO exceeds) the Sony Trinitron.

I need to stress the fact that the picture quality did not deteriorate over time, gradually. It was immediate. As in, from the moment I booted up the new PC in '05.
 

Binky

Diamond Member
Oct 9, 1999
4,046
4
81
Every time I encounter a problem like this, one that seems easy to fix but isn't, I take the easy way out and buy new parts. If I had this issue, I'd buy an ATI card. It may just solve your problem with no further hassles.
 

Slugbait

Elite Member
Oct 9, 1999
3,633
3
81
Originally posted by: Binky
...I take the easy way out and buy new parts. If I had this issue, I'd buy an ATI card.

I wouldn't.

Davez, you don't mention what your refresh rate is.

But I digress. I once had a similar problem with one of my Trinitrons. After the monitor was fully warmed up, I went into the onboard menu and ran the Image Restoration
 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
Maybe you bumped the monitor sharply in the process and knocked something out of alignment? Do you have another monitor you could try for a bit?
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
Originally posted by: Binky
Every time I encounter a problem like this, one that seems easy to fix but isn't, I take the easy way out and buy new parts. If I had this issue, I'd buy an ATI card. It may just solve your problem with no further hassles.

My ATI 1800XL All In Wonder has the worst TV quality playback ive ever seen.

 

Slugbait

Elite Member
Oct 9, 1999
3,633
3
81
Originally posted by: SniperDaws
Originally posted by: Binky
Every time I encounter a problem like this, one that seems easy to fix but isn't, I take the easy way out and buy new parts. If I had this issue, I'd buy an ATI card. It may just solve your problem with no further hassles.

My ATI 1800XL All In Wonder has the worst TV quality playback ive ever seen.

All ATI cards of that vintage have the worst DVD image quality playback as well. It's been well-documented. However, I don't know of anybody who has compared image quality of the current AMD cards against nVidia.

As an aside Sniper, you do know that your AIW TV features are no longer supported, and will probably never work on Vista, right?
 

Binky

Diamond Member
Oct 9, 1999
4,046
4
81
The image quality on the HD2x000 series ATI cards is extremely good. In my experience, it's better than Nvidia. The drivers in Vista are also very stable for me (more stable than Nvidia).
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
Originally posted by: Slugbait
Originally posted by: SniperDaws
Originally posted by: Binky
Every time I encounter a problem like this, one that seems easy to fix but isn't, I take the easy way out and buy new parts. If I had this issue, I'd buy an ATI card. It may just solve your problem with no further hassles.

My ATI 1800XL All In Wonder has the worst TV quality playback ive ever seen.

All ATI cards of that vintage have the worst DVD image quality playback as well. It's been well-documented. However, I don't know of anybody who has compared image quality of the current AMD cards against nVidia.

As an aside Sniper, you do know that your AIW TV features are no longer supported, and will probably never work on Vista, right?


Yeah, ill be stealing my TV card back when i get a 3870.
 

Davez621

Member
Jul 23, 2005
46
0
0
Tried changing the video card yet again (8600GT) - no improvement. I also acquired another monitor of the same make/model, and same thing. The way I'd describe the picture is as if it's undergone one 'generation' of quality loss. Kind of like using an LCD screen hooked up by VGA. Unless you had a basis of comparison, you may not even notice. But once you've compared it to DVI, you immediately notice.

**UPDATE** I think I fixed the problem. It's not the monitor, nor the video card, but the drivers that seem to be the problem. Basically, it's the difference between using the latest Nvidia drivers from their website, or the drivers that shipped with the video card. I never thought this would make a difference at all, but it seems to have. The drivers that came with the 6800GT, although dated, appear to produce a better quality image than those that are available on the Nvidia website.