Post the test image you're using.
Dude, are you joking? I can do my own test images, don't you get it?Post the test image you're using.
The problem I think we are all having is exactly what the problem is. It has been explained in 2 ways. 1 way cannot be tested in the manner given.
If the lows and highs are clipped, then the test would work, but if it is compressed, the test image does not give us a way to actually test the problem.
I can understand that. But then the issue would be simply of having slightly less vibrant colors and/or blacks and whites.
You claim you had a 256 grey scale image and you examined every shade to make sure they were all distinguishable. What's the big deal saving it as a PNG and posting it?Dude, are you joking? I can do my own test images, don't you get it?
I don't understand, you're saying there is no difference but there is a difference?...and I GUARANTEE there is absolutely no different in color quality. If anything, I'd say the DVI cable is a bit less vibrant
Even then I can see the difference between colors just TWO values away from each other. All the way from 0 to 255. I just tried it, and even though it's hard to see, I do definitely notice the difference.
Just wanted to add that the new GeForce Driver 347.09 gives the option now to move from limited range RGB to full range RGB in the 'color settings' options in the control panel. To my eye, there is no difference between full range RGB and YCbCr444, and the YCbCr444 option has always been there...but, for the purist it is a good option to have.
Just wanted to add that the new GeForce Driver 347.09 gives the option now to move from limited range RGB to full range RGB in the 'color settings' options in the control panel. To my eye, there is no difference between full range RGB and YCbCr444, and the YCbCr444 option has always been there...but, for the purist it is a good option to have.
You do not have to install GeForce Experience as I recall someone saying earlier.
Not sure if this information has already been mentioned or not, but thought I'd share.
As you can see there some slight differences here and there but nothing of huge significance. After selecting the YCbCr444 colour signal the resolutions will be listed in exactly the same way by the driver, so will remain in the Ultra HD, HD, SD list if thats where they were before. Because the Full Range RGB (0-255) signal is used for DVI and most DisplayPort connections you may prefer to enforce this instead of using YCbCr444, just so you know things are being done the right way. Our preferred method of enforcing the correct Full Range RGB signal over HDMI is to use a nifty little tool mentioned below.
Interesting, when I switched to YCbCr444 I could definitely tell the difference, everything looked grainier and slightly washed, almost like an old TV. I could definitely see the reduced colour range.
Unlike Nvidias Limited Range RGB (16-235) signal AMDs default YCbCr 4:4:4 signal never causes things to look washed out by dramatically altering gamma or contrast. But it does slightly affect colour values so some shades are presented slightly differently to how they would over a DVI or DisplayPort connection (i.e. correct). And as with Nvidia cards, this signal type can cause a minority of monitors to display blurred or fringed text where certain colours are involved. Most users will probably be quite happy to stick with this default signal, but it is actually very simple to change the signal used using one of two methods.
I stand corrected on my previous stantement about nvidia experience.
I just tried YCbCr444 in my control panel and it's worse than RGB. Different shades of black are less different and I can't see the difference between the two black squares in the test image posted above.
I stand corrected on my previous stantement about nvidia experience.
I just tried YCbCr444 in my control panel and it's worse than RGB. Different shades of black are less different and I can't see the difference between the two black squares in the test image posted above.
I'm on an IPS monitor as well.What type of monitor are you using?
I am using an IPS monitor? Are you using a TN monitor?
I wonder if that is part of the problem.
Full Range RGB with the new driver is definitely better than default (obviously), but as mentioned...while there may be technical differences betweeen YCbCr444 that can be measured, to my eye, I can't tell the difference on my Acer H236HL.
It should be an easily changeable setting, regardless, and should never have been a hidden default (IE, there should have been an easy to find checkbox, and preferably some additional detection for being a non-TV). It's a usability issue that should have gotten decent priority, and would have been very easy for them to implement, years ago.I've been trying so hard for years to bring this to gamers' attention, not because I want to crap on NV or something but because I want all of us to have accurate colours in games, movies and desktop usage after we pay $ for graphics cards. For me the most basic functions of a videocard are presentation of accurate colours and high clarity. If you can't get that right, everything else falls apart no matter the FPS or performance/$ or performance/watt.
This Full RGB fix for NV should be a sticky at the top of VC&G forum for every new NV user running HDMI.
Impossible solution. I didn't realize the problem was still around until I hooked it up, but I did know what it was and what to do, which I don't expect most users to.Easy solution: use DVI.
$$$It blows my mind that manufacturers have not figured out how to prevent backlight bleeding and uniformity issues this far into the life of LCD technology.
To pick nits, it's fundamental to panels that need backlights. Monochrome transflective STN FTW (actually, sunlight readability did lead to choose a few, but not for PC use)!This is because the LCD panel will always have a certain amount of translucency, so there is always a certain amount of backlight leakage throughout the entire screen (this is a fundamental limitation of LCD technology).