Nvidia new driver enables 4k@60HZ via hdmi

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Oh, so now I have to own a 4k display before I'm allowed to discuss the ramifications of YUV420 encoding?

GTFO.

I never said that you had to own a 4K display. I took this as not being figurative, "And you're simply incapable of envisioning the use case of "Hey, I've got this 60" 4k HDTV. it'd be cool to play BF4 on."". So I thought you meant you had a 4K TV you wanted to drive over HDMI. I looked at your sig and thought if that's the hardware he's going to use, then he might be right. I was just trying to look at it from your point of view.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I'm going to lay it out for you guys in the simplest terms, the broadest strokes. If you look at my post history, you will see many times that I have been referred to as an AMD fanboy, both from the Intel and NV camps. It's more basic than that.

I support any of these companies when they make a pro-consumer decision and complain about it when they don't. This is clearly a pro-consumer decision and frankly one I would've expected them to either hold back for the new generation or segment to the higher tiers.

I suggest, if you're not here to discuss then you should log off. This isn't a college, and you're not a professor. We aren't here for your dictation.

I caught your "exaggeration". It was an ignorant comparison because again resolution doesn't impact IQ in this manner. You can have a razor sharp 1024x768 image.

You are taking this personally and getting emotional. There's nothing personal going on that requires you to call people dishonest or "GTFO" or anything like that.
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
You are taking this personally and getting emotional. There's nothing personal going on that requires you to call people dishonest or "GTFO" or anything like that.
You're kidding right? We're all talking about how the world is round and we've got the two of you screaming about heresy.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
I think I got it
so it degrades the picture to fit the bandwidth of a 4k hdmi 24P tv so it can run at 60hz.
and thats a apple to apple picture to a 4k tv that has a displayport and vid. card with one ,that does not degrade the picture at 60hz .
got it thanks
 
Last edited:

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Yes, I do see the difference. No one, except the complainers, have argued this is lossless.

Would you like for me to produce for you a razor sharp 1024x768 image? Because I think you're very confused - you're claiming that 1024x768 is reducing quality in the same manner as color space compression.

That's not how it works.

And yes, I will attack opinions when they are stupid. The two of you are arguing that a free feature is bad simply because its not AMD.

To paraphrase my comment from the SD card smartphone thread: "No one ever said 'my graphics card has too many features.'"

FTFY
 

Ryan Smith

The New Boss
Staff member
Oct 22, 2005
537
117
116
www.anandtech.com
Boy it's hot in here.

Let's be clear about one thing: TV manufacturers are already doing this. Full stop. NVIDIA (and AMD, and Intel) can choose to follow the TV manufacturers by enabling it on their cards, thereby allowing their products to drive these 4K TVs. But there's nothing to gain from not enabling it, because for the moment there is no alternative when we're talking about HDMI.

Normally I don't comment on forum threads, but this is silly. While I am apprehensive at the fact that TV manufacturers are doing this in the first place due to the fact that it creates some technical confusion, on the video card side of things enabling this functionality is a net win. And for their part NVIDIA isn't calling this HDMI 2.0 and they didn't even put it in their release notes, so they have handled the matter reasonably well.
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
Boy it's hot in here.

Let's be clear about one thing: TV manufacturers are already doing this. Full stop. NVIDIA (and AMD, and Intel) can choose to follow the TV manufacturers by enabling it on their cards, thereby allowing their products to drive these 4K TVs. But there's nothing to gain from not enabling it, because for the moment there is no alternative when we're talking about HDMI.

Normally I don't comment on forum threads, but this is silly. While I am apprehensive at the fact that TV manufacturers are doing this in the first place due to the fact that it creates some technical confusion, on the video card side of things enabling this functionality is a net win. And for their part NVIDIA isn't calling this HDMI 2.0 and they didn't even put it in their release notes, so they have handled the matter reasonably well.
:beer:
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
I think I got it
so it degrades the picture to fit the bandwidth of a 4k hdmi 24P tv so it can run at 60hz.
and thats a apple to apple picture to a 4k tv that has a displayport and vid. card with one ,that does not degrade the picture at 60hz .
got it thanks
almost. the TV has to be 60hz compatible. this feature does nothing on a 24p/30hz set.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
ok ;)

getty_rf_photo_of_group_beer_toast.jpg
 

saeedkunna

Member
Apr 8, 2014
50
0
61
Boy it's hot in here.

Let's be clear about one thing: TV manufacturers are already doing this. Full stop. NVIDIA (and AMD, and Intel) can choose to follow the TV manufacturers by enabling it on their cards, thereby allowing their products to drive these 4K TVs. But there's nothing to gain from not enabling it, because for the moment there is no alternative when we're talking about HDMI.

Normally I don't comment on forum threads, but this is silly. While I am apprehensive at the fact that TV manufacturers are doing this in the first place due to the fact that it creates some technical confusion, on the video card side of things enabling this functionality is a net win. And for their part NVIDIA isn't calling this HDMI 2.0 and they didn't even put it in their release notes, so they have handled the matter reasonably well.
thank you
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I can't understand why any of you think this is a bad thing. Why this is deceitful, or any other negative spin you seem to want to give it.

I'll give you some reasons. It actually has a lot to do with NV's implementation of HDMI and how until today they refuse to acknowledge their Full-RGB problems, despite many consumers complaining since GTX400 generation at least.

1) It's deceitful without a disclaimer: if a person simply reads online/on a GPU box that current gen Videocard A provides full 4K @ 60Hz over HDMI. This is actually incorrect. The GPU can provide 4K @ 60Hz over HDMI at degraded IQ as a result of reduced color data. There is a 75% reduction in color encoding versus RGB color. If this is not hidden from the user, then I have no problem with it since NV is transparent about how they implemented the feature. Stating that you can run 4:2:0 to the average person means nothing vs. 75% reduced color.

2) What's more disturbing and what many PC gamers have pointed out for years is that NV's Full RGB mode has been broken on HDMI and NV continues to hide this, refusing to fix it.

Last time I checked NV used to have the option for toggling Full RGB:

yNIATdl.png


As of today, as far as I know nVidia still does not offer the option to use 0-255 RGB range via HDMI in their control panel and defaults RGB range to 16-235 for many people out of the box (when using HDMI). There is an option in NV control panel in "Adjust video color settings/Dynamic range" that deals with RGB range, but it is effective only for DVI users and does not change anything for HDMI users. Therefore, even without this thread, NV's color range over HDMI was greatly inferior to AMD's. Instead of directly addressing this problem, NV keep sweeping it under the rug as if it doesn't exist.

For games and 2D desktop usage, you need to do a registry tweak / need a color toggler app if you are using HDMI with an NV GPU.

Nvidia RGB Full/limited range toggler
http://blog.metaclassofnil.com/?p=83
or
http://review-it.ru/load/system/tun...vidia_rgb_full_limited_range_toggler/8-1-0-11

Question #1: So an Nvidia GPU without Reg tweaking and a Monitor will display everything in Standard RGB?
Answer #1: Over HDMI (and in rare cases DP, especially at 1080p) yes. Because HDMI is getting increasingly popular for monitors this is a real problem.

The reason many people are not happy is not only has NV NOT fixed their HDMI color problems at 1080/1440/1600p, nevermind 4K, but it is now trying to claim they can do 4K @ 60Hz over HDMI (with an even greater degradation of color!). :biggrin:

Most gamers who actually care about 2D IQ but run HDMI will notice a severely degraded IQ for NV when moving from AMD unless they resort to custom registry adjustments/utilizing custom tools for the fix. If you run DP or DVI, then you wouldn't be affected.

You can find hundreds of threads online all confirming the same for years:

http://www.rockpapershotgun.com/forums/showthread.php?9334-Nvidia-RGB-HDMI-issue

http://forums.overclockers.co.uk/showthread.php?t=18576199

http://community.futuremark.com/for...RGB-Range-without-custom-resolution-EASY-tool!

http://forums.guru3d.com/showthread.php?t=373241

http://techreport.com/forums/viewtopic.php?f=3&t=90500

http://www.avsforum.com/forum/26-ho...14-nvidia-rgb-full-limited-range-toggler.html

"I recently updated my GPU from dual HD6950s to a single GTX780 and I have been experiencing issues with the color output over HDMI. Prior to the upgrade, my display looked beautiful, but with the NVIDIA drivers (I have tried the three most recent versions) everything is washed out. Additionally, it appears impossible to achieve "true black" via HDMI. Now where this becomes bizarre is when you also consider that I have a monitor connected via DVI that displays color perfectly and has true black with no adjustments to the NVIDIA driver settings.

To be clear, nothing else changed besides the GPU. I am running the exact same setup, HDMI cable, DVI cable, and so on."

http://www.tomshardware.com/answers/id-1987382/nvidia-hdmi-display-issues.html

NV needs to admit that their HDMI implementation has been broken for years, but instead they try to market 4K @ 60Hz HDMI and pass it on as a viable solution. I have no problem with NV offering gamers options but if NV is already thinking about 4K @ 60Hz HDMI issues, why haven't they fixed their long-standing HDMI Full-RGB problem then?

Quote from PC monitor reviewer:

"Just posting here to let you know that I completely agree with the frustration in this thread angled towards Nvidia on this one. As a monitor reviewer and somebody who recommends monitors to people (some of which have HDMI as the only digital output) I find it extremely frustrating and inexcusable that Nvidia haven't sorted this out yet.

If you plug in a PC monitor to a PC, you really expect the graphics card to treat it properly and feed it the correct 'Full Range RGB (0-255)' signal. I am sick of having to mention this in monitor reviews, time and time again, in the calibration section. Use that tool that edits the registry or set a custom resolution (which not all games will use) - a completely unnecessary step if Nvidia would just do their job and sort this out."



And just because TV manufacturers and NV are onboard with trying to pass on this temporary solution to the end consumer, you know that neither will market it as a severely compromised solution. Instead, they'll claim full 4K @ 60Hz support so that the TV manufacturers can sell the TVs to unsuspecting consumers. There are many TVs that are falsely marketing 4K @ 60/120Hz but in reality can only do 30Hz when connected to a PC via HDMI 1.4.

Just like the GPU testing now includes frame times to augment FPS testing, we shouldn't be afraid to criticize AMD/NV/Intel, etc. if the end result is an improved user experience.
 
Last edited:

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
instead they try to market 4K @ 60Hz HDMI and pass it on as a viable solution.

The entirety of the problem in your post reduced to one sentence. If they market it as some sort of uncompromised solution I will agree with you 100%. So far, they aren't marketing it at all and in fact users were the first to comment on this feature.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The entirety of the problem in your post reduced to one sentence. If they market it as some sort of uncompromised solution I will agree with you 100%. So far, they aren't marketing it at all and in fact users were the first to comment on this feature.

The marketing is gonna follow soon after they release a driver update. The main point of my post that you didn't address is NV's broken HDMI support for Full RGB.

Think about this, someone at NV spent the time to work out a way to provide an incomplete 4K @ 60Hz support over HDMI. All of a sudden tech sites like AT post articles how NV provided a driver fix for 4K @ 60Hz but do you see articles that talk about their persistent HDMI color issues? Don't you think it's a disservice to the consumer that only 1/2 of the story is mentioned?

It's also very strange to me that NV goes out of their way to spend the time and financial resources doing something that affects 0.1% of PC gamers. However, they haven't fixed the core issue for 1000x more users with 720p/1080p/1440p/1600p monitors running HDMI. They wasted time on things most of the market doesn't care about and continue ignore the core HDMI issue that all of their modern cards have.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I really agree with Russian here, the review sites are doing a great disservice on how they report the problems with the current crop of cards and have been doing this for a long time. They have a conflict of interest, if they keep breaking bad news about the cards to customers then both AMD and Nvidia will stop sending them samples. They report the bad behaviour but after that they can't release reviews at the same time as everyone else, and it costs them more. In effect AMD/Nvidia sinks their business. So they get protection from the review sites.

If its not broken HDMI colour its broken HDMI passthrough, its crappy crossfire, its poor performance on the desktop, its missing implementations of DirectX features and misreporting of the actual support. They have failed us repeatedly at this point. These companies don't pay any attention when we report a bug, Nvidia has not even looked at my bugs for 9 months, didn't even bother to open them. AMD dismissed the lot and they persist 2.5 years later. Why doesn't the media report about how impossible it is to get these companies to fix clear repeatable bugs?! That conflict of interest. They don't just need a readership, they also need the graphics card companies good will.

I don't think adding another feature like this where users in the future will be complaining about the terrible image is a good idea. Its bad enough the TV companies did it, its worse that Nvidia is taking on the support calls that this will cause. You can guarantee we'll be fielding questions about terrible image quality on 4k screens for a decade. For what reason will all this misery be introduced? A short temporary period of time where 4k monitors were 30hz because they rushed the models out a little earlier as they weren't really production ready. Its not just a choice, its a problematic one and the press will cover up its downside and mouthpiece what Nvidia tells them to say. You can tell that is what will happen because every site will have it in the news and they will all read almost identically, there will be no critical analysis of it except maybe on one smaller site where they don't currently get review samples, and never will why they continue to poke the eyes of the companies that would provide them.

Anandtech is a mouthpiece for the companies it advertises, as is every other review site you read. The Cheetos mess was exposed in the gaming review sites and we all know what happened there....nothing. You wont ever get the truth about the problems with these products from them, and while the enthusiasts constantly bicker and deny the problems we wont ever get a genuine view of what is wrong with the cards and their drivers. Its high time as a customer and community we did something about it, because frankly I am sick of being treated badly by these companies that have reach much further than they should.
 
Last edited:

CakeMonster

Golden Member
Nov 22, 2012
1,621
798
136
As of today, as far as I know nVidia still does not offer the option to use 0-255 RGB range via HDMI in their control panel and defaults RGB range to 16-235 for many people out of the box (when using HDMI). There is an option in NV control panel in "Adjust video color settings/Dynamic range" that deals with RGB range, but it is effective only for DVI users and does not change anything for HDMI users. Therefore, even without this thread, NV's color range over HDMI was greatly inferior to AMD's. Instead of directly addressing this problem, NV keep sweeping it under the rug as if it doesn't exist.

For games and 2D desktop usage, you need to do a registry tweak / need a color toggler app if you are using HDMI with an NV GPU.

Nvidia RGB Full/limited range toggler
http://blog.metaclassofnil.com/?p=83
or
http://review-it.ru/load/system/tun...vidia_rgb_full_limited_range_toggler/8-1-0-11
Ugh. I have run my 780 with my projector as a 3rd monitor over HDMI since October, but I never knew this. Guess I can't complain that much if I didn't notice but I feel kind of hoodwinked. Its very possible that I would have waited for 290* which were right around the corner had I known this.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
I'll give you some reasons. It actually has a lot to do with NV's implementation of HDMI and how until today they refuse to acknowledge their Full-RGB problems, despite many consumers complaining since GTX400 generation at least.

1) It's deceitful without a disclaimer: if a person simply reads online/on a GPU box that current gen Videocard A provides full 4K @ 60Hz over HDMI. This is actually incorrect. The GPU can provide 4K @ 60Hz over HDMI at degraded IQ as a result of reduced color data. There is a 75% reduction in color encoding versus RGB color. If this is not hidden from the user, then I have no problem with it since NV is transparent about how they implemented the feature. Stating that you can run 4:2:0 to the average person means nothing vs. 75% reduced color.

2) What's more disturbing and what many PC gamers have pointed out for years is that NV's Full RGB mode has been broken on HDMI and NV continues to hide this, refusing to fix it.

  1. Nvidia makes new driver feature which merely supports feature already present in displays - enabling 4k@60HZ via hdmi, and all of a sudden:
    - IT'S DIRTY MARKETING, IT'S DECEITFUL WITHOUT DISCLAIMER and similar lulz

  2. Nvidia (same like Intel and Apple(!)) DOES NOT add EDID override feature, which could fix stupid EDID standards with mismatched HDMI/1080p/720 RGB reading, but now we have:

    NV's Full RGB mode has been broken on HDMI and NV continues to hide this

Simply phenomenal :D
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
The marketing is gonna follow soon after they release a driver update. The main point of my post that you didn't address is NV's broken HDMI support for Full RGB.

Think about this, someone at NV spent the time to work out a way to provide an incomplete 4K @ 60Hz support over HDMI. All of a sudden tech sites like AT post articles how NV provided a driver fix for 4K @ 60Hz but do you see articles that talk about their persistent HDMI color issues? Don't you think it's a disservice to the consumer that only 1/2 of the story is mentioned?

It's also very strange to me that NV goes out of their way to spend the time and financial resources doing something that affects 0.1% of PC gamers. However, they haven't fixed the core issue for 1000x more users with 720p/1080p/1440p/1600p monitors running HDMI. They wasted time on things most of the market doesn't care about and continue ignore the core HDMI issue that all of their modern cards have.

Isn't the full RGB "issue" that your monitor is in fact transmitting the wrong color space in its EDID?
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
....<large post that i didnt want to take up the page quoting>...

Holy smokes.

Must be pretty hard up for weekly nvidia trash talk topics, that much is obvious. Also, I cannot believe an AMD user would go through so much time and effort........ digging deep for issues arent we?

There is no point in even trying to point out how many things you have distorted and skewed........and flat out fabricated. For starters nvidia has not passed this off as anything. I mean, Ryan already address all this and yet you still come back without any regard???

Quit making this something it is not. Its been set straight, here:

Boy it's hot in here.

Let's be clear about one thing: TV manufacturers are already doing this. Full stop. NVIDIA (and AMD, and Intel) can choose to follow the TV manufacturers by enabling it on their cards, thereby allowing their products to drive these 4K TVs. But there's nothing to gain from not enabling it, because for the moment there is no alternative when we're talking about HDMI.

Normally I don't comment on forum threads, but this is silly. While I am apprehensive at the fact that TV manufacturers are doing this in the first place due to the fact that it creates some technical confusion, on the video card side of things enabling this functionality is a net win. And for their part NVIDIA isn't calling this HDMI 2.0 and they didn't even put it in their release notes, so they have handled the matter reasonably well.

There is really no more point in having the thread open. Especially since there is this ongoing intent to distort for no apparent reason other than certain posters dislike for Nvidia and it extendeds to anything and everything they do.

If the thread remains open, it will be another trash nvidia thread because they have offered a new feature that nobody has to use. There is nothing bad you can say about it in itself, unless you resort to fabricating.

like this:
The marketing is gonna follow soon after they release a driver update. The main point of my post that you didn't address is NV's broken HDMI support for Full RGB.

Think about this, someone at NV spent the time to work out a way to provide an incomplete 4K @ 60Hz support over HDMI. All of a sudden tech sites like AT post articles how NV provided a driver fix for 4K @ 60Hz but do you see articles that talk about their persistent HDMI color issues? Don't you think it's a disservice to the consumer that only 1/2 of the story is mentioned?

It's also very strange to me that NV goes out of their way to spend the time and financial resources doing something that affects 0.1% of PC gamers. However, they haven't fixed the core issue for 1000x more users with 720p/1080p/1440p/1600p monitors running HDMI. They wasted time on things most of the market doesn't care about and continue ignore the core HDMI issue that all of their modern cards have.

Its not happened yet your already reacting as if it has
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The marketing is gonna follow soon after they release a driver update. The main point of my post that you didn't address is NV's broken HDMI support for Full RGB.

There is nothing deceitful until they market it as such, therefore there is nothing deceitful at this time.

Go blow your horn all you want when/if they do as you are expecting, but not before.

And HDMI's other problems has nothing to do with this.
 
Last edited:
Status
Not open for further replies.