Nvidia new driver enables 4k@60HZ via hdmi

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
So we are just talking HDTVs here ?!

Well I guess for those who want to game with a PC on a 4K HDTV this is something for the next few months. The next gen of video cards will have HDMI 2.0 anyways.

4K HDTVs are really worthless. There is no content. Blu-ray is in 1080p and cable/satellite TV providers still deliver 720p for the most part, and their networks would have to compress the crap out of 4K content to deliver it, if it even existed.

4K HDTVs are not good for much but watching the reference disk. That and upscaling 1080p to 4K. High quality 1080P plasma>1080p upscaled to 4K on an LCD in PQ . Just don't see the point in 4K anywhere but on a PC monitor currently.

Netflix is suppose to have 2160p content. But out of that, I don't see any point of getting a 4k TV right now especially if they don't have a DisplayPort or if our cards don't support HDMI 2.0 already.
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
I agree that it's better to wait re: 4K TVs, but if you need a new TV NOW, it makes sense.

On average the 4K sets are $400-700 more than their 1080p counterparts. I keep my TVs for around 5 years at least, so, dropping $2k on a 1080p set now seems silly. I'm sure we'll start seeing more 4K content in the next few years.
 

saeedkunna

Member
Apr 8, 2014
50
0
61
TV manucfacturer that sold 4k TVs that don't have HDMI 2.0 are selling hardware updates for 300$-400$ to enable HDMI 2.0. But this will give you 100% real 4k at 60hz... not just a fake placebo with reduced color quality for free in order to boost their marketing strategy.

those people who paid ($300-400) will appreciate what NVidia did because it didn't coast them anything and now they have a choice .

High quality 1080P plasma>1080p upscaled to 4K on an LCD in PQ

this is not true, I have both sony 4k tv 900a and Panasonic TC-46PGT24 the upscaled 1080p looks much better then any other 1080p plasma or lcd.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
this is not true, I have both sony 4k tv 900a and Panasonic TC-46PGT24 the upscaled 1080p looks much better then any other 1080p plasma or lcd.

Not even close. Upscaled 1080p to 4K can vary from horrid, just a stretched image, to passable with a scalar algorithm used. Either solution degrades and/or alters the accuracy of the original sourcee.

I've shopped around the 4K LCDs, first they're LCD, strike one. Second, the upscaling is terrible. There is just no reason to bother with them due to a lack of native content. Why buy 4K to use 1080P content ?
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
those people who paid ($300-400) will appreciate what NVidia did because it didn't coast them anything and now they have a choice .

Wrong, they already could run 4k @30hz (4:4:4) or 4k @ 60hz (4:2:0).

They will still be limited, no change at all if you think about it.

(See, you are almost thinking nVidia enabled you true HDMI 2.0)
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
TV manucfacturer that sold 4k TVs that don't have HDMI 2.0 are selling hardware updates for 300$-400$ to enable HDMI 2.0. But this will give you 100% real 4k at 60hz... not just a fake placebo with reduced color quality for free in order to boost their marketing strategy.
that still does no good unless you have an HDMI 2.0 video card.

LMK when manufacturers start selling either DP->HDMI 2.0 adapters or upgrades to TVs that add DP.

Until then, this is a needed and innovative solution that costs nothing.
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
Wrong, they already could run 4k @30hz (4:4:4) or 4k @ 60hz (4:2:0).

They will still be limited, no change at all if you think about it.

(See, you are almost thinking nVidia enabled you true HDMI 2.0)
Nowhere did he say he was confused about the limitations of Nvidia's solution. I think you're transferring your confusion onto him.
 

saeedkunna

Member
Apr 8, 2014
50
0
61
Wrong, they already could run 4k @30hz (4:4:4) or 4k @ 60hz (4:2:0).

They will still be limited, no change at all if you think about it.

(See, you are almost thinking nVidia enabled you true HDMI 2.0)

limited or not who made this choice possible?

I think you are confused no body is saying NVidia enabled hdmi 2.0 (in all my post about this I said enabled 4k@60HZ )
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Nowhere did he say he was confused about the limitations of Nvidia's solution. I think you're transferring your confusion onto him.

Lol, stop saying I'm confused, I probably have more experience than you might think in this department.

limited or not who made this choice possible?

I think you are confused no body is saying NVidia enabled hdmi 2.0 (in all my post about this I said enabled 4k@60HZ )


Just re-read your previous post and realise how stupid it sound!


those people who paid ($300-400) will appreciate what NVidia did because it didn't coast them anything and now they have a choice .

Even if they did or didn't pay this 400$, they would still be limited to the crappy 4:2:0 color @ 60hz with Nvidia unlock. Duh.


____________
______________

Also, I never said what nVidia did is bad. They just did it for marketing purpose. They screw the colors just to say out loud: " We can do 4k 60hz with HDMI 1.4b" . Hell no you can't.

What is the point of reducing the color quality so the image looks like 1024x768. It will br written 3840x2160 @ 60hz in nVidia control panel?

Chroma_2D00_Examples.jpg


Hair_Comparison_400x.png


Colorcomp.jpg


_______
________

This is the worst thing I hate from AMD and nVidia, their freakin lies and marketing gimmicks.
 
Last edited:

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
Saying the image quality will look like 1024x768 is hyperbole, all Blu-rays are mastered at 4:2:0 anyway.

Nvidia probably did it for marketing reasons but I don't care, as a consumer that has a 4K TV I'm glad they added the option. At least I can now use my display @ 60Hz while I wait for HDMI 2.0 GPUs. It is not a good solution, but it's the only one we've got right now.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Saying the image quality will look like 1024x768 is hyperbole, all Blu-rays are mastered at 4:2:0 anyway.

Nvidia probably did it for marketing reasons but I don't care, as a consumer that has a 4K TV I'm glad they added the option. At least I can now use my display @ 60Hz while I wait for HDMI 2.0 GPUs. It is not a good solution, but it's the only one we've got right now.

I've got to admit, you are right on this part.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
When I swapped my 6950 to 7970 over HDMI, the newest drivers didn't enable 4:4:4 full RGB color mode. The difference in color quality was tremendous to the point where I noticed it immediately. I thought my new 7970 was broken until I manually set the setting. Sure, it is usable but the 2D image quality is atrocious to say the least, not even comparable to GPUs form the late 90s.

http://i.imgur.com/RY3YrFn.png

Those comparisons you linked do not accurately show the difference someone will actually see in front of them.

People who run cheap TN panels aren't likely to care. People who run high quality panels like we find in 4K monitors, especially the IPS panels will. Look at how much people spend to be able to "max out settings". Supposedly IQ is the most important reason to be a PC gamer. Yet we have people defending something that achieves FPS purely by reducing rendering quality. Like you said, "not even comparable to GPUs form the late 90s." It takes ~3x gk110/Hawaii to achieve ~60fps fairly consistently @ 4K. Add to that the price of a 4K monitor. Why would anyone want to reduce color quality with that kind of investment? After you've spent the money to have the hardware that will run 60fps @ 4K, buy a monitor that's capable as well.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
That's neato. I can do 8K at 120Hz with technology that most people don't have too. Your point?

You understand that HDMI and DisplayPort are two completely different things and having one doesn't mean you have access to the other, right?

If you can run 60fps @ 4K you've spent a lot in hardware already, buy a proper monitor too. You can get them for ~$600.
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
Lol, stop saying I'm confused, I probably have more experience than you might think in this department.




Just re-read your previous post and realise how stupid it sound!




Even if they did or didn't pay this 400$, they would still be limited to the crappy 4:2:0 color @ 60hz with Nvidia unlock. Duh.


____________
______________

Also, I never said what nVidia did is bad. They just did it for marketing purpose. They screw the colors just to say out loud: " We can do 4k 60hz with HDMI 1.4b" . Hell no you can't.

What is the point of reducing the color quality so the image looks like 1024x768. It will br written 3840x2160 @ 60hz in nVidia control panel?

Chroma_2D00_Examples.jpg


Hair_Comparison_400x.png


Colorcomp.jpg


_______
________

This is the worst thing I hate from AMD and nVidia, their freakin lies and marketing gimmicks.
Your post is exactly why you're either confused or intellectually dishonest.

If you really had as much experience as I had you would realize how ignorant your post is. Resolution has nothing to do with colors or color depth.

Technically you can have a 4k 1 bit display.
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
People who run cheap TN panels aren't likely to care. People who run high quality panels like we find in 4K monitors, especially the IPS panels will. Look at how much people spend to be able to "max out settings". Supposedly IQ is the most important reason to be a PC gamer. Yet we have people defending something that achieves FPS purely by reducing rendering quality. Like you said, "not even comparable to GPUs form the late 90s." It takes ~3x gk110/Hawaii to achieve ~60fps fairly consistently @ 4K. Add to that the price of a 4K monitor. Why would anyone want to reduce color quality with that kind of investment? After you've spent the money to have the hardware that will run 60fps @ 4K, buy a monitor that's capable as well.

And you're simply incapable of envisioning the use case of "Hey, I've got this 60" 4k HDTV. it'd be cool to play BF4 on." Let the tech mature a bit and you might even see Steam game streaming able to run at this resolution. YUV420 compression would really help with the bandwidth needs.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
And you're simply incapable of envisioning the use case of "Hey, I've got this 60" 4k HDTV. it'd be cool to play BF4 on." Let the tech mature a bit and you might even see Steam game streaming able to run at this resolution. YUV420 compression would really help with the bandwidth needs.

I'm not sure what hardware you want to drive this TV with, but if it's the 770SLI in your sig then you are probably right. You would have to reduce IQ settings so much anyway to achieve 60fps that it might not matter. If on the other hand you have the GPU horsepower to run 60fps @ 4K with high IQ settings, it seems like a complete waste to me not to connect it to a proper monitor.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Your post is exactly why you're either confused or intellectually dishonest.

If you really had as much experience as I had you would realize how ignorant your post is. Resolution has nothing to do with colors or color depth.

Technically you can have a 4k 1 bit display.

Man are you kidding me?? Can't you see the difference in the pictures that is shown between 4:2:0 and 4:4:4. The reduced color quality make it look like the resolution is lower. You are the one denying it.

You attack me and of course colors has nothing to do with resolution, augh... nevermind man, yay nVidia.

3DVagabond, if I were you I would let go. You will get personnal attack because someone is not the same opinion as you. Be careful.
 
Last edited:

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
I'm not sure what hardware you want to drive this TV with, but if it's the 770SLI in your sig then you are probably right. You would have to reduce IQ settings so much anyway to achieve 60fps that it might not matter. If on the other hand you have the GPU horsepower to run 60fps @ 4K with high IQ settings, it seems like a complete waste to me not to connect it to a proper monitor.
Oh, so now I have to own a 4k display before I'm allowed to discuss the ramifications of YUV420 encoding?

GTFO.
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
Man are you kidding me?? Can't you see the difference in the pictures that is shown between 4:2:0 and 4:4:4. The reduced color quality make it look like the resolution is lower. You are the one denying it.

You attack me and of course colors has nothing to do with resolution, augh... nevermind man, yay nVidia.

3DVagabond, if I were you I would let go. You will get personnal attack because someone is not the same opinion as you. Be careful.
Yes, I do see the difference. No one, except the complainers, have argued this is lossless.

Would you like for me to produce for you a razor sharp 1024x768 image? Because I think you're very confused - you're claiming that 1024x768 is reducing quality in the same manner as color space compression.

That's not how it works.

And yes, I will attack opinions when they are stupid. The two of you are arguing that a free feature is bad simply because you don't agree with it.

To paraphrase my comment from the SD card smartphone thread: "No one ever said 'my graphics card has too many features.'"
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Yes, I do see the difference. No one, except the complainers, have argued this is lossless.

Would you like for me to produce for you a razor sharp 1024x768 image? Because I think you're very confused - you're claiming that 1024x768 is reducing quality in the same manner as color space compression.

That's not how it works.

And yes, I will attack opinions when they are stupid. The two of you are arguing that a free feature is bad simply because you don't agree with it.

To paraphrase my comment from the SD card smartphone thread: "No one ever said 'my graphics card has too many features.'"

Are you trying to change my opinion or what?
My opinion won't change: yes it's good they did that but they purely did it for marketing purpose.

And by the way the 1024x768 was just an exageration obviously but you didn't seem to catch this.

Please don't reply to me anymore.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Well, I keep going back and forth. This usually gets me to the best conclusion, so I like reading the debate even if it can be contentious.

If I had a 4kHDTV this would be great, but I won't pick one up for it. And i'm working on holstering itchy trigger finger on upgrading to have option to spit out 8294400 pixels at 60hz to something fairly large.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Are you trying to change my opinion or what?
My opinion won't change: yes it's good they did that but they purely did it for marketing purpose.

And by the way the 1024x768 was just an exageration obviously but you didn't seem to catch this.

Please don't reply to me anymore.

Amazing.
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
I'm going to lay it out for you guys in the simplest terms, the broadest strokes. If you look at my post history, you will see many times that I have been referred to as an AMD fanboy, both from the Intel and NV camps. It's more basic than that.

I support any of these companies when they make a pro-consumer decision and complain about it when they don't. This is clearly a pro-consumer decision and frankly one I would've expected them to either hold back for the new generation or segment to the higher tiers.
Are you trying to change my opinion or what?
My opinion won't change: yes it's good they did that but they purely did it for marketing purpose.

And by the way the 1024x768 was just an exageration obviously but you didn't seem to catch this.

Please don't reply to me anymore.
I suggest, if you're not here to discuss then you should log off. This isn't a college, and you're not a professor. We aren't here for your dictation.

I caught your "exaggeration". It was an ignorant comparison because again resolution doesn't impact IQ in this manner. You can have a razor sharp 1024x768 image.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I can't understand why any of you think this is a bad thing. Why this is deceitful, or any other negative spin you seem to want to give it. It is a feature that can be useful for some people. Most likely for those with existing HDTV's that have an HDMI 2.0 port.

Most of us won't care, or use it, but that does not make it terrible thing that shouldn't have been given to us. You just won't use it. Someone else will.
 
Status
Not open for further replies.