Nvidia new driver enables 4k@60HZ via hdmi

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Ryan Smith

The New Boss
Staff member
Oct 22, 2005
537
117
116
www.anandtech.com
Looks interesting,...

BUT,

why compromise at 4kHD?

Looks clear you still need HDMI 2.0 to not compromise on quality at 4kHD60hz from a computer.
Because if you're a TV manufacturer, it's this or wait another year before your 4K TVs can do 60Hz. TV manufacturers are (as you'd imagine) bigger fans of the former than the latter.

It's like 4K multi-tile displays. At the end of the day it's a workaround to compensate for the fact that not all parts of the technology are ready, but that it's complete enough to ship for early adopters.
 

saeedkunna

Member
Apr 8, 2014
50
0
61
Let's be accurate, this is a tech forum.

they are compressing the color space, just like your blu ray player. where do you think this feature originated?
exactly, and most people wont notice the color different for me I think its great solution from NVidia for people who already have both the TV and GPU .
 
Last edited:

TrulyUncouth

Senior member
Jul 16, 2013
213
0
76
I am rarely one to call fanboy and own an AMD card this generation myself but I think anyone trying to find the negatives here must be. If customers have the additional choice with these cards of pulling off 60hz, which nearly all of us thought impossible until they did, with HDMI using nothing but a driver update then this is a solid value-add for Nvidia.

I am the first to complain when Nvidia tries to push things like Gysync because I really don't want to be tied down to a single graphics card company- but in this case they are just doing a great job and providing their customers with a bonus at no additional cost. Props to Nvidia for this one.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Because if you're a TV manufacturer, it's this or wait another year before your 4K TVs can do 60Hz. TV manufacturers are (as you'd imagine) bigger fans of the former than the latter.

It's like 4K multi-tile displays. At the end of the day it's a workaround to compensate for the fact that not all parts of the technology are ready, but that it's complete enough to ship for early adopters.

Thanks for the response here.

My thinking was GPU's DP v1.2 to HDMI 2.0 adapter to TV's HDMI 2.0 input could get us 4kHD60hz bliss.

But that's a no go because those adapters don't exist yet. Unclear if they will ever be available or how soon.

As I consider this, this is a major deal for me because I'm on HT Receiver w/ HDMI 1.4a support between computer and TV. NVidia's solution could let me keep that receiver and get 4kHD60hz to a new TV. I hope AMD takes the hint here and delivers something similar through a driver update.
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
exactly, and most people wont notice the color different for me I think its great solution from NVidia for people who already have both the TV and GPU .

We can change color space in drivers to mimic the solution/workaround nvidias got here, so folks can test color difference of this solution for themselves.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Most of the people will think nVidia are awsome to do this but ignoring that they are screwing with the colors to reach 60hz.

That's why I think it's pure marketing strategy. I wouldn't even consider HDMI with 4k even if my R9 290x would support it like nVidia does.

Anyway, my 4k monitor has a DisplayPort which I can easily run 3840x2160 at 60hz :D .

Can you imagine the outcry if AMD had done this. We saw some shimmering textures on an old obscure game once and the forum exploded with calls of AMD fudging benchmarks. Now we have nVidia claiming 60Hz @4K using HDMI and it doesn't matter how they are doing it. It's a 1/2 baked application. They aren't rendering the full bandwidth. I wonder if it gives them a performance advantage when benched against a true 4K @ 60Hz DP signal?
 

TerryMathews

Lifer
Oct 9, 1999
11,473
2
0
Can you imagine the outcry if AMD had done this. We saw some shimmering textures on an old obscure game once and the forum exploded with calls of AMD fudging benchmarks. Now we have nVidia claiming 60Hz @4K using HDMI and it doesn't matter how they are doing it. It's a 1/2 baked application. They aren't rendering the full bandwidth. I wonder if it gives them a performance advantage when benched against a true 4K @ 60Hz DP signal?
And you can't see the difference between shimmering textures (which is a defect in the primary purpose of the video card) and enabling color space compression (which is a completely new feature)?

I also remind you that at various points in history both companies have been guilty of fixing benchmarks. It is and will continue to be a valid concern.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Can you imagine the outcry if AMD had done this. We saw some shimmering textures on an old obscure game once and the forum exploded with calls of AMD fudging benchmarks. Now we have nVidia claiming 60Hz @4K using HDMI and it doesn't matter how they are doing it. It's a 1/2 baked application. They aren't rendering the full bandwidth. I wonder if it gives them a performance advantage when benched against a true 4K @ 60Hz DP signal?

Outcry? maybe its just me but its very hard to understand why some posters are already grabbing their pitchforks.

This "hack" or "patch" I think is a valuable and a legitimate option (for the time being) for some of the early adopters that cant get 60Hz to work with their 4K displays because... theres simply not many products which support let alone physically have DP 1.2 nor HDMI 2.0. If AMD could provide the same, it'd just be as welcomed. And the plus side is that its free.

Now what I dont understand is why suddenly the topic of performance advantages, benchmarks and such are cropping up. Just look at the bolded part above. Since when did nVIDIA claim they are doing this when it was an AT forum member that discovered this neat option. Also people do care how they are doing it because the technique obviously has a tradeoff. My feeling is that they would reveal this option when the R340 driver is WHQL ready.

On topic: Will AT do some investigation into the matter e.g. image quality differences in game, video etc? It'd be useless for normal PC use as the texts will be very hard to read.. but curious to see the trade offs and may affect my future HTPC.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It really doesn't look terrible to me. Some photos here

http://cdn.overclock.net/4/4a/900x900px-LL-4a037243_DSC_0133.jpeg
http://cdn.overclock.net/a/a6/a61aa71c_DSC_0086.jpeg

You can see some color fringing going on with fonts, but aside from that... looks usable. I would say it is worth the tradeoff over 30 Hz, but that's just me

When I swapped my 6950 to 7970 over HDMI, the newest drivers didn't enable 4:4:4 full RGB color mode. The difference in color quality was tremendous to the point where I noticed it immediately. I thought my new 7970 was broken until I manually set the setting. Sure, it is usable but the 2D image quality is atrocious to say the least, not even comparable to GPUs form the late 90s.

http://i.imgur.com/RY3YrFn.png

Those comparisons you linked do not accurately show the difference someone will actually see in front of them.
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I am rarely one to call fanboy and own an AMD card this generation myself but I think anyone trying to find the negatives here must be. If customers have the additional choice with these cards of pulling off 60hz, which nearly all of us thought impossible until they did, with HDMI using nothing but a driver update then this is a solid value-add for Nvidia.

I am the first to complain when Nvidia tries to push things like Gysync because I really don't want to be tied down to a single graphics card company- but in this case they are just doing a great job and providing their customers with a bonus at no additional cost. Props to Nvidia for this one.

This. It is a feature you can enable or not. No downside here, only potential upside, if you decide to enable it. Sure, it has it trade-offs, but makes sense for some people.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
This. It is a feature you can enable or not. No downside here, only potential upside, if you decide to enable it. Sure, it has it trade-offs, but makes sense for some people.

Its a choice between bad and bad. You can have 60hz but it looks much worse than youtube in colour and pixel quality or 30hz and have it look OK but run like a console. Its not exactly an excellent trade off, they are both bad for almost any computer use. Its the sort of workaround that in 6 months will be completely obsolete because we'll have none MST full 60hz 4k monitors being readily available at reasonable prices. Windows will still suck at scaling but at least the monitors wont be a huge compromise. The early adopters who adopt before its fully baked always loose out, its just not worth catering to this small market to solve a problem most 4k users should never have. It was a waste of Nvidia dev time better spent on something else.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Its a choice between bad and bad. You can have 60hz but it looks much worse than youtube in colour and pixel quality or 30hz and have it look OK but run like a console. Its not exactly an excellent trade off, they are both bad for almost any computer use. Its the sort of workaround that in 6 months will be completely obsolete because we'll have none MST full 60hz 4k monitors being readily available at reasonable prices. Windows will still suck at scaling but at least the monitors wont be a huge compromise. The early adopters who adopt before its fully baked always loose out, its just not worth catering to this small market to solve a problem most 4k users should never have. It was a waste of Nvidia dev time better spent on something else.


It's good for marketing because nVidia won't yell on the roofs that they are reducing the color to 4:2:0.

They will claim they can do 4k at 60hz via HDMI 1.4b.

PR is important.
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
Its the sort of workaround that in 6 months will be completely obsolete

Hey, guess what? This ain't 6 months from now so your argument is complete crap! If you want 4K at 60Hz today, this is your only option, so your futuristic Jetsons technology doesn't do jack for people in the year 2014.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Hey, guess what? This ain't 6 months from now so your argument is complete crap! If you want 4K at 60Hz today, this is your only option, so your futuristic Jetsons technology doesn't do jack for people in the year 2014.

Stop talking crap, I've been using 4k @ 60hz with perfect colors for more than 2 months now with DisplayPort outputs.

You don't have a choice because the technology doesn't exist yet.

Everything is a compromise.

Maybe you should do your homework before saying the technology doesn't exist.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
Its the sort of workaround that in 6 months will be completely obsolete...

For the people who have the limited hardware now, it won't matter what happens in 6 months or 3 years. It's not like, when HDMI2.0 comes out, all previously purchased hardware is updated...
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Hey, guess what? This ain't 6 months from now so your argument is complete crap! If you want 4K at 60Hz today, this is your only option, so your futuristic Jetsons technology doesn't do jack for people in the year 2014.

???

You can do 4K at 60hz right now today with Display Port and not having to degrade colours by 75%.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Only if your TV has a DisplayPort. It's not exactly common outside PC monitors.

I've never seen such bellyaching over an added feature.

TV manucfacturer that sold 4k TVs that don't have HDMI 2.0 are selling hardware updates for 300$-400$ to enable HDMI 2.0. But this will give you 100% real 4k at 60hz... not just a fake placebo with reduced color quality for free in order to boost their marketing strategy.
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
Stop talking crap, I've been using 4k @ 60hz with perfect colors for more than 2 months now with DisplayPort outputs.

That's neato. I can do 8K at 120Hz with technology that most people don't have too. Your point?

You understand that HDMI and DisplayPort are two completely different things and having one doesn't mean you have access to the other, right?
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
That's neato. I can do 8K at 120Hz with technology that most people don't have too.

And which technology is that?

Your point?

Your doing false claim. I hope I don't need to requote two of your posts saying that there is no other technology and now you say you own tech that can do 8k @ 120hz. lol
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
So we are just talking HDTVs here ?!

Well I guess for those who want to game with a PC on a 4K HDTV this is something for the next few months. The next gen of video cards will have HDMI 2.0 anyways.

4K HDTVs are really worthless. There is no content. Blu-ray is in 1080p and cable/satellite TV providers still deliver 720p for the most part, and their networks would have to compress the crap out of 4K content to deliver it, if it even existed.

4K HDTVs are not good for much but watching the reference disk. That and upscaling 1080p to 4K. High quality 1080P plasma>1080p upscaled to 4K on an LCD in PQ . Just don't see the point in 4K anywhere but on a PC monitor currently.
 
Status
Not open for further replies.