[AnandTech] Club3D Releases Their DisplayPort 1.2 to HDMI 2.0 Adapter

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
this is like the niche of an already niche market. last I checked, 4k gamers were only 0.12%. how many of them game with 4k tvs instead of monitors?
 

TestKing123

Senior member
Sep 9, 2007
204
15
81
I don't imagine most people are trying to run FO4 or TW3 on a 4k TV using their HTPC. Based on my use case and I imagine many other people's, I use my HTPC for home theater.

Why would you need 4k 60hz then? 99.99% of media content is either 24fps or 30fps, which needs just HDMI 1.4 for 4k 30hz.

Only The Hobbit is 48fps that I know of. Are there others? Even if they exist, I don't think they're 4k 48fps, but 1080p 48 fps.

I thought the whole point of HDMI 2.0 converters is to allow > 30fps 4k gaming for AMD users. This is what AMD focused on in their marketing for 4k HTPC gamers.
 

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
Ever tried using even desktop tasks at 30Hz? It is not fun, and the drop in refresh rate is very noticeable...
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
this is like the niche of an already niche market. last I checked, 4k gamers were only 0.12%. how many of them game with 4k tvs instead of monitors?

I am one of the proud few but I am also very much in the minority. I use my 4K TV for PC gaming and its been great so far. This adapter is great for AMD guys and wish it was around while I was still using my 290Xs. I actually switched to Nvidia specifically for HDMI 2.0, would have gladly stayed with what I had if this adapter was available.


Ever tried using even desktop tasks at 30Hz? It is not fun, and the drop in refresh rate is very noticeable...

Can confirm 30hz desktop tasks does suck. Moving the mouse cursor felt choppy. Same computer, same display, moving to HDMI 2.0 @ 60hz made a world of a difference.
 
Last edited:

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
It's great that this product has finally materialized, but it's definitely too bad it was even needed.

I was absolutely devastated when I bought my 290X to discover that I couldn't connect it to my FW900 without an active adapter and even then not at the higher resolutions and refresh rates I was accustom to. It was my own mistake for believing that it had both DVI-I and DVI-D, but it still sucked in a big way. Then again, I was flabbergasted when the Fury was announced without HDMI 2.0 and only the promise of adapters.

Likewise, Intel dropped the ball on HDMI 2.0 as well. Their NUCs are worthless as 4K HTPCs without it. AMD could have also had a phenomenal NUC-like device with the new HEVC-enabled Carrizo APU, but no HDMI 2.0 to be seen. This adapter is a godsend. It will be interesting to see how many people suddenly recognize/endorse 4K content now that they can actually use it (outside of a tiny 4K DP monitor).

Before anyone responds to say that 4K isn't mature or some BS, keep in mind:

1. 4K gaming has been a reality for years, maybe not AC:Unity on a single card, but nearly any game released before 2013 can be played at 4K on the highest possible settings with minimal effort. You don't even need a very powerful rig to do it. Toss in some ReShade effects and texture packs to modernize and beautify and the experience is wonderful.

2. 4K+ YouTube has been available for years and 4K 60fps videos and gaming streams are everywhere. There's plenty of content out there.

3. 4K GoPro/sporting cameras, cell phones, and all sorts of consumer cameras have been generating homemade 4K content for years. Watching them on the big screen is great!

4. Netflix and Amazon 4K content may not be available officially on PC, but nearly every show is available via less-than-reputable sources, in all their complete 4K HEVC glory.

5. Even if your rig can't game at 4K, using a 4K60 desktop is unrivaled in its awesomeness.

6. The "niche within a niche" argument is a good one, but when your high end GPUs mostly only sell to niches in the first place, skimping on features like this is a big mistake. I can only speak for myself, but AMD lost my dollar by not having HDMI 2.0 in the last round. (They may very well get it back with DP1.3 in the next round.)

7. There are more than 60 4K HDTV models that support 4K60 4:4:4 input over HDMI 2.0 - not including those that can be modified with firmware to do so. There are 38 4K+ DisplayPort monitors available from Newegg, including 30Hz displays (I couldn't filter them out).

All of that said, I'm very happy with my GTX 970 - I can connect any display to it without an adapter. Built in RAMDAC for premium CRT performance, built in HDMI 2.0 for 4K60 4:4:4 on HDTVs, and then DP and DVI for everything else. The only thing it's missing (or not enabled) is VESA AdaptiveSync support and 4GB VRAM (jk lol).
 

TestKing123

Senior member
Sep 9, 2007
204
15
81
Ever tried using even desktop tasks at 30Hz? It is not fun, and the drop in refresh rate is very noticeable...

Well then, the answer would be to set your desktop to a 60hz resolution.

I don't get the spin here, it seems here people are "spinning" the true purpose of a displayport to HDMI 2.0 adaptor.

From what I read, it's always been to address the limitations of lack of HDMI 2.0 on AMD cards, hence why AMD themselves touted them upon Fury X's release.

There are no 4k streaming content for PC's. And won't be for a while. All 4k content are locked for specific tv and other hardware models. You can read the reasoning behind this here:
http://www.techhive.com/article/285...n-a-pc-or-mac-even-though-theyre-capable.html

When UHD Blu Rays are out, you'll need HDCP 2.2 devices. I don't believe you can stick a UHD blu ray on a PC and run it (at least not for a while).

Yes, graphics cards (Nvidia at the moment) are HDMI 2.0 compliant, but it's for gaming. Just like this converter is for, to allow you to game at > 30fps at 4k.

Spinning this for 4k PC media content, when non-exists, seems a bit absurd. Maybe if you're into making your own home-made movies, this would make sense.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
It's great that this product has finally materialized, but it's definitely too bad it was even needed.

I was absolutely devastated when I bought my 290X to discover that I couldn't connect it to my FW900 without an active adapter and even then not at the higher resolutions and refresh rates I was accustom to. It was my own mistake for believing that it had both DVI-I and DVI-D, but it still sucked in a big way. Then again, I was flabbergasted when the Fury was announced without HDMI 2.0 and only the promise of adapters.

Likewise, Intel dropped the ball on HDMI 2.0 as well. Their NUCs are worthless as 4K HTPCs without it. AMD could have also had a phenomenal NUC-like device with the new HEVC-enabled Carrizo APU, but no HDMI 2.0 to be seen. This adapter is a godsend. It will be interesting to see how many people suddenly recognize/endorse 4K content now that they can actually use it (outside of a tiny 4K DP monitor).

Before anyone responds to say that 4K isn't mature or some BS, keep in mind:

1. 4K gaming has been a reality for years, maybe not AC:Unity on a single card, but nearly any game released before 2013 can be played at 4K on the highest possible settings with minimal effort. You don't even need a very powerful rig to do it. Toss in some ReShade effects and texture packs to modernize and beautify and the experience is wonderful.

2. 4K+ YouTube has been available for years and 4K 60fps videos and gaming streams are everywhere. There's plenty of content out there.

3. 4K GoPro/sporting cameras, cell phones, and all sorts of consumer cameras have been generating homemade 4K content for years. Watching them on the big screen is great!

4. Netflix and Amazon 4K content may not be available officially on PC, but nearly every show is available via less-than-reputable sources, in all their complete 4K HEVC glory.

5. Even if your rig can't game at 4K, using a 4K60 desktop is unrivaled in its awesomeness.

6. The "niche within a niche" argument is a good one, but when your high end GPUs mostly only sell to niches in the first place, skimping on features like this is a big mistake. I can only speak for myself, but AMD lost my dollar by not having HDMI 2.0 in the last round. (They may very well get it back with DP1.3 in the next round.)

7. There are more than 60 4K HDTV models that support 4K60 4:4:4 input over HDMI 2.0 - not including those that can be modified with firmware to do so. There are 38 4K+ DisplayPort monitors available from Newegg, including 30Hz displays (I couldn't filter them out).

All of that said, I'm very happy with my GTX 970 - I can connect any display to it without an adapter. Built in RAMDAC for premium CRT performance, built in HDMI 2.0 for 4K60 4:4:4 on HDTVs, and then DP and DVI for everything else. The only thing it's missing (or not enabled) is VESA AdaptiveSync support and 4GB VRAM (jk lol).
Carrizo has HDMI 2.0...
 

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
I don't get the spin here, it seems here people are "spinning" the true purpose of a displayport to HDMI 2.0 adaptor.

The only spinning going on here is in your head. The "answer" is not to drop your resolution to something that supports 60Hz, the answer is to use the soon-to-be available adapter. I'm not sure what is so hard to understand about that.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
Well then, the answer would be to set your desktop to a 60hz resolution.

I don't get the spin here, it seems here people are "spinning" the true purpose of a displayport to HDMI 2.0 adaptor.

From what I read, it's always been to address the limitations of lack of HDMI 2.0 on AMD cards, hence why AMD themselves touted them upon Fury X's release.

There are no 4k streaming content for PC's. And won't be for a while. All 4k content are locked for specific tv and other hardware models. You can read the reasoning behind this here:
http://www.techhive.com/article/285...n-a-pc-or-mac-even-though-theyre-capable.html

When UHD Blu Rays are out, you'll need HDCP 2.2 devices. I don't believe you can stick a UHD blu ray on a PC and run it (at least not for a while).

Yes, graphics cards (Nvidia at the moment) are HDMI 2.0 compliant, but it's for gaming. Just like this converter is for, to allow you to game at > 30fps at 4k.

Spinning this for 4k PC media content, when non-exists, seems a bit absurd. Maybe if you're into making your own home-made movies, this would make sense.

I'd imagine the majority of people interested in this would be users of Hawaii, Fiji or Kepler cards who want to game on a 4k TV, though it's hard to say how large that market really is.

These do still make sense for a small HTPC even if you can't run new games on them though. Even if in the short term most 4k content isn't Blu Ray, you can't just switch your refresh rate to 60Hz/4:4:4 over HDMI1.x when you're not playing lower refresh rate content. Using a DP to HDMI2.0 converter lets you just stick your HTPC at 4k60 and get a smooth desktop experience with proper text without having to switch back to a lower resolution.

The same use case works for people using 4k as a primary desktop monitor as well. I can run DP since my 43" 4k monitor supports it, but if I'd bought a TV instead I would need an adapter to do 4k/60/4:4:4. I can actually game at 4k and >30fps, but you definitely don't need a high power GPU to enjoy 4k on the desktop.
 

TestKing123

Senior member
Sep 9, 2007
204
15
81
The only spinning going on here is in your head. The "answer" is not to drop your resolution to something that supports 60Hz, the answer is to use the soon-to-be available adapter. I'm not sure what is so hard to understand about that.

So to just run a desktop:

Buy a 4k TV with no display port input and only HDMI 2.0, and buy an AMD gpu on top of that with no native HDMI 2.0 support, and on top of that buy a displayport to HDMI 2.0 adaptor (which isn't even available yet).


Or:

Buy an actual monitor and use the displayport input, like this one:
http://www.newegg.com/Product/Product.aspx?Item=N82E16824009726


Yeah, option #1 makes perfect sense over #2 in your world, doesn't it? :rollseyes:
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
So to just run a desktop:

Buy a 4k TV with no display port input and only HDMI 2.0, and buy an AMD gpu on top of that with no native HDMI 2.0 support, and on top of that buy a displayport to HDMI 2.0 adaptor (which isn't even available yet).


Or:

Buy an actual monitor and use the displayport input, like this one:
http://www.newegg.com/Product/Product.aspx?Item=N82E16824009726


Yeah, option #1 makes perfect sense over #2 in your world, doesn't it? :rollseyes:

I'm not sure exactly what you're arguing about. Small monitors have started to appear over the last year, but options in for larger screens still seem limited to TVs, the Philips display, and imported Korean displays. Judging by the activity on the HardOCP display forum, there's a lot more people using a 4k TV as a monitor than 40"+ DP monitors.

If you already have a GPU system that works fine, be it Intel, AMD or older nVidia, and want to use a 4k TV as a monitor, these adapters are a perfectly acceptable choice, vs spending $150+ on a GM20X card.
 

TestKing123

Senior member
Sep 9, 2007
204
15
81
I'd imagine the majority of people interested in this would be users of Hawaii, Fiji or Kepler cards who want to game on a 4k TV, though it's hard to say how large that market really is.

These do still make sense for a small HTPC even if you can't run new games on them though. Even if in the short term most 4k content isn't Blu Ray, you can't just switch your refresh rate to 60Hz/4:4:4 over HDMI1.x when you're not playing lower refresh rate content. Using a DP to HDMI2.0 converter lets you just stick your HTPC at 4k60 and get a smooth desktop experience with proper text without having to switch back to a lower resolution.

The same use case works for people using 4k as a primary desktop monitor as well. I can run DP since my 43" 4k monitor supports it, but if I'd bought a TV instead I would need an adapter to do 4k/60/4:4:4. I can actually game at 4k and >30fps, but you definitely don't need a high power GPU to enjoy 4k on the desktop.

If the goal is just to run a 4k 60 desktop, it just makes sense to use a desktop monitor with native PC input rather than jumping through the hoops. Building an HTPC around this limitation when you cannot even stream official 4k content, or play UHD Blu Rays directly on the PC, seems a bit backward.

The argument isn't about "30hz isn't good for a desktop", because it obviously isn't. The question is what exactly are you going to do with that 4k 60hz that you need to go out and buy an adaptor specifically on your limited HTPC for? If not content or gaming? Just running the desktop is absurd, because I can tell you my Windows 10 desktop looks pretty much the same in either 2160p or 1080p because of DPI scaling. The only time I can tell a difference is when I use 4k background screens, otherwise font sizes, text all look the same.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
and so it begins...you spin me right round, baby right round...
The cable is available yet...

So I don't know what you're talking about if I want hdmi 2.0 I have to wait still for amd....

On top of it it adds to the cost bringing fury x to the cost of aftermarket 980tis where it gets stomped so badly it's the exact reason I'm waiting for next gen. I refuee to have a poor performing $680 fury x joke gpu when the after market 980ti stomps all over it for the same price....

Oh and I don't have to wait for the special cable adapter. I have far more cable choices.

Stop downplaying every single issue the fury x and fiji line in general has.

It was a horrendous release. There really just is no justification to defend amd for how poor fiji performed.....
 
Last edited:

TestKing123

Senior member
Sep 9, 2007
204
15
81
I'm not sure exactly what you're arguing about. Small monitors have started to appear over the last year, but options in for larger screens still seem limited to TVs, the Philips display, and imported Korean displays. Judging by the activity on the HardOCP display forum, there's a lot more people using a 4k TV as a monitor than 40"+ DP monitors.

If you already have a GPU system that works fine, be it Intel, AMD or older nVidia, and want to use a 4k TV as a monitor, these adapters are a perfectly acceptable choice, vs spending $150+ on a GM20X card.


Suppose that would make sense for someone that already has a 4k TV and an AMD/Intel gpu, and wants to use that for a monitor for whatever reason. But, brand new? Seems like it's just better to just buy a 40" and higher monitor outright, especially since prices aren't that far apart.

http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY37G8567
 

tential

Diamond Member
May 13, 2008
7,355
642
121
Suppose that would make sense for someone that already has a 4k TV and an AMD/Intel gpu, and wants to use that for a monitor for whatever reason. But, brand new? Seems like it's just better to just buy a 40" and higher monitor outright, especially since prices aren't that far apart.

http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY37G8567
Unless you want bigger than 40 inches. Jesus Christ stop pushing everyone to what you consider to be big. I've seen a lot of users with the Vizio p series and gtx 900 series cards.

For me, it was 980ti and Vizio p series 70 Inch.... Or nothing.. And I chose nothing since I hoped fury x would be faster...

I'm happy I chose nothing since freesync came to larger monitors. But I still hope it goes into a 65 inch monitor because 55 inches is a massive compromise for me.

So next gen, is pascal is fast enough, I'll get a 4k lightboost hdtv from Sony and pascal over amd freesync + 55 inch monitor.

You can't pigeon hole amd. Into the best choice.

Sometimes, it's just a bad choice for flexibility. Get over it.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
If the goal is just to run a 4k 60 desktop, it just makes sense to use a desktop monitor with native PC input rather than jumping through the hoops. Building an HTPC around this limitation when you cannot even stream official 4k content, or play UHD Blu Rays directly on the PC, seems a bit backward.

The argument isn't about "30hz isn't good for a desktop", because it obviously isn't. The question is what exactly are you going to do with that 4k 60hz that you need to go out and buy an adaptor specifically on your limited HTPC for? If not content or gaming? Just running the desktop is absurd, because I can tell you my Windows 10 desktop looks pretty much the same in either 2160p or 1080p because of DPI scaling. The only time I can tell a difference is when I use 4k background screens, otherwise font sizes, text all look the same.

I don't plan to run 4k on my HTPC until I can replace my Panasonic plasma with a 4k OLED TV, but I can see the value in doing so if you have a 4k LCD. If you are running HDMI1.4, you can either be outputting 4k30Hz or 60Hz at a lower resolution. I primarily just watch media on mine so I would probably just deal with the limited time at 30Hz, but it's not uncommon for people to surf the internet on their HTPCs and that would be maddening at 30Hz. Even if 4k content isn't commonly 60Hz, switching back and forth between 4k and 60Hz would be worth the cost of buying the adapter.

For my desktop, I was willing to go out and buy a monitor with DP off eBay from a Korean seller, and hope I get one without excessive backlight bleed or dead pixels. Most people aren't willing to do that, so their options are limited. A Samsung 40JU6500 is also cheaper than a Crossover 404k, and you can just walk into a Best Buy and take it home.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I wonder how many people who are here arguing about how awesome or crappy this is actually want or need one? :lol:

It's far more likely that this will be needed some time in the future when people's current AMD/Kepler gaming card gets relegated to their HTPC and they own a 4K TV. For these people it will be perfect and it's not so expensive that you'd be better off buying a cheap but similar performing card.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
I always assumed people wanted 4K 60hz for media consumption mostly. This is what HTPCs are for for example. Not sure why the argument.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
I don't plan to run 4k on my HTPC until I can replace my Panasonic plasma with a 4k OLED TV, but I can see the value in doing so if you have a 4k LCD. If you are running HDMI1.4, you can either be outputting 4k30Hz or 60Hz at a lower resolution. I primarily just watch media on mine so I would probably just deal with the limited time at 30Hz, but it's not uncommon for people to surf the internet on their HTPCs and that would be maddening at 30Hz. Even if 4k content isn't commonly 60Hz, switching back and forth between 4k and 60Hz would be worth the cost of buying the adapter.

For my desktop, I was willing to go out and buy a monitor with DP off eBay from a Korean seller, and hope I get one without excessive backlight bleed or dead pixels. Most people aren't willing to do that, so their options are limited. A Samsung 40JU6500 is also cheaper than a Crossover 404k, and you can just walk into a Best Buy and take it home.
That's great that you can see the value in wanting to use a 4k screen if you have one. Many amd fans believe we should wait until a third party cable is available to use a 4k hdtv.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,437
146
That's great that you can see the value in wanting to use a 4k screen if you have one. Many amd fans believe we should wait until a third party cable is available to use a 4k hdtv.

Quit with the trolling, we don't need to turn this into an Nvidia vs AMD argument. This is just going to start fights. Stay on topic or the thread will be closed.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Just a warning. I would like to see if its a single case or not. And if compatibility is a common issue.

One person got the dongle, using a 290 with grimson drivers. And he gets this as the result.

sp1d18d.jpg


Anyone else got it yet?
 
Last edited:

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Just a warning. I would like to see if its a single case or not. And if compatibility is a common issue.

One person got the dongle, using a 290 with grimson drivers. And he gets this as the result.

sp1d18d.jpg


Anyone else got it yet?

ha