and so it begins...you spin me right round, baby right round...
Reading malfunction.....
I don't imagine most people are trying to run FO4 or TW3 on a 4k TV using their HTPC. Based on my use case and I imagine many other people's, I use my HTPC for home theater.
this is like the niche of an already niche market. last I checked, 4k gamers were only 0.12%. how many of them game with 4k tvs instead of monitors?
Ever tried using even desktop tasks at 30Hz? It is not fun, and the drop in refresh rate is very noticeable...
Ever tried using even desktop tasks at 30Hz? It is not fun, and the drop in refresh rate is very noticeable...
Carrizo has HDMI 2.0...It's great that this product has finally materialized, but it's definitely too bad it was even needed.
I was absolutely devastated when I bought my 290X to discover that I couldn't connect it to my FW900 without an active adapter and even then not at the higher resolutions and refresh rates I was accustom to. It was my own mistake for believing that it had both DVI-I and DVI-D, but it still sucked in a big way. Then again, I was flabbergasted when the Fury was announced without HDMI 2.0 and only the promise of adapters.
Likewise, Intel dropped the ball on HDMI 2.0 as well. Their NUCs are worthless as 4K HTPCs without it. AMD could have also had a phenomenal NUC-like device with the new HEVC-enabled Carrizo APU, but no HDMI 2.0 to be seen. This adapter is a godsend. It will be interesting to see how many people suddenly recognize/endorse 4K content now that they can actually use it (outside of a tiny 4K DP monitor).
Before anyone responds to say that 4K isn't mature or some BS, keep in mind:
1. 4K gaming has been a reality for years, maybe not AC:Unity on a single card, but nearly any game released before 2013 can be played at 4K on the highest possible settings with minimal effort. You don't even need a very powerful rig to do it. Toss in some ReShade effects and texture packs to modernize and beautify and the experience is wonderful.
2. 4K+ YouTube has been available for years and 4K 60fps videos and gaming streams are everywhere. There's plenty of content out there.
3. 4K GoPro/sporting cameras, cell phones, and all sorts of consumer cameras have been generating homemade 4K content for years. Watching them on the big screen is great!
4. Netflix and Amazon 4K content may not be available officially on PC, but nearly every show is available via less-than-reputable sources, in all their complete 4K HEVC glory.
5. Even if your rig can't game at 4K, using a 4K60 desktop is unrivaled in its awesomeness.
6. The "niche within a niche" argument is a good one, but when your high end GPUs mostly only sell to niches in the first place, skimping on features like this is a big mistake. I can only speak for myself, but AMD lost my dollar by not having HDMI 2.0 in the last round. (They may very well get it back with DP1.3 in the next round.)
7. There are more than 60 4K HDTV models that support 4K60 4:4:4 input over HDMI 2.0 - not including those that can be modified with firmware to do so. There are 38 4K+ DisplayPort monitors available from Newegg, including 30Hz displays (I couldn't filter them out).
All of that said, I'm very happy with my GTX 970 - I can connect any display to it without an adapter. Built in RAMDAC for premium CRT performance, built in HDMI 2.0 for 4K60 4:4:4 on HDTVs, and then DP and DVI for everything else. The only thing it's missing (or not enabled) is VESA AdaptiveSync support and 4GB VRAM (jk lol).
Carrizo has HDMI 2.0...
I don't get the spin here, it seems here people are "spinning" the true purpose of a displayport to HDMI 2.0 adaptor.
Well then, the answer would be to set your desktop to a 60hz resolution.
I don't get the spin here, it seems here people are "spinning" the true purpose of a displayport to HDMI 2.0 adaptor.
From what I read, it's always been to address the limitations of lack of HDMI 2.0 on AMD cards, hence why AMD themselves touted them upon Fury X's release.
There are no 4k streaming content for PC's. And won't be for a while. All 4k content are locked for specific tv and other hardware models. You can read the reasoning behind this here:
http://www.techhive.com/article/285...n-a-pc-or-mac-even-though-theyre-capable.html
When UHD Blu Rays are out, you'll need HDCP 2.2 devices. I don't believe you can stick a UHD blu ray on a PC and run it (at least not for a while).
Yes, graphics cards (Nvidia at the moment) are HDMI 2.0 compliant, but it's for gaming. Just like this converter is for, to allow you to game at > 30fps at 4k.
Spinning this for 4k PC media content, when non-exists, seems a bit absurd. Maybe if you're into making your own home-made movies, this would make sense.
The only spinning going on here is in your head. The "answer" is not to drop your resolution to something that supports 60Hz, the answer is to use the soon-to-be available adapter. I'm not sure what is so hard to understand about that.
So to just run a desktop:
Buy a 4k TV with no display port input and only HDMI 2.0, and buy an AMD gpu on top of that with no native HDMI 2.0 support, and on top of that buy a displayport to HDMI 2.0 adaptor (which isn't even available yet).
Or:
Buy an actual monitor and use the displayport input, like this one:
http://www.newegg.com/Product/Product.aspx?Item=N82E16824009726
Yeah, option #1 makes perfect sense over #2 in your world, doesn't it? :rollseyes:
I'd imagine the majority of people interested in this would be users of Hawaii, Fiji or Kepler cards who want to game on a 4k TV, though it's hard to say how large that market really is.
These do still make sense for a small HTPC even if you can't run new games on them though. Even if in the short term most 4k content isn't Blu Ray, you can't just switch your refresh rate to 60Hz/4:4:4 over HDMI1.x when you're not playing lower refresh rate content. Using a DP to HDMI2.0 converter lets you just stick your HTPC at 4k60 and get a smooth desktop experience with proper text without having to switch back to a lower resolution.
The same use case works for people using 4k as a primary desktop monitor as well. I can run DP since my 43" 4k monitor supports it, but if I'd bought a TV instead I would need an adapter to do 4k/60/4:4:4. I can actually game at 4k and >30fps, but you definitely don't need a high power GPU to enjoy 4k on the desktop.
The cable is available yet...and so it begins...you spin me right round, baby right round...
I'm not sure exactly what you're arguing about. Small monitors have started to appear over the last year, but options in for larger screens still seem limited to TVs, the Philips display, and imported Korean displays. Judging by the activity on the HardOCP display forum, there's a lot more people using a 4k TV as a monitor than 40"+ DP monitors.
If you already have a GPU system that works fine, be it Intel, AMD or older nVidia, and want to use a 4k TV as a monitor, these adapters are a perfectly acceptable choice, vs spending $150+ on a GM20X card.
Unless you want bigger than 40 inches. Jesus Christ stop pushing everyone to what you consider to be big. I've seen a lot of users with the Vizio p series and gtx 900 series cards.Suppose that would make sense for someone that already has a 4k TV and an AMD/Intel gpu, and wants to use that for a monitor for whatever reason. But, brand new? Seems like it's just better to just buy a 40" and higher monitor outright, especially since prices aren't that far apart.
http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY37G8567
If the goal is just to run a 4k 60 desktop, it just makes sense to use a desktop monitor with native PC input rather than jumping through the hoops. Building an HTPC around this limitation when you cannot even stream official 4k content, or play UHD Blu Rays directly on the PC, seems a bit backward.
The argument isn't about "30hz isn't good for a desktop", because it obviously isn't. The question is what exactly are you going to do with that 4k 60hz that you need to go out and buy an adaptor specifically on your limited HTPC for? If not content or gaming? Just running the desktop is absurd, because I can tell you my Windows 10 desktop looks pretty much the same in either 2160p or 1080p because of DPI scaling. The only time I can tell a difference is when I use 4k background screens, otherwise font sizes, text all look the same.
That's great that you can see the value in wanting to use a 4k screen if you have one. Many amd fans believe we should wait until a third party cable is available to use a 4k hdtv.I don't plan to run 4k on my HTPC until I can replace my Panasonic plasma with a 4k OLED TV, but I can see the value in doing so if you have a 4k LCD. If you are running HDMI1.4, you can either be outputting 4k30Hz or 60Hz at a lower resolution. I primarily just watch media on mine so I would probably just deal with the limited time at 30Hz, but it's not uncommon for people to surf the internet on their HTPCs and that would be maddening at 30Hz. Even if 4k content isn't commonly 60Hz, switching back and forth between 4k and 60Hz would be worth the cost of buying the adapter.
For my desktop, I was willing to go out and buy a monitor with DP off eBay from a Korean seller, and hope I get one without excessive backlight bleed or dead pixels. Most people aren't willing to do that, so their options are limited. A Samsung 40JU6500 is also cheaper than a Crossover 404k, and you can just walk into a Best Buy and take it home.
That's great that you can see the value in wanting to use a 4k screen if you have one. Many amd fans believe we should wait until a third party cable is available to use a 4k hdtv.
Just a warning. I would like to see if its a single case or not. And if compatibility is a common issue.
One person got the dongle, using a 290 with grimson drivers. And he gets this as the result.
Anyone else got it yet?