Problem is manufactors are actually not allowed to show HDMI versions.
http://www.hdmi.org/download/guidelines/2009_11_18_RevisedTradeLogo_Guidelines_FINAL_a.pdf
I'm shocked that not all manufacturers are sticking to HDMI 2.0, which was one of Nvidia's major marketing points for this series.
I'd cross the MSI off the list for that reason along. The similar Asus Strix has HDMI 2.0: http://www.asus.com/us/Graphics_Cards/STRIXGTX970DC2OC4GD5/specifications/
Also, reviewing the HDMI guidelines linked previously really calls into question the legitimacy of the standard. The prohibition on advertising the version number of an HDMI cable can and does cause mass confusion on the part of consumers due to incompatibilities and poor performance.
I for one spent weeks trouble-shooting a problem with a Blu-Ray player before I realized I was using a "1.2" cable where a "1.3" cable was needed - and of course the cables are not labeled, making trouble-shooting an absolute shot in the dark.
The fact that 1.4 cables, for instance, aren't guaranteed to possess the bandwidth to support 4K 60Hz is going to mean lots of very angry 4K HDTV customers who think their TVs or output devices are broken.
Cables people, cables.
You can specify the type of port. Cables are either High Speed(which have been on the market since well before 1.2 came out) and Standard. There's also with and without ethernet channel, but that's not something most are concerned with for this application.
They don't want people specifying version numbers for cables because THEY DON'T MATTER. You'll have uninformed people who had a card with a 1.2 port, who upgrade to a 1.4b or 2.0 port and think they need a "1.4 HDMI cable or "2.0 HDMI cable", both of which DO NOT EXIST. You only need a "high speed" cable....which almost every one sold in the last 4 years is probably high speed.
There's nothing stopping a mfg from listing the type of port. HDMI 2.0 ports don't require a new cable. Just that it's "high speed".
Furthermore, it's as yet unclear whether the tremendous increase in bandwidth required by 4K/60Hz, only supported by HDMI 2.0, can actually be passed on older cables. Because there have been no devices (up until the release of the GTX 980, as a matter of fact) that could output 4K/60Hz, this issue has not yet been put to the test.
Does HDMI 2.0 require new cables?
No, HDMI 2.0 features will work with existing HDMI cables. Higher bandwidth features, such as 4K@50/60 (2160p) video formats, will require existing High Speed HDMI cables (Category 2 cables).
http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx#144
Even these cheap cables should be fine - http://www.amazon.com/AmazonBasics-H...productDetails
I hope you are correct, but just because the HDMI consortium assures us that the millions of existing high-speed cables on the market and in homes will work for HDMI 2.0 does not mean this is actually true. The bandwidth increase is significant, and therefore additional shielding may be required to avoid crosstalk.
Again, the HDMI 2.0 spec simply has not been tested in the field, as there were no HDMI 2.0 output devices until Nvidia's GTX 970/980 arrived.
The only way to know is to check the manufacturer's website for the exact model you are looking to purchase.
4k@60Hz is working on my HU8550 and GTX 980.
"Make sure it's displaying 1:1 (pixel exact). Browsers may do funny things, so I recommend downloading it and viewing it in an image viewer of some sort where you can click a 1:1 zoom button.
You should see pairs of vertical bars of black, blue, green, red, and then pairs of horizontal bars red, green, blue, black.
Each pair has two halves, offset by one pixel. On a 4:4:4 panel, the lines will come through clear (no blurring between or around the pairs, both halves are the same color, the colored lines are all fully saturated, etc). Very slowly dragging the image around the screen should not result in a "shimmering" effect."
Do you happen to know if it's running at 4:4:4 (no chroma subsampling)? There were some concerns expressed on an Amazon review about the Samsung UN40HU6950 (which does offer HDMI 2.0 support) that it was downsampling the chroma to 4:2:0 even when using HDMI 2.0.
The reviewer (David) suggested testing the following image:
http://s10.postimg.org/wkjtapxll/colors.png
Any chance you could try this on your Samsung TV? I'd be interested to know if the problem exists with HDMI 2.0. (The reviewer tested it with the built-in image viewer on a USB stick, which could have problems of its own.)
I have a 4K tv and 780 (so no HDMI 2.0) and I had problems being able to tell that it ran 4:2:0 until I used a test pattern. It seems to be able to output 60hz which unless I'm mistaken is not possible with HDMI 1.x even with 4:2:0 so I'm not sure what kind of hack it uses for that..?
4K60Hz is possible over HMDI 1.4 with 4:2:0 chroma on Nvidia cards...
http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of
Thanks, I didn't check my facts there.
I did try and set it down to 24 and 30hz, since I wrongly expected that to be the max even for 4:2:0. However, it outputted the same (reduced) colors at 24hz. I would presume that it should be able to use full color space at that refresh rate which is lower than half of the bandwidth of 60hz?
I kind of wish the NVidia control panel had some more easily available info (or a flat out warning) about color space.