Fury cards will NOT have HDMI 2.0 outputs

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

garagisti

Senior member
Aug 7, 2007
592
7
81
I know, I frequent the AVS forums and I couldn't find anything about it.
It is right there in one of the threads i linked. The chap who owns one of the 9xx cards mentioned that it supports 1.3, so to speak 1080p blu ray will work fine, but the newer standard, not so much. So anyone using the same for UHD playback can expect a healthy amount of downgrading in video signal. Well, it wasn't pretty at all on a 1080p screen, and after buying a new compliant blu ray player, receiver, it will fill my heart with so much joy to not have it work. Bunch of utter pillocks them lot at UHD consortium. This standardisation bit should have been finalised much earlier.
 

flopper

Senior member
Dec 16, 2005
739
19
76
http://forums.overclockers.co.uk/showthread.php?t=18677121&page=24

Just to confirm.

The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.

In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
http://forums.overclockers.co.uk/showthread.php?t=18677121&page=24

Just to confirm.

The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.

In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.
Thanks, i'll wait for adapters to come along, and hopefully we will know about NANO too in the meantime. It would be great if somehow hdcp 2.2 compliance could be shoehorned in.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
It is right there in one of the threads i linked. The chap who owns one of the 9xx cards mentioned that it supports 1.3, so to speak 1080p blu ray will work fine, but the newer standard, not so much. So anyone using the same for UHD playback can expect a healthy amount of downgrading in video signal. Well, it wasn't pretty at all on a 1080p screen, and after buying a new compliant blu ray player, receiver, it will fill my heart with so much joy to not have it work. Bunch of utter pillocks them lot at UHD consortium. This standardisation bit should have been finalised much earlier.

I searched, I didn't see any links to the AVS Forum that pertains to this... Am I missing something? Many times you said you read it on the AVS forums, but never link to it.

The only thread I found on the AVS forums was from back in 2014, when they were speculating if it would work or not.

I found it!

The guy mentions not having an SIL9777 chipset, but I don't think graphics cards need external chipsets to control HDMI. I believe that is done within the display controller on the GPU...

Edit - Bummer, only GM200 supports HDCP 2.2...

Guru3D said:
With the amount of 4K content expected to explode in the coming years, GM200 also adds native support for HDCP 2.2 content protection over HDMI.

Source

Better than none of them, I guess.
 
Last edited:

Lalilulelo

Member
Jun 1, 2015
34
0
0
http://forums.overclockers.co.uk/showthread.php?t=18677121&page=24

Just to confirm.

The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.

In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.


That's pretty nice for the people who needs HDMI 2.0 , plus it'll be an official working adapter than some cheap 3rd party :cool: .
 

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
The list of available FreeSync monitors that I found in a brief search looked a bit thin. I wonder if a separate thread about these monitors, and perhaps any other display device with 4K, 4:4:4, 60Hz, AND DP would be helpful. Some of the other lists I found were about HDMI 2.0 devices.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
That's pretty nice for the people who needs HDMI 2.0 , plus it'll be an official working adapter than some cheap 3rd party :cool: .


I don't believe he means official adapters from AMD, more like "there should be some coming to the market soon".
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
DVI is a dead standard. Several major players including AMD said way back in 2010 that they were dropping support for legacy interfaces. 2015, here we are.

Too bad they also dropped support, or, um, never had it, for the most important TV standard in a fast-growing (though admittedly pretty small) market.

Why are tv's so dead set on hdmi. Or, conversely, why are monitors so set on dp? Don't a lot of companies make both? Why not just get together and make one industry standard for both?

I don't know. With all those comments on this, there seems to be a lot people with a 4K TV that play pc games on it ...

I do have some friends with 4K TVs but none of them have a PC attached to it.

That said, not having the new stuff on a new GPU is kinda *meh*

Naw, there are a FEW people with 4k tv's and a lot of NV shills/rabid fanboys looking to capitalize on AMD's mistake. In AMD's defense, this is pretty close to a best case scenario, as the Green Team probably had something like the following on their list of "ways to discredit fury so we don't have to lower prices too much"

1. Not as fast as NV
2. Really power hungry
3. Really inefficient
4. Not enough Ram
5. Huddy is the worst public speaker since Ferris Bueler's teacher
37. no hdmi 2.0

#4 is problematic b/c if they get too aggressive on the Ram attack then they could hurt sales of 980 vs 390x, while #5 is a pretty solid way to attack AMD. I mean...come on guys, you can't find somebody with more personality that a piece of wood to present your super-cool™ next gen monster of a gpu??

https://www.youtube.com/watch?v=KS6f1MKpLGM
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Serious question, are there 4k TVs with input lag comparable to a good PC monitor for gaming? Input lag is what has always kept me away from gaming seriously on TVs.

I heard 22-24 ms last night as a best case scenario. That probably wouldn't bug me one bit on d3 or civ5, so now I have some interdasting thoughts for my Man Cave 2016™...though it sounds like it would be pretty painful for somebody playing an fps.

Probably not but for other people it's not a big deal. 30 ms of input lag isn't a problem to me. I've played with over 100 ms before and I'm sure many console gamers have too.

There simply isn't enough time in the day right now to stress how mad I am about 0 hdmi 2.0 support.

Ugh this decision is 10 times harder when before the fury x was an easy win which is what really is odd because why would amd just leave money on the table. Like that.....

The few times that I've gamed on a TV have been a serious pita, you don't want to introduce more potential issues than necessary. Even if you can get an active adapter for a reasonable price on day 1 (highly unlikely), I think that you'd be better off going with 980ti this round.
 
Last edited:

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
But, but, that would be common sense. It is not like the lack of hd audio bit-streaming stopped the same lot from buying Nvidia before,
It stopped me. None of my HTPCs touched an NVIDIA GPU until Kepler hit.

but now that it is AMD, things are a bit different. Besides, nothing is DP 1.3 or HDCP 2.2 compliant, but that is irrelevant.

It is irrelevant because it doesn't matter since nearly all 2014+ 4K HDTVs are HDCP 2.2 compliant. Also, HDCP 2.2 isn't required for games or to get a 4K60 4:4:4 signal. DP1.3, sadly, doesn't exist in the real world yet.

AMDMatt is probably going to be in trouble...
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
For these people saying there are 4k tvs with display port on.

Find me a 4k TV with display port that also supports 1080p 120hz for at least a 60 inch display.

I'll wait for the multitude of cheap options to flood in!

Oh come on...how much did your TV cost? I'd wager that the Panasonics are similarly-priced. In your case it doesn't matter, but for those who have been holding off on a TV purchase then this at least tells them what brand they need to get.

http://forums.overclockers.co.uk/showthread.php?t=18677121&page=24

Just to confirm.

The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.

In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.

WHERE can we find active converters "this summer" though? AmdMATT is a marketing guy, it's surprising he can even spell "hdmi", and he took an awfully long time before "clarifying" things. He got that entire post from Lisa Su or somebody very high up in the GPU food chain at AMD at least, and THAT person has no concrete info on the availability of active adapters, either.

Why are we supposed to get indignant at NV when they nefariously steal .5gb of our Ram but give a pass to AMD when they aggressively fudge the numbers on such a hot-button issue?
 
Last edited:

garagisti

Senior member
Aug 7, 2007
592
7
81
I searched, I didn't see any links to the AVS Forum that pertains to this... Am I missing something? Many times you said you read it on the AVS forums, but never link to it.

The only thread I found on the AVS forums was from back in 2014, when they were speculating if it would work or not.

I found it!

The guy mentions not having an SIL9777 chipset, but I don't think graphics cards need external chipsets to control HDMI. I believe that is done within the display controller on the GPU...

Edit - Bummer, only GM200 supports HDCP 2.2...



Source

Better than none of them, I guess.

I did link a thread here (or may be another thread) and with a post from an owner no less. Now that you have mentioned it, i looked it up and even tech-radar says that 960 is also HDCP compliant. I tried looking this up at Nvidia after seeing that post on AVS, but couldn't find much. This is really helpful. cheers. checking more on the same. if you have some helpful links detailing what support it has for streaming audio etc., post that as well.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
That's pretty nice for the people who needs HDMI 2.0 , plus it'll be an official working adapter than some cheap 3rd party :cool: .

When marketers start acting like that, you have to read in a literal manner everything that they wrote, and then take a couple of truck loads of salt with it. He didn't say that AMD was producing the part, just that the parts "are set to debut this summer". He's correct, there are some that "are set to" come out before Sept 21, but rumors on the street peg them significantly later than that. Nov-Dec seems more likely, and perhaps dragging into next year. I wouldn't count on that if you're planning to buy a 4k TV that requires hdmi 2.0.
 
Last edited:
May 11, 2008
22,220
1,410
126
He guys and girls, does this text not contradict some of the posted replies about 4k and displayport :


http://www.techspot.com/news/51519-vesa-updates-displayport-dual-mode-pushes-4k-uhd-over-hdmi.html

Industry standards body VESA has officially updated its DisplayPort Dual-Mode 1.1 standard to provide better performance, higher resolutions and increased interoperability when hooking up to HDMI and DVI ports through a cable or adapter. Under Dual-Mode, a DisplayPort can output an additional HDMI/DVI-compatible signal alongside its expected DisplayPort link. This allows for connectivity to HDMI and DVI-equipped devices without the need for cables or converters equipped with active electronics.

Current DisplayPort Dual-Mode converters/adapters suffer from a limited output of 1080p @ 60Hz with 24-bit color. VESA's update includes support for just about anything HDMI 1.4 can handle -- deep-color, 3D 1080p @ 60Hz and 4K UHD (2160p) @ 30Hz -- but all through a single cable. These improvements are achieved by nearly doubling the TMDS (Transitional-Minimized Differential Signal) rate from 165MHz to 300MHz.

It's worth noting the current DisplayPort standard (pdf) itself has no trouble delivering 4K UHD signals @ 60Hz and 24-bit color.


DisplayPort adapters and cables which meet the new specifications will be referred to as "Type 2". Previous-generation devices, cables and adapters will be henceforce labeled as "Type 1". Type 2 (300MHz) adapters will be compatible with existing 165MHz Type 1 source devices, but the performance of such adapters are downgraded to 165MHz for compatibility. However, in some cases, a software update may allow existing Type 1 devices to take full advantage of Type 2 features and performance.

The updated standard comes at an auspicious time when ultra high-def displays are are clearly on the way. If CES is any indication (and it likely is), 4K UHD and even 8K UHD televisions and displays are imminent. Japan's government seems to agree with that assessment.

What limitation is not mentioned here ?
Which standard do they mean ?
I am just a noob.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
AMDMatt is probably going to be in trouble...

No reason for him to be in trouble, it's not like this wasn't going to come out in a few days, anyway. He just needs to learn how to instinctively "talk without saying anything" like most marketing professionals.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
So AMD's official response to buyers of their new $499-$649 flagship card is "go buy a third-party adapter (which doesn't exist yet, but hopefully will soon) if you want HDMI 2.0."

Disappointing.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Not that I need HDMI 2.0 were I too buy Fury X, but this is very disappointing confirmation of what I thought could not be true.

No HDMI 2.0 is a mistake, a big one. I game on 1080P 54in Plasma. And i'm definitely interested in running HDMI 2.0 from GPU to possible new screen.

I have had problems with adapters in the past and avoid them whenever possible.


I should not need an adapter to get HDMI 2.0 from Fury X to a HDMI 2.0 compatible display device. That's the bottom line. Frankly i'm a lot less excited about Fury X now.
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
No reason for him to be in trouble, it's not like this wasn't going to come out in a few days, anyway. He just needs to learn how to instinctively "talk without saying anything" like most marketing professionals.


Well, or like politicians. To tell mere portions of the truth with the intent to deceive about the bottom line.

In other words, tell a carefully constructed lie to keep folks from knowing the truth.

No HDMI 2.0.
 

maddie

Diamond Member
Jul 18, 2010
5,147
5,523
136
Concentrate your attacks on the weak points.

When 295X2 came out the lack of HDMI 2.0 was not raised as a reason not to purchase because the bigger weak points were power use and Dual GPU Xfire issues. Now that Fiji can deliver a powerful single GPU attack, you must find a new weak point.

A classic strategy.

Solution:
If you need HDMI 2.0, then you won't get Fiji. If you don't need it then you can get Fiji. Very simple. All things in life have compromises.

Note:
It would have been better if HDMI 2.0 available as a native output.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
If passive adapter, yeah. However, most likely need an active adapter (since hdmi 2.0 device will be treated as DP device by Fury) and that will need a new version of chip like this one (http://www.st.com/web/catalog/mmc/FM128/SC1386/SS1485/LN1694/PF253910) that can handle new protocol requirements and increased clock rate of hdmi 2.0.

Edit: fixed for clarity (st chip linked can't handle hdmi 2.0).
 
Last edited: