Fury cards will NOT have HDMI 2.0 outputs

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
Concentrate your attacks on the weak points.

When 295X2 came out the lack of HDMI 2.0 was not raised as a reason not to purchase because the bigger weak points were power use and Dual GPU Xfire issues. Now that Fiji can deliver a powerful single GPU attack, you must find a new weak point.

A classic strategy.

Solution:
If you need HDMI 2.0, then you won't get Fiji. If you don't need it then you can get Fiji. Very simple. All things in life have compromises.

Note:
It would have been better if HDMI 2.0 available as a native output.

You might try something constructive, like helping list 4K 60+Hz display devices that WILL work with Fury, instead of insinuating that people that have a problem with lack of HDMI 2.0 have some sort of hidden agenda. I want a Fury and I also wanted a 4K TV, since I'm not going to get what I want, be helpful and suggest alternatives.
 
Feb 6, 2007
16,432
1
81
Concentrate your attacks on the weak points.

When 295X2 came out the lack of HDMI 2.0 was not raised as a reason not to purchase because the bigger weak points were power use and Dual GPU Xfire issues. Now that Fiji can deliver a powerful single GPU attack, you must find a new weak point.

A classic strategy.

Solution:
If you need HDMI 2.0, then you won't get Fiji. If you don't need it then you can get Fiji. Very simple. All things in life have compromises.

Note:
It would have been better if HDMI 2.0 available as a native output.

I don't think it's necessarily partisanship driving the discussion. I'm brand neutral, I'm in the market for a new card, and developments like this give me pause because although I tend to buy at the bleeding edge, I also like to keep that card around for a few years. I may not be upgrading to a 4K TV in the next few years, but if I do, I would hope that it wouldn't require me to upgrade my computer (or purchase an additional adapter) simply because of a lack of support in a standard that a competitor supported at the time. Better to have it and not need it than need it and not have it, you know? But it's doubly confounding when AMD is touting the 4K capabilities of their product that it can't actually connect to a wide segment of 4K devices. What were they thinking?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Not that I need HDMI 2.0 were I too buy Fury X, but this is very disappointing confirmation of what I thought could not be true.

No HDMI 2.0 is a mistake, a big one. I game on 1080P 54in Plasma. And i'm definitely interested in running HDMI 2.0 from GPU to possible new screen.

I have had problems with adapters in the past and avoid them whenever possible.


I should not need an adapter to get HDMI 2.0 from Fury X to a HDMI 2.0 compatible display device. That's the bottom line. Frankly i'm a lot less excited about Fury X now.

You're wasting your money with Fury X for 1080p gaming, I'd snag a 970 or perhaps one of those cheap-o 290's at newegg (if you can figure out how to connect it to the TV ofc). You don't need hdmi 2 to connect to a 1080p TV fyi.

Well, or like politicians. To tell mere portions of the truth with the intent to deceive about the bottom line.

In other words, tell a carefully constructed lie to keep folks from knowing the truth.

No HDMI 2.0.

No, that's the opposite of what he was doing. He was 100% truthful in his more recent post, he just said it in such a way that a casual observer would read what he wanted to read into that; ie, NV fans read "no hdmi 2.0!", whereas AMD fans and many casuals read "no hdmi 2.0, no problem!". A good politician or marketer never lies, he just does a really good job of framing the discussion in a way to paint his point of view in the most favorable light possible. NV marketing is good at this, AMD...well, let's be generous and say that they need practice still.
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
May 11, 2008
22,220
1,410
126
If passive adapter, yeah. However, most likely need an active adapter (since hdmi 2.0 device will be treated as DP device by Fury) and that will need a new chip like this one (http://www.st.com/web/catalog/mmc/FM128/SC1386/SS1485/LN1694/PF253910) that can handle new protocol requirements and increased clock rate of hdmi 2.0.

But in the specs it says it has a hdmi 1.4 transmitter. This would not do for a hdmi 2.0 converter.

Key Features

DisplayPort® (DP) receiver
DP 1.2a compliant
Link rate HBR2/HBR/RBR
1, 2, or 4 lanes
AUX CH 1 Mbps
Supports eDP operation
HDMI 1.4 transmitter
Max data rate up to 2.97 Gbps/data pair
Color depth up to 48 bits
3D video timings
CEC
SPDIF audio output
192 kHz/24 bits
Compressed/LPCM
HDCP repeater with embedded keys
ASSR -- eDP display authentication option
AUX to I2C bridge for EDID/MCCS pass through
Device configuration options
SPI Flash
I2C host interface
Spread spectrum on DisplayPort interface for EMI reduction
Deep color support
RGB/YCC (4:4:4) – 16-bit color
YCC (4:2:2) – 16-bit color
Color space conversion – YUV to RGB and RGB to YUV
Bandwidth
Video resolution up to 4K x 2K @ 30 Hz; 1920 x 1080 @ 120 Hz
Audio 7.1 Ch up to 192 kHz sample rate
Low power operation; active 462 mW, standby 21 mW
Package
81 BGA (8 x 8 mm)
Power supply voltages
3.3 V I/O; 1.2 V core
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
The bad news is 90% of those who skip Fury cards because of this will likely never even get a 4k display, let alone use it for PC gaming, within the 12 months to two years they actually own and use the card. The good news is most of them would have found another excuse to cross Fury off their list anyway, so it's not like AMD will lose a lot of revenue because of this missing feature.

HDMI 2.0 compatibility is completely irrelevant for me and most gamers who run 1920x1080, 1920x1200, or 1440p displays.
 

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
The bad news is 90% of those who skip Fury cards because of this will likely never even get a 4k display, let alone use it for PC gaming, within the 12 months to two years they actually own and use the card. The good news is most of them would have found another excuse to cross Fury off their list anyway, so it's not like AMD will lose a lot of revenue because of this missing feature.

HDMI 2.0 compatibility is completely irrelevant for me and most gamers who run 1920x1080, 1920x1200, or 1440p displays.

Virtually the only reason to get a Fury is to game at high res, so I don't see your reasoning.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
You might try something constructive, like helping list 4K 60+Hz display devices that WILL work with Fury, instead of insinuating that people that have a problem with lack of HDMI 2.0 have some sort of hidden agenda. I want a Fury and I also wanted a 4K TV, since I'm not going to get what I want, be helpful and suggest alternatives.

In his defense, there are quite a few people who have a history of AMD-bashing who have gravitated towards this subject, and I don't recall any of of them claiming to even own a 4k tv. It's an interesting topic nontheless, and it gives all of us something to argue about for the next few days until fury x reviews come out.

Panasonic has a good line of 4k TV's with DP standard. I don't remember the model number but you should be able to find it on their web site.

Virtually the only reason to get a Fury is to game at high res, so I don't see your reasoning.

4k TV's are not ideal for most gaming scenarios, 4k computer monitors are much better for us (and by "us" I mean you guys as a 4k tv would actually be pretty good for my older games). For some people 4k might be decent in a few years, but by then Fury X and 980ti will be really chugging in the newest games at that resolution, anyway. One of my reasons for holding off on 4k is that I don't want to have to have to buy one (or two) video cards every single year, and I'm sure that I'm not alone there. Heck, even Grooveriding is still on 2560x1600.
 
Last edited:

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
In his defense, there are quite a few people who have a history of AMD-bashing who have gravitated towards this subject, and I don't recall any of of them claiming to even own a 4k tv. It's an interesting topic nontheless, and it gives all of us something to argue about for the next few days until fury x reviews come out.

Panasonic has a good line of 4k TV's with DP standard. I don't remember the model number but you should be able to find it on their web site.

I like this one, but it doesn't have DP. I thought I read that it was only last year's models that had it, and some or all of those were 30Hz? (Not sure about that). Part of the aggravation is that it's one more spec to check and not all spec sheets even bother to list it. Just another pain in the *ss to deal with when figuring out if something is going to work.
 

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
4k TV's are not ideal for most gaming scenarios, 4k computer monitors are much better for us (and by "us" I mean you guys as a 4k tv would actually be pretty good for my older games). For some people 4k might be decent in a few years, but by then Fury X and 980ti will be really chugging in the newest games at that resolution, anyway. One of my reasons for holding off on 4k is that I don't want to have to have to buy one (or two) video cards every single year, and I'm sure that I'm not alone there. Heck, even Grooveriding is still on 2560x1600.
I like slow paced games in general, so I thought that a TV might be OK for me. But that could be wrong. Also wanted to go straight from 1080 to 4K, but perhaps an intermediate step will be needed. If a 1440 monitor is purchased now, won't it feel outdated pretty quick at this point?
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
Last year's Panasonic isn't a great TV. Lots of complaints about banding on AVS. It also uses DisplayPort MST to achieve 60 Hz which introduces some scaling issues if you try to use lower resolutions on it. And, it's pretty expensive compared to some of Samsung's offerings. The 2015 Samsung line is quite good for gaming, around 20ms input lag in Game mode.

Not saying the Panasonic is bad by any means but I wouldn't get it, if I were in the market for a 4K TV. DP port doesn't make up for some of the negatives.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
This ic is used for so called passive converters for displayport++ to hdmi or dvi.
The graphic card has dual mode option and the ic performs the signal level translations.

The PTN3363.

http://www.nxp.com/products/interfa...layport/high_speed_multiplexer/PTN3363BS.html

http://www.nxp.com/documents/leaflet/75017545.pdf

From what I understand of adapters (from this post: http://www.overclock.net/t/721931/active-vs-passive-displayport-adapters-the-truth) is passive won't work at all since Fury can't handle hdmi 2.0 at all (so Fury won't know how to handshake and send video if passive adapter is used). An active adapter will make the hdmi 2.0 device look like a DP1.2a device to Fury which requires the active adapter to handle protocol and data signal conversions.
 

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
What percentage of high end cards are running on a 4K display?
Maybe we should confine the discussion to Titan X, 980ti, and Fury X. I believe a majority of those are (or will be) running well over 1080, if they aren't, what a ridiculous waste of resources that is!
 

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
I have to admit I had no clue there was going to be a connectivity issue at this point in the game.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
I have to admit I had no clue there was going to be a connectivity issue at this point in the game.

Yeah and that's why it's understandable why people are pissed as hdmi 2.0 should be native at this time. However, now knowing it would require silicon changes I understand why hdmi 2.0 is not there. Doesn't mean I have to be happy about it.
 
May 11, 2008
22,220
1,410
126
From what I understand of adapters (from this post: http://www.overclock.net/t/721931/active-vs-passive-displayport-adapters-the-truth) is passive won't work at all since Fury can't handle hdmi 2.0 at all (so Fury won't know how to handshake and send video if passive adapter is used). An active adapter will make the hdmi 2.0 device look like a DP1.2a device to Fury which requires the active adapter to handle protocol and data signal conversions.

Indeed. I agree with you.
I started reading about it tonight because i was worried. I mistakenly understood that the fury cards only had display ports. But it seems these cards will have hdmi 1.4a outputs as well. So, i am set. But i do understand the disappointment people have who want to buy a 4K television set that does not have a displayport(1.2) connection but only a hdmi 2.0 connection.

I found this pdf that explains a lot about displayport :

www.vesa.org/wp-content/uploads/2011/01/ICCE-Presentation-on-VESA-DisplayPort.pdf

I do wonder, if i have a monitor with an older hdmi version, will it work with newer hdmi connections ? Is hdmi backwards compatible ?
It seems to be with what i have read.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
5,147
5,523
136
You might try something constructive, like helping list 4K 60+Hz display devices that WILL work with Fury, instead of insinuating that people that have a problem with lack of HDMI 2.0 have some sort of hidden agenda. I want a Fury and I also wanted a 4K TV, since I'm not going to get what I want, be helpful and suggest alternatives.
I just sent a PM to a member here that has experience with a 50" 4K TV as that is the size I would most like to use as a Hi-res general use screen + gaming on the side [even at 1080, so that I can use a slower card if necessary]. The small 4K monitors are too difficult for my eyes, and font scaling reduces the amount of data on the screen.
My post showed that I don't see a solution at present. Waiting for an adapter, that might not work properly or introduces more lag are all unknowns at present, thus my solution.
I don't think it's necessarily partisanship driving the discussion. I'm brand neutral, I'm in the market for a new card, and developments like this give me pause because although I tend to buy at the bleeding edge, I also like to keep that card around for a few years. I may not be upgrading to a 4K TV in the next few years, but if I do, I would hope that it wouldn't require me to upgrade my computer (or purchase an additional adapter) simply because of a lack of support in a standard that a competitor supported at the time. Better to have it and not need it than need it and not have it, you know? But it's doubly confounding when AMD is touting the 4K capabilities of their product that it can't actually connect to a wide segment of 4K devices. What were they thinking?

The problem is that two sets of posters will be active. Those who are really trying to find solutions and those jumping on the issue to press an attack. Past postings will give us a clue on whom to disregard.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
I do wonder, if i have a monitor with an older hdmi version, will it work with newer hdmi connections ? Is hdmi backwards compatible ?

Yup, hdmi definitely is backwards compatible. HDMI source and output will handshake at the highest level between the two (or in other words the highest level the older product can support).
 
May 11, 2008
22,220
1,410
126
Yup, hdmi definitely is backwards compatible. HDMI source and output will handshake at the highest level between the two (or in other words the highest level the older product can support).

Ah, thank you.
All i have to do now is wait until the reviews are available and if the nano is to my satisfaction. Will be interesting. If i decide to buy one (and they do seem promising) I will have to disable my current (A10-6700) igpu (in the bios) i guess since crossfire or any calculation offloading to the igpu will not happen. Which is a shame but nothing i can do about it. It does mean that my cpu will have more headroom for auto overclocking with a disabled igpu.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
I heard 22-24 ms last night as a best case scenario. That probably wouldn't bug me one bit on d3 or civ5, so now I have some interdasting thoughts for my Man Cave 2016™...though it sounds like it would be pretty painful for somebody playing an fps.



The few times that I've gamed on a TV have been a serious pita, you don't want to introduce more potential issues than necessary. Even if you can get an active adapter for a reasonable price on day 1 (highly unlikely), I think that you'd be better off going with 980ti this round.
Huh? I've never stated the fury x was better for me. No hdmi 2.0 support makes it far worse. I'd feel much safer with the 980ti.

What really annoys me is that now I'm lumped in with shill because I want hdmi 2.0? I've been talking about big screen TV gaming on here for 3+ years now. I'm not making a big deal out of this because I like nvidia. I make a big deal out of this because I like hdmi 2.0 and amd I waited for fury for almost a year now for nothing.

No solution I can think of makes me happy because I don't want a 980ti aib version for far more than 650. %I wanted the fury card to undercut it so I could spend 650 for wc too. Ugh my decision is screwed and it's even worse if fiji isn't sold by Amazon because at least I could buy it, try vsr 4k see if it's enough and then keep or return.