Fury cards will NOT have HDMI 2.0 outputs

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tential

Diamond Member
May 13, 2008
7,348
642
121
I have it the same way. Got the 7970 vs 680 and was like 5 minutes from 970. Damn. But we dont know if the new gen in ~1.3 year will change the pattern of gcn improving. And neither do we know if hbm2 is very different from a programming and driver perspective than gen 1.
But i bet on we will continue to see solid hbm improvement and gcn improvement the next 2 years.
We have 3 fiji cards and soon 4. I think there will be plenty of optimization here. I dont even care if fury x is 5% slower than 980ti. What matters is the potential. I think i go for the pro with air as i will only game starwars battlefront this year and early next year at 1440p. 80 hours max. I promise. Lol. Then it needs to stay idle and 100% silent the rest of the time.
Late next year is 4k and 14nm. Perhaps oled.
Before the high end was a given for me. Fury x was a day 1 purchase and then I was getting everything necessary to get 4k working.

Now though, it's look at reviews, then decide where I think cards will end up. Fury pro and fury nano look attractive but fury x has that cooling beast makes me wonder what the oc potential is when it's not voltage locked

Besides, my lineup for games is bioshock infinite, batman from like 2010 or so in the series. I've got a lot of moving parts now that fury x isn't a guaranteed purchase (which required all new purchase so didn't care). Now that I'm reusing old stuff I have its harder to pick lol. Just gotta make it tip next year or two and I'm good.
 
Last edited:

garagisti

Senior member
Aug 7, 2007
592
7
81
The odd thing is that while I expect to see the fury x be faster in this review, I'm worried that when I sell the fury x, it'll be far harder and get less value than selling a 980ti g1 card
That depends on how long you intend to keep the card... as krumme already replied, performance over a period of time is something to consider. Then there's the fact that it has water cooling and is supposedly whisper quiet while gaming. Another couple of days to go. Sodding, milk, watching... doesn't boil.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
That is not typical of past aging of AMD/NV cards. This particular cycle was weird b/c AMD had nothing new come out for a long time after NV's refresh last year. AMD will certainly be focusing more energy on Fury cards now instead of Hawaii/Grenada. If anything, the extra 2 gb on 980ti might help it age a bit more gracefully than Fury X.

The only reason to expect that 980ti will fall off over time is if you expect that AMD will be (again) far behind NV on the next node release. That's a murky picture indeed, and it's tough to see how it will play out over the next year (or 2).

Again? I thought AMD was the one who have been moving to newer nodes before Nvidia?
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
AMD definitely hit 40nm and 28nm first. This is a matter of public record. How silly.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Again, I just don't understand the drama. Why do so many people get worked up over this stuff? I understand the thinking behind supporting the underdog, but many here seem to take things to extremes.

980 ti is clearly the right card for you, so you bought it. Smart move.



Isn't the whole thought process behind curved monitors that it gives you that "surround monitor" feel...ie, a more immersive experience? It seems like a curved Monitor would, if anything, be more immersive than Curved TV.

Agree on the monitor part. I was referring to a majority of the OLED TV options as curved. Strangely, they work better as monitors due to the curve rather than TV where it hampers viewing angles. For just TV/movie watching on a HTPC where viewing angle is important, a curved OLED is not a great option. Sitting at your desk, though, it is awesome. :)
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
The forum title states "Fury cards will NOT have HDMI 2.0 outputs".

The next person who wants to talk about Nvidia and process node changes are getting booted from this subforum. Anything off topic, you are getting booted from this subforum.



-Rvenger
 

thehotsung8701A

Senior member
May 18, 2015
584
1
0
If you want to display 4K at more than 30fps you have two competing standards atm, HDMI and DisplayPort. HDMI 2.0 and DisplayPort 1.2a/1.3 all support 4K @ 60fps. Most 4K TVs however only have HDMI 2.0 ports.

Wait so that mean you can't even game in 4K with this card even though it market for 4K gaming?
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Wait so that mean you can't even game in 4K with this card even though it market for 4K gaming?

You can game with this card, up to and including 4k@60hz, however you must either have a TV with a Display Port plug (which sounds to only be on the more expensive panels) or have an active Display Port to HDMI 2.0 adapter, which doesn't really exist yet.

So, sure - you CAN use this card to do 4k, just not 4k@60hz on a cheap TV.

Rather short sighted...
 
Feb 19, 2009
10,457
10
76
https://www.youtube.com/watch?v=BWbRSHtHI6c

AMD goes into details why they went with 3 DP, to offer better flexibility with multi-monitor gaming as well as freesync support on all monitors.

Specifically mentions work with partners to bring DP > HDMI2 adapters that will solve the issue.

Also mentions Fury X is "no custom variants" lock, but normal Fury, AIBs can put whatever they want on the cards.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
https://www.youtube.com/watch?v=BWbRSHtHI6c

AMD goes into details why they went with 3 DP, to offer better flexibility with multi-monitor gaming as well as freesync support on all monitors.

Specifically mentions work with partners to bring DP > HDMI2 adapters that will solve the issue.

Also mentions Fury X is "no custom variants" lock, but normal Fury, AIBs can put whatever they want on the cards.

For me I could care less about hdmi 2.0 support. Just something for others to complain about even if they have no intention of using it or even buying the card.

Eagerly awaiting some reviews.

Might get a Fury X :)

Using a 1080p 144hz monitor and craving higher fps than the 970 I'm currently using can produce.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
You can game with this card, up to and including 4k@60hz, however you must either have a TV with a Display Port plug (which sounds to only be on the more expensive panels) or have an active Display Port to HDMI 2.0 adapter, which doesn't really exist yet.

So, sure - you CAN use this card to do 4k, just not 4k@60hz on a cheap TV.

Rather short sighted...

That however, is specific to TV's. Most monitors have a displayport. So anyone with a 4k monitor should/ would ideally prefer this for a good couple of reasons over anything else.
 

Wag

Diamond Member
Jul 21, 2000
8,288
8
81
The problem is that now there are affordable 4k HDTVs that can sit on a desk with full HDMI 2.0 4:4:4 compatibility. They can be had for $1000 or less.

Since all TV manufacturers have phased out the use of DP usage in their 2015 4k TVs, your only option is to go with HDMI 2.0.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
The problem is that now there are affordable 4k HDTVs that can sit on a desk with full HDMI 2.0 4:4:4 compatibility. They can be had for $1000 or less.

Since all TV manufacturers have phased out the use of DP usage in their 2015 4k TVs, your only option is to go with HDMI 2.0.
You'll have to wait for reviews I guess but hisense 50" HDTV 4:4:4 4k 60 hz is out at Walmart. $600 new. Not a bad deal. I'm waiting for avsforum members to try it out first though. There wasn't any build-up to this announcement so we'll see how well it does but I've seen a bunch of 4k 60hz tvs now for under 1k at Walmart these past couple weeks. This card has to sell during the holiday season of this year. Not having hdmi 2.0 when HDTV have support for it, gaming related companies are trying to push the PC into the living room to pick up console and more casual gamers too and will want to build boxes with hdmi 2.0 to sell to people. Windows 10 may even have better support for the big screen at some point(if it doesn't already haven't kept up with it), it's just not something you want to ignore.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
2015 model (released Dec 2014) Panasonic AX900 65" 4K LED 4HDMI 1DP
http://shop.panasonic.com/tvs/4k-tvs/TC-65AX900U.html


Careful with that panel. I would search avsforum's owners thread and keep an eye out for the banding defects. It has plagued the two gens of that panel w/o a fix.


Btw, has it been mentioned that only the 980ti has true hdmi 2.0, as it's the only card with the Sil chip? All other Nv card's are false advertising as they use reduced settings to fit 4k@60 under the bandwidth limit of 1.4? There are hardly many products at all with the Sil9777 chipset. Off the top of my head, the Denon and Pio receivers and the 980ti. When I first learned that AMD would not include hdmi 2.0 I was as peeved as the next guy, but I understand as that chip is rare and they didn't lie to use like another brand. The irony for me is that I could never get 4k@60 with 4:4:4 using an ex maxwell card I had. Now I know why...


http://www.siliconimage.com/Company...ual-Mode_HDMI®_2_0/MHL®_3_0_IC_with_HDCP_2_2/
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Careful with that panel. I would search avsforum's owners thread and keep an eye out for the banding defects. It has plagued the two gens of that panel w/o a fix.


Btw, has it been mentioned that only the 980ti has true hdmi 2.0, as it's the only card with the Sil chip? All other Nv card's are false advertising as they use reduced settings to fit 4k@60 under the bandwidth limit of 1.4? There are hardly many products at all with the Sil9777 chipset. Off the top of my head, the Denon and Pio receivers and the 980ti. When I first learned that AMD would not include hdmi 2.0 I was as peeved as the next guy, but I understand as that chip is rare and they didn't lie to use like another brand. The irony for me is that I could never get 4k@60 with 4:4:4 using an ex maxwell card I had. Now I know why...


http://www.siliconimage.com/Company...ual-Mode_HDMI®_2_0/MHL®_3_0_IC_with_HDCP_2_2/
Per mods we can't talk about anything other than the fact fury has no hdmi 2.0. So while I wish we could keep talking about the great points you brought up that I've wanted to get into for awhile now in the thread and think they're great topics to discuss, but we can't.

How hard would it have been to do 3 display port and 1 hdmi 2.0. Would have been amazing....
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
I do not believe graphics cards have the SIL chips, instead they have SIP blocks that are directly on the GPU die. This is why I don't think they can just slap on an HDMI 2.0 port even if they wanted to, the display controller in the GPU would have to allow it.
 
Last edited:

Wag

Diamond Member
Jul 21, 2000
8,288
8
81
Well, the Hisense sounds decent enough, but they're a cheapie brand and someone already posted on Walmart that it's not actually doing quite what they claim. None of those cheapie 4k displays have been able to do 4k 4:4:4...yet.

I own a 2015 4k Samsung UN48JU6700, which is 48" curved. Nice display. Has around 32ms in game mode, which is adequate. I'm not a big FPS gamer anymore so I don't care. But for RPGs like Witcher 3 it's perfect.

I was really hoping on getting a Fury this time around, but I have no choice but to stick with Nvidia. Damnit AMD.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Well, the Hisense sounds decent enough, but they're a cheapie brand and someone already posted on Walmart that it's not actually doing quite what they claim. None of those cheapie 4k displays have been able to do 4k 4:4:4...yet.

I own a 2015 4k Samsung UN48JU6700, which is 48" curved. Nice display. Has around 32ms in game mode, which is adequate. I'm not a big FPS gamer anymore so I don't care. But for RPGs like Witcher 3 it's perfect.

I was really hoping on getting a Fury this time around, but I have no choice but to stick with Nvidia. Damnit AMD.
Ya I feel 30 ms is more than acceptable. I'm a person who has gamed with lag all the time so I really don't care at all anyway. Even 100 ms wouldn't bother me if I had to give that away to hit a screen size, iq, and price point.
 

Steel Rain

Junior Member
Jun 27, 2015
2
0
0
Ya I feel 30 ms is more than acceptable. I'm a person who has gamed with lag all the time so I really don't care at all anyway. Even 100 ms wouldn't bother me if I had to give that away to hit a screen size, iq, and price point.

Little confused about all this talk of 30+ ms input lag with TV's. As it stands a good high end/mid range 4K TV like the Samsung HU9000, F9000, etc. can achieve around 30 ms response times. I've recorded a response time of about 29ms using the HU9000. Of course it's nowhere near say a 1 ms 144fps monitor, but monitors that have input lag that low normally use a TN panel, which while things are snappy and feel more smooth, look like complete trash when it comes to PQ. A high end S-IPS panel is normally 10ish +-2 ms for example.

While most people may game on a monitor, they just don't do it for some of us. They are too tiny; and frankly, the colors, black levels, and black uniformity of all but the most expensive S-PLS/S-IPS monitors are awful! Even the 1K 48in JU6700 TV blows away any sub 1K monitor. If I'm gonna drop 2-3K on a high end 32in 4K monitor, why not just drop 1K more, and get a 65in 4K TV which has a comparable PQ with slightly higher input lag?

A lot of people are affected by this silly decision, as the amount of people playing at a desk is declining. And the people that aren't affected, either don't have/use 4K, in which case it's completely irrelevant, or they are likely using a very high end 4K monitor in which case they are using firepro/tesla GPU's anyways.

Really dumb call AMD, really dumb.
 
Last edited: