Fury cards will NOT have HDMI 2.0 outputs

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
So was away for the weekend, and not going to read seven pages of this article. But, why is everybody so angry over no HDMI? I can understand this if you are connecting to a TV. But every display I have bought in the last several years that has HDMI also has DP. So why is there an issue? Its not like DP is some new technology, its many years old.
 

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
So was away for the weekend, and not going to read seven pages of this article. But, why is everybody so angry over no HDMI? I can understand this if you are connecting to a TV. But every display I have bought in the last several years that has HDMI also has DP. So why is there an issue? Its not like DP is some new technology, its many years old.

I guess for me it's like when you have your heart set on a certain dish, and the waitress comes back and says it not on the menu anymore. Sometimes it takes time to adjust to something that most would consider trivial. I had the display device I wanted pretty much picked out and wanted one of the Fury versions to go with it. Now I have to re-think it all, and it is annoying to have to do so because of (stupid) connectivity issues that I and many others just kind of got blindsided by.

Oh, well, moving right along. Let's see the reviews!
 

SithSolo1

Diamond Member
Mar 19, 2001
7,740
11
81
But, why is everybody so angry over no HDMI? I can understand this if you are connecting to a TV.


The TV connection is the issue as most the readily available larger 4k displays are TVs that only have HDMI 2.0 ports for 4K@60hz. To the best of my knowledge only a few flagship TVs from Panasonic have DisplayPorts. Without native HDMI 2.0 support you will need to buy active adapters to convert DP to HDMI or you will be stuck at 4K@30hz. At the moment these adapters don't really exist in the market, may introduce additional input lag, and will add(at a very optimistic minimum) $50+ to an already not cheap card.

Even though I have no need for HDMI 2.0 in the foreseeable future I'm still surprised it wasn't included in the gpu.
 

Xcobra

Diamond Member
Oct 19, 2004
3,675
423
126
Bought 2 x EVGA 980 Ti SC+ / ACX+ cards today, due to the lack of HDMI 2.0 on Fury X.

Hated doing it, paid $690 each, which is quite a price premium over the air Fury, and still more expensive than even the wonderful AIO hybrid Fury X, but had to obviously as my desktop display is a Samsung 48JU7500 4k LCD that I wouldn't give up for any silly GPU. IMAX on the desktop, nothing like it unless you're an early VR owner.

Damnit AMD...always one step forward, two steps back. So close, hopefully next time.

What a waste... I would have waited until after Fury X reviews in hopes for a price drop :D
 

SithSolo1

Diamond Member
Mar 19, 2001
7,740
11
81
What does having or not having HDMI 2.0 mean exactly?

If you want to display 4K at more than 30fps you have two competing standards atm, HDMI and DisplayPort. HDMI 2.0 and DisplayPort 1.2a/1.3 all support 4K @ 60fps. Most 4K TVs however only have HDMI 2.0 ports.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
What a waste... I would have waited until after Fury X reviews in hopes for a price drop :D

That's a highly unlikely scenario. You can pick almost any GPU generation in the last 5 years and you'll see that NV has no problems charging more for less performance and especially more for similar performance. Even if Fury X is 10-15% faster than a stock 980Ti, NV likely won't lower prices. They had no problem charging $150 more than an R9 280X for a 770 4GB and $100 more for a 780 vs. R9 290. That alone tells you NV can price its cards $100-150 more for similar or worse performance.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
That's a highly unlikely scenario. You can pick almost any GPU generation in the last 5 years and you'll see that NV has no problems charging more for less performance and especially more for similar performance. Even if Fury X is 10-15% faster than a stock 980Ti, NV likely won't lower prices. They had no problem charging $150 more than an R9 280X for a 770 4GB and $100 more for a 780 vs. R9 290. That alone tells you NV can price its cards $100-150 more for similar or worse performance.
The reason is the price elasticity of nv gpu is less than amd. Its extremely valuable for profit.
For a huge part of the potential consumers a gpu = nv and nv = gpu. Same thing.
Its just a question of what nv model it is.
Go look our own forums - and its not different from other solid brands.
Such a brand requires good products. Excellent marketing. Then you can sell at a higher price without changing demand.
(Demand price elasticity is percentage change of demand with percentage change of price)
We all know the number one brand in our tech world so to spead. Talk about value (for shareholders lol)
 
Last edited:

MagickMan

Diamond Member
Aug 11, 2008
7,460
3
76
This is a bummer, but I've long switched to DP, as have most other PC gamers (especially those who have dropped some serious coin on a nice 4K or 21:9 monitor). Once you've gone 32" 4k there's no going back.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
The reason is the price elasticity of nv gpu is less than amd. Its extremely valuable for profit.
For a huge part of the potential consumers a gpu = nv and nv = gpu. Same thing.
Its just a question of what nv model it is.
Go look our own forums - and its not different from other solid brands.
Such a brand requires good products. Excellent marketing. Then you can sell at a higher price without changing demand.
(Demand price elasticity is percentage change of demand with percentage change of price)
We all know the number one brand in our tech world so to spead. Talk about value (for shareholders lol)
The odd thing is that while I expect to see the fury x be faster in this review, I'm worried that when I sell the fury x, it'll be far harder and get less value than selling a 980ti g1 card
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Bought 2 x EVGA 980 Ti SC+ / ACX+ cards today, due to the lack of HDMI 2.0 on Fury X.

Hated doing it, paid $690 each, which is quite a price premium over the air Fury, and still more expensive than even the wonderful AIO hybrid Fury X, but had to obviously as my desktop display is a Samsung 48JU7500 4k LCD that I wouldn't give up for any silly GPU. IMAX on the desktop, nothing like it unless you're an early VR owner.

Damnit AMD...always one step forward, two steps back. So close, hopefully next time.

Great buy! http://www.techpowerup.com/reviews/EVGA/GTX_980_Ti_SC_Plus/

More efficient than the reference 980 TI, 14% faster at 4k out of the box, and only 35 db noise. IMO, this card is more well-rounded than the Gigabyte G1.


There will be no more off topic discussion. You are in the wrong subforum for Nvidia talk.


-Rvenger
 
Last edited by a moderator:

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
If you want to display 4K at more than 30fps you have two competing standards atm, HDMI and DisplayPort. HDMI 2.0 and DisplayPort 1.2a/1.3 all support 4K @ 60fps. Most 4K TVs however only have HDMI 2.0 ports.

Are they just leaving DP out since it's less common and would be added cost that would not give them more sales? Is the DP hardware that expensive?
 

SithSolo1

Diamond Member
Mar 19, 2001
7,740
11
81
Are they just leaving DP out since it's less common and would be added cost that would not give them more sales? Is the DP hardware that expensive?

My guess is that since most consumer video hardware has been using HDMI for years they see no reason to add additional functionality on mainstream products. This may change as 4K becomes the new norm but I wouldn't hold my breath.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I like slow paced games in general, so I thought that a TV might be OK for me. But that could be wrong. Also wanted to go straight from 1080 to 4K, but perhaps an intermediate step will be needed. If a 1440 monitor is purchased now, won't it feel outdated pretty quick at this point?

Sorry to just now pick up the discussion thread, I spent all weekend on ONE game of Civ4 with RomAnd mod. :)

I went from a 2209wa to 2713hm about 18 mos ago, and it was a HUGE jump (22" 1680x1050 to 27" 2560x1440). It was killing me that my wife's 2213hm was so fantastic, so I just took the plunge. I have not regretted it one bit. My only hassle was having to buy a $4.99 dp cable from newegg after the monitor arrived so that I could game above 1920x1200. I've had a high quality 1080p tv for years, my latest one is a 65" plasma with a ridiculously crisp picture, and my monitor blows that away. If anything, the new monitor seemed too big for me. However, I wouldn't want a 27" 4k monitor as the dpi would be atrocious, I just read that it takes 40-41" on a 4k monitor to achieve the same dpi...which would probably cost an arm and a leg.

So, I would say that based upon my personal experience, you'd be pretty happy to go to 2560x1440 if you have budgetary concerns. And, even if you don't care about money at all, you might want to look at a 30" 2560x1600 monitor, lots of dedicated gamers are still using those and I'd expect that prices are much better these days.

fyi, I'm not trash talking the 2209wa, I'm actually using that as we speak b/c it's my monitor at the office. It just feels tiny compared to the 2713 hm.

I don't think it's necessarily partisanship driving the discussion. I'm brand neutral, I'm in the market for a new card, and developments like this give me pause because although I tend to buy at the bleeding edge, I also like to keep that card around for a few years. I may not be upgrading to a 4K TV in the next few years, but if I do, I would hope that it wouldn't require me to upgrade my computer (or purchase an additional adapter) simply because of a lack of support in a standard that a competitor supported at the time. Better to have it and not need it than need it and not have it, you know? But it's doubly confounding when AMD is touting the 4K capabilities of their product that it can't actually connect to a wide segment of 4K devices. What were they thinking?

Wait...you buy at the bleeding edge, yet you like to keep cards around for a few years? You'd be FAR better off buying scaled back cards like fury pro or a theoretical future 980+ every year than buying top end cards for $650 every 2 or 3 years, yet you'd have the same outlay.

If you don't plan to game on a 4k tv atm, I'd look at the things that you DO plan to game on to make your decision. What games do you play? What resolution are you going to use for the next year or so? What monitor do you currently own? Those things are most relevant in your situation.

Huh? I've never stated the fury x was better for me. No hdmi 2.0 support makes it far worse. I'd feel much safer with the 980ti.

What really annoys me is that now I'm lumped in with shill because I want hdmi 2.0? I've been talking about big screen TV gaming on here for 3+ years now. I'm not making a big deal out of this because I like nvidia. I make a big deal out of this because I like hdmi 2.0 and amd I waited for fury for almost a year now for nothing.

No solution I can think of makes me happy because I don't want a 980ti aib version for far more than 650. %I wanted the fury card to undercut it so I could spend 650 for wc too. Ugh my decision is screwed and it's even worse if fiji isn't sold by Amazon because at least I could buy it, try vsr 4k see if it's enough and then keep or return.

What are you talking about? I realize, based upon your more recent posts in other threads, that you're not planning to go 4k now, but if you WERE going to go 4k on a tv then 9800ti or one of those DP-enabled panasonics for fury X are really the only options. People telling you to go with 9800ti earlier in this thread was 100% based upon your stated desires, there's no need to read an ulterior motive in other people's posts.
 
Last edited:

crashtech

Lifer
Jan 4, 2013
10,681
2,277
146
Sorry to just now pick up the discussion thread, I spent all weekend on ONE game of Civ4 with RomAnd mod. :)

I went from a 2209wa to 2713hm about 18 mos ago, and it was a HUGE jump (22" 1680x1050 to 27" 2560x1440). It was killing me that my wife's 2213hm was so fantastic, so I just took the plunge. I have not regretted it one bit. My only hassle was having to buy a $4.99 dp cable from newegg after the monitor arrived so that I could game above 1920x1200. I've had a high quality 1080p tv for years, my latest one is a 65" plasma with a ridiculously crisp picture, and my monitor blows that away. If anything, the new monitor seemed too big for me. However, I wouldn't want a 27" 4k monitor as the dpi would be atrocious, I just read that it takes 40-41" on a 4k monitor to achieve the same dpi...which would probably cost an arm and a leg.

So, I would say that based upon my personal experience, you'd be pretty happy to go to 2560x1440 if you have budgetary concerns. And, even if you don't care about money at all, you might want to look at a 30" 2560x1600 monitor, lots of dedicated gamers are still using those and I'd expect that prices are much better these days.

fyi, I'm not trash talking the 2209wa, I'm actually using that as we speak b/c it's my monitor at the office. It just feels tiny compared to the 2713 hm.

Thanks for the reply. There are several different way to go, and I've definitely taken a step back to reconsider. I don't suppose there is a rush but this release got me pretty jazzed about upgrading.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I've been watching OLED and it looks to be the future for a good 4K screen if you want to preserve the PQ coming from a plasma. Still not sold on long term reliability yet. Not that plasmas are great, they fade over time. Another great thing is the power consumption/heat output of OLED. A plasma literally can heat your room :D

HA, my plasma is in a 15' tall room...thank God! My 2713hm seems to run pretty cool, though I couldn't tell you whether it's led, oled, plasma, chalkboard, or papier mache technology.

If you are in US, and have a Frys or Best Buy (with Magnolia) nearby then go take a look. LG 1080p OLED are there in most Frys, and the 4K OLEDs are also there at few locations.

Do they have such things in Santa Clara? Maybe they'll start selling technical equipment at Whole Foods though...

I would have loved to have given my money to AMD, but they don't want it, so now I have to buy the inferior product from the competition and further their market lead. And then I have to recommend the same to all of my friends and clients interested in 4k gaming. You guys fanboying and telling everyone to get over it don't realize most of us are upset with AMD because they've done what nvidia couldn't and gotten us to buy nvidia, not because we hate AMD anyway.

I seriously don't get this kind of post. Why is NV "inferior"? 9800ti looks to be roughly as fast as Fury X, and it will have an extra 2 gb just in case that ends up being important. Even if Fury X ends up being 10% faster (which would surprise the heck out of many of us), the 9800ti is a great overclocker so your real-world gaming experience would likely be identical or nearly so.

Just buy the card that has the best price/performance for your gaming needs. And, for this generation, if your gaming needs include "4k/60" and you don't like Panasonic TV's, then you shouldn't even consider anything other than 9800ti.
 
Last edited:

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
The odd thing is that while I expect to see the fury x be faster in this review, I'm worried that when I sell the fury x, it'll be far harder and get less value than selling a 980ti g1 card
Thats surely going to be the case. But as history shows in 1 year its also probably far faster than 980ti. So you can keep it for far longer. Go eg look at 680 vs 7970 today. Its staggering and 7970 was what 3 months before?
 

maddie

Diamond Member
Jul 18, 2010
5,147
5,523
136
Sorry to just now pick up the discussion thread, I spent all weekend on ONE game of Civ4 with RomAnd mod. :)

I went from a 2209wa to 2713hm about 18 mos ago, and it was a HUGE jump (22" 1680x1050 to 27" 2560x1440). It was killing me that my wife's 2213hm was so fantastic, so I just took the plunge. I have not regretted it one bit. My only hassle was having to buy a $4.99 dp cable from newegg after the monitor arrived so that I could game above 1920x1200. I've had a high quality 1080p tv for years, my latest one is a 65" plasma with a ridiculously crisp picture, and my monitor blows that away. If anything, the new monitor seemed too big for me. However, I wouldn't want a 27" 4k monitor as the dpi would be atrocious, I just read that it takes 40-41" on a 4k monitor to achieve the same dpi...which would probably cost an arm and a leg.

So, I would say that based upon my personal experience, you'd be pretty happy to go to 2560x1440 if you have budgetary concerns. And, even if you don't care about money at all, you might want to look at a 30" 2560x1600 monitor, lots of dedicated gamers are still using those and I'd expect that prices are much better these days.

fyi, I'm not trash talking the 2209wa, I'm actually using that as we speak b/c it's my monitor at the office. It just feels tiny compared to the 2713 hm.



Wait...you buy at the bleeding edge, yet you like to keep cards around for a few years? You'd be FAR better off buying scaled back cards like fury pro or a theoretical future 980+ every year than buying top end cards for $650 every 2 or 3 years, yet you'd have the same outlay.

If you don't plan to game on a 4k tv atm, I'd look at the things that you DO plan to game on to make your decision. What games do you play? What resolution are you going to use for the next year or so? What monitor do you currently own? Those things are most relevant in your situation.



What are you talking about? I realize, based upon your more recent posts in other threads, that you're not planning to go 4k now, but if you WERE going to go 4k on a tv then 9800ti or one of those DP-enabled panasonics for fury X are really the only options. People telling you to go with 9800ti earlier in this thread was 100% based upon your stated desires, there's no need to read an ulterior motive in other people's posts.

Agreed. DPI scaling is not an option if you want to fit a lot of info for work use on the screen. I have a Korean 2560x1440 now and don't have a problem buying that way.

Arm and Leg?

Perfect Pixel WASABI MANGO UHD420 REAL 4K 42" LG AH-IPS UHD 3840x2160 Monitor for $900 shipped.
http://www.ebay.com/itm/Perfect-Pix...714?pt=LH_DefaultDomain_0&hash=item1e9e31a5ca

Equals 28" 2560x1440 DPI. Can live with this.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Bought 2 x EVGA 980 Ti SC+ / ACX+ cards today, due to the lack of HDMI 2.0 on Fury X.

Hated doing it, paid $690 each, which is quite a price premium over the air Fury, and still more expensive than even the wonderful AIO hybrid Fury X, but had to obviously as my desktop display is a Samsung 48JU7500 4k LCD that I wouldn't give up for any silly GPU. IMAX on the desktop, nothing like it unless you're an early VR owner.

Damnit AMD...always one step forward, two steps back. So close, hopefully next time.

Again, I just don't understand the drama. Why do so many people get worked up over this stuff? I understand the thinking behind supporting the underdog, but many here seem to take things to extremes.

980 ti is clearly the right card for you, so you bought it. Smart move.

Edit: Also hoping for more flat OLEDs next year. I don't mind curved, but it seems more OLEDs are curved vs flat right now and I don't want or need a curved display for my use.

Isn't the whole thought process behind curved monitors that it gives you that "surround monitor" feel...ie, a more immersive experience? It seems like a curved Monitor would, if anything, be more immersive than Curved TV.
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
My guess is that since most consumer video hardware has been using HDMI for years they see no reason to add additional functionality on mainstream products. This may change as 4K becomes the new norm but I wouldn't hold my breath.

Seems very odd that a new TV technology with only one good use atm (video games) would be so tough to use for...video games. I'm not up to speed on this sort of thing, but at a guess I'd say that it's cheaper for a TV to add DP than it would be for NV or AMD to design hdmi 2.0 into their cards. Perhaps if AMD gets the market back to 40/60, then more of the TV makers will start putting DP connections on their 4k offerings.

OFC, by then AMD will doubtless have hdmi 2.0 on their cards...dual standards, indeed.

Agreed. DPI scaling is not an option if you want to fit a lot of info for work use on the screen. I have a Korean 2560x1440 now and don't have a problem buying that way.

Arm and Leg?

Perfect Pixel WASABI MANGO UHD420 REAL 4K 42" LG AH-IPS UHD 3840x2160 Monitor for $900 shipped.
http://www.ebay.com/itm/Perfect-Pix...714?pt=LH_DefaultDomain_0&hash=item1e9e31a5ca

Equals 28" 2560x1440 DPI. Can live with this.

That price is not bad, but it's still 3x the cost of a 27" monitor. Might just cost you the arm OR leg...

At least it has DP.
 
Last edited:

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
It just sucks because the form factor, noise and temps made it the perfect HTPC card. Not a problem for most users as most likely the living room tv time is monopolized by normal tv\kids\whatever, but for me would be nice to play a console port with wireless controller every now and then. Really hoping that the hdmi 2.0 adapters pan out or AIBs can add the active adapter ic with comparable cooling solution.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Bought 2 x EVGA 980 Ti SC+ / ACX+ cards today, due to the lack of HDMI 2.0 on Fury X.

Loved doing it because nvidia, paid $690 each, which is quite a price premium over the air Fury, and still more expensive than even the wonderful AIO hybrid Fury X, but had to obviously as my desktop display is a Samsung 48JU7500 4k LCD that I wouldn't give up for any silly GPU and because nvidia. IMAX on the desktop, nothing like it unless you're an early VR owner.

Damnit AMD...always one step forward, two steps back. So close, hopefully but probably not because nvidia, next time.

LOL, that is this crap?
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Thats surely going to be the case. But as history shows in 1 year its also probably far faster than 980ti. So you can keep it for far longer. Go eg look at 680 vs 7970 today. Its staggering and 7970 was what 3 months before?
As a person who purchased the 7950 over the 680 and now sees where they ended up. Ya I really am worried about picking up a 980ti and have its performance fall off a cliff later
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
As a person who purchased the 7950 over the 680 and now sees where they ended up. Ya I really am worried about picking up a 980ti and have its performance fall off a cliff later

That is not typical of past aging of AMD/NV cards. This particular cycle was weird b/c AMD had nothing new come out for a long time after NV's refresh last year. AMD will certainly be focusing more energy on Fury cards now instead of Hawaii/Grenada. If anything, the extra 2 gb on 980ti might help it age a bit more gracefully than Fury X.

The only reason to expect that 980ti will fall off over time is if you expect that AMD will be (again) far behind NV on the next node release. That's a murky picture indeed, and it's tough to see how it will play out over the next year (or 2).
 
Last edited:

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
As a person who purchased the 7950 over the 680 and now sees where they ended up. Ya I really am worried about picking up a 980ti and have its performance fall off a cliff later
I have it the same way. Got the 7970 vs 680 and was like 5 minutes from 970. Damn. But we dont know if the new gen in ~1.3 year will change the pattern of gcn improving. And neither do we know if hbm2 is very different from a programming and driver perspective than gen 1.
But i bet on we will continue to see solid hbm improvement and gcn improvement the next 2 years.
We have 3 fiji cards and soon 4. I think there will be plenty of optimization here. I dont even care if fury x is 5% slower than 980ti. What matters is the potential. I think i go for the pro with air as i will only game starwars battlefront this year and early next year at 1440p. 80 hours max. I promise. Lol. Then it needs to stay idle and 100% silent the rest of the time.
Late next year is 4k and 14nm. Perhaps oled.