There goes my Freesync single GPU 4K Dream

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
AMD-Radeon-Fury-X-Gaming-Benchmarks.jpg


Was thinking of going Fury X ($650) + Samsung UE850 4K 32" IPS 4K Freesync monitor ($1399) = $2050. But it would need to be able to hit 40fps minimum to sync with the monitor in Freesync.

Alternative is SLI 980 Ti ($1300) + Acer B326HK 4K 32" IPS monitor $(700-750) = $2000-2050 which costs about the same, but needs multi-GPU game support and much more power to achieve the 60fps to sync with the monitor.

But it looks like according to official slides the card is barely managing high 30s fps in Witcher 3. And that's average framerates--minimums will dip well below the Freesync threshold. 40fps 4K single card + Freesync still looks like a bit of a pipe dream at this point, especially as games get more demanding.
 
Last edited:

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
Personally, I don't see single GPU 4K gaming as being an option until another 2 years minimum, get a 1440P monitor and whatever GPU you pick you'll be covered.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
The high threshold for FreeSync is a problem. AMD should have mandated that in order to claim FreeSync compatibility, monitors had to go down at least as far as a specific refresh rate - 30 Hz at a maximum, and preferably 20 Hz.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Turn down AA/settings a little bit. Hit the framerates. Profit.

Or just wait a bit for the monitors to get a bit better. Everything's kinda marginal right now for that sort of thing.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Turn down AA/settings a little bit. Hit the framerates. Profit.

Or just wait a bit for the monitors to get a bit better. Everything's kinda marginal right now for that sort of thing.

honestly games look better with AA and max settings at 1440p than 4k at medium.

Personally, I don't see single GPU 4K gaming as being an option until another 2 years minimum, get a 1440P monitor and whatever GPU you pick you'll be covered.

I feel the same way. Running medium to get good framerates isn't worth the trade off to me. 1440p is plenty with a bit of AA and max settings.
 

Irenicus

Member
Jul 10, 2008
94
0
0
I've said it before, I'll say it again, this is the worst POSSIBLE time to dive deep into 4k ANYTHING.


hdmi 2.0 is bandwidth starved to the point where to get 4k @60Hz you have to drop color quality with hdcp.


next year each gpu maker goes from 28nm to 14nm. That is a massive leap. THAT is when you will be able to run 4k on a single card. Even better, by then hopefully we can start seeing 4k tvs with supermhl built in and gpus with supermhl to replace hdmi and send it to the ash heap of history. Let that standard die.


As if 4k is where things are supposed to stop.

The next MAJOR 4k tv purchase ought to be able to handle :

-4k@ 60Hz AT LEAST if not 120Hz
-rec 2020
-hfr video
-high dynamic range video content
-cheaper oled tvs from lg with even greater yields in 2016 as their production ramps up more.


Today nothing has supermhl, almost nothing has displayport, and hdmi 2.0 is already obsolete. Very few 4k tvs support hdr content, and the only oled displays with perfect blacks are still too expensive. And you want to jump into 4k right now? Why? Why on earth would anyone piss away time and money like that right now? Wait. Wait another year and be so much better off in both performance AND tv options.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
I've said it before, I'll say it again, this is the worst POSSIBLE time to dive deep into 4k ANYTHING.


hdmi 2.0 is bandwidth starved to the point where to get 4k @60Hz you have to drop color quality with hdcp.

That has nothing to do with hdmi 2.0 but rather TVs using an old Silicon Image chipset that only has 4:2:0 support. 4k TVs starting Q4 2015 will all be coming out with Silicon Image SiL 9777 HDCP 2.2 4:4:4 chips.
 
Feb 19, 2009
10,457
10
76
Those aren't even maxed settings. It's like FXAA or 2x MSAA.. 2x Fury X isn't enough to max 4K with 4x MSAA.

Neither would 2x Titan X.

4K is about trade-offs, higher resolution vs maxing IQ in games.

Now compare how games tend to look on High/Ultra vs All Ultra. Much difference? Hardly any.
 

Irenicus

Member
Jul 10, 2008
94
0
0
That has nothing to do with hdmi 2.0 but rather TVs using an old Silicon Image chipset that only has 4:2:0 support. 4k TVs starting Q4 2015 will all be coming out with Silicon Image SiL 9777 HDCP 2.2 4:4:4 chips.

I was under the impression that the full hdmi 2.0 18Gbps spec was enough for 4k @ 60Hz and 4:4:4 chroma sub sampling, but NOT with the additional overhead of hdcp 2.2 on top of that as well. So that if you needed hdcp 2.2 to get 4k content to display @ 60Hz you would have to drop the color to 4:2:0.

Is that not right?


I know there are plenty of tvs on the market right now that can handle 4k content, but most of it gets there via streaming. The tv connectors are what's lagging behind which I find astonishing. What is more difficult to design and build? An entire television set or the connector?

If you want to use a 4k tv for a gaming display, don't you want MORE than 60Hz? And does that not suggest that both hdmi 2.0 and displayport 1.2 is not up to the task there?


How much bandwidth is needed to deliver rec 2020 color at 4k at 60Hz? 120Hz? Is the 18Gbps of hdmi 2.0 enough for that? Because I'm pretty sure supermhl is. And that is what we do not have.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I was under the impression that the full hdmi 2.0 18Gbps spec was enough for 4k @ 60Hz and 4:4:4 chroma sub sampling, but NOT with the additional overhead of hdcp 2.2 on top of that as well. So that if you needed hdcp 2.2 to get 4k content to display @ 60Hz you would have to drop the color to 4:2:0.

Is that not right?
HDMI 2.0 is enough for 4k@60Hz, 8bpc, with 4:4:4 chroma subsampling and HDCP 2.2. They didn't go through all of that effort to develop HDMI 2.0 and HDCP 2.2 and not think through the bandwidth requirements.:p

The issue is that HDCP 2.2 capable controllers that can also handle the full HDMI 2.0 specification are still very new and will take time to become common in 4K TVs.
 
Last edited:

Irenicus

Member
Jul 10, 2008
94
0
0
HDMI 2.2 is enough for 4k@60Hz, 8bpc, with 4:4:4 chroma subsampling and HDCP 2.2. They didn't go through all of that effort to develop HDMI 2.0 and HDCP 2.2 and not think through the bandwidth requirements.:p

The issue is that HDCP 2.2 capable controllers that can also handle the full HDMI 2.0 specification are still very new and will take time to become common in 4K TVs.

Well then hdmi 2.0 is slightly less terrible than I thought it was. Even so, I'd still wait until we had supermhl tvs on the market before I made a jump to 4k.


http://www.hdtvexpert.com/look-out-hdmi-here-comes-super-mhl/


maxing out @ 60Hz sounds terrible to me - So does sticking to 8bit color when HDR is on the horizon.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I know there are plenty of tvs on the market right now that can handle 4k content, but most of it gets there via streaming. The tv connectors are what's lagging behind which I find astonishing. What is more difficult to design and build? An entire television set or the connector?
In all seriousness, the connector. With the TV everything is on-chip, so you don't have to worry about data routing and more importantly you don't have to worry about heavy-handed HDCP security.

How much bandwidth is needed to deliver rec 2020 color at 4k at 60Hz? 120Hz? Is the 18Gbps of hdmi 2.0 enough for that? Because I'm pretty sure supermhl is. And that is what we do not have.
Technically you can Rec 2020 with 4K@60Hz over HDMI 2.0. However that's with 8bpc color, which is suboptimal. Really when doing Rec 2020 you want 12bpc color, in which case HDMI 2.0 is not enough.

But! The vast majority of the time if you want HDR you're talking about pre-recorded content already stored with 4:2:0 chroma subsampling (i.e. Ultra HD Blu-Ray), in which case we're back to fitting on an HDMI 2.0 connection if we just use 4:2:0 over the entire connection.

Arguably the only real problem with HDMI 2.0 is when you want 4K, HDR, and 4:4:4/RGB content, which is pretty much limited to gaming and is why it's not a serious concern for TV manufacturers or the CE industry at large.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Personally, I don't see single GPU 4K gaming as being an option until another 2 years minimum, get a 1440P monitor and whatever GPU you pick you'll be covered.

It entirely depends on what you play.

It's not possible with absolutely max-ultra-8xAA-modded-customtextures-shineyballs settings in the top few most demanding games, but you drop the settings from ultra down to high or very high then you've got a very good experience at 4k and the 99% of the rest of any decent sized gaming library will run maxed out.

I love the really high fidelity that some titles can crank out, the Crysis-like games which are so pretty it makes your eyes bleed, but I probably put in close to 1000 hours of gaming a year and the latest Crysis-like titles probably account for maybe a total of 50h of that.

Some people literally only play CS and TF2, or MOBAs which all easily run maxed out. It's easy to lose track of reality when you're looking at benchmarks of the most demanding games in the highest settings, these are used to show off the power of the cards, but for most people that's no representative of their actual gaming habits.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I've said it before, I'll say it again, this is the worst POSSIBLE time to dive deep into 4k ANYTHING.


hdmi 2.0 is bandwidth starved to the point where to get 4k @60Hz you have to drop color quality with hdcp.


next year each gpu maker goes from 28nm to 14nm. That is a massive leap. THAT is when you will be able to run 4k on a single card. Even better, by then hopefully we can start seeing 4k tvs with supermhl built in and gpus with supermhl to replace hdmi and send it to the ash heap of history. Let that standard die.


As if 4k is where things are supposed to stop.

The next MAJOR 4k tv purchase ought to be able to handle :

-4k@ 60Hz AT LEAST if not 120Hz
-rec 2020
-hfr video
-high dynamic range video content
-cheaper oled tvs from lg with even greater yields in 2016 as their production ramps up more.


Today nothing has supermhl, almost nothing has displayport, and hdmi 2.0 is already obsolete. Very few 4k tvs support hdr content, and the only oled displays with perfect blacks are still too expensive. And you want to jump into 4k right now? Why? Why on earth would anyone piss away time and money like that right now? Wait. Wait another year and be so much better off in both performance AND tv options.

If your other 50 posts in the last 7 years are as insightful as this one, I await eagerly your next words of wisdom. ;)

Great post!
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Those aren't even maxed settings. It's like FXAA or 2x MSAA.. 2x Fury X isn't enough to max 4K with 4x MSAA.

Neither would 2x Titan X.

4K is about trade-offs, higher resolution vs maxing IQ in games.

Now compare how games tend to look on High/Ultra vs All Ultra. Much difference? Hardly any.


Hardly any is still a difference and I'd rather have all the quality I can before I move up in resolution personally.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
This thread is stating the obvious. 4k at very / maxed settings no AA with the latest titles is for SLI/CF setups with Fury X or 980 TI / Titan X. Anyone talking about 4k performance on single GPU's right now is either sticking with 3+ year old games or just simply needs to STFU because it just doesn't matter yet. 1440p with maxed settings is where the battle still is right now.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It entirely depends on what you play.

It's not possible with absolutely max-ultra-8xAA-modded-customtextures-shineyballs settings in the top few most demanding games, but you drop the settings from ultra down to high or very high then you've got a very good experience at 4k and the 99% of the rest of any decent sized gaming library will run maxed out.

I love the really high fidelity that some titles can crank out, the Crysis-like games which are so pretty it makes your eyes bleed, but I probably put in close to 1000 hours of gaming a year and the latest Crysis-like titles probably account for maybe a total of 50h of that.

Some people literally only play CS and TF2, or MOBAs which all easily run maxed out. It's easy to lose track of reality when you're looking at benchmarks of the most demanding games in the highest settings, these are used to show off the power of the cards, but for most people that's no representative of their actual gaming habits.

Those aren't even maxed settings. It's like FXAA or 2x MSAA.. 2x Fury X isn't enough to max 4K with 4x MSAA.

Neither would 2x Titan X.

4K is about trade-offs, higher resolution vs maxing IQ in games.

Now compare how games tend to look on High/Ultra vs All Ultra. Much difference? Hardly any.

Lot's of truth here on both accounts.

Not only do we not always play these most demanding games, but as IQ improves, we start to see diminishing returns on how much better things look at higher settings. The same thing can be said about resolutions. Resolutions also effect IQ.

I'm not sure why in game settings have to be maxed to be considered playable, but resolutions don't? Is Ultra at a lower resolution really better than High at a higher resolution? If that was always true, why do we spend money on top in GPU's, when we could just use 720p displays? Why don't dev's extend the sliders on their games to allow us to see the higher IQ settings that they currently hide, so we can do this now?

Am I the only one who realizes that IQ settings and resolutions are one in the same, and dev's simply give us the settings based on the resolutions most of us use?

What would you guys do if the dev's removed the Ultra settings they have now (labeling the current medium or high settings to Ultra)? Would you complain about a drop in IQ, or be excited that your 4K resolutions can be played on a single GPU?

What if dev's give labeled the current Ultra settings to medium and gave us access to settings way beyond our current Ultra (they exist within the games, they just aren't exposed to us)? Would you lower the resolution, or lower the settings?
 
Last edited:
Feb 19, 2009
10,457
10
76
It's a preference thing, a mix of High with Ultra Textures (most important) for me, look very close to Ultra in all the games I've played. The difference is the performance hit or rather, performance saved, often very significant.

Also, the thing about MSAA and 4K is very simple, take the average 24-27 inch 1080p monitor. Now give it 4x the pixels but make the screen size 27-32 inch. The viewing area goes up by what, 25%? But that pixel density has gone up 400%. It behaves like you are running games with native SSAA. Seeing it in person, I didn't see the need for MSAA to be added on top, especially since it would cripple our current GPUs badly.

The only "Ultra" option you can't skimp out imo, is Texture Quality.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91

Is that 23ms the response time?

And why the talk about TVs. They aren't meant for computer gaming and don't have freesync or G-sync

Going 4K is asking for trouble. You can max out 1440p if you get over the 4K hype and bragging rights.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I've said it before, I'll say it again, this is the worst POSSIBLE time to dive deep into 4k ANYTHING.


hdmi 2.0 is bandwidth starved to the point where to get 4k @60Hz you have to drop color quality with hdcp.


next year each gpu maker goes from 28nm to 14nm. That is a massive leap. THAT is when you will be able to run 4k on a single card. Even better, by then hopefully we can start seeing 4k tvs with supermhl built in and gpus with supermhl to replace hdmi and send it to the ash heap of history. Let that standard die.


As if 4k is where things are supposed to stop.

The next MAJOR 4k tv purchase ought to be able to handle :

-4k@ 60Hz AT LEAST if not 120Hz
-rec 2020
-hfr video
-high dynamic range video content
-cheaper oled tvs from lg with even greater yields in 2016 as their production ramps up more.


Today nothing has supermhl, almost nothing has displayport, and hdmi 2.0 is already obsolete. Very few 4k tvs support hdr content, and the only oled displays with perfect blacks are still too expensive. And you want to jump into 4k right now? Why? Why on earth would anyone piss away time and money like that right now? Wait. Wait another year and be so much better off in both performance AND tv options.
We'll say it again, we don't care about hdcp..... We want to game we don't care about bluray disks. I don't even own a bluray player
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Lot's of truth here on both accounts.

Not only do we not always play these most demanding games, but as IQ improves, we start to see diminishing returns on how much better things look at higher settings. The same thing can be said about resolutions. Resolutions also effect IQ.

I'm not sure why in game settings have to be maxed to be considered playable, but resolutions don't? Is Ultra at a lower resolution really better than High at a higher resolution? If that was always true, why do we spend money on top in GPU's, when we could just use 720p displays? Why don't dev's extend the sliders on their games to allow us to see the higher IQ settings that they currently hide, so we can do this now?

Am I the only one who realizes that IQ settings and resolutions are one in the same, and dev's simply give us the settings based on the resolutions most of us use?

What would you guys do if the dev's removed the Ultra settings they have now (labeling the current medium or high settings to Ultra)? Would you complain about a drop in IQ, or be excited that your 4K resolutions can be played on a single GPU?

What if dev's give labeled the current Ultra settings to medium and gave us access to settings way beyond our current Ultra (they exist within the games, they just aren't exposed to us)? Would you lower the resolution, or lower the settings?


I always want my games on ultra or whatever the highest setting is. Post process effects not included. That is more important than resolution to me. I'll run 1080p before I turn stuff on medium to get frame rates I consider playable at resolutions above 1440p. I don't have a 4k native display but have used DSR to play sometimes at 4k and I'll be frank, I didn't notice the difference by just turning up the resolution. The game looked the same because it was on max settings both ways. Everyone else can do whatever they want, I prefer not to trade graphics settings for resolution. I want the textures and shadow effects to be at their highest first. I don't stare at the edges of objects zoomed way in to point out aliasing and I don't count pixels. Maybe I'm in the minority.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I always want my games on ultra or whatever the highest setting is. Post process effects not included. That is more important than resolution to me. I'll run 1080p before I turn stuff on medium to get frame rates I consider playable at resolutions above 1440p. I don't have a 4k native display but have used DSR to play sometimes at 4k and I'll be frank, I didn't notice the difference by just turning up the resolution. The game looked the same because it was on max settings both ways. Everyone else can do whatever they want, I prefer not to trade graphics settings for resolution. I want the textures and shadow effects to be at their highest first. I don't stare at the edges of objects zoomed way in to point out aliasing and I don't count pixels. Maybe I'm in the minority.

What you might not realize is that you are already making a compromise. It's just one the developer made for you. He hid the max settings so that you could set the slider to Ultra at your 1080p resolution and feel good.

It seems more of a psychological thing for many around here. The word "Ultra" carries all sorts of psychological power over people.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
If your other 50 posts in the last 7 years are as insightful as this one, I await eagerly your next words of wisdom. ;)

Great post!

The next MAJOR 4k tv purchase ought to be able to handle :

-4k@ 60Hz AT LEAST if not 120Hz
-rec 2020
-hfr video
-high dynamic range video content

But yet everything he asks for in a 4k TV with HDMI 2.0 has been out for 3 or 4 months for less than 1500$.
Or were you using sarcasm?