4gb fury x 4k gaming, what games have issues?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
^ bold
So suggest a 1070 8GB then. Not a 4GB FuryX for 4K. 1080 not being recommended is understandable, but you not mentioning 1070 as an option and only compare FuryX prices to 1080 prices is steering. There is also 980Ti 6GB option, but no mention? Heck, wouldn't even the 6GB 1060 be a better option at 4K over FuryX 4GB? What about 8GB AMD cards? 390X? Are those 8GB?

I suppose settings could be turned down if need be if memory runs out on the FuryX. If the price is undeniably irresistible, the I can see it.

Did you bother to read the post I was quoting and replying to? He was the one that suggested a 1080 and not a 1070.

The 1080 scales much better @ 4k vs the 1070 anyway, and the 1070 is still at least $100 more than a Fury.

1070 is ~ 5% faster @ 4k on average over Fury, while the 1080 is ~20% faster than the 1070.

https://www.computerbase.de/2016-06/nvidia-geforce-gtx-1070-test/4/

As far as 980 Ti goes, they haven't seemed to drop in price new like Fury has. They seem to go for more than 1070s. Cheapest is $580+ at newegg.

Warning issued for inappropriate content.
-- stahlhart
 
Last edited:

ControlD

Diamond Member
Apr 25, 2005
5,440
44
91
To me it seems like 2K (** edit ** 1440p) is the current sweet spot for a 28"/29" monitor size. You get a decent step up in quality and desktop space without needing a monster GPU to run everything. Still, I am interested in seeing where the OP ends up going with this as I have been mulling over the same upgrades for awhile now myself.
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
By "2K" do you mean 2560x1440? Not an accurate label, but just want to make sure...

If so, I agree 1440p 27-inch is the sweet spot. The setting reductions required to push 4K is pretty steep, plus you lose the high refresh rate.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
I didn't think about that. I'm really pessimistic about Vega now. Its super late, looks to be expensive. We'll see when it comes out but doesn't look good.



Well that's what I'm worried about. I see the 1060/480 where 4gb of vram or less is terrible. But yet fury x still seems to do well in games according to benches and users.



1070 doesn't support freesync and gsync just takes me so far out of the price range that I could get crossfire fury x and a freesync panel and still be under the pure panel cost of gsync.

It is a little disingenuous to not mention the 1070. That's my major gripe.

If I could get the 1070 and a 4k 30+ inch gsync monitor for a decent price then I would be much more inclined to go Nvidia. Right now u don't see many 30+inch 4k panels. I saw some 24 inch panels or so for 600 but that's tiny for 4k.

I have a freesync monitor (XL2730Z) and used to have a 390X. Loved freesync. Upgraded around 5 months ago to a 1070, using the same monitor. I haven't noticed loosing freesync at all, due to the massively increased FPS.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Did you bother to read the post I was quoting and replying to? He was the one that suggested a 1080 and not a 1070.

The 1080 scales much better @ 4k vs the 1070 anyway, and the 1070 is still at least $100 more than a Fury.

1070 is ~ 5% faster @ 4k on average over Fury, while the 1080 is ~20% faster than the 1070.

https://www.computerbase.de/2016-06/nvidia-geforce-gtx-1070-test/4/

As far as 980 Ti goes, they haven't seemed to drop in price new like Fury has. They seem to go for more than 1070s. Cheapest is $580+ at newegg.

Yes, I did actually. I UNDERSTAND that the person you quoted didn't mention the 1070. My point was that YOU should have. Instead, you opted to show the stark differences in pricing between FuryX and 1080 to make it seem like a no brainer. Well, I'm sorry, but brains need to be involved here.

So, you said a FuryX is 300 dollars. And you said 1070 is 100 dollars more. Tential says 320 for a FuryX and I see 1070s going for 379.00 on the egg. So, maybe not so much 100 bucks, right?

And, a 1070 is almost always faster than a 980Ti except when both are fully o/c'd. Then the 980Ti just edges out the 1070. Either way, FuryX isn't the greatest overclocker anyhow. There are a lot of things stacked against a Fury X compared to 1070. It hasn't got a prayer when all are overclocked. And I'd say a fair bit more than the 5% over FuryX than you alluded to. I'm willing to bet you were very conservative with that number.

I understand Tential says he will use Freesync, so it kind of renders this all moot, but still, FuryX at 4K? nah brah.
 

ControlD

Diamond Member
Apr 25, 2005
5,440
44
91
By "2K" do you mean 2560x1440? Not an accurate label, but just want to make sure...

If so, I agree 1440p 27-inch is the sweet spot. The setting reductions required to push 4K is pretty steep, plus you lose the high refresh rate.

Yes, 2560x1440 was what I meant. I keep seeing it referred to as "2K" when I should have said 1440p.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
Err I don't see how it's bad for me to do my due diligence and know which current games I'll run into issues with using ultra textures.

I don't get the replies in this thread at all. A 4GB Fury X will NEVER saturate in a "real gaming scenario" ???
What does that even mean? As opposed to a fake gaming scenario?
4GB FuryX will be fine at 4K but a 3GB 1060 is the worst buy you can ever make even in some games at 1080p as I've seen spoken in other threads?


same explanation for both.
another VRAM thread with the same folks regurgitating the same false information like gospel. in example - 1060 3gb is not enough vram for 1080p. smh.



first is stop confusing FPS limitation with VRAM limitation.
today's gaming scenario and non benchmark scenario. at 4k resolution. a single furyx will choke on playable fps limitation before any vram limitation is ever reached. furyx x2 same end result. furyx x3 is where the tide start to change.
so unless you are planning on running 4k with either furyx x3 or furyx x4. no need to lose sleep.



lastly.
VRAM is a cheap upgrade when available. your money. spend it as you see fit. simply do not justify that more VRAM is needed unless your specific situation warrants it. as for the future proof folks. you are better off buying NVidia stocks.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Aren't the AMD Fury cards all limited to HDMI 1.4? Won't that always limit you to 30FPS regardless of vram use or GPU grunt needed? I'd wait for or buy an HDMI 2.0 card if that is the case.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
Aren't the AMD Fury cards all limited to HDMI 1.4? Won't that always limit you to 30FPS regardless of vram use or GPU grunt needed? I'd wait for or buy an HDMI 2.0 card if that is the case.

All modern AMD GPU's have Display Port, which allows 4k 60FPS.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
same explanation for both.
another VRAM thread with the same folks regurgitating the same false information like gospel. in example - 1060 3gb is not enough vram for 1080p. smh.



first is stop confusing FPS limitation with VRAM limitation.
today's gaming scenario and non benchmark scenario. at 4k resolution. a single furyx will choke on playable fps limitation before any vram limitation is ever reached. furyx x2 same end result. furyx x3 is where the tide start to change.
so unless you are planning on running 4k with either furyx x3 or furyx x4. no need to lose sleep.



lastly.
VRAM is a cheap upgrade when available. your money. spend it as you see fit. simply do not justify that more VRAM is needed unless your specific situation warrants it. as for the future proof folks. you are better off buying NVidia stocks.

I missed the boat on Nvidia stock. I literally posted here that I'd rather buy Nvidia stock than pascal.... If I had done that I'd be able to afford a gtx 1080ti. Might still buy though since I expect them to continue to blow past earnings estimates when there is no competition. Putting my money in to Nvidia stock was the best option then though but I missed out on my own recommendation. Sucks to suck.

I'm not sure I agree with you on the vram front though.

I have a freesync monitor (XL2730Z) and used to have a 390X. Loved freesync. Upgraded around 5 months ago to a 1070, using the same monitor. I haven't noticed loosing freesync at all, due to the massively increased FPS.

This is my biggest fear.

Freesync vs gpu horsepower and vram is basically what it boils down to when it's fury x vs 1070.

To me it seems like 2K (** edit ** 1440p) is the current sweet spot for a 28"/29" monitor size. You get a decent step up in quality and desktop space without needing a monster GPU to run everything. Still, I am interested in seeing where the OP ends up going with this as I have been mulling over the same upgrades for awhile now myself.

I'm going to most likely go with fury x and wait for the qnix deal to return since I missed that last night.

I'll still need to cry myself to sleep though after a fury x purchase.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Aren't the AMD Fury cards all limited to HDMI 1.4? Won't that always limit you to 30FPS regardless of vram use or GPU grunt needed? I'd wait for or buy an HDMI 2.0 card if that is the case.
The freesync displays are all using DisplayPort, except like 1 or 2 that just came out that support HDMI 2.0 Freesync.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
I'm not sure I agree with you on the vram front though.

will give it another poke. then you are on your own after this.



also need to stop confusing vram required and vram cache.

as for vram required. rest assure that when you take into consideration playable enjoyable gameplay (aka fps). you will not bust 4gb vram required on 4k on any "single" gpu. titan x pascal included. as for vram cache. games will use as much vram as available. vram cache also does help improve frame times due to texture already in gpu memoy. given that said. do buy the bigger vram version if available. but to required it. is a fallacy unless your specific application warrant it.

btw. when u hit the vram limit regardless of resolution. even if you are running quad titan x pascal. you will know. frames will dip from 100fps to low single digits.



anyway. had your exact same question a year or so ago. tried to ask a forum member here who had the exact configuration but he/she was too busy promoting himself/herself vs taking time out of his/her busy schedule to then help out another forum member. so have personally spent my own coin to find out 4gb vram is not a limiting factor in actual playable enjoyable gaming senario. clearly not regurgitating vram gospel.
 

Mr Evil

Senior member
Jul 24, 2015
464
187
116
mrevil.asvachin.com
...you will not bust 4gb vram required on 4k on any "single" gpu...
That doesn't appear to be true. There's the one example I gave earlier in this thread, and I've played other games that also suffered from hitching when turning textures up too high. The big performance hit from AA is probably due to VRAM limits too.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
All modern AMD GPU's have Display Port, which allows 4k 60FPS.


The freesync displays are all using DisplayPort, except like 1 or 2 that just came out that support HDMI 2.0 Freesync.


Thanks, I thought DP was limited to 30FPS too. Good to know!

Now that I think of it, it wouldn't make sense for them to sell a high end video card that cannot output 4K 60FPS in some way... not sure why I thought DP couldn't do that. Anyway... moving on. :)
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
I suspect that for 2016-2017 era games you will find you will have to use the High Texture setting rather than Ult
I don't get the replies in this thread at all. A 4GB Fury X will NEVER saturate in a "real gaming scenario" ???
What does that even mean? As opposed to a fake gaming scenario?
4GB FuryX will be fine at 4K but a 3GB 1060 is the worst buy you can ever make even in some games at 1080p as I've seen spoken in other threads?

FuryX 4GB will be fine for most games at 1080 and even 1440 for the most part, but 4K is pushing it. I'd go for a 1070 for 4K. Not much more money than the FuryX with more horsepower and twice the memory.
Its not the same comparing slow GDDR5 memory of 3GB buffer to 4GB of ultra high speed HBM memory. The 4GB HBM can process stuff quicker and avoid being overflowed with data. Of course there is no replacement for bigger buffer, if a game need 6-7GB it will big down the 4gb, but still 4GB HBM should be considered more as a 5GB or 5.5GB card.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
By "2K" do you mean 2560x1440? Not an accurate label, but just want to make sure...

If so, I agree 1440p 27-inch is the sweet spot. The setting reductions required to push 4K is pretty steep, plus you lose the high refresh rate.

Yes, 2560x1440 was what I meant. I keep seeing it referred to as "2K" when I should have said 1440p.

Actually the resources required to push 1440p is almost in the same ballpark as 4K...

Wait; By "1440p" do you mean QHD? 1440p is not an accurate label since UWQHD (3440x1440) can also be called 1440p. And I don't even know why you all write "p" after the vertical component of your resolution.

Just my 2 cents. There is an already an accurate scheme to describe a monitor's resolution and aspect ratio. Plus it doesn't require you write a useless "p" after everything.
 
Last edited:

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Its not the same comparing slow GDDR5 memory of 3GB buffer to 4GB of ultra high speed HBM memory. The 4GB HBM can process stuff quicker and avoid being overflowed with data. Of course there is no replacement for bigger buffer, if a game need 6-7GB it will big down the 4gb, but still 4GB HBM should be considered more as a 5GB or 5.5GB card.

The major bottleneck when a GPU runs out of RAM will be accessing Virtual memory (RAM or HDD/SSD) over PCIe. Since both GDDR5 and HBM are operating orders of magnitude faster than it's possible to refill data from Virtual memory I think you're over estimating the difference between HBM and DDR5 in a VRAM limited situation.

That being said, I've heard told AMD is putting a lot of effort into optimising Fury's drivers and memory management for just such a situation.

Personally I think 4GB should be enough for a solid 4K experience for the next while with a little tweaking of settings.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
$
I don't even know anymore bacon. I just want to know what games have issues so I can get an idea of what I can't play to see if it's anything that I can just say "hey, that's a game I'd like to play I'll shelf it for 2018.."
I dunno this really sucks I didn't expect Vega to be this far away that I'd have to seriously consider fury x at the current $320 sale its on.

Gta 5 with mods. I really want that..
Skyrim remastered with mods
Dmc
Batman Arkham origins
Xcom 2
Final fantasy games after 10
Infinite
Assassin creed unity
Metal gear solid 5
Inquisition
Anno 2070
Doom


I get some games I'll have to hold off on. I just want to play some 4k games now.

I don't think I'll see a flagship amd gpu that I actually am impressed enough by to buy. So I guess I'll just enjoy the bargain close out deals instead.


What card do you currently have ??

Most of the above games have CF support, if you are on a R9 290 I would get a cheap used 290 to CF until Vega release.
 

ControlD

Diamond Member
Apr 25, 2005
5,440
44
91
Actually the resources required to push 1440p is almost in the same ballpark as 4K...

Wait; By "1440p" do you mean QHD? 1440p is not an accurate label since UWQHD (3440x1440) can also be called 1440p. And I don't even know why you all write "p" after the vertical component of your resolution.

Just my 2 cents. There is an already an accurate scheme to describe a monitor's resolution and aspect ratio. Plus it doesn't require you write a useless "p" after everything.

Let's just go with numbers then. I was referring to 2560x1440 as I stated in my post.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
That doesn't appear to be true. There's the one example I gave earlier in this thread, and I've played other games that also suffered from hitching when turning textures up too high. The big performance hit from AA is probably due to VRAM limits too.

your post #19 you stated your single fury at 4k could barely maintain average 60fps (forget whatever min fps is) at high setting (forget ultra setting) without any aa.

no point in further jacking up vram by cranking up texture to ultra or cranking up aa or both. smh.
 

Jackie60

Member
Aug 11, 2006
118
46
101
I would wait if I were you Tential, also I wouldn't bother with a 28" 4k monitor. I've had a 4k 28in and I'm using a 4k 40in monitor and the 28" one was very meh in terms of its not big enough to notice the 4K bit.
If you're going 4K go big or go home imho. Also I wouldn't recommend a single 1070/Fury X for 4k either. I ran 980ti SLI and that was good and I could play at decent settings but a single 980ti was a pretty poor experience.
I really noticed having to dial down settings and still not getting a decent gaming experience in GTA5 (without any mods!). I'd really recommend a minimum of 980ti SLI for 4K otherwise I think you'll be disappointed. Waiting a few months may be a pain but spending now for mediocre performance will be annoying having spent all that money. I genuinely think 4K needs Titan XP SLI levels of performance to really shine but a miniumum of 1070/980ti SLI.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I'd personally get the cheapest Fury X I could find used and ride out this wave. Vega should be here soon. Or you get a PS4 Pro for $400 and rent the games you are interested in.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
This assertion that somehow HBM counts as more memory than any other memory type needs to die. No. It doesn't. 4GB HBM holds exactly 4GB. No more, no less. The Fury does undoubtedly perform better in VRAM limiting scenarios than other 4GB cards, but that's due to specific driver tuning. It goes to show the power of software in all of this.
 

[DHT]Osiris

Lifer
Dec 15, 2015
17,565
16,931
146
you will not bust 4gb vram required on 4k on any "single" gpu.

False. I can break 4GB now @1440p on a lightly modded SkyrimSE. That's a refresh of a 5 year old game, using a 2? year old engine.

You might, MIGHT be able to caveat that statement with 'without modding', but even there i'm sure there's a few suspect games which break 4GB @ 4k.

Its not the same comparing slow GDDR5 memory of 3GB buffer to 4GB of ultra high speed HBM memory. The 4GB HBM can process stuff quicker and avoid being overflowed with data. Of course there is no replacement for bigger buffer, if a game need 6-7GB it will big down the 4gb, but still 4GB HBM should be considered more as a 5GB or 5.5GB card.

False, memory doesn't work that way. It's either in-memory or not in-memory. Speed just makes the transition faster.

Imho, 4GB is 1080p land, 6GB is 1440p land, and 8GB is 4k land. Allow some fudge room for certain titles, mods, and date released (newer games are going to be squeezing upwards to 6GB @ 1080p before long).