master_shake_
Diamond Member
- May 22, 2012
- 6,425
- 292
- 121
You just used a test from August 2015?
One where the game wasn't using all the ram but still had issues?
Look how much better Far Cry Primal runs compared to Far Cry 4. That's No GW vs GW while providing the same details.
The only time it has issues with the memory limit is when its already failing to push playable frame rates anyway.
You missed the whole point of why I posted those graphs... My point was to show that frame rate graphs are insufficient when it comes to showing discrepancies with VRAM performance. Only frame time graphs will suffice..
The same "it is too slow anyway" argument that is inevitably dismissed when an nVidia card shows poor frametimes. Seems it is valid for AMD though. And at the limit settings Fury X definitely shows vram limitations, but I dont hear the same arguments that in a year or two a user will regret buying it.You just used a test from August 2015?
One where the game wasn't using all the ram but still had issues?
![]()
How about a more recent review, from the 1080 launch:
Far Cry Primal:
![]()
BF Hardline:
![]()
GTA V:
![]()
Looks like Fury is doing fine.
http://www.guru3d.com/articles_pages/fcat_geforce_gtx_1080_framepacing_review,15.html
Since you didn't link your source, here it was:
http://www.extremetech.com/gaming/2...x-faces-off-with-nvidias-gtx-980-ti-titan-x/2
You also left out some noticable quotes from that article:
And for GTA V:
And their conclusion:
Look how much better Far Cry Primal runs compared to Far Cry 4. That's No GW vs GW while providing the same details.
The only time it has issues with the memory limit is when its already failing to push playable frame rates anyway.
You missed the whole point of why I posted those graphs... My point was to show that frame rate graphs are insufficient when it comes to showing discrepancies with VRAM performance. Only frame time graphs will suffice..
Far Cry Primal running better on AMD has nothing to do with GW, or the lack thereof. GW can be disabled. GTA V has both Gameworks and GamingEvolved enhancements, yet it runs fine on AMD and NVidia.
Again, GW effects can usually be toggled. If it can't be toggled, then it's running on the CPU which shouldn't matter..
Tech Report did a similar test in Far Cry 4 at 4K, but they used the stock ultra setting which doesn't enable GW effects. Fury X pumps out playable frames per second when you look at the frame rate graphs, beating the GTX 980 Ti and Titan X Maxwell even.
![]()
But when you look at the frame time graphs, you see a different picture. Notice the much lower frame times for the 390x? Curiously, the GTX 980 which has 4GB doesn't seem to suffer from massive frame time spikes that the AMD 4GB cards have.
So it goes back to what I said in my last post. Driver optimizations (among other things) have a lot to do with how well GPUs with smaller frame buffers can circumvent VRAM capacity issues.
![]()
The same "it is too slow anyway" argument that is inevitably dismissed when an nVidia card shows poor frametimes. Seems it is valid for AMD though. .
And at the limit settings Fury X definitely shows vram limitations, but I dont hear the same arguments that in a year or two a user will regret buying it
I'm really weirded out by your argumentation basis. Are you riding semantics or is this some kind of misunderstanding in what hypothetical statements mean?
Granted it is an old card, but it is still top of the line, so what will happen in 2 years? Will 4 gb even of HBM be enough? That is the main argument being used to mercilessly bash the 3 gb 1060. But I dont see amd supporters casting the same doubts on Fury X. To be honest, I would much rather buy a 3gb lower/mid range card than a top of the line card like fury X with 4 gb, even if it is HBM.Are you honestly comparing running @ 4k max settings with low FPS for all cards vs running 1080p with mostly ~60fps with stuttering dips due to lack of ram?
I mean one is "unplayable settings" - settings that no one would run. The other is common settings that everyone would be normally running, except those cards can't due to lack of VRAM. Otherwise the FPS is fine as the card has the compute power, just is lacking in VRAM.
Because its balanced. The VRAM issues don't occur until after the point where the card is failing due to lack of compute and raw power. The 1060 3GB has the raw power, but is limited by its VRAM. Fury is more balanced, so by the time the VRAM is a limit, you'll have to be turning down settings for playable FPS anyway.
Yeah i always used to get angry when people including supposed forum experts used the phrase "its too slow anyway" for cards like GTX680,770,960 even though i don't need to show you any proof that these cards can easily utilize more than 2GB and those who bought the 4GB despite everyone advising against it are now able to use high textures while those with 2GB have to make do with lower quality textures in new games or suffer frame rate tanking.The same "it is too slow anyway" argument that is inevitably dismissed when an nVidia card shows poor frametimes. Seems it is valid for AMD though. And at the limit settings Fury X definitely shows vram limitations, but I dont hear the same arguments that in a year or two a user will regret buying it.
Yeah i always used to get angry when people including supposed forum experts used the phrase "its too slow anyway" for cards like GTX680,770,960 even though i don't need to show you any proof that these cards can easily utilize more than 2GB and those who bought the 4GB despite everyone advising against it are now able to use high textures while those with 2GB have to make do with lower quality textures in new games or suffer frame rate tanking.
Same thing now.People are downplaying the capability of RX470 to utilize more than 4GB without taking into account future titles.If i had to choose between RX470 8GB or RX480 4GB at the same price,i would wisely pick RX470 for 10% less performance now but ability to use higher quality textures 2 years from now while RX480 will have to make do with lower quality textures.
Most people in the real world keep their cards for 4-5 years unlike these forums where people change cards every year.So for 95% of people out there, more vram is better than slightly more performance at the same price.
So long story short is 6GB or 8GB the new 1080p standard if you want to use 4K textures etc?
But on the other hand, I cant remember the test I linked in another thread, but a test done in early 2016, showed that despite a lot of feeling otherwise, it was very debatable whether the 4gb 960 was in fact significantly faster (except maybe on one or two games) than the 2gb version.
Granted it is an old card, but it is still top of the line, so what will happen in 2 years?
Also regarding the Fury,even though the excessive bandwidth might help,AMD said they were optimising on a game by game basis,so wait until Vega drops and its EOL - I personally think we will see Fury also hitting issues then too. Either way,at least 6GB of VRAM is what people need to be looking at if they are spending around £200 and above.
Yeah i always used to get angry when people including supposed forum experts used the phrase "its too slow anyway" for cards like GTX680,770,960 even though i don't need to show you any proof that these cards can easily utilize more than 2GB and those who bought the 4GB despite everyone advising against it are now able to use high textures while those with 2GB have to make do with lower quality textures in new games or suffer frame rate tanking.
In 2 years it won't have the compute power to run those games with maxed settings, so you'll be decreasing the IQ before the memory limits occur anyway. Also HBM works much better than GDDR5 so you can't compare them as equals.
I am not sure the data backs up the fact that the 960 can really utilize 4gb of vram.
Sold my EVGA FTW. First time I knew it was not worth keeping anymore was Gears Ultimate. Some of the textures were terrible or missing with 2GB, I thought at first, they forgot to update assets, but I quickly got over my derp and realized it was the vram holding it back. I do not need the best settings to be happy, but when something looks worse than console to me, is where I draw the line.anyone still rocking a 2gb 960?
how are you guys doing these days?
Let me go on record and say I think a person buying Fury today will regret it in a year or two versus 1070.The same "it is too slow anyway" argument that is inevitably dismissed when an nVidia card shows poor frametimes. Seems it is valid for AMD though. And at the limit settings Fury X definitely shows vram limitations, but I dont hear the same arguments that in a year or two a user will regret buying it.
Please elaborate on how you think the bold has anything to do with memory capacity.
