Computerbase - 3gb vs 4gb vs 6gb vs 8gb GDDR5 VRAM Frametime Testing

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You just used a test from August 2015?

One where the game wasn't using all the ram but still had issues?

You missed the whole point of why I posted those graphs... My point was to show that frame rate graphs are insufficient when it comes to showing discrepancies with VRAM performance. Only frame time graphs will suffice..

Look how much better Far Cry Primal runs compared to Far Cry 4. That's No GW vs GW while providing the same details.

Far Cry Primal running better on AMD has nothing to do with GW, or the lack thereof. GW can be disabled. GTA V has both Gameworks and GamingEvolved enhancements, yet it runs fine on AMD and NVidia.

Again, GW effects can usually be toggled. If it can't be toggled, then it's running on the CPU which shouldn't matter..

The only time it has issues with the memory limit is when its already failing to push playable frame rates anyway.

Tech Report did a similar test in Far Cry 4 at 4K, but they used the stock ultra setting which doesn't enable GW effects. Fury X pumps out playable frames per second when you look at the frame rate graphs, beating the GTX 980 Ti and Titan X Maxwell even.

fc4-fps.gif


But when you look at the frame time graphs, you see a different picture. Notice the much lower frame times for the 390x? Curiously, the GTX 980 which has 4GB doesn't seem to suffer from massive frame time spikes that the AMD 4GB cards have.

So it goes back to what I said in my last post. Driver optimizations (among other things) have a lot to do with how well GPUs with smaller frame buffers can circumvent VRAM capacity issues.
fc4-33ms.gif
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
You missed the whole point of why I posted those graphs... My point was to show that frame rate graphs are insufficient when it comes to showing discrepancies with VRAM performance. Only frame time graphs will suffice..

No I fully agree with you there. I wish all sites used overtime graphs instead of just simple min/max/avg (sometimes not even all 3).
 
Aug 11, 2008
10,451
642
126
You just used a test from August 2015?

One where the game wasn't using all the ram but still had issues?

FC4-RAM.png


How about a more recent review, from the 1080 launch:

Far Cry Primal:
index.php


BF Hardline:

index.php


GTA V:

index.php


Looks like Fury is doing fine.

http://www.guru3d.com/articles_pages/fcat_geforce_gtx_1080_framepacing_review,15.html

Since you didn't link your source, here it was:

http://www.extremetech.com/gaming/2...x-faces-off-with-nvidias-gtx-980-ti-titan-x/2

You also left out some noticable quotes from that article:



And for GTA V:






And their conclusion:



Look how much better Far Cry Primal runs compared to Far Cry 4. That's No GW vs GW while providing the same details.

The only time it has issues with the memory limit is when its already failing to push playable frame rates anyway.
The same "it is too slow anyway" argument that is inevitably dismissed when an nVidia card shows poor frametimes. Seems it is valid for AMD though. And at the limit settings Fury X definitely shows vram limitations, but I dont hear the same arguments that in a year or two a user will regret buying it.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Do I really have to wait for Vega for this topic to die? I pray Vega doesn't come with no less than 8GBs of VRAM.

While I'm sure the discussion here is purely focused on performance, there are still a great portion of buyers/gamers who normally tone down a setting or two and some (in my experience) people put up with terrible frame rates. They do this to themselves and outside of using Raptr or NV Expeirence, I don't think they really care.

GF and I sit across from each other and every time I look at her stuttering low FPS, I get nauseas. But she doesn't seem to care. (Since she's reading over my shoulder, "I get 40-50 FPS on everything I do ON ULTRA!!! ON EVERYTHING I DO") but she doesn't have adaptive sync...so stutter!!! ("No of course I don't have adaptive sync, I didn't drop $1500 on a monitor")

Pray for me.
 

master_shake_

Diamond Member
May 22, 2012
6,425
292
121
You missed the whole point of why I posted those graphs... My point was to show that frame rate graphs are insufficient when it comes to showing discrepancies with VRAM performance. Only frame time graphs will suffice..



Far Cry Primal running better on AMD has nothing to do with GW, or the lack thereof. GW can be disabled. GTA V has both Gameworks and GamingEvolved enhancements, yet it runs fine on AMD and NVidia.

Again, GW effects can usually be toggled. If it can't be toggled, then it's running on the CPU which shouldn't matter..



Tech Report did a similar test in Far Cry 4 at 4K, but they used the stock ultra setting which doesn't enable GW effects. Fury X pumps out playable frames per second when you look at the frame rate graphs, beating the GTX 980 Ti and Titan X Maxwell even.

fc4-fps.gif


But when you look at the frame time graphs, you see a different picture. Notice the much lower frame times for the 390x? Curiously, the GTX 980 which has 4GB doesn't seem to suffer from massive frame time spikes that the AMD 4GB cards have.

So it goes back to what I said in my last post. Driver optimizations (among other things) have a lot to do with how well GPUs with smaller frame buffers can circumvent VRAM capacity issues.
fc4-33ms.gif

it's a real shame they didn't include the 290x's frame time results.

also lol at the 970... i think we all know what happened there.

also i wonder if they used a stock 290x with the crippling cooler...
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
The same "it is too slow anyway" argument that is inevitably dismissed when an nVidia card shows poor frametimes. Seems it is valid for AMD though. .

Are you honestly comparing running @ 4k max settings with low FPS for all cards vs running 1080p with mostly ~60fps with stuttering dips due to lack of ram?

I mean one is "unplayable settings" - settings that no one would run. The other is common settings that everyone would be normally running, except those cards can't due to lack of VRAM. Otherwise the FPS is fine as the card has the compute power, just is lacking in VRAM.

And at the limit settings Fury X definitely shows vram limitations, but I dont hear the same arguments that in a year or two a user will regret buying it

Because its balanced. The VRAM issues don't occur until after the point where the card is failing due to lack of compute and raw power. The 1060 3GB has the raw power, but is limited by its VRAM. Fury is more balanced, so by the time the VRAM is a limit, you'll have to be turning down settings for playable FPS anyway.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I'm really weirded out by your argumentation basis. Are you riding semantics or is this some kind of misunderstanding in what hypothetical statements mean?

Apparently your confusion stems from the fact that you made an argument and then immediately forgot what your own argument was.

First you said that even when the low VRAM card wasn't any slower than the high VRAM card it was still worse off. I said that this was bunk, since the low VRAM card would be cheaper and thus better off. You then apparently forgot your original argument and started arguing that we can't know whether or not the low VRAM card will be slower or not (even though the premise of your original argument assumed this)

There isn't any semantics about this, you made a hypothetical statement (low VRAM cards that aren't slower then high VRAM cards are still worse), which I considered incorrect, and then when I pointed this out you started arguing something completely different (that we can't know ahead of time whether or not the low VRAM card will match the high VRAM card).

In other words your first hypothetical statement assumed that we already knew that the low VRAM card wasn't any slower ( i.e. it assumed that we had won the "VRAM roulette"), but all your subsequent arguments was based on us not knowing this. So you were arguing two completely different scenarios.

I really don't think I can make it any clearer than this.
 
  • Like
Reactions: zentan

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
So long story short is 6GB or 8GB the new 1080p standard if you want to use 4K textures etc?
 

kondziowy

Senior member
Feb 19, 2016
212
188
116
I have not seen slowdowns caused by VRAM on maxed textures and maxed everything on 6GB cards so I guess this is the new standard for 1080p now?

8GB cards are available new starting at 250$+ today so it will be interesting to see how long that will last :sweat:
 
Aug 11, 2008
10,451
642
126
Are you honestly comparing running @ 4k max settings with low FPS for all cards vs running 1080p with mostly ~60fps with stuttering dips due to lack of ram?

I mean one is "unplayable settings" - settings that no one would run. The other is common settings that everyone would be normally running, except those cards can't due to lack of VRAM. Otherwise the FPS is fine as the card has the compute power, just is lacking in VRAM.



Because its balanced. The VRAM issues don't occur until after the point where the card is failing due to lack of compute and raw power. The 1060 3GB has the raw power, but is limited by its VRAM. Fury is more balanced, so by the time the VRAM is a limit, you'll have to be turning down settings for playable FPS anyway.
Granted it is an old card, but it is still top of the line, so what will happen in 2 years? Will 4 gb even of HBM be enough? That is the main argument being used to mercilessly bash the 3 gb 1060. But I dont see amd supporters casting the same doubts on Fury X. To be honest, I would much rather buy a 3gb lower/mid range card than a top of the line card like fury X with 4 gb, even if it is HBM.

Also, how many games have been shown to run at 60 fps on the 3gb 1060 (Edit: with serious frametime issues)? In the 3 games tested here, the only game that showed severe frametime issues was ME, and it was only running at about 30 fps, the exact same situation you are using to justify Fury X.
 
Last edited:

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
The same "it is too slow anyway" argument that is inevitably dismissed when an nVidia card shows poor frametimes. Seems it is valid for AMD though. And at the limit settings Fury X definitely shows vram limitations, but I dont hear the same arguments that in a year or two a user will regret buying it.
Yeah i always used to get angry when people including supposed forum experts used the phrase "its too slow anyway" for cards like GTX680,770,960 even though i don't need to show you any proof that these cards can easily utilize more than 2GB and those who bought the 4GB despite everyone advising against it are now able to use high textures while those with 2GB have to make do with lower quality textures in new games or suffer frame rate tanking.
Same thing now.People are downplaying the capability of RX470 to utilize more than 4GB without taking into account future titles.If i had to choose between RX470 8GB or RX480 4GB at the same price,i would wisely pick RX470 for 10% less performance now but ability to use higher quality textures 2 years from now while RX480 will have to make do with lower quality textures.
Most people in the real world keep their cards for 4-5 years unlike these forums where people change cards every year.So for 95% of people out there, more vram is better than slightly more performance at the same price.
 
Aug 11, 2008
10,451
642
126
Yeah i always used to get angry when people including supposed forum experts used the phrase "its too slow anyway" for cards like GTX680,770,960 even though i don't need to show you any proof that these cards can easily utilize more than 2GB and those who bought the 4GB despite everyone advising against it are now able to use high textures while those with 2GB have to make do with lower quality textures in new games or suffer frame rate tanking.
Same thing now.People are downplaying the capability of RX470 to utilize more than 4GB without taking into account future titles.If i had to choose between RX470 8GB or RX480 4GB at the same price,i would wisely pick RX470 for 10% less performance now but ability to use higher quality textures 2 years from now while RX480 will have to make do with lower quality textures.
Most people in the real world keep their cards for 4-5 years unlike these forums where people change cards every year.So for 95% of people out there, more vram is better than slightly more performance at the same price.

But on the other hand, I cant remember the test I linked in another thread, but a test done in early 2016, showed that despite a lot of feeling otherwise, it was very debatable whether the 4gb 960 was in fact significantly faster (except maybe on one or two games) than the 2gb version.

Edit: this is not directed at you personally, I basically agree with what you said, with the caveat that I am not sure the data backs up the fact that the 960 can really utilize 4gb of vram. It is just the double standard and all are nothing thinking that permeates any thread regarding the 1060 3gb that makes my head want to explode. Obviously more vram is better, but there is not some magical switch that makes a 3gb card "trash" or a "non-starter" and a 4gb card wonderful and future proof. Every card is a compromise, with the 1060 3gb, you compromise settings in a few current games, and an unknowable amount of future proofing, for a cheaper price and overall better performance per dollar in current games. Is that a good compromise, I dont really know, but If posters really want to post the most useful information in the forums, they would do well to post the facts, possible future implications and compromises, and let the reader make his own decision, instead of unconditionally condemning a card and insulting other members who dare to even give a second thought that the card might actually be a viable choice.
 
Last edited:

zinfamous

No Lifer
Jul 12, 2006
111,864
31,359
146
So long story short is 6GB or 8GB the new 1080p standard if you want to use 4K textures etc?

6GB is probably the "safe minimum" for that right now. I think that you want to look at 1060 6gb, 470/480 8gb cards as the lowest-end cards today if you want to push 4k textures @1080p for the next 2 or 3 years.

Some of those current DX12 and Vulcan games are pushing into 5gb+ right now with ultra/4k textures.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
But on the other hand, I cant remember the test I linked in another thread, but a test done in early 2016, showed that despite a lot of feeling otherwise, it was very debatable whether the 4gb 960 was in fact significantly faster (except maybe on one or two games) than the 2gb version.

You also no doubt saw this in a few discussions about the matter:

https://www.computerbase.de/2015-12/2gb-4gb-gtx-960-r9-380-vram-test/2/#abschnitt_frametimemessungen

Assassin's Creed Unity

12-1260.1456222344.png


11-1260.1456222344.png


Dragon Age: Inquisition

14-1260.1456222343.png


13-1260.1456222343.png


Far Cry 4

16-1260.1456222342.png


GTA V

18-1260.1456222342.png


17-1260.1456222342.png


Middle-earth: Shadow of Mordor

20-1260.1456222341.png


9-1260.1456222344.png


That is from late 2015 and its been posted by a few people here(including me),so pretty happy I got the GTX960 4GB over the 2GB version.Interestingly enough,Nvidia stopped production of the 2GB version and replaced it with the 4GB one.

That all happened in under a year from the GTX960 launch:

http://wccftech.com/nvidia-discontinue-2gb-versions-geforce-gtx-960/

I actually have used the 2GB card too(a mates) and the GTX960 4GB was noticeably smoother in some games,when I did a quick comparison a while back.

Looks like we are seeing the same with the GTX1060 3GB too - I am still not understanding the fascination with saving a few pounds with this card. The GTX1060 6GB will last longer and have a better resale and re-use value.

Like I said before,its going to be fun to see where the GTX1060 3GB will be in 18 to 24 months time.
 
Last edited:

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Also regarding the Fury,even though the excessive bandwidth might help,AMD said they were optimising on a game by game basis,so wait until Vega drops and its EOL - I personally think we will see Fury also hitting issues then too. Either way,at least 6GB of VRAM is what people need to be looking at if they are spending around £200 and above.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Granted it is an old card, but it is still top of the line, so what will happen in 2 years?

In 2 years it won't have the compute power to run those games with maxed settings, so you'll be decreasing the IQ before the memory limits occur anyway. Also HBM works much better than GDDR5 so you can't compare them as equals.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Also regarding the Fury,even though the excessive bandwidth might help,AMD said they were optimising on a game by game basis,so wait until Vega drops and its EOL - I personally think we will see Fury also hitting issues then too. Either way,at least 6GB of VRAM is what people need to be looking at if they are spending around £200 and above.

I remember reading Ryan (our Ryan) basically saying that he thinks that 6 GB VRAM will be enough in the foreseeable future until the next generation. It's part of the reason why I went for a 980 Ti as the used market saw significant dropped prices in my area. At 1440p, there are already some games that I play which are pushing over 4 GB VRAM.
 

Ranulf

Platinum Member
Jul 18, 2001
2,868
2,520
136
Yeah i always used to get angry when people including supposed forum experts used the phrase "its too slow anyway" for cards like GTX680,770,960 even though i don't need to show you any proof that these cards can easily utilize more than 2GB and those who bought the 4GB despite everyone advising against it are now able to use high textures while those with 2GB have to make do with lower quality textures in new games or suffer frame rate tanking.

People bashed the 4gb 960's because it was competing with cheap 290/290x's that were about $50 more (often with free games) for far better performance. The 4gb 960s started eating into the 2gb after showing up a few weeks afterwards (IIRC) and took over the $230-250 spot. The 960 was a raw deal all around (that nvidia took 3+ months to release after the 970/80) and the only bright side is the 4gb at least has held its ground, perf wise for high textures (yay I guess). I only picked up a 2gb 960 a year ago because with a free game and rebate the card was around $150.

The 6gb and 8gb cards this gen at this price level are all overpriced because of greed and supply/demand issues. The 4gb cards barely merit the $199 pricetags let alone a 3gb card with gimped shaders.
 
  • Like
Reactions: RussianSensation

96Firebird

Diamond Member
Nov 8, 2010
5,743
340
126
In 2 years it won't have the compute power to run those games with maxed settings, so you'll be decreasing the IQ before the memory limits occur anyway. Also HBM works much better than GDDR5 so you can't compare them as equals.

Please elaborate on how you think the bold has anything to do with memory capacity.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
I am not sure the data backs up the fact that the 960 can really utilize 4gb of vram.

It isn't about using the entire 4GB. Even if the game only needs 2.5 the 2GB model is toast.

Personally the only issue I have with a 3GB card in 2016 is it has less VRAM than the 3.5GB 970. The 970 is the most popular card on Steam now, so you have to figure developers feel safe making the 970 the baseline for the top settings. If the game is using 3.25GB (because it's optimized for a 970) it doesn't matter that the 4GB card has "too much" because what it really has is "enough" compared to the 3GB card. But I can't prove that example right now today on a FPS chart because we don't have enough data (basically enough FRAPs charts) to find a great example, but I can logically reason that such a problem COULD exist. It is kinda like how they knew that Higgs Boson should have been there before they actually found it.

There are basically two types of data points: those we can prove in games today and those we can extrapolate based on what games did before. In games today it's clear that the 3GB 1060 has limits the 2GB 960 didn't have at its launch, that is the first data point. The second data point is how poorly the 2GB 960 aged, and assumption those forces will be the same going forward (which will mean that the already limited 3GB model MIGHT age worse than the 2GB 960 did). We can also look at how Nvidia cards age vs AMD but I feel that is a separate argument.

If winning the "GPU Debate" was a serious thing, like it had a cash prize for doing so, then the second data point wouldn't be valid as only what we can prove today matters (and a certain Russian would be very rich). But the "winner" of the GPU debate doesn't win much- basically their GPU buying advice will help more people- so I personally think it is ok to put forth the second data point because that might help someone make a decision that is MAYBE more optimal for themselves long-term. Someone else might think "no, I only care what can be proven with today's games" and to that I say that is fine as long as it's their money they are deciding with. You can always ignore a data point you don't feel is valid, but you can't consider a data point you never were exposed to.

What rubs me the wrong way is when people with $500+ GPUs don't consider the quality of the advice they are giving to "budget" GPU buyers and if they are giving people valid data points to consider. They say "yeah the 3GB model is fine because if they didn't want to turn down settings they should have just bought my card anyway" when in actually THEIR decision process for buying a flagship card was light years away from the decision process of someone buying a sub-$300 card they are keeping for three or four years. A person buying $500+ flagships who is tuned into this world can't really make a bad decision, because either their card ages well or it doesn't and they dump it to some poor sucker before the next $500+ flagship comes out 15 months later when their previous flagship card still has some resale value. It is the person buying the $250 card once every three or fours years who HAS TO make the best decision possible, because they will be stuck with that card for the longest time possible most likely. In that way he who buys the cheapest video cards the least amount of time need to make the best decisions they can possibly make, which to me means they need to consider every possible valid data point.

That is just my opinion though.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,056
32,579
146
anyone still rocking a 2gb 960?

how are you guys doing these days?
Sold my EVGA FTW. First time I knew it was not worth keeping anymore was Gears Ultimate. Some of the textures were terrible or missing with 2GB, I thought at first, they forgot to update assets, but I quickly got over my derp and realized it was the vram holding it back. I do not need the best settings to be happy, but when something looks worse than console to me, is where I draw the line.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
All my previous GPUs before the 1070 had 2 VRAM configurations and I always choose the higher variant. It also always ended up being a great decision. So needless to say where I stand on the VRAM debate.

However there is nevertheless a difference in the 3GB vs 4GB argument. Previously we were mostly looking at a 100% increase in VRAM but here is it only a 33% increase. So the maximum theoretical difference is only so much between 3 and 4GB. So past examples are not an exact fit to this situation.

While I personally wouldn't buy either card but if I really had to I would choose the 3GB 1060 over 4GB 470. Of course one has to be incredibly stupid to buy the 3GB 1060 over 4GB 480 but they are hardly ever the same price.

Sent from my HTC One M9
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
The same "it is too slow anyway" argument that is inevitably dismissed when an nVidia card shows poor frametimes. Seems it is valid for AMD though. And at the limit settings Fury X definitely shows vram limitations, but I dont hear the same arguments that in a year or two a user will regret buying it.
Let me go on record and say I think a person buying Fury today will regret it in a year or two versus 1070.