• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Computerbase - 3gb vs 4gb vs 6gb vs 8gb GDDR5 VRAM Frametime Testing

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
No matter what data shows, the nVIDIA stock holders and defense force will be out defending these gimped offerings, because there will always be a scenario where a 3GB Geforce Gimp Edition can be shown to be viable for today and yesterday, but what about tomorrow? I won't name anyone, but if you frequent these forums enough, you probably know who these people are by now…. Suffice to say, take their "advice" with a grain of salt, and for heaven’s sake don’t pay 200+$ in 2016 for a 3GB video card.
The review favours 6GB as the minimum. Example that 4GB isnt enough for the performance tested.
24-630.1473843923.png

7-630.1473671212.png

image.png
No matter what the data shows, a few posters have a pre-conceived idea that 3 gb is a nonstarter and trash while 4 gb will be future proof. Even worse they impugn the integrity of anyone who disagrees with them and tries to see both sides of the issue. Current data shows the 3gb card to actually be a better performer per dollar than the 6gb card. (10% or so slower overall, while it is about 20 to 25% cheaper.) So like most everything in life, it is a compromise. Cheaper and very close to the same performance overall now, with possibly (or maybe even likely) more compromises in the future but no one knows to what extent. And even if it loses another 15 to 20% relative to the 6gb, it would still be very close in performance per dollar.

As for the test the op linked. There is one game where the 4gb is clearly superior to 3gb, one game where both the 4gb and 3gb show very similar issues at highest settings, and one game where a the 3gb shows moderate spikes, but the 4gb is showing similar issues, but to a lesser extent. Hardly conclusive in IMO to totally condemn a 3gb card and give a 4gb card a free pass.
 
  • Like
Reactions: ShintaiDK

nurturedhate

Golden Member
Aug 27, 2011
1,767
773
136
As for the test the op linked. There is one game where the 4gb is clearly superior to 3gb, one game where both the 4gb and 3gb show very similar issues at highest settings, and one game where a the 3gb shows moderate spikes, but the 4gb is showing similar issues, but to a lesser extent. Hardly conclusive in IMO to totally condemn a 3gb card and give a 4gb card a free pass.
By your own statement we have 3 games:

1 game where the 4gb wins
1 game where they tie
1 game where both suffer but 3gb suffers more (that's called a win btw, one better than the other and all...)

None of that is conclusive though? Really? 2 out of 3 the 4gb card wins.. That's called winning. That's called the majority. These are your statements. It's blatantly obvious based solely on what you posted.
 
  • Like
Reactions: rpsgc

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
not really. Try using a GPU after 2 years from launch with the latest and most demanding games. I am speaking from personal experience where a HD 4850 512 MB and HD 4850 1GB showed clear difference with the lower VRAM card running into VRAM limits often with newer games 2+ years from launch.

I owned a few (ok, like four) HD4850 cards, reference VisionTek models, which only had 512MB. In fact, I still have a couple of them.

I have a friend that I sold one of my Gigabyte GTX460 1GB OC WindForce cards to, with one bad fan. Well, it eventually died, so I sold him a nearly-new Palit GTX460, only this one was a semi-rare / late-model 2GB model.

He still uses it to this day to game. The GTX460 is getting a little long in the tooth, performance-wise, but with 2GB of VRAM, it's still (mostly, at least up to 2014 games) viable.
 
Aug 11, 2008
10,451
642
126
By your own statement we have 3 games:

1 game where the 4gb wins
1 game where they tie
1 game where both suffer but 3gb suffers more (that's called a win btw, one better than the other and all...)

None of that is conclusive though? Really? 2 out of 3 the 4gb card wins.. That's called winning. That's called the majority. These are your statements. It's blatantly obvious based solely on what you posted.
It is blatantly obvious you did not read or deliberately misquoted my post. I never said either can "won". What I said was the 3gb card is viable in the vast majority of games, with possible limitations in the future, while the 4 gb could easily have issues as well. Honestly the all or nothing thinking is this thread is totally over the top.

BTW, how many games are on steam? These games in the OP's post were clearly highlighted to be games with vram issues. They are a minuscule portion of the possible games that one can play. So one game in which 4 gb is clearly superior (assuming the test was done properly) is hardly a "majority" in the overall scheme of things.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
I just checked my R7 250X 2GB GDDR5 video card's Radeon Crimson control panel, and it says "Memory size: 2048MB" and "Memory Type:HyperMemory".

I did a search, and I found this reference:
http://www.amd.com/Documents/HyperMemory_Whitepaper.pdf

So, the size of the card's local VRAM store is not the end-all and be-all of video card memory management. That kind of puts a different perspective on this discussion, does it not?
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
not really. Try using a GPU after 2 years from launch with the latest and most demanding games. I am speaking from personal experience where a HD 4850 512 MB and HD 4850 1GB showed clear difference with the lower VRAM card running into VRAM limits often with newer games 2+ years from launch.

I'm not saying that it doesn't happen, nor am I even saying that it doesn't happen often, I'm simply saying that your claim that it always happens is untrue.

Wasn't the 2G/4G GTX770 almost at price parity? The 960 and 380 I remember as well as two cards that turned out to be very close in price between the two versions (less than 20$ when I bought my 4G 380). I'm only making an exception for the 1060 3G version because it's almost 60$ of price difference currently in Germany (210€ vs. 260€). My opinion would be different if those cards were only 30$ apart.

I certainly don't remember the 2GB/4GB 770's being at parity (you may be thinking of certain high end 2GB models matching some low end 4GB models, but that's really not comparable).

But even if the cards were only $1 apart you would still be better off getting the low VRAM version in the scenario where you win the "VRAM roulette". Of course if the price difference is that low you would be crazy to take the chance on the "VRAM roulette" in the first place, but that's a different issue.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
I certainly don't remember the 2GB/4GB 770's being at parity (you may be thinking of certain high end 2GB models matching some low end 4GB models, but that's really not comparable).
Possible. Though I'm generally looking at prices after the launch price surge has settled down and often times this moves SKUs way closer.

But even if the cards were only $1 apart you would still be better off getting the low VRAM version in the scenario where you win the "VRAM roulette".
Not this time, with the 1060 you still get a card with 10% fewer shaders and with the 480 you get a card with 12% lower bandwith. And even disregarding that, 1$ is a mighty small price for the chance to get a better product, especially with a historical success chance of >>20% - to turn that original argumentation around.

It's not hard to translate our argument of gaming performance over to Intel quad cores (HT or no HT) and that's a price difference of 100$. Think about it, would you ever recommend a 4670k over a 4770k if they were priced at 248$ and 249$ respectively? It's only a couple of games that profit from HT, but those that do tend to also be those that can make use of the extra performance for more consistency during gameplay.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
The thing is, if you uncomfortable buying the 1060 3GB don't go out and buy 470/480 4GB since in situations where 3GB doesn't cut, 4GB won't be doing much better, save the Fury HBM based 4GB which seems to be doing well in some 4K titles.

NV messed up but AMD did not escape either. 1060 6GB and 470/480GB is what you want to get for cards with that much horsepower.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
The thing is, if you uncomfortable buying the 1060 3GB don't go out and buy 470/480 4GB since in situations where 3GB doesn't cut, 4GB won't be doing much better, save the Fury HBM based 4GB which seems to be doing well in some 4K titles.

NV messed up but AMD did not escape either. 1060 6GB and 470/480GB is what you want to get for cards with that much horsepower.

Fury isn't going to perform any different in these titles with the settings used. You simply need 6GB or more.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Not this time, with the 1060 you still get a card with 10% fewer shaders and with the 480 you get a card with 12% lower bandwith. And even disregarding that, 1$ is a mighty small price for the chance to get a better product, especially with a historical success chance of >>20% - to turn that original argumentation around.

True, but you originally excluded the 1060 from the Vram roulette argument due to the price difference, so I assumed that we were only talking about the more traditional low/high VRAM cards out there, which with a few exceptions all had identical specs outside of VRAM.

And while $1 is indeed a very small price for the chance of a better product, that is irrelevant in this case, since the original argument assumed that you had already won the VRAM roulette, thus no chance is involved. Of course as mentioned you can never know beforehand whether or not you will win or loss the VRAM roulette, so this is a highly artificial scenario.

It's not hard to translate our argument of gaming performance over to Intel quad cores (HT or no HT) and that's a price difference of 100$. Think about it, would you ever recommend a 4670k over a 4770k if they were priced at 248$ and 249$ respectively? It's only a couple of games that profit from HT, but those that do tend to also be those that can make use of the extra performance for more consistency during gameplay.

If I knew with absolute certainty that HT would provide zero performance improvement in any of the use games the person I was recommending played, then I would recommend the 4670K, since $1 is still better than nothing*. If on the other hand I had no idea if HT would come in handy, then I would recommend the 4770K.

In other words both this scenario and the VRAM roulette scenario, will have different answers depending upon whether or not you know the performance end result. Obviously in real life we never know the end result beforehand, but in your original argument we did ("even winning the jackpot means you come off worse than not playing this game of vram-Roulette at all").

*Of course this doesn't account for resale value, which potentially changes the situation considerably.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
games picked due to their VRAM usage.

I rather refer to my link since it has several games, therefore more representative... unlike your "picked" little sample

Fact: Fury X trades blows with GTX 980TI and 1070 at 4K resolutions
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,051
32,572
146
If I were scrapping up money for a budget gamer I would pull the trigger on that $16X 3GB 1060 RS linked on Jet. That savings could double my system ram from 8 to 16GB without having to wait till later. Even the need to turn down a setting or 3 is not that big a deal. Trust me, having spent 7 yrs console gaming before getting back to PC as primary platform, it will still look amazballs by comparison. I prefer the RX series if price is the same, but the operative word is prefer. I am not going to inundate the thread with histograms to make my case. It is more a educated guess, nothing more, about where the gaming industry is going, and what features are likely to see better usage.
 
  • Like
Reactions: Magee_MC and Phynaz

Piroko

Senior member
Jan 10, 2013
905
79
91
In other words both this scenario and the VRAM roulette scenario, will have different answers depending upon whether or not you know the performance end result. Obviously in real life we never know the end result beforehand, but in your original argument we did ("even winning the jackpot means you come off worse than not playing this game of vram-Roulette at all").
That last sentence of mine actually was directed at this generation specifically. Also, I didn't assume that it's reasonable to expect winning the Jackpot. In fact, this thread already proves that the Jackpot can not be won with this round of cards.

And, to be honest, I'm still with raghu78 here with the odds being rather terrible of ever winning a round of that game (with gaming related products, I'm fully aware of 4GB GT630 cards), hyperbole or not.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
These threads are hilarious. One group is saying 3gb bad, 4gb totally ok. Another group is saying WELL 3GB IS BAD SO 4GB IS BAD TOO!

Why is it so difficult to be level headed about it? 3GB is pretty clearly not a great choice for the average 2 generation lifespan of the midrange card (current gen purchase +1 gen, e.g. 2-3 years) given the all-but-certain trend which has never reversed itself of VRAM requirements growing slowly over time. And 4gb is exactly 1GB better. It will make no difference in games that need anything more than 4GB which is a smaller section than the ones that need 3GB. How much smaller? That's the actual question which people are steadfastly ignoring in favor of rhetoric. To some degree we can't know what VRAM limitations will look like in 2-3 years except for the fact that we can count on them to grow from their current status per historical trends. But in all likelihood 4GB isn't going to be all that much better than 3GB at the end of its useful lifespan. At some point in the lifespan of these cards there is a point where the 4GB will be on average better than 3GB (all else being equal). But how quickly after that inflection point is 4GB bottlenecking so bad you should have gotten 6?

The real bottom line question is how many more months of useful lifespan does that extra 1 or 3GB buy me?
 
  • Like
Reactions: Magee_MC and rpsgc

zinfamous

No Lifer
Jul 12, 2006
111,864
31,359
146
Not compression, I've learned. But management? Seems so. 3GB overcoming 4GB? I dont believe so, but through management can certainly close the gap. I understand your need for sarcasm to try and drive your point home, but the funny thing is, you were probably more correct than you realize.

I don't doubt that there is some truth to this, or at least an argument that explains what seems to be happening. Personally, I don't get the "memory compression" argument--all of the memory modules are coming from Samsung anyway, right? It's not like nVidia and AMD are doing anything differently with identical modules. But they do design their "bus" and all of those fancy memory handling parts of the cards, right?, so this is where the difference in how the same memory can be more efficient in different designs--as I understand it, anyway.

This is why HBM at lower physical amounts can handle as much, if not more, than greater amounts of GDDR memory in some use cases. (of course in this case, we talking about an entirely different design for the memory, but the real benefit is bitrate which manages to overcome the physical limitations compared to larger amounts of a different type of memory. Or not--it's often mumbojumbo to me. :D
 

zinfamous

No Lifer
Jul 12, 2006
111,864
31,359
146
No matter what the data shows, a few posters have a pre-conceived idea that 3 gb is a nonstarter and trash while 4 gb will be future proof. Even worse they impugn the integrity of anyone who disagrees with them and tries to see both sides of the issue. Current data shows the 3gb card to actually be a better performer per dollar than the 6gb card. (10% or so slower overall, while it is about 20 to 25% cheaper.) So like most everything in life, it is a compromise. Cheaper and very close to the same performance overall now, with possibly (or maybe even likely) more compromises in the future but no one knows to what extent. And even if it loses another 15 to 20% relative to the 6gb, it would still be very close in performance per dollar.

As for the test the op linked. There is one game where the 4gb is clearly superior to 3gb, one game where both the 4gb and 3gb show very similar issues at highest settings, and one game where a the 3gb shows moderate spikes, but the 4gb is showing similar issues, but to a lesser extent. Hardly conclusive in IMO to totally condemn a 3gb card and give a 4gb card a free pass.

And those very same people that made that very same argument about today's $/performance benefit of the lower VRAM offering throughout the previous several generations were quickly burned when, repeatedly and predictably, those then-laudable cards quickly dropped in relevance compared to their contemporary higher VRAM cards which aged far better through newer and newer games.

At some point, you and the others will stop making this same roundly disproven argument again and again and again. Or, one day, maybe the earth really will be flat if you just continue to tell us it is.
 
  • Like
Reactions: rpsgc

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
That last sentence of mine actually was directed at this generation specifically. Also, I didn't assume that it's reasonable to expect winning the Jackpot. In fact, this thread already proves that the Jackpot can not be won with this round of cards.

And, to be honest, I'm still with raghu78 here with the odds being rather terrible of ever winning a round of that game (with gaming related products, I'm fully aware of 4GB GT630 cards), hyperbole or not.

You're original argument was that "even winning the jackpot mean you come of worse", so in that scenario we don't have to guess whether or not you will win the jackpot, since it is already assumed that you did.

And this thread does indeed prove that 3/4GB will suffer relative to their 6/8GB brethren, but it doesn't prove that this automatically makes you worse off, since those cards are still a fair bit cheaper. Whether or not the price difference is big enough to compensate is another question, the answer to which will differ from person to person, based on their financial means.

And those very same people that made that very same argument about today's $/performance benefit of the lower VRAM offering throughout the previous several generations were quickly burned when, repeatedly and predictably, those then-laudable cards quickly dropped in relevance compared to their contemporary higher VRAM cards which aged far better through newer and newer games.

At some point, you and the others will stop making this same roundly disproven argument again and again and again. Or, one day, maybe the earth really will be flat if you just continue to tell us it is.

And maybe one day you will realise that it isn't that black or white. Yes lower VRAM cards can and will age worse, but that doesn't automatically make them worse purchases. Whether or not they are worse purchases depends entirely upon the price.

So the question then is, how much performance the 1060 3GB and the RX 470/480 4GB can afford to lose relative to the 1060 6GB and the RX 480 8GB given the $40-50 price difference (or upwards of $80 with the recent Jet offer on the 1060 3GB) before the value proposition flips.

The simplistic answer would be that the 6/8GB are about 25% more expensive (50% more expensive if the Jet offer is included), so likewise that is roughly the performance gap where the value proposition flips as well, since this is where the the 6/8GB overtakes the 3/4GB cards in perf/$. The more complex answer would take into account stuff like resale value, and the individual consumer's marginal utility preferences and so on.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
You're original argument was that "even winning the jackpot mean you come of worse", so in that scenario we don't have to guess whether or not you will win the jackpot, since it is already assumed that you did.
I'm really weirded out by your argumentation basis. Are you riding semantics or is this some kind of misunderstanding in what hypothetical statements mean?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
This VRAM issue really is a lot more complicated than most people think. There are several factors involved in addition to the physical amount of VRAM, that will affect how well the game runs:

1) The programming of the engine. Some games sip VRAM, whilst others drink it and this depends on how efficiently the engine handles memory and how much detail is presented at any given time. Witcher 3 is a good example of the former, and Deus Ex Mankind Divided is a good example of the latter.

2) The drivers. Ultimately it's the drivers that have the final say. Whenever I look at VRAM comparison charts, it seems as though NVidia always uses more than AMD. I think this is because of how NVidia's drivers are programmed to cache as much data as possible. AMD's drivers seems to be much more conservative in how it manages VRAM.. That said, they have had to allocate more engineering resources into the memory management aspect of their drivers for Fury X's relatively smaller frame buffer.

3) Type of game. Depending on the type of game, the asset streaming can be either aggressive, normal or slow. Fast moving, highly detailed, open world games with long draw distances are more likely to consume more RAM and VRAM as they have to stream assets faster for instance..

So looking at all of these, should help explain why some games run well on GPUs with smaller frame buffers, whilst other games run really badly..

How come Fury X beats GTX 980TI and matches Titan X 12GB card in 4k? Your bias knows no limits unfortunately
https://www.techpowerup.com/reviews/NVIDIA/Titan_X_Pascal/24.html

Those benchmarks are completely useless because they don't examine frame time. Case in point, here is a benchmark of Fallout 4 between Fury X, GTX 980 Ti and Titan X:

FC4-AvgFrameRate.png


As you can see, Fury X isn't that far behind when you ignore 1080p which is probably CPU bound for the Radeon. But when you look at the frame time graphs, it's a whole other picture. And the weird thing is, that Far Cry 4 doesn't use much VRAM and RAM.

What you're seeing, is likely due to bad programming in terms of memory management more than anything, or it may be AMD's drivers which are to blame..

FC4-1080p.png

FC4-1440p.png

FC4-4K.png
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
This VRAM issue really is a lot more complicated than most people think. There are several factors involved in addition to the physical amount of VRAM, that will affect how well the game runs

You just used a test from August 2015?

One where the game wasn't using all the ram but still had issues?

FC4-RAM.png


How about a more recent review, from the 1080 launch:

Far Cry Primal:
index.php


BF Hardline:

index.php


GTA V:

index.php


Looks like Fury is doing fine.

http://www.guru3d.com/articles_pages/fcat_geforce_gtx_1080_framepacing_review,15.html

Since you didn't link your source, here it was:

http://www.extremetech.com/gaming/2...x-faces-off-with-nvidias-gtx-980-ti-titan-x/2

You also left out some noticable quotes from that article:

AMD’s frame timing isn’t as good as Nvidia’s, but we see that issue even below the 4GB limit. The situation gets noticeably worse at 4K, which does imply that Fury X’s memory buffer isn’t large enough to handle the detail settings we chose, but the GTX 980 Ti and Titan X aren’t returning high enough frame rates to qualify as great alternatives. The frame pacing may be better, but all three GPUs are again below the 30 FPS mark. Let’s move on to our final test case and more neutral ground.

And for GTA V:

In a word, “No.” The Fury X isn’t as fast as its GeForce counterparts in this test, but the 0.1% frame rate ratio on Team Red is the same as Team Green in 1080p. At 1440p, AMD’s 0.1% frame rate is actually better than Nvidia’s, while the 4K frame rate ratio still matches that of the GTX Titan X.

Our frame time results bear this out as well. The 1080p and 1440p results show AMD tying up nearly when Nvidia — both companies have GPU frame time spikes, and while AMD’s times are a fraction worse, we don’t see anything like the spikes that characterized Far Cry 4 and Assassin’s Creed Unity. At 4K, there’s evidence of a repetitive pattern in AMD’s results that doesn’t appear in Nvidia’s, and that may well be evidence of a 4GB RAM hit — but once again, we have to come back to the fact that none of the GPUs in this comparison are delivering playable frame rates at the usage levels that make it an issue in the first place.


And their conclusion:

First, there’s the fact that out of the fifteen games we tested, only four of could be forced to consume more than the 4GB of RAM. In every case, we had to use high-end settings at 4K to accomplish this. Of the three games profiled in this article, two of them were GameWorks titles with heavy, Nvidia-specific optimizations and did not run well on AMD hardware, even at resolutions where RAM was no issue. The third, Grand Theft Auto V, includes optimization code from both AMD and Nvidia, and it’s overall performance is much improved as a result.

Look how much better Far Cry Primal runs compared to Far Cry 4. That's No GW vs GW while providing the same details.

The only time it has issues with the memory limit is when its already failing to push playable frame rates anyway.
 
  • Like
Reactions: RussianSensation
Aug 11, 2008
10,451
642
126
And those very same people that made that very same argument about today's $/performance benefit of the lower VRAM offering throughout the previous several generations were quickly burned when, repeatedly and predictably, those then-laudable cards quickly dropped in relevance compared to their contemporary higher VRAM cards which aged far better through newer and newer games.

At some point, you and the others will stop making this same roundly disproven argument again and again and again. Or, one day, maybe the earth really will be flat if you just continue to tell us it is.
And those very same people that made that very same argument about today's $/performance benefit of the lower VRAM offering throughout the previous several generations were quickly burned when, repeatedly and predictably, those then-laudable cards quickly dropped in relevance compared to their contemporary higher VRAM cards which aged far better through newer and newer games.

At some point, you and the others will stop making this same roundly disproven argument again and again and again. Or, one day, maybe the earth really will be flat if you just continue to tell us it is.
Or maybe some posters will actually look at data that exists today and stop claiming to know the future with absolute certainty.