Question GeForce RTX 2060 VRAM Usage, Is 6GB's Enough for 1440p?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jun 8, 2003
14,081
107
126
#26
I had a gtx960 4gb overclocked to 1550 core.
Performance was about gtx770 levels in some titles mabe closer to a gtx780.
The card itself didnt have the GPU grunt to play title that saturated 4gb of memory. I remember seeing 2.2gb to 2.3 gb of memory usage before I was running under 40fps in most games.

A gtx1060/gtx980 is about 2x faster than a 960 with 6gb of memory.
I dont think a gtx1060 level card will use much more than 4gb of memory before games are running too slow for most people. It just dosent have the GPU grunt to do it.

Now the 1660ti will be about 17% faster than the 1060, I can see it using 4.8gb of memory before frames in games become too slow.

The 2060 will be about 16% faster than the 1660ti, it should have the grunt to push games fast enough to actually need the full 6gb of vram. And that's with a few choice games that will allocate vram , I dont see 6gb of vram as a limiting factor with a 2060.
 
Oct 27, 2006
19,562
99
106
#27
While I respect your perspective, look up the video showing GTX 770 2GB vs 4GB. It's from 2015, and tested 10 titles with high textures. The 4GB model not only performed well, but I would say overall excellent throughout. The 2GB 770 on the other hand was a mixed bag. Perfectly adequate on some titles, but had distracting hitching due to VRAM limits in others.

A similar issue exists more subtly with GTX 970. In very VRAM intensive titles one has to keep usage under 3.5GB or hit a substantial penalty.

That's not to say there aren't cases where cards have silly/nonsensical amounts of VRAM that the GPU itself cannot adequately make use of. Things like 2GB GT 530 come to mind lol.

But in point of fact, pure textures themselves generally aren't extremely intensive on GPU performance. Poly count, Lighting, shadows, and AA play a far more noticeable role on average even if you are well within your VRAM limit. in this way textures are often like the anisotropic filtering setting as something that isn't massively demanding on the GPU per se.
 

mohit9206

Golden Member
Jul 2, 2013
1,005
51
116
#28
Yes i disagree with the video. Ok 6gb is enough for now but it wont be in the future. Yes you can just drop the resolution to 1080p or textures to medium/high or both but it still means that you are paying more than 400 dollars for a 6GB card when a card half the price has 8GB. Its bad in the same way that 780 and 780Ti 3GB were bad when competition had 8GB even back then. The video is very narrow focused and doesnt want to look in the past or future.Yes maybe that is not the purpose of the video but it does not paint the whole picture.
 
Jun 8, 2003
14,081
107
126
#29
The video is very narrow focused and doesnt want to look in the past or future.
quote
"
At some point games are absolutely going to require more than 6GB of VRAM for best visuals.
The question is, by the time that happens will the RTX 2060 be powerful enough to provide playable performance using those settings? It’s almost certainly not going to be an issue this year and I doubt it will be a real problem next year. Maybe in 3 years, you might have to start managing some quality settings then, 4 years probably, and I would say certainly in 5 years time. "
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
226
96
#30
Yes i disagree with the video. Ok 6gb is enough for now but it wont be in the future. Yes you can just drop the resolution to 1080p or textures to medium/high or both but it still means that you are paying more than 400 dollars for a 6GB card when a card half the price has 8GB. .

What really matters is the whole package. Not one spec.

IMO neither a 1070Ti nor a Vega 56 are ever going to end up better than 2060, and those are realistically the competing new cards today, and certainly not cards costing half the price.
 

mohit9206

Golden Member
Jul 2, 2013
1,005
51
116
#31
Ok you are right but it just doesn't sit well with me a 400 dollar card having 6gb. But i do agree 2060 will easily play games for the next 4 years with no issues.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
226
96
#32
Ok you are right but it just doesn't sit well with me a 400 dollar card having 6gb. But i do agree 2060 will easily play games for the next 4 years with no issues.
And what is with the $400. You can buy NVidia's dual fan Reference straight from the them for $350, and there are several AIB models for $350, and EVGA even has a single fan model for $329 and that looks like regular price (no sale or rebates mentioned on newegg).
 

tviceman

Diamond Member
Mar 25, 2008
6,581
28
126
www.facebook.com
#33
I would like to see 2019 games tested at 1440p with and without RTX . Games like Metro Exodus, The Division 2, Rage 2, Devil May Cry 5, Gears 5, Star Wars: Jedi Fallen Order and Cyberpunk 2077. Then we can talk again if 6GB is enough for 1440p or not.
I'm shocked that you actually said something I agree with.
 

pauldun170

Diamond Member
Sep 26, 2011
5,596
46
126
#34
Ok you are right but it just doesn't sit well with me a 400 dollar card having 6gb. But i do agree 2060 will easily play games for the next 4 years with no issues.
It's not a $400 card.
It's a $350 card.
 

coercitiv

Diamond Member
Jan 24, 2014
3,048
332
136
#36
I'll only consider the price it is in my country so like 440 dollars.
We use USA prices for reference on the forum. If we all started to compare prices based on our own country we'd be comparing tax models more than product prices.
 

Guru

Senior member
May 5, 2017
530
126
76
#37
I think 6GB GDDR6 is more than enough for 1440p gaming for today and the near future. If it was 6GB GDDR5 on a compute power of a 1070ti, then the buffer size would be worrisome, but considering how fat GDDR6 is it should be okay for the rest of 2019.

The biggest vram guzzling games I've ever seen are Deus EX and ROTTR, using up to 6GB even at 1080p and up to 8GB in 1440p.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
97,964
252
126
#38
did we really need a second thread about the 2060?
 

Mopetar

Diamond Member
Jan 31, 2011
4,413
346
126
#39
did we really need a second thread about the 2060?
Second? I think there are more than just 2, but maybe I'm just getting them all confused now.

I did notice that we didn't really ever have a Turing megathread as has been the case historically, so things seemed to all get their own little threads, and there have been a lot related to Turing or the various Turing cards.

I'm not sure which is better. Multiple threads does result in more focused discussion within each, but it can result in thread spam on the front page. The megathreads cut down on this, but get unwieldy if there are 3 different discussions happening all at once.

I don't know if the mods have a preference, but it seems like mini-threads are en vogue at the moment. I'm sure we'll go back to megathreads eventually and the cycle will begin anew.
 
Oct 27, 2006
19,562
99
106
#40
It will be fascinating to see how the new consoles affect ports. With no RTX or reasonable raytracing implementation from RTG and the upcoming PS5/Scarlett (XBXX) APUs combined with the push for 4k and FauxK (1440p-1800p CB or other upscaling to 4k), it seems that the most likely results will be more aggressive texture upgrades and perhaps improved console AA/AF levels.

16GB is almost certainly the minimum goal for a unified memory pool, though it's certainly possible we will also see a 2-8GB bank for an ARM SOC that handles streaming anf other tertiary OS features. IIRC the PS4 way back in 2013 launched with such a configuration, not well publicized but it proved valuable in helping with the share features and even game install speeds.

In any case, the PS4 and X1 had roughly 4-5ish GB to work with in game and video memory available to devs. This effectively equaled around a 3-4GB GPU in PC specs at its most aggressive use. A hypothetical 16GB 9th gen console should end up equalling 8-10GB by contrast. We can see already where this would be highly useful. I was playing Battlefield V last night all ultra on PC 3440x1440 UW. Many textures were excellent, but you could still find many that simply looked hugely worse, presumably due to memory constraints. As long as you have enough VRAM to hold everything, texture resolution almost always has a minimal effect on framerate, while lighting, AA, and shadows can have relatively extreme impact on it. It's often noted how AF is nearly 'free' on any decent GPU, textures aren't far behind (unless you run out of VRAM, and then it causes hitching)

We will know so much more when the first big AAA 9th gen ports drop. I don't anticipate issues before then. However, next fall is only around 20 months out.
 


ASK THE COMMUNITY

TRENDING THREADS