Question GeForce RTX 2060 VRAM Usage, Is 6GB's Enough for 1440p?

happy medium

Lifer
Jun 8, 2003
14,387
480
126
please watch the entire video.
I think this settles this misconception.
Remember" memory allocation" is still not memory needed!


Written article
https://www.techspot.com/article/1785-nvidia-geforce-rtx-2060-vram-enough/

Quote"
"
Bottom Line
It's clear that right now, even for 4K gaming, 6GB of VRAM really is enough. Of course, the RTX 2060 isn’t powerful enough to game at 4K, at least using maximum quality settings, but that’s not really the point. I can hear the roars already, this isn't about gaming today, it’s about gaming tomorrow. Like a much later tomorrow…
The argument is something like, yeah the RTX 2060 is okay now, but for future games it just won’t have enough VRAM. And while we don’t have a functioning crystal ball, we know this is going to be both true, and not so true. At some point games are absolutely going to require more than 6GB of VRAM for best visuals.
The question is, by the time that happens will the RTX 2060 be powerful enough to provide playable performance using those settings? It’s almost certainly not going to be an issue this year and I doubt it will be a real problem next year. Maybe in 3 years, you might have to start managing some quality settings then, 4 years probably, and I would say certainly in 5 years time.












 
Last edited:
  • Like
Reactions: Morlean

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I'm not so sure I agree for two reasons:

1) With the better optimised (by Nvidia drivers) DX11 graphics engines that's true, but we are looking at DX12/vulcan taking over (as new features will be DX12/vulcan exclusive), and so far they are less efficient and require more gpu memory. At least that's what BF5 seems to be showing - even without using RTX it is much more likely to stutter due to lack of vram in DX12.

2) Then we can add RTX to the mix and it looks like you need even more memory. 2060 is an RTX card so it may be the lack of vram that's the limiting factor to using that RTX not gpu performance.

For RTX cards I think it would have helped quite a bit if they all had 50% more memory, but I can see that would have made them even more expensive...

That said you can always just lower the quality settings a notch which generally has a negligible hit to quality (unless you love to study static screen shots).
 
  • Like
Reactions: Morlean

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
I think this settles this misconception.
It's only a misconception if you agree 2060 isn't really an RTX card. Otherwise you've simply shown benchmarks which conveniently ignore RTX because it now suits you.

Remember" memory allocation" is still not memory needed!
Sure, except when it allocates more than the card has, which means it obviously needs it. Likewise, if dropping the texture slider reduces frame spikes and stuttering, it clearly needs more.

The reality is at every Turding bracket, nVidia has created the equivalent of a 3GB 1060 because they're all starved for VRAM relative to their performance. The 2080TI should've had at least 16GB like AMD's slower Radeon 7.
 
Last edited:
  • Like
Reactions: beginner99

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
It's only a misconception if you agree 2060 isn't really an RTX card. Otherwise you've simply shown benchmarks which conveniently ignore RTX because it now suits you.


Sure, except when it allocates more than the card has, which means it obviously needs it. Likewise, if dropping the texture slider reduces frame spikes and stuttering, it clearly needs more.

The reality is at every Turding bracket, nVidia has created the equivalent of a 3GB 1060 because they're all starved for VRAM relative to their performance. The 2080TI should've had at least 16GB like AMD's slower Radeon 7.
16GB likely wouldn't make it any faster at RT though.
 
  • Like
Reactions: happy medium

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136

No [H] don't say that at all: https://www.hardocp.com/article/2019/01/20/battlefield_v_nvidia_ray_tracing_rtx_2060_performance/7
For BF5 they actually say:
The only non-limiting VRAM scenarios are DX11 with No DXR at 1440p and 1080p. Otherwise, every setting we tested is hitting the 6GB limit on the video card.

Turning on DX12 alone brings us up to the maximum VRAM capacity of the video card. Enabling any level of DXR on top of that just creates a bottleneck. You can really see here why "Ultra" and "High" DXR had very close performance at 1080p, it is reaching the absolute maximum VRAM capacity there is, like hitting a brick wall and it just can’t go further.

This is a big problem for the GeForce RTX 2060 in general, 6GB of VRAM is limiting for NVIDIA Ray Tracing in Battlefield V. It’s like trying to drive with the parking brake enabled.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
I would like to see 2019 games tested at 1440p with and without RTX . Games like Metro Exodus, The Division 2, Rage 2, Devil May Cry 5, Gears 5, Star Wars: Jedi Fallen Order and Cyberpunk 2077. Then we can talk again if 6GB is enough for 1440p or not.
 

mopardude87

Diamond Member
Oct 22, 2018
3,348
1,575
96
Yeah completely ignore the fact with everything turned on,BF5 is already having issues with that 6gb on day one of this cards release and that is at 1080p.Kind of amazed so many are either bypassing this example or simply nodding at just dropping textures.Not like its gonna get better when other RT titles come out.To many in the here and now.

Planned obsolescence?
 

linkgoron

Platinum Member
Mar 9, 2005
2,298
818
136
Looking at RTX at least, it appears as though it has a large effect. The minimum framerate at high and ultra are 6 FPS and 2 FPS respectively (1080p).

https://www.hardocp.com/article/2019/01/20/battlefield_v_nvidia_ray_tracing_rtx_2060_performance/4
https://www.hardocp.com/article/2018/12/17/battlefield_v_nvidia_ray_tracing_rtx_2070_performance/4

Note how the average frame rate for the 2070 goes from 59 FPS to 50.4 FPS (15%) when switching from low RTX to Ultra RTX. However, for the 2060 it goes from 47.6 to 35.6 (25%). Sadly [H] did not list the minimums for the 2070 like they did for 2060.

https://www.hardocp.com/article/2019/01/20/battlefield_v_nvidia_ray_tracing_rtx_2060_performance/6

At 1080p with DX12 enabled and "Ultra" DXR we can see how performance scales from the ASUS ROG STRIX RTX 2080 Ti, to 2080 to 2070 to MSI GeForce RTX 2060 GAMING Z. The 2070 is 42% faster than the 2060 and the 2080 is 11% faster than the 2070 and the 2080 Ti is 11% faster than the 2080. A lot of the 2060’s performance bottlenecks with "Ultra" DXR here may be due to VRAM limitations.

EDIT:
Hardware Unboxed said:
It's clear that right now, even for 4K gaming, 6GB of VRAM really is enough.
I'd say that the TPUs benchmarks, do not agree with Techspot's benchmarks. In TPU's benchmarks the 2070 is 17% faster than the 2060 at 1080p, but is 24.1% faster @ 1440p and 31.2% faster @ 4K in Shadow of War. We might also be seeing it in Hellblade where the 2070 is 23.6% faster @ 1440P.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
No [H] don't say that at all: https://www.hardocp.com/article/2019/01/20/battlefield_v_nvidia_ray_tracing_rtx_2060_performance/7
For BF5 they actually say:
The only non-limiting VRAM scenarios are DX11 with No DXR at 1440p and 1080p. Otherwise, every setting we tested is hitting the 6GB limit on the video card.

Turning on DX12 alone brings us up to the maximum VRAM capacity of the video card. Enabling any level of DXR on top of that just creates a bottleneck. You can really see here why "Ultra" and "High" DXR had very close performance at 1080p, it is reaching the absolute maximum VRAM capacity there is, like hitting a brick wall and it just can’t go further.

This is a big problem for the GeForce RTX 2060 in general, 6GB of VRAM is limiting for NVIDIA Ray Tracing in Battlefield V. It’s like trying to drive with the parking brake enabled.

I think we can all sensibly agree that BF V DX12 is broken. Performance alone by enabling it on ANY card is a 25% penalty for absolutely no gain whatsoever in rendering output. Citing a broken version API of a game to make a point is not making a point at all, it's wasting everyone's time.
 

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136
No [H] don't say that at all: https://www.hardocp.com/article/2019/01/20/battlefield_v_nvidia_ray_tracing_rtx_2060_performance/7
For BF5 they actually say:
The only non-limiting VRAM scenarios are DX11 with No DXR at 1440p and 1080p. Otherwise, every setting we tested is hitting the 6GB limit on the video card.

Turning on DX12 alone brings us up to the maximum VRAM capacity of the video card. Enabling any level of DXR on top of that just creates a bottleneck. You can really see here why "Ultra" and "High" DXR had very close performance at 1080p, it is reaching the absolute maximum VRAM capacity there is, like hitting a brick wall and it just can’t go further.

This is a big problem for the GeForce RTX 2060 in general, 6GB of VRAM is limiting for NVIDIA Ray Tracing in Battlefield V. It’s like trying to drive with the parking brake enabled.
Very interesting you getting a downvote by merely quoting [H].
 

amenx

Diamond Member
Dec 17, 2004
3,902
2,121
136
Looking at RTX at least, it appears as though it has a large effect. The minimum framerate at high and ultra are 6 FPS and 2 FPS respectively (1080p).

https://www.hardocp.com/article/2019/01/20/battlefield_v_nvidia_ray_tracing_rtx_2060_performance/4
https://www.hardocp.com/article/2018/12/17/battlefield_v_nvidia_ray_tracing_rtx_2070_performance/4

Note how the average frame rate for the 2070 goes from 59 FPS to 50.4 FPS (15%) when switching from low RTX to Ultra RTX. However, for the 2060 it goes from 47.6 to 35.6 (25%). Sadly [H] did not list the minimums for the 2070 like they did for 2060.

https://www.hardocp.com/article/2019/01/20/battlefield_v_nvidia_ray_tracing_rtx_2060_performance/6



EDIT:

I'd say that the TPUs benchmarks, do not agree with Techspot's benchmarks. In TPU's benchmarks the 2070 is 17% faster than the 2060 at 1080p, but is 24.1% faster @ 1440p and 31.2% faster @ 4K in Shadow of War. We might also be seeing it in Hellblade where the 2070 is 23.6% faster @ 1440P.
Not sure why all the focus in the thread is only on mem capacity, the other factor is the smaller bit bus of the 2060 (192 bit). Which part has greater impact at higher res gaming, the lesser vram or the smaller bit bus or a combination of both? What if the 2060 had a 384-bit bus? Would the 6gb GDDR6 still be a limiting factor? Or if 192-bit but vram doubled to 12gb? Where or what would its weak point be at 1440p or 4k, say vs the 2070?
 
  • Like
Reactions: happy medium

Gt403cyl

Member
Jun 12, 2018
126
21
51
Not sure why all the focus in the thread is only on mem capacity, the other factor is the smaller bit bus of the 2060 (192 bit). Which part has greater impact at higher res gaming, the lesser vram or the smaller bit bus or a combination of both? What if the 2060 had a 384-bit bus? Would the 6gb GDDR6 still be a limiting factor? Or if 192-bit but vram doubled to 12gb? Where or what would its weak point be at 1440p or 4k, say vs the 2070?

That’s what I was thinking, bandwitdth plays a factor as well....

If you have slow 6GB vs fast 3GB.....

There is a point where the rate of data in and out exceeds the point where you actually need more vram.

Not saying this is the case here but something to look at and think about.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
I feel that this is not only somewhat marginal at present, but extremely ill-founded to speak about three freaking years being ok for 6GB.

PC ports have thankfully increased dramatically during 8th gen consoles vs 7th gen (thanks, AMD and x86!), but that also means that you are seeing more of a tie to console optimizations than ever before. Flagship titles quickly tied onto PS4/X1 memory and texture realms, causing 2GB and even 3GB GPUs to take a hit during 8th gen.

Fall 2020 will see consoles with 16GB or more of unified memory, and upwards of 8GB for texture and GPU pool memory usage. Flagship AAA ports will suddenly be looking for similar numbers for 'Ultra' settings on PC.

That's not to say that a 2060 won't run Doom or Assassin's Creed 2020+ ports at medium settings, but I do expect the 1080, Vega, 2070, and 2080 to hold up better, and 1080ti/Vega7/2080ti to hold up far better.

RTX also won't likely play any part in 9th gen console priority, because AMD/RTG doesn't appear to have any corresponding/compatible variations of the same tech for their near-term roadmap, certainly not in time to be in the PS5/XBXX. Optimally, Nvidia and partners can make for some compelling use to enhance ports in the future, but realities being what they are, developers will continue to prioritize development for 4k/ultra textures/console hardware, with PC being capable of more of the same, at higher framerate but not necessarily more features.

Time will tell. We've been down similar roads many times before though, and seldom does a mid-range or above GPU seem like it didn't need as much memory as reasonably possible to maintain good results going forward.

I've seen a succession of GPUs that were respectable at their time show more age from memory limitations.

320MB 8800GT
1GB 7850
1GB 750ti
2GB 770 FTW
3GB 780
3.5GB 970 Gaming+

Etc

Pretty easy to see 6GB cards be on that list in 24 months. Not 'obsolete'. But distinctly less than ideal. And each of those past examples had alternatives that aged better.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
6GB to the high end cards (8GB like the 2080) is what, an added 33%
1GB 7850 to 2GB 7850 was 100%, even worse if you compare with the 7950...

as far as I know the 1060 3GB is doing fine on games that are more optimized for 4GB cards still.

one thing that could have a big impact is if the PS5 has like 24GB of ram or something, affecting more of the core design of games and making it hard to scale back properly.
but I don't see them pushing too hard on the ram front for the next consoles....
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I'd say that the TPUs benchmarks, do not agree with Techspot's benchmarks. In TPU's benchmarks the 2070 is 17% faster than the 2060 at 1080p, but is 24.1% faster @ 1440p and 31.2% faster @ 4K in Shadow of War. We might also be seeing it in Hellblade where the 2070 is 23.6% faster @ 1440P.

Maybe we need to look at more reviews and see which is the anomaly.

https://www.guru3d.com/articles-pages/geforce-rtx-2060-review-(founder),23.html

Is more in line with HWUB. I wonder if there was some specific extra feature turned on for TPU...

Also note, that even in the TPU SoW benchmark, the 6GB 2060 still beats the 8GB 1070TI, so it would be quite misguided to go for the older card on the basis of thinking that 8GB has more longevity based on even this test.
 
Last edited:

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Maybe we need to look at more reviews and see which is the anomaly.

https://www.guru3d.com/articles-pages/geforce-rtx-2060-review-(founder),23.html

Is more in line with HWUB. I wonder if there was some specific extra feature turned on for TPU...

Also note, that even in the TPU SoW benchmark, the 6GB 2060 still beats the 8GB 1070TI, so it would be quite misguided to go for the older card on the basis of thinking that 8GB has more longevity based on even this test.

Yeah one thing is for sure, the GPU portion of 2060 is stronger than 1070. The memory question remains to be seen, though of course even if you eventually have to dial back to medium/medium high vs ultra it's not the end of the world by any means.

It does remind me of past weird situations where you could get a slower GPU, yet with tons of memory, vs some faster GPUs that had models with limited memory.

Eg; 1GB 6950 models were in the same world as 2GB 6870. The 6950 absolutely was the stronger GPU, while the 6870 had more memory. I wonder what they'd look like at 1080p/low in current games. They're both super old now, but I'm always kind of fascinated by trying to push really old hardware, sometimes you get surprisingly good results.
 
  • Like
Reactions: happy medium

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Yeah one thing is for sure, the GPU portion of 2060 is stronger than 1070. The memory question remains to be seen, though of course even if you eventually have to dial back to medium/medium high vs ultra it's not the end of the world by any means.

I think we've been down this rabbit hole multiple times. And just about every time I've seen it outside of the exact same GPU core (ie same everything only diff is VRAM), the GPU core matters more than the VRAM in future gaming. VRAM is definitely important, but you'll most likely hit a wall on processing power long before you hit the wall for VRAM.

This also assuming you do apples to apples comparisons. Often I come across posts of someone saying "I still game max/ultra settings on my GTX 970" but often pressing them shows they lower a setting or two, one that isn't affected by VRAM.
 
  • Like
Reactions: happy medium

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
I think we've been down this rabbit hole multiple times. And just about every time I've seen it outside of the exact same GPU core (ie same everything only diff is VRAM), the GPU core matters more than the VRAM in future gaming. VRAM is definitely important, but you'll most likely hit a wall on processing power long before you hit the wall for VRAM.

This also assuming you do apples to apples comparisons. Often I come across posts of someone saying "I still game max/ultra settings on my GTX 970" but often pressing them shows they lower a setting or two, one that isn't affected by VRAM.

I think both scenarios are decidely real, as we've seen with the 4GB Fury X vs approximately equal performing GPUs with 8GB memory, and would likely show an OC 6870 2GB ahead of 1GB 6950 (to approximate raw core performance for as close to apples/apples as possible).

The big wildcard, and the most unsettling idea to me is that 6GB in 2019 is definitely ok with good settings in current games, just as 1GB and 2GB were fine in 2013. But we are on the precipice of new consoles with dramatically higher memory available for textures and details. It's reasonable to postulate that 6GB and below will suffer a bit with AAA ports that target these new units.

Otoh, it's a tough spot for consumers right now, particularly if they are only choosing from new product. I don't think the 2070 is worth the huge price premium over the 2060 despite the ram gap. Hope remains that 1660ti (or whatever it's called) turns out relatively close performance without being nearly $400.
 

nurturedhate

Golden Member
Aug 27, 2011
1,743
674
136
680 2gb cards, 580 1.5gb cards. Those weren't that long ago. GTX 580 was completely hamstrung by it's lack of vram in the second half of it's life. Both were affected by the release of new consoles. We'll be in the same situation next year. This doesn't make the 2060 a bad buy, it just requires more thought put into the purchase.
 
  • Like
Reactions: ZGR

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
680 2gb cards, 580 1.5gb cards. Those weren't that long ago. GTX 580 was completely hamstrung by it's lack of vram in the second half of it's life. Both were affected by the release of new consoles. We'll be in the same situation next year. This doesn't make the 2060 a bad buy, it just requires more thought put into the purchase.

I am willing to be the 580 3GB card would also be crushed be 680 2GB.

The 680 was a big architecture shift and had something like double the shader/texture performance of the 580.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
I am willing to be the 580 3GB card would also be crushed be 680 2GB.

The 680 was a big architecture shift and had something like double the shader/texture performance of the 580.

I believe you may be slightly mistaking what he's implying. He's not saying that 680 2GB is good and 1.5GB 580 is bad, rather that both suffered fairly substantially in the 8th gen console port era. Like for like, the 3GB 580 or 4GB 680 models likely fared far better over time.

I saw a 2015 head to head with 770 2GB and 770 4GB, and the 4GB card encountered many less hitches in certain titles (but not all, some were less demanding).

He can correct me if I'm not understanding him correctly. Cheers.
 
  • Like
Reactions: mohit9206