8GB VRAM not enough (and 10 / 12)

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,969
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
Nvidia was deliberate about it. The lack of memory in 3080 is intended to drive buyers with a "fear of future proofing" into the 3090 and much higher profit margins.
Yep, very deliberate, nVidia buyers would either have to buy overpriced 3090 or upgrade to 4xxx series as soon as it becomes available.
 
  • Like
Reactions: Leeea

repoman0

Diamond Member
Jun 17, 2010
4,473
3,312
136
I would be pissed seeing something like that Far Cry 6 Vram suggestion if I bought a 3080 for $700 let alone the $1800 most people pay for it these days.

Hard to be pissed at a $700-800 3080 when anyone who paid that has probably mined $3-4k worth of Ethereum since.

I personally didn’t try the texture pack, but the game looks and runs really well in 4k HDR. Still only played for an hour before getting bored with it and going back to the Mass Effect remasters on a peasant console.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Nvidia uses memory compression technology but I have no idea if that helps cards utilize frame buffer more efficiently. If such a magic technology actually works, then 8GB should be enough for the time being till 1440p resolutions. But I don't know if it helps.
 

Hitman928

Diamond Member
Apr 15, 2012
5,254
7,822
136
Nvidia uses memory compression technology but I have no idea if that helps cards utilize frame buffer more efficiently. If such a magic technology actually works, then 8GB should be enough for the time being till 1440p resolutions. But I don't know if it helps.

Everyone uses memory compression now, some more effectively than others, but the purpose is really to help with bandwidth needs as far as I know. By and large the RAM space used is the same with or without.
 

Golgatha

Lifer
Jul 18, 2003
12,650
1,512
126
Everyone uses memory compression now, some more effectively than others, but the purpose is really to help with bandwidth needs as far as I know. By and large the RAM space used is the same with or without.

As I understand it, the encoding for data transfer is more efficient, so it performs faster, but as you said it has very little (if any) effect on total vRAM size needed for a frame buffer.
 
Last edited:
  • Like
Reactions: Leeea

BFG10K

Lifer
Aug 14, 2000
22,709
2,969
126
A more recent example where 8GB cards fall to the bottom of the trash heap:
far-cry-6-rt-3840-2160.png
That's right folks, even Arc is faster than 3070Ti with ray tracing in this test, even more embarrassing given ReBAR's off. 8GB is enough, yo. GG.

 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
A more recent example where 8GB cards fall to the bottom of the trash heap:

That's right folks, even Arc is faster than 3070Ti with ray tracing in this test, even more embarrassing given ReBAR's off. 8GB is enough, yo. GG.


Well, I think it still matters on use case. Its certainly not enough for 4K. But the vast majority of users have 1080P displays, and 1440P has become more popular. 8GB still does fine for any game at 1080P. 1440P might still run short with lots of RT and extras enabled.

However, this does show us that nVidia 3K series cards are going to age REALLY poorly.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,061
136
www.teamjuchems.com
Well, I think it still matters on use case. Its certainly not enough for 4K. But the vast majority of users have 1080P displays, and 1440P has become more popular. 8GB still does fine for any game at 1080P. 1440P might still run short with lots of RT and extras enabled.

However, this does show us that nVidia 3K series cards are going to age REALLY poorly.

NVidia is counting on it. It's been my theory that memory increases is how nvidia planned to move people from Ampere as an incremental step and those holding out on Turing and Pascal all along.
 

Zeze

Lifer
Mar 4, 2011
11,110
1,021
126
Well, I think it still matters on use case. Its certainly not enough for 4K. But the vast majority of users have 1080P displays, and 1440P has become more popular. 8GB still does fine for any game at 1080P. 1440P might still run short with lots of RT and extras enabled.

However, this does show us that nVidia 3K series cards are going to age REALLY poorly.

I'm a casual observer of GPU market. I'm still on 1080p, and I'd wager lots of gamers still are. Latest AAA titles struggle at 1080p still at max.

OP was very educational. It looks like PC 4K 144hz gaming still a good decade away (if not more). GPU performance increases been pretty stagnant and I feel like we've all been on 1920x1080 monitors for over a decade. I feel like my next GPU upgrade should have no less than 16gb of RAM in about 3-4 yrs.

I just upgraded from 1060 6gb to 2070 used. lmao, I'm loving it for today's games.
 
  • Like
Reactions: Tlh97 and Leeea

Leeea

Diamond Member
Apr 3, 2020
3,617
5,363
136
I'm a casual observer of GPU market. I'm still on 1080p, and I'd wager lots of gamers still are. Latest AAA titles struggle at 1080p still at max.
1. max settings = pointless, visual is the same as high settings
2. ray tracing is stupid
3. Current high end cards ( rx6800xt, rtx3080 ) can easily do 1440p on high settings with good frame rates
4. monitors are so much better now that if you are staying at 1080p, you probably should upgrade to a decent monitor anyway
 
Last edited:

Zeze

Lifer
Mar 4, 2011
11,110
1,021
126
1. max settings = pointless, visual is the same has high settings
2. ray tracing is stupid
3. Current high end cards ( rx6800xt, rtx3080 ) can easily do 1440p on high settings with good frame rates
4. monitors are so much better now that if you are staying at 1080p, you probably should upgrade to a decent monitor anyway
Stop it. I can no longer upgrade.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,465
20,492
146
1. max settings = pointless, visual is the same has high settings
2. ray tracing is stupid
3. Current high end cards ( rx6800xt, rtx3080 ) can easily do 1440p on high settings with good frame rates
4. monitors are so much better now that if you are staying at 1080p, you probably should upgrade to a decent monitor anyway
I am in complete agreeance, aside from RT reflections in Cyberpunk'd and Spiderman, those are worthwhile.

A good monitor will last through several builds usually.

A look at the most played games, most used GPUs, and most used resolution, says where most gamers priorities are.

OP is right though. What Nvidia charges for 8GB cards is highway robbery. Talk about planned obsolescence.
 
  • Like
Reactions: Tlh97 and Leeea

Zeze

Lifer
Mar 4, 2011
11,110
1,021
126
OP is right though. What Nvidia charges for 8GB cards is highway robbery. Talk about planned obsolescence.

Jesus I didn't think about that. Why are 3xxx and even 4xxx so low on memory.. why are they flirting with 8gb-12gb memory all across the generations?

Is memory quantity actually expensive make and slap on the cards? I thought most of the R&D went to architecture, small die size, and speed.
 
  • Like
Reactions: Tlh97 and Leeea

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Jesus I didn't think about that. Why are 3xxx and even 4xxx so low on memory.. why are they flirting with 8gb-12gb memory all across the generations?

Is memory quantity actually expensive make and slap on the cards? I thought most of the R&D went to architecture, small die size, and speed.

Part of it was that nVidia chose to go with GDDR6X, which at the time of launch of the 3K series, did not have larger capacity chips. Which is why the 3090 had memory on the backside of the board. But the later 3090Ti did not. And then there was the fact that they DID make 20GB 3080's, but every single one of them went to miners. None of them were available for retail.

AMD on the other hand stuck with standard GDDR6, and had no issues with putting 16GB on any card that was fast enough to use it.
 
  • Like
Reactions: Tlh97 and Leeea

kondziowy

Senior member
Feb 19, 2016
212
188
116
People were fighting to justify GTX960 2GB, GTX1060 3GB, RTX 2060 6GB, and they will defend 3070Ti 8GB. When you see it on the graph tanking to 5fps you know it was going on for years but in smaller, less spectacular fasion. Maybe some stutter here and there and stuff. I've seen people reporting less stutters when switching from rtx2060. I myself had to drop textures on 8GB card.
 

yottabit

Golden Member
Jun 5, 2008
1,364
229
116
A more recent example where 8GB cards fall to the bottom of the trash heap:
far-cry-6-rt-3840-2160.png
That's right folks, even Arc is faster than 3070Ti with ray tracing in this test, even more embarrassing given ReBAR's off. 8GB is enough, yo. GG.


That’s interesting. I experienced the “6 FPS“ firsthand first time I tried Cyberpunk 4k + RT on my 3070. I wasn’t expecting playable performance so just figured the card didn’t have enough GPU horsepower, not that there was VRAM bottleneck. Thanks for sharing :)

I’m still quite happy with the card overall. I feel like high refresh rate monitors must be a creation of some secret GPU lobbyist group. Right around the time the average 1060 could run any console port maxed out, they popped up for people to find new things to complain about :D

I honestly don’t really have any sympathy for people that have 4K / VR, want to run everything maxed, and then complain about the GPU landscape. I don’t think NV ever marketed the 3070 as a “4K” card

Back with my 1070 I was happy to run most games at medium settings 4k or maxed with “render resolution” reduced to get 1080 or 1440p 3D visuals and 4k crisp UI elements

DLSS 2 was a godsend for the 3070. Even use it in VR with No Mans Sky

All that being said my next card will be at least 16 GB

Thanks for sharing
 
  • Like
Reactions: Leeea

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136
Just solidifies my decision to only pursue cards that have 10gb of VRAM and up in my latest upgrade.

8gb feels like entry level nowadays, we've been on it so long and the modern console generation comes with more built in.
 

Golgatha

Lifer
Jul 18, 2003
12,650
1,512
126
Just solidifies my decision to only pursue cards that have 10gb of VRAM and up in my latest upgrade.

8gb feels like entry level nowadays, we've been on it so long and the modern console generation comes with more built in.

My thought on it is at 1440p, I want 10GB or more. My reasoning is the current generation of gaming consoles have about 10GB available, so a lot of games will be developed going forward with that in mind, and most 4k console games are actually rendered at something close to 1440p and upscaled.

In my mind, 8GB is ok for all 1080p and most 1440p gaming, and 12GB or more is ok for 4K gaming.
 

Golgatha

Lifer
Jul 18, 2003
12,650
1,512
126
Just solidifies my decision to only pursue cards that have 10gb of VRAM and up in my latest upgrade.

8gb feels like entry level nowadays, we've been on it so long and the modern console generation comes with more built in.

Another comment. We've been on 8GB of VRAM since the GTX 1080, and that was released on 5/27/2016. That's way too long in my book.
 

adamge

Member
Aug 15, 2022
51
126
66
Mid gen refreshes are coming sooner or later and will probably bump up vram use if you expect to keep a card for a while.

I don't follow your comment. The RTX 3000 series had a number of 'mid gen refreshes', quite a long time ago. As did the Radeon 6000 series. I do not think there will be more 'mid gen refreshes' of the same generation.
 
  • Like
Reactions: Leeea

blckgrffn

Diamond Member
May 1, 2003
9,123
3,061
136
www.teamjuchems.com
I don't follow your comment. The RTX 3000 series had a number of 'mid gen refreshes', quite a long time ago. As did the Radeon 6000 series. I do not think there will be more 'mid gen refreshes' of the same generation.

Console refreshes. Expecting the “Pro” SKUs to carry what, 24GB of ram? I don’t know how far out they are, given that PS5s are just starting to get actually accessible.

I agree, however, that you want at least the same amount of memory as the latest Gen console uses for graphics, and this Gen was 10GB. It’s likely a well optimized target for many games that are cross platform.