More examples of 8GB VRAM not enough

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
People aren't really talking about this given non-existent availability, but I came across two new examples where 8GB absolutely cripples performance.

Videos are time-stamped to the right place:



No such problems with the 6700XT 12GB/VII 16GB used in the same tests.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,981
136
Yeah that would be useful to compare with. Some VR games can go way beyond 12GB though, even over 20GB occasionally. I think that earlier comment about 12GB not good for 4K RT is even more true for VR. It's enough for most non-RT 4K games though.

I've got no experience with VR so I wouldn't know what the requirements generally are, but I'll take your word for it. I don't care about RT either, which I understand requires additional memory as well, so really I'm just referencing standard rasterized gaming.

I think at least 12 GB is what you likely want long term for 4K assuming you don't want to lower settings. 10 GB will probably be okay for most titles, but if you want to keep such a card for 5 years it will probably really hit a wall by the end of that time. 8 GB will struggle with the first wave of games developed with the new consoles in mind.

Of course 4K gaming is still a small part of the market. If gaming at 1440p or 1080p that memory will probably be fine.
 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
VR has both a much higher resolution and wider FOV than regular gaming, and I believe both eyes' frames are rendered separately. Many games run much worse in VR and need settings turned down even when the same game is great in regular 4K. Once you see it in VR though, you don't want to play it in regular flatscreen. :)

For RT 12GB should actually be enough, since none of the current cards do that well at native 4K anyway and you have to use DLSS.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
What is really unknown at this point is how next-gen games powered by UE 5 are going to behave. I know that the engine itself was heavily optimized for the PlayStation hardware, though. We know the specs, but if PC gaming won’t receive tailored optimizations like the console do, we probably need to double the specs to get equal ground with them. Well, time will tell. End of this year we might see something?
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
VR has both a much higher resolution and wider FOV than regular gaming, and I believe both eyes' frames are rendered separately. Many games run much worse in VR and need settings turned down even when the same game is great in regular 4K. Once you see it in VR though, you don't want to play it in regular flatscreen. :)

For RT 12GB should actually be enough, since none of the current cards do that well at native 4K anyway and you have to use DLSS.

I've wondered about this since performance can be terrible and doesn't seem to match the performance you'd expect by simply looking at the total resolution of the headset. Rendering the game twice could explain a lot.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,451
20,462
146
Lulz. There are always the sticks in the mud that go through the Kübler-Ross model, whenever hardware requirements start to change. It never happens all at once, so why they get salty about it, deny it, and debate it, seems silly to me. Just accept it and be prepared to act when you cannot avoid it any longer. ;)
 
  • Like
Reactions: ryan20fun
Feb 4, 2009
34,554
15,766
136
Lulz. There are always the sticks in the mud that go through the Kübler-Ross model, whenever hardware requirements start to change. It never happens all at once, so why they get salty about it, deny it, and debate it, seems silly to me. Just accept it and be prepared to act when you cannot avoid it any longer. ;)

Yeah, I am so glad I have able to learn to accept if a game looks good or not by looking at it and not hyper evaluating benchmarks or FPS or whatever.
So much easier and so much less of a pain in the ass.
If I am satisfied with how it looks or runs that good enough for me, if I am not satisfied either don’t play it or buy a new card.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,451
20,462
146
Yeah, I am so glad I have able to learn to accept if a game looks good or not by looking at it and not hyper evaluating benchmarks or FPS or whatever.
So much easier and so much less of a pain in the ass.
If I am satisfied with how it looks or runs that good enough for me, if I am not satisfied either don’t play it or buy a new card.
To preface my remarks: I don't want to derail the thread, so I will make a very on topic point too.

First, in response to you: I was on consoles while my son was growing up. Until he was about 12, when he and his friends discovered how much better Minecraft was on PC. The point being? You can get used to almost anything. And when you don't know what you are missing, because it has been so long since you had a compare and contrast, it is all good.

On topic: One of the Youtubers I follow, that actually plays a bunch of games on every card he tests, has been beating the drum about vram limits lately too. He kept saying "Why Nvidia?!?" about the GTX 1650 Super, because it has the horsepower to use more ram (looking at you 1060). But I guess it made more sense to give it 4GB, because of price tiering and planned obsolescence?
 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
Yes, there is often no perceptible difference between medium/high and ultra settings, but the latter tanks the framerate. It's useful to keep in mind for VR where ultra is usually too demanding even for top cards.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I came to the conclusion years ago that people who really enjoy video games are not the same people that obsess over gpu settings. The former just want a fun gaming experience which for most means medium, 60 fps at 1080p or so, the latter don't really play games, they just have fun discussing the merits of 8gb vs 16gb of gpu memory for games they barely ever play or perhaps don't even own.
 
Feb 4, 2009
34,554
15,766
136
Yes, there is often no perceptible difference between medium/high and ultra settings, but the latter tanks the framerate. It's useful to keep in mind for VR where ultra is usually too demanding even for top cards.

I want to say screen shot wise I can tell a difference maybe not between high & ultra but usually yes I can see a difference.
While playing the game and 30, 40 or 60 frames are passing per SECOND I do not see a difference.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,981
136
I came to the conclusion years ago that people who really enjoy video games are not the same people that obsess over gpu settings.

While that's a very sensible statement, the GPUs that run games on Ultra today are the ones that will eventually be able to run games on Medium in five or so years. For some games, the fun is in the gameplay and the graphics really don't matter. Something like Binding of Issac is going to run fine even on low-end cards from previous generations, but for other games a lot of the fun comes from the atmosphere and the visual presentation which attempts to create an experience that would be compromised if it couldn't manage that same level of visuals or if the frame rate suffered and created noticeable hitches to drag you out of the immersion. Neither of these approaches is any more "right" than the other and they obviously have different requirements.
 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
I want to say screen shot wise I can tell a difference maybe not between high & ultra but usually yes I can see a difference.
While playing the game and 30, 40 or 60 frames are passing per SECOND I do not see a difference.

Sometimes high and ultra are different in a screenshot but it's hard to say that one is actually better than the other (or even which one is which). I like turning textures/detail settings up to ultra but often turn down things like RT, HBAO or shadows. There is some slight improvement but it's not worth dropping the framerate by 50%.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
I came to the conclusion years ago that people who really enjoy video games are not the same people that obsess over gpu settings. The former just want a fun gaming experience which for most means medium, 60 fps at 1080p or so, the latter don't really play games, they just have fun discussing the merits of 8gb vs 16gb of gpu memory for games they barely ever play or perhaps don't even own.
Your conclusion is wrong I'm afraid: https://forums.anandtech.com/threads/games-you-finished-in-2020.2589014/
 
  • Like
Reactions: Elfear

Jaskalas

Lifer
Jun 23, 2004
33,427
7,485
136
People aren't really talking about this given non-existent availability, but I came across two new examples where 8GB absolutely cripples performance.

So sitting here at 1080p, I'll just snap up a fast 8gb card and call it a day, eh?
If it takes 4k to break 8gb, I am feeling pretty comfy.
 

dr1337

Senior member
May 25, 2020
331
559
106
And this is exactly why a new monitor is off the table for me
As someone who dropped money on a monitor instead of a GPU due to the shortages, I have no regrets and wish I would have moved to 4k sooner. Granted the few pancake games I play are mostly sims and civ, my 580 still does good enough to hang with turning down minimal settings. And boy does surfing in csgo feel nice at 4k144hz.
 
  • Like
Reactions: Leeea

Furious_Styles

Senior member
Jan 17, 2019
492
228
116
As someone who dropped money on a monitor instead of a GPU due to the shortages, I have no regrets and wish I would have moved to 4k sooner. Granted the few pancake games I play are mostly sims and civ, my 580 still does good enough to hang with turning down minimal settings. And boy does surfing in csgo feel nice at 4k144hz.

The 580 just won't cut it for 4k/144hz for the vast majority of people. 1440p is much more sensible for that. And I do understand that in your case it works because nothing is graphically demanding.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,981
136
As someone who dropped money on a monitor instead of a GPU due to the shortages, I have no regrets and wish I would have moved to 4k sooner. Granted the few pancake games I play are mostly sims and civ, my 580 still does good enough to hang with turning down minimal settings. And boy does surfing in csgo feel nice at 4k144hz.

It's probably hard to answer without also having a comparison, but do you think it's a better experience at 4K with lower settings or might you think 1440p at higher settings would be better. I only ask because I'm in the same boat where component availability or cost is making me look at other avenues of upgrade in the meanwhile and a new monitor is on the list regardless.
 
  • Like
Reactions: Tlh97 and Leeea

dr1337

Senior member
May 25, 2020
331
559
106
It's probably hard to answer without also having a comparison, but do you think it's a better experience at 4K with lower settings or might you think 1440p at higher settings would be better. I only ask because I'm in the same boat where component availability or cost is making me look at other avenues of upgrade in the meanwhile and a new monitor is on the list regardless.
Well my main panel was a 27" 1440p IPS from back in early days of cheap korean monitors. I jumped to it from 23" 1080p and while it was a nice upgrade it never really had a wow factor. Now granted I went from 27" 1440 to 43" 4k so lol form factor is quite different and PPI is the same. Even that considered though I immediately noticed an increase in visual fidelity, with the big 4k monitor its really easy for me to see anything that isn't being rendered at native. First step with civ and other games was installing 4k texture packs and boy did it make a difference. At 1440 I never felt the need to do such things and felt like the diminishing returns on textures was essentially the same as at 1080p. Now in terms of games with lots of eye candy, idk it can be tradeoff depending on the game. With mgs:tpp the render distance causes the biggest frame drops with my 580, and turning that down isn't really an option in that game so I definitely lost visual fidelity turning down settings compared to playing it at 1440, but rendering at native still looks really nice and 4k really lets texture detail shine. Other games like R6S, pretty much no visual difference that I can actively notice.

So really it boils down to how you want to do things, sticking with 1440p and going with high refresh rate can be cheaper than jumping to 4k. But if you're someone like me and you buy a monitor maybe once a decade then future proofing is a must. I know that whenever I finally get a new graphics card that I'll be able to turn all of my settings up and get good frame rates and while always looking better than 1440p. Till that day comes I'm stuck making trade off's for demanding games, and simply have better visuals literally everywhere else, for me the choice was obvious.