8GB VRAM not enough (and 10 / 12)

Page 102 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,689
31,557
146
Strix Point has up to 16 CU, and while those no guarantee that AMD increases to 24 CU with their next generation, if they don't the one after definitely will.

I don't think it will be more than 3 years to see some mainstream part that hits those targets.
If AMD does not push the limits, Intel will. Being it is no longer a one horse race. Hopefully the competition will get us faster AND cheaper. The system ram bottlenecking performance is still preferable to not having enough fast GDDR on a dGPU.

One example: I messed around with settings in Shadow of War with the hi res texture pack using a 5700G. Even back then it required an 8GB card when most were on 2-6GB cards. I found that no matter what I did, the game looked like PS2/PS3 mixed together. Pop in was so bad it was not worth playing. Until I used max textures. That setting alone made the difference night and day.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
Strix Halo should also have a cutdown version, I am just not so sure If It will be 24CU, 32CU looks more likely to me.

Not sure how It will compare against dGPUs, I think somewhere between 4060 and 4070 mobile.
The problem is that likely next gen will be out by then, but If they will be limited to 8GB Vram, then this could be an interesting option depending on price.
 
  • Like
Reactions: Mopetar

marees

Golden Member
Apr 28, 2024
1,252
1,805
96
Strix Halo should also have a cutdown version, I am just not so sure If It will be 24CU, 32CU looks more likely to me.

Not sure how It will compare against dGPUs, I think somewhere between 4060 and 4070 mobile.
The problem is that likely next gen will be out by then, but If they will be limited to 8GB Vram, then this could be an interesting option depending on price.
I don't think any AMD apu (other than one ordered by a console player such as steam, PlayStation, Surface or apple) will make a difference to a dGPU

Simply because from a differentiation point of view, AMD would cut down GPU CUs, if they scaled down CPU cores.

As Cary Golomb said, AMD needs to go wide & low both with memory & GPU (simultaneously reducing power/area budget for CPU), like apple.
Microsoft if they wanted could make it happen, but it seems they have gone an xbox o/s route & outsourced hardware to Asus, Lenovo etc, which would be generic kraken point like APUs
 

marees

Golden Member
Apr 28, 2024
1,252
1,805
96
I don't think any AMD apu (other than one ordered by a console player such as steam, PlayStation, Surface or apple) will make a difference to a dGPU

Simply because from a differentiation point of view, AMD would cut down GPU CUs, if they scaled down CPU cores.

As Cary Golomb said, AMD needs to go wide & low both with memory & GPU (simultaneously reducing power/area budget for CPU), like apple.
Microsoft if they wanted could make it happen, but it seems they have gone an xbox o/s route & outsourced hardware to Asus, Lenovo etc, which would be generic kraken point like APUs
Strix halo could be the KabyLake-G of its time, but strictly in niche products. This will never see wide adoption
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
Another game that exceeds 8GB even with the short test runs W1z uses.

https://www.techpowerup.com/review/the-first-descendant-fps-performance-benchmark/5.html

I would add the caveat that's only when you start turning RT and other bells/whistles on. It can handle 1080p with Ultra settings (at least within the test parameters which may not show a full picture) otherwise.

The 4060 is already below 60 FPS even before turning any of the other settings like RT on, so I'm not sure how many people would want to use it eve assuming that it wasn't going to require more than 8 GB.

Meanwhile the 8/16 GB models of the 4060 Ti are within margin of error of each other for all tests, which could suggest they're either no long enough to expose any issues pertaining to VRAM or that the game will drop texture quality on the fly if it can't load them in time. Further analysis would be necessary to determine which case we're looking at.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,689
31,557
146
Meanwhile the 8/16 GB models of the 4060 Ti are within margin of error of each other for all tests, which could suggest they're either no long enough to expose any issues pertaining to VRAM or that the game will drop texture quality on the fly if it can't load them in time. Further analysis would be necessary to determine which case we're looking at.
Yup. But it's all academic as none of the bigger bar better fps have FG. Uses over 8GB at 900p for flying spaghetti monster's sake. And it will only get worse when you play the game instead of testing it for a hot second.

vram.png
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,689
31,557
146
Best Buy Canada had a Zotac 4060 for $310 Canadian pesos yesterday, which is US$228. I think in that price range it's perfectly acceptable to pick an 8GB card.
US$390 for an 8GB 4060 Ti is just out to lunch though.
LOLZ I know why Zotac is dropping prices, they just had a major customer privacy leak. GN should have a vid on it soon. They evidently fixed it, but it is still in the Google cache as of now. Don't ask me for links or anything, you have an internet connection, use it.
 

MrTeal

Diamond Member
Dec 7, 2003
3,916
2,700
136
LOLZ I know why Zotac is dropping prices, they just had a major customer privacy leak. GN should have a vid on it soon. They evidently fixed it, but it is still in the Google cache as of now. Don't ask me for links or anything, you have an internet connection, use it.
Just saw that, was it confirmed Zotac? Don't think it would really affect most end users unless they're actually creating accounts to register their product, but it's still an almost laughable security breach.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,689
31,557
146
I am going to follow up on Zotac, though it is off topic. Why? Because it appears we have finally been ghosted. Consequently there is no more dissention, and all we can do content wise, is wait for each new game showing $300-$400 8GB cards age like warm milk.

 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,689
31,557
146
Hah, zotac fixed it within 4 hours of being notified of it.
Not exactly They fixed it within 4hrs of biz clients complaining. Some of which was at Steve's prompting after making them aware of the issue. Zotac helped the person that gave Steve the heads up, but failed to solve the larger problem. When I posted about it, significantly more than 4hrs had already passed since the initial report.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
This has been pointed out to you so many times, but for whatever reason it just doesn't click for you.

If you use some e-sports title or other game that doesn't even use 2 GB of VRAM, then it's obvious that 8 GB doesn't matter than more than 6 GB, 12 GB, 4 GB, or 16 GB. It proves nothing.
Once more, go to gamegpu, pick ALL 1080p results and see how bad 8GB cards are doing. These are NOT esports games. See also The First Descendant that has been discussed above, that once again is tried to be passed as problematic, where it shoes no indication of problems for 8GB cards, within their performance targets. The 3060 12GB is in the gutter once again for example.

For your premise to be correct it would have to be impossible to show even a single instance in which 8 GB is a limiting factor. The thread contains dozens of examples of this very thing. You're arguing that something is not possible after it's been shown to have occurred.
No, far from it. My premise is all about the range within which a video card operates. What YOU (not you) are doing, is taking a look at the extremes. I have said it before. *SOME* problems will arise for 8GB cards, but they are very very few and most of the time, these settings also surpass the gpu processing power of the card, so what's the point?

It's hardly surprising that everyone thinks you're a troll, because that's what it looks like. The only other explanation is that you'd need to have someone help you get dressed in the morning because you might strangle yourself trying to put a shirt on and there's nothing outside of your dogheaded defense of a position that everyone else can clearly see is wrong to make people think that you could possibly by that stupid. It's just too impossible for someone to have a fully functioning brain that's only completely broken in that singular regard. The odds are just too impossible for it to be real.
I have explained my position VERY clearly. This is far from trolling. It's not my fault you chose to ignore it.

Please refrain from personal attacks.

You also have to realize that even if you genuinely do believe what you're saying right now that it will not hold true at some point in the future. If 8GB were evergreen, why wasn't 4GB or 2 GB or 512 MB? So supposing that you don't believe that 8 GB is going to be good for all of time moving forward, what will be the evidence that will allow you to determine that 8 GB isn't enough any longer?
Yes, video ram increased demands, go with the territory, which territory is better visual quality. What you are leaving out however, is the triple fold demand of gpu power. All current 8GB cards, are mostly fine, WITHING their gpu power budgets. Seriously have you seen what's going on with next gen engines or are you looking at the ceiling, like the rest of the chaps here?

Really that's the only thing you need to state for this thread to move forward instead of being stuck in the hellish version of Groundhog's day that it's been in for something like a year now. If you cant or don't believe you can state any such criteria because it doesn't exist, then I think it's fair to say you believe 8 GB ought to be enough for any game and all time moving forward. At that point it's just an article of faith for you and there's nothing anyone here could possibly say to convince you otherwise any more than you could budge a devout man's belief in god.
It is not I that has been stuck on PS games, TLOU and whatever, with their yesteryear PS4 glorified engines. I am talking about true future engines here. I am way past groundhog day friend.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Well, he convinced me! 8 GB cards? Nah, I am now running my old GTX 1060 3GB card for 4K gaming at the "correct settings".

Never mind that the "correct settings" is 1024 x 768 upscaled with everything set to low but hey, getting 6FPS is obviously so worth it.
Please, don't take my words out of context sir and don't push them to the extremes. You are not giving an example for a serious conversation.
 

Iron Woode

Elite Member
Super Moderator
Oct 10, 1999
31,252
12,777
136
Please, don't take my words out of context sir and don't push them to the extremes. You are not giving an example for a serious conversation.
I am pretty sure it is in context. You just don't like being shown how ridiculous your posts are.

There is a use case for 8GB cards but modern games, with loads of features that people use, now require more vram than before. Playing a new game on reduced settings and/or with textures popping is not the best experience. Playing older games at higher settings is fine for 8GB cards. You seem to be obsessed with just vram and not architecture. A 4060 Ti 8GB is not comparable to a 7600 16Gb or a 3060 12GB. Some 8GB cards perform well like a 3070 but eventually people want to play a game with all the image quality turned on and many 8GB cards have issues. If someone wants to experience RT and high fidelity graphics an 8GB card is useless.

The running joke here is "correct settings". Playing AAA games at low settings @ 1080P is not what most gamers want. Competitive shooters, where FPS matters the most, are fine with low settings. Most others want to experience a beautiful game the way it was meant to be played and they need the GPU horsepower to do it. That includes plenty of vram. They also deserve better pricing for new cards. Expensive 8GB cards is insulting these days.

So, to summerize:

8Gb cards are good for older games and reduced settings in newer titles
12G and higher vram cards with new achitectures are good for new games with all the features turned on
if people are stuck on 8GB cards then they need to look to the used market for something better suited to what they expect out of a game. Not just higher vram but better architecture as well.
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
Once more, go to gamegpu, pick ALL 1080p results and see how bad 8GB cards are doing.

With many titles just not loading textures or downgrading the quality when they hit their VRAM limits, the bar charts aren't particularly useful. It's likely that some would have the exact same frame rate with 6 GB or even 4 GB VRAM because of that. When the bars aren't comparing the same output anymore they become useless.

Also you still can't seem to wrap your head around that the claim "8 GB is no longer enough for all games" is not the same as saying "No game can possibly function without 8 GB of VRAM" and thinking that showing a game that runs on 8 GB disproves the first statement.

All current 8GB cards, are mostly fine, WITHING their gpu power budgets.

If 8 GB were enough, wouldn't they be completely fine? "GPU Power Budget" is a meaningless and subjective term you've invented to allow yourself to rationalize why the 8 GB card is failing compared to a less powerful card with more VRAM.

Can you provide any objective measure by which this power budget can be determined ahead of time, or is is only something you can tell us after the fact? When your argument is that all of the games that fall off in performance at 8 GB only did so because they're out of their power budget, it ceases to be useful as an explanation. You may as well replace it with "invisible gremlins" which may at least explain why no one else can seem to see the problem that can't possibly be too little VRAM.
 

jpiniero

Lifer
Oct 1, 2010
16,494
6,994
136
What's funny is that the 4060 Ti 16 GB doesn't seem to be selling all that great versus the 8 GB model. To me that suggests that NV won't do it again and the GB206 models will all have 8 GB unless/until 3 GB chips arrive.

Obvs the higher price is a part of it.
 
  • Like
Reactions: Tlh97 and Ranulf

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
A 4060 Ti 8GB is not comparable to a 7600 16Gb or a 3060 12GB.
Sure it's not. I would advise you to see again The First Descendant, that was discussed above, one more UE5 game.


The 4060ti 8GB is beating the 7600XT easily, with a vary serious threshold. It's above 60fps while the 7600XT is below. It is also destroying the 3060.

performance-1920-1080.png

If someone wants to experience RT and high fidelity graphics an 8GB card is useless.
An 8GB card is mostly useless for RT, due to gpu processing power first and foremost. Even so, again on The First Descendant


performance-rt-1920-1080.png

The 4060ti 8GB is as fast as the 7900GRE.
It is beating the 6900XT, 7800XT, 6800XT, 7700XT, 7600XT and destroying the 3060 as usual.
Playing AAA games at low settings @ 1080P is not what most gamers want.
Never said anything about LOW settings sir. You are again taking my words out of context and pushing them to the extreme. I only said use common sense and don't shoot for over 9000 settings.

Most others want to experience a beautiful game the way it was meant to be played and they need the GPU horsepower to do it. That includes plenty of vram.
Good luck trying to play the first descendant at 36fps 1080p RT on a 3060, and less than 30fps on a 7600XT just because they have 12GB/16GBs of "horsepower".

So, to summerize:

8Gb cards are good for older games and reduced settings in newer titles
12G and higher vram cards with new achitectures are good for new games with all the features turned on
So to summarize:

Clearly people here are only seeing what they WANT to see. They also have no clue how stuff works. The future is here. Vram ain't gonna save you. You need gpu power first and foremost.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
With many titles just not loading textures or downgrading the quality when they hit their VRAM limits, the bar charts aren't particularly useful. It's likely that some would have the exact same frame rate with 6 GB or even 4 GB VRAM because of that. When the bars aren't comparing the same output anymore they become useless.
I am using my 3060ti more than I do my 4070ti, and it has zero problems performing within its intended parameters. No textures have even been missed, aside from Forspoken, which is now fixed. Why? Because I am not an idiot to slide every setting over 9000.

Also you still can't seem to wrap your head around that the claim "8 GB is no longer enough for all games" is not the same as saying "No game can possibly function without 8 GB of VRAM" and thinking that showing a game that runs on 8 GB disproves the first statement.
8GB IS enough though, for all games, if you use correct settings, that absolutely will NOT make your game ugly. All 8GB cards, have gpu power problems first and foremost. You will need to set your settings in such a way, that will drive vram requirements within their 8GB budget ANYWAY. There are very few exceptions of the 4060ti 16GB which happens to be the strongest 1080p card and can actually punch above its weight, in very few titles.

If 8 GB were enough, wouldn't they be completely fine? "GPU Power Budget" is a meaningless and subjective term you've invented to allow yourself to rationalize why the 8 GB card is failing compared to a less powerful card with more VRAM.
By gpu power, I mean gpu processing power, I hope that's obvious.

It's not a subjective term. How many times have I said that I have THREE freagin 8GB cards and they are nothing alike? They all have the same size framebuffer, with very different gpu power and features. The GTX 1070 and rx6600, can never come close to the 3060ti, either you like it or not.

Also you can see above how the 7600XT and the 3060 "DO NOT FAIL" in the first descendant and again, pay a visit at gamegpu and see where the 3060 especially, stands in 1080p gaming today.

Can you provide any objective measure by which this power budget can be determined ahead of time, or is is only something you can tell us after the fact? When your argument is that all of the games that fall off in performance at 8 GB only did so because they're out of their power budget, it ceases to be useful as an explanation. You may as well replace it with "invisible gremlins" which may at least explain why no one else can seem to see the problem that can't possibly be too little VRAM.
Well that's why benchmarks are published. Again see how well the 3060 is performing and how its 12GBs are helping it, if you don't understand what gpu power means.

The situation right now is this. All 8GB cards, below the 4060ti (and even the 4060ti sometimes), are having performance problems in many games, with vram being the least of their problems. I am talking about 1080p/ultra / no RT not even for a joke. All these cards, will have to use settings, that will lower vram requirements and thus drive the vram budget below 8GBs and also within their gpu power budget.

For me, there is absozerolutely point, is studying cases where the card cannot even run the game properly and then be like SEE SEE SEE, there's a vram limit, it would run at 37fps instead of 12 if it had moar vram. Yeah, I don't care about that.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
What's funny is that the 4060 Ti 16 GB doesn't seem to be selling all that great versus the 8 GB model. To me that suggests that NV won't do it again and the GB206 models will all have 8 GB unless/until 3 GB chips arrive.

Obvs the higher price is a part of it.
For people that have some sense, they would absolutely go for the 4070 instead. The 4070 is a very good 1080p card. See the first descendant above. It can do 1080p RT quite convincingly, while both 4060tis cannot.
 

poke01

Diamond Member
Mar 8, 2022
3,758
5,090
106
It would be interesting to see how cards perfrom in Black Myth. Its a far more demanding game and looks great.
 

Iron Woode

Elite Member
Super Moderator
Oct 10, 1999
31,252
12,777
136
Sure it's not. I would advise you to see again The First Descendant, that was discussed above, one more UE5 game.


The 4060ti 8GB is beating the 7600XT easily, with a vary serious threshold. It's above 60fps while the 7600XT is below. It is also destroying the 3060.

performance-1920-1080.png


An 8GB card is mostly useless for RT, due to gpu processing power first and foremost. Even so, again on The First Descendant


performance-rt-1920-1080.png

The 4060ti 8GB is as fast as the 7900GRE.
It is beating the 6900XT, 7800XT, 6800XT, 7700XT, 7600XT and destroying the 3060 as usual.

Never said anything about LOW settings sir. You are again taking my words out of context and pushing them to the extreme. I only said use common sense and don't shoot for over 9000 settings.


Good luck trying to play the first descendant at 36fps 1080p RT on a 3060, and less than 30fps on a 7600XT just because they have 12GB/16GBs of "horsepower".


So to summarize:

Clearly people here are only seeing what they WANT to see. They also have no clue how stuff works. The future is here. Vram ain't gonna save you. You need gpu power first and foremost.
I can't even....

VRAM isn't horsepower. Architecture determines GPU "horsepower", especially when combined with plenty of VRAM.

The only words I can use to describe your post is cognitive dissonance.
 
Jul 27, 2020
26,030
17,960
146
Maybe this analogy will help the 8GB dude?

A person's only job is creating these large squares from different colored blocks. He has a table to do it. But the table is only so big. So he creates a square as large as the table allows, then puts it someplace a bit far with more storage space. One day, he has an idea. What if he got twice as large a table? He does that and lo and behold, he can create two squares and also carry both of them back to storage. His output doubled and number of trips to storage space halved!

Now I'm just itching to see how the dude invents a new table with "settings" :p
 
  • Like
Reactions: Tlh97 and marees

mikeymikec

Lifer
May 19, 2011
20,387
15,084
136
Maybe this analogy will help the 8GB dude?

A person's only job is creating these large squares from different colored blocks. He has a table to do it. But the table is only so big. So he creates a square as large as the table allows, then puts it someplace a bit far with more storage space. One day, he has an idea. What if he got twice as large a table? He does that and lo and behold, he can create two squares and also carry both of them back to storage. His output doubled and number of trips to storage space halved!

Now I'm just itching to see how the dude invents a new table with "settings" :p

No, it won't help. As per:

So to summarize:

Clearly people here are only seeing what they WANT to see. They also have no clue how stuff works. The future is here. Vram ain't gonna save you. You need gpu power first and foremost.

You're only seeing problems you want to see. Ergo if you think the table isn't big enough for your job, you just ignore the parts of the job that make it "too large" until you have a small enough job to fit on the table.

So really the Amiga 500 was perfectly capable of playing Doom and therefore is still the best gaming computer ever. It's just a case of correct settings. 640KB? Pah. The stock A500 could easily have done it. Ignore anyone who suggests otherwise, especially the incompetent devs.