8GB VRAM not enough (and 10 / 12)

Page 71 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

psolord

Golden Member
Sep 16, 2009
1,913
1,192
136
I am rolling my eyes so hard, that at this point, they are measuring gallons.

How long have a been telling you that all current gpus will cry with UE5, vram being completely irrelevant?

Oh you don't like how Nightingale runs on your 16GB card? You WILL take 1080p at 60fps and you WILL like it.

And no, UE5 ain't bad. It just has the best fidelity out there and this comes at a cost. This is the cost. You will have to upgrade and the vram on your current card, will not save you. Correct settings will though. Luckily UE5 at medium, looks better than other engines at ultra.
 
Jul 27, 2020
16,161
10,240
106
Oh you don't like how Nightingale runs on your 16GB card? You WILL take 1080p at 60fps and you WILL like it.
RX 6800 still does quite well compared to the Geforce cards. The owners of these cards may regret the fact that they shunned the RX 6800 even when they could find it cheaper than the Geforce cards unable to match it in this game. But who am I kidding? They will give more of their money to Jensen to get a more powerful card to overcome that disappointment. That's what these gamers do to cope. Just spend as much as possible to keep disappointment away for longer.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
740
136
RX 6800 still does quite well compared to the Geforce cards. The owners of these cards may regret the fact that they shunned the RX 6800 even when they could find it cheaper than the Geforce cards unable to match it in this game. But who am I kidding? They will give more of their money to Jensen to get a more powerful card to overcome that disappointment. That's what these gamers do to cope. Just spend as much as possible to keep disappointment away for longer.
Since I have both the 6800 & a 3080 10GB I will probably buy Nightingale when I get back from holiday, it does look like 4K would be way out of reach but 1440p might be squeakable.
 

psolord

Golden Member
Sep 16, 2009
1,913
1,192
136
RX 6800 still does quite well compared to the Geforce cards. The owners of these cards may regret the fact that they shunned the RX 6800 even when they could find it cheaper than the Geforce cards unable to match it in this game. But who am I kidding? They will give more of their money to Jensen to get a more powerful card to overcome that disappointment. That's what these gamers do to cope. Just spend as much as possible to keep disappointment away for longer.
You are referring to the 8GB geforce cards I take it? Well the 6800 has always been faster than these cards anyway and this was a gpu power characteristic, first and foremost.

I don't think anyone in their mind, would prefer an 8GB geforce, over a 16Gb radeon back then, if the price was close. I know I wouldn't.

This is beside the matter at hand though. Which is, that both these users, will need to upgrade for true high fidelity next gen UE5 and other engines. Vram ain't gonna save you, as I have been saying in this thread since ever.

This thread has been an anti-nvidia meme and nothing else.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,674
3,796
136
You are referring to the 8GB geforce cards I take it? Well the 6800 has always been faster than these cards anyway and this was a gpu power characteristic, first and foremost.

I don't think anyone in their mind, would prefer an 8GB geforce, over a 16Gb radeon back then, if the price was close. I know I wouldn't.

This is beside the matter at hand though. Which is, that both these users, will need to upgrade for true high fidelity next gen UE5 and other engines. Vram ain't gonna save you, as I have been saying in this thread since ever.

This thread has been an anti-nvidia meme and nothing else.

Says the guys still running a 12GB 4070 Ti. You probably could have gotten a 7900 XT for a similar price. Screw it though, who needs that extra 8GB of VRAM. Just use the "right settings!".
 
  • Like
Reactions: Magic Carpet

psolord

Golden Member
Sep 16, 2009
1,913
1,192
136
Says the guys still running a 12GB 4070 Ti. You probably could have gotten a 7900 XT for a similar price. Screw it though, who needs that extra 8GB of VRAM. Just use the "right settings!".
For +100 euros for the 7900xt, +new case since I could only fit a dual slot gpu +new psu, no I couldn't have gotten a 7900xt.

The 12GBs are absolutely fine for the 4070ti. The 4070ti already has gpu power problems, while it has zero vram problems. Please stop perceiving vram like an elementary school child.

As for the correct settings, yes they are mandatory for all video cards, even for the 4090, with UE5 on our doorstep.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,674
3,796
136
For +100 euros for the 7900xt, +new case since I could only fit a dual slot gpu +new psu, no I couldn't have gotten a 7900xt.

The 12GBs are absolutely fine for the 4070ti. The 4070ti already has gpu power problems, while it has zero vram problems. Please stop perceiving vram like an elementary school child.

As for the correct settings, yes they are mandatory for all video cards, even for the 4090, with UE5 on our doorstep.

Sorry I don't magically know what is capable of fitting in your case. I won't make that mistake again. Anf if you compare launch prices yes the 7900 XT was stupidly overpriced. By the time you got your 4700 Ti though I'd bet they were close. But I guess that depends on your country.

Also, classy.
 

poke01

Senior member
Mar 8, 2022
725
697
106
Wait... When was it not? I'm confused. :confused:
It always was, very strong anti-nvidia here. See below:
Only Nvidia worshipers.
Its funny how the mods allow these insults, just shows give a pass when the insults are at Nvidia users. Also last I understood "worshiper" is akin to "fanboy". So I learnt we can use worshipers to bypass the word ban for "fanboy" now.
 
  • Like
Reactions: psolord

Saylick

Diamond Member
Sep 10, 2012
3,125
6,296
136
It always was, very strong anti-nvidia here. See below:

Its funny how the mods allow these insults, just shows give a pass when the insults are at Nvidia users. Also last I understood "worshiper" is akin to "fanboy". So I learnt we can use worshipers to bypass the word ban for "fanboy" now.
I’m not sure if the word fanboy is completely off limits but I think there’s a difference between directly calling someone a fanboy vs. saying the only people who would tolerate 8GB are Nvidia fanboys.
 
  • Like
Reactions: Tlh97 and poke01

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,451
20,461
146
It always was, very strong anti-nvidia here. See below:

Its funny how the mods allow these insults, just shows give a pass when the insults are at Nvidia users. Also last I understood "worshiper" is akin to "fanboy". So I learnt we can use worshipers to bypass the word ban for "fanboy" now.
You just broke multiple forum rules. I am not going to infract you for them because this is a good opportunity to get everyone reading this on the same page. Any further violations of the rules moving forward will be infracted and can result in loss of posting privileges.

Personal insults are not permitted in the tech forums. It is that simple. Fanboy, fan, zealots, worshipers, the list goes on and on. If you see someone using insults towards other members in a post? Report the post. The mod staff does not always see the offending content. Report it and action will be taken.

You can never, ever, callout moderators. Nor question moderation, anywhere other than the moderator discussions forum. So there won't be any more warnings about it.

The rule is attack the post not the poster. Learn it, know it, live it.

This is a Moderator response, not a member response, so do not reply to it.

I’m not sure if the word fanboy is completely off limits but I think there’s a difference between directly calling someone a fanboy vs. saying the only people who would tolerate 8GB are Nvidia fanboys.
As stated above, none of the usages are permitted.

FYI: It is also a violation to quote the offending posts.

Repeat: if you see someone insulting others or any other form of trolling, report it.

If that feels like snitching? Then you (I am using the collective you each time I type it) are relying on a moderator to see the offending content, which is not guaranteed by any means. We are a small crew and can't read every post in every thread.

We do our best to make the tech forums a welcoming place for user to user help, discussion, and exchange of ideas. We also try to keep it fun and light. However: There is of course the expectation that it will be done in a civil manner. Violators will fill in spaces on their ban hammer bingo cards.

Again: do not respond to this as it's an official moderation response.

- Moderator DAPUNISHER
 
  • Like
Reactions: Mopetar

coercitiv

Diamond Member
Jan 24, 2014
6,187
11,856
136
and nothing else.
I still remember your initial claim in this thread, that your interest was purely academic:
I don't mean to antagonize this thread. I have a purely academic interest as a person that follows gpu performances for decades.
Since then you have slowly but surely moved away from discussing facts & figures to labeling settings and comparisons as "stupid", shifting goalposts to avoid admitting anything during debates, accuse folks of behaving like children / sheep /cultists, introduce sarcastic remarks in other threads about this topic, and finally frame the entire issue as monochrome brand bias. In the past 6 months you have progressively revealed a fading interest for honest dialogue and a growing appetite for labeling and blaming others.

You probably saw yourself as the hero in this debate, and yet have played dirty enough to become the villain. Whether you meant what you said or not, you did end up antagonizing this thread. That pure academic interest is lost in a swamp of snark and deflection.
 

Ranulf

Platinum Member
Jul 18, 2001
2,348
1,165
136
Maybe its just all of us ranting at the moon made of vram but looking back, 60 pages in the thread now in the last year. Maybe it is just the reality catching up with the theory.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,451
20,461
146
Maybe its just all of us ranting at the moon made of vram but looking back, 60 pages in the thread now in the last year. Maybe it is just the reality catching up with the theory.
Could you elaborate on that further?

Observing how things have shaken out, I don't think 8GB is going to hold back games anymore. I do think owners will increasingly not be enjoying many of the day one experiences. Example: they will release the unoptimized system stranglers like The Last of Us, then as the months go by and they get ROI, there will be the attention necessary to run on just about anything within reason.
 

psolord

Golden Member
Sep 16, 2009
1,913
1,192
136
I still remember your initial claim in this thread, that your interest was purely academic:
It was purely academic, but friction causes heat, as they say.

Since then you have slowly but surely moved away from discussing facts & figures to labeling settings and comparisons as "stupid", shifting goalposts to avoid admitting anything during debates,
Whenever I tried to post benchmark results, people accused me of posting a splurge of screenshots, too much data, blah blah, and a I will keep believing what I want mentality ensued.

I am not shifting anything. I am just not accepting stupid results. If something runs at 25fps on a 12GB card and 15fps on a 8GB card, these results are useless to me.

accuse folks of behaving like children / sheep /cultists, introduce sarcastic remarks in other threads about this topic,
Like children I said yes. The sheeps and cultists belong to the other side mate, not me.

and finally frame the entire issue as monochrome brand bias.
It is mostly that for the 8GB not enough camp. The Nvidia haters of this thread.

You probably saw yourself as the hero in this debate, and yet have played dirty enough to become the villain. Whether you meant what you said or not, you did end up antagonizing this thread. That pure academic interest is lost in a swamp of snark and deflection.
Saying that the most advanced engines and games, have shown very little vram problems, is not snark and deflection. Starfield, UE5 games, Avatar, Alan Wake 2, are all vram agnostic mostly. GPU power is what matters most. I am not deflecting anything. You are.