8GB VRAM not enough (and 10 / 12)

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,610
21,004
146
Well, there is Intel and the 770.


Damned right there is, and I am glad for it. I know what you meant, but you should've specified 16GB since there is an 8GB model. It does nothing to undermine the value of the 6800, since it is far less performant. However, it is developing into a viable option. Needs to be no more than $275 IMO. The 6700XT is too good to recommend the A770 16GB when the price is close.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,610
21,004
146
Ray Tracing - I am the future!

Textures - And I took that personally.

This is an active thread so I will mention it here. The forum migration will be happening shortly, which means they will be down for a good part of the day. Just a head's up.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,937
5,857
136
Perhaps not just AMD. I am hopeful the A770 16GB is doing well in The Last of Us. Seeing it throw a beating to the 306ti, 3070/ti because they lack the VRAM would be quite satisfying. Hell, it may outperform the 3080 10GB at high enough settings. FPS means squat when frame pacing is terrible.

I see responses to HUB on Twitter, saying "can't you just turn the textures down?" LOL of course you can. But you sacrifice probably the most important visual. not like they have any choice though.
I don't think anything is going to do well with this trash port. For a minute I thought maybe Iron Galaxy included a cryptocurrency miner built into the game but then I realized that was impossible since I would have had full gpu usage in that case.
 

jpiniero

Lifer
Oct 1, 2010
14,659
5,282
136
I don't think anything is going to do well with this trash port. For a minute I thought maybe Iron Galaxy included a cryptocurrency miner built into the game but then I realized that was impossible since I would have had full gpu usage in that case.

Are you able to play without letting it completely finish the shader compiling? Maybe that's what's going on.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,937
5,857
136
The game is getting review bombed on Steam, crashes are very common apparently.
I wouldn't call it review bombing. Review bombing is when you have a coordinated effort to try to kill scores for a game. I think this is a lot of people who bought it angry the game runs terribly at best and unplayable in many cases. Even after more than an hour of compiling shaders in the menu to 100% until the compiling shaders message disappears the game seems to compile shaders while playing too, which is likely why I had a huge drop in framerate and jump in cpu load when just changing the game from 1440p to 1080p and reloading a save. I haven't had any crashes in the game but for instance I lose control of my character for a second or two every 30-45 seconds in the opening stage of the game, which makes the game absolutely unplayable.
 
  • Like
Reactions: MangoX

coercitiv

Diamond Member
Jan 24, 2014
6,243
12,106
136
Review bombing is when you have a coordinated effort to try to kill scores for a game. I think this is a lot of people who bought it angry the game runs terribly at best and unplayable in many cases.
I would argue that sometimes it can be both. In this case, based on the reviews I shuffled through, people are making a clear distinction between the original game and the port, and are review bombing to signal the unacceptable state of the PC version. They could easily choose the more rational and much more effective way of getting a refund.
 

Jaskalas

Lifer
Jun 23, 2004
33,469
7,525
136
Gotta say, given the new requirements, I picked perhaps the absolute worst time to try and move on from 1080p to 1440p.
Though the move did free me from GSync exclusivity and will allow AMD to be an option. Perhaps the now sorely needed upgrade from my 3060Ti will be something from team red.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,937
5,857
136
Gotta say, given the new requirements, I picked perhaps the absolute worst time to try and move on from 1080p to 1440p.
Though the move did free me from GSync exclusivity and will allow AMD to be an option. Perhaps the now sorely needed upgrade from my 3060Ti will be something from team red.
I can't really understand getting any Nvidia gpu other than the 4090 right now if it's strictly for gaming. God do they love starving their cards for VRAM and can't believe how much they seem to be doubling down on it for the 4000 series. Seems like it has been like this most of the time since the GTX 680 with a few exceptions like the 1080 Ti and 1070 that aged brilliantly thanks to having forward looking amounts of VRAM. And 2080 Ti has aged well too, though it should have for $1200.
 
  • Like
Reactions: Ranulf and Tlh97

Timorous

Golden Member
Oct 27, 2008
1,654
2,872
136
4070 non-Ti 12GB, if it gets released, might be acceptable as a decent 1440p card.

a 4070 or 4070ti with 12 GB of ram is not bad per se. They are bad at $700-800 which is the current price but if they were coming in at even 3070 MSRP so $500 for the 4070 and $600 for the 4070Ti then for 1440p it would be no where near as bad. The 4070Ti would still be on the edge though because in that $600 region you would expect 16 GB of ram IMO.

Really if you think about it 1080p cards should be 300 or less and 8GB at the low end here is fine.
1440p should be 550 or less and 12GB is okay here.
4K should start at around 600 and ideally be 16GB or more.
4K ultra is basically the realm of the $1,000 + tier with 24GB of ram and replaces the SLI on a stick setups some went with. Even $2K for a 4K Ultra + RT 4090Ti for example would be cheaper than going quad sli and the performance is going to be a lot more reliable so I don't really see an issue for that ultra high end halo part but the parts below it need to make sense and they just don't.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,610
21,004
146

YT'er buys RTX 3080 10GB used for $550, over an RX 6800XT for $500, now regrets it due to VRAM limitations.
As I have written before, Nvidia was firing on all cylinders while the zoomers were growing up. That resulted in a generation equating Nvidia with PC gaming. My son and his friends were all indoctrinated.

I do think it is cool the mindset is starting to change in that age group now. He talks about how ray tracing propaganda worked on him. He was convinced that it was an important factor in his purchasing decision. Once he tried it, he found out he didn't care for it, or about it. Welcome aboard kid. :D

Plus owning an AMD APU laptop, he discovered the AMD user experience is a good one. Not the terrible time the viral marketers and shills tirelessly claim you'll have.

He is salty over the 3080 not accelerating his productivity the way he thought it would. Paying more for less VRAM, and ray tracing being a vastly overhyped feature. With how Nvidia is apparently going to price 6GB and 8GB cards this gen, they are going to help sell more AMD cards than AMD's horrible marketing and sales dept. ever could.
 

KompuKare

Golden Member
Jul 28, 2009
1,035
1,001
136
Jensen shouting at his engineers: "Why didn't any of you peabrains THINK about the consequences of giving the cheaper 3060 more VRAM than the Ti version???"
But to be fair, planning obsolescence doesn't work quite like that:
we can make card which will run out of VRAM before it runs out of grunt (3060 Ti, etc.)
or we can make a card which will run out of grunt before it runs of VRAM.

For planning obsolescence, the important thing is to make unbalanced cards!