8GB VRAM not enough (and 10 / 12)

Page 59 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Guys, guys, I finally figured it out!

1 GB card is just fine when using the "proper setting" of 320x240. And it totally doesn't even matter if it costs more than a 12GB card that does 1440p!

Who decides what the "proper setting" is, you ask? Why it's meeeeee of course! Don't be silly, you don't get to decide, and neither does the person who paid for the GPU.

OMG guys, my pants are totally on fire right now!
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
The 2080super is also a 8GB card and for the time this slide was posted, it may have been true. However, marketing will do marketing, that's known and they are not famous for their straightforwardness and honesty. They show performance factors though and not framerates per se. Speaking for myself, I got the 3060ti for 1080p very high/ultra no RT. I got the 4070ti for 1440p (4k dlss+RT). As time moves on, graphics cards tend to move a tier lower. If the 2080super was a 1440p card before, it is a 1080p card now. Cards lose their tiering due to processing power first and foremost and not due to vram though. That's why I say, I have three 8GB cards and they are nothing alike, which you mocked a couple posts above.
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
Guys, guys, I finally figured it out!

1 GB VRAM is just fine when using the "proper setting" of 320x240. And it totally doesn't even matter if it costs more than a 12GB card that can do 1440p!

Who decides what the "proper setting" is, you ask? Why it's meeeeee of course! Don't be silly, you don't get to decide, and neither does the person who paid for the GPU.

OMG guys, my pants are totally on fire right now!
Although I disagree with you mostly, you have raised some nice arguments in the past. Try to stick with that and not be childish please?
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
Runs without mesh shaders but 5700XT in particular is considered unplayable. Tim says performance is anywhere from 20 to 50 FPS depending on area. He has a chart half-way through the video, showing 5700XT struggle even with FSR2 enabled.
I expected it to do worse really, after seeing the requirements charts, but it is what it is. Maybe the devs will fine tune some coding and/or AMD and nvidia, could replace these mesh shaders, driver level, with their own? I don't know if that's possible. Just a thought.

Anyhoo, for the matter at hand, the 8GB vram that is, we see that the vram usage is lower than 6GBs and the cards are still struggling. Which brings as where? That processing power is more important than vram going forwards. The problem is also shown on the consoles. Their vram ain't helping. They dropped to 850p for 60fps and they have framedrops too (digital foundry preliminary video). Also fsr2 is super bad in some places, again from digital foundry. Tim at HWUB downright called it cheating, if you compare 1080p fs2 vs 1080 dlss2, due to how bad fsr2 is.

The same will be shown next week with Robocop, mark my words.
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
ps if anyone cares, I am glad to report, that the issue that occurred in version 1.835 of Ratchet and Clank, which caused the game to crash with an out of vram error on the 3060ti, is now resolved on the 1.922. I do have the feeling it runs worse than before though. No Rt and it's fine. I will keep this in the borderline yellow zone problematic 1080p games.
 
  • Like
Reactions: GodisanAtheist

nurturedhate

Golden Member
Aug 27, 2011
1,767
773
136
I'll just upgrade to a DLSS 4.0 capable card. It's fantastic tech. Came with a single piece of paper and some tape. Directions say to tape it over my monitor or I can even cut it to fit glasses (Nvidia CTX scissors sold separately). The box says it comes with every game ever made, even future unreleased games! Here's what it looks like. I was playing a sail boat game yesterday with my Canadian girlfriend. You've never heard of this game before. Also, no more vram requirements!

1698414314359.jpeg
 
Last edited:

Ranulf

Platinum Member
Jul 18, 2001
2,864
2,514
136
If the 2080super was a 1440p card before, it is a 1080p card now. Cards lose their tiering due to processing power first and foremost and not due to vram though.

The GTX 970 was a 1440p card too. For $330. With 4GB* of ram. We've had 6-8GB of ram on 1080p/1440p cards for seven years now. You're doing a fantastic job of selling not going beyond 1080p if don't want to be stuck spending $400 then 500 then $600 for a mid range card (x70 or x80)...that still has 8GB of ram and is already limping along the very next generation.
 
  • Like
Reactions: NTMBK and StarFall

fralexandr

Platinum Member
Apr 26, 2007
2,288
228
106
www.flickr.com
The 2080super is also a 8GB card and for the time this slide was posted, it may have been true. However, marketing will do marketing, that's known and they are not famous for their straightforwardness and honesty. They show performance factors though and not framerates per se. Speaking for myself, I got the 3060ti for 1080p very high/ultra no RT. I got the 4070ti for 1440p (4k dlss+RT). As time moves on, graphics cards tend to move a tier lower. If the 2080super was a 1440p card before, it is a 1080p card now. Cards lose their tiering due to processing power first and foremost and not due to vram though. That's why I say, I have three 8GB cards and they are nothing alike, which you mocked a couple posts above.
? You got a 3060 ti instead of a rx 6800/xt to NOT use RT? was it a power supply thing? because the rx was usually better performance/price unless it was sold out?
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
? You got a 3060 ti instead of a rx 6800/xt to NOT use RT? was it a power supply thing? because the rx was usually better performance/price unless it was sold out?
I keep my expectations at check. The 3060ti was too weak for that. I did play a couple of games with some light rt tho. The Ascent was impressive to say the least. I did try to get a 6800, but it was nowhere to be found.

In other news,

I tried Alan Wake II on the 4070ti. Yeah right...lol. The game is barely playable at high 4k/dlss quality (1440p internal resolution). The card screamed at 260W even with dlss. The rendering is out of this world.

AlanWake2_2023_10_28_00_14_43_315.jpg

It IS heavy on the vram, but we got bigger problems here folks.

Also it does not even launch on my 2500k system. My day is ruined! :(

I bet they did an AVX2 blander once again. The game is very light on the cpu. 120+ fps on a Ryzen 3100. The 2500k should be able to run it with no problem.
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
Digital Foundry's piece regarding Alan Wake II is out. A must see.


Shows how the 3070 is about 50% faster, than PS5, at PS5 settings. Yeah vram ain't helping.

Also shows how post processing quality, tanks the performance, with no real visual gains. That's a stupid setting right there. And this time not stupid to use for the user. It's just stupid.

Also shows the difference of mesh shaders between 1080ti, 5700xt ans rtx 2080. Interesting stuff.

Oh wow, turning post processing to low, really makes the game playable.

Also RT is a no go even for the 4070ti. From what I see in gamegpu, it's a no go for everything. The only way I managed to get acceptable performance with RT, is 4k/dlss/low+low rt+frame gen. It is quite uglier compared to high settings no rt though. I will post some shots tomorrow. Positve side effect, first time I saw framegen actually working in a good way. And with vsync on! I thought it needed vsync disabled for nvidia cards?
 
Last edited:

Thunder 57

Diamond Member
Aug 19, 2007
4,042
6,759
136
I keep my expectations at check. The 3060ti was too weak for that. I did play a couple of games with some light rt tho. The Ascent was impressive to say the least. I did try to get a 6800, but it was nowhere to be found.

In other news,

I tried Alan Wake II on the 4070ti. Yeah right...lol. The game is barely playable at high 4k/dlss quality (1440p internal resolution). The card screamed at 260W even with dlss. The rendering is out of this world.

View attachment 87921

It IS heavy on the vram, but we got bigger problems here folks.

Also it does not even launch on my 2500k system. My day is ruined! :(

I bet they did an AVX2 blander once again. The game is very light on the cpu. 120+ fps on a Ryzen 3100. The 2500k should be able to run it with no problem.

2500k is a joke these days. Time to let it go.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,032
32,506
146
2500k is a joke these days. Time to let it go.
I get that if you live in a part of the world where the yearly income is really low compared to the U.S., Canukistan, or another "rich" country, you do what you have to.

When Alex is the content narrator the channel becomes Digital Founder's Edition and is most assuredly not must see TV. It is a thinly veiled pro Nvidia fluff piece. There is a reason John was the one doing the interview with the Insomniac Dev. Alex would have to say nice things about AMD hardware and that ain't happening.

And for the nth time: 8GB is acceptable on a budget card. For $300-$400 it's not. Everything else is obfuscation.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
8,949
7,662
136
Also shows how post processing quality, tanks the performance, with no real visual gains. That's a stupid setting right there. And this time not stupid to use for the user. It's just stupid.
??? About the 9:55 mark they showed a huge difference in image quality with postprocessing on high vs low with tons of flickering on low, at least in combination with depth of field. Hopefully not the case without DOF since I hate the look and always disable it in games. Don't know why anyone would want a photo filter blurring the background on a game they're playing.
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
??? About the 9:55 mark they showed a huge difference in image quality with postprocessing on high vs low with tons of flickering on low, at least in combination with depth of field. Hopefully not the case without DOF since I hate the look and always disable it in games. Don't know why anyone would want a photo filter blurring the background on a game they're playing.
There's seems to be some shimmering in lights in the background inside the mind place ( mind place is pretty cool tbh), but overall the image quality is virtually the same. It can affect the gaming experience in a negative way.

Inside the woods, 60fps is not possible on the 4070ti 4k/high/dlss quality with PP high. It's ok with PP low though.

PP LOW, 60fps very smooth
AlanWake2_2023_10_28_09_50_31_459.jpg

PP HIGH, game gets stuttery well below 60fpsAlanWake2_2023_10_28_09_50_58_820.jpg
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
In town framerate is more stable, but gpu load changes

PP LOW
AlanWake2_2023_10_28_10_20_02_523.jpg

PP HIGH
AlanWake2_2023_10_28_10_20_27_886.jpg

Hey! No jokes about my pp ok? It's doing what it can! xD
 

coercitiv

Diamond Member
Jan 24, 2014
7,362
17,456
136
So AW2 has improved reflections (better than SSR) and global illumination even when ray tracing is turned off. I guess it follows in the footsteps of Lumen. This explains why the game is so compute intensive and why the visuals look nice overall.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,322
9,700
136
4C/4T has been weaksauce for gaming for at least 7 years. It might have been servicable for a bit longer but these days 4C/4T for gaming is not good.

-Built my buddy a gaming rig with my hand me down 2200G processor (1660S GPU) and while I wouldn't say that it's a great gaming experience, it handles things much better than I would have expected.

I'll upgrade him to a Zen 2/3 processor with at least 6/12 next time he's got a birthday or when he's unhappy with the performance but he's been pretty happily playing Dying Light and even Hogwarts Legacy for about a month now without any complaints.