8GB VRAM not enough (and 10 / 12)

Page 60 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
4C/4T has been weaksauce for gaming for at least 7 years. It might have been servicable for a bit longer but these days 4C/4T for gaming is not good.
Come on I know that. I have three systems better than that. I just love my 2500k, because I had the most fun with. More than any other system ever. That's why I want to keep on testing it. Sometimes I even game on it for a couple hours, for old times sake. Paired with a rx6600, it is doing quite well even in modern games, despite the extreme connectivity handicap (pcie 2.0 with 8b/10b encoding). Not in all games, but in quite a few.

For example and in order to be on topic with the thread, paired with the 8GB 6600, it can run very decently, current UE5 games. (non monetized channel, however the robocop game has a copyright claim due to the soundtrack)



I mean this system has used like 5-7 cards in all it's life. It was running UE3 when it came out and now it runs UE5. How can I not be impressed? And how Remedy kicks legendary cpus out, like that?


Naughty Dog did the same thing with Uncharted 4. People where trying to run it with Intel's emulator and they did, only it was a slideshow. All it took was for the AVX2 requirement to be patched out.




And after that, it run at 60fps Ultra no problem. I showed the run a few posts back.

The on topic part, is the above runs, are not vram limited, rather gpu power limited, like all UE5 games so far. The guy above, can post 1440p videos for an 8GB card, all he wants.
 
  • Like
Reactions: Elfear

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Can you please start another thread if you want to post your game test results with various rigs?

This thread is so far off the rails that it's become a joke.
People here post whatever they find, while I am backing up my point of view, with my own work. How are other people's tests on topic and mine aren't?
 

MrPickins

Diamond Member
May 24, 2003
9,119
767
126
People here post whatever they find, while I am backing up my point of view, with my own work. How are other people's tests on topic and mine aren't?
Your last few posts, at least, have been about your 2500k. Those don't belong here.

I really find most of the rest of your posts in this thread inane and completely missing the point, but at least they are on-topic...
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Your last few posts, at least, have been about your 2500k. Those don't belong here.

I really find most of the rest of your posts in this thread inane and completely missing the point, but at least they are on-topic...
The 2500k came into the discussion briefly, because I said I cannot test the rx6600, in Alan Wake 2, to see if it has a 8GB vram problem or a gpu processing power problem. The guy above said just let it go, it's too old and I just answered to that. The focus is the rx6600 with its 8GBs of vram. I use all my 8gb vram systems to show how different setups, can accommodate the same game, BUT with different settings, because all 8GB cards are not the same.

These very settings, have been demeaned in the OP , while they are a very crucial part of PC gaming as a whole. People posting 1440p vids at max settings +RT to show how 8GBs are bad, seems inane to me as well. I am more of a "use settings to solve problems than create them guy". Sorry about that.
 
  • Like
Reactions: GodisanAtheist

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
In other news, since the op decided to include 12GB cards as well, allow me to post my Alan Wake II run. (non monetized channel)


No need to watch it really. I will post my findings in text.

-Overall, for 4k/dlss quality/1440p real resolution/high preset/low post processing, it looks great and runs fine.
-12GB Vram is not an issue. Gpu processing power is. According to gamegpu, the 4070ti is faster than the 6950xt at 1440p. It IS heavy on the vram, but not spilling over.
-The game streams like crazy. The above run is done from an external patriot usb dongle with an adata sx8200 2TB nvme inside. Max speed with the connection it has on this system is less than 500MB/sec (max spec speed is 1TB/sec but my cable is not up to it). The data streaming is monitored in msi afterburner's OSD. It casually jumps to 300MB/sec. It reads A LOT of data. I am not seeing a directstorage.dll in the folder, but it may have been implemented differently. I am not sure I am liking this trend, where the devs opt to stream data from the storage like crazy, instead of using the gobbles of pc system ram.
-I started the run with post prossessing at high initially, but quickly changed it to low. The transition is shown at the second minute and the visuals do not change by much. The gpu load saved, is more important for the fluidity of the game. There is some shimmering in the distance, when the camera focuses on the foreground, due to that. At 8:47 for example, there is shimmering on the windows.
-It is very light on the cpu (with no RT at least), that's why I wanted to test it on the 2500k. I fully believe its doable. I opened a ticket for that. Let's see what happens.
-RT is no go for the 4070ti. I swear to god, this game will be playable at native 4k maxed, only on the 6090, mark my words.


Side note, according to gamegpu tech's findings, 8GB cards don't have a problem either. The 4060ti (I hear pitchforks being sharpened) is quite a bit faster than the 6700xt for example and the 3060ti quite a bit faster than the 3060. I will be testing this on the 3060ti of course and if I see irrecoverable problems, I WILL post them.

So from where I am standing, the most advanced pc game to date, does not have an immediate vram problem, although I will need my own 8GB testing.
 
  • Like
Reactions: GodisanAtheist

NTMBK

Lifer
Nov 14, 2011
10,404
5,645
136
I tried so hard to get into Control and never could. Was amazing graphically and one of the few games I thought RT reflections were worth the performance hit even on my RX 6700 XT, but never got much out of the story and eventually got bored with it. Figured I'd love it since I was a big fan of Quantum Break but never could get Control to click for me even putting about 10-15 hours into it.

I really enjoyed it right up until I realised they were going to reuse the same video clip over and over for the hallucination bits. That spoiled the magic for me.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Digital foundry's latest piece, regarding the first batch of Unreal Engine 5 titles. Timestamped where he speaks about vram usage. He talks about a little over 8GB vram usage.....for 4k.


He did notice delayed loading in some textures, even on a 4090, but only delayed loading, not missing textures and this is not vram related, rather a developer choice of how aggressive asset streaming will be. We've been seeing this in UE4 too, nothing new here.

What he did not mention however, is how the gpu power to vram ratio changes and how gpus already have performance problems at higher settings, vram quantity notwithstanding.

Northlight Engine and Creation Engine 2, also went down this path. No please show us Deliver Us Mars at 1440p + RT. :rolleyes:
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,354
30,419
146
Nvidia marketing human caterpillar segment Alex is the last person most of us will pay any attention to concerning VRAM. Or any other issue that paints Nvidia in a negative light for that matter. He is 100% in their pocket.

And to cut through the rest of the horse crap that keeps getting pedaled here. A $300 4060 and $400 4060ti not being able to use the higher texture quality settings the $300 6700XT or A770 can is unacceptable. Cherry picking and moving the goalpost to GPU performance will not change that simple fact. Texture quality turns in the best visuals to performance cost. Nothing else even comes close using that metric.

I will conclude this post by reminding everyone that you can put others on ignore anytime you'd like. If you are tired of the thread being spammed up with dreck, hit that ignore button and watch as the thread magically turns from💩 to readable again. :beercheers:
 

Panino Manino

Golden Member
Jan 28, 2017
1,088
1,324
136
Nvidia marketing human caterpillar segment Alex is the last person most of us will pay any attention to concerning VRAM. Or any other issue that paints Nvidia in a negative light for that matter. He is 100% in their pocket.

And to cut through the rest of the horse crap that keeps getting pedaled here. A $300 4060 and $400 4060ti not being able to use the higher texture quality settings the $300 6700XT or A770 can is unacceptable. Cherry picking and moving the goalpost to GPU performance will not change that simple fact. Texture quality turns in the best visuals to performance cost. Nothing else even comes close using that metric.

I will conclude this post by reminding everyone that you can put others on ignore anytime you'd like. If you are tired of the thread being spammed up with dreck, hit that ignore button and watch as the thread magically turns from💩 to readable again. :beercheers:

Isn't just "Alex", normal average consumers have the same thinking.
I was looking the AW II performance graphics on TechPowerUp and it happened again, I think this is the third time we get to a point where half the Nvidia cards just die for lack of VRAM.
Consumers didn't learned the first time, didn't learned the second time, and will not learn the third time their cards die and will continue buying Nvidia and Jensen will continue using the same tricks.

It's hopeless.
 

Jaskalas

Lifer
Jun 23, 2004
35,143
9,284
136
Don't see any data pertaining to VRAM limitations in ARK.
Just trash performance in general, with no GPU hitting 100 fps at 1080p, and no GPU hitting 60 fps at 1440p.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Yeah, Huang make 4070ti 12gb, and then 4070ti super with 16 gb. Soon™

Ark Survival Ascended from GGpu​

4e8f490c4641bc04522b50d23a81cb0b.png
96f42e8ed5545277ab540894cf7fe9cd.png
6a07356eae38d2af79306614bcc62db7.png
Interesting test. Surely heavy on the VRAM, but we see cards with 8GB being faster than 12GB ones and 12GB ones being faster than 16GB ones and 16GB ones being faster than 20GB ones.

Aslo it is completely unplayable on almost everything.

I'd say it's more sensitive to gpu processing power than vram. I'll give it a look.
 

yepp

Senior member
Jul 30, 2006
403
38
91
Not impressed with AW2, It might be pushing some of the latest heavy graphical tech but the cartnooney trees along with it's ugly LOD pop-in breaks the presentation for me, running through the forest with constant LOD changes on the trees a few feet away is very noticeable. It's not even an open world game so that excuse goes out the window, probably one of the sacrifices they made to run well on 8GB cards.


Dare I say it, the Order 1886 from 2015 looks better than this game, booting up that on game on my PS4 in 2023 and I'm still wowed by the visuals. This game, global illumination, RT reflections, Ok? Can't say I'm wooed.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,354
30,419
146
Not impressed with AW2, It might be pushing some of the latest heavy graphical tech but the cartnooney trees along with it's ugly LOD pop-in breaks the presentation for me, running through the forest with constant LOD changes on the trees a few feet away is very noticeable. It's not even an open world game so that excuse goes out the window, probably one of the sacrifices they made to run well on 8GB cards.


Dare I say it, the Order 1886 from 2015 looks better than this game, booting up that on game on my PS4 in 2023 and I'm still wowed by the visuals. This game, global illumination, RT reflections, Ok? Can't say I'm wooed.
This is a prime example of why bar charts are so inadequate. That pop in is something you have to see for yourself. Bigger bar better doesn't mean much when you are forced to deal with that kind of jarring graphical issue.
 

yepp

Senior member
Jul 30, 2006
403
38
91
This is a prime example of why bar charts are so inadequate. That pop in is something you have to see for yourself. Bigger bar better doesn't mean much when you are forced to deal with that kind of jarring graphical issue.
I'm running a 16GB card here, they obviously compromised on the graphics engine to facilitate 8GB, not sure if the pop-in is worse on 8GB cards. It's very disappointing that this game is touted as the latest graphics marvel by the likes of Digital Foundry and suffers from such noticeable LOD pop-in, and the shadows are horrible in rasterization, makes you wonder was GTAV doing RT shadows.
 

Hitman928

Diamond Member
Apr 15, 2012
6,626
12,168
136
I'm running a 16GB card here, they obviously compromised on the graphics engine to facilitate 8GB, not sure if the pop-in is worse on 8GB cards. It's very disappointing that this game is touted as the latest graphics marvel by the likes of Digital Foundry and suffers from such noticeable LOD pop-in, and the shadows are horrible in rasterization, makes you wonder was GTAV doing RT shadows.

Yeah, AW2 seems to be a mixed bag in terms of graphics. Some scenes look pretty stunning, but there are at least a few glaring issues with the graphics at the same time. The LOD pop in was something I had noticed in other videos before you even posted but was wondering if it was due to it being run on lower VRAM cards. The fact that it happens on a 16 GB card is no good and does point to compromises made trying to lower VRAM requirements in general. Maybe they'll continue to work on it and after a few patches, the graphics will get more polished.
 

Aapje

Golden Member
Mar 21, 2022
1,530
2,106
106
The 2500k came into the discussion briefly, because I said I cannot test the rx6600, in Alan Wake 2, to see if it has a 8GB vram problem or a gpu processing power problem. The guy above said just let it go, it's too old and I just answered to that. The focus is the rx6600 with its 8GBs of vram. I use all my 8gb vram systems to show how different setups, can accommodate the same game, BUT with different settings, because all 8GB cards are not the same.

These very settings, have been demeaned in the OP , while they are a very crucial part of PC gaming as a whole. People posting 1440p vids at max settings +RT to show how 8GBs are bad, seems inane to me as well. I am more of a "use settings to solve problems than create them guy". Sorry about that.

VRAM is relative to the GPU power as well, and to the price. A 6600 is too weak for a lot of higher settings and is relatively cheap, so you can't expect it to last as long.

But I disagree that for a new GPU, you should accept having to be work around a major imbalance in the GPU, especially if that means that the features they promise you don't work. Like RT not working due to lack of VRAM.
 
Last edited: