8GB VRAM not enough (and 10 / 12)

Page 27 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

Tup3x

Golden Member
Dec 31, 2016
1,277
1,410
136
I don't know guys, the chaps at gamegpu.com benchmarked it already and it seems like a healthy game.


No need for me to upload pics. Give them the views. They are worth it.

Quick sum up.

This is the session they benchmarked.


Ok it probably is not the most heavy part of the game, but I remember it was not that easy on the previous game.

So for THIS session, even the 6600 is hitting 100fps at 1080p epic and 60fps at 1440p.

The 3060ti seems to be able to do 4k/dlss.


The 4070ti is destroying it at 90fps at 4k EPIC.

I don't know if you notice, but I mentioned cards I have, and I care about, lol.

Also the 3070 and 3070ti seem to be doing OK compared to the 3060ti and they are where you'd expect them to be, so the 8GBs seem to be enough. Maybe it will have problems in other parts. We'll see.

The reported VRAM usage is up to 6GBs for Nvidia's cards and up to 8GBs for AMD's cards. That's at 4K.

From the CPU side even the 3100 can do 78fps.

Seems very healthy to me. I'll test it tomorrow, since I have preordered.

I swear to God, if I see the 2500k hitting 60fps again, I'll pile all my other systems and I will blow them up (j/k I don't believe in God and I am not crazy xD).
Yeah, well, that's not Jedi Survivor. I did run pretty well - few typical UE loading stutters here and there. It kinda amazes me how they can launch sequel in this state... Did they learn nothing from working with UE4? Or did the switch to DX12 cause this? Anyway, interesting to see how badly it runs on my rig...
 

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
I love this AMD critter that keeps gnawing away at the CPU and GPU behemoths Intel and Nvidia. If they weren't such fierce competitors, they could conceivably join forces for a hostile takeover of AMD, extinguishing their most annoying threat.
AMD's graphics division has been slowly gnawing away at the Nvidia behemoth since 8 years before Nvidia was founded. A few more years and I'm sure they'll get there.
 

maddie

Diamond Member
Jul 18, 2010
5,156
5,545
136
AMD's graphics division has been slowly gnawing away at the Nvidia behemoth since 8 years before Nvidia was founded. A few more years and I'm sure they'll get there.
But Nvidia is the Hydra. When one falls another emerges, witness the rise of AI demand after the crash of crypto.
 

Captante

Lifer
Oct 20, 2003
30,353
10,876
136
Hardware Unboxed is not an objective source when comparing AMD and Nvidia... they're just "okay" for monitors too since they make recommendations with ZERO mention of straight-up garbage level customer-service and/or warranty support.

(looking at YOU MSI and ESPECIALLY Gigabyte!) :oops: :rolleyes:

Find a more trustworthy reviewer for GPU's in particular.
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
Ok playing a little Jedi Survivor here. For 1080p, msi ab shows up to 10GBs on the 4070ti. That itself does not say much, until I test in my 8GB cards. Hmmm I don't have a 4k screen here, but tomorrow the 4070ti will go to its final place and I will do some testing there. Native 4k seems difficult even for the 4070ti. It has fsr though, so it will be ok I guess. No DLSS for now.

RT is the problem however. It brings a HUGE performance hit, that does not seem to be on the gpu side. The 4070ti was showing 40% gpu usage with low clocks, so it was even lower, but still it was running at like 35fps. The cpu is getting hammered with RT and I am using a 5Ghz 8600k, which is also not at 100%.

Without RT, it seems quite playable though. I believe the 3060ti will also be able to do 1080p/epic and the GTX 1070 maybe 1080p/high/medium. I'll test the 970 too, lol.

It looks WAY better than fallen order. Not even close. The heaviness is justified. If it's UE4, son I'm impressed.

Great gaming mechanics too. It will be a great game for sure. Some patches will help.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,033
32,510
146
I was watching a 7800X3D + 7900XTX in the first levels and it was seeing 100% usage often and staying over 95% the vast majority of the time in the first half hour of the game. That was 4K EPIC FSR2 Quality RT on. That was using AMD ReLive AV1 game capture.

It seems broken on Nvidia. As if it isn't using any hardware for RT, and texture streaming is not working correctly.
 
  • Wow
Reactions: Tlh97 and psolord

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,600
6,084
136
Daniel Owen's video on 7800X3D and RTX 4090 seems like it's broken on nV since the GPU utilization tanked pretty hard with RT. Whether due to higher driver overhead, poor optimization, broken code path, Denuvo DRM or all of the above. Being CPU-limited at 4K Epic settings with RT on using the fastest gaming CPU available is nuts.

I'm downloading the game via EA App so should be able to test this evening and see how a 7800X3D/7900XTX combo runs the game.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,033
32,510
146
Daniel Owen's video on 7800X3D and RTX 4090 seems like it's broken on nV since the GPU utilization tanked pretty hard with RT. Whether due to higher driver overhead, poor optimization, broken code path, Denuvo DRM or all of the above. Being CPU-limited at 4K Epic settings with RT on using the fastest gaming CPU available is nuts.

I'm downloading the game via EA App so should be able to test this evening and see how a 7800X3D/7900XTX combo runs the game.
Here is the video using your hardware that I was watching.

 
  • Like
Reactions: Tlh97 and IEC

coercitiv

Diamond Member
Jan 24, 2014
7,365
17,461
136
RT is the problem however. It brings a HUGE performance hit, that does not seem to be on the gpu side. The 4070ti was showing 40% gpu usage with low clocks, so it was even lower, but still it was running at like 35fps. The cpu is getting hammered with RT and I am using a 5Ghz 8600k, which is also not at 100%.
Yup, Daniel Owen has a video up where he's showing serious CPU bottlenecks even on 7800X3D /w RTX 4090. Enabling RT makes the CPU situation even worse. Also, lowering settings to LOW does not help the CPU utilization issue.

By comparison, here's how it runs on 7900XTX /w 7800X3D in the same section of the game. The 4090 would run considerably faster than that were it not for the odd CPU usage behavior.

In terms of VRAM there are already a couple of videos up with gameplay on the 3070, and it looks to be under control with RT disabled, at least at 1080p.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,600
6,084
136
8GB is definitely not enough, at least not at 1440p Epic + RT on (FSR disabled):

Forgive the custom overlay, need to figure out how to get the average/1%/0.1% to display.

1682695385887.png
 

SteveGrabowski

Diamond Member
Oct 20, 2014
8,949
7,663
136
I beat you on posting that 7900XTX vid. :D I just checked a 6700XT + 5800X3D and it is running really well at 1080 epic. It keeps grabbing more and more VRAM as play goes on.


Is Epic the equivalent of ultra settings in this game? Wonder how much performance you could claw back at 1440p high, as I always try to play at settings to keep my framerate between 50 fps and 57 fps since I play on a 60 Hz FreeSync monitor and I get tearing if I put frame limiters any closer to 60 for most games and also get tearing if it ever drops below 40 fps since then I'm outside the FreeSync range of the monitor.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,067
3,574
126

RIP...
May the force be with you if you have less then 24GB... literally.

8GB is definitely not enough, at least not at 1440p Epic + RT on (FSR disabled):


waaah... your game is playing better then mine.
Wait, i was playing on a 3840x1600p
I was bearly getting 60fps.
I have a 4090.

I finally found something to give my 4090 a good exercise, although ud be thinking i was playing this game on a 3070Ti instead.

Daniel Owen's video on 7800X3D and RTX 4090 seems like it's broken on nV since the GPU utilization tanked pretty hard with RT.

LOL no its actually sort of accurate.
Well different CPU in my case.
Had to disable RT.
Wish the game DLSS.
 
Last edited:
  • Wow
Reactions: Captante

Captante

Lifer
Oct 20, 2003
30,353
10,876
136
Thing is from what I've seen this isn't a case of "high tech mind-blowing graphics", it's a case of some TRULY pathetic coding and optimisation by the developers.

I played the first game and enjoyed it (even with keyboard & mouse!) but I won't be buying this until it's fixed and I suggest everyone else do the same.

Further, while my 3080fe/5800x is no longer a "cutting edge" gaming system, it should be PLENTY fast enough to run any PROPERLY WRITTEN game at max-detail.

Heck MOST games (even new ones!) can be made to work "okay" on even my ancient FX-8350/GTX-980 backup desktop provided they'll load in the 1st place .... there's zero excuse for this nonsense.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Hardware Unboxed is not an objective source when comparing AMD and Nvidia...

Utter nonsense. Their benchmarks objectively prove 8GB is a problem and their findings have been confirmed by at least 2 other sources.

But it's pretty amusing to see you using #2, haven't seen that one for a while.

Further, while my 3080fe/5800x is no longer a "cutting edge" gaming system, it should be PLENTY fast enough to run any PROPERLY WRITTEN game at max-detail.
Ah yes, #13, "8GB is plenty because a random forum user said so". We get that one a lot around here.

So tell me, who gets to decide when 8GB isn't enough? Is it you? Or someone else?

8GB VRAM has been around for nine years in consumer space. Would you equally argue for the same stagnation of CPU core count and DRAM capacity from the year 2014?
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
It seems broken on Nvidia. As if it isn't using any hardware for RT, and texture streaming is not working correctly.
Remember when this forum was full of people telling us DX12 would provide automatic performance gains for developers without any effort (including all Indie titles), as long as the engine (e.g. Unreal) already supported it?

Also remember when we were told Ray Tracing automatically works everywhere without any effort, you just need to flip a simple switch?

Yeah, those people are pretty quiet these days. The fact that they thought game developers would be able to program GPUs better than the hardware engineers who actually built them was a comical delusion.
 
Last edited:
  • Like
Reactions: DAPUNISHER

Captante

Lifer
Oct 20, 2003
30,353
10,876
136
Utter nonsense. Their benchmarks objectively prove 8GB is a problem and their findings have been confirmed by at least 2 other sources.

But it's pretty amusing to see you using #2, haven't seen that one for a while.


Ah yes, #13, "8GB is plenty because a random forum user said so". We get that one a lot around here.

So tell me, who gets to decide when 8GB isn't enough? Is it you? Or someone else?

LOL ... first of all I've been watching Hardware Unboxed since they first started. The fact that they continually wholeheartedly suggest buying expensive failure-prone products from companies with crap support is too much for me to stomach.

But maybe its "good enough" for you? :D


Second a 3080fe has 10gb's not 8 (not that it matters a whole lot) and I've yet to have a REAL issue 100% related to lack of VRAM on a 4GB GTX-980.

And third thanks (I guess?) for reminding me why I avoid this forum with your [HAPPY AND PLEASANT] tone despite being my a hard-core gamer ... nice work.

;)
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,491
7,747
136
A lot of games have problems because they licensed an engine, and the code monkeys that are trying to get it to do something it wasn't originally designed for aren't as skilled as the people who built it.

Look at how awful Bethesda games tend to run (especially considering the graphics you get) because it's all being run on an ancient engine being held together with duct tape and bubblegum.

Studios that build and develop their own engines and understand their capabilities and limitations aren't having these same sort of problems.

DX12 is great for the skilled practitioners of their craft who can take that greater control and create something amazing. Give a master painter a better brush, more varied pigments, and a quality canvas and they'll make something that wasn't previously possible. Give the same to a monkey and we already know what the results will be.
 
  • Like
Reactions: Tlh97
Jul 27, 2020
28,175
19,192
146
Second a 3080fe has 10gb's
Still, it's a good idea to sell your card while it still has life and use that money towards getting something current. Otherwise, it will become a paperweight. That's the obligatory ongoing PC gamer tax. Unless you will hand down the card to someone and pay full price for a new card down the road.
 
  • Like
Reactions: Captante

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,067
3,574
126
Guys lets calm down a bit.
We should be pissed at programers, and not each other.

Typically i think 8 GB would be enough, unless its a very large world, and you need to render every last air molecule that passes by the frame.

All the extra stuff loading is obviously textures and other stupid stuff.
There is also probably a uber memory trash piling up on our GPU as its not optimized.

Lets give it some time to see.
But this is not looking for for gaming industry for PC.
First its Last of Us, and now its this.

Of course having more memory is always better.
And i still think if a card costs over 600 dollars, that card better at least have 16GB of GDDR6X and not some desktop ram ASUS pulled on us a long time ago on the budget cards.

But this is not helping.
I think they were writing code for the game on a 4090, and went oh it runs at 60fps.. ok were done.

The delusion is strong with this one.

Relax one must... to see and fight another day always wins me says. - in Yoda's voice...