8GB VRAM not enough (and 10 / 12)

Page 105 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

poke01

Diamond Member
Mar 8, 2022
3,769
5,099
106
Yet the 4070 decimated the GRE with its RT prowess at lower resolutions. I suspect lack of VRAM at 4K is to blame though yes, more data would be helpful.
The 4070 Super and the 70 Ti have a 192bit bus as well. The only major difference between those cards and the 4070 is RT cores and Shader cores is very cut down. It has ~1200 less shader cores and 10/14 less RT cores than the 4070 S/Ti. Lack of VRAM maybe a contributing factor but the performance is lacking relative to the other AD104 cards.
 
Jul 27, 2020
26,053
17,969
146
Lack of VRAM maybe a contributing factor but the performance is lacking relative to the other AD104 cards.
Yes but it kinda throws shade on the belief of Nvidia users that RTX RT cores are the holy grail of raytracing. These cores are getting beaten here by AMD's poorer RT implementation.
 

poke01

Diamond Member
Mar 8, 2022
3,769
5,099
106
Yes but it kinda throws shade on the belief of Nvidia users that RTX RT cores are the holy grail of raytracing. These cores are getting beaten here by AMD's poorer RT implementation.
They are the holy grail. That’s why the GRE is getting beaten by the 4070 Super at 4K while having a 192bit bus while the GRE has a 256bit bus.

The 4070 has way less cores, of course you can’t win even if your RT tech is better when you compare with a AMD card that has more grunt. What I’m saying is the 4070 cannot handle 4K RT because it’s 5888 shader cores/46 RT are not enough for 4K and it winning in 1080p and 1440p shows they are only sufficient for those resolutions proves my point. The 192 bit bus width doesn’t help either at 4K so that’s why Nvidia had drastically improve the shader count with the 4070S.

What’s holding all these AD104 cards back at 4K RT is the 192bit bus as well. Just look how much better the 4070Ti Super is performing with its 256bit bus at 4K. The difference isn’t as drastic with 1080p and 1440p.
 

poke01

Diamond Member
Mar 8, 2022
3,769
5,099
106
That's 504GB/s vs. 576GB/s, less than half the difference implied by the bus size.
The GRE gets beaten by 25% each time at 1080p and 1440p by the 4070 Super. At 4K it’s only beats it by 4%. Why? Because the 4070 Super chokes at 4K cause of lower bandwidth and VRAM despite having the core count.

This has nothing to with the implementation of the RT cores, those are great.

When Nvidia has comparable bandwidth and VRAM needed for 4K RT such as the 4070 Ti Super, at 4K it’s a massive difference, as the Ti super wins by 32% vs the 7900xt.
 
Last edited:
  • Like
Reactions: Mopetar and marees

coercitiv

Diamond Member
Jan 24, 2014
7,226
16,983
136
This has nothing to with the implementation of the RT cores, those are great.
I did not dispute your point overall, just the argument of bandwidth in this instance, specifically the optics of comparing 256 bits vs. 192 bits when the latter is using GDDR6X.
 
  • Like
Reactions: Tlh97 and poke01
Jul 27, 2020
26,053
17,969
146
When Nvidia has comparable bandwidth and VRAM needed for 4K RT such as the 4070 Ti Super, at 4K it’s a massive difference, as the Ti super wins by 32% vs the 7900xt.
Yeah, I kinda wish the 4070 Super were just an identical 4070 Ti Super with only 10GB VRAM (yes, I will punish the peasants for paying less! :D) so we could see how much VRAM matters.

Maybe there's hope. Maybe I can beam my thoughts of a 4070 LE 10GB with the same GPU as 4070 Ti Super into Jensen's brain :p

Hopefully the dude will then trade in his 12GB 4070 Ti for that coz VRAM doesn't matter...
 

poke01

Diamond Member
Mar 8, 2022
3,769
5,099
106
Hopefully the dude will then trade in his 12GB 4070 Ti for that coz VRAM doesn't matter...
Oh it does matter that wasn’t my initial point anyway. 12GB is minimum these days for 4K and a 256bit bus.

And a >$500card should come with 16GB minimum because it’s what the market values.
 
  • Like
Reactions: Mopetar

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,701
31,567
146
Steve wrote a great script for that vid. The conclusion was indisputable. 8GB is holding back game development and costing more time=money. And reviewers need to do a much better job of informing their audiences about the limitations of vram, and provide better information for shoppers edification. To use my favorite malaphor - it ain't rocket surgery.

This is yet another vid that liver kick KOs the bigger bar better based arguments that are incessantly spammed here. Which show everything is just peachy with the 8GB card vs its 16GB twin. Right up until you look at the game being tested and see the visual problems the graphs don't show.

Makes sense that low bandwidth plus low vram = bad time.

It was also gratifying to see that our small group's consensus scales to a much larger group.
 

Ranulf

Platinum Member
Jul 18, 2001
2,822
2,423
136
I'm shocked at these results, just shocked. Its like crazy that bandwidth can limit performance or that vram can compensate for that. Who knew that buying a $400 gpu with 128bit memory bus, 8GB vram and a 8x pci-e bandwidth could be well, over priced.

Also, just look at that frame time:

Screenshot 2024-07-20 at 12-12-30 Breaking Nvidia's GeF[...].png
 
Jul 27, 2020
26,053
17,969
146
I can almost predict what the dude will say:

PCIe 2.0? Are you insane???? Who runs PCIe 2.0 in real life? And that too at x8!!???!!

Verdict: This is a very convoluted, special case that has no connection with real world gaming and thus, it's an invalid case and absolutely has no basis in reality for proving the hypothesis that 8GB VRAM isn't enough.

:p
 

Ranulf

Platinum Member
Jul 18, 2001
2,822
2,423
136
I can almost predict what the dude will say:

PCIe 2.0? Are you insane???? Who runs PCIe 2.0 in real life? And that too at x8!!???!!

Verdict: This is a very convoluted, special case that has no connection with real world gaming and thus, it's an invalid case and absolutely has no basis in reality for proving the hypothesis that 8GB VRAM isn't enough.

:p

Indeed and if one looks at specs in a signature, well that 2500k that has been tested uses a p67 mobo that is pci-e 2.0. My old asrock p67 extereme mobo was anyway.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,132
3,069
146
I still use a 5700XT in my smaller, second desktop, which I use for traveling to Quakecon. It has 8GB of memory, but it does pretty well for the games we play there. Obviously it doesn't perform as well as my 7900 XTX, but I don't turn on RT for either (though not an option on the 5700 XT). I usually adjust settings for optimal FPS regardless.

Also, when traveling I usually borrow a smaller, 1080p monitor from someone, so my resolution isn't as high, so that helps.
 

DaaQ

Golden Member
Dec 8, 2018
1,907
1,357
136
I can almost predict what the dude will say:

PCIe 2.0? Are you insane???? Who runs PCIe 2.0 in real life? And that too at x8!!???!!

Verdict: This is a very convoluted, special case that has no connection with real world gaming and thus, it's an invalid case and absolutely has no basis in reality for proving the hypothesis that 8GB VRAM isn't enough.

:p
Sorry I just have to, because I was able to take #1 from a 4080S

Screenshot 2024-07-16 213723.jpgScreenshot 2024-07-16 212113.jpg