8GB VRAM not enough (and 10 / 12)

Page 26 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

Aapje

Golden Member
Mar 21, 2022
1,530
2,106
106
Our intrepid Brazilian modders proving 8GB is forced obsolescence:
Yeah. We know that Nvidia was pretty close to releasing a 20 GB 3080 (which were actually sold to miners by MSI, who decided to just sell these cards from a test production run secretly, breaking Nvidia's rules) and a 16 GB 3070 (which Steve from Hardware Unboxed was told about by AIBs).
 

blckgrffn

Diamond Member
May 1, 2003
9,668
4,295
136
www.teamjuchems.com
lmao, I see this constantly :D

1682430072866.png

These Ampere cards would have put a crapload of pressure on Ada. The 20GB 3080 would have really made the 12GB 4070 seem like an ever large compromise.

It would have added years to the utility of the 3080 though, so that really sucks.

I'll remain steadfast in my view that the 3070 ti should have been 16GB for sure. Smh.
 

Aapje

Golden Member
Mar 21, 2022
1,530
2,106
106
At this point reviewers are taking notes on the game selection to prepare for the 4060Ti review. AMD will also feel the heat with their N33 based cards, unless they do somehting to make the 7600XT more viable.
The 7600 XT could be sold for as cheap as $250, so at that price it will probably still sell to entry level gamers (the kind of people who bought 1660 Super's in the past).

Nvidia is most likely going to ask for at least $399 for the 4060 Ti and at least $330 for the 4060, which makes it a far harder sell.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
At this point reviewers are taking notes on the game selection to prepare for the 4060Ti review.
Remember when nVidia tried to blacklist HWUB because they didn't focus on "wray traycing" enough? Wouldn't it be funny if for the 4060TI reviews, nVidia told reviewers to focus on rasterization again? o_O

AMD will also feel the heat with their N33 based cards, unless they do somehting to make the 7600XT more viable.
I'm 90% certain AMD's new parts will be 12GB, with the possible exception of a cheap entry-level 8GB.

nVidia OTOH, I can just see them plow ahead with a $400 - $450 128 bit 8GB 4060TI because of their arrogance. They'll push TV motion interpolation DLSS 3.0 like it's the second coming of the Messiah.

But if it blows up too much we may see a "Super" refresh in six months, just like Turing.
 
Last edited:

Aapje

Golden Member
Mar 21, 2022
1,530
2,106
106
A bit worried for my 6700 XT too after the way they talk about 12GB in the conclusion. Though I'd hope it would be enough to at least run at console equivalent textures. No hope whatsoever for scrubby 8GB card though. :tearsofjoy:
N22 is not that strong, so 12 GB should be a good match. Supposedly, consoles have around 10-12 GB to work with as VRAM, so 12 GB should be sufficient to get console equivalent graphics with no RT.

I think that 12 GB is a bigger issue for the 4070 and especially 4070 Ti, as they are more powerful and can be used with RT or use other techniques to make them perform quite a bit better than consoles, but then the VRAM may hold them back. And the higher price also makes it a bigger issue if they become obsolote relatively soon.
 
  • Like
Reactions: Tlh97

jpiniero

Lifer
Oct 1, 2010
16,417
6,881
136
I'll remain steadfast in my view that the 3070 ti should have been 16GB for sure. Smh.

Again, when Ampere first launched it only had 1 GB GDDR6X chips available. They could have refreshed when the 2 GB became available but obviously chose not to.

I believe the 3090 Ti does have the 2 GB chips but that is the only one.

That would be interesting about how the costs would look like with that theoretical 58 CU GA103 with 16 GB GDDR6X versus the 4070 and 4070 Ti.
 

blckgrffn

Diamond Member
May 1, 2003
9,668
4,295
136
www.teamjuchems.com
Again, when Ampere first launched it only had 1 GB GDDR6X chips available. They could have refreshed when the 2 GB became available but obviously chose not to.

I believe the 3090 Ti does have the 2 GB chips but that is the only one.

That would be interesting about how the costs would look like with that theoretical 58 CU GA103 with 16 GB GDDR6X versus the 4070 and 4070 Ti.

Again?

I didn't say GDDR6X. I said 16GB of memory. Given the current state of things, the extra bandwidth is worthless when you are out of memory.

That GDDRX power usage should have gone to powering 16GB of non-x memory for a much longer lasting card.

A full die and 16GB of ram would have sold plenty of folks on it as the card to have to hold for a long time.

Which is why didn't get made and they used the Ti to dump GDDR6X instead.
 

jpiniero

Lifer
Oct 1, 2010
16,417
6,881
136
Again?

I didn't say GDDR6X. I said 16GB of memory. Given the current state of things, the extra bandwidth is worthless when you are out of memory.

That GDDRX power usage should have gone to powering 16GB of non-x memory for a much longer lasting card.

But "logically" you would have to of refreshed the 3080/Ti to with 2 GB versions if they were going to use 2 GB GDDR6 with the 3070/Ti compared to 1 GB GDDR6X. It's not entirely clear when the 2 GB chips became available.
 
  • Like
Reactions: Tlh97 and blckgrffn

blckgrffn

Diamond Member
May 1, 2003
9,668
4,295
136
www.teamjuchems.com
But "logically" you would have to of refreshed the 3080/Ti to with 2 GB versions if they were going to use 2 GB GDDR6 with the 3070/Ti compared to 1 GB GDDR6X. It's not entirely clear when the 2 GB chips became available.

I don't dispute that!

That would have been the best possible outcome, right? 16GB 3070Ti and a 20 GB 3080 Ti with the fastest possible memory? TBP is already pretty high so what the heck, what's a few more watts between friends? :)

But from where I am standing - 16GB of whatever memory would have been preferable to the current state of things.

But couldn't the Ti have maybe even used faster 16 gbps GDDR6 as a middle ground (the 3070 used 14gbps memory per Google)? I think by the time we refreshed that wasn't even top tier anymore.
 
Last edited:

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Jedi Survivor looks like a real system hog. Seems like a bad port more than anything else, the graphics don't look all that different from Fallen Order to me, and that game ran fine on a 1080ti.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Wasn't the original horrible on all platforms at launch too?

I had done a run of Jedi Fallen Order, on a freagin 2500k and a 7950, both from the PS3 era and the game run fine, at EPIC settings fml...

(non monetized channel)


How much heavier can this one be? I mean, it's the same engine essentially, isn't it?
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
7,212
16,917
136
Survivor is a technical mess, we already have DO NOT BUY reviews on YT telling people not to touch the PC version just yet. People can't play it properly even on a 4090. The 40-50% GPU utilization screams CPU performance issues, and considering the consoles are running fine... my bet would go towards DRM software killing this game with a hatchet while laughing hysterically.

The VRAM allocation numbers are also scary, but we can't judge them until CPU utilization is fixed.
 

Timorous

Golden Member
Oct 27, 2008
1,955
3,817
136
Survivor is a technical mess, we already have DO NOT BUY reviews on YT telling people not to touch the PC version just yet. People can't play it properly even on a 4090. The 40-50% GPU utilization screams CPU performance issues, and considering the consoles are running fine... my bet would go towards DRM software killing this game with a hatchet while laughing hysterically.

The VRAM allocation numbers are also scary, but we can't judge them until CPU utilization is fixed.
Yet others are saying it works fine on the 7900XTX. Must be the magic always perfect Nvidia drivers...
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,576
31,220
146
Yet others are saying it works fine on the 7900XTX. Must be the magic always perfect Nvidia drivers...
To be fair, it is an AMD bundled game. It is also likely that once issues get addressed, it will run better on Nvidia GPUs, since it is a UE game.

Would not surprise me if the Nvidia driver overhead combined with CPU heavy asset decompression are partially responsible for the issues. DRM is the prime accomplice. Something is nerfed for certain. No way a 5950X or 9900KS should be holding a GPU back like that.
 

coercitiv

Diamond Member
Jan 24, 2014
7,212
16,917
136
Would not surprise me if the Nvidia driver overhead combined with CPU heavy asset decompression are partially responsible for the issues. DRM is the prime accomplice. Something is nerfed for certain. No way a 5950X or 9900KS should be holding a GPU back like that.
Daniel Owen already discusses the topic of driver overhead and it looks like this may partially explain the issues:

Asset decompression and VRAM pressure might explain the big difference in performance between closed and open areas of the game, as some of the early reviews point out.

The DRM hunch is based solely on the massive gap between official system requirements and real-world results.
 
Jul 27, 2020
25,341
17,588
146
Nvidia driver overhead
I can see why Nvidia struggles with that. They don't have the luxury of going into the adjacent building or cafeteria and sitting down with the actual CPU engineers, to discuss performance issues. People say that AMD made the best decision when it bought ATI. I say the ATI/RTG folks are damn lucky for being acquired by a CPU company.