8GB VRAM not enough (and 10 / 12)

Page 129 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

marees

Golden Member
Apr 28, 2024
1,061
1,448
96

Indiana Jone & the Great Circle Settings for Low-end PC: RTX 3060, RTX 3060 Ti & RTX 4060​

Indiana Jones & the Great Circle: Low-end PC Benchmarks​

Indiana Jones and the Great Circle runs remarkably well on low-end hardware…as long as you keep path tracing disabled. The GeForce RTX 3060 12 GB averaged 70 FPS at 1080p “Supreme” quality graphics settings with DLSS set to quality mode. The gameplay was quite smooth with 1% lows of 55 FPS. This sub-$300 card produced an average of 61 FPS at 1440p with DLSS set to “Performance” mode and lows of 50 FPS.

The GeForce RTX 4060 laptop GPU on the Alienware x14 produced similar numbers at 1080p, but the texture quality had to be reduced to “High” and DLSS set to balanced mode. To stay above 60 FPS at 1440p, the shadows, reflections, and water quality had to be reduced to “High” and the DLSS to “Performance” mode.
RTX 4060 mobile came very close to running out of graphics memory. If you have an 8 GB card, and keep seeing artifacts or stutters, reduce the shadow quality to “Medium.”

Indiana Jones & the Great Circle: Best Settings for Low-end PC​

Graphics SettingsRTX 3060 12 GBRTX 3060 Ti 8 GBRTX 4060 8 GBRTX 4060 Laptop GPU 8 GB
Resolution1080p|1440p1080p|1440p1080p|1440p1080p|1440p
FPS Target60 FPS60 FPS60 FPS60 FPS
Texture QualitySupremeHighHighHigh
Shadow QualityUltraUltraUltraHigh
Decal Rendering DistanceUltraUltraUltraUltra
Global Illumination (RTXGI)HighHighHighHigh
Reflection QualityUltraUltraUltraHigh
Motion BlurUltraUltraUltraLow
Water QualityUltraUltraUltraHigh
Volumetrics QualityMediumMediumMediumMedium
Hair QualityHighHighHighHigh
Texture FilteringVery UltraVery UltraVery UltraVery Ultra
Path TracingOffOffOffOff
UpscalingDLSS Quality|BalancedDLSS QualityDLSS QualityDLSS Balanced|Performance
Frame GenerationOffOffOffOff

 

marees

Golden Member
Apr 28, 2024
1,061
1,448
96

Indiana Jone & the Great Circle Settings for Low-end PC: RTX 3060, RTX 3060 Ti & RTX 4060​

Indiana Jones & the Great Circle: Low-end PC Benchmarks​

Indiana Jones and the Great Circle runs remarkably well on low-end hardware…as long as you keep path tracing disabled. The GeForce RTX 3060 12 GB averaged 70 FPS at 1080p “Supreme” quality graphics settings with DLSS set to quality mode. The gameplay was quite smooth with 1% lows of 55 FPS. This sub-$300 card produced an average of 61 FPS at 1440p with DLSS set to “Performance” mode and lows of 50 FPS.

The GeForce RTX 4060 laptop GPU on the Alienware x14 produced similar numbers at 1080p, but the texture quality had to be reduced to “High” and DLSS set to balanced mode. To stay above 60 FPS at 1440p, the shadows, reflections, and water quality had to be reduced to “High” and the DLSS to “Performance” mode.
RTX 4060 mobile came very close to running out of graphics memory. If you have an 8 GB card, and keep seeing artifacts or stutters, reduce the shadow quality to “Medium.”

Indiana Jones & the Great Circle: Best Settings for Low-end PC​

Graphics SettingsRTX 3060 12 GBRTX 3060 Ti 8 GBRTX 4060 8 GBRTX 4060 Laptop GPU 8 GB
Resolution1080p|1440p1080p|1440p1080p|1440p1080p|1440p
FPS Target60 FPS60 FPS60 FPS60 FPS
Texture QualitySupremeHighHighHigh
Shadow QualityUltraUltraUltraHigh
Decal Rendering DistanceUltraUltraUltraUltra
Global Illumination (RTXGI)HighHighHighHigh
Reflection QualityUltraUltraUltraHigh
Motion BlurUltraUltraUltraLow
Water QualityUltraUltraUltraHigh
Volumetrics QualityMediumMediumMediumMedium
Hair QualityHighHighHighHigh
Texture FilteringVery UltraVery UltraVery UltraVery Ultra
Path TracingOffOffOffOff
UpscalingDLSS Quality|BalancedDLSS QualityDLSS QualityDLSS Balanced|Performance
Frame GenerationOffOffOffOff

Screenshot of the settings:

Screenshot_20241214-124147_Opera.jpg
 
  • Like
Reactions: Mopetar

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,354
30,411
146
Steve and Tim on their latest podcast discuss vram since the B580 12GB is here for $250. Steve agrees with most of us, in that he'd rather have the max textures and higher settings visuals it offers for that performance tier. Compared to the heavily compromised level of ray tracing and other visual settings necessary to use them on a 4060. No upscaler looks good starting at 1080, and the extra 4GB means the B580 can use frame generation and play at 1440 when the 4060 cannot.

For anyone that wants to listen -

 

blckgrffn

Diamond Member
May 1, 2003
9,650
4,226
136
www.teamjuchems.com
Screenshot of the settings:

So we've learned that nvidia's trump card on this front is to just run the game at a lower and lower internal resolution and pick it back up off the floor with DLSS. It took me a while to figure out that's what was going on in that sea of "High" and "Ultra" settings.

I guess most of us came to the logical conclusion that this would be "the way" going forward to stretch lower end cards.

And it further allows for "use the right settings" to continue being a valid argument long after a card would otherwise have stuttered to a stop. Instead of telling folks to game at 720p on their 1080p (or even 1440p!) screens, magic tensor magic makes it happen when you hit the "optimize" button in the new and admittedly shiny (or at least table stakes) nvidia app. And you can still use High/Ultra in game settings to feel like you are in the PCMR!

It's a "good" thing I guess but feels gross to me.

I also can hear this as a great justification for 8GB on the next gen of nvidia gaming cards below $450. I mean, justification from the sales and marketing required to move a lot of entry level product, anyway.
 

MrTeal

Diamond Member
Dec 7, 2003
3,907
2,664
136
So we've learned that nvidia's trump card on this front is to just run the game at a lower and lower internal resolution and pick it back up off the floor with DLSS. It took me a while to figure out that's what was going on in that sea of "High" and "Ultra" settings.

I guess most of us came to the logical conclusion that this would be "the way" going forward to stretch lower end cards.

And it further allows for "use the right settings" to continue being a valid argument long after a card would otherwise have stuttered to a stop. Instead of telling folks to game at 720p on their 1080p (or even 1440p!) screens, magic tensor magic makes it happen when you hit the "optimize" button in the new and admittedly shiny (or at least table stakes) nvidia app. And you can still use High/Ultra in game settings to feel like you are in the PCMR!

It's a "good" thing I guess but feels gross to me.

I also can hear this as a great justification for 8GB on the next gen of nvidia gaming cards below $450. I mean, justification from the sales and marketing required to move a lot of entry level product, anyway.
Next we can get the RAMDACs put back so we can get some sweet free smoothing running our games on our 21" CRTs. It's the future of gaming.
 

blckgrffn

Diamond Member
May 1, 2003
9,650
4,226
136
www.teamjuchems.com
Is anyone surprised though?
We shouldn’t be at all. Running a card out of vram is one of the best ways to make sure a GPU is going to need to be replaced before long.

We can be reasonably sure they will only add ram with they absolutely have to. GTX 960 says hello :) That card definitely lived the “2GB is the right value” into the teeth of “2GB is a wasted investment” while it was available. And a lot of people saw it coming but were the outliers at launch from the memory lane threads I just revisited :) 2016, what a time.

In all fairness, base 5700 at 12GB doesn’t offend me much, the 4070 was the first 70 series to jump up and a couple gen run of that seems realistic. And 12GB seems to largely keep you in the clear across the board for the moment.
 
Last edited:

Ranulf

Platinum Member
Jul 18, 2001
2,774
2,287
136
And a lot of people saw it coming but were the outliers at launch from the memory lane threads I just revisited :) 2016, what a time.

Heh, just went and looked at an old thread via search. Fun times. Frametimes, power efficiency and price/perf. 960 vs gtx760 or 770 vs r9 290/280X.

From the release month in early 2015:

Unless you are using Maxwell optimized AA, 960 takes quite a performance hit (in pr slides nvidia conveniently compared 960 to 660 with maxwell aa). Also, I suspect it performs worse in older games, compared to 760. I need to verify the latter, though. Compare Far Cry 4 versus Crysis 3 TPU benches.

FC4 v1.7 is using close to 1.8gb of Vram (max, smaa, simulated fur, enhanced god rays, hbao+, soft shadows) even at 1280x1024 res (played last night). Some food for thought, what it's going to be like in 2016?

This card wasn't meant to last long. And failure to mention this, is a sin.

If you must buy Maxwell, get 970/980 instead. At least you won't have to replace those cards in a year (unless you play wow/LoL or other undemanding games). With nVidia you have to pay more to get good products, I am afraid.

And yes, the 960 lasted all of 1.5 years before the gtx 1060 replaced it. Well, you could get the 4GB 960 for $230. Yay, not as bad as the 4060ti 8 vs 16GB.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,354
30,411
146
The reaction to the leak of the 5060 being 8GB is encouraging. PCMR -

nvidia-greed-v0-bbh7jg7ycf7e1.jpeg


Some shill trying to counter it with BS about Nvidia neural texture compression. I've seen some funny Nvidia propaganda, but good guy Nvidia coming to the rescue is the funniest one yet. We won't have to worry about ram and vram anymore guyz! 🤣
 
Jul 27, 2020
24,472
17,031
146
Even if Nvidia does it, they wouldn't do it out of the goodness of their heart. They would tie it somehow to their 5000 series GPUs so only they can decode the proprietary texture format and it will mostly be available in Nvidia sponsored titles. Then certain reviewers will use that bullet point to get users to feel bad about their 8GB cards, including Nvidia's 3000/4000 series cards.
 
  • Like
Reactions: DAPUNISHER

Thunder 57

Diamond Member
Aug 19, 2007
3,581
5,984
136
The reaction to the leak of the 5060 being 8GB is encouraging. PCMR -

nvidia-greed-v0-bbh7jg7ycf7e1.jpeg


Some shill trying to counter it with BS about Nvidia neural texture compression. I've seen some funny Nvidia propaganda, but good guy Nvidia coming to the rescue is the funniest one yet. We won't have to worry about ram and vram anymore guyz! 🤣

That has to be Jarred Walton, what a cuck. He says how AMD 10GB is equal to Nvidia 8GB. And he justifies it by saying they have better compression.
 

mikeymikec

Lifer
May 19, 2011
20,041
14,414
136
Who thinks the 1070 was "mid range"? TPU's review in 2016 has it listed as between $379-$449!

In 2016, videocardz.com was talking about the 1060 being mid-range for about $250:
 

marees

Golden Member
Apr 28, 2024
1,061
1,448
96
Who thinks the 1070 was "mid range"? TPU's review in 2016 has it listed as between $379-$449!

In 2016, videocardz.com was talking about the 1060 being mid-range for about $250:
I believe there was a RX 480 8gb, that could be considered mid-range
 

Thunder 57

Diamond Member
Aug 19, 2007
3,581
5,984
136
Who thinks the 1070 was "mid range"? TPU's review in 2016 has it listed as between $379-$449!

In 2016, videocardz.com was talking about the 1060 being mid-range for about $250:

The 8800GT was near the top performance wise and was price friendly. Maybe $200?

I believe there was a RX 480 8gb, that could be considered mid-range

The RX 480 8GB was $239. I waited weeks to buy one. It was upper mid range and one of my favorite video cards.
 
Last edited:
  • Like
Reactions: marees

MrTeal

Diamond Member
Dec 7, 2003
3,907
2,664
136
Who thinks the 1070 was "mid range"? TPU's review in 2016 has it listed as between $379-$449!

In 2016, videocardz.com was talking about the 1060 being mid-range for about $250:
IIRC it was almost impossible to find a card at base MSRP and pretty much everything was closer to the $450 FE pricing, though I can't be bothered to look it up. Well reviewed card but there were complaints at launch about what was effectively a 50% price increase over the by then $300 GTX970.

Turns out in retrospect it was a great card, but yeah for the time it was definitely more enthusiast than midrange. $240 RX480 was definitely midrange and has also held up amazingly well though, better than the 1060 6GB has.
 
  • Like
Reactions: marees

Mopetar

Diamond Member
Jan 31, 2011
8,393
7,513
136
The RX 480 8GB was $239. I waited weeks to buy one. It was upper mid range and one of my favorite video cards.

The 480 was a 1060 competitor and both were midrange products. The 1070 is best described as upper-midrange with the 1080, 1080 Ti, and Pascal Titan being the top-end cards.

That generation NVidia made 5 dies: GP-102, 104, 106, 107, and 108. The 1060 was on the GP104 die, smack dab in the middle of their product stack. The 1070 was a binned GP104 die with 75% of the shaders of the full die and slightly lower boost clocks.

If the 480 seems like more than midrange it's because AMD had nothing above Polaris until Vega finally launched with largely anemic performance. Technically it was upper range for AMD, but not the market as a whole. It did have an extra 2 GB of VRAM over the 1060 (or 1 GB if comparing the 4 GB model against the 3 GB 1060) which did give it longer legs.

I'm sure there's a thread somewhere on these forums where this has been referenced numerous times given the relevance to current similar situations. :p
 

Thunder 57

Diamond Member
Aug 19, 2007
3,581
5,984
136
The 480 was a 1060 competitor and both were midrange products. The 1070 is best described as upper-midrange with the 1080, 1080 Ti, and Pascal Titan being the top-end cards.

That generation NVidia made 5 dies: GP-102, 104, 106, 107, and 108. The 1060 was on the GP104 die, smack dab in the middle of their product stack. The 1070 was a binned GP104 die with 75% of the shaders of the full die and slightly lower boost clocks.

If the 480 seems like more than midrange it's because AMD had nothing above Polaris until Vega finally launched with largely anemic performance. Technically it was upper range for AMD, but not the market as a whole. It did have an extra 2 GB of VRAM over the 1060 (or 1 GB if comparing the 4 GB model against the 3 GB 1060) which did give it longer legs.

I'm sure there's a thread somewhere on these forums where this has been referenced numerous times given the relevance to current similar situations. :p

Vega was garbage. So was the 1060 3GB version that was cut down without much mention. I was back and forth between the 1060 6GB and RX 480 8GB but went with the 480 because it did better in the most demanding games I was playing at the time. If I was in the market for more performance I would've went with the 1070 but the RX 480 did just fine at 1440p at the time.
 
  • Like
Reactions: Mopetar

MrTeal

Diamond Member
Dec 7, 2003
3,907
2,664
136
It was $250, but offered like 90% of the flagship performance at like 40% of the price.

Inflation calculator says that $250 in 2007 is roughly $480 today. Could you imagine having almost 4090 performance for $500?
Leading up to the day the 8800 GT NDA lifted, you could actually purchase the 8800 GT for as little as $220 from a variety of online vendors. Once the embargo was lifted, the story changed considerably. Prices went from the expected $199 - $249 to a completely unexpected $250 - $300 range. Looking at our own price search engine we see that only Amazon is listing a card available at $249, but it's not in stock, nor are any of the other more expensive 8800 GTs listed.

The cheapest 8800 GT we can find at Newegg.com is $269 for either a XFX or PNY card, but neither are in stock, not to mention that the listed price is still $20 over what NVIDIA told us the maximum would be.
The more things change, the more they stay the same.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,581
5,984
136
  • Like
Reactions: Mopetar and Elfear

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,354
30,411
146
My all time favorite card is usually whichever one I bought last. :p The "damn!" started with the Red Devil Vega 56 though. First absolute unit of a card I'd owned. Back when 8GB was alright for a card like that. Last 3 are the 6800 16GB, 7800XT 16GB, and the 7900XTX 24GB. Better RT and DLSS wasn't going to save my 2060 super, 2070 super, or 3060ti from aging out, but more vram would have.
 
  • Like
Reactions: igor_kavinski