Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
The question is simple. Will 10GB be enough moving forward for the next 3 years? We all know what happens when Vram gets breached: skips, stutters and chugging.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,761
777
126
That 20GB version is the one to get. I wouldn't ever consider Vram again with a 20GB card. I've enjoyed that carefree use with the 1080Ti for almost 4 years and I'd hate to be concerned about Vram starting day 1 of having a new card. The 10GB version is a no-go and should be renamed the buyer's remorse version.

Exactly. Feels like the 3gb version of the 1060. I feel 12-16 should be the spot for high end cards but obviously they cannot do that on the 3080 and therefore took the cheaper option.
 

sze5003

Lifer
Aug 18, 2012
14,177
622
126
the 3080 w/ 20GB wont cost above 850$ for the FE version.
Provided you can bot it.
That's what I'm thinking. Speaking of which I need to start working on a bot script to try and get myself 1 3090 next week lol.

Who am I kidding, I don't have any experience in that sort of thing despite being a developer. But it seems that it is something to look into for the future.
 
  • Like
Reactions: Tlh97 and pallab

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,248
136

StinkyPinky

Diamond Member
Jul 6, 2002
6,761
777
126
I wonder if a 12gb 3090 will be released. Could make it $150 cheaper, I'd suggest that would be a popular card since I think 12gb vram is enough for the next 5 years. (unlike possibly 10gb)
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
That RTX 3070 is basically obsolete right out of the box IMO. 8GB should be for 3060 class cards for 1080p-1440p gaming. Is 8GB even enough for 1440p moving forward?

Heck no!

I'll have to bring to your attention the fact that people don't buy brand new "flagship" cards just to have to lower settings to get playable performance on day one! If the cards simply came with a couple more gigs of ram, this wouldn't even be worth talking about. However, it is worth talking about because games are pushing past 8gigs already, often getting close to it at 1440p. That's today. There is a Marvel comics (or whatever) game that blows past 10.5GB at 4k already using HD textures. You telling me people spending $700+ on a new "flagship" GPU won't be pissed when they get hitching because they had the audacity to use an HD texture pack in a new game?
Forget 3 years; what about 1 year from now? This isn't going to get any better and no amount of Nvidia IO magical BS (that needs developer support, am I right?) is going to solve the 10GB problem. 2080Ti was their last-gen flagship. It had 11GB of ram. If the 3080 is their new flagship, why did they think it was a good idea to reduce it down to 10? When in the history of PC hardware has a new flagship release ever had less of a component as critical as RAM? The answer is *never*. The 3080 is not the flagship. It's a 2080 replacement and suitable for 1440p gaming over the next few years. It's not a legit 4K card moving forward.

8800GTS to 8800GTS G92: 640MB to 512MB. Had both, the former being my first high end card :D

But yeah, well said with your post!!!
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
The 8800 I think was one of my first Nvidia cards. I recently came across the retail box last year when I was moving. Good memories.

I went from an Nvidia FX5200 to that 8800GTS 640. It was like going from crawling straight into a rocketship.

But yeah, that 20GB model; with the price gap surely it'll be 1000-1100 USD, a bit steep for my liking.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
I'm gonna game at 1440p
Based on what I read here I'm gonna get the 3080 10gb and save the difference for a cpu upgrade in a year

I'm in the same boat. 1440p 144hz, don't plan on moving to 4k anytime soon and am not worried about 10GB @ 1440p. I figure I'll be using a 4080 or an RDNA3 GPU by the time I make the jump past 1440p anyway.
 

sze5003

Lifer
Aug 18, 2012
14,177
622
126
I went from an Nvidia FX5200 to that 8800GTS 640. It was like going from crawling straight into a rocketship.

But yeah, that 20GB model; with the price gap surely it'll be 1000-1100 USD, a bit steep for my liking.
I'm currently at 3440x1440 @120hz so unless they come out with 4k ultrawide I'm not changing resolution anytime soon. Ultrawide has been super productive for me working from home anyway.

While $1k-1100 sucks to pay for the 20gb model I'm not planning on getting rid of it until 3 years down the line and if it means I can crank all settings max and enjoy the latest games no issues, most likely worth it for that time frame.
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
I'm not liking the reports of stability issues beyond 2ghz. It just feels like the chip is being pushed to its absolute limits and it on the ragged edge of failure. Combined with the ram capacity issues, this is less and less attractive by the day.

I look at it this way: every chip has got its limits and it appears 2 is this one's. So we shouldn't go beyond that (must we, even?). To me it's not an issue but merely a property of that chip family.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
I look at it this way: every chip has got its limits and it appears 2 is this one's. So we shouldn't go beyond that (must we, even?). To me it's not an issue but merely a property of that chip family.

Igor's lab posted some investigation. His investigation implicated capacitor setup. No idea if this is reality or red herring:
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
Igor's lab posted some investigation. His investigation implicated capacitor setup. No idea if this is reality or red herring:

Excellent reply, I'll check it out. Maybe Samsung's lackluster 8nm may play a role too? Either way, this may change my earlier opinion.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Apparently, AIB's cheaping out with junk capacitors to save a few bucks is limiting the cards' ability to fully boost to their higher bins. I'm throwing this right out there now: they discovered the issue early, but not early enough to fix them before launch, thus explaining the extremely limited quantity of cards on release. Less RMA, damage control, yada yada. They knew it was going to be discovered and NO ONE spending $800 wants to get stuck with a card that won't give it's best performance due to building the card BELOW reference spec to save a buck. Whada pile of crap.

Actually, does that explain the limited quantity? I don't think it does, because some vendors built good cards and the FE cards are good, and those are also in short supply. Oh well, whatevs. I ain't buying one until I'm convinced they aren't all broken, lol.
 

MrTeal

Diamond Member
Dec 7, 2003
3,554
1,658
136
Apparently, AIB's cheaping out with junk capacitors to save a few bucks is limiting the cards' ability to fully boost to their higher bins. I'm throwing this right out there now: they discovered the issue early, but not early enough to fix them before launch, thus explaining the extremely limited quantity of cards on release. Less RMA, damage control, yada yada. They knew it was going to be discovered and NO ONE spending $800 wants to get stuck with a card that won't give it's best performance due to building the card BELOW reference spec to save a buck. Whada pile of crap.

Actually, does that explain the limited quantity? I don't think it does, because some vendors built good cards and the FE cards are good, and those are also in short supply. Oh well, whatevs. I ain't buying one until I'm convinced they aren't all broken, lol.
Panasonic SP-Caps are hardly junk or cheap caps. Polymer caps often serve a different purpose than a ceramic cap, but a single SP-Cap is often more expensive than a bunch of ceramics. Unfortunately I can't understand the video, but his explanation is pretty garbage and saying that any board designer would know ceramics work better than low ESR electrolytics is like any car buff knows that Corvettes are better than Peterbilts.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
Excellent reply, I'll check it out. Maybe Samsung's lackluster 8nm may play a role too? Either way, this may change my earlier opinion.

Looks like confirmation from EVGA on the Capacitor issue:

https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx
...
During our mass production QC testing we discovered a full 6 POSCAPs solution cannot pass the real world applications testing. It took almost a week of R&D effort to find the cause and reduce the POSCAPs to 4 and add 20 MLCC caps prior to shipping production boards, this is why the EVGA GeForce RTX 3080 FTW3 series was delayed at launch. There were no 6 POSCAP production EVGA GeForce RTX 3080 FTW3 boards shipped.

But, due to the time crunch, some of the reviewers were sent a pre-production version with 6 POSCAP’s, we are working with those reviewers directly to replace their boards with production versions.
EVGA GeForce RTX 3080 XC3 series with 5 POSCAPs + 10 MLCC solution is matched with the XC3 spec without issues.

Also note that we have updated the product pictures at EVGA.com to reflect the production components that shipped to gamers and enthusiasts since day 1 of product launch.
Once you receive the card you can compare for yourself, EVGA stands behind its products!

Thanks
EVGA