• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Is 10GB of Vram enough for 4K gaming for the next 3 years?

  • Yes

    Votes: 39 36.1%
  • No

    Votes: 69 63.9%

  • Total voters
    108

sze5003

Lifer
Aug 18, 2012
13,025
251
126
the 3080 w/ 20GB wont cost above 850$ for the FE version.
Provided you can bot it.
That's what I'm thinking. Speaking of which I need to start working on a bot script to try and get myself 1 3090 next week lol.

Who am I kidding, I don't have any experience in that sort of thing despite being a developer. But it seems that it is something to look into for the future.
 
  • Like
Reactions: Tlh97 and pallab

Kenmitch

Diamond Member
Oct 10, 1999
8,152
1,591
126

StinkyPinky

Diamond Member
Jul 6, 2002
6,495
413
126
I wonder if a 12gb 3090 will be released. Could make it $150 cheaper, I'd suggest that would be a popular card since I think 12gb vram is enough for the next 5 years. (unlike possibly 10gb)
 

guidryp

Senior member
Apr 3, 2006
444
292
136
I wonder if a 12gb 3090 will be released. Could make it $150 cheaper, I'd suggest that would be a popular card since I think 12gb vram is enough for the next 5 years. (unlike possibly 10gb)
More likely a 12GB version is the 3080 Ti, a little later when yields improve, so cheaper die and less VRAM gets the price down to $999.
 
  • Like
Reactions: Ranulf and ozzy702

TheF34RChannel

Senior member
May 18, 2017
782
301
106
That RTX 3070 is basically obsolete right out of the box IMO. 8GB should be for 3060 class cards for 1080p-1440p gaming. Is 8GB even enough for 1440p moving forward?
Heck no!

I'll have to bring to your attention the fact that people don't buy brand new "flagship" cards just to have to lower settings to get playable performance on day one! If the cards simply came with a couple more gigs of ram, this wouldn't even be worth talking about. However, it is worth talking about because games are pushing past 8gigs already, often getting close to it at 1440p. That's today. There is a Marvel comics (or whatever) game that blows past 10.5GB at 4k already using HD textures. You telling me people spending $700+ on a new "flagship" GPU won't be pissed when they get hitching because they had the audacity to use an HD texture pack in a new game?
Forget 3 years; what about 1 year from now? This isn't going to get any better and no amount of Nvidia IO magical BS (that needs developer support, am I right?) is going to solve the 10GB problem. 2080Ti was their last-gen flagship. It had 11GB of ram. If the 3080 is their new flagship, why did they think it was a good idea to reduce it down to 10? When in the history of PC hardware has a new flagship release ever had less of a component as critical as RAM? The answer is *never*. The 3080 is not the flagship. It's a 2080 replacement and suitable for 1440p gaming over the next few years. It's not a legit 4K card moving forward.
8800GTS to 8800GTS G92: 640MB to 512MB. Had both, the former being my first high end card :D

But yeah, well said with your post!!!
 

elhefegaming

Member
Aug 23, 2017
157
70
71
I'm gonna game at 1440p
Based on what I read here I'm gonna get the 3080 10gb and save the difference for a cpu upgrade in a year
 

TheF34RChannel

Senior member
May 18, 2017
782
301
106
The 8800 I think was one of my first Nvidia cards. I recently came across the retail box last year when I was moving. Good memories.
I went from an Nvidia FX5200 to that 8800GTS 640. It was like going from crawling straight into a rocketship.

But yeah, that 20GB model; with the price gap surely it'll be 1000-1100 USD, a bit steep for my liking.
 

ozzy702

Golden Member
Nov 1, 2011
1,037
427
136
I'm gonna game at 1440p
Based on what I read here I'm gonna get the 3080 10gb and save the difference for a cpu upgrade in a year
I'm in the same boat. 1440p 144hz, don't plan on moving to 4k anytime soon and am not worried about 10GB @ 1440p. I figure I'll be using a 4080 or an RDNA3 GPU by the time I make the jump past 1440p anyway.
 

sze5003

Lifer
Aug 18, 2012
13,025
251
126
I went from an Nvidia FX5200 to that 8800GTS 640. It was like going from crawling straight into a rocketship.

But yeah, that 20GB model; with the price gap surely it'll be 1000-1100 USD, a bit steep for my liking.
I'm currently at 3440x1440 @120hz so unless they come out with 4k ultrawide I'm not changing resolution anytime soon. Ultrawide has been super productive for me working from home anyway.

While $1k-1100 sucks to pay for the 20gb model I'm not planning on getting rid of it until 3 years down the line and if it means I can crank all settings max and enjoy the latest games no issues, most likely worth it for that time frame.
 

elhefegaming

Member
Aug 23, 2017
157
70
71
I'm not liking the reports of stability issues beyond 2ghz. It just feels like the chip is being pushed to its absolute limits and it on the ragged edge of failure. Combined with the ram capacity issues, this is less and less attractive by the day.
isnt the clock like 1.8? i couldnt care less for OCing this.
 
  • Like
Reactions: brianmanahan

TheF34RChannel

Senior member
May 18, 2017
782
301
106
I'm not liking the reports of stability issues beyond 2ghz. It just feels like the chip is being pushed to its absolute limits and it on the ragged edge of failure. Combined with the ram capacity issues, this is less and less attractive by the day.
I look at it this way: every chip has got its limits and it appears 2 is this one's. So we shouldn't go beyond that (must we, even?). To me it's not an issue but merely a property of that chip family.
 

guidryp

Senior member
Apr 3, 2006
444
292
136
I look at it this way: every chip has got its limits and it appears 2 is this one's. So we shouldn't go beyond that (must we, even?). To me it's not an issue but merely a property of that chip family.
Igor's lab posted some investigation. His investigation implicated capacitor setup. No idea if this is reality or red herring:
 

TheF34RChannel

Senior member
May 18, 2017
782
301
106
Igor's lab posted some investigation. His investigation implicated capacitor setup. No idea if this is reality or red herring:
Excellent reply, I'll check it out. Maybe Samsung's lackluster 8nm may play a role too? Either way, this may change my earlier opinion.
 

CakeMonster

Senior member
Nov 22, 2012
955
68
91
Do we know the Samsung 8nm is bad? I see it being repeated but do we have any chips that are remotely similar to compare performance to?
 

moonbogg

Diamond Member
Jan 8, 2011
9,878
1,483
126
Apparently, AIB's cheaping out with junk capacitors to save a few bucks is limiting the cards' ability to fully boost to their higher bins. I'm throwing this right out there now: they discovered the issue early, but not early enough to fix them before launch, thus explaining the extremely limited quantity of cards on release. Less RMA, damage control, yada yada. They knew it was going to be discovered and NO ONE spending $800 wants to get stuck with a card that won't give it's best performance due to building the card BELOW reference spec to save a buck. Whada pile of crap.

Actually, does that explain the limited quantity? I don't think it does, because some vendors built good cards and the FE cards are good, and those are also in short supply. Oh well, whatevs. I ain't buying one until I'm convinced they aren't all broken, lol.
 

MrTeal

Platinum Member
Dec 7, 2003
2,863
463
126
Apparently, AIB's cheaping out with junk capacitors to save a few bucks is limiting the cards' ability to fully boost to their higher bins. I'm throwing this right out there now: they discovered the issue early, but not early enough to fix them before launch, thus explaining the extremely limited quantity of cards on release. Less RMA, damage control, yada yada. They knew it was going to be discovered and NO ONE spending $800 wants to get stuck with a card that won't give it's best performance due to building the card BELOW reference spec to save a buck. Whada pile of crap.

Actually, does that explain the limited quantity? I don't think it does, because some vendors built good cards and the FE cards are good, and those are also in short supply. Oh well, whatevs. I ain't buying one until I'm convinced they aren't all broken, lol.
Panasonic SP-Caps are hardly junk or cheap caps. Polymer caps often serve a different purpose than a ceramic cap, but a single SP-Cap is often more expensive than a bunch of ceramics. Unfortunately I can't understand the video, but his explanation is pretty garbage and saying that any board designer would know ceramics work better than low ESR electrolytics is like any car buff knows that Corvettes are better than Peterbilts.
 

guidryp

Senior member
Apr 3, 2006
444
292
136
Excellent reply, I'll check it out. Maybe Samsung's lackluster 8nm may play a role too? Either way, this may change my earlier opinion.
Looks like confirmation from EVGA on the Capacitor issue:

https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx
...
During our mass production QC testing we discovered a full 6 POSCAPs solution cannot pass the real world applications testing. It took almost a week of R&D effort to find the cause and reduce the POSCAPs to 4 and add 20 MLCC caps prior to shipping production boards, this is why the EVGA GeForce RTX 3080 FTW3 series was delayed at launch. There were no 6 POSCAP production EVGA GeForce RTX 3080 FTW3 boards shipped.

But, due to the time crunch, some of the reviewers were sent a pre-production version with 6 POSCAP’s, we are working with those reviewers directly to replace their boards with production versions.
EVGA GeForce RTX 3080 XC3 series with 5 POSCAPs + 10 MLCC solution is matched with the XC3 spec without issues.

Also note that we have updated the product pictures at EVGA.com to reflect the production components that shipped to gamers and enthusiasts since day 1 of product launch.
Once you receive the card you can compare for yourself, EVGA stands behind its products!

Thanks
EVGA
 

MrTeal

Platinum Member
Dec 7, 2003
2,863
463
126
EVGA confirmed it right there. They used POS CAPS, just like I said.
Yeah, again, not cheap caps. The SP-Caps are basically the gold standard in low voltage power supply filtering, that’s why if you flip the "overbuilt" FE board you'll find them used everywhere under the VRMs.
1601091598742.png

If you look up unit pricing on the SP-caps, you'll find they're generally not far off 10x the cost of a good X7R/X8R 0603 ceramic cap which appears to be the sized used there. Typically if you're buying a reel at Digikey it's about $1/piece for one of the 220uF SP-caps, and $0.05-$0.1 for a ceramic cap.

They're built for different purposes. The ceramic caps in the 2x5 grid are probably in the ballpark of 1uF, maybe up to a few microfarads. Total for one grid you're looking at 10-40uF total capacitance. Each of those SP-Caps is 470uF. The SP-Caps are built for bulk capacitance of VRMs though, which typically run in the ballpark of 1MHz with harmonics above that.

tl;dr, the polymer electrolytics or sintered tantalums aren't cheap-o capacitors. Without the additional higher-frequency ceramics the board might be unstable at the highest frequencies and power draws, but that might mean they underestimated the decoupling needs at high frequency. Probably a rushed launch and poor guidance by nvidia on what the actual needs were, it's not fair to dump this on the board partner's feet saying they cheaped out.
 

ASK THE COMMUNITY