• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Is 10GB of Vram enough for 4K gaming for the next 3 years?

  • Yes

    Votes: 34 34.3%
  • No

    Votes: 65 65.7%

  • Total voters
    99

sze5003

Lifer
Aug 18, 2012
12,989
246
106
I'll be getting a ps5 too but I don't think this time around I'll get an Xbox. I barely used my X except to play red dead 2 and a couple of other titles.

I'm more interested in the prices of the 3080 and 3090 cards when they go on sale on the upcoming weeks. I usually remember during each cycle, prices were always slightly higher and usually never near the MSRP of what they would state during the reveal. This time could be different well at least for the 3090 since there is no FE version.

A good friend of mine are going back and forth between choosing a card both trying to justify the 3080 because it's cheaper. He doesn't mind upgrading in a year to the next model whereas I haven't done that in a while so I would prefer to just keep what I get now.

I'm not cheap in general and I have the money to spend but for me, it's a personal matter of logic and necessity in the long run. Kind of like why spend that much when you don't have to, if you don't have to. I suppose at 3440x1440 I shouldn't have to worry because I'm not planning on going to 4k or 8k unless they make a ultrawide 4k monitor I can upgrade to.
 

repoman0

Platinum Member
Jun 17, 2010
2,937
1,128
136
If you are willing to turn settings down to get FPS up the 3800 is an outstanding step from your 1070 I’d say. It’s huuuuuuge :)
Yeah, looks like a solid 3-4x faster if I’m not mistaken. $700 is about the most I’ll spend anyway so no point waiting for a more expensive 20GB card. And I like my LG Gsync monitor so guess I’m hitting refresh on release day.
 

BFG10K

Lifer
Aug 14, 2000
21,694
524
126
10GB should be ok but 8GB definitely won't be. 8GB is already being squeezed now in some situations where the card has enough performance to handle the settings.
 
  • Like
Reactions: Tlh97 and moonbogg

MrTeal

Platinum Member
Dec 7, 2003
2,837
404
126
10GB should be ok but 8GB definitely won't be. 8GB is already being squeezed now in some situations where the card has enough performance to handle the settings.
I don't know, the fastest thing available now with 8GB is the 2080 Super. If that card is running into situations where 8GB isn't enough, it's not hard to imagine 10GB being a limit in a year or two for a card that's almost twice as fast.
 

samboy

Member
Aug 17, 2002
183
20
81
I'm coming from a GTX970 and plan to get the 3080; so big jump from 3.5GB to 10GB.
Like everyone else, I would have liked 16GB better.

However, to be honest if it was a choice of another $100 to go from 10GB to 16GB I'm not sure that I would pull the trigger. At the end of the day, NVidia has a pretty balanced offering with the 10GB and developers do have an incentive to not exceed the capabilities of their flagship card; they also need to run well on the 8GB 3070.

If you have a PCIE 4.0 system then the VRAM paging will also be 2x as fast; well written software should have ample resource to work with. The >10GB probably only comes into play for less optimize software (ok, console ports may fall into this category!)
 

DooKey

Golden Member
Nov 9, 2005
1,618
267
126
I'm in no hurry. I can easily wait all next year for a proper Ti replacement if necessary. 10GB is just baffling to be honest. They aren't fooling anyone when they call it their flagship GPU. It's the 2080 replacement, not the Ti replacement. Ti buyers should just wait IMO if you want the real deal or go for the 3090.
IMO they said flagship because they expect to sell a ton of them. IMO they will.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,070
1,351
126
To quote Nvidia themselves (and not GPU-Z or EVGA precision with their dodgy VRAM consumption figures):
I would expect it to be counterproductive to sales to list any of the games they've seen exceed 10GB's of memory. If he was being completely honest he would have at least thrown out one example of a game that exceeded the 10GB barrier.

Wait for official reviews and them pass judgement.

10's just a weird number when it comes to GPU memory.
 

moonbogg

Diamond Member
Jan 8, 2011
9,859
1,454
126
That RTX 3070 is basically obsolete right out of the box IMO. 8GB should be for 3060 class cards for 1080p-1440p gaming. Is 8GB even enough for 1440p moving forward?
 
  • Like
Reactions: psolord and Tlh97

Kenmitch

Diamond Member
Oct 10, 1999
8,070
1,351
126
If you have a PCIE 4.0 system then the VRAM paging will also be 2x as fast; well written software should have ample resource to work with. The >10GB probably only comes into play for less optimize software (ok, console ports may fall into this category!)
That's a big IF. Most gamers have been brain washed into Intel's offerings as they produced the highest frame rates on the top end cards. It'll probably change with Zen 3? Guess time will tell in the end.

I'm interested in seeing the effects of PCI 4.0 on the newest GPU's. If you watch the video you can see some scaling depending on the game(s).

 
  • Like
Reactions: Tlh97

mohit9206

Golden Member
Jul 2, 2013
1,285
402
136
That RTX 3070 is basically obsolete right out of the box IMO. 8GB should be for 3060 class cards for 1080p-1440p gaming. Is 8GB even enough for 1440p moving forward?
Yes it is enough for 1440p and even for 4K. Again let me remind you that PC games have various graphics presets like Low, Medium, High, Very High and Ultra.
Ultra is useless as it is just for showing off. 8GB is enough for 4K at medium or High settings unless you can prove otherwise.
 

moonbogg

Diamond Member
Jan 8, 2011
9,859
1,454
126
Yes it is enough for 1440p and even for 4K. Again let me remind you that PC games have various graphics presets like Low, Medium, High, Very High and Ultra.
Ultra is useless as it is just for showing off. 8GB is enough for 4K at medium or High settings unless you can prove otherwise.
I'll have to bring to your attention the fact that people don't buy brand new "flagship" cards just to have to lower settings to get playable performance on day one! If the cards simply came with a couple more gigs of ram, this wouldn't even be worth talking about. However, it is worth talking about because games are pushing past 8gigs already, often getting close to it at 1440p. That's today. There is a Marvel comics (or whatever) game that blows past 10.5GB at 4k already using HD textures. You telling me people spending $700+ on a new "flagship" GPU won't be pissed when they get hitching because they had the audacity to use an HD texture pack in a new game?
Forget 3 years; what about 1 year from now? This isn't going to get any better and no amount of Nvidia IO magical BS (that needs developer support, am I right?) is going to solve the 10GB problem. 2080Ti was their last-gen flagship. It had 11GB of ram. If the 3080 is their new flagship, why did they think it was a good idea to reduce it down to 10? When in the history of PC hardware has a new flagship release ever had less of a component as critical as RAM? The answer is *never*. The 3080 is not the flagship. It's a 2080 replacement and suitable for 1440p gaming over the next few years. It's not a legit 4K card moving forward.
 

CP5670

Diamond Member
Jun 24, 2004
4,459
44
91
ll have to bring to your attention the fact that people don't buy brand new "flagship" cards just to have to lower settings to get playable performance on day one! If the cards simply came with a couple more gigs of ram, this wouldn't even be worth talking about.
Yea, if you don't need to play at 4K with max settings, there is no reason to get a $800+ video card to begin with. I don't even mind dropping settings in games (many ultra quality effects in games cause big performance hits for barely noticeable changes) but high-res textures are the one thing that you can actually notice. If the 20GB version of the 3080 is $200 or so more, I would rather buy that.
 
  • Like
Reactions: Tlh97 and moonbogg

mohit9206

Golden Member
Jul 2, 2013
1,285
402
136
When in the history of PC hardware has a new flagship release ever had less of a component as critical as RAM? The answer is *never*. The 3080 is not the flagship. It's a 2080 replacement and suitable for 1440p gaming over the next few years. It's not a legit 4K card moving forward.
R9 Fury X is one that immediately comes to mind.
 

blckgrffn

Diamond Member
May 1, 2003
7,087
367
126
www.teamjuchems.com
R9 Fury X is one that immediately comes to mind.
I feel that was met with even more rage fury consternation that this has been.

That card such a huge compromise all the way around... the only Fury that I ever really considered were the Nano's that were briefly widely available on eBay about a year ago for $100 shipped. At the time, they offered a lot of value.

This will only be a Fury X moment if a 16GB GDDR6 Navi clearly outperforms the 3080 and at a lower TDP. Chances seem... not great. I am optimistic for a shoot out though :)
 

brianmanahan

Lifer
Sep 2, 2006
19,841
2,035
126
I've decided I'm going to go for the 3090. I'd love to spend as much as I did for the 1080ti this year but I also don't want to go back on the vram. I use my VR headset quite a bit and I've seen my card get used up quite a bit.
i'm *this* close to deciding the same, along with a nice 3440x1440 monitor

i could just use the money saved from the vacations i haven't been able to go on for the past 2 years (and probably next year as well)
 

ondma

Golden Member
Mar 18, 2018
1,488
412
106
R9 Fury X is one that immediately comes to mind.
That is somewhat apples to oranges though, as the Fury X had HBM. Kind of odd that everybody was saying AMD was going to dominate the market with HBM, and we still arent seeing it.
 

moonbogg

Diamond Member
Jan 8, 2011
9,859
1,454
126
Yeah, the first HBM implementation was criticized pretty severely if I recall because it had something like half the RAM of other cards. So, I stand corrected; this has happened before. It was absolutely terrible both times.
 

DamZe

Member
May 18, 2016
181
71
101
I don't know, but 10GB of vRAM for next gen enthusiast level card in 2020/21 seems low to me.
I don't see why nVidia couldn't give the 3080 12 GB and a 384-bit bus. They may just release a 3080Ti in the future with a slightly cutdown shaders compared to the 3090 and with 12GB.
 

Jaskalas

Lifer
Jun 23, 2004
29,150
2,753
126
I don't know, the fastest thing available now with 8GB is the 2080 Super. If that card is running into situations where 8GB isn't enough, it's not hard to imagine 10GB being a limit in a year or two for a card that's almost twice as fast.
Because consoles.

What amount of VRAM do they have access to? You'll never "need" more than that.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,070
1,351
126
10GB's is enough for the 360Hz 1080p guys. Not sure if the 3080 can hit that goal or not. Beyond that what the future holds is a unknown.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,190
126
106
I have yet to see anyone actually show 10GB+ VRAM being used and not just reserved, unlike us lot Nvidia has the tools to see actual usage of VRAM and not the reservation that user side tools show.

360Hz is just nuts, makes my 3x 1080p 120Hz displays seem slow as hell.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,070
1,351
126
Are you actually using more RAM at 1080 360Hz than 1080 144Hz at the same settings? 1080 isn’t really the issue, I run 4K and that’s where the concern is.
The comment was geared more or less towards the 3080's performance. Depending on reviews they might be worthy of one another.

I'm a 144Hz free sync RX 5700 part time peasant gamer that balks at cards over $400. I only paid $300 at launch for my 5700 thanks to Microcenters double $50 bundle deal.
 
  • Like
Reactions: Tlh97 and blckgrffn

ASK THE COMMUNITY