• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."
  • Community Question: What makes a good motherboard?

Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Is 10GB of Vram enough for 4K gaming for the next 3 years?

  • Yes

    Votes: 44 33.3%
  • No

    Votes: 88 66.7%

  • Total voters
    132

Stuka87

Diamond Member
Dec 10, 2010
5,216
985
126
Freesync and gsync are overrated.

ex: take your current monitor, flip gsync off, flip vsync on, and push your fps over 90. It is unlikely you will notice the difference.

on some monitors* they have additional low latency settings that can be toggled on when gsync/freesync is turned off. If yours is one of these you could get improved results flipping gsync off. ufo** test results get interesting.

*looking at you samsung with your pwm strobing
** https://www.testufo.com/
The purpose of variable rate refresh is for when you FPS drop BELOW your refresh rate. If your refresh rate is 60Hz, and you drop to 55, GSync/FreeSync are there to make it so you don't notice. Or, lets say you are gaming at 120Hz, and you drop to 110, it will be there to make it so you don't notice.
 
  • Love
Reactions: spursindonesia

sze5003

Lifer
Aug 18, 2012
13,139
277
126
Yup it's suppose to make it so you don't see it. I can tell you now I booted up watch dogs 2 and at 1440p highest settings you can definitely tell when it goes from 70fps to 50. It's very subtle because of the gsync but it's there. It just handles it smoothly despite the drop.
 

blckgrffn

Diamond Member
May 1, 2003
7,256
562
126
www.teamjuchems.com
Hmm... Freesync 2 premium lets me control the actual configuration of my monitor on per game/app setting and ensures HDR is good to go.

Honestly, it's a huge deal and I wouldn't want to switch away from it now. It's more than just the frame rates, but that is so noticeable as well.

I am still firmly in the camp that you need to be playing with an HDR enabled monitor if you truly care about the visuals of your games. It might be a small camp. IDK.
 

beginner99

Diamond Member
Jun 2, 2009
4,667
1,078
136
It might be a small camp. IDK.
Just a different camp. I'm in the anti-blur camp. I like this test. The tesufo moving photo tests with a Street map. With a strobbing display, you can read the street names. plain and clear with 0 eye stress. That really is night and day. Even with 120hz you have little chance to read the names, you need 120hz + strobbing.
 

Stuka87

Diamond Member
Dec 10, 2010
5,216
985
126
Hmm... Freesync 2 premium lets me control the actual configuration of my monitor on per game/app setting and ensures HDR is good to go.

Honestly, it's a huge deal and I wouldn't want to switch away from it now. It's more than just the frame rates, but that is so noticeable as well.

I am still firmly in the camp that you need to be playing with an HDR enabled monitor if you truly care about the visuals of your games. It might be a small camp. IDK.
I have a FreeSync Premium Pro display with HDR, and it is hands down the best display I have owned. Its sharp, smooth, and looks great. And if FPS drop, no big deal.
 

Leeea

Member
Apr 3, 2020
128
170
76
The purpose of variable rate refresh is for when you FPS drop BELOW your refresh rate. If your refresh rate is 60Hz, and you drop to 55, GSync/FreeSync are there to make it so you don't notice. Or, lets say you are gaming at 120Hz, and you drop to 110, it will be there to make it so you don't notice.
My refresh rate is 144 Hz. Anyone using a Nvidia 3000 series or AMD 6000 series will be able to peg the fps well above 60 Hz all the time. And likely will be running a 144 Hz or better monitor. 144 hz monitors are less then $200, buying a $600+ graphics card and using it with a 60 hz monitor seems odd.

Gsync and freesync is all but unheard of on TVs and many TVs are still stuck at 60 Hz. Flipping vsync on allows any of the above cards to peg at 60 Hz. Radeon anti-lag or nvidia's equivalent is more valuable in those scenarios, allowing the system to extract everything it can out of a weak display.

I have found that I run freesync off, vsync on, and strobe on. Strobe is way nicer then free sync when things start moving.

Yup it's suppose to make it so you don't see it. I can tell you now I booted up watch dogs 2 and at 1440p highest settings you can definitely tell when it goes from 70fps to 50. It's very subtle because of the gsync but it's there. It just handles it smoothly despite the drop.
Maybe gsync is way better, but on my freesync monitor I can definitely see it. With free sync it automatically disables when it drops below 48 fps. Gsync disables 30 fps. Either way in my experience just push the fps past 90 and embrace strobe.


Hmm... Freesync 2 premium lets me control the actual configuration of my monitor on per game/app setting and ensures HDR is good to go.

Honestly, it's a huge deal and I wouldn't want to switch away from it now. It's more than just the frame rates, but that is so noticeable as well.

I am still firmly in the camp that you need to be playing with an HDR enabled monitor if you truly care about the visuals of your games. It might be a small camp. IDK.
HDR on windows is a mixed bag. I run SDR on the desktop even with my HDR monitor.

In games that will output HDR directly, it is a real step up. However, most game still do not output HDR.

On my monitor HDR and Freesync are not linked. I can run HDR with strobe and it looks great.

I have found a display calibration instrument can really step up the experience as displays age. It hangs in front of your monitor and samples the colors allowing software like https://displaycal.net/ to develop color curves to correct for errors / drift in your displays output. Most HDR TVs also have a submenu for corrections, so no need for display cal profiles with those.

I have found even ancient monitors can look surprisingly good after running the color calibrator on them.


Just a different camp. I'm in the anti-blur camp. I like this test. The tesufo moving photo tests with a Street map. With a strobbing display, you can read the street names. plain and clear with 0 eye stress. That really is night and day. Even with 120hz you have little chance to read the names, you need 120hz + strobbing.
I can read the street names :). I am at 144 Hz with strobe on a C32HG70. Rumor has it though oleds are better now. I worry about the burn in though.
 
Last edited:

Leeea

Member
Apr 3, 2020
128
170
76
Quick notes about strobing:

typically incompatible with Freesync or GSync

The monitor strobes the backlight, so pixels are not visible while they transition between colors. The monitors backlight strobes on to show final result. This greatly reduces blurring, making each frame much crisper, cleaner, and nicer to look at.

This makes identifying, reacting, tracking, and targeting fast moving objects much easier.

Sometimes this is called black frame insertion, however black flame insertion can include other techniques not related to this.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
5,216
985
126
My refresh rate is 144 Hz. Anyone using a Nvidia 3000 series or AMD 6000 series will be able to peg the fps well above 60 Hz all the time. And likely will be running a 144 Hz or better monitor. 144 hz monitors are less then $200, buying a $600+ graphics card and using it with a 60 hz monitor seems odd.

Gsync and freesync is all but unheard of on TVs and many TVs are still stuck at 60 Hz. Flipping vsync on allows any of the above cards to peg at 60 Hz. Radeon anti-lag or nvidia's equivalent is more valuable in those scenarios, allowing the system to extract everything it can out of a weak display.

I have found that I run freesync off, vsync on, and strobe on. Strobe is way nicer then free sync when things start moving.



Maybe gsync is way better, but on my freesync monitor I can definitely see it. With free sync it automatically disables when it drops below 48 fps. Gsync disables 30 fps. Either way in my experience just push the fps past 90 and embrace strobe.
My comment on 60Hz was just an example. My display supports 165Hz, although I do not run it there currently.

There is no bottom end bound for GSync or FreeSync. The vast majority of displays however cannot go below 48Hz. This applies to FreeSync and GSync modes. The displays that support 30Hz with GSync are very high end displays, and require the hardware GSync module.

Besides, since monitors with modules are going the way of the dodo, GSync and FreeSync are the same thing now as nVidia gave in and started to support the open standard instead of their proprietary setup.
 
  • Like
Reactions: Tlh97 and Leeea

sze5003

Lifer
Aug 18, 2012
13,139
277
126
It would seem the proper thing to do would be to try and sell my current g sync only ultrawide and get an ultrawide that has the newer module which supports both free sync and gsync. This is of course if I decide to rebuild and go all AMD.

Holding out for independent reviews on the AMD GPU's and then a little longer to see the timeframe of that supposed 3080Ti.
 

Stuka87

Diamond Member
Dec 10, 2010
5,216
985
126
It would seem the proper thing to do would be to try and sell my current g sync only ultrawide and get an ultrawide that has the newer module which supports both free sync and gsync. This is of course if I decide to rebuild and go all AMD.

Holding out for independent reviews on the AMD GPU's and then a little longer to see the timeframe of that supposed 3080Ti.
The GSync Compatible/FreeSync displays don't have a module. Its a VESA standard thats built into the display controller. Its why they are so much cheaper than the older displays that used a module.
 
  • Like
Reactions: Tlh97 and Leeea

blckgrffn

Diamond Member
May 1, 2003
7,256
562
126
www.teamjuchems.com
The GSync Compatible/FreeSync displays don't have a module. Its a VESA standard thats built into the display controller. Its why they are so much cheaper than the older displays that used a module.
Yeah man, the monitors that Stuka and I have are in the $280-$350 range.

My monitor has been ~$330 at Best Buy lately. 32" 1440p Curved VA Freesync Premium Pro with Ultrasharp build quality and similar panel warranty. It's just a tremendous value IMHO. I know it isn't an ultrawide (I also wanted an ultrawide) but it checks a ton of boxes.
 

Golgatha

Lifer
Jul 18, 2003
11,863
124
106
I am such a weakling. I figure 10GB probably isn't enough for the next few years, but I guess I'll find out the hard way if it isn't. My local Microcenter had some 3080 stock come in yesterday while I was in the store, and I couldn't resist the new shiny. Also, the size of these cards don't hit you until you hold one. Holy cow they are huge and chonky!
 

blckgrffn

Diamond Member
May 1, 2003
7,256
562
126
www.teamjuchems.com
I am such a weakling. I figure 10GB probably isn't enough for the next few years, but I guess I'll find out the hard way if it isn't. My local Microcenter had some 3080 stock come in yesterday while I was in the store, and I couldn't resist the new shiny. Also, the size of these cards don't hit you until you hold one. Holy cow they are huge and chonky!
Hey, if you are paying MSRP for one at least, no judgement at all. They new, shiny and even if they do consume a lot of power top tier. Enjoy! :D
 
  • Like
Reactions: Tlh97 and Leeea

sze5003

Lifer
Aug 18, 2012
13,139
277
126
Yeah man, the monitors that Stuka and I have are in the $280-$350 range.

My monitor has been ~$330 at Best Buy lately. 32" 1440p Curved VA Freesync Premium Pro with Ultrasharp build quality and similar panel warranty. It's just a tremendous value IMHO. I know it isn't an ultrawide (I also wanted an ultrawide) but it checks a ton of boxes.
Does sound a lot less than what I paid for this 34 Alienware monitor. I do want at least 34 or 36 inch and ultrawide would be nice. I think LG has some nice ultrawides if I do happen to ever get rid of my current monitor.

Of course this will all depend on what stock is available in the next 3 months for AMD or Nvidia cards or other components.
 
  • Like
Reactions: Tlh97 and blckgrffn

undertaker101

Member
Apr 9, 2006
66
43
91
I am such a weakling. I figure 10GB probably isn't enough for the next few years, but I guess I'll find out the hard way if it isn't. My local Microcenter had some 3080 stock come in yesterday while I was in the store, and I couldn't resist the new shiny. Also, the size of these cards don't hit you until you hold one. Holy cow they are huge and chonky!
Yep my Zotac 3080 just about fit easily in my Antec E-ATX case, hope they don't increase card length much or I will have to get a new case next time around.
 
  • Wow
  • Like
Reactions: Leeea and blckgrffn

Stuka87

Diamond Member
Dec 10, 2010
5,216
985
126
Bruh... Tarkov at 3440x1440 fills 12GB on my Vega with HBCC on (stutters if off) and eats all my 16GB ram on newest Reserve map. Textures maxed but other settings on med/low/off. 10GB is low end amount. I expect SUPER cards soon.
That game is also infamous for being terribly optimized. But I also know people that play it on 8GB cards without stuttering.
 

dr1337

Member
May 25, 2020
62
131
66
But I also know people that play it on 8GB cards without stuttering.
At what settings though? Turn down texture quality and suddenly VRAM isn't an issue anymore. But for people who want to run at the absolute highest quality settings, its a very important consideration. For an anecdote, I had a 780ti for a while, and it could run r6 siege just fine but I absolutely could not run it with the high resolution textures enabled, but it was no problem for my buddy with an 8gb 290.
 
  • Like
Reactions: Tlh97

kondziowy

Member
Feb 19, 2016
168
72
101
That game is also infamous for being terribly optimized. But I also know people that play it on 8GB cards without stuttering.
Still should be taken into consideration even if not optimised. This game is 4 years old. There are people who want to play it. Especially on newest map. And one step down on textures in this game is making it look like 2005 game.

I called it:
It's happening boys :) Ugly medium textures on 700$ card :) He didn't make any video on Reserve map on High textures (only on other, smaller maps where he puts High).
 

Leeea

Member
Apr 3, 2020
128
170
76
Well, that makes at least 3 games where 10 GB is not enough.

If these were midrange cards midrange settings would be reasonable.
 
Last edited:
  • Like
Reactions: lightmanek

Mopetar

Diamond Member
Jan 31, 2011
5,049
1,552
136
I see Tarkov on this page mentioned. What are the other 2 please? Just curious.
One is a game that isn't even out yet (I think the name is Godfall) and hasn't been tested so who knows if it's a big deal or not.

I think the third was a CoD title, but I may be mixing that up with discussions about 8 GB on the 3070 being sufficient.
 

ASK THE COMMUNITY