Question Will a 3060 Ti/3070 with 8GB Vram be enough for 1080p 75fps Ultra with 4k texture pack?

VforV

Member
Oct 19, 2019
60
9
16
Hey guys, I'm currently running an almost potato 1060 6gb, so obviously I want to upgrade (if I find stock and decent prices).

I'm looking for at max a $500 GPU and not a cent more (at current prices that's not a 3070, since it's much more over MSRP unfortunately). But let's assume I find one, if not at least a 3060 Ti. Both have 8gb Vram and a lot of power compared to my 1060, so it will be a nice upgrade, but here comes the dilemma...

I want to game at 1080p Ultra (for at least 4 years before I upgrade again) and if possible in every game that has the option to use the 4k texture pack. Now 1080p with 4k textures is easier on the GPU than 1440p/4k with 4k textures, so I think I'll be ok with the horsepower, but how about Vram?

So basically the question is: Will a 3060 Ti/3070 with 8GB Vram be enough for 1080p 75fps Ultra with 4k texture pack?

Has anyone played with this scenario, or playing now? I know 6GB is not enough for this combo, though, but maybe 8GB* is too little of an upgrade also? *(strictly talking about Vram size in this aspect).

Or should I wait more for the RX 6700/XT cards which will have 12 GB? (That can mean waiting 2-3 months...)
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
I'd like to establish a baseline before we try to get an answer: speaking of VRAM capacity, the term '75 fps' is somewhat irrelevant, if the question is whether VRAM is enough or not. If it's enough (not just right now but also for the foreseeable future), then cool, and the fps number will not depend on the amount of VRAM present on the card.
If it's not enough, than it means that occasionally the fetching of textures will reach out to DRAM and though average fps might not suffer a lot, your lows will be insufferable, making the title at hand unplayable.
It's basically up to the developers, what goals they've set in the past 1-2 years regarding 4K textures. I'd bet some money that if you looked at the XBX and the PS5, you'd know the minimum amount of VRAM you were gonna need in the next 2-3 years, should you want to play in 4K resolution :)
There are outliers, for example Cyberpunk. That game follows NVIDIA product details only. But speaking of almost all other cases, games will mostly be tailored around console capabilities in the near future. My 2 cents :)
 

VforV

Member
Oct 19, 2019
60
9
16
This topic disappeared and now is back, so maybe some mod can merge it with the identical other one... :/

On topic: I agree with what you said.

I just think that a demanding game at 1080p Ultra with 4k textures today or a demanding game at 1080p Ultra with 4k textures 3 years from now can be a very different thing altogherher in terms of performance.

With the new consoles and the standardization of SSDs as a base need in new games, the game worlds will get denser and denser as we move forward, so 4k texture for let's say 100 assets in a demanding game, might turn into 4k textures for 500 or 1000 assets in future demanding games...

So you can see how much the Vram demand will increase with a change like that.
 
  • Like
Reactions: Tlh97 and lobz

Leeea

Diamond Member
Apr 3, 2020
3,617
5,363
136
for at least 4 years before I upgrade again

Will a 3060 Ti/3070 with 8GB Vram be enough for 1080p 75fps Ultra with 4k texture pack?

No.

. . .

Four years is a long time.

Another way of looking at it:
Console* ports to PC typically consider console quality as "medium" preset on PC
Console games are going to expect 10 GB of fast ram for graphics on consoles.

If 10 GB = medium settings on PC, then high and ultra settings > 10 GB.


*Xbox Series X may be a "4k" console, it seems at least some games are targeting 1080p 60 fps.
 
Last edited:
  • Like
Reactions: KompuKare

VforV

Member
Oct 19, 2019
60
9
16
Yeah, that's what I'm afraid will be the case too...

I don't want to buy a GPU for 2 years only so because of that I think I need more Vram for Ultra settings, even at 1080p... unless I will use it 3-4 years from now, at as you say, "medium" settings or even lower.

At least with the current prices and stock shortages I won't be buying one soon, so I have time to wait and think about my purchase.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
@senttoschool if you wanna vote me down, it's fine, but at least try to be decent and vote down my explanation, not tue statement itself. Also try to contradict it, so your 'vote' actually means something other than a childish display of frustration :)
Yeah, that's what I'm afraid will be the case too...

I don't want to buy a GPU for 2 years only so because of that I think I need more Vram for Ultra settings, even at 1080p... unless I will use it 3-4 years from now, at as you say, "medium" settings or even lower.

At least with the current prices and stock shortages I won't be buying one soon, so I have time to wait and think about my purchase.
Yes, and in all fairness you should just look at how the 980 (4 GB) fares against the R9 390 (8 GB) today, despite the 390 being only a 970 competitor at the time of its launch. I know it launched more than 5 years ago, but the shift hasn't happened yesterday, more like ca. 2 years ago.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
I think a better comparison might be the RX 480 and GTX 1060, both of which launched just a little over four years ago and have different configurations with the RX 480 having 4 and 8 GB variants and the 1060 coming in between at 6 GB. If you can find enough data over time you can probably get a better picture of how far into a card's life memory will hold out.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I think a better comparison might be the RX 480 and GTX 1060, both of which launched just a little over four years ago and have different configurations with the RX 480 having 4 and 8 GB variants and the 1060 coming in between at 6 GB. If you can find enough data over time you can probably get a better picture of how far into a card's life memory will hold out.

1060 comes in 3GB and 6GB. The 3GB one has MAJOR issues with games these days. 6GB barely cuts it only because the card is not fast enough to run settings that are high enough to use more than that.

The 2060 though with 6GB is really short on memory.
 
  • Like
Reactions: Tlh97 and VforV

VforV

Member
Oct 19, 2019
60
9
16
Indeed the rx 580 8gb vs gtx 1060 6gb (which I have) is a great example. I see the rx 580 8gb outperforming my gpu for some time now, when at first the gtx 1060 was slightly better or on par.
Yeah, this is the issue...
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
I think a better comparison might be the RX 480 and GTX 1060, both of which launched just a little over four years ago and have different configurations with the RX 480 having 4 and 8 GB variants and the 1060 coming in between at 6 GB. If you can find enough data over time you can probably get a better picture of how far into a card's life memory will hold out.
The cards I mentioned had proportionally the exact same difference in VRAM, that's why I went with that example.
 
  • Like
Reactions: Tlh97 and Mopetar

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Indeed the rx 580 8gb vs gtx 1060 6gb (which I have) is a great example. I see the rx 580 8gb outperforming my gpu for some time now, when at first the gtx 1060 was slightly better or on par.
Yeah, this is the issue...
That might still not be a VRAM thing, especially if you look at the 580 vs 980ti with its 6 GB of VRAM.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
1060 comes in 3GB and 6GB. The 3GB one has MAJOR issues with games these days. 6GB barely cuts it only because the card is not fast enough to run settings that are high enough to use more than that.

The 2060 though with 6GB is really short on memory.

The 3 GB 1060 is a cut down die (in some cases massively so since there's a GP-104 variant that has over half of hardware disabled) so you don't get quite the same comparison. The RX480 has both a 4 GB and 8 GB variant with all 36 CUs enabled which offers a better comparison, but I think the 4 GB variant may have had lower memory clocks so it isn't perfect. I believe Polaris also scaled well with memory overclocks more so than it did from pushing the cores so that may make a bigger difference than you might suspect.

I couldn't find a lot of benchmarks comparing the 4 GB and 8 GB variants as most sites just have results for the 8 GB version. I did manage to find two different TPU reviews from about a year ago of two different systems, one with an 8 GB 580, and the other with a 4 GB 580. The CPUs are different along with other system components so it isn't a perfect comparison, but it's probably about as good as one would get. Unfortunately they don't use all of the same games in the comparisons, but I've added those that do appear in both. I've also included the higher resolutions just to see if there's any drop-off.

580 4 GB Avg. (Min) FPS580 8 GB Avg. (Min) FPS
Witcher 3 (1080p)42.4 (34.0)43.9 (35.0)
Witcher 3 (1440p)33.2 (26.9)33.9 (27.2)
Witcher 3 (4K) 20.5 (17.1)20.9 (17.0)
GTA V (1080p)88.6 (81.0)94.1 (85.3)
GTA V (1440p)67.1 (62.6)70.6 (64.8)
GTA V (4K)33.7 (31.6)35.3 (32.7)

The 8 GB card does perform a little better, but without a more thorough analysis it's hard to say that's down to just the memory. However, given that both see similar scaling as the resolution increases, it doesn't appear that the VRAM size is responsible for any bottlenecks that the card may have. I suspect that you could probably get by with 8 GB, but it might start showing its age in 4 years. Neither of the head to head games here are more recent titles so I wouldn't draw too many conclusions from just these data points.

Also it may be worth waiting for Navi 22 cards to drop as well. It's pretty clear that the inclusion of a massive cache is giving the 6800/6900 cards such impressive 1080p results. There's no reason to think that won't carry over to the 6700/6600 cards as well and we know that the a 6700XT will have at least 12 GB of RAM which should give you a little extra breathing room. It'll probably come in somewhere between $400 and $450 which is a little below your budget so you can save a bit or look for an AIB card that's going to be closer to $500 but offer better cooling or overclock.
 

VforV

Member
Oct 19, 2019
60
9
16
Yes, I agree. The only issue I have is that is a little hard to wait 2-3 more months for the rx 6700/xt, IF I see a good offer on a rtx 3060ti or even rtx 3070.

So if there are no good offers for those cards it won't be an issue waiting, but if they are I hope I don't succumb to an impulse buy... :)
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
The 3060 Ti isn't out yet, though rumors put it about a week away. If past Ampere supply has been anything to base this off of you may be waiting 2-3 more months regardless of which cards you'd ultimately prefer to get.
 
  • Like
Reactions: Tlh97 and maddie

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
The 3 GB 1060 is a cut down die (in some cases massively so since there's a GP-104 variant that has over half of hardware disabled) so you don't get quite the same comparison. The RX480 has both a 4 GB and 8 GB variant with all 36 CUs enabled which offers a better comparison, but I think the 4 GB variant may have had lower memory clocks so it isn't perfect. I believe Polaris also scaled well with memory overclocks more so than it did from pushing the cores so that may make a bigger difference than you might suspect.

I couldn't find a lot of benchmarks comparing the 4 GB and 8 GB variants as most sites just have results for the 8 GB version. I did manage to find two different TPU reviews from about a year ago of two different systems, one with an 8 GB 580, and the other with a 4 GB 580. The CPUs are different along with other system components so it isn't a perfect comparison, but it's probably about as good as one would get. Unfortunately they don't use all of the same games in the comparisons, but I've added those that do appear in both. I've also included the higher resolutions just to see if there's any drop-off.

580 4 GB Avg. (Min) FPS580 8 GB Avg. (Min) FPS
Witcher 3 (1080p)42.4 (34.0)43.9 (35.0)
Witcher 3 (1440p)33.2 (26.9)33.9 (27.2)
Witcher 3 (4K)20.5 (17.1)20.9 (17.0)
GTA V (1080p)88.6 (81.0)94.1 (85.3)
GTA V (1440p)67.1 (62.6)70.6 (64.8)
GTA V (4K)33.7 (31.6)35.3 (32.7)

The 8 GB card does perform a little better, but without a more thorough analysis it's hard to say that's down to just the memory. However, given that both see similar scaling as the resolution increases, it doesn't appear that the VRAM size is responsible for any bottlenecks that the card may have. I suspect that you could probably get by with 8 GB, but it might start showing its age in 4 years. Neither of the head to head games here are more recent titles so I wouldn't draw too many conclusions from just these data points.

Also it may be worth waiting for Navi 22 cards to drop as well. It's pretty clear that the inclusion of a massive cache is giving the 6800/6900 cards such impressive 1080p results. There's no reason to think that won't carry over to the 6700/6600 cards as well and we know that the a 6700XT will have at least 12 GB of RAM which should give you a little extra breathing room. It'll probably come in somewhere between $400 and $450 which is a little below your budget so you can save a bit or look for an AIB card that's going to be closer to $500 but offer better cooling or overclock.
The 8GB has faster memory - 8Gbs vs 7Gbs on the 4GB version. That's probably why it is slightly faster not because it has more memory.
 
  • Like
Reactions: Tlh97 and Mopetar

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
The 8GB has faster memory - 8Gbs vs 7Gbs on the 4GB version. That's probably why it is slightly faster not because it has more memory.

The other possibility is that there's a difference in clock speed or OC between the cards that accounts for the difference, but I didn't look into it much.

Even if the core clocks were the same and the difference is purely memory speed, that doesn't seem to make as much of a difference as one would think it should. Maybe we just need results from newer titles or some specific testing in older titles with high res texture packs, because it's really odd that even the 4K results are so similar
 

Ranulf

Platinum Member
Jul 18, 2001
2,348
1,165
136
Having run system with a rx 480 8GB and a 570 4gb, I wish the 570 had 8gb of ram. It does depend on the game but any game pushing 2.5-3GB+ of ram usage (RDR2) for 1080p means the card is 500mb or less of spare video ram left on the rx 570.

If you want to keep a card for 4 years with roughly 60fps+, I'd get the most ram and most powerful card your budget can handle.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
Having run system with a rx 480 8GB and a 570 4gb, I wish the 570 had 8gb of ram. It does depend on the game but any game pushing 2.5-3GB+ of ram usage (RDR2) for 1080p means the card is 500mb or less of spare video ram left on the rx 570.

If you want to keep a card for 4 years with roughly 60fps+, I'd get the most ram and most powerful card your budget can handle.

I suppose that's good news for an 8 GB 3060 Ti / 3070 being used for 1080p because if 4 GB is still enough in most games, 8 GB should probably last for a good while.
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
I suppose that's good news for an 8 GB 3060 Ti / 3070 being used for 1080p because if 4 GB is still enough in most games, 8 GB should probably last for a good while.

What if i tell you im using a RX570 4GB for 1440P?

Funny enoght, you know in what kind of games im having the most issues? Unity3D ones... 7 days to die and Empyrion Galactic Survival would be two HUGE examples, Unity3D implementing dynamic resolution really saved me from using fullscreen. I had no issues playing RDR2 for example.

But yeah, high/ultra at 1080p 4GB is a little short but still works.

The 580 to 1060 is not really a valid comparison, the 1060 has to be compared to the 480, the 580 is a overcloked 480. Everyone always forgets about this.
 
  • Like
Reactions: Ranulf

Leeea

Diamond Member
Apr 3, 2020
3,617
5,363
136
I think a better comparison might be the RX 480 and GTX 1060, both of which launched just a little over four years ago and have different configurations with the RX 480 having 4 and 8 GB variants and the 1060 coming in between at 6 GB. If you can find enough data over time you can probably get a better picture of how far into a card's life memory will hold out.

My last card was a 4g Rx 580. The 4g version runs both lower mem clocks and lower GPU clock. I tried to upclock it and it was having none of that.

I upgraded from it about 18 months ago. For me it was a Skyrim texture pack I really wanted to run that did me in. Otherwise turning the textures down is sometimes doable.


The 8 GB card does perform a little better, but without a more thorough analysis it's hard to say that's down to just the memory.

I strongly disagree. I owned an RX 580 4g. The test results are flawed. Average FPS does not show the jarring SNAP that occurs when the entire game pauses for just a split second. The benchmarks do not show what happens when the action gets intense, to many enemies get on the screen, and then the entire game starts studdering as a person spins the camera quickly to keep track of everything.

Yes, turning the textures down fixes it. Going from ultra to high does not really hurt. But high to medium? I see it the rest of the game.

Another example is total war mods. In my opinion total war mods are way cooler then the generic games, you get much cooler uniforms, more units on the battlefield, and all sorts of neat changes. However, as my campaigns would go on I would get battles where my 4g RX 580 would chug along at <4 FPS for some parts. It is just painful. Usually there is no texture slider for mods.

What if i tell you im using a RX570 4GB for 1440P?

Funny enoght, you know in what kind of games im having the most issues? Unity3D ones... 7 days to die and Empyrion Galactic Survival would be two HUGE examples, Unity3D implementing dynamic resolution really saved me from using fullscreen. I had no issues playing RDR2 for example.

But yeah, high/ultra at 1080p 4GB is a little short but still works.

I gave a grandkid the 4 GB Rx 580, he was delighted. He was using some sort of crappy iGPU before that. It helps to keep perspective :).
 
  • Like
Reactions: Tlh97 and Ranulf

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
My last card was a 4g Rx 580. The 4g version runs both lower mem clocks and lower GPU clock. I tried to upclock it and it was having none of that.

The GPU clocks are not related to the memory. Thats just based on what AIB wanted to sell. I had an RX480 4GB Nitro+ and it had the same GPU clock as the 8GB version. I actually think the memory was the same speed as well.
 

Leeea

Diamond Member
Apr 3, 2020
3,617
5,363
136
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,446
20,438
146
With VRAM dominating current discourse, I am bumping this old thread to highlight a good prognostication.


I'd like to establish a baseline before we try to get an answer: speaking of VRAM capacity, the term '75 fps' is somewhat irrelevant, if the question is whether VRAM is enough or not. If it's enough (not just right now but also for the foreseeable future), then cool, and the fps number will not depend on the amount of VRAM present on the card.
If it's not enough, than it means that occasionally the fetching of textures will reach out to DRAM and though average fps might not suffer a lot, your lows will be insufferable, making the title at hand unplayable.
It's basically up to the developers, what goals they've set in the past 1-2 years regarding 4K textures. I'd bet some money that if you looked at the XBX and the PS5, you'd know the minimum amount of VRAM you were gonna need in the next 2-3 years, should you want to play in 4K resolution :)
There are outliers, for example Cyberpunk. That game follows NVIDIA product details only. But speaking of almost all other cases, games will mostly be tailored around console capabilities in the near future. My 2 cents :)
Nailed it, with the exception that the requirement would be for gaming at 4K was too optimistic.
 
  • Love
Reactions: IEC