8_10_12 GB VRAM not enough

Page 63 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,637
2,725
126
Videos of multiple games having problems




Deathloop

Resident Evil Village
3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:

RE.jpg


Ghost Recon Breakpoint

Resident Evil 2

Portal RTX
Without DLSS 3060 is faster than the 3070/3080TI/2080TI. Even with DLSS 3060 is still faster than 3070.

performance-3840-2160.png


Company of Heroes
3060 has a higher minimum than the 3070TI, even at just 1080p.

CH.jpg


10GB / 12GB
Plague Tale, 3080 10GB tanks if you enable ray tracing, would be fast enough if it had more VRAM because if you stop moving, the framerate stabilizes to >60FPS.


Hogwarts Legacy 4K + RT, 4060Ti 16GB is faster than 4070 12GB.


Reasons why 8GB nine years after first release isn't NV's fault.

  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.

According to some people here, 8GB is neeeevaaaaah NV's fault and the objective evidence below "doesn't count" because of reasons(tm) above. If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

psolord

Golden Member
Sep 16, 2009
1,832
1,178
136
You bought a 1440p card to play 1080p, sounds pretty much the same as what 4060Ti owners will get: a 1080p card to play 720p.
Video cards especially, being the most important part of gaming, get quickly relegated as graphics advance. This is the norm since the dawn of time of gpus. The 4060ti, in contrast to its constant bashing, seems to be consistently a little faster than the 3060ti.

Here are the latest tests of gamegpu at the end of 2023.
Screenshot 2023-11-24 at 10-55-04 Starfield v. 1.8.86 DLSS Test and PC Performance Benchmarks ...png

Screenshot 2023-11-24 at 10-55-23 Flashback 2 PC Performance Benchmarks for Graphics Cards and...png

Screenshot 2023-11-24 at 10-55-47 Call of Duty Modern Warfare 3 PC Performance Benchmarks for ...png

Screenshot 2023-11-24 at 10-56-11 The Invincible PC Performance Benchmarks for Graphics Cards ...png

Screenshot 2023-11-24 at 10-56-54 Alan Wake 2 PC Performance Benchmarks for Graphics Cards and...png

Screenshot 2023-11-24 at 10-57-25 Ark Survival Ascended PC Performance Benchmarks for Graphics...png

It is beating the 3060ti easily and even more easily the 12GB 3060.

The superb Alan Wake II exposes the true weaknesses of the cards. RTX 3060 44fps, 3060ti 55fps, 4060ti 67fps.

In Ark its 50% faster than the 3060. In starfield latest patch the same thing. Yeah....Vram ain't gonna save you friends.

It's really sad people in this forum cannot appreciate good tech and are only focusing on forced artificial problems. Yeah it would be much much better if it was cheaper, no arguments there. Two years later, small improvements over the 3060ti, same framebuffer, yeah I get it, it does not look great, but it is a good card still. At 350 it would be much better accepted. Sad choice by Nvidia there.
 
Last edited:

Aapje

Golden Member
Mar 21, 2022
1,267
1,705
96
This is the norm since the dawn of time of gpus. The 4060ti, in contrast to its constant bashing, seems to be consistently a little faster than the 3060ti.
[...]
It is beating the 3060ti easily and even more easily the 12GB 3060.

HBU found a 5% improvement over the 3060 Ti at 1440p, which is margin of error territory, certainly not "beating the 3060ti easily." That is the same kind of gap that you can find between different 3060 Ti models depending on their tuning, or one that you can close with a small overclock. Not that it matters and that 5% is going to be noticeable anyway outside of benchmarks.

Video cards especially, being the most important part of gaming, get quickly relegated as graphics advance.

Yes, which is exactly why the 4060 Ti is such a bad card, which you would realize if you took your own statements seriously. According to your own words, video cards that were acceptable in the past are no longer acceptable at the same price point today as they are relegated as graphics advance. Since graphics advanced much more than the 5% speed boost and 0% memory size boost for the 4060 Ti at the same price point as the 3060 Ti, the 4060 Ti is actually a much worse card than the 3060 Ti at the same price point in 2023 than the 3060 Ti was in 2020.

Certainly if you compare it to the alternatives, because in an even shorter period, the 6700 XT, which is margin of error slower than the 4060 Ti (4% at 1440p according to HBU), went from $479 to $300. Even if you compare it to the $399 3060 Ti rather than the mining-inflated initial price of the 6700 XT, the current price for the 6700 XT provides a significant improvement over the last 3 years, with a 25% lower price and 4GB extra memory, at pretty much the same performance level.
 
Last edited:

mikeymikec

Lifer
May 19, 2011
17,467
8,988
136
You can call a firm stance towards an issue, whatever you like mate. I am just posting my point of view. with graphs, own tests and whatnot. You don't like it, fine.

The point of this thread is to inform potential buyers that 8GB is increasingly problematic for new games. There's plenty of evidence to support that, and you've posted no counterpoint to that despite basically arguing that the premise of the thread is wrong. It really doesn't matter that it might be possible to work around an obvious limitation in what are still expensive, brand-new graphics cards, that not all games need that much VRAM, etc.

There are two truths to tell here: One is the premise of this thread being that spending hundreds of pounds/dollars on a graphics card that still can't deliver well on many modern games without workarounds, the other is how to make PC gaming work better with lower budgets. They're two different topics.

I think your energies would be better channelled into creating a thread that describes workarounds for cards with less VRAM complete with benchmarks and screenshots/videos to show the difference that say a single setting change can make, rather than arguing like 8GB is fine*. I think you'll get a far more positive response by showing how to make a ~£250 graphics card bring as much bang per buck as possible especially for the problematic games on 8GB.

* - provided that, adjust expectations here, terms and conditions apply.
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,117
11,543
136
It's really sad people in this forum cannot appreciate good tech and are only focusing on forced artificial problems.
That's just it, it's really sad Nvidia had to artificially gimp their mainstream cards in two otherwise powerful generations to maximize their margins. Either the chips themselves were meant for smaller shoes, or they were gimped by design.

Keep in mind they almost gave us 4080 12GB. Forced artificial problems indeed.

Yeah it would be much much better if it was cheaper, no arguments there. Two years later, small improvements over the 3060ti, same framebuffer, yeah I get it, it does not look great, but it is a good card still. At 350 it would be much better accepted. Sad choice by Nvidia there.
Exactly, that's what many here have been trying to tell you for weeks now! The lower it goes in price, the less we care about those 8GB. At $400+ people are upset, at $350 some will still be grumpy, and there's even a price point where they'll be content or happy with the purchase. The same people, the same card.

[Later edit] The same thing happened to RX 7600. At the initially intended MSRP of $300 it was deemed to expensive, and while I saw many voices asking for $200, I'm in the camp that considered $250 is a fair asking price for the performance class (VRAM included).
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,637
2,725
126
I see people are skipping the 1080p part of that, including yourself.
Nah, it's elementary that a 1440p card also runs 1080p and below. If playing DOSBox @ 320x240 on a 4090 gives you a stiffy, go right ahead. Nobody will pop into your thread telling you to "run proper settings".

As for it's usefulness at 1080p or 1440p, well that (as always) depends on the game, settings etc.
It also depends on the VRAM. Like..the entire point of the thread. Consider this proven scenario:

Two identical GPUs, one 8GB and the other 16GB, run a game @ 1440p. 8GB splutters at ~30FPS with wild random framespikes, while 16GB maintains a consistent ~60FPS.

You have two choices to explain this:
  1. 8GB clearly cripples the GPU as it's very playable with 16GB.
  2. 16GB is using "incorrect settings". The "proper setting" is dropping both cards down to 1080p, thereby "proving" 8GB is enough. A random anonymous internet person decides what the "proper setting" is. The person who paid for 16GB (and the fact 16GB performs well) have no say in the matter.
Which do you think is a sane response, and which you think is utter lunacy?
 
Last edited:

psolord

Golden Member
Sep 16, 2009
1,832
1,178
136
That's just it, it's really sad Nvidia had to artificially gimp their mainstream cards in two otherwise powerful generations to maximize their margins. Either the chips themselves were meant for smaller shoes, or they were gimped by design.

Keep in mind they almost gave us 4080 12GB. Forced artificial problems indeed.
Nvidia is a tech company and whether we like it or not, they sell the prime cuts of gpu technology. The AD104 is a full chip and a great chip at that. It was designed with 192bit bus in mind. As a 4070ti owner, I can assure you, that it already has more problems regarding it's gpu processing power, than its framebuffer capacity.
Exactly, that's what many here have been trying to tell you for weeks now! The lower it goes in price, the less we care about those 8GB. At $400+ people are upset, at $350 some will still be grumpy, and there's even a price point where they'll be content or happy with the purchase. The same people, the same card.

The whole Ada Lovelace family is overpriced. We know that. It's good but overpriced. It's not only the 8GB cards. I'd like the 4070ti to be 200$ cheaper. Can I do anything about it? Don't buy it you say? Yeah good luck driving that 4K panel (with dlss) on my previous 3060ti.

As it is now, I need 50 euros per week for gas. I'd wish, that extra 200$ Ngreedia, as it is called in these parts, charged me, for the next two years (which is an extra cost of 8.33 per month, from my ideal 600$ price), would be my only monetary annoyance.

This is a different discussion however and not related with how much vram is appropriate per ADA teraflop level. Even with 24GBs, the 4070ti would be the same card for me. Nothing would change. I can only do 4K/60/DLSS ayway, which is enough. A 3060ti 16GB would be just as useless for that panel.

The 4060ti only has 288GB/sec bandwidth, while being a little faster than the 3060ti with 448GB/sec.. This is the kind of wizardry that impresses me and I find the whole 8GB fixation sad. But there is a 100$ more expensive 16GB model, for those that have really weighed their needs and deemed it a worthy purchase. Speaking for myself, if I was considering spending 500 for a card, I'd say what the hell, add another 100 and buy a 4070 with no second thought.

It's funny that I am trying to see the bright side here, while I was called a socialist gamer and to go find a commie country, when I complained about gpu scalped prices a while back.


Still as a rx6600 owner, I am very happy with its VFM. Not going to argue about that. However as I usually say, not all 8GB cards are the same. You can see the ARK benchmarks I posted above. 4060ti 8GB=30fps
rx6600 8GB=6fps
and the 4070ti is faster than three 16GB cards. That's at 1080p alright. They are still useless in my book, but still faster if you take it as a metric.

So if the 4060ti 8GB is 5X faster than the 6600, how much would be a fair price for it, hmmm? You could argue that the 6700XT is also many times faster than the 6600 in Ark, so we could be facing a driver/optimization issue and this is an out-lier result.

But it is also, 72% faster in Alan Wake 2 and 63% faster in Starfield 1.886. In my country, the 4060ti is twice the price of the 6600 and in your American Newegg, it's not that far off. So I think the price is not THAT bad. Nvidia priced the card a little bit higher, than its raw performance, due to its better features (Nvidia's software combined with ADA's excellent power draw).
 

psolord

Golden Member
Sep 16, 2009
1,832
1,178
136
It also depends on the VRAM. Like..the entire point of the thread. Consider this proven scenario:

Two identical GPUs, one 8GB and the other 16GB, run a game @ 1440p. 8GB splutters at ~30FPS with wild random framespikes, while 16GB maintains a consistent ~60FPS.

You have two choices to explain this:
  1. 8GB clearly cripples the GPU as it's very playable with 16GB.
  2. 16GB is using "incorrect settings". The "proper setting" is dropping both cards down to 1080p, thereby "proving" 8GB is enough. A random anonymous internet person decides what the "proper setting" is. The person who paid for 16GB (and the fact 16GB performs well) have no say in the matter.
Which do you think is a sane response, and which you think is utter lunacy?


The 4060ti and the 4060ti 16Gb are priced differently and are aimed at different resolutions. At 1440p you are only forcing the first, to render into the latter's territory. Even so, could you or anyone, show us, exactly what kind of percentage of better running games we are looking here for 1080p and 1440p?

The funny thing, is that Nvidia is selling you software features, that you are choosing not to use. For example what would be the result if the 4060ti 8GB used DLSS in the above comparison? Is that not a setting that you paid for and is aimed to provide solutions, that you willfully ignore?

Here is a 4070 12GB vs 4060ti 16GB comparison at 1440p, which they are aimed at.



Cyberpunk 64 vs 50fps
Horizon Zero Dawn 113 vs 86fps
God of War 104 vs 75fps
Resident Evil 4 remake 93 vs 77fps
Hitman 3 188 vs 144fps
The Last of Us 63 vs 46fps
Forza Horizon 5 108 vs 93fps
A Plague Tale Requiem 80fps vs 65fps
MS Flight Simulaton 77fps vs 60fps
Hogwarts Legacy 42 vs 35fps

Also since you seem to like HWUB's tests in this forum (I like them too) here are 4060ti 8GB vs 4060ti 16GB results

4060ti 1080p hwub.jpg

4060ti 1440p hwub.jpg

For some strange whatever reason, the 16GB model is 3.5% faster at 1080p and 2.6% faster at 1440p. Congratulations, you bought yourself 2.6% more performance, on average, for the 1440p you are tooting, for 25% more money. Yeah of course there are those "let's force the 8GB card to do things it aint meant to do" corner cases you are posting from time to time.

But lookie here, the 4070 is 31% faster than the 4060ti 16GB, for another extra 100$.

4070 vs 4060ti 16gb  hwub.jpg

You are buying 31% more performance here, for 20% more money. But no, more vram is better, just because... Yeah that is true lunacy indeed.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,640
3,695
136
Nvidia is a tech company and whether we like it or not, they sell the prime cuts of gpu technology. The AD104 is a full chip and a great chip at that. It was designed with 192bit bus in mind. As a 4070ti owner, I can assure you, that it already has more problems regarding it's gpu processing power, than its framebuffer capacity.


The whole Ada Lovelace family is overpriced. We know that. It's good but overpriced. It's not only the 8GB cards. I'd like the 4070ti to be 200$ cheaper. Can I do anything about it? Don't buy it you say? Yeah good luck driving that 4K panel (with dlss) on my previous 3060ti.

As it is now, I need 50 euros per week for gas. I'd wish, that extra 200$ Ngreedia, as it is called in these parts, charged me, for the next two years (which is an extra cost of 8.33 per month, from my ideal 600$ price), would be my only monetary annoyance.

This is a different discussion however and not related with how much vram is appropriate per ADA teraflop level. Even with 24GBs, the 4070ti would be the same card for me. Nothing would change. I can only do 4K/60/DLSS ayway, which is enough. A 3060ti 16GB would be just as useless for that panel.

The 4060ti only has 288GB/sec bandwidth, while being a little faster than the 3060ti with 448GB/sec.. This is the kind of wizardry that impresses me and I find the whole 8GB fixation sad. But there is a 100$ more expensive 16GB model, for those that have really weighed their needs and deemed it a worthy purchase. Speaking for myself, if I was considering spending 500 for a card, I'd say what the hell, add another 100 and buy a 4070 with no second thought.

It's funny that I am trying to see the bright side here, while I was called a socialist gamer and to go find a commie country, when I complained about gpu scalped prices a while back.


Still as a rx6600 owner, I am very happy with its VFM. Not going to argue about that. However as I usually say, not all 8GB cards are the same. You can see the ARK benchmarks I posted above. 4060ti 8GB=30fps
rx6600 8GB=6fps

and the 4070ti is faster than three 16GB cards. That's at 1080p alright. They are still useless in my book, but still faster if you take it as a metric.

So if the 4060ti 8GB is 5X faster than the 6600, how much would be a fair price for it, hmmm? You could argue that the 6700XT is also many times faster than the 6600 in Ark, so we could be facing a driver/optimization issue and this is an out-lier result.

But it is also, 72% faster in Alan Wake 2 and 63% faster in Starfield 1.886. In my country, the 4060ti is twice the price of the 6600 and in your American Newegg, it's not that far off. So I think the price is not THAT bad. Nvidia priced the card a little bit higher, than its raw performance, due to its better features (Nvidia's software combined with ADA's excellent power draw).

LOL you bought probably the worst GPU available in this generation. A 4070 Ti? With 12GB?

Yeah, don't buy it. Give AMD or Intel a chance. And are you really referencing userbanchmark? That's pathetic, if true. But it would explain your Nvidia bias.

And hell, you've been saying "right settings" forever now and now saw an RX 6600 is far to slow? I guess you were using the "wrong settings".


All I can say is: eyeroll
 

psolord

Golden Member
Sep 16, 2009
1,832
1,178
136
LOL you bought probably the worst GPU available in this generation. A 4070 Ti? With 12GB?
For me it was the best for its price. If you have anything contrary to show, let me know.

I've been having a blast for the past almost a year, thank you very much.

Yeah, don't buy it. Give AMD or Intel a chance. And are you really referencing userbanchmark? That's pathetic, if true. But it would explain your Nvidia bias.
I am referencing the gamegpu tests which I posted above.

There was literally no gpu that would fit my dual slot space, at the power budget and features I wanted. I will gladly give AMD and Intel a chance for my primary rig, when they match these. Why is it a bias if what you want is not satisfied?

And hell, you've been saying "right settings" forever now and now saw an RX 6600 is far to slow? I guess you were using the "wrong settings".
I was talking about the ARK results posted in the above link.

Yes with right settings it could be playable, but that was not what the site was testing. They always crank up everything to over 9000, because that's how they roll. They are a testing site. Personally I would like for them to test lower end cards at different settings, but that's just me. I am just exposing what stupid settings can do to a test, is all. And that is for cards with the same framebuffer....


All I can say is: eyeroll
Maybe grow up please?
 

Mopetar

Diamond Member
Jan 31, 2011
7,790
5,887
136
But no, more vram is better, just because... Yeah that is true lunacy indeed.

This comment is already foolhardy for reasons that have been demonstrated in this thread, but it's going to age incredibly poorly. Anyone buying an 8 GB card today is going to have a bad time down the road.

Anyone who's paid attention to the past is already aware of this when we had the same discussion about the 3 GB and 6 GB 1060 or the 4 GB and 8GB Polaris cards about five years ago.

In two years sites like HUB, GN, and others will test the different cards and there's going to be a much larger gap between the 8 GB and 16 GB cards than there is today. Acting as though the only thing that matters is there here and now is incredibly shortsighted.
 

Ranulf

Platinum Member
Jul 18, 2001
2,303
1,097
136
At least the 4GB cards had 2-4+ years of good value aka price per performance. I don't see these $250-400 8GB cards getting even 2 years. I guess if you count DLSS as a useful feature, sure. But that just makes Turing or RDNA 1 cards look even better.

You've got new, or updated engines coming out that need more power or are still in the early optimizing phases and thus need more performance to do the same. The game Satisfactory just released update 8, it now uses Unreal 5 and the complaints are rolling in that it performs worse.
 

psolord

Golden Member
Sep 16, 2009
1,832
1,178
136
In two years sites like HUB, GN, and others will test the different cards and there's going to be a much larger gap between the 8 GB and 16 GB cards than there is today. Acting as though the only thing that matters is there here and now is incredibly shortsighted.

In two years my 3060ti will be 5 years old. It's already 3. Who gives a damn? It will have much greater problems than vram until then. But still, with correct settings (that is NOT UGLY settings) it will be fine.

I am not worried about these cards and I will tell you why. Let's see Alan Wake II again. These are the settings used on the PS5, according to DF.

alan wake 2 settings pc high vs ps5.jpg

I have drawn matching lines with the settings I used in this run for comparison with the performance mode (60fps).


Comparing the PS5's performance mode (60fps) the 3060ti is:
-using higher res with higher quality, which is 1080p DLAA, vs 847 internal upscaled to 1440p with fsr2 on the ps5
-also using higher settings. Even higher than the quality mode.

Same thing happened with A Plague tale Requiem. In Baldur's Gate 3 a 6600 is enough for PS5 parity, etc. Vram ain't helping.

Therefore, since these systems are the dev's baseline, these cards will be fine, for the foreseeable future.

Also I came upon this.


Yeah the 4060ti did 30fps in gamegpu's tests at 1080p and I bet you this is not 4k rendering on the XBOX. It's surely upscaled. In newer patches they have added dlss and framegen, so yeah, you can play much better on these cards. Vram ain't helping here either.

And since you mentioned Gamer's Nexus, listen to what Steve had to say about the 16GB 4060ti.


"The 4060ti 16GB is not just fast enough to benefit from the memory capacity increase, so you might as well ignore that it exists" ;)
 

coercitiv

Diamond Member
Jan 24, 2014
6,117
11,543
136
Even with 24GBs, the 4070ti would be the same card for me. Nothing would change. I can only do 4K/60/DLSS ayway, which is enough. A 3060ti 16GB would be just as useless for that panel.
Always deflecting and moving the subject towards another, more convenient focus. Remember when we compared 4060Ti 16GB with the 8GB version and had examples of games where the 16GB was able to drive 1440p with higher IQ settings at playable FPS? Your response was the 16GB card is another class, another price point, and thus people who pay $500 can expect a different experience.

Wanna play FarCry @ 1440p with HD texture pack? You must be looking for the $500 experience, this way please!

These two cards, are not at the same price and therefore have a different target group. Some people will get the 8GB for their needs and some others the 16GB for their needs. These needs are NOT the same.
A month ago you were arguing the 4060Ti 16GB is a card aimed at a different group, with different NEEDS. What needs might those be? Ultra textures? Enabled RT? Incorrect settings? Maybe the needs of people who want to enjoy their games without spending hours hunting for the "correct settings" in all the games they're about to play.

If you value your time, pay the Nvidia comfort tax. Except when you enjoy going on epic posting rants on forums, then buy the 8GB cards and waste amazing amounts of hours debating the correct way to play games and hold phones.

1701164384491.png
 

Aapje

Golden Member
Mar 21, 2022
1,267
1,705
96
In two years my 3060ti will be 5 years old. It's already 3. Who gives a damn? It will have much greater problems than vram until then. But still, with correct settings (that is NOT UGLY settings) it will be fine.

I am not worried about these cards and I will tell you why.

I agree with coercitiv that you are deflecting and carefully making your argument in such a way to not have to respond to arguments that don't work to your benefit.

That you may get 5 years of use out of that 3060 Ti is utterly irrelevant to a buyer today, who can't go back in time to buy a GPU in 2020. Fact is that a 4060 Ti has near the same performance and the exact same memory as the 3060 Ti, so unless FSR 3 significantly extends the life of the card, which is very doubtful, it will last to about the same date as the 3060 Ti. Which means that the 4060 Ti is way worse value, since you get way less life out of it than the 3060 Ti buyers got.
 
Jul 27, 2020
15,025
9,254
106
Which means that the 4060 Ti is way worse value, since you get way less life out of it than the 3060 Ti buyers got.
It was DOA for me.

At the time these were launched, my thoughts on them:

1060 3GB: Better than nothing.
1060 6GB: Yes, please.
2060 6GB: Umm..maybe useful for a bit of RTX?

3060 12GB: Ehhh, the extra VRAM is nice. Looks more attractive than the 3060 Ti 8GB.

4060 Ti 8GB: Uggghhh. No thank you to this turd in 2023.
4060 Ti 16GB: May consider it at $380.
 
  • Like
Reactions: Tlh97

Mopetar

Diamond Member
Jan 31, 2011
7,790
5,887
136
In two years my 3060ti will be 5 years old. It's already 3.

What does that matter. I have an 8 GB Polaris card that's even older, around 6 or 7 years by now. Doesn't change that it (or anything else with only 8 GB) that wasn't having VRAM issues when it first came out would be a bad purchase now.

The 4060 Ti was only released a little over six months ago. We've already seen the numerous cases where a 3060 will put up better numbers due to not hitting the wall with only 8 GB of VRAM.

"The 4060ti 16GB is not just fast enough to benefit from the memory capacity increase, so you might as well ignore that it exists" ;)

Unless it's any of the cases where the 3060 beats the 8 GB model. Those cases will only grow more numerous over time.

Frankly it's just a badly designed product. NVidia should have given it a wider bus and 12 GB of VRAM. It'd be a much more compelling product and would be considerably more future proof.
 

Ranulf

Platinum Member
Jul 18, 2001
2,303
1,097
136
More grist for the mill, Homeworld 3's hardware requirements have been released. 1060 6Gb absolute minimum for GPU. 3060 minimum for RT, though I wonder if that is 30 series or better tech only issue or just a safeguard by the devs. I can't imagine a 2070 or better card would have trouble with it otherwise.

Edit: It may be from DLSS 3.0 and FrameGen.

"Homeworld 3 players can enable Ray Tracing for soft shadows with contact hardening, as well as HDR (High-Dynamic Range) for unlocking even higher fidelity. DLSS3 and Frame Generation are featured in Homeworld 3 for compatible NVIDIA graphics card, as well as AMD FidelityFX™ Super Resolution 2. Finally, Homeworld 3 officially supports the Intel ARC series of graphics cards."

Via https://www.homeworlduniverse.com/homeworld-3-system-requirements/

Homeworld-3_PC-Requirements_1920x1080-40gb.jpg
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,626
6,869
136
More grist for the mill, Homeworld 3's hardware requirements have been released. 1060 6Gb absolute minimum for GPU. 3060 minimum for RT, though I wonder if that is 30 series or better tech only issue or just a safeguard by the devs. I can't imagine a 2070 or better card would have trouble with it otherwise.

Edit: It may be from DLSS 3.0 and FrameGen.

"Homeworld 3 players can enable Ray Tracing for soft shadows with contact hardening, as well as HDR (High-Dynamic Range) for unlocking even higher fidelity. DLSS3 and Frame Generation are featured in Homeworld 3 for compatible NVIDIA graphics card, as well as AMD FidelityFX™ Super Resolution 2. Finally, Homeworld 3 officially supports the Intel ARC series of graphics cards."

Via https://www.homeworlduniverse.com/homeworld-3-system-requirements/

Homeworld-3_PC-Requirements_1920x1080-40gb.jpg

-Relic will save us the trouble by making it suck.