8GB VRAM not enough (and 10 / 12)

Page 35 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
This thread was started in mid 2021 and is being retired/locked. As the OP is no longer active, or updating and maintaining it.

Mod DAPUNISHER


8GB

Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!

 
Last edited by a moderator:

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
$249 wouldn't be terrible for 6650 XT performance with ~15W lower power consumption.
You really think, of all people, AMD GPU users on dsktop PCs care at all about 15W??? Enough to drop coin on a not-really-good-enough-anymore card? WITHOUT all of the premiums on the NV side of things?

I'll be polite, and not call you names. But really???

NO MORE than $200, for this 8GB VRAM, 2048 SP, Polaris replacement.
 
  • Love
Reactions: GodisanAtheist

Thunder 57

Diamond Member
Aug 19, 2007
4,130
6,886
136
Not like the 4070 is selling well. Besides, I was talking about defensible prices that would excite people into buying, not prices to keep the 4060 Ti in line with the rest of the price gouged lineup.

What makes you think they are going to change their pricing straegy now? If they were charging defensible prices that would excite people in to buying the 4070 would be $400, $500 tops.

You really think, of all people, AMD GPU users on dsktop PCs care at all about 15W??? Enough to drop coin on a not-really-good-enough-anymore card? WITHOUT all of the premiums on the NV side of things?

I'll be polite, and not call you names. But really???

NO MORE than $200, for this 8GB VRAM, 2048 SP, Polaris replacement.

So inflation is only a valid excuse for NVIDIA? The 4GB Rx 480 came out in 2016 at $200. Especially if the 4060 Ti is launched at $400. I don't buy the $250 rumor, I already guessed $300. I'd love to be wrong though.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
9,046
7,776
136
You really think, of all people, AMD GPU users on dsktop PCs care at all about 15W??? Enough to drop coin on a not-really-good-enough-anymore card? WITHOUT all of the premiums on the NV side of things?

I'll be polite, and not call you names. But really???

NO MORE than $200, for this 8GB VRAM, 2048 SP, Polaris replacement.
Was bracing for $330 I guess. Never said $250 would be good though, just not terrible. At least compared to the $330 rumors.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
9,046
7,776
136
What makes you think they are going to change their pricing straegy now? If they were charging defensible prices that would excite people in to buying the 4070 would be $400, $500 tops.
They got so much bad press about 8GB with games struggling on it this year. They kind of have to if they don't want to piss off gamers who are already avoiding the 4070. Though I'm fully expecting $400.
 

jpiniero

Lifer
Oct 1, 2010
16,939
7,355
136
They got so much bad press about 8GB with games struggling on it this year.

That's what the 4060 Ti 16 GB is about. Now the price OTOH...

(Also the 4070 Ti is selling okayish. The issue with the 4070 might be that it's not faster than the 3080. And people want faster)
 

Aapje

Golden Member
Mar 21, 2022
1,530
2,106
106
(Also the 4070 Ti is selling okayish. The issue with the 4070 might be that it's not faster than the 3080. And people want faster)
The 4070 is fine if it were cheaper. Right now it offers fairly little over the 3080 for not much less, so most people who would otherwise buy it, already got a 3080.
 
  • Like
Reactions: Tlh97 and blckgrffn

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
Compensation for nVidia's top execs.


Jensen alone got ~$21 Million in 2023. Yet some individuals on this forum tell us nVidia's margins are razor thin and they'd be destitute if they added more VRAM to GPUs.

Poor, poor nVidia, gg.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,506
7,760
136
You really think, of all people, AMD GPU users on dsktop PCs care at all about 15W??? Enough to drop coin on a not-really-good-enough-anymore card? WITHOUT all of the premiums on the NV side of things?

If they made a low profile fanless card for that price that only drew 15W they'd probably get a lot of people who build their own media box HTPC rigs very interested.

Compensation for nVidia's top execs.


Jensen alone got ~$21 Million in 2023. Yet some individuals on this forum tell us nVidia's margins are razor thin and they'd be destitute if they added more VRAM to GPUs.

Poor, poor nVidia, gg.

Who here actually claims NVidia's margins are thin? I don't even think die hard fanboys would claim such a thing. The only person who might think they're too low has to have a ton of NVidia shares.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,985
1,281
126
A 16GB 4070ti for $899 or even $949 would sell very well. But I guess they'd need to tweak the bus to get that ram amount.
 

Mopetar

Diamond Member
Jan 31, 2011
8,506
7,760
136
AD-104 has a 192-bit bus, so they'd have to go with 24 GB or they'd wind up selling a card with more VRAM, but less bandwidth which would probably hurt performance.

I don't even think they'd try to sell a 20 GB 4070 Ti because the are probably enough titles more sensitive to the bandwidth than the extra VRAM that it would lose on the performance charts to the less expensive card.
 

A///

Diamond Member
Feb 24, 2017
4,351
3,160
136
Who here actually claims NVidia's margins are thin? I don't even think die hard fanboys would claim such a thing. The only person who might think they're too low has to have a ton of NVidia shares.
dunno there's not a member I suspect of smoking crack. with llm ai being the new nft crypto whatever we'll be back to a high sooner or later and then the dump when it occurs after people realize the services are only useful as the material fed to it, which isn't high quality to begin with.
 

tajoh111

Senior member
Mar 28, 2005
348
389
136
Compensation for nVidia's top execs.


Jensen alone got ~$21 Million in 2023. Yet some individuals on this forum tell us nVidia's margins are razor thin and they'd be destitute if they added more VRAM to GPUs.

Poor, poor nVidia, gg.


Same can be said of Lisa Su. She makes quite a bit more then Jensen.
 

tajoh111

Senior member
Mar 28, 2005
348
389
136
She deserves every penny of it. She's not hostile to consumers, unlike Jensen.
How much a CEO deserves is based on financial performance of the company and to some extent the companies resources.

In addition, he grew it while keeping his domestic workforce and keeping layoffs to a minimal.

AMD best trick was tricking Consumers they are consumer friendly.

What AMD has been doing with CPU's is almost exactly with the Nvidia has done with the RTX 4080.

That is charging the same with the same core count.


The cost to make a 7950x is 69 dollars but AMD tried to sell it for 699. With these shrinking dies, we should getting more cores but we are not. We are paying the same amount if not more. If you want more cores, you have to pay through the nose just like GPUs.

AMD largest Consumer CPU chips cost $5500 and with AMD chiplet strategy along with a lack of a cooler and memory, likely cost less to manufacture than a RTX 4090. But AMD's get's no criticism as a result of Saintess Su. In addition, look at the release pricing of AMD GPU's, are they really any better than Nvidia when you take into account their reduced R and D spending and reduce driver support?

Also remove the consoles from AMD's revenue which represent 25% of AMD revenue but have margins in the low teens and AMD margins are in the 53 to 60% range which is the same as Nvidia's.
 
  • Like
Reactions: psolord

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
Jensen's salary while huge is basically peanuts compared to him ownership stake in Nvidia which is worth 1000x more.

I honestly really can't get that upset about him making that kind of cash. You can legitimately argue that Nvidia wouldn't be the company it is today without him at the helm for the last 30 years.
 
  • Like
Reactions: Tlh97 and Mopetar

fleshconsumed

Diamond Member
Feb 21, 2002
6,486
2,363
136
The cost to make a 7950x is 69 dollars but AMD tried to sell it for 699. With these shrinking dies, we should getting more cores but we are not. We are paying the same amount if not more. If you want more cores, you have to pay through the nose just like GPUs.
To be fair, before Ryzen came about you had to pay $339 for a 4 core 7700k. That's $84.75 per core. Even at its full MSRP of 699 7950x is $43.68 per core. Half the price of what we used to pay. So we did get more cores cores for our money.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
Who here actually claims NVidia's margins are thin?
Anyone who defends 8GB by claiming that in order to increase it, already overpriced GPUs would have to significantly rise in cost.

Same can be said of Lisa Su. She makes quite a bit more then Jensen.
Just so we're clear, I don't give a crap how much Jensen or any other CEO is paid. I'm just pointing out nVidia isn't a pauper.

The fact is, we're still getting 8GB because:
  1. Maintain miner profit margins.
  2. Planned obsolescence.
  3. Force you to buy professional GPUs which are even more overpriced.
  4. GPU monopoly. NV's doing exactly what Intel did with 14 nm++++ due to no competition. 8GB is NV's 14++++ nm.
It's nothing to do with foundry space, supply chain issues, inflation, component shortages. or any other claptrap apologists claim.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
The fact is, we're still getting 8GB because:
  1. Maintain miner profit margins.
  2. Planned obsolescence.
The thing was, really, that during the most recent mining "boom" (bull run), GPUs (the new 3000 series) had 8GB, mostly, because that was just barely "good enough" to mine crypto in Windows... no other real reason.

IOW, cards of that era (3000-series) were designed with VRAM capacities with the "ETH DAG size" purely in mind.

Edit: At least, IMHO. After all, the Polaris cards had an 8GB VRAM variety, and they mined well, no need to increase past that. Well, until the spectre of AAA gaming in 2023 reared it's head, with several major console ports finding out that "8GB really wasn't enough". Which should be reason for video cards released in 2023 to have MORE VRAM, we'll see about the 4060ti 16GB, I guess.
 
Last edited:
  • Like
Reactions: Tlh97 and Mopetar

Mopetar

Diamond Member
Jan 31, 2011
8,506
7,760
136
Anyone who defends 8GB by claiming that in order to increase it, already overpriced GPUs would have to significantly rise in cost.

Is anyone actually claiming 8 GB is enough because they think 16 GB would cost so much more? I'm more inclined to believe it's people who don't play AAA titles or anything that uses a lot of VRAM.

There are a lot of rumors swirling that NVidia will offer cards with more VRAM but they'll just pass the cost on to consumers. If anything the margins on those cards will be higher as tends to be the case for consumer products with better specs.

The inclusion of a card with more memory doesn't preclude offering a card with less. If there are people who think 8 GB is suitable for their needs who are you or I to tell them otherwise. If naive consumers buy something bad to save a buck, there's few better teachers than experience.

NVidia can have all manner of reasons (good or otherwise) for selling the products they do, but none of that actually goes to show that there are posters here who think their margins are too thin. If you can find one, I'll gladly lampoon them with you, but otherwise I think you're twisting words.
 

VRAMdemon

Diamond Member
Aug 16, 2012
7,926
10,432
136
AD-104 has a 192-bit bus, so they'd have to go with 24 GB or they'd wind up selling a card with more VRAM, but less bandwidth which would probably hurt performance.

I don't even think they'd try to sell a 20 GB 4070 Ti because the are probably enough titles more sensitive to the bandwidth than the extra VRAM that it would lose on the performance charts to the less expensive card.

Is there any way Nvidia could have gone with 18 GB of vram with the 192 bit bus instead of 24? Or is that theoretically impossible? I assume it's because 3GB GDDR6 modules don't exist?
 

jpiniero

Lifer
Oct 1, 2010
16,939
7,355
136
Is there any way Nvidia could have gone with 18 GB of vram with the 192 bit bus instead of 24? Or is that theoretically impossible? I assume it's because 3GB GDDR6 modules don't exist?

I expect nVidia to refresh Ada with GDDR7. And I would say 3 or 4 GB chips is theoretically possible then. But that's just a guess.

So you could say that the bus width decisions could have been made thinking that the refresh would have 50% or 100% more memory than the initial.
 

Mopetar

Diamond Member
Jan 31, 2011
8,506
7,760
136
Is there any way Nvidia could have gone with 18 GB of vram with the 192 bit bus instead of 24? Or is that theoretically impossible? I assume it's because 3GB GDDR6 modules don't exist?

I don't believe 3 GB modules exist. Almost anything computer related tends to be a power of 2 due to the binary nature of the hardware. If the memory controller supported such an odd arrangement there's no issue, but I don't think anyone is manufacturing 3 GB memory chips.
 

Aapje

Golden Member
Mar 21, 2022
1,530
2,106
106
I don't believe 3 GB modules exist. Almost anything computer related tends to be a power of 2 due to the binary nature of the hardware. If the memory controller supported such an odd arrangement there's no issue, but I don't think anyone is manufacturing 3 GB memory chips.

It tends to be a power of 2, but it definitely doesn't have to be. It's usually just a bit inefficient because processors are built to manipulate numbers of certain sizes, which are always a power of 2 (which is very helpful when you want to combine two 16 bit numbers into a single 32 bit number for example). So, a processing unit in a CPU might be built to operate on 32 bits at a time. If the address to denote a certain part of the RAM is then also 32 bit long, you use that processing unit optimally. If you actually need only need a 24 bit number to address the memory, you still have to use a 32 bit operation and it is no faster than manipulating a 32 bit number. So usually, the choice is then made to add so much memory that the entire 32 bit number is used and then the next step up is using an entire 64 bit number, so the overhead of addressing the RAM is minimized.

However, it probably doesn't matter that much you are dealing with huge amounts of memory anyway and the retrieval is way slower than the speed of the processor. The speed by which the processor manipulates the address is not going to be significant.

GDDR6 actually has a standard for 1, 1.5 and 2 GB modules, with reservations being made for 3 and 4 GB modules. The spec for GDDR7 isn't done, so we don't know what it will support. Ultimately, these specs are mostly written by the companies themselves, based on what they think they might want to and are able to produce. They haven't said that they will make 3 and 4 GB GDDR7 modules.

---
As a bit of an aside:

GDDR6X is not binary when sending data. A binary signal can send one of two values (0 or 1) every clock cycle. PAM4 in GDDR6X sends one of four values. However, since the rest of the system is binary, this actually results in 2 bits that each have a 0 or a 1 being sent every cycle instead of the typical one.

GDDR7 uses PAM3 which sends one of three values every clock cycle (-1, 0, +1). Since a bit has two values, this is actually 1.5 bits. However, since the rest of the system again uses binary, they actually have to combine the result of two clock cycles into 3 bits.
 
Status
Not open for further replies.