So what do you base that on? How many GPUs are you expecting to be sold per year for AI use, and how much HBM memory per GPU on average?
Mostly, articles I have been reading on both the tech and investment sides.
Data centers are sprouting up like shanty towns around the US. And, for instance, AMD's upcoming MI450/MI500 AI modules require at least 432GB of HBM4 memory for each unit. The server running it will require multiple times that in main system memory. I'm not as familiar with the nVidia side, but current products use lesser amounts of HBM (though I think it is HBM3, with them going to HBM4 in next-gen products) and more DRAM.
Consumer GPUs are going to be hit because GDDR6/7 and later memory lines will be repurposed to produce HBM memory. It is much, much cheaper for a memory manufacturer to repurpose an existing line for different production than it is to build a new one.
There’s a flip side to that coin. After the current insane price bumps of around 3x, their margins must be huge. If they could sell with a profit at 1/3 of the current price before, their current margins should be at least 2/3 of sales price.
So there’s a lot of total amount of money to be made by increasing capacity. They can of course hope that not increasing capacity will keep the margins at this level. But eventually someone will jump in and increase capacity, when margins are as huge as currently.
DRAM margins are presently huge, but HBM margins are even higher. The major memory manufacturers have all already stated categorically on the record that they have no intention of increasing their production capacity. Tons of companies can make DRAM modules, but the companies that make the actual DRAM chips needed to make modules are far fewer.
I really hope that I am wrong, and I hope you get to come back in a year or two and rub my face in it.
I'm not holding my breath for it, though, unless the AI bubble bursts. And, if it does, we'll all have a lot more serious problems to worry about economically than GPU/memory module availability.....