Question How much vram is too much vram ?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

maddie

Diamond Member
Jul 18, 2010
5,172
5,565
136
  • Like
Reactions: marees

coercitiv

Diamond Member
Jan 24, 2014
7,440
17,724
136
Assuming the AI growth continues.
OpenAI spent ~$11B with a revenue of ~$4B in H1 2025. Meanwhile Zuckenberg argues Meta can happily spend hundreds of billions because they have their revenue stream secured. The only catch is ChatGPT is currently being transitioned and positioned as a replacement for Facebook and other forms of social media, will happily replace search engines too.

They all sold us this dream of AGI and AI coming for our jobs, yet currently their most obvious use of AI is to create engagement farming companions that sell us stuff. IMHO they are just building parasites now (commercially, not research)). If something truly productive comes out of the race great, otherwise the plan is to exploit human weakness as a revenue stream.

My guess is the growth will continue.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
5,172
5,565
136
OpenAI spent ~$11B with a revenue of ~$4B in H1 2025. Meanwhile Zuckenberg argues Meta can happily spend hundreds of billions because they have their revenue stream secured. The only catch is ChatGPT is currently being transitioned and positioned as a replacement for Facebook and other forms of social media, will happily replace search engines too.

They all sold us this dream of AGI and AI coming for our jobs, yet currently their most obvious use of AI is to create engagement farming companions that sell us stuff. IMHO they are just building parasites now (commercially, not research)). If something truly productive comes out of the race great, otherwise the plan is to exploit human weakness as a revenue stream.

My guess is the growth will continue.
Capital destruction?
 

marees

Golden Member
Apr 28, 2024
1,973
2,605
96
Hmmm

Server memory prices to double year-over-year in 2026, LPDDR5X prices could follow — 'seismic shift' means even smartphone-class memory isn't safe from AI-induced crunch​

News
By Anton Shilov published 2 days ago
Shortage and high DRAM prices can undermine.

as Nvidia uses LPDDR5X in its Grace Blackwell and Vera Rubin platforms and as demand for such servers is skyrocketing, prices of smartphone memory will also rise in the coming quarters. Indeed, each Grace CPU in today's platform is equipped with 480 GB of LPDDR5X memory (a premium smartphone uses 16 GB of LPDDR5X), but this is going to at least double with Vera CPUs, possibly straining LPDDR5X supply.