Discussion An idea on how Intel could disrupt the GPU space

Jul 27, 2020
25,998
17,944
146
Upgradable GPU socket.

Upgradable RAM using LPCAMM2.

LPCAMM2 can be on the backside of the PCB, isolating it from the heat output of the GPU and making it easier to stay cool while also allowing users with AIO or custom cooling to overclock it.

Let the board be upgradable for at least two generations of GPUs.

Just imagine the amount of e-waste that could be reduced.

AMD could do it but I suspect their consumer GPU people have their hands tied behind their backs and can't really do any disruptive stuff for fear of Lisa Su's backhanded smack if they fail in the slightest.

This only leaves Intel who really have nothing to lose at this point. They can go out with a spectacular bang with a concept that fills the hearts of techies everywhere with that amazingly soothing warm feeling of glee mixed with immense satisfaction from seeing a part of their imaginations come true.

Come on, Intel!

Come on, LBT!

@511, please tweet @ LBT with this post or idea.
 
  • Like
Reactions: Quintessa

511

Platinum Member
Jul 12, 2024
2,865
2,870
106
Upgradable GPU socket.

Upgradable RAM using LPCAMM2.

LPCAMM2 can be on the backside of the PCB, isolating it from the heat output of the GPU and making it easier to stay cool while also allowing users with AIO or custom cooling to overclock it.

Let the board be upgradable for at least two generations of GPUs.

Just imagine the amount of e-waste that could be reduced.

AMD could do it but I suspect their consumer GPU people have their hands tied behind their backs and can't really do any disruptive stuff for fear of Lisa Su's backhanded smack if they fail in the slightest.

This only leaves Intel who really have nothing to lose at this point. They can go out with a spectacular bang with a concept that fills the hearts of techies everywhere with that amazingly soothing warm feeling of glee mixed with immense satisfaction from seeing a part of their imaginations come true.

Come on, Intel!
Come on, LBT!

@511, please tweet @ LBT with this post or idea.
This is already in planning if LBT doesn't cancels it
 
  • Love
Reactions: igor_kavinski
Jul 27, 2020
25,998
17,944
146
Haven't seen a patent but unlikely btw they released Battle matrix was based on one of their patents as well.
Wish they had done this at Intel ARC launch. Intel is almost always one step behind. They need to hurry up!
 

DavidC1

Golden Member
Dec 29, 2023
1,615
2,668
96
They can't get anywhere near high performance GPU with an upgradeable socket and LPCAMM2. Did you not see what AMD said regarding LPCAMM and Strix Halo?

LPCAMM and it's variants cannot reach performance of BGA solution. It merely closes the gap but not fully. Socketable GPU means you are even farther away from memory.

Future ARC variants have to target performance multiples of what Strix Halo is.
 
  • Like
Reactions: yottabit
Jul 27, 2020
25,998
17,944
146
LPCAMM and it's variants cannot reach performance of BGA solution.
16 channels of LPDDR6 could give around 300 GB/s or more bandwidth, putting it on par with the bandwidth available to a 9060 XT. After two years, pop in a faster GPU and maybe double the RAM and that's great longevity and more than acceptable performance for most.
 

poke01

Diamond Member
Mar 8, 2022
3,727
5,057
106
16 channels of LPDDR6 could give around 300 GB/s or more bandwidth, putting it on par with the bandwidth available to a 9060 XT. After two years, pop in a faster GPU and maybe double the RAM and that's great longevity and more than acceptable performance for most.

Hopefully some day for the lower end but not this decade.
 
  • Like
Reactions: DavidC1

Quintessa

Member
Jun 23, 2025
27
19
36
Upgradable GPU socket.

Upgradable RAM using LPCAMM2.
GPU socketing= PCIe x16 already does modularity. What you're imagining is socketed GPU cores, which introduces wild challenges: signal integrity, VRM tuning, thermal headroom per SKU, etc. Even NVIDIA ditched MXM years ago for good reasons.
Conceptually? Love it. Practically? It's a massive lift.

LPCAMM2 can be on the backside of the PCB, ...
LPCAMM2 is still premium & niche, and placing it behind GPU die (for thermal isolation) would make PCB routing a nightmare and further inflate BOM. Plus, GPU memory is tightly coupled to memory controller timing, hard to generalize across generations.
 
  • Like
Reactions: 511

ToTTenTranz

Senior member
Feb 4, 2021
451
836
136
I would skip LPCAMM, in fact I would skip VRAM completely. Push PCIe 7 forward and quad-module DDR6 to get up to 512GB/s access to system RAM, and invest in a massive last level cache instead, like 256MB+ on a N6 chiplet.


GPU PCB only gets the PCIe 7 controller, a socket and VRM. Quad module DDR6 allows the GPU to access crazy amounts of VRAM, like 512GB or 1TB, and everyone gets to run big boy AI models at their homes using cheap-ish hardware.
 
  • Like
Reactions: igor_kavinski

511

Platinum Member
Jul 12, 2024
2,865
2,870
106
They are cooking Xe3P/Xe4 for next gen it's their only saving grace that they cook them properly on their own process.

Next Gen dGPU If not cancelled are Xe3P and Jaguar Shores is Xe4.
 

DavidC1

Golden Member
Dec 29, 2023
1,615
2,668
96
GPU socketing= PCIe x16 already does modularity. What you're imagining is socketed GPU cores, which introduces wild challenges: signal integrity, VRM tuning, thermal headroom per SKU, etc. Even NVIDIA ditched MXM years ago for good reasons.
Conceptually? Love it. Practically? It's a massive lift.
They abandoned MXM for different reasons though. Such as nobody wanting it. MXM is just a board which is doable.

Socketable GPUs are much harder.
 

DavidC1

Golden Member
Dec 29, 2023
1,615
2,668
96
I do however like @igor_kavinski OOB thinking.

I'm thinking maybe Intel should open up ARC. Make PCB designs and documents for firmware and GPU readily available. Nvidia locked their firmware down. AMD didn't so you could modify them. Well, open it up completely. Be able to go into vBIOS and modify memory timings easily as going into BIOS.

Maybe some crazy enthusiasts would bother making their own! It certainly would attract attention and carve out a niche. Maybe that's what Intel branded should really mean. Totally open reference design that others can build upon. Have the GPU core available on Digikey/Mouser.

One thing that 3dFX was criticized for was making their own GPUs. So Intel should go back to having reference designs to make it easier for board partners.