- Aug 25, 2001
- 56,581
- 10,220
- 126
Just wondering what the feasibility of implementing a dGPU (a really tiny and power-efficient one) into an M.2 PCI-E slot?
Ok, I'll admit the motivation behind this idea is my original DeskMini H110W rig, which has an M.2 PCI-E slot, wired up to the CPU's PCI-E lanes (only the first four), which would also make it perfect for a mini-GPU.
Plus, the Intel HD 630 iGPU in my G4600 basically sucks, so I would love an upgrade, but I don't want to get another STX rig with the MXM card in it, which is larger.
I don't know what voltages are available on the M.2 connector, nor how much power it can supply, or how much heat it can dissipate. (Look at the M.2 PCI-E "Cardea" SSDs for an idea of how big an M.2 heatsink can be, which might also apply to a dGPU implementation.)
For video output, it would still use the iGPU in tandem, like Virtua MVP and the like, bascally it would render via the M.2 dGPU, and then copy the output framebuffer to the iGPU's framebuffer.
And hey, NVMe supports "HOST DRAM BUFFER", maybe they could leverage that same sort of technology, or just implement "Hyper Cache" or whatever AMD called it, to use system RAM for textures, over the PCI-E 3.0 x4 bus.
It could be made to work, I think. Very niche, but possible.
It could also be positioned as an upgrade for modern laptops, that might want to do (light) gaming.
Ok, I'll admit the motivation behind this idea is my original DeskMini H110W rig, which has an M.2 PCI-E slot, wired up to the CPU's PCI-E lanes (only the first four), which would also make it perfect for a mini-GPU.
Plus, the Intel HD 630 iGPU in my G4600 basically sucks, so I would love an upgrade, but I don't want to get another STX rig with the MXM card in it, which is larger.
I don't know what voltages are available on the M.2 connector, nor how much power it can supply, or how much heat it can dissipate. (Look at the M.2 PCI-E "Cardea" SSDs for an idea of how big an M.2 heatsink can be, which might also apply to a dGPU implementation.)

For video output, it would still use the iGPU in tandem, like Virtua MVP and the like, bascally it would render via the M.2 dGPU, and then copy the output framebuffer to the iGPU's framebuffer.
And hey, NVMe supports "HOST DRAM BUFFER", maybe they could leverage that same sort of technology, or just implement "Hyper Cache" or whatever AMD called it, to use system RAM for textures, over the PCI-E 3.0 x4 bus.
It could be made to work, I think. Very niche, but possible.
It could also be positioned as an upgrade for modern laptops, that might want to do (light) gaming.