Possibility of M.2 PCI-E 3.0 x4 dGPU? GT1030, or 1020, or something?

VirtualLarry

No Lifer
Aug 25, 2001
56,581
10,220
126
Just wondering what the feasibility of implementing a dGPU (a really tiny and power-efficient one) into an M.2 PCI-E slot?

Ok, I'll admit the motivation behind this idea is my original DeskMini H110W rig, which has an M.2 PCI-E slot, wired up to the CPU's PCI-E lanes (only the first four), which would also make it perfect for a mini-GPU.

Plus, the Intel HD 630 iGPU in my G4600 basically sucks, so I would love an upgrade, but I don't want to get another STX rig with the MXM card in it, which is larger.

I don't know what voltages are available on the M.2 connector, nor how much power it can supply, or how much heat it can dissipate. (Look at the M.2 PCI-E "Cardea" SSDs for an idea of how big an M.2 heatsink can be, which might also apply to a dGPU implementation.)

s-l1600.jpg

For video output, it would still use the iGPU in tandem, like Virtua MVP and the like, bascally it would render via the M.2 dGPU, and then copy the output framebuffer to the iGPU's framebuffer.

And hey, NVMe supports "HOST DRAM BUFFER", maybe they could leverage that same sort of technology, or just implement "Hyper Cache" or whatever AMD called it, to use system RAM for textures, over the PCI-E 3.0 x4 bus.

It could be made to work, I think. Very niche, but possible.

It could also be positioned as an upgrade for modern laptops, that might want to do (light) gaming.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
I've daydreamed about this myself. Iirc M.2 and mPCIe lack 12V supply (and probably not enough wattage), but I assume the data channels are just generic PCIe and would work with any device.

It would be hard to fit everything into the format obv's, even with HBM.

Edit: Seems I was thinking about mPCIe, it appears M.2 can do higher voltage. Though I can't see max power or specifics of voltages. From the Wikipedia page:
"M.2 modules are rectangular, with an edge connector on one side (75 positions with up to 67 pins, 0.5 mm pitch, pins overlap on different sides of the PCB), and a semicircular mounting hole at the center of the opposite edge. Each pin on the connector is rated for up to 50 V and 0.5 A..."
^ But I doubt you can get 25W on all 67 pins.
 
Last edited:

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,730
136
Where will you put the display outputs, and more importantly, how are you going to access them, if your plan is to drive a display with it?
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
Where will you put the display outputs, and more importantly, how are you going to access them, if your plan is to drive a display with it?
Run them over PCIe, like with an MXM card? Those don't have display outputs either ... nor do on-board GPUs in laptops, for that matter.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,730
136
Run them over PCIe, like with an MXM card? Those don't have display outputs either ... nor do on-board GPUs in laptops, for that matter.
It is the job of the laptop manufacturer to make the necessary connections between the hardware(the panel and PCB housing the dGPU). How are you going to connect your monitor to the damn thing? Especially on X299/X99 which don't have any display outputs on the motherboard?
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
It is the job of the laptop manufacturer to make the necessary connections between the hardware(the panel and PCB housing the dGPU). How are you going to connect your monitor to the damn thing? Especially on X299/X99 which don't have any display outputs on the motherboard?
I don't see this as meant for desktops (other than perhaps ultra-SFF desktops), but rather laptops. And for your question: how do you connect a monitor to an MXM-based dGPU in any PC that has one? With outputs run over PCIe. The Gigabyte Brix UHD does this, as does the new ASRock micro-STX standard. I don't think it's a problem at all.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
With Nvidia GP108 being PCIe 3.0 x 4 that is an interesting idea.

Although (for the laptops) I'll bet the heatsink will have to be really small.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
With Nvidia GP108 being PCIe 3.0 x 4 that is an interesting idea.

Although (for the laptops) I'll bet the heatsink will have to be really small.
Since when do laptop heatsinks have any physical relation to the parts they cool outside of connecting to them with heatpipes? As for UCFF PCs, I'd imagine them tying this into the main CPU HSF with some added size and heatpipes.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
With ASRock's new mSTX small form factor board with MXM GPUs, I think we're more likely to see that become reality than M.2 GPUs.
I think you have a point here, but the markets are different. Or, in other words, this would allow for a reasonably powerful, replaceable dGPU in roughly a NUC format factor. That's a lot smaller than micro-STX. You can't fit an MXM card in a 13-14" laptop without sacrificing a lot of battery capacity either. Of course, I really hope mSTX can get retail sales of MXM cards going. It's about time.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
With Nvidia GP108 being PCIe 3.0 x 4 that is an interesting idea.

Although (for the laptops) I'll bet the heatsink will have to be really small.

Since when do laptop heatsinks have any physical relation to the parts they cool outside of connecting to them with heatpipes? As for UCFF PCs, I'd imagine them tying this into the main CPU HSF with some added size and heatpipes.

M.2 SSDs in laptops aren't normally cooled by the heat pipe system right?

.....So it would need its own (independent) heatsink.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
I think you have a point here, but the markets are different. Or, in other words, this would allow for a reasonably powerful, replaceable dGPU in roughly a NUC format factor. That's a lot smaller than micro-STX. You can't fit an MXM card in a 13-14" laptop without sacrificing a lot of battery capacity either. Of course, I really hope mSTX can get retail sales of MXM cards going. It's about time.
I think the NUC (the specific Intel box) is too small and also somewhat nonsensical for this type of application. Like you said, the heat/power restraints within that form factor are the limiting factor. An M.2 GT 1030 or further stripped 1020 would still be another 30-50W added to the overall TDP of the system, perhaps lower if undervolted/downclocked/golden samples. At what point does the GPU itself become so costly and low performance that the Intel Iris IGP (or someday AMD Vega APU) is the better and more cost-effective choice? I think that BRIX answered that question for us already - those are effectively a NUC with an MXM riser card. How many people have upgraded the GPU in their BRIX...?

Micro-STX appears to be its own market for sure, but realistically it overlaps both Mini-ITX and BRIX by mashing up desktop CPU parts with mobile RAM and native MXM. The hard sell will be paying 200% more for the GPU (based upon $720 for an MXM GTX 1070 on eBay vs $380 for a stubby PCIe GTX 1070). Even a low-end, used 940M MXM is $200 and Intel's top mobile Iris 580 can trade blows with it for less than that. That's the very same hard sell that keeps most people from upgrading their laptops and BRIX units. The only advantage I see with mSTX is greater heat dissipation and the prospect of smaller (or at least differently-shaped) cases than mITX.

I'd love to see more variation in how GPUs are built and offered, but the cost seems too damn high!
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
M.2 SSDs in laptops aren't normally cooled by the heat pipe system right?

.....So it would need its own (independent) heatsink.
I don't envision this as a slot-in replacement in laptop designs that currently have m.2 SSDs - that would be pretty much impossible in terms of cooling. Heck, most laptops seem to stick the m.2 drives in the most tucked-away corner of the chassis possible. But it would hardly be a challenge at all for laptop manufacturers to incorporate strategically placed m.2 slots for graphics where needed - in the vicinity of the CPU and/or a HSF, where one or two heatpipes could easily pass over it (in case of a unified cooling solution) or terminate there (in case of a dedicated one for the dGPU).
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
I think the NUC (the specific Intel box) is too small and also somewhat nonsensical for this type of application. Like you said, the heat/power restraints within that form factor are the limiting factor. An M.2 GT 1030 or further stripped 1020 would still be another 30-50W added to the overall TDP of the system, perhaps lower if undervolted/downclocked/golden samples. At what point does the GPU itself become so costly and low performance that the Intel Iris IGP (or someday AMD Vega APU) is the better and more cost-effective choice? I think that BRIX answered that question for us already - those are effectively a NUC with an MXM riser card. How many people have upgraded the GPU in their BRIX...?

Micro-STX appears to be its own market for sure, but realistically it overlaps both Mini-ITX and BRIX by mashing up desktop CPU parts with mobile RAM and native MXM. The hard sell will be paying 200% more for the GPU (based upon $720 for an MXM GTX 1070 on eBay vs $380 for a stubby PCIe GTX 1070). Even a low-end, used 940M MXM is $200 and Intel's top mobile Iris 580 can trade blows with it for less than that. That's the very same hard sell that keeps most people from upgrading their laptops and BRIX units. The only advantage I see with mSTX is greater heat dissipation and the prospect of smaller (or at least differently-shaped) cases than mITX.

I'd love to see more variation in how GPUs are built and offered, but the cost seems too damn high!
The cost is high because there is no retail market for them, and there is no retail market because the cost is too high and there are barely any PCs out there with MXM slots that aren't BIOS locked to the card it came with. That's why MXM replacements don't ever happen. The idea (both behind mSTX and in general) is that removing one of these obstacles will also remove the other. There's no actual reason for MXM cards to be any more expensive than retail PCIe GPUs. While they might need higher efficiency power delivery, they don't need large PCBs, display outputs, or even a bundled cooler - which should at the very least make them comparably priced, if not cheaper, once the market is there. And if there was a large enough potential market out there with PCs with upgradeable MXM PCs, some manufacturer would start making them for retail. I believe all it would take is a serious push from Nvidia or some other big actor - which seems to be what ASRock is trying to start.

As for the BRIX, I don't know if you're talking about the Brix UHD series (mini-towers) or the more NUC-like Brix Gaming (which didn't use MXM), but the UHD series at least seems to be trying something similar to ASRock. As for the NUC form factor, have you seen how utterly tiny those 15W heatsinks they have are? Even with the minuscule size of the chassis, they're tiny. If they added half an inch or a little more of thickness to the chassis and used most of it for GPU cooling, cooling a 30W GPU in a NUC form factor alongside a 15W CPU would be no issue at all. Remember, the Dell XPS 15 cools a 45W CPU and a ~50W dGPU in a very, very thin laptop chassis, and still fits a huge battery and a lot of other stuff in there. Cooling ~50-75W in a tiny chassis with a single decent-sized blower fan and a well designed heatsink really wouldn't be an issue. It wouldn't be as small as a NUC, but it doesn't need to be. It would be very attractive even at 2x the thickness of the (non-2.5") NUC.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
I see where you're going and yes a well-designed HSF setup could do the trick in a NUC, but then that's another issue on its own. ATX/ITX has an advantage in that HSF designs can vary significantly from one another because of the internal space to work with. "WindForce", "StriX", etc. offers branding and competitive marketing. In a NUC or similar-sized case, manufacturers/vendors would have to reduce their normal flair/bling and conform to a one-size-fits-all solution... or each GPU would just come with its own HSF that wouldn't be changeable. If mSTX is going to attempt to be something competitive with mITX, then it's gonna take a lot more than just getting cheaper MXM chips to market.

Don't get me wrong, I WANT IT, but not enough to be an early adopter at these prices. LOL

As for the BRIX, I don't know if you're talking about the Brix UHD series (mini-towers) or the more NUC-like Brix Gaming (which didn't use MXM), but the UHD series at least seems to be trying something similar to ASRock.
I'm talking about the original BRIX units that were slightly taller than the NUC. They did use an MXM board connected via a bridge:
IMG_9133.jpg
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I don't envision this as a slot-in replacement in laptop designs that currently have m.2 SSDs - that would be pretty much impossible in terms of cooling.

Maybe it could work in a laptop that was made of aluminum and had a thermal pad in between the M.2 GPU and the aluminum chassis?

See posts 3,4,5 and 6 in the thread below for an example of how a thermal pad can conduct heat from an SSD to an alumimum laptop chassis:

http://www.overclock.net/t/1597862/diy-cooling-for-an-m-2-ssd-in-a-laptop
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
Isn't that how lots of Macbook customers burned their legs a few years ago? :D

There have been a couple of passive, aluminum-finned "heat sink cases" sold that do just that - conduct heat to the chassis. Problem is that you still need good ventilation around the case otherwise the temps just keep rising and rising.

@VirtualLarry - do you think this topic will be of any concern to you once we get AMD Raven Ridge (Ryzen+Vega APU)? If they can get close to this price with a 4C/8T model... OMG
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
Maybe it could work in a laptop that was made of aluminum and had a thermal pad in between the M.2 GPU and the aluminum chassis?

See posts 3,4,5 and 6 in the thread below for an example of how a thermal pad can conduct heat from an SSD to an alumimum laptop chassis:

http://www.overclock.net/t/1597862/diy-cooling-for-an-m-2-ssd-in-a-laptop
Well, that might work for a <10W GPU, but what would be the point of that? It would hardly outperform the iGPU of any modern CPU (except for power throttling, of course). Might work, but I don't see this as much of an opportunity. My vision of this is for new designs of Ultrabooks and similar thin-and-light laptops with specially designed m.2 slots for GPUs, complete with heatpipes. Especially since I very much doubt current laptops with m.2 SSDs would recognize a GPU in there at all, with all the hardware whitelist shenanigans manufacturers love.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
There have been a couple of passive, aluminum-finned "heat sink cases" sold that do just that - conduct heat to the chassis. Problem is that you still need good ventilation around the case otherwise the temps just keep rising and rising.

Use a laptop pad with fan(s)?
 
Last edited:

wilds

Platinum Member
Oct 26, 2012
2,059
674
136
Isn't that how lots of Macbook customers burned their legs a few years ago? :D

The main problem with hot Macs is Apple's insistence on forcing idle fan speed until the CPU hits 100 celcius! This coupled with Intel turbo boost always on means Macbooks run insanely hot out of the box.

Only 3rd party software can make them run cool and quiet... MacsFanControl + Turbo Boost Switcher = a silent Macbook below 70 celcius.

No idea why Apple does this... I wouldn't want to buy a used Mac from someone who has been running their CPU at 90 celcius for years and years though!

Use a laptop pad with fan(s)?

Or just disable Intel Turbo Boost which fixes all the heat problems in the first place :)

The lack of Intel Turbo allows so much more thermal headroom to the dGPU. Most laptops share the heatpipes so if the CPU gets hot, so does the GPU. In most scenarios, Turbo isn't needed at all which means Intel's Turbo is just a detriment in all these situations.