Using an m2 slot for a GPU

ericlp

Diamond Member
Dec 24, 2000
6,133
219
106
Just don't burn your house down. If you are tripping breakers, don't add a bigger one to that yellow #14 wire. Add more circuits like a few #12's with dedicated outlets.

I ran into that problem, running DC equipment in the furthest bedroom far from the circuit breaker. Remember kids to also calculate the voltage drop over distance. I'd laugh about it my mistakes, but house fires aren't really laughing matters. Lesson learned, get thicker cables like #10 wires if you gotta go the distance or figure out how to get your rigs as close to that breaker box as possible.
 
  • Like
Reactions: zzuupp

UsandThem

Elite Member
May 4, 2000
16,068
7,380
146
vINKmE9.jpg


Wowsa, now that's a machine!

And I thought my toy hobby was expensive.
 

TennesseeTony

Elite Member
Aug 2, 2003
4,209
3,634
136
www.google.com
Pretty cool little adapter.

614Pl0ZK66L._SX679_[1].jpg


If you want to remove the "Question" you need to edit your post, then in the title bar is a drop-down menu to the left, choose the last option " (no prefix) ". No prefix is selected by default, but still shows up as "Question" due to the flawless roll out of the upgraded forum software. ;)
 

StefanR5R

Elite Member
Dec 10, 2016
5,498
7,786
136
The box pictured in post #5 has the 1080TIs connected with the usual single-lane extenders. These will noticeably bottleneck the GPUs in some (but not all) GPGPU applications. Folding@Home will be especially affected. I suppose SETI@Home will take a bit of a hit too. (Even more so if operated under Windows; Linux would lessen the blow somewhat due to lesser bus traffic required by the driver architecture.)

The OP of the S@H thread states:
Ian&Steve C. said:
From what others have commented in other threads here, other projects like Einstein@home do see a performance hit on low PCIe bandwidth connections, so keep your own personal goals in mind when setting up your machine. but for this thread, i'll focus on SETI which sees little or no performance impact all the way down to a PCIe x1 interface *as long as you have at least PCIe gen2, and ideally gen3* you WILL see a performance hit if you try this method on very old PCIe gen1 stuff.
I am skeptic. I admit I haven't yet looked through the S@H forum for comparative measurements. But I had watched S@H's bus utilization in wide enough slots (PCIe v3 x8...x16) and found that 1080TI uses about the equivalent of PCIe v3 x2 or PCIe v2 x4 if it can get it:
Here are measurements of SETI@Home's PCIe bus utilization, taken from 3x GTX 1080Ti in PCIe v3 x16/ x16/ x8 slots, obtained with nvidia-smi with 1 second reporting period, running for over an hour on Windows and on Linux, respectively.

The numbers are: average / 90%-quantile / peak.
RX = PCIe reception, TX = PCIe transmission

Windows 7 Pro 64bit, Nvidia driver 387.92, SETI@home v8 8.22 opencl_nvidia_SoG
RX: 920 / 1,400 / 1,800 MB/s
TX: 160 / 220 / 520 MB/s​

Linux Mint 64bit, Nvidia driver 384.111, setiathome_v8 8.01 cuda90
RX: 340 / 520 / 1,400 MB/s
TX: 80 / 110 / 2,100 MB/s​
Well, going by the 90%-quantiles, maybe the Linux cuda90 application can get by with PCIe v2 x1 without considerable performance hit.
Data rates of PCIe versions:
Code:
lanes      x1     x2     x4     x8     x16
------------------------------------------------
PCIe v3   985  1,970  3,940  7,880  15,760  MB/s
PCIe v2   500  1,000  2,000  4,000   8,000  MB/s
PCIe v1   250    500  1,000  2,000   4,000  MB/s

Regarding the CPU: I wonder if the 4 cores (but 8 threads) of the i7-7700K in this build are enough to drive six big GPUs in Folding@Home and some other CPU intensive GPGPU applications. Maybe so on Linux, and a little less so on Windows again. BTW, the 7700K (with Intel's thick application of toothpaste between die and heat spreader, combined with the small "reference" cooler, and this put into the exhaust stream of six 250 Watt GPUs) is bound to get cooked in this build pretty good.

A general note on builds as dense as this: To operate such a thing, you need a separate room where the noise doesn't pose a problem. But if you have such a room, why aim for such density then?
 

lane42

Diamond Member
Sep 3, 2000
5,721
624
126
I like to call these types of rigs KEEP THE WIFE HAPPY computers.
"One computer dear" is what I hear. So stuff em with as many gpu's as you can
and call it a day. I sometimes envy you single guys :)