Why aren't we seeing more PCI-E 3.0 x1 dGPUs?

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
I mean, the GT 1030 is only PCI-E 3.0 x4, right? Why don't they offer two PCBs, preferably the low-profile, single-slot variety, with a PCI-E x1 slot?

I have an actual use-case for these, too.

I've got some Haswell H81 Biostar flex-ATX mobos, they're like ITX, but the first slot is a PCI-E x1, and then the PCI-E x16 is the second slot over.

But they're in Winsys WT-02 cases, which are ITX SFF cases, like micro-ATX size, but only (barely!) accepting a dual-slot flex-ATX mobo, but with only a single low-profile expansion slot. (Slot #1, which is the PCI-E x1 on my boards.)

This appears to be the newest tech available in PCI-E x1. A Zotac GT710.

https://www.newegg.com/Product/Prod..._re=PCIe_x1_video_card-_-14-500-395-_-Product

I would really like a GT1030 in PCI-E x1, for both 4K60 UHD output, as well as VP9 4K60 decoding.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
when people plug it into 1.x and 2.0 slots it's going to be seriously bottlenecked

I agree with the previous post that this is probably a very low demand product, but it might happen at some point
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Virtual Larry, Another option would be to "open end" a PCIe x 1 slot.

Here is a video discussing this (note the dangers to doing this):


Alternatively, you could always look for motherboard with an open end pcie x 1 slot from the factory. (I have seen them, but they are rare).
 
  • Like
Reactions: psolord

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
I'll give some thought to modifying the slot, although, I don't know if a PCI-E x1 slot will supply enough current to power an x16 card. I think x1 slots max out at 25W, whereas we know PCI-E x16 slots are supposed to go to 75W.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I think x1 slots max out at 25W,

So probably we would see a PCIe x 1 card with a "GT 1020" or "GT 1010" then.

And this could very well happen if Nvidia decides to use harvested GP 108 dies for desktop.
 

Dizzious

Junior Member
Sep 17, 2009
3
2
66
I'll give some thought to modifying the slot, although, I don't know if a PCI-E x1 slot will supply enough current to power an x16 card. I think x1 slots max out at 25W, whereas we know PCI-E x16 slots are supposed to go to 75W.
All of the speeds can provide the same power. With PCI-E, the notch in the connector separates the "power" part of the connector from the "data" part.
images.jpg
Note that the part of the connector to the left of the notch is the same size, regardless of the speed. All the speeds are supposed to support the same power output.
That was one of the cool things about PCI-E when they first released it - the power and data lanes are all the same, but card manufacturers can choose to put various numbers of data lanes on their cards. Since the power spec for all of them is the same, it provides the option of installing any smaller card into a larger slot, and usually allows installing a larger card into an "open ended" slot. When PCI-E killed off the old AGP slots, the idea was that there would be no hard incompatibility between "graphics card slots" and "other peripheral" slots.

I'm working on a Mini-ITX build right now utilizing an old ASRock Q1900-ITX board, and will be putting a Geforce 1030 DDR5 card into it. I was also trying to find one in PCIE x1, but alas there are none and I don't want to go all the way down to Geforce 710 performance. Originally I had been going to just take a Dremel and cut off the other 15 data lanes from the card itself, but during the course of writing this post I realized I actually have one of these things laying around:
51oRR6F2NmL._SL1039_.jpg
PCIE X16-to-X1 adapter. So I'll probably just use that.
These adapters are pretty easy to get now - the demand for them due to crypto mining has increased their availability. I think I got mine for like $6 on Newegg a year ago.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
All PCI express cards may consume up to 3 A at +3.3 V (9.9 W). The amount of +12 V and total power they may consume depends on the type of card:[15]:35–36[16]
  • ×1 cards are limited to 0.5 A at +12 V (6 W) and 10 W combined.
  • ×4 and wider cards are limited to 2.1 A at +12 V (25 W) and 25 W combined.
  • A full-sized ×1 card may draw up to the 25 W limits after initialization and software configuration as a "high power device".
  • A full-sized ×16 graphics card[12] may draw up to 5.5 A at +12 V (66 W) and 75 W combined after initialization and software configuration as a "high power device".

All cards below X16 are supposed to be limited to 25 watts.
That does not mean that mfgs comply.

Wiki PCIE article and pages 19, 20 here:
http://e2e.ti.com/cfs-file/__key/co...nts-files/639/7851.PCIe_5F00_designGuides.pdf
 
  • Like
Reactions: VirtualLarry

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
All cards below X16 are supposed to be limited to 25 watts.

While this is somewhat of a necrothread, that's the main issue.

Which limits choices to ultra-low-end GT710 class GPUs. You -might- be able to fit a GT1030/RX550 (reduced frequency etc.) into a 25W TDP, but I doubt there'd be much sale in such. Properbly why no one has bothered.

Besides, if all you need is a basic functionality a GT710 is plenty enough. And widely supported too.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
I think the TDP of the GT 1030 DDR4 is 20W, so it should be OK in that regard!?
 

PingSpike

Lifer
Feb 25, 2004
21,732
561
126
There are different types of adapters like the ones Dizzious posted. Some of them provide the option to attach a molex for additional power to the slot. This seems to be a frequently used setup for mining rigs, and apparently works for much more powerful cards than things like the GT 1030 or GT 710.

My main complaint about these adapters is it seems hard to find a high quality one that you aren't worried about starting a fire even if you're willing to pay.
 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,672
578
126
I think it's like @SPBHM stated, the people most likely to get a bargain basement card are also the people most likely to have PCI-e 1.x or 2.x boards. While a 1x PCI-e 3.0 slot wouldn't be bad for a 1030, a 1x PCI-e 1.0 slot would put a serious hurt on even the some of the lowest GPU's performance numbers. An 8x slot with as the minimum provides a bargain basement guarantee of performance even on the lowest speed boards.
 

mirddes

Junior Member
Oct 22, 2020
4
2
41
lets make a GT1010 for AGP...

honestly though a modern GPU on a single slot low profile pcie3 1x card would be really really nice. it doesn't need to be amazing. it just needs to exist.

it would be great for multiseat deployments in rural villages.
 
  • Haha
Reactions: aigomorla

Jimminy

Senior member
May 19, 2020
344
127
86
Why don't we have two types:
Video cards.
Mining cards.

Each could be optimized for the exact specific application ... no compromises.

No more having to use a pipe wrench as a torque wrench. Two different types of wheels for us cave men.

Ogabooga ... Coins ... cha ching. Me spend.

Ogabooga ... video picture. Me see.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,841
3,189
126
Well mining cards have very little resell value, while gaming cards which can be mined at can be resold quite easily.

If we disable mining abilities on a video card, i believe people would easily find a way to bypass that with customized bios so it wouldn't really help.

The only way is to make the mining cards not have a video output like some quadros did, and sell them at a discount, but again even then the typical miner can recoop about 50% of video cards price when they resell it, after the mining fad is gone.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
Why don't we have two types:
Video cards.
Mining cards.
I guess that MIGHT make some sense, if the two different cards didn't depend on mostly the SAME mfg resources and components. As well, miners (that aren't renting a warehouse), after what happened after the last mining boom went bust, cards without ANY display output, were largely worthless (because they couldn't mine any more either, because the DAG size increased passed the size of their onboard frame-buffer). If they had only put AT LEAST ONE video output onboard, they could have been thrown into some sort of entry-level gaming PC or simply used just for a display output card. At least would have been useful in SOME way, had they had a display output port.

I am 100% AGAINST any sort of card, WITHOUT any sort of display output. Calling a "mining card", or a "useless hunk of junk" (eventually), but it still makes NO sense, from a longer-term, as well as an e-waste perspective. I think that they should be legislatively banned in all major countries, that's how much I loathe the idea, as an anti-ewaste initiative against "intentionally useless technology".

What I WOULD agree with, is a "mining edition" card, with only ONE display output, rather than four, and NO PACKAGING. Only an anti-static bubble-sleeve, bulk-packed in some strong boxes, intended for "bulk orders". No need for fancy printed retail-boxes. Sure, maybe that lowers their resale value as well, but cards sold that way should be cheaper in bulk, too.

Edit: Someone made the argument, that the mining cards could utilize GPU silicon that might otherwise be forced to go to waste, if the ROPS/texture-units were BAD, and only the Compute units were OK. Those GPUs might make sense to use with mining cards without display outputs, if the display was going to be "bad" anyways. I guess that might make some sense. But what percentage of the silicon yields end up with those types of chips? A minute fraction, I expect.

Another possibility, which, as much finished GPU volume that mining users buy, still pales in comparison with the volume purchased by gamers, but if it were enough of a percentage, perhaps NV would think about spinning up some (smaller) custom silicon variants, to ELIMINATE unnecessary portions of the GPU, such as ROPS/texture-units, and make an actual COMPUTE-SPECIFIC GPU, at a lower-end level (purchasable by mere mortals, not just big-$$$ corporations like the GA100).

Edit: If GPU-based mining continued to be a "thing" (and EIP-1559 wasn't hanging over the neck of ETH miners), then I could see NV and AMD naturally going in this direction, and actually make mining-specific GPU silicon, for a lower price, that was less capable overall (couldn't play games, but was available in-bulk, and didn't take away from gamer finished GPU stocks), but still saleable to miners for that purpose, and that purpose only.

With such mining-GPU-only silicon available, NV and AMD could "dump" (blend-in) the GPU silicon from the gamer cards, that was produced with defects in the unused portions of the silicon that the mining GPUs didn't use, and use them as well, thus reducing e-waste overall.
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Why don't we have two types:
Video cards.
Mining cards.
The only way is to make the mining cards not have a video output like some quadros did, and sell them at a discount, but again even then the typical miner can recoop about 50% of video cards price when they resell it, after the mining fad is gone.

Already been tried. Miners just buy BOTH variants, and continue like nothing happened.

honestly though a modern GPU on a single slot low profile pcie3 1x card would be really really nice. it doesn't need to be amazing. it just needs to exist.

If you just need video outputs in a small form factor, this might be what you're looking for:

https://www.asus.com/Motherboards-Components/Graphics-Cards/All-series/GT710-4H-SL-2GD5/
 

PingSpike

Lifer
Feb 25, 2004
21,732
561
126
I don't see how mining cards alleviate pressure anywhere. There is a shortage of the chips and other materials at the end of the day. You make some mining cards, if they're cheaper miners might buy those out first if they are cheap but it still took chips away that could have been gaming cards of some type. Then they show up on the used market...but they're useless for gaming so they don't alleviate any pressure there either. If anything all making mining cards does is kill used gaming market supply which is probably the real reason they make them.

Its not like there is a shortage of DVI ports or something. The only parts they disable just aren't big cost savers.

If you wanted to make a gaming ONLY card is actually possible: Make the card refuse to run or throttle down to nothing unless it negotiates at least 4x pcie lanes. The vast majority of mining cards are run in rigs that use splitters and 1x ports on a motherboard. This would force them to blow money on more mining rigs just to use tons of the gaming cards. Then RAISE the price on mining only cards who didn't require 4x lanes or just do away with them altogether.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
If that was the case they could just use HEDT or surplus server hardware to build mining rigs with 7 x16 physical slots. Even if you're buying a new 7 slot X299 board with CPU, that adds <$1000 to the price of your rig, which is a 10% increase in the total rig price if you're mining on 3080s. Smaller scale miners that don't need dozens of the same boards to ease deployment could just buy old 8 x16 slot server hardware for a couple hundred bucks per platform.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,991
136
So what brought on the sudden interest in the dark arts of thread necromancy that dug this out of the ground?
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,808
7,162
136
I'm honestly a little surprised AMD/NV have not gotten in on the FPGA game.

Seems a little like small potatos for these companies to design an FPGA geared for mining, then mass produce it on an older cheaper node. I have to figure the gains of the FPGA over the GPU would offset the older node. Must not be as lucrative as I assume it is.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
I'm honestly a little surprised AMD/NV have not gotten in on the FPGA game.

Seems a little like small potatos for these companies to design an FPGA geared for mining, then mass produce it on an older cheaper node. I have to figure the gains of the FPGA over the GPU would offset the older node. Must not be as lucrative as I assume it is.
FPGAs did well to bridge the gap between GPUs and ASICs for Bitcoin, because the algo was very compute heavy and required very little in terms of memory bandwidth. Even then the majority of FPGA miners were made using secondhand FPGAs that wouldn't have been able to scale in massive quantities; the ones Butterfly Labs was using cost several thousand new. Ethereum mining is quite memory intensive, and that's not something most FPGAs excel at.
 
  • Like
Reactions: GodisanAtheist

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,991
136
I'm honestly a little surprised AMD/NV have not gotten in on the FPGA game.

Seems a little like small potatos for these companies to design an FPGA geared for mining, then mass produce it on an older cheaper node. I have to figure the gains of the FPGA over the GPU would offset the older node. Must not be as lucrative as I assume it is.

AMD just bought Xillinx who are a major player in the FPGA area. Hell, they're one the companies (along with Altera) that pretty much pioneered the technology.

However most of the bitcoin mining is done on ASICs which are dedicated hardware and will always have an advantage over an FPGA for any dedicated task like that. An FPGA would be better in that it could be configured to mine different types of cryptocurrencies depending on shifts in valuation, but I don't know if there's enough valuation shift in the currencies to account for the performance uplift that an ASIC will grant, especially when process improvements will obsolete hardware faster than just about anything else.

Bitcoin is big enough that someone can probably afford the prices of the cutting edge wafers, so even though a company like Xillinx could use a cutting edge process just by virtue of having so much volume and flexibility with an FPGA, they still won't be able to achieve the an economic edge over an ASIC.
 
  • Like
Reactions: GodisanAtheist