How much PCI-e bandwidth do you really need for basic tasks? Not gaming.

PingSpike

Lifer
Feb 25, 2004
21,765
615
126
I've seen plenty of PCI-e scaling benchmarks for gaming. The short answer on that is its not that bad until you get below 4x but depends on the application and card.

Obviously, 1x pci-e cards exist. They're not great gaming cards with their limited pci-e bandwidth and need to maintain a tiny power envelope. Really they're meant for other things in my estimation. That said, I can't find any reviews saying how much worse, if at all, they are at say decoding youtube video, or displaying your desktop, etc. Every review is all about gaming.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,486
2,363
136
Not much. I run 1x videocard in my server at 1080p resolution no problem. That said, if you want hardware decoding for latest video codecs such as HEVC or VP9 I don't think any current 1x videocard supports that, which just means your CPU will have to decode it.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,696
136
Obviously, 1x pci-e cards exist. They're not great gaming cards with their limited pci-e bandwidth and need to maintain a tiny power envelope. Really they're meant for other things in my estimation. That said, I can't find any reviews saying how much worse, if at all, they are at say decoding youtube video, or displaying your desktop, etc. Every review is all about gaming.

There is no difference in their ability to handle basic desktop usage (<=1080p) on your monitor. However, the newest PCIe x1 card I've seen is a humble GT710. Which means no hardware h.265 or VP9 decoding, so you'll have to rely on CPU grunt to do those.

If only somebody would do a GT1030-class card (PCIe 3.0 x1/985MB/s), that'd be perfect. But I doubt the potential market is large enough to warrant the investment.
 

PingSpike

Lifer
Feb 25, 2004
21,765
615
126
Thanks for your thoughts. I know mining isn't really affected by running at 1x.

I think the reason there might not be any GT1030 is it has a max power of 30watts vs the GT 710/HD5450 19watts. The 1x slot is only suppose to support 25watt max IIRC. I actually run a GT710 with a 16x-1x adapter in my server. I'd found that 1x native versions of cards have a nasty price premium so the adapters are the way to go.

That said, the much maligned DDR4 GT1030 seems to have 20watt TDP. Maybe this thing has a reason to exist after all.

One reason I was thinking about this is I was looking at VM sharable GPUs. They cost a fortune, require special expensive licensing, don't work with most hypervisors and generally seem difficult to setup and use. As stupid as it sounds, it seems like buying a bunch of regular cards and plugging them into a pci-e splitter would be more versatile and cost effective. The only possible downsides being higher idle power consumption and the mess of cables.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,696
136
I'd found that 1x native versions of cards have a nasty price premium so the adapters are the way to go.

You'd have to include the adaptor cost of course. But other then that, that'd be the way to go. Unless you need the card to be low profile, but any decent non-OEM case should have plenty of space.

One reason I was thinking about this is I was looking at VM sharable GPUs. They cost a fortune, require special expensive licensing, don't work with most hypervisors and generally seem difficult to setup and use. As stupid as it sounds, it seems like buying a bunch of regular cards and plugging them into a pci-e splitter would be more versatile and cost effective. The only possible downsides being higher idle power consumption and the mess of cables.

Interesting idea actually. Some GT710 / HD(5/6)450 / 230's could do, depending on how many VMs you plan to run.

I'd look into getting a disused mining case. Without the mining cards themselves. That would solve the cabling issue somewhat. It's not perfect, but could work well for a budget setup.