Question So I bought 5x Zotac 2GB GDDR5 GT 730 single-fan double-width full-height cards, DVI/VGA/HDMI ports. What to do with them?

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
Bought one used off a user here, and then bought four more off of ebay, as "refurb", factory-direct from Zotacusa. ($35 ea., pretty inexpensive, for a GPU that can, barely, play GTA V at the right settings, if YT vids are accurate. At least, it's the GDDR5 version, so multi-monitor shouldn't be glitched-out when driving 4K screens.)

What can I do with them? They're not half-height, so I can't refurb some OEM SFF with them like I could a GT 1030 2GB GDDR5 card.

I mostly bought them for 4K UHD desktop output duty for use with Ryzen CPUs, building Ryzen rigs for desktop usage, and leaving the GPU upgrade as a sort of "fixer-upper opportunity" for the purchaser, or possibly some return business to upgrade it later to play something more than basic e-sports games. Which this card would probably handle. Although, it's GDDR5, but on a 64-bit bus. :|

Honestly, I'm not even sure that they're more capable than a Ryzen 3 3200G APU's graphics. Might be interesting to bench them off against each other. (I'm sure that there's some YT vids doing exactly that.)

These are probably slower than a 2GB GTX 1050 GDDR5 card, which has, I believe, 128-bit memory, AND like 512 or 640 or maybe more CUDA cores. (These GT 730 have the Kepler 384 CUDA cores.)

Just wondering if maybe someone has some better ideas for them. (That are "fit for purpose" - NOT skeet targets, although they are probably the right size for them...)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
Truly amazing.
At the sheer level of...?

Anyways, they were cheapo, and being factory refurb from the actual company, hopefully very functional. I don't expect much out of them, but ... honestly, the 2GB GDDR5 version of the GT 730, can't be all that far behind the GT 1030 for gaming, don't they both have 384 Cuda Cores? One's Kepler, and one's Pascal, granted. But I doubt that the generation of cards matters THAT much, when you're talking a low amount of Cuda cores in both, and both have basically the same VRAM config (64-bit 2GB GDDR5). Granted, the GT 1030 are also low-profile ready, and these aren't. But no biggie.

I expect that they will play Fortnite to some extent, maybe Apex Legends and whatnot, Roblox, Minecraft (*not sure about VRAM amounts for minecraft), possibly GTA V, etc.

For $35 I don't expect that much, but I at least got "a deal" on them. Considering the cheapest "real" gaming GPU is around $100-150 these days. Strangely enough, though, the price of GT 1030 cards doesn't appear to be inflated at all. Kind of like the world just forgot about them. ("Don't forget about DRE!")
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
honestly, the 2GB GDDR5 version of the GT 730, can't be all that far behind the GT 1030 for gaming, don't they both have 384 Cuda Cores?

CUDA cores are only half of the equation for gaming performance, TMUs and ROPs matter quite a lot. The 730 GDDR5 (GK208) has 8 less TMUs and half the ROPs of the GT1030 (16/8 vs 24/16). The 1030 also has a major frequency advantage (902 vs 1468MHz official, in reality many 1030s will hit max boost which is 1771MHz).

For the main usage case here the GT730 is further disadvantaged by poor video decode support (only h.264).
 
  • Like
Reactions: DAPUNISHER

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
Just wondering if maybe someone has some better ideas for them. (That are "fit for purpose" - NOT skeet targets, although they are probably the right size for them...)

I am going with the skeet target option.

That is a pathetic GPU. You will be lucky to get 20 fps with fortnight at 1080p. No vp9 support.

Ryzen 3 3200G APU's

The Vega 8 on the 3200g will be over 2x faster then those.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
These are probably slower than a 2GB GTX 1050 GDDR5 card, which has, I believe, 128-bit memory, AND like 512 or 640 or maybe more CUDA cores. (These GT 730 have the Kepler 384 CUDA cores.)

As a serious answer... The Kepler will be far slower than a Pascal or Maxwell part with the same core count. Kepler needs code to be written in a particular way to allow it to dual-issue instructions at least 50% of the time, or those cores are going to go underutilised. It's a known Achilles heel of the Kepler architecture. They had more backend execution hardware than they had front end to feed it instructions, which is why they dropped the core count per SM from 192 to 128 in Maxwell.

If you carefully unroll the loops in your shader code then you can see some good performance, but on typical code its IPC is going to be nowhere near that of a Maxwell/Pascal GPU.

If you look at a review of the 1030, it's closer to a 750ti in performance: https://www.tomshardware.com/uk/reviews/nvidia-geforce-gt-1030-2gb,5110.html

I would not recommend these cards for anyone who wants to game. Their best use is as a simple video out for Ryzen machines with no IGP, that don't need graphics power. Put them on eBay, sell them as fast as you can before they lose any more value. You don't need them cluttering up your house.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
they are good upgrades for people running old IGPs, can't think of many other things to say,
but even then, the 730s are not that new and lack things like VP9 decoding which is important nowadays.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,274
19,922
146
Speaking of shooting, I used it for troubleshooting on the test bench. I think it uses 35W? so it was great for OEM systems, since you don't have to worry about power draw issues during the shoot. BTW, Zotac on Ebay is where I bought it too. :D

@Leeea is right on, the 3200G is much faster.

The only good use for it besides my edge case, is retro gaming. It played FO3 & FO: New Vegas, 2006 Ghost Recon: AW, Thief 2 Metal Age, Delta Force Black Hawk Down, and some other old titles perfectly fine. On both win 7 and 10. Some were playing off of optical because I still own the physical media, others were from GOG. I was using it in a Phenom II quad core w/ 4GB ram. There are also XP drivers for it, if someone wants to try using it with a period correct OS for the retro build.
 
Last edited:

MalVeauX

Senior member
Dec 19, 2008
653
176
116
Heya,

They may be useful to someone who's into low power Folding@Home or wants to build a low cost folding array, especially during a time like this where GPU's are over-priced and scarce. The lower power draw is good and they do about 31k PPD with Folding@Home. A couple of those doing a work unit each could get into the 100k+ PPD so every 10 days that person could be folding 1 million or so, for cheap.

Very best,
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
You could use one as a dedicated PhysX card if you run AMD as your main GPU. I used one for Borderlands 2 and the performance was fine. You can also hook up extra/spare VGA monitors, if needed, since the power draw is extremely low. And what the heck, it’s still supported in Windows 10 (unlike mobile Kepler), let alone the older systems starting from Windows XP.

2) cost effective replacement for older IGP based systems like Ivy Bridge or older. But they are pretty much useless for anything above 1080p, however. Still, much better than what GeForce 520 was at the time.
 
Last edited:

MalVeauX

Senior member
Dec 19, 2008
653
176
116
You could use one as a dedicated PhysX card if you run AMD as your main GPU. I used one for Borderlands 2 and the performance was fine. You can also hook up extra/spare VGA monitors, if needed, since the power draw is extremely low. And what the heck, it’s still supported in Windows 10 (unlike mobile Kepler), let alone the older systems starting from Windows XP.

Is PhysX still a thing, like at all? I haven't seen any benches showing PhysX helping a benchmark in years and years... (genuinely curious)

Very best,
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
Is PhysX still a thing, like at all? I haven't seen any benches showing PhysX helping a benchmark in years and years... (genuinely curious)
Here's an old article on the subject. Personally, I've only played Borderlands 2 that was PhysX enabled, and that game saw massive improvements from running a dedicated card for its operations. I played maybe 500 hours of that game, and Geforce GT 640 (384 cores) was a great help to my Radeon 7870 at the time.

I don't know how the older code deals with newer cards, however. I only briefly played BL2 on my now sold 2070 Super card. I am getting RTX 3090 in a week and a half and can let you know then (I still have GT 640 lol).

1609600877402.png
 
Last edited:
  • Like
Reactions: MalVeauX

MalVeauX

Senior member
Dec 19, 2008
653
176
116
Thanks,

I figured it was gone, I know it had its moments on a few titles back in the day, but I figured it was a total thing of the past with how we have crazy core/thread counts and monstrous GPU compared to back then so PhysX with a dedicated 2nd GPU in the rig is a thing of the past. Looks like multi-GPU in general is going the way of the DoDo.... fine by me though, requires too much to do for not a double gain despite more than double the cost.

Very best,
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
Ok, this Ryzen R5 1600 CPU build that I threw together, B450 mATX board, SATA 240GB SSD, stock 1600 cooler, brand-new Rosewill case + 400 (450?) PSU with PCI-E cable(s), I put in the GT 730 2GB GDDR5 version, which I realize is extremely weak for gaming. Was going to throw in a free BNIB VGA 1600x900 19" LCD monitor. (picked up two years ago for $39.99).

1) Will it play Fortnight with a GT 730 2GB?
2) Should I instead, put in a GTX 1650 4GB D5 card? I've got an MSI Ventus OC model (GDDR5), that I picked up refurb new from Newegg last year for $130 or so + tax.

Those cards are going for like $300 these days.

I already offered the PC, with the GT 730 2GB card, for free to someone. Was planning on offering them a better GPU, when funds allow for it. But I'm wondering if I should just man up and throw in the GTX 1650 4GB. (I've got two more in a pair of PCs, in which those will probably come out of those PCs, or possibly, if I build a second mining server, then those GTX 1650 cards would serve as the primary GPUs.)
 

MalVeauX

Senior member
Dec 19, 2008
653
176
116
While Fortnite will run on a GT730, it will chug in congested areas and wide open areas with activity on anything but lowest settings basically, between 720p and 1080p. It'll work. As long as the recipient doesn't expect console performance in all potential areas, it will work. Ultimately if they are looking to really play Fortnite online with lots of people and have a smooth experience, the 1650 would make a lot more sense.

Very best,