Question GT 710 ... "Gaming Edition"? Is THAT what we've come to? (2GB GDDR5 version)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
14-126-211-V03.jpg

ASUS GeForce GT 710 2GB GDDR5 HDMI VGA DVI Graphics Card (GT710-SL-2GD5-CSM)


Don't get me wrong, it looks pretty slick, but why on earth would they stick 2GB of GDDR5 on this card? Except, to serve as an (EXTREMELY!) low-end gaming card?

Seems kind of crazy if true, but ... that's the most directly answer that I could come up with, when I ask myself "why?".

Seeing as how prices have risen on the GT 1030 2GB GDDR5 cards, up to $150 now, NON-scalped prices, as those have risen to the top of the pack as newly bona-fide "Gaming" cards, these GT 710 cards seem to fill up the ranks as well, at only $69.99.

Granted, that's actually not a bad price overall, for ANY 2GB GDDR5 card. The 2GB GDDR5 MSI single-slot half-height GT 730 cards from MSI that were rare but kinda awesome in their own way, were often $70-$100 on their own, even BEFORE the "Great GPU Shortage of 2020-2021".

(Just waiting for the 4GB GDDR5 "Gaming Edition" Radeon 5450 cards... LOL.)
 
Last edited:
  • Like
Reactions: bkiserx7

aleader

Senior member
Oct 28, 2013
502
150
116
Showed her today the cheapest you can pick one up for is about ~$2100.

Did that actually work? I didn't pay much above MSRP for either of my newer cards ($629 USD for EVGA 3070 Ultra XC3, $492 USD for ASUS TUF 3060ti) but she had a huge issue at even those prices. She could care less when I showed her what they were going for on EBay, or the fact that I sold my 5700XT last week for $1,000 USD, and my son sold his 1660 Super for almost $400. Still a huge waste of cash in her world. In the end, she isn't going to divorce me for it, so I'll keep buying :D I have noticed in the last few weeks that I'm not seeing ANY new cards show up in my bookmarked list (Memory Express). I was pretty regularly seeing 1660 Supers and the odd 6800, 6800XT, and 3070. They'd only last for maybe 30 minutes or so, but lately, nothing.
 
  • Like
Reactions: The red spirit

aleader

Senior member
Oct 28, 2013
502
150
116
A 1060 is still a decent card. It can still play Fortnite at 1080p at high settings at 60 fps, so it's a great add-in card for kids who got cheap Walmart systems for Christmas with integrated graphics.

For sure, there are lots of games in my Steam library that ran at over 100 fps at 1440p with it. It's just surprising as it's really no good for mining with only 3GB, and I basically got exactly what I paid for it when I bought it new in 2016. I probably could have got another $50 or more if I would have tried.
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
Did that actually work?......................Still a huge waste of cash in her world. In the end, she isn't going to divorce me for it, so I'll keep buying :D
Yeah, for the most part. It helps that she knows I sold my 2080Ti for what it cost me and his 1070 for more than it originally cost as an offset. I buy what I want, I just get a small amout of flak for it as she doesn't seem to care that "faster=better" since the previous parts "work just fine". lol

She's still having trouble with my need for the 5950X since the 3950X is only about a year and a half old.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,482
20,569
146
You guys are hitting on the most important economic principle of the hobby for married hetero men - WAF(wife acceptance factor) Mine only gets spicy when I have too many parts and PCs at once. I am currently doing a spring cleaning to keep the peace.
 
  • Like
Reactions: Insert_Nickname

ultimatebob

Lifer
Jul 1, 2001
25,135
2,445
126
Honestly, my main concern with this hobby is e-waste. If this shortage means that people get more use out of old parts instead of sending them to landfill, that's fine by me.

In this regard, PC gaming is a lot better than it used to be. Back in the late 90's, you were lucky if you could get 2 years out of a graphics card and 4 years out of a new PC. Nowadays, you can get 5 years of life out of graphics card and 8 years out of a PC with just some memory and storage upgrades along the way.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
In this regard, PC gaming is a lot better than it used to be. Back in the late 90's, you were lucky if you could get 2 years out of a graphics card and 4 years out of a new PC. Nowadays, you can get 5 years of life out of graphics card and 8 years out of a PC with just some memory and storage upgrades along the way.

With potential for even longer in the future. Consoles have stabilized on 8 Zen2 cores for this generation, so anything on that level (6+ core Zen+/2/3, Coffee/Comet/Rocket Lake) should last a very long time indeed.

Now, if we could just get some new graphics cards.
 

killster1

Banned
Mar 15, 2007
6,208
475
126
It's funny how the cards that everyone thought were ripoffs 6 months ago now seem like great deals at their release prices, like the 3090 and 6900XT. I definitely agree with the "buy whatever makes you happy" sentiment, and it's also better to buy a card when you actually have time to enjoy it. I had a lot of gaming free time over the last 6 months that I'm unlikely to have in the future, and got a lot more use out of my card than I normally would.
i dunno who thought they are a rip off they are faster then teh 2080ti and about teh same price as it was.. i have some x1900xtxtx on the wall in teh garage and a lot of other like 270x i liked looking at on my display but i wonder how much they wouild go for and how well they would game. hell even a few 8200gts 8800gts would be better..

but as far as 2gb on the card uhh you didnt know people use cards CAD CAE ETC ?? :)
14-126-211-V03.jpg

ASUS GeForce GT 710 2GB GDDR5 HDMI VGA DVI Graphics Card (GT710-SL-2GD5-CSM)


Don't get me wrong, it looks pretty slick, but why on earth would they stick 2GB of GDDR5 on this card? Except, to serve as an (EXTREMELY!) low-end gaming card?

Seems kind of crazy if true, but ... that's the most directly answer that I could come up with, when I ask myself "why?".

Seeing as how prices have risen on the GT 1030 2GB GDDR5 cards, up to $150 now, NON-scalped prices, as those have risen to the top of the pack as newly bona-fide "Gaming" cards, these GT 710 cards seem to fill up the ranks as well, at only $69.99.

Granted, that's actually not a bad price overall, for ANY 2GB GDDR5 card. The 2GB GDDR5 MSI single-slot half-height GT 730 cards from MSI that were rare but kinda awesome in their own way, were often $70-$100 on their own, even BEFORE the "Great GPU Shortage of 2020-2021".

(Just waiting for the 4GB GDDR5 "Gaming Edition" Radeon 5450 cards... LOL.)
 

CP5670

Diamond Member
Jun 24, 2004
5,511
588
126
Honestly, my main concern with this hobby is e-waste. If this shortage means that people get more use out of old parts instead of sending them to landfill, that's fine by me.

It's not really e-waste if you sell all your old stuff. Someone else gets use out of it, especially since the overall PC gaming market is growing. Many components can still remain useful for 10+ years these days.
 

killster1

Banned
Mar 15, 2007
6,208
475
126
It's not really e-waste if you sell all your old stuff. Someone else gets use out of it, especially since the overall PC gaming market is growing. Many components can still remain useful for 10+ years these days.
100% agree these things shouldnt be sent to the landfill to begin with, there are precious metals and hazardous materials used. it would be like throwing car batteries into the dump.. wth!
 
  • Like
Reactions: The red spirit

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
I think many people tend to keep the same monitor for a very long time,
I have old LCDs from 15 years ago that are still working quite well for example,
also I'm not sure these ports are completely gone from the market, some lower end monitors might still be using it

this card even offers a VGA port
I do think that the digital only DVI port becomes a little irrelevant when it's so easy to use a passive adapter from HDMI to DVI and it works perfectly fine (it's how I'm using a DVI monitor right now)

the dvi port with the additional pins for a passive VGA connector are more interesting to have.

It's amazing build quality and reliability of electronics pre global financial crisis. I have dell monitors from 2003-2005 that still work. My last 4 $1000+ 4k monitors lasted between 1-2 years each before dying. Build quality and reliability is so bad now, no wonder manufacturers only offer 1 year warranties now.
 
  • Like
Reactions: The red spirit

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
If I stop using Ferengi math and calculate hours of gaming enjoyment per dollar, even a massively overpriced $1500 RTX 3070 is suddenly a bargain. And though you can spend less for less card, if you are forced to play at low res, low settings, low fps, the enjoyment factor suffers. It is all a value judgement for me, one that exceeds the limits of Ferengi math.

The problem is that being willing to buy a scalped card just gives more incentive for scalpers to continue to scalp. My biggest problem with scalping is that it's low effort, low risk, and high reward. It's really the "low effort" part that bugs me. It's not like these people are sitting out in line for 12 hours, which you could argue that the mark-up is equivalent to the scalper's time. So, it makes me ask, "What benefit do these people provide?" None. They create extra, artificial demand, which actually strengthens their own service. In other words, if they make stock worse, then there's more push to purchasing cards for exorbitant amounts.

Bot makers are just as bad. When someone decides that they need a bot to make a personal purchase (i.e. not intended for scalping), they've essentially made it harder for anyone else to get the item through manual means. So, by promoting using automated purchasing methods, bot makers are essentially creating the solution (automated purchasing tools) to the problem (inability to purchase due to inhuman purchasing speeds) that they create. I don't look at this any different than Ubisoft selling experience boosters in their own games.

Of course, none of this is to suggest that bots/scalpers are the only cause of our woes. There is pretty good demand going on right now, and that wouldn't simply disappear.
 

aleader

Senior member
Oct 28, 2013
502
150
116
Of course, none of this is to suggest that bots/scalpers are the only cause of our woes. There is pretty good demand going on right now, and that wouldn't simply disappear.

The real issue is miners using bots right now to buy ALL of the cards that come available.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
The real issue is miners using bots right now to buy ALL of the cards that come available.

I've been wondering a bit about this... will it naturally go away soon? This thought is based upon how mining booms have this tendency to fizzle out rather quickly. The problem is that miners add more and more hardware to the mining pool, the difficulty begins to ramp up to match the available hashing capability. So, as an example, rather than a new card making you $8 a day, it may now only make $6 a day. Last time, I believe we saw increased difficulty combined with plateaued (or dropped) prices, which lead to a drop in demand.

Now, there is one other factor that I think is important to consider now... the heavy prevalence of Ethereum-based alt-coins. Ethereum-based alt-coins existed during the last boom too, but I'd argue that they're far more prevalent now. In that case, if one coin becomes too difficult (not monetarily worth it), a miner could simply switch to an algorithm/coin with reduced difficulty. So, would this mentality keep people mining for longer?
 
  • Like
Reactions: blckgrffn

aleader

Senior member
Oct 28, 2013
502
150
116
I've been wondering a bit about this... will it naturally go away soon? This thought is based upon how mining booms have this tendency to fizzle out rather quickly. The problem is that miners add more and more hardware to the mining pool, the difficulty begins to ramp up to match the available hashing capability. So, as an example, rather than a new card making you $8 a day, it may now only make $6 a day. Last time, I believe we saw increased difficulty combined with plateaued (or dropped) prices, which lead to a drop in demand.

Now, there is one other factor that I think is important to consider now... the heavy prevalence of Ethereum-based alt-coins. Ethereum-based alt-coins existed during the last boom too, but I'd argue that they're far more prevalent now. In that case, if one coin becomes too difficult (not monetarily worth it), a miner could simply switch to an algorithm/coin with reduced difficulty. So, would this mentality keep people mining for longer?

The mentality I'm thinking is most prevalent right now is the price of Ethereum (currently at $2,304 USD and rising). I've seen difficulty increase off and on, but it hasn't been steady. Basically when I was cashing out for the last few months, as the price rises, I get more cash, pretty consistently. I'm going to hold it now as my cards are paid for, thinking it may go much higher in the next few months.

Here's an interesting article I saw today about bots:

https://www.pcgamer.com/graphics-ca...h-rise-in-bad-bot-activity-might-be-to-blame/
 

taisingera

Golden Member
Dec 27, 2005
1,140
35
91
I blame the pandemic (both increase demand and a cut in production and shipping), mining, rising inflation and the lack of igpu on most Ryzen cpus for these insane prices.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
I knew you'd come around in the end ;) Hope it does what you need!
LOL, I hope so too. I don't want to have to use one of my GT 1030 cards, those are too expensive now to use for non-gaming duty. :p

I just needed a card to do 4K60 over HDMI, since Intel's (even 10th-Gen) iGPU can't do 4k60 over HDMI 1.4 (but Nvidia's Kepler can, for some reason).
 

lyonwonder

Member
Dec 29, 2018
31
23
81
14-126-211-V03.jpg

ASUS GeForce GT 710 2GB GDDR5 HDMI VGA DVI Graphics Card (GT710-SL-2GD5-CSM)


Don't get me wrong, it looks pretty slick, but why on earth would they stick 2GB of GDDR5 on this card? Except, to serve as an (EXTREMELY!) low-end gaming card?

Seems kind of crazy if true, but ... that's the most directly answer that I could come up with, when I ask myself "why?".

Seeing as how prices have risen on the GT 1030 2GB GDDR5 cards, up to $150 now, NON-scalped prices, as those have risen to the top of the pack as newly bona-fide "Gaming" cards, these GT 710 cards seem to fill up the ranks as well, at only $69.99.

Granted, that's actually not a bad price overall, for ANY 2GB GDDR5 card. The 2GB GDDR5 MSI single-slot half-height GT 730 cards from MSI that were rare but kinda awesome in their own way, were often $70-$100 on their own, even BEFORE the "Great GPU Shortage of 2020-2021".

(Just waiting for the 4GB GDDR5 "Gaming Edition" Radeon 5450 cards... LOL.)

Even
14-126-211-V03.jpg

ASUS GeForce GT 710 2GB GDDR5 HDMI VGA DVI Graphics Card (GT710-SL-2GD5-CSM)


Don't get me wrong, it looks pretty slick, but why on earth would they stick 2GB of GDDR5 on this card? Except, to serve as an (EXTREMELY!) low-end gaming card?

Seems kind of crazy if true, but ... that's the most directly answer that I could come up with, when I ask myself "why?".

Seeing as how prices have risen on the GT 1030 2GB GDDR5 cards, up to $150 now, NON-scalped prices, as those have risen to the top of the pack as newly bona-fide "Gaming" cards, these GT 710 cards seem to fill up the ranks as well, at only $69.99.

Granted, that's actually not a bad price overall, for ANY 2GB GDDR5 card. The 2GB GDDR5 MSI single-slot half-height GT 730 cards from MSI that were rare but kinda awesome in their own way, were often $70-$100 on their own, even BEFORE the "Great GPU Shortage of 2020-2021".

(Just waiting for the 4GB GDDR5 "Gaming Edition" Radeon 5450 cards... LOL.)

Intel's
According to Wikipedia the options are DDR3 or GDDR5. Just be glad it's not the DDR3 version?

My guess is the GT 710 was switched to GDDR5 since DDR3 has gotten harder to come by. And I doubt Kepler supports DDR4 as used by some GT 1030s.
 
  • Like
Reactions: The red spirit

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
Well, THAT was disappointing.

I had the chance to work on an older Sandy Bridge Pentium G630 (?) PC, 2x4GB DDR4, Gateway SX-something (285x?) slim PC.

The previous config that I sold it in, had a PNY GT430 128-bit DDR3 card, with LP brackets.

I swapped out the HDD for a 480GB Acer SATA SSD, and then removed the PNY GT430, and put the LP bracket(s) onto the GT710 GDDR5 version card, and then slotted it in.

I can't get any video output?!? I tried the HDMI, no BIOS screen, I tried the DVI-D with a DVI-to-HDMI cable, and a 7" mini-LCD with HDMI inputs that I previously had working with my server PC (with onboard Intel DVI-D output).Nothing!

I tried removing the VGA header. Nothing.

Not sure what's going on here. The card initializes, and makes proper boot beeps, I can hit the keys to enter BIOS, hit ESC and ENTER, and it beeps not too long after that as it reboots out of the BIOS screen. But it displays nothing to the monitor.

I have yet to try the card in a more modern PC. I don't know if the card's outputs are defective, or what. (Or if the +5V line on the PSU in the slim desktop isn't working? Surely, it wouldn't boot if that were the case?)

Is there a chance that "modern" low-end NVidia-chipset cards, are no longer compatible with "Legacy" mainboards? This is frustrating. I don't want want to put the PNY card back in, as it's Fermi, unsupported by current drivers, and I don't know how much life the fan has in it left. (These parts are all fairly old.)

The Kepler GT710 GDDR5 card was passively-cooled, which would have lasted basically forever, and the SSD should last a long time, which basically leaves the PSU and PSU fan as the most likely parts to fail in time.

The other possibility is, it's not making full PCI-E slot contact somehow, but it seems like it's plugged in OK. I took it out and adjusted the rear bracket, and slid it all the way up that I could, and tightened it back down, so that it could plug as far into the socket as possible. I don't see any gold fingers or "tilt" to the card when it's plugged in.

I also can't get a display out of the onboard, when the card is plugged in. So it does seem to be detecting it. Just no visual output. (I do get a display and POST with the onboard, without the GT710 plugged in. Also, I believe, from the beeps and keyboard, that it is indeed POSTing with it plugged in as well, but just no video output.)


Could this be a PCI-E slot power spec issue? I know that PCI-E x16 is supposed to provide 75W, but in this PC, it may only be 25W. Or I might be incorrect about that, how would a GT430 function on 25W?
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,482
20,569
146
Confirm the 710 works in another system first. If the card checks out, try manually selecting the PCIE to initialize first in the bios using the iGPU. Turn it off, put the card in, try it again. If you still can't get display, probably a firmware issue as you suspected. And in case you have not read it yet, Nvidia is about to end support for the 6&7 series. I don't think that includes Maxwell based, but I am not certain.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
Confirm the 710 works in another system first.
That's next on my list. I may have to swap brackets again, in order to do that.

I've never gotten an out-of-the-box failure on a video card before, that I can remember offhand. But "they say" the overall industry component shortage, has caused products and suppliers to use lower-grade components in products.

One other thing to note, I installed 21H1 off a USB3.0 drive, but it wouldn't recognize or boot in the USB port cluster on the end of the I/O area near the video card.
Maybe there's a +5V trace that got scratched and cut somewhere near there on the board? I'd be more willing to believe BIOS incompatibilities before that, though.