Modern, but ultra-low-end GPU. For mostly video watching. Does it exist?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NTMBK

Lifer
Nov 14, 2011
10,400
5,635
136
And now there's a half-height, single slot 1030 too!

02G-P4-6333-KR_XL_4.jpg


http://techreport.com/news/31922/evga-gt-1030-cards-arrive-in-three-flavors
 
  • Like
Reactions: nathanddrews

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,202
126
How does the GT1030 stack up compared to the GTX750ti in performance? The 750ti obviously takes more wattage, and I think, has a 128-bit memory bus. So I'm guessing the 750ti is faster?
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
That half height, low profile card looks like a winner. Perfectly capable of going into any system.

Performance looks to be between a 750 and 750Ti, but with much reduced power usage, modern IO, and modern API and codec support.
 

nvgpu

Senior member
Sep 12, 2014
629
202
81

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,202
126
Other vendors only offer cards with legacy DVI-D port and since Nvidia only integrated single-link DVI in GP108 GPU to save on die size & cost, it maxes out at 1920x1080/1200 res.
Given the popularity of those 2560x1440 "Korean monitors" in the recent past, I think that was a mistake, to only put SL DVI-D on those cards. If you're going to go DVI-D only, no analog, then you might as well go all the way, and put DL on it.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
Not a super thorough analysis, but he promises to do more comparisons against the RX 550 soon. Seems to be what one would expect - somewhere between 750 and 750Ti performance, priced a bit too high for what it is, but with solid esports performance. NVIDIA's DX11 drivers have probably never been more important:
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
I was so sure that I saw someone launch a single slot passive HHHL GTX 1030, but either I was mistaken or that news post disappeared in a puff of smoke. Can't find it now, that's for sure.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
How does the GT1030 stack up compared to the GTX750ti in performance? The 750ti obviously takes more wattage, and I think, has a 128-bit memory bus. So I'm guessing the 750ti is faster?


^^^^ GT 1030 looks to be somewhere around 80% to 85% of a GTX 750 Ti.

Incidentally, a GTX 750 (Non-Ti) has 80% of the GPU core of the GTX 750 Ti (512sp @ 1020 Mhz/1085 Mhz vs. 640sp @ @ 1020 Mhz/1085 Mhz), but only 1GB VRAM rather than 2GB.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Edit: And if NV makes one, will it support VGA at all? There's lots of lower-end rigs out there with VGA (only) monitors, that could use an upgrade, for video codec support, to keep their PCs relevant in the "Rich Media Age" of pervasive online video. Think of all of the Core2 rigs out there, some using GMA3100/4500.

Now that we know GP 108 only supports two display outputs I could imagine some company coming out with a GT 1030 supporting at least four display outputs.

So maybe one HDMI and two DP and one Dual link DVI-I? (This on a low profile dual slot bracket)
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
Now that we know GP 108 only supports two display outputs I could imagine some company coming out with a GT 1030 supporting at least four display outputs.

This is what you want for maximising display outputs on a low-profile bracket. Sure, it is a bit more expensive, but you get 4(!) DisplayPort outputs. There is also the P400 which "only" has 3 DisplayPorts.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,202
126
That is the biggest GT 1030 made so far. (Full size dual slot and dual fans....all this for a 30W GPU)
Those two fans are probably a non-insignificant portion of the wattage that that GPU draws. Or at least, I wouldn't be too surprised that would be the case.

I think that that card is mostly for marketing reasons; bigger == better, some people don't buy anything less than dual-fan cards, etc.

Strangely, that card probably has better cooling, than my 225W max TDP HIS HD 7950 3GB card(s) that I sold to a fellow AT'er for cheap. The HIS cards only had a single fan.

Edit: Still, if that dual-fan card was cheap enough, and I had enough room in the PC, I'd probably buy it. I had good luck with my dual-fan Gigabyte Windforce GTX 460 1GB cards; when one fan started failing, the other one took up the slack. Then again, it would be nice if fans lasted longer than three years on a video card.

My current gaming-class card is an XFX RX 470 4GB model, with "hard-swap fans", which should make it easier to replace the fans in the future, should they fail.
 
  • Like
Reactions: cbn

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
That's why I prefer passively cooled cards in the non-performance segment. Less things to go wrong.
Doesn't beefy coolers with "zero rpm" modes make that point moot? If the fans only turn on while gaming, they're far less likely to fail, after all.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
Doesn't beefy coolers with "zero rpm" modes make that point moot? If the fans only turn on while gaming, they're far less likely to fail, after all.

Actually, it's the other way round. Frequent start-stop cycles can course bearings to wear quicker. Fans are best run continuously. With a low-RPM setting if necessary.
 
  • Like
Reactions: cbn

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
Actually, it's the other way round. Frequent start-stop cycles can course bearings to wear quicker. Fans are best run continuously. With a low-RPM setting if necessary.
But will you see frequent start-stop cycles if the heatsink is sufficient to keep the GPU beneath the fan turn-on temperature when idle, watching video, or during general desktop usage? Isn't that pretty much the whole point of those modes? As in: gaming = fan on continously, everything else = fan off all the time (outside of extreme edge cases).
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
But will you see frequent start-stop cycles if the heatsink is sufficient to keep the GPU beneath the fan turn-on temperature when idle, watching video, or during general desktop usage? Isn't that pretty much the whole point of those modes? As in: gaming = fan on continously, everything else = fan off all the time (outside of extreme edge cases).

I suppose so. If you have sufficient airflow/cooling capacity in your case to not trigger the GFX fan all the time, it should be good. Or playing with the fan control, but you might loose a bit of turbo boost. It's not like modern GPUs are power hogs in desktop mode, with the exception of Radeons* when decoding video.

It's just a rule of thumb after all, some fans are designed to handle this. But they're usually the more expensive kind, and you can't always predict wear-and-tear anyway.

*For some inexplicable reason AMD has always chosen to run their memory at full-speed when decoding. This is fixed with the 500-series, by adding a low-power memory mode BTW.
 
  • Like
Reactions: cbn

cbn

Lifer
Mar 27, 2009
12,968
221
106
This is what you want for maximising display outputs on a low-profile bracket.

I agree (for somone that doesn't want or need DVI-I or native HDMI) that Quadro P600 (GP107 with 384sp, 128 bit, 40W) is pretty nice.

Interestingly, I noticed Quadro P1000 (GP107 @ 640sp, 128 bit, 47W) also has the same true single slot low profile cooler:

14-133-645-Z01.jpg




P.S. I found it interesting that none of the low end quadros (including P400) have DDR4--> http://www.anandtech.com/show/11103...p600-p400-finishing-the-quadro-pascal-refresh

(So for GP107 its likely GDDR5 is only the choice. I wonder if the same is true for GP108?)
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Assuming GP108 does have DDR4 controller, here is what I think the specs would be for a GT 1010:

256sp
64 bit DDR4 2400 (would be nice if it was faster though)
25W

As side note 64 bit DDR4 2400 MHz would be like having 3830 MHz memory on a 64 bit bus if comparing to Kepler:

(2400 MHz x 1.2 x 1.33 = 3830 MHz)


PascalEdDay_FINAL_NDA_1463156837-012_575px.png


BandwidthSavings_575px.png
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Another thing to consider would be if Polaris 12 has DDR4 controller.

A harvested P12 with 128 bit DDR4 2400 should be faster than a GP108 with 64 bit DDR4 2400.

(Nice thing about P12 is that we do know it supports (at least) three display outputs (including dual link DVI-D). This in contrast to the two display outs of GP108.)
 
Last edited:

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
P.S. I found it interesting that none of the low end quadros (including P400) have DDR4--> http://www.anandtech.com/show/11103...p600-p400-finishing-the-quadro-pascal-refresh

(So for GP107 its likely GDDR5 is only the choice. I wonder if the same is true for GP108?)
The main reason for using DDR3 previously was cost, right? Is slow GDDR5 that much more expensive than DDR4 at all (especially today)? Is the difference enough to make up for the added cost (R&D + die size) of adding a second memory controller? Makes me wonder. I for one would not mourn the death of non-G DDR RAM on graphics cards.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
The main reason for using DDR3 previously was cost, right? Is slow GDDR5 that much more expensive than DDR4 at all (especially today)? Is the difference enough to make up for the added cost (R&D + die size) of adding a second memory controller? Makes me wonder. I for one would not mourn the death of non-G DDR RAM on graphics cards.

In the past I have seen folks in this forum claim that GDDR5 was twice as expensive as DDR3

Today, wouldn't the the same price difference exist with DDR4? (re: DDR4 is about the same price as DDR3).

Now regarding slow GDDR5........seeing as the RX 550 uses the same 7 Gbps speed GDDR5 as the RX 560 despite having half the GPU core (512sp vs. 1024sp) on the same 128 bit memory bus I'll bet slow GDDR5 it isn't that much cheaper (or even cheaper at all). This otherwise the RX 550 would have surely used slow GDDR5 right? In fact, I even wondering if 6 Gbps GDDR5 costs the same as 7 Gbps GDDR5 (but uses less power).

SIDE NOTE: A harvested Polaris 12 with 384sp and 128 bit DDR4 2625 MHz would have the same GPU to bandwidth ratio as a fully enabled Polaris 11 (ie, 1024sp) with 128 bit GDDR5 7000 MHz.

So DDR4 (at a speed of 2400 MHz) could definitely work for Polaris 12. In fact, that is the reason I can think of AMD including 128 bit bus for it. (Otherwise wouldn't it have made sense they would use the 64 bit bus like Nvidia did?)
 
Last edited:

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
You might be on to something, but remember that AMD traditionally uses wider memory buses than Nvidia. Also, GDDR5 has been on the market far longer than DDR4, so I assume it would have dropped significantly in price compared to, say, the Kepler (GT 730/710) days. I don't have any numbers to back that up, though, and Dramexchange unfortunately doesn't list GDDR prices that I can see.[/QUOTE]