Modern, but ultra-low-end GPU. For mostly video watching. Does it exist?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cbn

Lifer
Mar 27, 2009
12,968
221
106
Huh? I was referencing the more recent offerings, such as the Kaby Lake series (or the iris pro series). Yeah, the older ones were terrible, and the newer ones are better. Future ones will be better than these, and creep closer to the low-end DGPUs. That's all I was saying.

Yes, I did realize that.

I just brought that up because I see a huge gain for older iGPU platforms that still have a relevant CPU like Core i5 2400. If you think about those machines are still really common. In fact, I wonder if there are more of those machines around than there are Haswell or Skylake?

With that noted, how do you feel about "light gaming" being a moving target?

So while the iGPUs (and lowest end newest release dGPUs) get better....so do the graphics on "light games".
 
Last edited:

[DHT]Osiris

Lifer
Dec 15, 2015
17,165
16,308
146
I brought that up because I see a huge gain for older iGPU platforms that still have a relevant CPU like Core i5 2400. If you think about those machines are still really common. In fact, I wonder if there are more of those machines are around than there are Haswell or Skylake?

With that noted, how do you feel about "light gaming" being a moving target?

So while the iGPUs (and lowest end newest release dGPUs) get better....so do the graphics on "light games".

Well, the original topic was concerning tablet/surface/laptop gaming, where you're basically throwing the old one away when it can't perform, so a dGPU offering for an older CPU really doesn't help you, only the dGPU that comes in the unit from the factory.

Yeah, light gaming is definitely a moving target, and screen resolution sizes tend to increase as opposed to staying static (it's easier to run a 4k desktop with an iGPU than to game at that). iGPU has vastly outpaced the gaming minimum increases though, especially for games that don't focus specifically on graphics (see Overwatch, like 95% of the games on Steam). I know the upward trajectory of iGPU advances can't continue forever but I'm predicting it'll devour the 'NV 950/1050 & AMD 460' level of dGPUs before it plateaus.
 

MarkizSchnitzel

Senior member
Nov 10, 2013
466
106
116
Sure, but the gap is still very wide. If i could get something like IRIS to replace low end dgpu, for the same price, sure. But there isnt, oems are just not using those :(

Sent from my ONEPLUS A3003 using Tapatalk
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Well, the original topic was concerning tablet/surface/laptop gaming, where you're basically throwing the old one away when it can't perform, so a dGPU offering for an older CPU really doesn't help you, only the dGPU that comes in the unit from the factory.

Yep, that is true (based on current designs).

P.S. Laptops used to use a mobile dGPU form factor called MXM which fixed that issue. However, currently the only product that I know of that uses that dGPU form factor is an upcoming ASRock desktop (see link below).

http://www.anandtech.com/show/11052...x-using-microstx-motherboard-with-mxm-support

ASRock-Z270M-STX-MXM-angle1_678x452.jpg


ASRock-Z270M-STX-MXM-specs.jpg


Yeah, light gaming is definitely a moving target, and screen resolution sizes tend to increase as opposed to staying static (it's easier to run a 4k desktop with an iGPU than to game at that). iGPU has vastly outpaced the gaming minimum increases though, especially for games that don't focus specifically on graphics (see Overwatch, like 95% of the games on Steam). I know the upward trajectory of iGPU advances can't continue forever but I'm predicting it'll devour the 'NV 950/1050 & AMD 460' level of dGPUs before it plateaus.

On the Intel side (for mainstream desktop) it might be a while as we should have 14nm (which would include Coffee Lake's 14nm++) for at least another 2.5 Years.......and by that time who knows what GPU tech and process node AMD and Nvidia will be using.
 
Last edited:
  • Like
Reactions: [DHT]Osiris

walk2k

Member
Feb 11, 2006
157
2
81
I lol'd. But its true. VGA is so smeary and gross
I challenge anyone to tell the difference between analog (VGA) and digital on a ~19" ~1280x1024 monitor.
Hell I bet anyone here $1000 they can't consistently tell the difference, like significantly higher than 50/50 coin tossing.

It's not until you start going much larger/higher resolution that analog gives out.

Plenty of cards on ebay will fit your requirements, at least at lower resolutions, 4k60 may be tough...

Several posters above listed cards with only DVI-D (digital only, no analog/VGA possible).
 
Last edited:

richaron

Golden Member
Mar 27, 2012
1,357
329
136
I challenge anyone to tell the difference between analog (VGA) and digital on a ~19" ~1280x1024 monitor.
Hell I bet anyone here $1000 they can't consistently tell the difference, like significantly higher than 50/50 coin tossing.

It's not until you start going much larger/higher resolution that analog gives out.

That said, 4k60 video over VGA? Not likely, though technically a card with a 400mhz RAMDAC should support QXGA (2048×1536).

Plenty of cards on ebay will fit your other requirements though, at 1080p or so..

Several posters above listed cards with only DVI-D (digital only, no analog/VGA possible). You can tell the difference, the hybrid analog/digital DVI-I connector has 4 extra pins around the large dash/slot piece.

Dude, maybe you live in a world of premium cables because your assumptions are just wrong.

For a vast majority of VGA cables there is significant electromagnetic interference, and resolution has little to do with this. From the wall power comes at the same frequency as most monitors, so all of your electronic devices contribute to the interference. And there are many (most) cables out there with inadequate shielding. I've noticed and fixed analog interference for well over a decade with varying quality cables, and it's obvious with monitors even smaller than your "challenge" 19" 1280x1024. And it still continues these days; for example this Christmas, which I spent with my cousins, I pointed to a very long and poorly shielded VGA cable causing serious image quality issues with a Christmas present monitor.

Even if people are using short quality shielded VGA cables you can assume there will be image degradation. And image quality goes downhill fast when it gets longer or lesser quality.
 

walk2k

Member
Feb 11, 2006
157
2
81
Use decent cables, with ferrite cores on each end, and don't put your computer next to a power transformer? You'll be fine with analog, up to a certain point like I said... VGA is going to be hard pressed to get you 4k. Probably tops, around 2048 x 1536
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I challenge anyone to tell the difference between analog (VGA) and digital on a ~19" ~1280x1024 monitor.
Hell I bet anyone here $1000 they can't consistently tell the difference, like significantly higher than 50/50 coin tossing.

It's not until you start going much larger/higher resolution that analog gives out.

Plenty of cards on ebay will fit your requirements, at least at lower resolutions, 4k60 may be tough...

Several posters above listed cards with only DVI-D (digital only, no analog/VGA possible).

I had a 1280x1024 monitor that looked like garbage and I couldn't figure out why, changed it from VGA to DVI and it looked a thousand times better
 

bigboxes

Lifer
Apr 6, 2002
41,829
12,341
146
If you have a qualty cable and decent graphics chip then vga will be just fine. DVI is an improvement though. There's no reason for vga in this day an age.
 

daxzy

Senior member
Dec 22, 2013
393
77
101
If you have a qualty cable and decent graphics chip then vga will be just fine. DVI is an improvement though. There's no reason for vga in this day an age.

VGA is analog. It is quite useful for servers or as a debug mechanism.

Personally, I think its easy to tell the difference between VGA/DVI at around 1280x1024 and above.
 

PingSpike

Lifer
Feb 25, 2004
21,756
600
126
I have two 17" Dell LCD 1280 x 1024 monitors.....one has DVI + VGA and the other has VGA. I haven't noticed a difference between the two in terms of picture when I use DVI in the first one and VGA in the second one.

However, I think I remember something about LCD manufacturers skimping on the chip used for VGA in some DVI monitors. So if using VGA on a DVI monitor it might be the picture is worse than VGA on a dedicated VGA LCD monitor.

I use the in monitor switch on mine, basically because I couldn't get any DVI switches to work without artifacting (WTF on that) and despite the mantra being how crummy VGA is, I can't actually tell the different between it and DVI. It probably depends upon the cable, video card and monitor.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
(snip)

EDIT: I will also point that late core 2 machines onward typically have Display port and VGA whereas early core 2 machines usually only had VGA. So if the desktop is late enough there is always the possibility to go dual monitor via the iGPU*. This provided the second monitor has Display port or the user has a passive display port to HDMI or DVI-I adapter. Fortunately these passive adapters aren't too expensive.

(As a correction to what I wrote above)

I just noticed Dell Optiplex 380 (a late Core 2 machine that supports DDR3) only has a single VGA output (no display port). So, in this case, a late Core 2 machine has the same video output as the earlier ones.

This, in contrast, to Dell Optiplex 780 (a higher end Core 2 that also supports DDR3) which also has HDMI and Display port in addition to VGA.
 

Blue_Max

Diamond Member
Jul 7, 2011
4,223
153
106
Yep, that is true (based on current designs).

P.S. Laptops used to use a mobile dGPU form factor called MXM which fixed that issue. However, currently the only product that I know of that uses that dGPU form factor is an upcoming ASRock desktop (see link below).

http://www.anandtech.com/show/11052...x-using-microstx-motherboard-with-mxm-support

On the Intel side (for mainstream desktop) it might be a while as we should have 14nm (which would include Coffee Lake's 14nm++) for at least another 2.5 Years.......and by that time who knows what GPU tech and process node AMD and Nvidia will be using.

Very cool! Reminds me of the HP 8x00 "Elite" USFF systems that also used MXM modules, but anything that requires active cooling would take some extra engineering. ;)
8200_USDT_PC_3.jpg
 

piasabird

Lifer
Feb 6, 2002
17,168
60
91
There are a lot of motherboards which still have a VGA out as an option along with DVI and and HDMI. My GA-H110N Motherboard is like that. Not exactly a video card or a gaming oriented piece of hardware. I use an i3 6100 on mine.
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
Yep, that is true (based on current designs).

P.S. Laptops used to use a mobile dGPU form factor called MXM which fixed that issue. However, currently the only product that I know of that uses that dGPU form factor is an upcoming ASRock desktop (see link below).

http://www.anandtech.com/show/11052...x-using-microstx-motherboard-with-mxm-support

ASRock-Z270M-STX-MXM-angle1_678x452.jpg


ASRock-Z270M-STX-MXM-specs.jpg

The thing is that using MXM will add cost to a laptop design vs just soldering it down to the motherboard. It's bigger, you need to pay extra for the socket, the wiring is more difficult, the cooling is more difficult... the only good reason to do it is if you are intending to offer the same laptop with multiple different models of graphics card.

It's also phenomenally difficult (and expensive) to find an MXM GPU as a consumer, and upgrading one is a lot more complex than dropping in a new PCIe card. You have to dismantle practically your entire laptop to get it out. I think that most people aren't interested in that kind of hassle!
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
VGA is analog. It is quite useful for servers or as a debug mechanism.

Wait, how on earth does that make sense? Your PC is generating a digital image, that then needs to be converted to analogue to pass over the VGA cable, and then needs to be converted back from analogue to digital so that the LCD monitor can display it. You're adding more complexity into the display chain, and hence more potential points of failure. How does that make it better for debugging?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
It's also phenomenally difficult (and expensive) to find an MXM GPU as a consumer, and upgrading one is a lot more complex than dropping in a new PCIe card. You have to dismantle practically your entire laptop to get it out. I think that most people aren't interested in that kind of hassle!

Here is a link to dismantling a Dell Precision M6600 Workstation:

https://www.youtube.com/watch?v=ZAyPtXDJ4BI

It doesn't look that bad really.....and the CPU in them is still quite good for modern AAA gaming.

Core i7-2920XM (According to CPU world this 4C/8T Sandy Bridge CPU has a 3.2 GHz turbo on 3 or 4 cores, 3.4 GHz turbo on 2 cores and 3.5 GHz turbo on 1 core)

Core i7-2960XM (According to CPU world this 4C/8T Sandy Bridge CPU has 3.5 Ghz turbo on 3 or 4 cores, 3.6 Ghz on 2 cores and 3.7 Ghz turbo on 1 core)

With that mentioned, the CPU in the M6700 is even better:

Core i7-3940XM (According to CPU world this 4C/8T Ivy bridge CPU has a 3.7 GHz turbo on 3 or 4 cores, 3.8 GHz turbo on 2 cores and 3.9 GHz turbo on 1 core)
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
But where is the GTX 1040, in a single-slot, passive, low-profile-ready card, 25W, capable of 4K60 desktop display and video-decoding for all major current codecs? Where is it? Why is my only video-card choice around $30 for display-only the GT710?

Two of them coming in the form of GT 1030s:

23f54b6101622d8b0c08ecbc0c0cd100-1200x900.jpg


f9a550dece9dc6bf5244c7d45ed5f0ed-1200x900.jpg


These cards are basically 1/2 of a GTX 1050 Ti and the first low profile passive cards I have seen with GDDR5. The XFX R7 240 low profile passive had 128 bit DDR3 1600 while the various low profile passive Kepler GK208 cards (GT 630, GT 720, GT 710) had either 64 bit DDR3 1600 or 64 bit DDR3 1800.
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
Two of them coming in the form of GT 1030s:

23f54b6101622d8b0c08ecbc0c0cd100-1200x900.jpg


f9a550dece9dc6bf5244c7d45ed5f0ed-1200x900.jpg


These cards are basically 1/2 of a GTX 1050 Ti and the first low profile passive cards I have seen with GDDR5. The XFX R7 240 low profile passive had 128 bit DDR3 1600 while the various low profile passive Kepler GK208 cards (GT 630, GT 720, GT 710) had either 64 bit DDR3 1600 or 64 bit DDR3 1800.

With those enormous heatsinks, I don't think that counts as single slot any more ;)
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
With those enormous heatsinks, I don't think that counts as single slot any more ;)

That is a good point, but its been a long time since I have seen a true single slot (ie, single slot height cooler) low profile passive card.

Last one was the Power color HD5450, but even on that one they switched to a taller cooler:

https://forums.anandtech.com/threads/the-entry-level-video-card-hot-deals-thread.2403091/

Today (10/1/2015) I noticed the cooler in the above Newegg listing has changed...

....from this smaller one:

14-131-451-03.jpg


....to this bigger one:

14-131-338-16.jpg


14-131-338-Z03
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
That is a good point, but its been a long time since I have seen a true single slot (ie, single slot height cooler) low profile passive card.

Last one was the Power color HD5450, but even on that one they switched to a taller cooler:

https://forums.anandtech.com/threads/the-entry-level-video-card-hot-deals-thread.2403091/

The new single slot 1050ti cards look like "true single slot":

products_id_297_2.png


http://www.inno3d.com/products_detail.php?refid=297

Though given the design of the cooler, you'll still need to put it next to either an empty slot, or vents in your case. Sadly not low profile, but still useful for some applications.
 

shortylickens

No Lifer
Jul 15, 2003
80,287
17,081
136
VGA is not analog.

And I think if you need something really discrete you should get one of those new AMD chips with the onboard graphics.
 

woozle64

Junior Member
Aug 13, 2016
13
3
36
VGA is not analog.
VGA is analog for all intents and purposes. It goes through a DAC and is transmitted as an analog signal which is then sampled at the monitor.

Sure the signal is digital at some points along the way (and in some forms has a digital hsync/vsync signal, not always), but that's silly to say it isn't analog.