Save power by using low-power PCIe instead of IGP?

sheh

Senior member
Jul 25, 2005
247
8
81
I have a Geforce 6100 + nForce 410 mobo (MSI K9NGM). While far fetched, I wonder if I could save power by using a modern low-power PCIe graphics card instead of the IGP. Does it seem likely?

Can BIOSes fully disable the IGP at all? (This one only has an option called "OnChip and PCIe VGA selection" which can be either "Both" or "Auto".)
 

cyrusfox

Member
Jun 12, 2010
91
0
66
Its possible, check your bios settings, if you have a card and kill-a-watt you can try it out yourself. You're talking pennies a year though so don't really see the point. At least for me it is .12 cents a kWhr.

You could always just build a new platform to play with, the am1 boards and cpu look like fun or try the new atom for spin. Probably superior performance to your current platform.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
How much power do you think the 6100 will be pulling with it's dual shader pipeline?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
I have a Geforce 6100 + nForce 410 mobo (MSI K9NGM). While far fetched, I wonder if I could save power by using a modern low-power PCIe graphics card instead of the IGP. Does it seem likely?

No. Compare the size of the heatsink (and fan) on a low-power PCI-E video card, to the size of the heatsink (and fan) on your chipset's northbridge.
 

lyssword

Diamond Member
Dec 15, 2005
5,630
25
91
if you actually want to save money on energy (assuming 24/7 operation) move away from the old 90nm 90watt single core p4/athlon to something like a cheap haswell/celeron. Those will use 2x less at load, even more idle, and be something like 5x faster.
 

coffeejunkee

Golden Member
Jul 31, 2010
1,153
0
0
I doubt it, but even if a discrete card uses less the difference will be so small that it will take a very long time to recoup even the cheapest card's cost.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
if you actually want to save money on energy (assuming 24/7 operation) move away from the old 90nm 90watt single core p4/athlon to something like a cheap haswell/celeron. Those will use 2x less at load, even more idle, and be something like 5x faster.

This^^

You could properly fit a discrete card in there and still use less power then your current system. A Haswell Celeron/Pentium+mATX board should only use 20-25W idle, and perhaps 35-40W fully loaded. Compare with the 89W TDP of an old 90nm Athlon (or 125W for an X2)...

The improvements in efficiency in the last 8 years are impressive to put it mildly. :)
 

sheh

Senior member
Jul 25, 2005
247
8
81
cyrusfox, I don't have a card to try it with. But if it's possible I'll look for a suitably low-power one that's most likely to help.

Re HSF of a standalone card, etc., I'm not looking to improve power usage under load, just idle. Power efficiency's gotten more attention in recent years, so if the 6100 happens to waste 5W at idle I might save a few, assuming a standalone card with all its memory and support circuitry can use less.

Yeah, a modern low-power mobo and CPU will be better, but I'm looking to spend just a few dollars on a used graphics card.

The CPU, BTW, is not 90W TDP but a 59W Sempron 3000+ AM2. I have a 65nm 45W part on the way, but I'm a bit doubtful it will help. This CPU seems to use surprisingly little power; I didn't test extensively, but after a short run at 100% it barely reaches +10C room temperature, which is maybe +2-3C its idle temperature. That's with the stock cooler slowed to 2000 RPM. The temperature reading looked suspicious, but touching the heatsink it wasn't even warm. I also had the impression AMD CPUs of the era were hot, but this one definitely isn't.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
cyrusfox, I don't have a card to try it with. But if it's possible I'll look for a suitably low-power one that's most likely to help.

Re HSF of a standalone card, etc., I'm not looking to improve power usage under load, just idle. Power efficiency's gotten more attention in recent years, so if the 6100 happens to waste 5W at idle I might save a few, assuming a standalone card with all its memory and support circuitry can use less.
Unless you are paying $1 USD per KWh or something insane like that, or live off the grid, I don't see how that you could ever recoup your investment into a discrete card.

AMD's newest cards have ZeroCore, but that only applies when the monitor is off (monitor sleep mode).

In short, I think that this idea of yours is a future Edit:futile exercise. I can't see a discrete card ever taking less power than an IGP, especially at idle.
 
Last edited:

lyssword

Diamond Member
Dec 15, 2005
5,630
25
91
The thing is, 6100 igp has so few transistors and runs at such low frequency (thus, slow as crap and without any modern acceleration features), and like you mentioned, doesnt need additional ram chips, it will be impossible to find any standalone card that uses less.
Here is some p4 old machine with 6200 standalone, which will probably use 2x the 6100 igp. As you can see, the shitty intel igp uses the least power. For reference, x300 uses 7-8 watts idle, while 6200 uses 10, which is probably 2x faster than, safe to assume 6100 must be using below 5w at idle.
http://www.anandtech.com/show/1506/4 here's another website which says intel 900 igp from above link uses 2w idle
http://forums.atomicmpc.com.au/index.php?showtopic=264

According to this, a modern card with super idle management radeon 7750/6450 has a nice low 6w idle, but that is exactly same (or 1 wmore) as ur igp.
 

tortillasoup

Golden Member
Jan 12, 2011
1,977
4
81
It's really hard to imagine a scenario with the IGP using more power than an add-on card. The only way I can imagine is that if the IGP is basically useless and the add-on card is a low powered card with lots of video acceleration features that help offset the load from the CPU. I think conceivably this can happen. However like others have pointed out, you'd probably be better served just getting a newer system off craigslist as the computer will likely have a more power efficient processor, will spend less time @ full load and therefore use less electricity. If you really want to save power, try getting a laptop as a replacement as they're always power efficient. Could even buy a laptop with a broken screen and use it like a desktop computer.
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
Honestly I think it would take longer than your lifespan to actually save money by doing this if it saves power at all.

As others have said, buy a more modern rig :)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
You would be a perfect candidate for a Kabini quad-core, methinks.
http://forums.anandtech.com/showthread.php?t=2377340

If you have access to Newegg.com, check that thread out.

I'm sure that a certain someone will be along to slander the Kabini, and say you should only buy an Intel Haswell Celeron.

Either way, you will save power, and have a competent IGP. (The Kabini has a better IGP, but is slower in single-threaded tasks than the Haswell Celeron.)

Edit: Here's a couple of Haswell combos, if you prefer Intel:
http://www.newegg.com/Product/ComboD...=Combo.1596228
http://www.newegg.com/Product/ComboD...=Combo.1596229

The Kabini boards, with the exception of the two ASRock ITX boards, all have only two SATA ports. If you need more, I recommend the ASRock AM1 boards, or the Haswell board combos above.
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
You would be a perfect candidate for a Kabini quad-core, methinks

That's actually a very good idea. Kabini + AM1 mainboard is quite cheap, and you get a major CPU upgrade thrown in, and a very competent IGP for everyday usage. That'd be youtube, HD video decoding, flash(games) etc. If you stick to older titles, and generally don't expect too much, you can even game on it.

The Jaguar core has IPC comparable to the older K8 core, and you go from having a single core at 1600MHz to having four cores at 2.05GHz... :)

All while perhaps halving your power consumption.
 

sheh

Senior member
Jul 25, 2005
247
8
81
The monitor IS off most of the time. If a modern graphics card can practically turn off, that's promising. And those tables at AtomicPC are interesting. :)

A notebook as a low-power PC is something I thought of. But there's not much expandability or repairability, and I'm not sure how well it'd take being on all the time. I suppose they're engineered to be small and light rather than ideally cooled and long lived.
 

tortillasoup

Golden Member
Jan 12, 2011
1,977
4
81
I've had a laptop that was on for years at a time doing folding at home. The worst part was having to replace the CPU fan which burned out. Otherwise if it's on a hard surface, the computer shouldn't have issues with cooling itself. If you can control the fan to run minimally and just hard limit the CPU, it can easily run pretty close to fanless.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
I suppose they're engineered to be small and light rather than ideally cooled and long lived.

Laptops are NOT designed for 24/7 operation at full load. Period. Actually a lot of laptops can get severe thermal issues if they run at full load for more then a few hours.
 

tortillasoup

Golden Member
Jan 12, 2011
1,977
4
81
Laptops are NOT designed for 24/7 operation at full load. Period. Actually a lot of laptops can get severe thermal issues if they run at full load for more then a few hours.

Well he can get a cheap cooling pad that plugs into the USB if it comes to that. It really does depend on the laptop though.
 

sheh

Senior member
Jul 25, 2005
247
8
81
But there's a difference between 24/7 at load versus mostly idle. Though, the limited storage flexibility is another negative. Modern low-power/low-cost CPUs and mobos look more attractive overall, and prices are getting quite good as well.

Anyway, if I happen to run across a modern low-power graphics card I'll check what it does to power. And I should have soon another CPU to test with (and with CnQ).

BTW, checking the power usage of my notebook, and assuming my power meter is accurate enough at relatively low power, the AC/DC adapter seems quite wasteful. The nominal rating for the notebook is 5 hours on a 50W/h battery, which seem close enough to real life. The DC adapter, on the other hand, draws 16-18W (at idle with the screen at about 50% brightness). But this is getting more offtopic. :)
 

tortillasoup

Golden Member
Jan 12, 2011
1,977
4
81
Did you check the power consumption with the battery removed? I think the computer is a bit more efficient than you think including the power adapter. It's also possible the screen dims when you unplug the power adapter so the effective power consumption is less. You need to control for a lot of variables before you start complaining about things such as the power adapter.

Here is a hint, if your power adapter is about the size of two C Cell batteries stacked on top of each other or a bit bigger/smaller yet it's a 60W rated adapter, then you've already got a power efficient switching power adapter. Now if it's rated at 60w and its form factor is like 2-3 stacked Galaxy S4s, then it's probably an older/more inefficient transformer designed power adapter.