what consumes less power at idle?

AmongthechosenX

Senior member
Aug 10, 2008
468
0
76
my moms new rig is a Gigabyte AMD 740G motherboard (SB700 southbridge, 55nm manufacturing process, less power), and an Athlon X2 3600+, which i intend to under volt. Shes also getting 2 x 2GB Corsair 5-5-5-15 1.8v RAM installed as well.

My goal is to get the lowest possible power consumption at idle.

I originally bought her a 3850 for the system, but at the last minute I decided to swap it out for my sisters X1650 pro, and give my sister the 3850 (other sister. both sisters have a 3850 now, lol).

which will take less idle power? The Radeon X1650 Pro, or the Radeon HD3850?

On a side note, I've diconnected the fan from the PCB on the x1650 pro because it is so loud. trying to see if I can strap a silent 80mm fan underneath it where the shroud is right now in order to cut back on the noise..

Everything else in the system is almost dead silent. I'm changing out the stock cooler for a Hyper TX2 which is virtually silent to the human ear (the stock cooler has a tiny little vibration in it, i want that gone). all the 80mm fans and the one 120mm intake fan are split at the "smart fan" port on the motherboard, so theyre all running at lowest possible speed according to the motherboard.

i'm hoping this doesn't pull more than 50-58 watts at idle after I under volt it.

oh, and its a 65 watt TDP Athlon X2, not a 95 watt

We use the PC for COD4 on ocassion because we have 5 COD4 capable computers in the house, so sometimes friends come over and we LAN it up. otherwise I would use the onboard video of the 740G.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
This is only conjecture, but I would imagine the X1650 uses less power simply because it has a very large performance gap between itself and the 3850. Assuming the 3850 doesn't have some really awesome power saving features, I don't see any way that a card that is so much more complex can use less power.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
the x1650pro because it draws all it's power from the PCI-e port while the 3850 needs additional power from a PCI-e connector
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Originally posted by: nosfe
the x1650pro because it draws all it's power from the PCI-e port while the 3850 needs additional power from a PCI-e connector

I would imagine it only needs that additional power under load. The 3850 is made on a 55nm process, I'm willing to bet that runs pretty cool and low power at idle. I think the x1650 is 80-90nm. Not sure which one uses less at idle though...
 

AmongthechosenX

Senior member
Aug 10, 2008
468
0
76
Originally posted by: nosfe
even so, the 1650pro has a die size of 130mm compared to the 190mm of the 3850.
why do i bother? here
http://www.techpowerup.com/rev...e/HD_3850_1_GB/21.html
the 512mb version is virtually identical

that helps me out alot!

the only gripe i'm getting is when we play COD4, my moms computer has a samsung 22 inch LCD. my sister, who just got the 3850 that was for my mom, only has a 15 inch LCD (limited to 1024 x 768). thats alot of wasted horsepower there, but again her computer isn't on as much, so i think my mom might be better off with the X1650 pro.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
Don't forget power consumption can change depending on the manufacturer and clockspeeds. Consider that plus the information from the techpowerup link, it looks like the only sure way to know which card consumes less at idle would be to install each card into the system and measure the power draw with a meter. Of course, we're probably only talking a few watts.

Originally posted by: AmongthechosenX
the only gripe i'm getting is when we play COD4, my moms computer has a samsung 22 inch LCD. my sister, who just got the 3850 that was for my mom, only has a 15 inch LCD (limited to 1024 x 768). thats alot of wasted horsepower there, but again her computer isn't on as much, so i think my mom might be better off with the X1650 pro.

Eh, it's not that much of a waste. At that resolution a 3850 would be able to max out just about every game, especially COD4, while on higher resolutions you would have to drop the IQ to gain playable framerates in some games.