according to samsung's own power saving projections, their 1.35v modules will save you 5 cents a year worth of electricity per 2GB of ram... I did the math.
Let's re-do that math :sneaky:
If you save 0.5W per DIMM stick, you could say you saved 2W for 4 lower-power memory modules.
Depending on the location where you live, you would pay up to 4 euro for 2W of around-the-clock usage over 365 days. So you save 4 euro per year; not very significant. The memory may run cooler though.
For server systems with 12x2U systems in a cabinet and 24 memory modules per server, you have 12*24= 288 memory modules. Assuming the same 0.5W savings per module, that means 144W power savings.
The power costs of those 144W are 288 euro per year. Now you can multiply this amount by the air conditioning efficiency, where each watt costs at least another watt to cool and dissipate. The result is about 288W of actual power consumption if airconditioning is to be included, bringing the total costs to about 573 euro.
So eventually we pay 573 euro, just for the difference in power consumption of 0.5W per module.
by such a negligible amount that its not even worth bothering with.
Though home users would find the difference insignificant, it is significant enough to be important considerations in server systems.
If you are about to build a low-power server which will be running all day and low power consumption/noise is one consideration, then reducing idle power consumption just by chosing different components can be worthwhile. Not so much to save costs, but to create a solution that can be passively cooled without letting temperatures rise too much. Once you get into sub-30W idle power consumption, every watt counts!
By the way this link shows power savings much over 0.5W, though take it with a grain of salt:
http://www.xbitlabs.com/articles/memory/display/ddr3_13.html