• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Cost of distributed computing, over 150$ per year per computer!

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Thought I would point out some math to back out my claims that distributed computing is really expensive as far as electricity goes.

typical dual core computer power consumption from the wall: 200watts.

Consumptions at max CPU (when doing DC) 250 watts.

if you use the computer for say, 5 hours of gaming/maxed cpu usage scenerios a day, and 5 hours of general tasks (surving the web, etc, min CPU).
The 5 hours a day of maxed CPU don't count towards anything, its what you spend anyways. The 5 hours of general tasks would have taken 200 watts, but instead take 250.
So 50 watts x 5 hours/day= 0.25 kwh/day
The other 14 hours the computer could have been off, but you are leaving the computer on to crunch.
250 watts x 14 hours/day = 3.5 kwh/day
Total kwh/day expended beyond normal usage due to distributed computing = 3.75 kwh/day (the amount is higher the less time you would have spent using the computer...)
3.75 kwh/day x 365days/year = 1368.75 kwh/year
Assuming the absolute lowest priced electricity in texas 12cents per KWH (average is 14 + tax)
1368.75 kwh/year x 0.12$/kwh = 164.25$

If you have a beefier computer, or spend LESS time then what I described using the computer yourself every day, then the cost goes UP. If your computer takes less, I would like to hear what it has in it.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,073
3,576
126
most dc programs like WCG utilize idle clock cycles.

Meaning there all loaded, and when you need some clock cycles it temporary takes it off WCG to do work,


So your comp really doesnt have any break unless you pause.


And yea DC can get costly, your forgetting the cost of hardware from wearing out. Ive lost a ton of hardware because they just couldnt take the stress. Im sure mark has also.

But this is what we pay for to see how many points we can rack up per day!

Oh and for a good cause as well.


Also F@H can be run on an ATI card. Some F@H freaks will have there gpu's folding, so your is probably higher.
 

BrownTown

Diamond Member
Dec 1, 2005
5,314
1
0
Yes, distributed computing is a waste of money so far as I am concerned, and if I were a dirty hippie I would probably also note its harm on the environment and contributions to global warming.

Nice threadcrap. If you don't have anything positive to add. Stay out of the thread.

esquared
Anandtech Senior Moderator
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,275
16,120
136
Originally posted by: BrownTown
Yes, distributed computing is a waste of money so far as I am concerned, and if I were a dirty hippie I would probably also note its harm on the environment and contributions to global warming.

folding @ home is Stanford universities project to cure cancer (among other diseases). So to stop global warming, you hope we all die from Cancer ? Who cares if we are all dead.

Great attitude. I hope you are proud of it as you think about this on your death bed....
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: BrownTown
Yes, distributed computing is a waste of money so far as I am concerned, and if I were a dirty hippie I would probably also note its harm on the environment and contributions to global warming.

while at the same time donating your computer's "spare cycles" (that's like recycling, right? :p) to calculate the negative effects of global warming (the earth climate project, and the BBC earth climate project)
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,073
3,576
126
Originally posted by: BrownTown
Yes, distributed computing is a waste of money so far as I am concerned, and if I were a dirty hippie I would probably also note its harm on the environment and contributions to global warming.

i highly recomend you get rid of this post.

I and many others know people who have cancer. This type of project helps at working for a cure, and it does help A LOT. Your spitting on there face saying our computers time and our power money isnt worth jack to them.

Some of my friends have as many as 10-15 machines dedicated to projects. Some people will even take offense to comments like this.

I ask you polietly that you remove it and i will remove this post as well.


My "mentor" in quadcore overclocking has around 50-60 rigs and produces almost 3/4 million points per day on WCG. If you said something like this to him.... :T


I lost 3 boards this year due to stress. I lost 4 ram sticks due to stress. I ended up watercooling all my rigs because of stress.

You dont know jack about ocing and hard core stability until you join these projects and truely stress the living crap out of your hardware. :]
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,275
16,120
136
Originally posted by: aigomorla
Originally posted by: BrownTown
Yes, distributed computing is a waste of money so far as I am concerned, and if I were a dirty hippie I would probably also note its harm on the environment and contributions to global warming.

i highly recomend you get rid of this post.

I and many others know people who have cancer. This type of project helps at working for a cure, and it does help A LOT. Your spitting on there face saying our computers time and our power money isnt worth jack to them.

Some of my friends have as many as 10-15 machines dedicated to projects. Some people will even take offense to comments like this.

I ask you polietly that you remove it and i will remove this post as well.


My "mentor" in quadcore overclocking has around 50-60 rigs and produces almost 3/4 million points per day on WCG. If you said something like this to him.... :T


I lost 3 boards this year due to stress. I lost 4 ram sticks due to stress. I ended up watercooling all my rigs because of stress.

You dont know jack about ocing and hard core stability until you join these projects and truely stress the living crap out of your hardware. :]

And see my post above...This bordors on trolling, if nothing else, I hope you get whats coming to you Browntown.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,275
16,120
136
to taltamir, In Portland, OR, I think its less than 6 cents/kwh (where I live, google search showed 2007 rates)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
if I had a solar power system for my house I would be running it.
I am just pointing out that the costs are MUCH higher then people imagine... I ran it for years assuming its only a few bucks a year. if anything. I probably cost my parents thousands of dollars looking for little green men.

Those are some sweet power rates at portland.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Interesting. But things aren't so bad considering plasma TV's for example have a consumption of almost 700 watts for some models. And hundreds of thousands of these are used to watch Oprah every day.
 

faheyd

Junior Member
Jan 24, 2008
1
0
0
Originally posted by: taltamir
Thought I would point out some math to back out my claims that distributed computing is really expensive as far as electricity goes.

typical dual core computer power consumption from the wall: 200watts.

Consumptions at max CPU (when doing DC) 250 watts.

if you use the computer for say, 5 hours of gaming/maxed cpu usage scenerios a day, and 5 hours of general tasks (surving the web, etc, min CPU).
The 5 hours a day of maxed CPU don't count towards anything, its what you spend anyways. The 5 hours of general tasks would have taken 200 watts, but instead take 250.
So 50 watts x 5 hours/day= 0.25 kwh/day
The other 14 hours the computer could have been off, but you are leaving the computer on to crunch.
250 watts x 14 hours/day = 3.5 kwh/day
Total kwh/day expended beyond normal usage due to distributed computing = 3.75 kwh/day (the amount is higher the less time you would have spent using the computer...)
3.75 kwh/day x 365days/year = 1368.75 kwh/year
Assuming the absolute lowest priced electricity in texas 12cents per KWH (average is 14 + tax)
1368.75 kwh/year x 0.12$/kwh = 164.25$

If you have a beefier computer, or spend LESS time then what I described using the computer yourself every day, then the cost goes UP. If your computer takes less, I would like to hear what it has in it.

I just 5 minutes of my life registering here, just so I could say, "NO ONE CARES!"


I hope your future posts have more substance than this threadcrap.

esquared
Anandtech Senior Moderator
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: RussianSensation
Interesting. But things aren't so bad considering plasma TV's for example have a consumption of almost 700 watts for some models. And hundreds of thousands of these are used to watch Oprah every day.

HOLY SHAZAM SON OF SHAQ!

700watts for a TV?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: taltamir
Originally posted by: RussianSensation
Interesting. But things aren't so bad considering plasma TV's for example have a consumption of almost 700 watts for some models. And hundreds of thousands of these are used to watch Oprah every day.

HOLY SHAZAM SON OF SHAQ!

700watts for a TV?

Dude do you have any idea what a hairdrier pulls from the wall? How about the clothesdryer you like to use (or your mom for you) to dry your clothes?

Computers are nothing, a 200W computer is exactly the same power drain as two 100W lightbulbs. Think about that when you look around and see how many lights you have turned on at any given moment.

Also I would like to make a comment regarding power consumption estimations. Unless you have a ridiculous GPU that sucks up the juice while idling, no Core2 based DC computer is going to suck up 250W of power.

I say this because I am looking at 5 quad core computers sitting right here running 100% load on all cores and the power consumption (at the wall, what I pay for, per kill-a-watt meter) is 240W per quad core computer and that is at 3.3GHz and 4GB ram.

If I had to guess I'd bet most people's dual-core (Core2-based estimate here) systems when idling are using about 120W at the wall and and when doing CPU intensive stuff like F@H (but not gaming, i.e. GPU is idling) then the power consumption might get close to 200W.
 

sharad

Member
Apr 25, 2004
123
0
0
Originally posted by: Idontcare
Originally posted by: taltamir
Originally posted by: RussianSensation
Interesting. But things aren't so bad considering plasma TV's for example have a consumption of almost 700 watts for some models. And hundreds of thousands of these are used to watch Oprah every day.

HOLY SHAZAM SON OF SHAQ!

700watts for a TV?

Dude do you have any idea what a hairdrier pulls from the wall? How about the clothesdryer you like to use (or your mom for you) to dry your clothes?

Computers are nothing, a 200W computer is exactly the same power drain as two 100W lightbulbs. Think about that when you look around and see how many lights you have turned on at any given moment.

Also I would like to make a comment regarding power consumption estimations. Unless you have a ridiculous GPU that sucks up the juice while idling, no Core2 based DC computer is going to suck up 250W of power.

I say this because I am looking at 5 quad core computers sitting right here running 100% load on all cores and the power consumption (at the wall, what I pay for, per kill-a-watt meter) is 240W per quad core computer and that is at 3.3GHz and 4GB ram.

If I had to guess I'd bet most people's dual-core (Core2-based estimate here) systems when idling are using about 120W at the wall and and when doing CPU intensive stuff like F@H (but not gaming, i.e. GPU is idling) then the power consumption might get close to 200W.
I think the previous poster was meaning to say that 700watts for a TV is a lot. Yes, dryers use a lot of power (watts) but in most houses TVs are on for a longer period than those appliances.

I definitely wouldn't buy a TV that pulled 700 watts, no matter how good it was. 200-300w is ok.
 

hans007

Lifer
Feb 1, 2000
20,212
18
81
Originally posted by: sharad
Originally posted by: Idontcare
Originally posted by: taltamir
Originally posted by: RussianSensation
Interesting. But things aren't so bad considering plasma TV's for example have a consumption of almost 700 watts for some models. And hundreds of thousands of these are used to watch Oprah every day.

HOLY SHAZAM SON OF SHAQ!

700watts for a TV?

Dude do you have any idea what a hairdrier pulls from the wall? How about the clothesdryer you like to use (or your mom for you) to dry your clothes?

Computers are nothing, a 200W computer is exactly the same power drain as two 100W lightbulbs. Think about that when you look around and see how many lights you have turned on at any given moment.

Also I would like to make a comment regarding power consumption estimations. Unless you have a ridiculous GPU that sucks up the juice while idling, no Core2 based DC computer is going to suck up 250W of power.

I say this because I am looking at 5 quad core computers sitting right here running 100% load on all cores and the power consumption (at the wall, what I pay for, per kill-a-watt meter) is 240W per quad core computer and that is at 3.3GHz and 4GB ram.

If I had to guess I'd bet most people's dual-core (Core2-based estimate here) systems when idling are using about 120W at the wall and and when doing CPU intensive stuff like F@H (but not gaming, i.e. GPU is idling) then the power consumption might get close to 200W.
I think the previous poster was meaning to say that 700watts for a TV is a lot. Yes, dryers use a lot of power (watts) but in most houses TVs are on for a longer period than those appliances.

I definitely wouldn't buy a TV that pulled 700 watts, no matter how good it was. 200-300w is ok.

i have no doubt that some larger 60" or so plasma tvs pull 700 watts.

a 21" crt monitor pulls like 200 watts if i remember correclty.

then again, you do not leave your tv on 24/7/365 like folding at home computers.


the one thing i know that i was stupid about in the past is wasting cpu cycles on crap like rc5, or seti@ home even.

maybe folding @ home for cancer is worthwhile, but rc5 cracking is just plain stupid. or even prime number testing as well since.. really who cares what the largetst prime is.
 

PCTC2

Diamond Member
Feb 18, 2007
3,892
33
91
My computer runs at around 257W idle, 346W DC F@H Load, 440W full load. I'd rather give charity in the form of computing cycles and electricity bills than give money to charities that use 50% of their income to pay the people who run the charity.
 

Insidious

Diamond Member
Oct 25, 2001
7,649
0
0
I like using my computer :shocked:

of course, then I like driving my car... riding my motorcycle.... eating the food I buy....

If you don't want to use any electricity or "waste" your computer's life by... God forbid, computing... I would suggest there are much cheaper paperweights than a personal computer.

I've participated in DC for years and enjoy knowing my computer is adding to the world round the clock.

I know there are those who disagree, but I think the nice constant temperatures of my components and the non-varying loading has a lot to do with my equipment's longevity.

-Sid

 

biodoc

Diamond Member
Dec 29, 2005
6,339
2,243
136
Originally posted by: hans007
Originally posted by: sharad
Originally posted by: Idontcare
Originally posted by: taltamir
Originally posted by: RussianSensation
Interesting. But things aren't so bad considering plasma TV's for example have a consumption of almost 700 watts for some models. And hundreds of thousands of these are used to watch Oprah every day.

HOLY SHAZAM SON OF SHAQ!

700watts for a TV?

Dude do you have any idea what a hairdrier pulls from the wall? How about the clothesdryer you like to use (or your mom for you) to dry your clothes?

Computers are nothing, a 200W computer is exactly the same power drain as two 100W lightbulbs. Think about that when you look around and see how many lights you have turned on at any given moment.

Also I would like to make a comment regarding power consumption estimations. Unless you have a ridiculous GPU that sucks up the juice while idling, no Core2 based DC computer is going to suck up 250W of power.

I say this because I am looking at 5 quad core computers sitting right here running 100% load on all cores and the power consumption (at the wall, what I pay for, per kill-a-watt meter) is 240W per quad core computer and that is at 3.3GHz and 4GB ram.

If I had to guess I'd bet most people's dual-core (Core2-based estimate here) systems when idling are using about 120W at the wall and and when doing CPU intensive stuff like F@H (but not gaming, i.e. GPU is idling) then the power consumption might get close to 200W.
I think the previous poster was meaning to say that 700watts for a TV is a lot. Yes, dryers use a lot of power (watts) but in most houses TVs are on for a longer period than those appliances.

I definitely wouldn't buy a TV that pulled 700 watts, no matter how good it was. 200-300w is ok.

i have no doubt that some larger 60" or so plasma tvs pull 700 watts.

a 21" crt monitor pulls like 200 watts if i remember correclty.

then again, you do not leave your tv on 24/7/365 like folding at home computers.


the one thing i know that i was stupid about in the past is wasting cpu cycles on crap like rc5, or seti@ home even.

maybe folding @ home for cancer is worthwhile, but rc5 cracking is just plain stupid. or even prime number testing as well since.. really who cares what the largetst prime is.

IHMO DC support of basic research of any kind is not a waste of power/electricity. The amount of power consumed by DC is trivial compared to all of society in general. Take for example the "entertainment industry".

Home entertainment (TV, PC gaming, etc)

Sporting events (football, baseball at all levels; stadium lighting, etc)
cultural events (concerts, etc)
production of american idol, tv sitcoms, etc)
The amount of power used by the TV industry to produce, cover & broadcast all of the above.

By watching TV, you are supporting the massive use of power to stage and broadcast all of the above.

Am I going to stop watching sports or other entertainment on TV? NO WAY. (however, I do miss "day game" baseball;)

Am I going to stop using power to support basic research? Absolutely NOT.

Take a look at the whole picture of power use in this country and then pick another forum to stir up a cloud of dust.

You need to take your trolling elsewhere rather than pick on DCers.

 

AmberClad

Diamond Member
Jul 23, 2005
4,914
0
0
$150 eh. There are people that spend far more per year on MMO subscriptions/expenses, and without any benefit to science. So from that perspective, it doesn't seem particular bad.
 

biodoc

Diamond Member
Dec 29, 2005
6,339
2,243
136
I have 3 quads going 27/7 and my electricity bill is $230 per month total.

I also live in a cold climate so the heat generated does not go to waste. Of course, the summer time is another matter;)

Home heating oil is $3.50 per gallon in my area and the electricity comes from canada (hydorelectric). I'm spending more than $500 per month on fuel oil this winter:(

I have replaced all lighting with fluorescent and plan on spending the weekends this summer plugging/caulking/insulating so "this old house" is more energy efficient.

Cheers all & you're all welcome to join TeAm Anandtech DC!!!

:beer:
 

wwswimming

Banned
Jan 21, 2006
3,695
1
0
folding @ home is Stanford universities project to cure cancer (among other diseases). So to stop global warming, you hope we all die from Cancer ? Who cares if we are all dead.

Great attitude. I hope you are proud of it as you think about this on your death bed....

one very effective way to reduce cancer rates is to
manage the industrial effluents that cause many of
the cancers.

2 of countless examples -
* AMD's pollution of Sunnyvale groundwater.
band-aid solution - approx. $15K for homeowners in Sunnyvale.
* the incident immortalized in the movie Aaron Brockovitch.

incidentally, one of the people who posts on the website
LASIK-Flap.com
is a molecular biologist who is working as a lab manager
to fight cancer caused by industrial effluents, aka pollution.

her specialty is designing automated/robotic DNA sequencing
lab set-ups while managing the young techies who do some of
the work.

since the time of her LASIK surgery in 2004, she has been doing
that job with constant eye pain.
 

BrownTown

Diamond Member
Dec 1, 2005
5,314
1
0
Well since my initial response in this thread received several negative reactions including one from the neighborhood moderator let me take some time to readdress this issue. First off clearly my initial response was not the most diplomatic of posts, but it was also meant sarcastically in that I really DON'T care what the heck you do with your computers and electricity, and if you want to spend some of it on DC then thats great and your own business. However, that being said, I do NOT consider it an indefensible or vicious position to state that distributed computing is in fact a waste of resources. As the OP stated and many others (even supporters) have confirmed here, DC greatly increases power consumption and reduces product life. If perhaps 1 million "computer years" (and the number may be far more or less I don't know) are spent on DC at a cost of 100$ in electricity and 50$ in asset depreciation each year then thats 150,000,000$, so I have to ask what is all this money producing? Certinally with something like SETI@home my answer would be "not much", then comes the issue of the folding programs, and to this point I can simply say that I follow mainline news and technology news on a daily basis and I can personally never remember hearing of any major breakthroughs being contributed by these programs.

To some people it seems like this opinion is a very negative or viewed perhaps as an attack, but I am simply looking at the facts as I have seen them. I do not consider this argument to be invalid, nor would I consider an argument made in support for DC to be invalid, both I am sure can bring up good points in their own defense. Nor am I in some way set in stone in this opinion, if someone can produce evidence showing great breakthroughs caused by DC then that would be great too. But acting like I am a bad person for going against a program intended to cure cancer is meaningless without proof that said program actually DOES help find a cure to cancer. Personally I am a utilitarian, and when I see hundreds of millions of dollars going into something I have to ask my self "what are we getting back?", and so far as I have seen with DC the answer is not much, certainlly not 100,000,000$+ worth of meaningful research. Thats just my opinion, if you want to argue against it then great, but I don't see where my having a differing opinion than many people here is grounds for moderator involvement or attacks on my character.
 

ch33zw1z

Lifer
Nov 4, 2004
39,756
20,331
146
Originally posted by: BrownTown
I really DON'T care what the heck you do with your computers and electricity, and if you want to spend some of it on DC then thats great and your own business.

So don't post stupid comments...it wastes cpu cycles.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Only $150/year extra? Cool. Time to fire up more cruching rigs!

Btw, as a single male, I pay $50/mo currently for electric, and I have two computers running Seventeenorbust 24/7, a C2D E2140 @ 3.2Ghz, and a (non-OC, for shame) AMD64 3800+ @ 2.4Ghz. I have another C2D @ 3.2Ghz that is waiting in the wings to be fired up instead of the AMD, when I bother to switch these machines around.

I plan on adding some more cruch-only rigs soon to the mix, at least until my electric bill becomes unmanageable.