Do you use or do Distributed Computing? Why or Why not?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

narzy

Elite Member
Feb 26, 2000
7,006
1
81
Originally posted by: elconejito
I've got a kill-a-watt and some time to burn. I'd install a DC client for testing purposes to see what idle and load are while using the client. Which one should i choose?

I'd probably say Folding@home just because of the popularity. DC projects can be weird in the sense that each project does different types of calculations and thus uses system resources differently.
 

WildW

Senior member
Oct 3, 2008
984
20
81
evilpicard.com
I used to do Folding@Home a few years ago, and stopped after a while. Partly electricity costs, partly to control my own habbits (like building PCs just to fold on), and partly to simplify the management (for example, having to check which machines were about to upload a work unit and totally kill my online gaming ping in the process before I played anything.)

These days I have traded in my desktops for a single laptop. I could run F@H again, but I don't like the idea of running a high CPU/GPU level for any period of time in a laptop with the cooling issues. That and most of the time my laptop is off and in my backpack, so it wouldn't get much runtime.
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
why would I need it? most distributive apps are for massive calculations like folding applications. on a personal level I do some photoshop/h264 encoding/web browsing/programming. non of which need that type of power.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I just bought a couple of those LED replacements for canister lighting, that'll be my first foray into replacing CFL's. They were as Sam's club.
What are those?

Its true, CFL's aren't very green once the total carbon footprint, including production costs, are factored in.
Well, yes, but I meant that they say to throw away your WORKING brand new lightbulbs and buy "more efficient" ones. which completely counters any cost benefit of the more efficient ones. They are only economical if you buy them in the first place after your last one burned out. And yes, they should be collected with the toxic waste.

Can you actually fire up a copy of the F@H GPU client on your GTX 260 long enough to get a reading? I'm curious how much difference it will make versus idle.
Sure, ill do it later and tell you my findings. I expect it will be similar to the OCCT GPU benchmark, but we will see.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Originally posted by: taltamir
Can you actually fire up a copy of the F@H GPU client on your GTX 260 long enough to get a reading? I'm curious how much difference it will make versus idle.
Sure, ill do it later and tell you my findings. I expect it will be similar to the OCCT GPU benchmark, but we will see.

This is exactly what I want to see. I'm curious how well any of the benchmarking software can simulate folding on the GPU from a load perspective. And the raw power consumption numbers.

I've seen gaming load/peak numbers for the GTX 260 which indicate that the card alone can consume 150-170W by itself under gaming loads (comparing SLI vs single card results). Your testing with OCCT shows about 130W under that load. So I want to know how much load F@H crunching puts on the card and how much power is going to be consumed as a result.

TIA!!
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: Idontcare
(data point: converting my entire house from incandescent to CFL's resulted in a $100/month electricity bill reduction...it cost me ~$700 to buy all those CFL's but I recovered the investment in less than 8 months)
It's surprising how quickly they pay for themselves. A lot of us just see the cost in the store and say no.

Assuming a 100W regular and 23W CFL are equivalent light sources, how long does it take for the energy savings to equal the $4.66 price difference ($5 each compared to 3 for $1) if power costs 11 cents per kwh?
466 cents / 11 cent per kwh = 42.36kwh
42.36kwh / 77W energy difference = 550 hours
That's about 22.9 days of constant use. Depending on where the bulb is used, CFLs can save a lot of money.

(like building PCs just to fold on)
This is something I don't understand. I want to help science research as much as I can, but I can't picture myself building an entire computer just for folding. It's like making a $500+ donation for the computer then continuing to donate the power of the entire computer (over 200W) rather than the energy difference between idle and folding (~20W). The power of the entire 200W computer is almost $200 per year where I live. I think it's great that some people are that charitable, but damn that's a lot of money.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: ShawnD1
(like building PCs just to fold on)
This is something I don't understand. I want to help science research as much as I can, but I can't picture myself building an entire computer just for folding. It's like making a $500+ donation for the computer then continuing to donate the power of the entire computer (over 200W) rather than the energy difference between idle and folding (~20W). The power of the entire 200W computer is almost $200 per year where I live. I think it's great that some people are that charitable, but damn that's a lot of money.

At that point its really a hobby, no different than spending money souping up your car or buying a tattoo or remodeling the kitchen or buying fishing gear or buying a computer just for gaming, etc.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,113
614
136
Originally posted by: Idontcare
Originally posted by: ShawnD1
(like building PCs just to fold on)
This is something I don't understand. I want to help science research as much as I can, but I can't picture myself building an entire computer just for folding. It's like making a $500+ donation for the computer then continuing to donate the power of the entire computer (over 200W) rather than the energy difference between idle and folding (~20W). The power of the entire 200W computer is almost $200 per year where I live. I think it's great that some people are that charitable, but damn that's a lot of money.

At that point its really a hobby, no different than spending money souping up your car or buying a tattoo or remodeling the kitchen or buying fishing gear or buying a computer just for gaming, etc.

exactly. Its all about the stats and the competition. We have races and competition over in the dc forum.
 

MagickMan

Diamond Member
Aug 11, 2008
7,460
3
76
No, I don't do DC. For the PCs in my home, the difference in my electric bill was $75 more /mo ($220 w/o DC, $295 w/ DC). That's too rich for my blood.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Idontcare
Watching TV on a standard 27" CRT is going to use about 2-3x more electricity versus setting your otherwise idle computer to run DC.
Interesting. I know someone that stopped doing DC (crunching for me) on his desktop, and yet, he leaves a 30" older CRT on 24/7. Some power savings. :(

(His rig is an E5200 @ 3.75Ghz, 1.425v (BIOS), no C1E/EIST. I hooked him up with a nice 26" LCD too, and he disabled power-save mode for that too, so the screen is on all the time.)

 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: ShawnD1
My UPS says the power difference between full load and no load on an E6600 is only about 10-20W while the whole system takes over 200W, so it's really a non issue. The case fans and hard drives take more power than the CPU does.
That's in line with what I believe too, in comparison between an overclocked CPU without EIST/C1E, at idle and at load.


 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Idontcare
(data point: converting my entire house from incandescent to CFL's resulted in a $100/month electricity bill reduction...it cost me ~$700 to buy all those CFL's but I recovered the investment in less than 8 months)
Dude, if you spend $100/mo on lighting alone, you must have a HUGE house.

I have a small apt, and with two computers and two ACs running 24x7, my total electic bill is only around $100-110/mo.

 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: VirtualLarry
Originally posted by: Idontcare
(data point: converting my entire house from incandescent to CFL's resulted in a $100/month electricity bill reduction...it cost me ~$700 to buy all those CFL's but I recovered the investment in less than 8 months)
Dude, if you spend $100/mo on lighting alone, you must have a HUGE house.

I have a small apt, and with two computers and two ACs running 24x7, my total electic bill is only around $100-110/mo.

6k sq-ft, monthly electric bills in the summer were $800 and that was after the conversion to CFL's for lighting.

But we don't own it anymore, completely accidentally timed the market perfectly and we sold it something like 15 days before the housing collapse that started in summer 2007. Of course you could say the buyers had some of the worst timing too, but we sold it for basically the what we paid a few years earlier (Dallas housing market didn't bubble like the east and west coasts did)

Originally posted by: VirtualLarry
Originally posted by: Idontcare
Watching TV on a standard 27" CRT is going to use about 2-3x more electricity versus setting your otherwise idle computer to run DC.
Interesting. I know someone that stopped doing DC (crunching for me) on his desktop, and yet, he leaves a 30" older CRT on 24/7. Some power savings. :(

(His rig is an E5200 @ 3.75Ghz, 1.425v (BIOS), no C1E/EIST. I hooked him up with a nice 26" LCD too, and he disabled power-save mode for that too, so the screen is on all the time.)

My, now that is ironic.
 

elconejito

Senior member
Dec 19, 2007
607
0
76
www.harvsworld.com
OK I ran a quick check using the Folding@Home client. I'm assuming this is the CPU, not GPU version. I just grabbed the first one I came across using google.

Using a kill-a-watt, with the system currently in my sig (Q9650 @4ghz, 3 HDD's, 8GB RAM, 9600GT). I turned on the computer, and let it settle down for a few minutes before I did anything to it. I had forgotten to turn off screensaver and display suspend, so the first reading is with the monitor asleep. It's a few watts lower, I guess because the GPU actually sleeps from idle? That's a guess... Anyways...

Idle (w/monitors asleep) = 147-148w
Idle (w/monitors "on") = 150-151w
F@H (nothing else running) = 177-178w
OCCT = 263w

NOTES:
-All readings were pretty constant. I checked back every 5 or 10 mins except for OCCT which I only left on for a total of about 10mins and checked only once.

-F@H used up only about 25% CPU. This was from the very beginning when launched, and every time I checked it later it was still at 25%. The slider to control max CPU usage in configuration was already at it's highest. I don't know if any other adjustments could be made to make it use up more CPU.

-Increase from idle wattage to F@H wattage was almost 30w. CPU temp went up about 5c. Not a show stopper I guess, but I probably wouldn't do this (at least not on this machine) long-term. Primary reasons are electricity cost is already too high here, which covers the actual wattage used by the comp and also the AC has to run more (really poor insulation, etc) which also contributes to more wattage. I already fret about the filserver (~60w) and HTPC (~90w) on pretty much 24/7. And even those I will occasionally turn off. My main desktop (which I did this test on) I only have running when "needed" due to electric bill/heat. If I were in different conditions regarding my income or a more efficient living space I might change my tune.

-I will let the work units I got finish out and then stop there.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
-F@H used up only about 25% CPU. This was from the very beginning when launched, and every time I checked it later it was still at 25%. The slider to control max CPU usage in configuration was already at it's highest. I don't know if any other adjustments could be made to make it use up more CPU.
They have a hard to find "high performance clients" page. In it you need to get the SMP client. It uses more than one core. (you can also get the GPU client there)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: taltamir
-F@H used up only about 25% CPU. This was from the very beginning when launched, and every time I checked it later it was still at 25%. The slider to control max CPU usage in configuration was already at it's highest. I don't know if any other adjustments could be made to make it use up more CPU.
They have a hard to find "high performance clients" page. In it you need to get the SMP client. It uses more than one core. (you can also get the GPU client there)

I'm in the same pickle, have tried three different downloads now (including the high-performance one) and followed the forum posting and I just cannot get F@H to use more than one core here. What a PITA.

I will say after having heard about F@H so much for the past years I really had much higher expectations of there being a much easier to install and use user interface than what I have downloaded and tried to play with so far.

Even prime95 is thread aware and installs click-install-done easy...this console stuff with F@H plus the "follow these 20 steps and you should be golden" (reminds me of OCZ SSD workarounds, side-laugh) instructions is kind of surprising. I wonder how many people get totally deterred from bothering with F@H because of this elevated barrier to use.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I think BOINC has some protein folding programs, and everything in BOINC is thread aware. (actually it can do CPU and GPU on the same program).

I tried the GPU client... last time I did it loaded up my GPU and the fan sounded like a jet, now (same video card) it showed activity / inactivity in the GUI, and went from 120watt (my idle) to 180 watt... every time it went to 180 watt my UPS started buzzing and video card started squeeking and it dropped down to 120 watt and stopped calculating according to GUI (and command line). It is really weird, games and benchmarking software and BOINC do not cause such a ... reaction?, but there is something about the latest version of their GPU client that just doesn't work right with my system... maybe it is because I am using win 7. (last time I checked in vista, it was MUCH faster calculating a WU, and it loaded up my GPU to full 100% load stable with no weird noises)
 

nerp

Diamond Member
Dec 31, 2005
9,865
105
106
I could never figure out how to get the damn multicore f@h client working and that's part of the reason I gave up and walked away. Personally, I'd rather cut the cancer society a $250 check every year instead of adding the power company as a middle man. Less cancer, less coal. Win win.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
lol... no really, I am laughing out loud. ;p
I do wonder what kind of bizzare workload it is putting on my GPU to cause that though... my guess is rapid alteration from full load to idle on GPU causing power fluctuations.
 

elconejito

Senior member
Dec 19, 2007
607
0
76
www.harvsworld.com
So I've officially given up on the SMP client. I've burned a few hours googling, installing, etc that I could have been playing video games or working on my blog (I'm on VACATION!!!!!!!). I'm joining the "I can't figure this out" group. Downloaded the SMP client, tried various stuff that seemed to work for others on the forums... no luck. Literally, nothing happens. I can get past the installation, then past the install batch file (by running the commands manually, from a cmd window I opened separately with admin rights), and then i run the client from both a command line and from a shortcut with the proper command line switches......and nothing happens.

Moving on...

The regular client maxxed out at 25% CPU usage and brought my power consumption up to almost 180w from 150w at idle. I'm currently encoding a video with handbrake, restoring a database, and posting here making my CPU usage just over 50% and kill-a-watt says i'm at 205-215w I think it's a fair guess that the SMP client, if it does put your CPU to 100% would probably burn well over 200w, and probably close to the OCCT max of about 260w, on my system the entire time it's running.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
if you wanna do folding... or any DC, just use BOINC. berkeley open infrustructure for N-something computing.
it works like a charm, and they actually have two seperate protein folding projects:
http://boinc.bakerlab.org/rosetta/
http://boinc.fzk.de/poem/

it will handle more cores than you have, and can even do multiple GPUs, all from within the same interface. (you can select how many cores and GPUs it is allowed to use on each computer)

I highly recommand bam.boincstats.com as your project account manager
install boinc, attach to BAM. And then add projects, remove projects, change settings, change percent of allocated resources, etc through BAM for all your machines.
You could of course do it manually and direct, but I prefer bam
be aware that your client will only connect to bam every X hours to a day, so if you want to change settings NOW you should tell your client to connect to server now to update settings and status.
 

GLeeM

Elite Member
Apr 2, 2004
7,199
128
106
Originally posted by: Idontcare
I'm in the same pickle, have tried three different downloads now (including the high-performance one) and followed the forum posting and I just cannot get F@H to use more than one core here. What a PITA.

I agree. Try helping someone who wants to run F@H troubleshoot an installation :roll:

I found the GPU client fairly easy to install.

Also I found notFreds VM Player F@H Linux SMP appliance easier than the high performance Windows SMP client.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: taltamir
if you wanna do folding... or any DC, just use BOINC. berkeley open infrustructure for N-something computing.
it works like a charm, and they actually have two seperate protein folding projects:
http://boinc.bakerlab.org/rosetta/
http://boinc.fzk.de/poem/

it will handle more cores than you have, and can even do multiple GPUs, all from within the same interface. (you can select how many cores and GPUs it is allowed to use on each computer)
Cool. I'll check this out.

What constantly blows my mind is how the Russians are always near the top of every project. bionic teams, TSC Russia is #4. They were also near the top of UD when I followed that, then they were either #1 or #2 for FAD, they're #6 at F@H. Isn't Russia virtually a third world nation? The CIA says their GDP is only $15,800 (roughly $8/h based on 40 hour weeks). How do they pay for all of this computing power?
 

Fullmetal Chocobo

Moderator<br>Distributed Computing
Moderator
May 13, 2003
13,704
7
81
I do. It's fun, and is considered an activity that my wife and I participate in together. :)