Weekly DC Stats - 31OCT2021

StefanR5R

Elite Member
Dec 10, 2016
5,498
7,786
136
Welcome back,

yep, indeed we won the PrimeGrid challenge last week. Later this past week, @cellarnoise managed to scare a) most away from/ b) a few into participating in the brand-new FHCHC event (SiDock@home, started 10/29 10:00 UTC, ends 10/31 after some Whiskey). After some Whiskey? Woops, will somebody like me have to run this perpetually?

Next up:
  • anytime after 11/09 (00:00 UTC) bunkering MCM for those who don't mind bunkering for special occasions :-)
  • 11/16 – 11/23 (00:00 UTC) World Community Grid Birthday Challenge, MCM = CPUs
  • 11/23 - 11/26 (05:00 UTC) PrimeGrid Euler's Constant Challenge, AP27 = GPUs mostly
    (recent average GPU time: 3.5 h, recent average CPU time: 26 h)

But back to the past week. Stats of the Ramanujan Machine and of Folding@Home are still hand-crafted, RC5-72 is still left out in the cold. I still haven't found time to automate the stats gathering of either of them. Here you go:

**********************************************
In the event we have any non-crunching AnandTech readers who happen to wander into this thread: Distributed Computing is where you allow your computing device (smartphones/tablets included) to work on things like medical research, mathematical stuff, sifting through telescope data to further the field of Astronomy, and many other 'citizen science' projects. It allows networked computers to band together to act as a supercomputer. And you should join us. Thanks go, as always, to the folks responsible for Free-DC, who make this possible by keeping score for us.
**********************************************



Amicable Numbers overall position - 6
TeAm total for the week - 314,464
TeAm rank for weekly production - 47

__Credit/week _ UserName
1_______246,102______Fardringle
2_______68,361_______geecee


Climate Prediction overall position - 24
TeAm total for the week - 399,621
TeAm rank for weekly production - 4

__Credit/week _ UserName
1_______358,553______xii5ku
2_______27,590_______far
3_______13,478_______Endgame124


Collatz Conjecture overall position - 43
TeAm total for the week - 7,079,122
TeAm rank for weekly production - 71

__Credit/week _ UserName
1_______6,815,615____Fardringle
2_______263,507______Ken_g6


Cosmology@Home overall position - 15
TeAm total for the week - 97,681
TeAm rank for weekly production - 30

__Credit/week _ UserName
1_______94,800_______Bremen
2_______2,880________geecee


Einstein@Home overall position - 24
TeAm total for the week - 2,900,105
TeAm rank for weekly production - 104

__Credit/week _ UserName
1_______959,805______Fardringle
2_______273,708______Alan J. Simpson
3_______83,000_______Jesper S. Rytke
4_______72,000_______geecee
5_______65,835_______biodoc
6_______17,325_______xii5ku
7_______4,000________elawyn
8_______2,062________Endgame124
9_______693__________Skivelitis2


Folding@Home overall position - 16
TeAm total for the week - 859,864,262
TeAm rank for weekly production - 19

__Credit/week _ UserName
1_______186,810,946__Mark_F_Williams
2_______96,384,890___LANMANTA
3_______49,341,292___King_of_Efland
4_______45,777,319___Bullseye
5_______39,949,707___Thaddeus_C.
6_______35,180,439___Ryanrhino
7_______29,746,598___Ron_Michener
8_______20,406,876___cellarnoise2
9_______18,063,290___Dragonfly
10______17,797,003___DVS1
11______16,913,358___PaulZebo
12______16,254,826___TAS-petrusbroder
13______16,221,695___13arogom
14______14,681,765___EagleRG
15______14,348,411___Bluto241
16______13,906,319___Endgame124
17______13,829,326___FoldingSolutions
18______12,754,637___BenjaminMcBride
19______10,185,147___voodoo5_6k
20______9,931,411____David
21______9,507,960____PLS2725
22______8,600,816____Dmitry_Belozerov
23______8,173,312____Cliff_Taylor
24______7,473,291____j_hest
25______7,453,404____Alan_Z
26______7,081,431____slowbones
27______7,013,790____Mihryazd
28______6,783,581____ElectronicCyberDragon
29______5,549,607____Paul_McAvinney
30______5,301,352____markam67
31______4,728,731____jiwa
32______4,336,535____Spungo
33______4,273,059____GSmith
34______4,159,687____SpeedyTheTurtle
35______3,698,373____Hai_Vo-Ba
36______3,691,219____Dexter
37______3,603,040____Bri79
38______3,406,417____mazzmond
39______3,403,981____Datoth
40______3,355,618____Wayne_Chan
41______2,735,222____Fruitn
42______2,679,614____Nerdsinc
43______2,671,984____pkick
44______2,570,365____catavalon21
45______2,424,730____Pete_Nyholm
46______2,404,675____katelyns
47______2,362,549____HappyCracker
48______2,350,866____underdeliver
49______2,313,444____Paul_Linton
50______2,269,407____FrenziedEngi
51______2,165,359____TA_JC
52______2,113,664____MPC98-Ryzen5
53______2,110,542____KapitalYou
54______1,959,062____ilnot1
55______1,951,740____SlangNRox
56______1,779,456____undebutant
57______1,755,383____pjt90
58______1,694,861____saah
59______1,642,606____Dr.N0
60______1,573,740____creed3020
61______1,529,444____MrCommunistGen
62______1,500,590____BigBunny
63______1,471,120____rderickson
64______1,447,047____videticj
65______1,362,998____All_Might
66______1,349,803____fhonyotski
67______1,332,016____Daniel_Trudeau
68______1,223,838____goober1986
69______1,117,690____lplatypus
70______982,632______Anonymous
71______857,472______deninc66
72______838,285______2StepBlack
73______798,112______Pato
74______775,614______RHL
75______749,196______Dano312
76______727,034______KPas
77______625,177______bjsjr
78______557,067______davidm103
79______541,205______Karetaker
80______536,508______ensignlee
81______505,155______woochowski
82______473,312______RK
83______449,540______Sergey_Nikiforov
84______337,134______emptysky
85______301,721______ChasR
86______284,291______Swishington
87______281,337______monkeyspooge
88______278,747______jkresh
89______264,663______nitrous9200
90______236,691______Cong16
91______231,168______mykchin8
92______225,057______Dale_Martin
93______213,192______Intellectual_Mastermind
94______210,406______Gary_Balliet
95______202,972______JDAMe1hLojVd
96______200,963______Bobr
97______138,376______QuietDad
98______132,174______DCstewieG
99______127,771______hreysenbach
100_____118,286______StefanLenz
101_____107,268______supes81
102_____106,981______Brazos
103_____97,846_______dgenx210
104_____94,465_______RadiclDreamer
105_____74,326_______zshift
106_____74,207_______mmaresko
107_____67,767_______LarryWild
108_____66,822_______styrmis
109_____66,549_______shabs42
110_____55,937_______jlinfitt
111_____45,612_______fusionTi
112_____43,337_______WarrenFitch
113_____37,394_______MaSQuE
114_____32,130_______Greg
115_____30,065_______TA_Smoke
116_____23,562_______Greeba
117_____22,232_______ISSie-Vee
118_____21,140_______Travis_G
119_____20,202_______CCHomebrew
120_____14,861_______Courjoe
121_____9,877________DieselDad
122_____8,365________eselqueso
123_____8,078________Snowless
124_____7,000________78Pickup
125_____3,402________PhilMattson01
126_____3,402________avelanarius


iThena overall position - 1
TeAm total for the week - 2,357,408
TeAm rank for weekly production - 1

__Credit/week _ UserName
1_______2,332,502____Fardringle
2_______21,934_______Skivelitis2
3_______2,324________emoga
4_______612__________Skillz


LHC@Home 1.0 overall position - 14
TeAm total for the week - 683
TeAm rank for weekly production - 175

__Credit/week _ UserName


MilkyWay@Home overall position - 13
TeAm total for the week - 5,565,346
TeAm rank for weekly production - 27

__Credit/week _ UserName
1_______2,652,629____Skillz
2_______764,881______Walrus of Apathy
3_______685,790______waffleironhead
4_______537,939______Stig
5_______329,694______Brent
6_______174,384______Fardringle
7_______143,055______geecee
8_______91,454_______SuperTerra
9_______68,648_______uallas5
10______50,679_______crashtech
11______30,563_______elfenix
12______17,251_______Christopher
13______13,591_______idlorj
14______1,842________salvorhardin
15______1,482________burninator34
16______304__________12800


Minecraft@Home overall position - 14
TeAm total for the week - 3,297,494
TeAm rank for weekly production - 22

__Credit/week _ UserName
1_______3,297,494____Fardringle


MLC@Home overall position - 10
TeAm total for the week - 184,600
TeAm rank for weekly production - 30

__Credit/week _ UserName


Moo! Wrapper overall position - 24
TeAm total for the week - 341,224
TeAm rank for weekly production - 48

__Credit/week _ UserName
1_______339,208______geecee
2_______2,016________Skivelitis2


NFS@Home overall position - 7
TeAm total for the week - 235,426
TeAm rank for weekly production - 19

__Credit/week _ UserName
1_______233,640______Rudy Toody
2_______10,060_______geecee
3_______1,000________zzuupp


NumberFields@Home overall position - 11
TeAm total for the week - 43,262
TeAm rank for weekly production - 42

__Credit/week _ UserName


ODLK overall position - 4
TeAm total for the week - 219
TeAm rank for weekly production - 37

__Credit/week _ UserName
1_______219__________Skivelitis2


Primegrid overall position - 26
TeAm total for the week - 72,787,266
TeAm rank for weekly production - 11

__Credit/week _ UserName
1_______14,973,646___Skivelitis2
2_______14,459,156___xii5ku
3_______11,311,295___parsnip soup in a clay bowl
4_______10,990,540___Icecold
5_______6,789,417____crashtech
6_______3,488,985____far
7_______2,797,692____biodoc
8_______1,539,627____waffleironhead
9_______1,329,297____emoga
10______1,139,145____Skillz
11______1,110,492____Orange Kid
12______1,060,704____markfw
13______948,453______HutchinsonJC
14______776,822______Lane42
15______733,645______geecee
16______449,744______VirtualLarry
17______371,498______Ken_g6
18______183,843______SlangNRox
19______51,959_______kiska
20______39___________RussianSensation


QuChemPedIA@home overall position - 2
TeAm total for the week - 200
TeAm rank for weekly production - 42

__Credit/week _ UserName
1_______200__________Skillz


Ralph@Home overall position - 49
TeAm total for the week - 640
TeAm rank for weekly production - 4

__Credit/week _ UserName
1_______640__________Fardringle


The Ramanujan Machine overall position - 2
TeAm total for the week - 48,593
TeAm rank for weekly production - not yet tracked

__Credit/week _ UserName
1_______48,293_______Fardringle
2_______300__________Skivelitis2


Rosetta@Home overall position - 14
TeAm total for the week - 3,707,278
TeAm rank for weekly production - 7

__Credit/week _ UserName
1_______2,662,997____markfw
2_______230,223______BadThad
3_______204,226______Endgame124
4_______87,398_______salvorhardin
5_______71,543_______Closius
6_______68,967_______TeeDeeCloud
7_______65,942_______Skivelitis2
8_______53,624_______TA_JC
9_______50,340_______Bremen
10______35,927_______[TA]Assimilator1
11______29,197_______Lighthappy
12______28,926_______far
13______21,416_______cyost91
14______17,744_______Cassio Hui
15______14,143_______Bundyman
16______12,890_______cb0159
17______12,447_______slowbones
18______11,076_______geecee
19______8,186________voodoo5_6k
20______6,332________uallas5
21______5,375________GLeeM
22______4,573________TeAm Enterprise
23______3,199________Dave
24______2,676________Lim Rern Jern
25______1,765________Mort
26______1,299________Jesper S. Rytke
27______419__________Titansfury


SiDock@home overall position - 5
TeAm total for the week - 1,142,513
TeAm rank for weekly production - 9

__Credit/week _ UserName
1_______288,028______xii5ku
2_______277,696______crashtech
3_______217,735______Icecold
4_______157,456______cellarnoise2
5_______89,455_______biodoc
6_______84,258_______Sesson
7_______27,599_______parsnip soup in a clay bowl


SRBase overall position - 5
TeAm total for the week - 3,139,280
TeAm rank for weekly production - 14

__Credit/week _ UserName
1_______3,115,000____Fardringle
2_______5,280________Skivelitis2


TN-Grid overall position - 15
TeAm total for the week - 6,795
TeAm rank for weekly production - 40

__Credit/week _ UserName
1_______4,954________geecee
2_______1,840________waffleironhead


Universe@Home overall position - 15
TeAm total for the week - 1,162,666
TeAm rank for weekly production - 35

__Credit/week _ UserName
1_______699,333______[H]Skillz
2_______287,333______Icecold
3_______158,000______Rudy Toody
4_______18,000_______Fardringle


World Community Grid overall position - 42
TeAm total for the week - 7,566,934
TeAm rank for weekly production - 28

__Credit/week _ UserName
1_______2,362,769____markfw
2_______1,290,029____xii5ku
3_______874,356______biodoc
4_______667,604______ZipSpeed
5_______607,650______cellarnoise2
6_______281,310______10esseeTony
7_______216,757______voodoo5_6k
8_______203,559______crashtech
9_______160,127______Ingleside
10______137,767______Kaebu
11______132,250______Aegeon
12______100,935______sukhoi_584th
13______92,764_______farse
14______70,213_______Tejas II
15______56,334_______uallas5
16______53,630_______Rebel Alliance
17______38,959_______[TA]Assimilator1
18______35,082_______wayliff
19______21,019_______petrusbroder
20______16,200_______asimperson
21______15,954_______geecee
22______13,222_______Skivelitis2
23______10,358_______Endgame124
24______4,458________EagleKeeper
25______3,336________[H]Skillz
26______2,817________TACommittee
27______2,124________emoga
28______1,501________Daishi
29______223__________waffleironhead


WEP M2 overall position - 19
TeAm total for the week - 43,148
TeAm rank for weekly production - 12

__Credit/week _ UserName
1_______43,148_______Skivelitis2


Yoyo@Home overall position - 22
TeAm total for the week - 221,497
TeAm rank for weekly production - 17

__Credit/week _ UserName
1_______115,343______Skivelitis2
2_______83,350_______waffleironhead
3_______15,640_______geecee
4_______7,439________Keerthim


 

StefanR5R

Elite Member
Dec 10, 2016
5,498
7,786
136
World Community Grid overall position - 42
__Credit/week _ UserName
2_______1,290,029____xii5ku
BTW, the 1.3 M are from ≈50 hours of OPNG on GTX 1080Ti (>600 kPPD per card, if it can be fed with enough work).

Edit: May actually be more than 600k after complete validation.

Edit on Monday morning: Running for a full day on Sunday brought 1.1 M boinc credits per card.
 
Last edited:

voodoo5_6k

Senior member
Jan 14, 2021
395
443
116
Thanks for another round of stats, Stefan :)

BTW, the 1.3 M are from ≈50 hours of OPNG on GTX 1080Ti (>600 kPPD per card, if it can be fed with enough work).

Edit: May actually be more than 600k after complete validation.

Very interesting. Ever since they rolled out the GPU WUs I've been kind of hesitant giving this a try (because of the WU supply). With F@H it's basically fire-and-forget, so I keep the GPU on that. It's too bad, F@H does not use BOINC. If they did, it would be relatively easy to set it up like "run OPNG and if you run out of task run F@H". But with the proprietary F@H client, I assume there is no straightforward way of doing this, isn't it?
 
  • Like
Reactions: Assimilator1

StefanR5R

Elite Member
Dec 10, 2016
5,498
7,786
136
It's too bad, F@H does not use BOINC. If they did, it would be relatively easy to set it up like "run OPNG and if you run out of task run F@H".
The result would be that the host would end up running F@H all the time.

You either take measures in order to receive the OPNG work supply that you need, or you will receive almost none.

The extent to which you have to go to maintain an almost always full OPNG work queue depends on a) how fast your GPU is, b) whether or not there is a Formula Boinc sprint going on at WCG when you do that. :->

The WCG server is not simply giving the scarce OPNG work randomly to anybody. Instead, it is giving the work to those who beg for it most persistently and obtrusively.

But with the proprietary F@H client, I assume there is no straightforward way of doing this, isn't it?
It's possible, and easy, to do the other way around and configure boinc to auto-suspend while a FahCore is running. But you are right, FahClient does not have the same feature. At least as far as I am aware.

You would lose the Quick Return Bonus too (and the scientists wouldn't get the quick return which they desire) if FahClient was arbitrarily suspended. So the better way would be to pull work into the boinc client but don't start it, send a command to FahClient to finish the current work but don't request more work, then resume the boinc client once the F@H work is done. This can be implemented as a script in bash or python or similar language; the necessary interfaces exist in the two clients.

Edit:
Ever since they rolled out the GPU WUs I've been kind of hesitant giving this a try (because of the WU supply).
Same here. It doesn't make sense to me to donate computer time to a project which does not need more computer time. I am running OPNG just now for the first time, for a limited time, out of curiosity how it works.

Edit 2:
For the same reason, it doesn't make sense to me to run the OPN1 CPU work, ever since OPNG came out of beta. This project is getting far more computer capacity donated via OPNG than they can actually use.
 
Last edited:
  • Like
Reactions: voodoo5_6k

voodoo5_6k

Senior member
Jan 14, 2021
395
443
116
The WCG server is not simply giving the scarce OPNG work randomly to anybody. Instead, it is giving the work to those who beg for it most persistently and obtrusively.
A de facto "begging war" :D You very nicely put it, as always :)

It's possible, and easy, to do the other way around and configure boinc to auto-suspend while a FahCore is running. But you are right, FahClient does not have the same feature. At least as far as I am aware.
[...]
Thanks for your assessment! However, having to dig in these interfaces and learn all what is required to implement such a script is way beyond my current spare time resources :D

Same here. It doesn't make sense to me to donate computer time to a project which does not need more computer time. I am running OPNG just now for the first time, for a limited time, out of curiosity how it works.
Couldn't agree more. Maybe, I'll give it a shot, when I have a vacation day (that'll give me some spare time to set it up and check on it while running). Until then, I'll just stick with F@H for the GPU. Also, it's old... When you're old, you usually don't like changes that much anymore... Don't want to upset it :D

Edit 2:
For the same reason, it doesn't make sense to me to run the OPN1 CPU work, ever since OPNG came out of beta. This project is getting far more computer capacity donated via OPNG than they can actually use.
I agree too, but I still could imagine that by keeping the CPU work flowing it allows them to reach a few more participants (even though the throughput of that queue is greatly diminished by the GPU work queue turn around times- for a given host). In the grand scheme of things however, the CPU work queue might still have a significant output. I don't know about their mid-term plans, but if they manage to move more and more of the WU generation to OPNG, then the CPU work should become dispensable in the process (i.e. not worth the effort/maintenance on their end while facing an oversupply of computational capacity for OPNG). But maybe my recent observations of the shift to MCM WUs are already an indicator of that development- the shift happening because also less OPN1 CPU work is available.

Anyhow, thanks again for your response :)
 

StefanR5R

Elite Member
Dec 10, 2016
5,498
7,786
136
Here are the complete stats of running OPNG on GTX 1080Ti, partly on Saturday, fully on Sunday, partly on Monday:

Statistics date
Runtime
Points earned
Results returned
active hours × GPUs
boinc PPD per GPU
results per day per GPU
2021-10-30​
0:015:11:56:38​
8,791,650​
2,156​
50​
602,856
1,035
2021-10-31​
0:044:22:59:56​
22,602,219​
5,365​
72​
1,076,296
1,788
2021-11-01​
0:033:10:11:19​
17,743,379​
4,266​
59​
1,031,092
1,735

Notes:
  • The low PPD and result rate on Saturday is most likely caused by validation lag.
  • Given that Sunday and Monday PPD and result rates are about the same, they should both be representative of the steady state.
  • These figures are only valid for setups which keep the work buffer fed the entire time.
  • The test was performed on a computer with 3 GTX 1080Ti GPUs and a Broadwell-EP CPU. The computer pulled 900…910 W at the wall, for >3 MPPD total.
  • My setup was perhaps not at peak efficiency. Most of the time, quite many tasks were running in parallel on the same GPU, maybe too many for best efficiency.
 
Last edited:

voodoo5_6k

Senior member
Jan 14, 2021
395
443
116
Most of the time, quite many tasks were running in parallel on the same GPU, maybe too many for best efficiency.
Are you using app_config.xml for that or are you solely relying on the device profile (or do you use/need both)? During the last Pentathlon I had used both, if I recall that right (for Einstein@home).
 

StefanR5R

Elite Member
Dec 10, 2016
5,498
7,786
136
I had multiple tasks running per GPU simply because I needed to run multiple¹ boinc client instances in parallel in order to keep the GPUs busy.



________
¹) And by 'multiple', I really mean multiple.

(I needed especially many during the first part of the experiment, which happened to overlap with a Formula Boinc sprint at WCG, which I didn't realize at first. This means, there were a few other users with many more and better GPUs than mine who did pretty much the same what I did in this experiment, only even more extensively: Vacuum up the rare OPNG tasks. Though even after the end of that sprint, I still needed a considerable number of client instances for sustained GPU utilization.)
 
Last edited:

biodoc

Diamond Member
Dec 29, 2005
6,261
2,238
136
I had multiple tasks running per GPU simply because I needed to run multiple¹ boinc client instances in parallel in order to keep the GPUs busy.
That's how you did it! :) I have 5 computers each with at least one GPU and I can only produce 500K points per day with one instance per computer.
 

StefanR5R

Elite Member
Dec 10, 2016
5,498
7,786
136
Elsewhere I saw an estimation that one GPU can be kept busy at WCG-OPNG with one boinc client instance if all of the below conditions are met:
  • The GPU isn't very fast and takes maybe 10 minutes per task.
  • An "update-spam script" is running.
  • There isn't something extraordinary going on, like some Scots trying to win some contest.

Multiple boinc instances have the following effects:
  • The execution units of wider GPUs are not very well utilized by a single OPNG application instance. On a card like GTX 1080Ti, two concurrent tasks carve out more performance. This could be accomplished with an app_config.xml, or with concurrently running clients.
  • The spamming frequency which a script can accomplish together with a singe client instance does not let you beg hard enough for as much work as a faster GPU can take. The spamming rate per client is limited to 1 work request / (~2 minutes) by the project server. Concurrently spamming clients on the same host are implementing a higher spamming rate per physical host.
  • Optionally, if you deploy a sufficient number of clients, a spam script is no longer needed because at some point the built-in work request retry policy of the clients becomes sufficient for continuous work supply.

PS,
as I mentioned, I looked into this out of curiosity. In general I don't want to run projects which already have enough contributors.
 
Last edited:

Assimilator1

Elite Member
Nov 4, 1999
24,120
507
126
Cheers for the stats Stefan :), and the info about WCG and GPU WUs!

Rosetta
10______35,927_______[TA]Assimilator1

WCG
17______38,959_______[TA]Assimilator1