• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

14th WCG Birthday Challenge

StefanR5R

Elite Member
14thbirthday_logo_720.png


Sign up for the WCG Birthday Challenge is up and can be found here - https://www.worldcommunitygrid.org/team/challenge/viewTeamChallenge.do?challengeId=9145

TeAm AnandTech has signed up.

SETI.Germany's event page:
https://www.seti-germany.de/wcg/1_en_Welcome.html


The event starts in less than 10 days from now. Plus, most of WCG's sub-projects have a task deadline of 10 days. In other words: The race has started today at 00:00 UTC already! 🙂

Our 2017 WCG Birthday Challenge thread:
https://forums.anandtech.com/threads/13th-wcg-birthday-challenge.2522851/

edit Nov 8: header image added
 
Last edited:
Has Team Anandtech joined this race yet ?
In addition to the 100k ppd that I have going (not bunkered) I just started to bunker my 2990wx. 64 threads with 9 days bunkering......
 
This thread is eerily quiet...

Hence, some stats from the previous events:
  • Until the end of the 2017 and 2016 events, our respective forum threads grew 117 and 120 posts. :-)
  • Our points and ranks, copied from last year's thread:
9th challenge, 2013:
  • 76 teams produced 484 M points together.
  • The TeAm was 21st with 6.1 M (1.3 %).
10th challenge, 2014:
  • 94 teams produced 885 M points.
  • 484 M points were reached after 5d11h.
  • The TeAm was 26th with 10.1 M (1.1 %).
11th challenge, 2015:
  • 95 teams produced 756 M points.
  • 1d5h more would have been needed to exceed the previous year's total.
  • The TeAm was 25th with 6.6 M (0.9 %).
12th challenge, 2016:
  • 83 teams produced 873 M points.
  • 756 M points were reached after 5d23h.
  • The TeAm was 15th with 12.7 M (1.5 %).
13th challenge, 2017:
  • 95 teams produced 1,348 M points.
  • 873 M points were reached after 4d6h.
  • The TeAm was 8th with 73 M (5.4 %).
  • 35 TeAm mates contributed to this challenge.
2013:
19 _ MyOnlineTeam
20 _ AMD Users
21 _ TeAm AnandTech
22 _ BRAZIL - BRAZIL@GRID
23 _ SNURK and friends​

2014:
24 _ Team MoonLiteShadow
25 _ SETI.USA
26 _ TeAm AnandTech
27 _ The Knights Who Say Ni!
28 _ SETIKAH@KOREA​

2015:
23 _ MyOnlineTeam
24 _ BRAZIL - BRAZIL@GRID
25 _ TeAm AnandTech
26 _ BOINCstats
27 _ Cruncher Society​

2016:
13 _ Crunching@EVGA
14 _ AMD Users
15 _ TeAm AnandTech
16 _ HardOCP
17 _ UK​

2017:
6 _ TechPowerUp!
7 _ L'Alliance Francophone
8 _ TeAm AnandTech
9 _ Overclock.net
10 _ BOINC.Italy​


And remember,
start is at November 16, 00:00:00 UTC (convert),
finish is at November 22, 23:59:59 UTC (convert).
 
Welcome cellarnoise!

I think this will be the first time I've participated in this event.

@StefanR5R, are the historical data you posted boinc or WCG points?
 
@StefanR5R, are the historical data you posted boinc or WCG points?
I believe these are WCG points ( = 7 times of BOINC credits alias cobblestones).

We made 4.0 M BOINC credits in the April 2018 FB sprint at WCG. Assuming 3.5 days worth of production, that would be 1.1 M BOINC-ppd, or 8 M WCG-ppd.

Assuming 73 M WCG points at the 2017 B'day challenge, produced during ~2+7 = 9 days, would mean 8 M WCG-ppd as well.

--------
PS,
at the WCG Christmas race on December 1-25 in 2017, we made 105 M WCG points. That was 4.2 M WCG-ppd on average. (This was not much more than 25 days production, since we practically did not bunker before that race.)
 
Last edited:
I have discovered that the Microbiome Immunity Project has generated large numbers of computation errors on a few of my systems for an as yet unknown reason. For now, I have discontinued participation in that project.
 
It's interesting that MIP is using Rosetta. I'll have to try to dig a bit and see how and why they are using it.
 
Well, I did some quick checking. One of my 1950x's is doing 4:40 cpu time on mapping cancer markers. The 2990wx is doing 3:80.
Well, that means WCG isn't hampered with NUMA issue, assuming it occupies all 64 logical cores. Thanks for the info.
 
So, what's our target this year? Top 5 would be great.
Considering we were 8th last year, then 10th this year would already be a big improvement. This is because in contrast to 2017, this time three "financial services" ventures are registered as "teams" in this challenge. (Byteball.org with 14 M boinc-PPD daily average, BiblePay with 6.6 M, Gridcoin with 6.4 M. I took these averages from Free-DC. Note, IBM's average is 8 M boinc-PPD. Multiply with 7 to arrive at WCG-PPD. The challenge stats are counted in WCG points.)

Of the 7 teams which finished in front of us in 2017, the following ones are now (14 hours after the start) still behind us: HardOCP, Planet 3DNow!, TechPowerUp!, L'Alliance Francophone.

I do wonder whether or not SETI.USA made goals for themselves in this year's Birthday challenge, unlike 2017. They were the only real team who succeeded to get into the Thor challenge 2018 final round.

And to name just one more strong team of those who are still behind us, there is Decrypton. They did not participate in the Birthday challenge 2017, but they much outpaced us in a month long 12/2017 challenge.

Erm, @Markfw I'm interested in result comparison between your 1950X and 2990WX, do you have any?
Well, I did some quick checking. One of my 1950x's is doing 4:40 cpu time on mapping cancer markers. The 2990wx is doing 3:80.
When I measured MCM performance on my own hosts in November 2017, I found considerable variations between tasks on one and the same host:
  • Run times varied between 3 h and 5 h.
  • Granted credit per hour run time varied between 20 and 50 BOINC credits/hour.
  • Samples with a size of more than 50 tasks still had coefficients of variation in excess of 10 %, for run time and for credits per run time, respectively.
If this variation is still going on in current MCM batches, then an average of points-per-day from a considerable number of tasks needs to be taken, before comparisons between different hardware can be made.

(Furthermore, to compare different hardware, use the same OS. Some WCG subprojects have a considerable performance difference between their Linux and Windows applications. I suspect MCM doesn't differ a lot per OS, but I haven't actually checked.)
 
When I measured MCM performance on my own hosts in November 2017, I found considerable variations between tasks on one and the same host:
  • Run times varied between 3 h and 5 h.
  • Granted credit per hour run time varied between 20 and 50 BOINC credits/hour.
  • Samples with a size of more than 50 tasks still had coefficients of variation in excess of 10 %, for run time and for credits per run time, respectively.
If this variation is still going on in current MCM batches, then an average of points-per-day from a considerable number of tasks needs to be taken, before comparisons between different hardware can be made.

(Furthermore, to compare different hardware, use the same OS. Some WCG subprojects have a considerable performance difference between their Linux and Windows applications. I suspect MCM doesn't differ a lot per OS, but I haven't actually checked.)
Thanks for reminding me. I believe you'd told me about this last year, but I totally forgot about that.
 
Back
Top