• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

The 12th Annual Folding@Home Holiday Season Race

Page 19 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

The proposal for a name: The Cancer Terminators (Terminators) vs. The Cancer Eradicators (Eradicato

  • Yes

    Votes: 14 93.3%
  • No

    Votes: 1 6.7%
  • Try something else ....

    Votes: 0 0.0%

  • Total voters
    15
  • Poll closed .
NOT liking that uptick in Brony's production. We will have to keep an eye on them, maybe check the straps on the saddles as we've been riding them pretty hard. 😀

Upgraded a bit today, having some issues.
  • 1st, tired of Win10 begging me to install the Fall Update, so I allowed that.Got OpenCL not installed error for Folding upon completion.
  • Installed the latest Nvidia Driver, restarted, OpenCL error gone.
  • Swapped out a 1080 for a 1080Ti, Folding tasks are failing immediately.
  • Removed GPU slot. Added GPU slot. Folding tasks are failing immediately.
  • Currently reinstalling latest Nvidia Driver. If that doesn't work...
  • Will try older driver.
  • Will try Folding reinstallation.
 
The 1080Ti is putting out an extra 225-300k ppd, as expected. Linux is having some issues with the 'old' 1080 being added to the other system though.......
 
"Cinnamon" desktop crashes at startup with the 2nd 1080, but it still BOINC's and Folds, so....who cares!? 🙂 So, an additional 800k ppd on Thunder-Strike v2.
 
1125vdt.jpg


Now, this is not looking too good. We are loosing ground. If the pace keeps up it will be a too tight race for my taste. Please try to motivate team members to take part in the race.
We need more folders!
We could still use more participants to help turn that Brony smirk that's forming in the graph above upside down! People like, say, @brownstone, @GLeeM, @VirtualLarry, and @vsteel, whatever your forum name is.
 
I haven't had the nerve to take any of my Nvidia boxen out of production long enough to switch to Linux.
On the only Windows hosts of mine which I converted, which has fast Pascal GPUs and a slow Xeon CPU, the gain in PPD was above 20 %.* I don't recall how long the conversion took me, but it went fast because (1.) I used a separate, new disk for the installation (hence didn't have to shrink and repartition the Windows disk, and had the peace of mind that the Windows installation was at no risk of corruption), and more importantly, (2.) I undertook this shortly after I performed fresh Linux + NV-driver + F@H installations on completely new PCs, hence knew the drill, based on the tremendous help that I found here in the forum + general familiarity with Linux. Anyway, 20 % gain means that if this caused 2 hours downtime,** I would have been even after just 10 hours folding under Linux.

Given that this is a 2 months long race, I think the question is not so much how much downtime such a conversion will cause vs. what the percentage of gained PPD will be for the particular hardware, but rather the question is how high are the odds that the conversion succeeds in the first place. (In my case, success was almost guaranteed, due to the factors that I mentioned.)

One thing which I miss for GPU DC hosts with Linux, is some of the sensors support, and fan control outside the BIOS (including pump speed control, as this is a watercooled PC). But I can do without the latter because I have dialed in the radiator sizes, fan speeds, and pump speed already under Windows.***

*) There was also a gain in power efficiency, but not as much as the gain in PPD. I haven't measured it properly.
**) must have been far less, perhaps 3/4 hour, with all downloads and a bit of trial-and-error on the GRUB command line
***) although Linux uses more power, due to considerably higher GPU utilization
 
@StefanR5R , I may try this weekend, starting with one of the lesser producers with a 1060. Then perhaps disk images can be made and edited for deployment to other machines. I may be proficient enough to get a Linux setup going (I've done it numerous times, just not for DC), but may not have the required knowledge to fix problems as they occur. That's probably more of a concern in my mind than anything else, extended downtime caused by the inability to fix things quickly.
 
I haven't had the nerve to take any of my Nvidia boxen out of production long enough to switch to Linux.
It only seems to work for one boot but the entire install is 45 minutes once you get good at it ! I have it documented.

Edit, yes, always use a new disk for the linux, to make it easy to go back to windows if required. And it will auto-dual-boot the two disks in the install.
 
It only seems to work for one boot but the entire install is 45 minutes once you get good at it ! I have it documented.

Edit, yes, always use a new disk for the linux, to make it easy to go back to windows if required. And it will auto-dual-boot the two disks in the install.
Well, that's what I mean about knowing how to fix it, something tells me that the failure to boot is probably something fairly simple, IF there is the knowledge to know where to look, which I know I don't have.
 
I haven't rebooted my Linux F@H boxes often (didn't need to). The few reboots went OK. Some properties of my gear: So far, I only have ASRock Z270 and Asus X99 boards in GPU hosts, and only Pascal GPUs. I disabled automatic start of both boinc-client and FAHClient. Dunno if any of that matters.
 
Well, that's what I mean about knowing how to fix it, something tells me that the failure to boot is probably something fairly simple, IF there is the knowledge to know where to look, which I know I don't have.
Oh, it will boot the second time, with a black screen. And without being able to login, you can't start things. And are they running ? Not sure how Tony knows.
 
So it's a failure to load the GUI, or perhaps wrong display mode selected? I remember having to work on those things learning about Linux back in the day, and they aren't terribly hard, just time invested in the learning curve. These days I am trying to keep notes, since my memory is not what it used to be.
 
So it's a failure to load the GUI, or perhaps wrong display mode selected? I remember having to work on those things learning about Linux back in the day, and they aren't terribly hard, just time invested in the learning curve. These days I am trying to keep notes, since my memory is not what it used to be.
Well, thats for my AMD TR hosts. The only Intel one, crashes and will not boot at all. But everything works the first time. I have a thread about it, and tried a lot of fixes, so far no luck. After this contest is over I may work on it again.
 
Well, tracking data on my GPU orders seem to have gone into limbo, not good. No current estimated delivery date. OK, I get that we just had a blizzard-like storm here on the coast, but it would be nice to know when things are going to arrive, so I can be here to pick them up. Don't want to leave an expensive card out in the snow.
 
Welcome back to the F@H race, @Howdy2u2! We missed you.

Thanks, I never really went anywhere though. I just need to keep my costs down, outside temps are in the negatives lately. I can handle a higher electric bill for a bit but not that and a high gas bill to boot!!
 
Thanks, I never really went anywhere though. I just need to keep my costs down, outside temps are in the negatives lately. I can handle a higher electric bill for a bit but not that and a high gas bill to boot!!

I just heat my place with GPUs 😀
 
I just heat my place with GPUs 😀

Yes, I was too. Up until the temperatures went negative. Upstairs was a toasty 70*F, basement was 45*-50*F since the furnace wasn't running at all. Too close for me on those kinds of temperatures, the power has flickered in the last few days (enough to have to reset clocks) and I just don't want to take a chance of freezing pipes if the power happened to go out for awhile.
 
Yes, I was too. Up until the temperatures went negative. Upstairs was a toasty 70*F, basement was 45*-50*F since the furnace wasn't running at all. Too close for me on those kinds of temperatures, the power has flickered in the last few days (enough to have to reset clocks) and I just don't want to take a chance of freezing pipes if the power happened to go out for awhile.

That makes sense. If I had a basement I would certainly be putting a rig or two down there to more evenly heat the place.

As it is, I'm consistently around 80F inside +/- 3 degrees unless it's in the single digits or below.
 
Back
Top