Returning to F@H

Discussion in 'Distributed Computing' started by mojothehut, Apr 22, 2017.

  1. mojothehut

    mojothehut Senior member

    Joined:
    Feb 26, 2012
    Messages:
    312
    Likes Received:
    2
    Hey all. I've been out of the Folding scene for a number of years.
    But now I want to return and build a $600-ish rig for 24/7 Folding.

    I'm wondering about a few things.
    Is CPU folding still relevant? I've heard GPU is the way to go now.
    I was thinking about a Ryzen 1600 for the 6 core goodness. But if
    GPU is still better than even a 6 core, should I buy a cheap cpu
    like an i3 then pair it with a midrange gpu?

    Original idea: Ryzen 1600 + RX550
    Second idea: I3 Skylake/Kaby + Nvidia 1050/1050ti/1060
    I'm a little hesitant of GPU folding because of heat issues. Heard some
    horror stories, I plan on having a very well ventilated case though. Also I've
    never GPU folded =)
    Thanks!
    p.s I plan on contributing to team Anandtech =)
     
    Smoke likes this.
  2. Loading...

    Similar Threads - Returning Forum Date
    The Return of Weekly Stats? Distributed Computing Mar 31, 2013
    (The return of) Weekly stats for April 10, 2011 Distributed Computing Apr 10, 2011
    Return of another old member Distributed Computing Nov 20, 2007
    Returning to F@H Distributed Computing Oct 3, 2006
    SETI MTR: Returning fire! Distributed Computing Nov 20, 2005

  3. StefanR5R

    StefanR5R Senior member

    Joined:
    Dec 10, 2016
    Messages:
    658
    Likes Received:
    273
    Under favorable conditions (right type of work units available), a dual-socket server's CPU folding throughput equals that of a single midrange/ upper midrange consumer GPU.

    As for the choice of GPU, there is a performance table at overclock.net for example. Note, the entries in that database with sample sizes less than about 30 are unreliable and should better be ignored. This is because Folding@Home sends work units with a wide range of points per WU.

    I'd say at least as important as amount of air flow of the case is how the air is guided through the case. You don't want the GPU heat up other components too much. Leaving the side panel off solves this problem mostly, even with dual GPUs.
     
  4. StefanR5R

    StefanR5R Senior member

    Joined:
    Dec 10, 2016
    Messages:
    658
    Likes Received:
    273
    PS,
    there are of course lots of other distributed computing projects which solely rely on CPU, because they haven't yet, or just can't be, ported to GPU.
     
  5. crashtech

    crashtech Diamond Member

    Joined:
    Jan 4, 2013
    Messages:
    7,061
    Likes Received:
    380
    From a budget standpoint, I think your second option is one of the better ways to go. From a power efficiency and space saving standpoint, a quad core and a pair of 1070s are even better.
     
  6. TennesseeTony

    TennesseeTony Elite Member

    Joined:
    Aug 2, 2003
    Messages:
    2,060
    Likes Received:
    424
    Welcome back @mojothehut !

    As for your $600-ish budget, scrape together whatever computer you can find, and toss in a used EVGA GTX1070 off an auction site, should run you less than $350 for the card (way less if you're savvy and patient). They are all 1 year old or less, and EVGA transfers the remainder of the 3 year warranty to the next owner. Keep the card at stock clock, and your power usage is only 150 watts, a far cry better than the 250 watt cards all the stories of heat issues are based upon. Just last December my pair of 1070's were getting 1,300,000 to 1,400,000 points per day, so up to 700k ppd each.

    I had a third 1070, in a separate system, with a slower clock speed (both the same architecture, Haswell-E), and the ppd was diminished, as the GPU requires one full thread or core to help it run. So when scraping together your basic system, keep an eye towards more clockspeed, less cores if you have to compromise to stay within budget. And try to keep in mind that once you are totally addicted again, that system needs to have room for a 2nd GTX1070. ;)

    Now take that with a grain of salt, because the tasks are NOT the same as was running in December, and I picked up some 1080's when the Ti came out, and I'm barely getting 650k ppd each from them, in the fast system. I can only assume the current generation of work units just don't pay as much for some reason. But still, the bottom line is that you really are just looking to buy a GPU, and have enough budget left to piece together a system, whose sole function is to support the GPU.

    Money saving tips:
    • You can buy scrap OS keys for very little on auction sites.
    • To save on Keyboard/Mouse/Monitor costs, just use TeamViewer to monitor this dedicated system.
    • Energy savings, so don't bother overclocking, as the energy use goes through the roof in no time. Did I mention the 1070 has outstanding performance for only 150 watts?
     
    #5 TennesseeTony, Apr 22, 2017
    Last edited: Apr 22, 2017
    petrusbroder and crashtech like this.
  7. bds71

    bds71 Member

    Joined:
    Nov 29, 2016
    Messages:
    60
    Likes Received:
    31
    what he said (tony)!

    for a budget system like you want to do, go with GPU power. a used dual CPU server setup can be found at a reasonable price, but are often more expensive than brand new mainstream gear. and the 1070 (mentioned above) is a great choice both for initial cost, and energy costs as well. tip: try to go as new as possible with your components as the newer tech will give better efficiency/energy costs.

    dumb question: AMD 1600 + nVidia 1070? is this no longer possible? I used NVidia with AMD before (back in the day), but that was before they bought ATI....
     
  8. TennesseeTony

    TennesseeTony Elite Member

    Joined:
    Aug 2, 2003
    Messages:
    2,060
    Likes Received:
    424
    Oh, yes, Nvidia certainly runs on AMD CPU systems.
     
  9. bds71

    bds71 Member

    Joined:
    Nov 29, 2016
    Messages:
    60
    Likes Received:
    31
    I thought it did/would, but I only ever seem to see folks recommending AMD GPUs with AMD processors since they bought ATI. thanks for the confirmation, though :)