Thinking out loud - render farm

mazeroth

Golden Member
Jan 31, 2006
1,821
2
81
Back around 10 years ago I was pretty good at 3D modeling but what killed it was the time it took to render a scene. I would set the res to something pathetic like 320x240 and give it a day and I may have completed a 10 second bit.

I was browsing through the hot deals and thought of something. You can purchase an AMD X4 840 (3.1 ghz stock) with a motherboard for $99. I have the 630 version of this chip and can hit 3.5 ghz. without even trying. Throw in $50 for cheap RAM, a used HD and a used or cheap power supply and you have a functioning quad core for $150. No need for a case for the render farm; a sheet of 4x8 ply will do! Now, buy 12 of these or so and for under $2000 you literally have 48 cores @ 3.5 ghz. for network rendering or any other number crunching you may desire. That's 168 ghz.

I think I drained about $1500 into my PC 10 years ago (thinking it was an Athlon 1.2 ghz) and to think of what you can build for that kind of money now is just staggering, especially for rendering 3D.

That is all. I will now go to bed!
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,740
156
106
I did something similar to what you describe when the core2 first came out
my reasons were for distributed computing with linux
was about the same price you list too

built two:
e4300
512MB mem
80GB drives
table for case

They overclocked a solid GHz above stock which was exciting

If you do go the route you mention, I highly suggest cases, dust build up is not a joking matter
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
If you do go the route you mention, I highly suggest cases, dust build up is not a joking matter

Nah, you just hang it on a wall...

3rdBeowulfPrototype.jpg


or hang them upside down:

2ndBeowulfPrototype.jpg
 

Drsignguy

Platinum Member
Mar 24, 2002
2,264
0
76
Well, isn't this thinking outside the box so to speak. From a guy perspective, at least it isn't like the chick clutter they always buy to fill up blank wall space!:)
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
I don't know if this applies to rendering, but at work we do some physics modeling, and a single GTX580 is about as fast as 12 high end Intel cores. We have a larger cluster with many cores, but for individual engineer workstations, a single computer with 1 or 2 of these is significantly easier to manage than a small cluster. If there is software available for GPUs, you may want to check out how viable it is.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
While that certainly looks cool, it's certainly quite loud and dust should still be a problem shouldn't it?

It wasn't loud (that photo is of computers I bolted to my office wall) and dust was not an issue for the 3yrs those computers were there (operating full load 24x7).

Now if you want loud, when I scaled up to twelve computers in "one case" that used box-fans to form a pass-through wind-tunnel for cooling...now that had to sit in another room in the building that was unoccupied :eek:

Here's 4 computers to "one shelf":
FinalBeowulf4-packtray.jpg


Which slid into a box-case that housed three such shelves:
FinalBeowulfassembly4-packtray.jpg


Of course one is never enough, so I built two of them :awe:
FinalBeowulfs-WalmartSpecial.jpg
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
@IDC: I'm in awe :D Looks stylish, though I'd have a hard time find a use for that much computing power (although with the right SSD array I could probably stop going on a short break whenever I've got to recompile a large chunk of the program I'm currently working on.. think about it, I think I like how it's now better ;) )
 

paperwastage

Golden Member
May 25, 2010
1,848
2
76
I don't know if this applies to rendering, but at work we do some physics modeling, and a single GTX580 is about as fast as 12 high end Intel cores. We have a larger cluster with many cores, but for individual engineer workstations, a single computer with 1 or 2 of these is significantly easier to manage than a small cluster. If there is software available for GPUs, you may want to check out how viable it is.
depends on what kind of stuff you want to do

parallel processing != Intel cores

and parallel programming on GPU sucks :(
 
Apr 20, 2008
10,067
990
126
It wasn't loud (that photo is of computers I bolted to my office wall) and dust was not an issue for the 3yrs those computers were there (operating full load 24x7).

Now if you want loud, when I scaled up to twelve computers in "one case" that used box-fans to form a pass-through wind-tunnel for cooling...now that had to sit in another room in the building that was unoccupied :eek:

Here's 4 computers to "one shelf":
FinalBeowulf4-packtray.jpg


Which slid into a box-case that housed three such shelves:
FinalBeowulfassembly4-packtray.jpg


Of course one is never enough, so I built two of them :awe:
FinalBeowulfs-WalmartSpecial.jpg

That's intense.

Let me guess, an i7 980x blows away all of those machines combined?
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
Of course one is never enough, so I built two of them :awe:
FinalBeowulfs-WalmartSpecial.jpg
IDC, what was the tower of wrapped wire on the top of those for?

That is a pretty awesome setup you had though. Would you consider doing it again with more modern equipment?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
IDC, what was the tower of wrapped wire on the top of those for?

Just ethernet cables. At the time the whole cluster was wired together with standard 100mbit ethernet.

That is a pretty awesome setup you had though. Would you consider doing it again with more modern equipment?

Yep. Actually I did twice since then, the next iteration of that cluster was 24 3GHz P4's that were done up more expensively in a server-rack and upgraded to 1Gbit network fabric. (no pics)

I retired that system and built a six-computer 3.3GHz quad-core (OC'ed Q6600) system:
IMG_5458_small.jpg

(only four nodes are shown here)

This system is nearing its EOL and is going to be replaced with either a cluster of 2600K's or Zambezi's (won't know which until Zambezi is released and prices are established).
 

cantholdanymore

Senior member
Mar 20, 2011
447
0
76
It wasn't loud (that photo is of computers I bolted to my office wall) and dust was not an issue for the 3yrs those computers were there (operating full load 24x7).

Now if you want loud, when I scaled up to twelve computers in "one case" that used box-fans to form a pass-through wind-tunnel for cooling...now that had to sit in another room in the building that was unoccupied :eek:

Here's 4 computers to "one shelf":


Which slid into a box-case that housed three such shelves:


Of course one is never enough, so I built two of them :awe:

Let me guess, this was frankenstein brain. Now if you could slip a picture or the creature:biggrin:
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
Just ethernet cables. At the time the whole cluster was wired together with standard 100mbit ethernet.



Yep. Actually I did twice since then, the next iteration of that cluster was 24 3GHz P4's that were done up more expensively in a server-rack and upgraded to 1Gbit network fabric. (no pics)

I retired that system and built a six-computer 3.3GHz quad-core (OC'ed Q6600) system:
IMG_5458_small.jpg

(only four nodes are shown here)

This system is nearing its EOL and is going to be replaced with either a cluster of 2600K's or Zambezi's (won't know which until Zambezi is released and prices are established).

So you have that in your living room? The babygate and toys are kind of a give away that t is at least an extra room that the kids play in. My living room is in a constant state of mess from the toys my daughter doesnt' seem to know how to put away.

I just thought the idea of building your own server rack complete with box fans was a really cool idea, and it was awesome to see it. It is so much more gratifying to build your own solution that to buy a prebuilt one, but then that requires time and effort that not many parents have (I know I haven't taken on any of those types of projects since Grace was born. Being a single dad doesn't give you much time to do anything for yourself.)

Is there a reason you have always stayed with 24 cores on your systems? Is that the point where there is no longer any positive returns for your particular software?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
So you have that in your living room? The babygate and toys are kind of a give away that t is at least an extra room that the kids play in. My living room is in a constant state of mess from the toys my daughter doesnt' seem to know how to put away.

Nah, that's was the basement (previous house).

It's a passion/hobby of mine and I've managed to incorporate the need for such compute resources in my various walks of life.

The first one's (Frankenstein's eletronic brain :p) were for academia, actually a critical piece of hardware that I needed in order to complete the research for my doctorate.

The P4 cluster was for my job at TI. (hence no pics)

The current Q6600 cluster is for my business (algo trading of foreign currency) and hence it is located in my house. This also happens to make me acutely aware of the power-consumption. The cluster adds $100 a month to my electric bill.

I just thought the idea of building your own server rack complete with box fans was a really cool idea, and it was awesome to see it. It is so much more gratifying to build your own solution that to buy a prebuilt one, but then that requires time and effort that not many parents have (I know I haven't taken on any of those types of projects since Grace was born. Being a single dad doesn't give you much time to do anything for yourself.)

Is there a reason you have always stayed with 24 cores on your systems? Is that the point where there is no longer any positive returns for your particular software?

It was a lot of fun designing it and building it, was even more enjoyable putting it to work :)

24 cores has just kinda been coincidence.

The software scales well for small clusters like these. But the most recent build was entirely designed with forex in mind and my broker at the time had 19 currency pairs, so I built to handle each currency pair on a core (100% load) plus the overhead needed for assembling results and managing live trading on multiple accounts.

Nowadays broker's have upwards of 70+ currency pairs, so I doubt 24 cores is going to be sufficient for my next build.

IDC, what on earth are you using that much computing power for?

Gaussian, an ab initio computational chemistry package. And MetaTrader4, an automated foreign currency exchange platform.
 

mnewsham

Lifer
Oct 2, 2010
14,539
428
136
@IDC

Sounds like the next cluster you have planned for this summer(?) will be a very neat project! I would love to see some pics of that and would be very excited to see any plans you have! :awe:
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,131
3,666
126
Yep, and a 980x prolly uses about 1/10 the power at the same time.
990XAir.jpg


:biggrin:

mmmm.. well b4 i stopped DCing, i actually had a rack planned.

Basically full blown racks, where i would have 1-2 U4's be LC only, feeding the other racks.

I actually stopped DCing... parts buying became too addictive... then the electricity bill would only increase as i would have my entire farms OC'd and on water.

But at its peak, i had 8 machines, all LC'd inside cases, which i would remote to check its progress on.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@IDC
Any chance the next one with have 4x 16 core bulldozer cpus? :)

64 "cores" in 1 pc, almost kills the need for building "Frankenstein's eletronic brain" types of projects (id guess), unless the need for cpu power keeps increaseing with time.