Folding@Home with GTX 275 in SLI

Golgatha

Lifer
Jul 18, 2003
12,407
1,085
126
Update 10/2 - Holy shiznat!!

A 19,286 points in the last 24 hours!!?? Me likey the GTX 275 SLI folding!!

Another interesting thing is that the NotFreds client, which I started using in earnest in July, got me from an average of 100k to around 160k per month using the exact same CPU hardware. The VMWare/Linux based virtual machine is at least 1.5x ppd of the Windows client in my estimation.

The GPU2 client on nVidia hardware simply can't be touched though. When the GT300 comes out, I think I might retire one of the GTX 275 cards and keep the other for PhysX and Folding@Home.

With all this running full boar though, my CPU gets up to around 70°C with a Zalman CNPS9900LED HSF on it. The GPUs hover around 90°C each with the fans ramping up to 100% when set to auto. I can set them to 85% fan speed with much less perceivable noise and they hit around 92°C each. I'm pretty sure I'll have to be less aggressive in the summertime with Folding@Home on this rig, but with 45°F mornings an open window in the basement currently does wonderful things.




Tried searching on Google and didn't come up with anything conclusive. There are instructions for ATI cards to run two instances of the GPU2 client, but I was wondering if the GPU2 client is SLI enabled? If so, should I just load up one instance on my main rig and let it run, or do I need to do come command switches to get it up and running?

Basically, please LMK if I'm doing it right when it comes to running the GPU2 client, or let me know how I'm doing it wrong.


Right now I have...

1) 2x NotFreds clients in VMWare maxing out my Core i7 920, with 2 CPUs assigned to each NotFreds client in the configuration.

2) 1x GPU2 console client running.

Edit: Dang this thing is fast even if it's just using 1 GPU to fold right now. Looks like I'm coming to complete around 3 WUs on the GPU client today!! :thumbsup:
 

GLeeM

Elite Member
Apr 2, 2004
7,199
128
106
I can't answer all your questions but I am pretty sure SLI will not help one GPU client. And you would have to turn off SLI in order to run two.

You must be getting bigger point WUs than I have recently. I have been getting mostly 353 point ones to the tune of 26+ of them per day! I am getting 9200-9500 ppd with these on my GTX275.
What kind of OC are you getting on yours? I have shaders up to 1700. Do you know if this is too high or can it go higher safely? The temp is about 61C with these WUs.
Sorry about going off topic here but I am thinking of getting another GTX275. Let us know how hard it is to run two :D

How much are you able undervolt your i7 with HT off? With what temps?

Those notFreds give sweet ppd :shocked:
 

Golgatha

Lifer
Jul 18, 2003
12,407
1,085
126
Originally posted by: GLeeM
How much are you able undervolt your i7 with HT off? With what temps?

Those notFreds give sweet ppd :shocked:

I'm running two other NotFreds on my HTPC 24/7 with a Q9400 at stock speeds (2.66Ghz Quad Core) as well. I run my PS3 on occasion to contribute as well. Switching from console clients to NotFreds is like night and day as far as PPD is concerned though.

I want to see a few days of what my current setup can do and then mess with tweaking for getting both GPUs in the game.
 

Golgatha

Lifer
Jul 18, 2003
12,407
1,085
126
Who am I kidding?!

I turned off SLI, set the fans to 90%, and am going for the gold.

I set the process affinity to the GPU2 clients for CPUs 4,5,6, and 7, which are basically the hyperthreading virtual CPUs on my i7 920, and then used the following command line switches.

-gpu 0 -forcegpu nvidia_g80

-gpu 1 -forcegpu nvidia_g80

You just set the GPU ID number to 2 or 3 if you have three-way or quad-sli setups. Not sure if this would work with something like the GTX 295 though. I figure the grunt work is done by the GPUs on the GPU2 clients for the most part, so I just gave them the extra threads to do the light CPU computations needed for the GPU2 clients.

Amazingly, the fans at 90% really aren't super loud in my Antec Solo case. BTW you have to cut out some hard drive rails to get a GTX 275 to fit in this computer case. These are some long mofos!!

I used eVGA's Precision software to monitor the load on the GPUs. You can see them peg out each graphics card to 3D clocked mode when each of the GPU2 clients is started up. I am hoping for amazing results!! I'm running the graphics cards at stock speeds for now to see how much heat builds up over a day.
 

Golgatha

Lifer
Jul 18, 2003
12,407
1,085
126
Well everything ran perfectly last night.

I did have one problem this morning though. My UPS unit went into brownout or battery mode when my son tried to power on my XBox 360 this morning. Turn off the 360 and the UPS quits beeping. My computer and 360 together are pulling some serious juice. Time to redistribute the wiring to avoid this conflict. Good thing the outlet is on a 20A breaker and good thing I don't have a CRT anymore.
 

dajeepster

Golden Member
Apr 15, 2001
1,974
16
81
Originally posted by: Golgatha
Well everything ran perfectly last night.

I did have one problem this morning though. My UPS unit went into brownout or battery mode when my son tried to power on my XBox 360 this morning. Turn off the 360 and the UPS quits beeping. My computer and 360 together are pulling some serious juice. Time to redistribute the wiring to avoid this conflict. Good thing the outlet is on a 20A breaker and good thing I don't have a CRT anymore.

just plug the 360 directly into the wall.. it don need no freeeakin preteeectiioonn. :D
 

Golgatha

Lifer
Jul 18, 2003
12,407
1,085
126
Originally posted by: dajeepster
Originally posted by: Golgatha
Well everything ran perfectly last night.

I did have one problem this morning though. My UPS unit went into brownout or battery mode when my son tried to power on my XBox 360 this morning. Turn off the 360 and the UPS quits beeping. My computer and 360 together are pulling some serious juice. Time to redistribute the wiring to avoid this conflict. Good thing the outlet is on a 20A breaker and good thing I don't have a CRT anymore.

just plug the 360 directly into the wall.. it don need no freeeakin preteeectiioonn. :D

I plan to plug it into the smaller UPS I have, which is there for all my networking equipment. Should be fine. I just hope I'm not browning out the wall outlet. I'm loving the cold weather though, as I can open the basement window to cool my PC and just leave the heat off in the house.
 

Golgatha

Lifer
Jul 18, 2003
12,407
1,085
126
Originally posted by: GLeeM
I can't answer all your questions but I am pretty sure SLI will not help one GPU client. And you would have to turn off SLI in order to run two.

Actually, I've found leaving SLI on or off makes no difference.

-gpu 0 -forcegpu nvidia_g80

-gpu 1 -forcegpu nvidia_g80

Those command line switches make all the difference though. It basically tells the client which graphics card machine ID to use. The client couldn't care less if the nVidia drivers say use SLI or not from what I can tell.
 

FaaR

Golden Member
Dec 28, 2007
1,056
412
136
Originally posted by: Golgatha
Actually, I've found leaving SLI on or off makes no difference.
The most recent sets of Nvidia drivers allow folding while SLI is still enabled; in the past you had to turn it off or the folding client would only be able to identify one GPU.

It won't make anything go faster of course. It's merely an improvement in convenience, not performance...

Btw, my dual 8800GTXes 'only' do about 10500 PPD. Makes me a little sad to see how much faster newer cards are. I do have two current-gen GPUs in my main rig tho, but those are Radeon 4890 ATI boards and folding performance is almost pitiful even compared to my old 8800GTX workhorses... One single 4890 in-game is nearly as fast as my 8800s in SLI, but when folding the 8800s are twice as fast. :(
 

Golgatha

Lifer
Jul 18, 2003
12,407
1,085
126
Originally posted by: FaaR
Originally posted by: Golgatha
Actually, I've found leaving SLI on or off makes no difference.
The most recent sets of Nvidia drivers allow folding while SLI is still enabled; in the past you had to turn it off or the folding client would only be able to identify one GPU.

It won't make anything go faster of course. It's merely an improvement in convenience, not performance...

Btw, my dual 8800GTXes 'only' do about 10500 PPD. Makes me a little sad to see how much faster newer cards are. I do have two current-gen GPUs in my main rig tho, but those are Radeon 4890 ATI boards and folding performance is almost pitiful even compared to my old 8800GTX workhorses... One single 4890 in-game is nearly as fast as my 8800s in SLI, but when folding the 8800s are twice as fast. :(

I hear you. I had dual 4870 1GB cards. One of them was passively cooled and got too hot for my liking when trying to fold with it (100°C+) and the other hardly seemed worth the effort for the ppd output.

For games, switching from the dual 4870s would be deemed foolish by most, but I really wanted to try Folding@Home on nVidia hardware, so I bought one card and then promptly bought the second one two days later :). I also wanted to try out the PhysX stuff and lately (for better or worse) nVidia has been paying for better support in games (e.g. Batman AA with antialiasing), which adds some value IMO. The stock cooling on the GTX 275 cards is really good and really quiet too. I seriously considered the new 5870 from ATI as well, but dual GTX 275s work just as well in most of the games I play (or a single card runs them fine in some cases), and are as capable as the 5870 for the most part.
 

FaaR

Golden Member
Dec 28, 2007
1,056
412
136
Personally I don't consider Nvidia paying off developers to lock antialiasing only to their own hardware as 'added value'... I think it's shite behavior, mildly speaking.

Nvidia is also trying to grab the market with their own proprietary offerings (cuda and physx), which if one considers history is almost hilarious. 3dfx tried to do the same thing, and Nvidia was the one that beat the crap out of them. Now Nvidia obviously consider themselves the new masters of the universe apparantly, and I can only hope another company comes to serve them a dose of poetic justice... :p

Their hardware is fantastic, but the company ethics is the worst, have been for ages now, and isn't getting any better as time goes on. Quite the opposite, their latest move of disabling physx if the user renders 3D to another device rises their company aceholedness to an entirely new level.

I'm not going to support their disgusting antics with my money until they show marked improvements in their general behavior that's for sure, even though the new fermi chip is looking mindbogglingly awesome. I just hate it when evil companies execute really really well, lol. It's much easier to despise them when they're evil and screw up instead! :)

One can only hope that the OpenCL version of the folding client comes out soon, so that ATi cards can become viable once more. I've no idea how fast a 4890 could potentially fold if it was able to utilize all shading processors to the task... Hopefully it would not jeapordize hardware stability, since many 4890 boards have sub-standard power regulation hardware (including both of mine, feh!)

Right now though I push out about 12.5k PPD on my old 8800GTX rig with 4x CPU clients on an older Core2D quad CPU, and another 7.5k PPD roughly on my main rig from a set of 4 more CPU clients on a Core i7 920 that I run at 3.5GHz (just bumping bclk, not touching any volts.) Not bothering with the multicore client. Too much hassle to set up and to manage...
 

Golgatha

Lifer
Jul 18, 2003
12,407
1,085
126
Originally posted by: FaaR
Personally I don't consider Nvidia paying off developers to lock antialiasing only to their own hardware as 'added value'... I think it's shite behavior, mildly speaking.

Nvidia is also trying to grab the market with their own proprietary offerings (cuda and physx), which if one considers history is almost hilarious. 3dfx tried to do the same thing, and Nvidia was the one that beat the crap out of them. Now Nvidia obviously consider themselves the new masters of the universe apparantly, and I can only hope another company comes to serve them a dose of poetic justice... :p

With Batman AA, they worked with the developer to add AA to the UT3 engine, which doesn't natively support it. Should they be forced to share with ATI for free?

Cuda is their own coding work, so I don't begrudge them that either. I'm sure ATI won't be sharing Steam computing IP with nVidia anytime soon either.

Now the physx stuff does leave an awful taste in my mouth. If I have an ATI and nVidia card in the same system, I ought to be able to run physx on the game regardless of what other video card is present in the system.

3dfx died off because they moved all the board manufacturing in-house, alienated their hardware partners, and made some bad decisions with their hardware; no hardware T&L being a biggie. Concerning proprietary APIs, 3fdx Glide was proprietary, but last I checked D3D and OpenGL aren't.
 

FaaR

Golden Member
Dec 28, 2007
1,056
412
136
UT3 does antialias already on ATi with some .exe renaming magic, so it's not as if Nvidia performed a black magic rite over the engine sourcecode or anything...

Well, if they DID, it was to make sure that .exe renaming would NOT work to get AA working on ATi boards (it doesn't).

As for 3dfx and Glide; they resisted for the longest time to implement a full opengl driver. Remember, D3D sucked giant donkey schlong back then (and that's describing it kindly), so it was not a really serious competitor. 3dfx was betting on by delaying implementing opengl they could force game developers to move towards Glide instead through the powers of 3dfx brand name and market share.

Unfortunately for them, it didn't work. Nvidia rolled over them with more interesting, stronger performing and ultimately cheaper products when you didn't need to buy two boards to get full performance...

Fortunately, we can always rely on microsoft these days to clean up the marketplace when it gets too messy when it comes to different APIs and whatnot. THEY set a standard, and everybody are forced to follow it. Eventually, any proprietary stuff gets marginalized and are dropped. It's happened before, many times. :)
 

GLeeM

Elite Member
Apr 2, 2004
7,199
128
106
Originally posted by: Golgatha

When the GT300 comes out, I think I might retire one of the GTX 275 cards and keep the other for PhysX and Folding@Home.

If the 300 has a different number of shaders from the 275 it will only run as fast as the 275. Well at least it used to be that way, don't know if it still is. Something about two or more cards in same system needing to have same number of shaders.

Your GPU temps are higher than mine, but that is probably because you have two of them? And/or because I have the side of my case off?
 

Golgatha

Lifer
Jul 18, 2003
12,407
1,085
126
Originally posted by: GLeeM
Originally posted by: Golgatha

When the GT300 comes out, I think I might retire one of the GTX 275 cards and keep the other for PhysX and Folding@Home.

If the 300 has a different number of shaders from the 275 it will only run as fast as the 275. Well at least it used to be that way, don't know if it still is. Something about two or more cards in same system needing to have same number of shaders.

Your GPU temps are higher than mine, but that is probably because you have two of them? And/or because I have the side of my case off?

Yes, I have two of them and my side panel is in place. It's in a somewhat cramped Antec Solo case with just enough airflow to keep things running in spec at 100% load as far as temps go. Running something trivial like a game will knock off something like 20°C for every component in there when Folding@Home isn't running.