F@H: nVidia GPU Client Released!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Foxery

Golden Member
Jan 24, 2008
1,709
0
0
As long as you keep the heat under control, it shouldn't hurt anything. 9000-series GPUs with just about any CPU scream with Windows Vista.

XP systems tend to need a dedicated CPU core (no matter the speed) to feed data. There's some kind of driver difference with Vista that drops CPU usage to <10%.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Extelleron
With my setup, I am getting 5644 PPD according to Fahmon.

Can't wait to see what my GTX 280 does.

BTW I'm using FW 177.35.

I'm using that driver as well. My 280 is reading 6513 PPD according to FAH Mon ver. 2.3.2b.

FAHMon Pic

CPU usage: Only one core is reading about 9%. Other 3 cores are idle.

GPU temp: hangs around 70C. Fan speed kicked up from an idle 30% to loaded 37%.

EDIT: PPD seems to be going up. 7056 PPD. Does it depend on the WU?

FAHMon Pic2
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
54
91
Originally posted by: Philippart
It's in the FAQ:

Hey, where did all of the GPU client data files go?

The new GPU client is a SysTray client, similar to the new v6.x CPU SysTray clients, and follows a standardized Windows installation procedure. This new client type is similar in nature to the previous GUI style client, but with notable changes, and a separate visualizations module (to come later).

The client executable can be installed to any directory you select.

The default locations are:
In Windows XP:
Executable: C:\Program Files\Folding@home\Folding@home-gpu\Folding@home.exe
Data Files: C:\Documents and Settings\<your_windows_username>\Application Data\Folding@home-gpu


In Windows Vista:
Executable: C:\Program Files (x86)\Folding@home\Folding@home-gpu\Folding@home.exe"
Data Files: C:\Users\<your_windows_username>\AppData\Roaming\Folding@home-gpu\"

Note: The client installer creates a shortcut to the data files in the Folding@home Programs folder. The installer also creates a program shortcut in the Startup folder that launches the GPU client. The Startup shortcut points to the specific file locations in the Target: and Start In: fields. That shortcut cannot be edited to add client switches. To add a client switch, follow the instructions for creating a new shortcut above.

is this the folder i'm supposed to be adding to FAHMON? i can't seem to find an "application data" folder in my C

everything during install was default
 

Foxery

Golden Member
Jan 24, 2008
1,709
0
0
Originally posted by: keysplayr2003
EDIT: PPD seems to be going up. 7056 PPD. Does it depend on the WU?

FAHMon Pic2

The graphical viewer slows down the processing, so if you were oggling the pretty pictures earlier, it would be faster now :)

Note that FahMon can be set to calculate based on the last 3 frames only, instead of all-time. This tends to be more accurate, since the other setting makes you appear to slow down if the client was simply turned off for a while.

Originally posted by: LOUISSSSS
is this the folder i'm supposed to be adding to FAHMON? i can't seem to find an "application data" folder in my C

everything during install was default

C:\Documents and Settings\<your_windows_username>\Application Data\Folding@home-gpu

If you can't see Application Data when you are in the directory for your Windows logon, it's because it's a Hidden/System file and you haven't told Windows to show those to you. I haven't used Vista, but in WinXP there's a menu bar: Tools menu -> Folder Options -> View (tab in a new window) -> Uncheck the box that hides things from you.
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
Now that we have a single F@H client for both AMD + nV cards, can we see how they stack up directly against each other in terms of processing power?

IE, which card should I purchase next? 4850/70 or GTX260/280? Should also give a good idea of relative value ratio of each card.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I think that you should angle more towards higher-end solutions if you're looking to upgrade simply b/c 9600gt is so nice already. That's my problem, my 3870 is fine for my needs, but, um, I just...need...more...folding...power...

While 4850 is certainly a kick-ass solution, to me it just doesn't seem to be enough of an upgrade for you. Obviously, the same goes for 9800gtx, leaving you with 4870 vs 260/280. I would wait and see what other users get for ppd on a 4870 vs a 260, check out heat/noise, check out relative price, etc.


edit: by the way, we are going to see an EXPLOSION in ppd now imho. the video card forum is all riled up about this, and we have tons of nvidia users who were champing at the bit for this gpu client!
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Denithor
Now that we have a single F@H client for both AMD + nV cards, can we see how they stack up directly against each other in terms of processing power?

IE, which card should I purchase next? 4850/70 or GTX260/280? Should also give a good idea of relative value ratio of each card.

Unless there is a big change in PPD with 4850/4870, it will probably be more worthwhile to go with an nVidia card. AMD 3800 series cards seem to get around 2000 PPD max, meanwhile 8800/9800 cards with the full 128SP can get ~5000-6000 PPD. The GTX 280 can get ~8000 PPD with the latest drivers.

If you are picking a card with folding as your largest priority, it would probably be best to buy a 9800GTX or a cheap 8800GTS 512MB (such as mine :p). They OC well and will easily be capable of ~5500 PPD. The GTX 280 is capable of ~8000 PPD or 8500-9000 PPD overclocked, but it will consume significantly more power and it costs 3x as much.
 

Foxery

Golden Member
Jan 24, 2008
1,709
0
0
We'll also need to see more variety of work units before the dust settles. All of the work for nVidia clients right now are essentially the same molecule, and it's a very small one. ATI performance changed drastically when they tested some larger proteins. (544-576 atoms, vs. 1200+ atoms.) Newer cards with more shaders should show dramatically better scaling.

I don't think the 4850/4870 will disappoint. And when you're buying a card with the intention of running 24/7, you need to consider heat, power consumption, and price, and so far, it looks like ATI fares better on all three counts.

Cliffs: Need more information. We'll see how this all hashes out in a few weeks.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I would definitely avoid buying a single slot 4850 anyway b/c it would be throwing all that heat into your case. if you do go amd either get a 4870 or at least wait for a dual slot 4850. well, unless you have an s1 lying around...
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Since about 10AM EST this morning, 32 WU's have been completed. it is now 10:44PM EST.
PPD remains at 7056.
I was just thinking. I have this 790i Ultra SLI mobo. And two unused PCI-e slots at the moment. If I place those 2x9800GTX's in those slots, and NOT enable SLI of course, would
this F@H client utilize all three cards? I'm going to try one 9800GTX and the GTX280 first. If that works, I'll add the 2nd 9800GTX. So, I will have 368 stream processors total at first.
Adding the second 9800GTX will bring that up to 496.

I hate when I get experimental like this sometimes. I always end up staying awake long after I should be sleeping. LOL.
Gonna try this out and get back to you guys. Should take about 15 minutes.
brb.
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
So what's the expected ppd for a 9600GT?

I will probably wait for 4970, don't want to spend $400+ on a video card. The 9600GT will move to an HTPC I plan to build as soon as I pick up a new video card. Then the points should roll in...
 

Cutthroat

Golden Member
Apr 13, 2002
1,104
0
0
Originally posted by: keysplayr2003
Since about 10AM EST this morning, 32 WU's have been completed. it is now 10:44PM EST.
PPD remains at 7056.
I was just thinking. I have this 790i Ultra SLI mobo. And two unused PCI-e slots at the moment. If I place those 2x9800GTX's in those slots, and NOT enable SLI of course, would
this F@H client utilize all three cards? I'm going to try one 9800GTX and the GTX280 first. If that works, I'll add the 2nd 9800GTX. So, I will have 368 stream processors total at first.
Adding the second 9800GTX will bring that up to 496.

I hate when I get experimental like this sometimes. I always end up staying awake long after I should be sleeping. LOL.
Gonna try this out and get back to you guys. Should take about 15 minutes.
brb.

It should work, but the tricky part will be getting all the clients to use a different "machine ID". I found when trying to get it to run with 2 SMP clients that I could not change the machine ID on the GPU client, and it seems to default to "2". I had to change one of my SMP clients to machine ID#3.

I tried just editing the config file, but ti didn't work. After saving it it seems to recreate it but without most of the info, so I had to run the config app again.
 

GLeeM

Elite Member
Apr 2, 2004
7,199
128
106
Each client on one computer must have a different MachineID, at least it has always been this way in the past.

Usually, only console type clients can change MachineID, I don't know about this new one. Someone who knows will be along soon.

The client.cfg file is not a text file and should not be edited with Notepad or other word processors.

When you run the config app does it ask if you want to change the MachineID?

Looking forward to more information in this thread. And thanks Foxery for all your help :D
 

Cutthroat

Golden Member
Apr 13, 2002
1,104
0
0
I couldn't find a way to change the machine ID of the GPU2 client and it defaulted to 2, I just changed my SMP clients to 1 & 3. I didn't see it ask anywhere in the config app, but there must be a way.

EDIT: Maybe it can be done with a switch on the shortcut. Does such a switch exist?

I noticed in the FAQ that you can't edit the shortcuts that the client creates during installation, but you can create a new shortcut and add the switch to it.

So it's right in the FAQ after all.

What about multi-gpu support and the -gpu switch?

Running multiple GPU2 clients, one client each on multiple GPU cards, is supported through the -gpu x command line switch. The setup is similar to running multiple SysTray CPU clients.

* Copy your \Application Data\Folding@home-gpu folder to a new folder \Folding@home-gpu2 (\AppData\Roaming\Folding@home-gpu in Vista)
* Create a new shortcut for the first client, and be sure to use the correct Target: and Start In: information. Note that one has to be very careful with shortcuts, and in particular, make sure that the "Start in:" field is set correctly. If you are having problems with automatic core upgrades, it is likely that your short cut is not set up correctly.
* Edit the shortcut properties to add the -gpu 0 switch to the end of the Target: field.
* Create a new shortcut for the second client, and be sure to use the correct Target: and Start In: information.
* Edit the shortcut properties to add the -gpu 1 switch to the end of the Target: field.

Except for the different -gpu x switch, the Target: field in both shortcuts will point to the same FAH executable. The Start In: field for each client will point to the two different \Apps Data\FAH folders. The Target: and Start In: fields for a SysTray client are explained in more detail below.

The display must be active on the GPU card you plan to use, and ?gpu 0 will select the first board, ?gpu 1 will select the second board, -gpu 2 the third board, and so-on. You will need to disable crossfire for multiple boards to be detected. You will also need to use different Machine IDs for each client. Currently, only one client is supported on a 3850X2 or 3870X2.
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
First, +1 to Foxery for the x64 workaround instructions. Very helpful and got my client up and running yesterday afternoon.

My last 12 hours have yielded 1960 points so I should see nearly 4K ppd going forward. Compared to the measely 300-500 ppd I was seeing from my e8400 previously (which now runs 7-8% load while GPU folding).

I think it's safe to say that the days of cpu folding farms are over.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Denithor
First, +1 to Foxery for the x64 workaround instructions. Very helpful and got my client up and running yesterday afternoon.

My last 12 hours have yielded 1960 points so I should see nearly 4K ppd going forward. Compared to the measely 300-500 ppd I was seeing from my e8400 previously (which now runs 7-8% load while GPU folding).

I think it's safe to say that the days of cpu folding farms are over.

Still maintaining 7056ppd.
Question. Do the GPU and CPU clients calculate the same proteins in the same way?
In other words, can the GPU client run the exact same calculation as the CPU?
 

Foxery

Golden Member
Jan 24, 2008
1,709
0
0
Originally posted by: keysplayr2003
Originally posted by: Denithor
I think it's safe to say that the days of cpu folding farms are over.

Still maintaining 7056ppd.
Question. Do the GPU and CPU clients calculate the same proteins in the same way?
In other words, can the GPU client run the exact same calculation as the CPU?

No, the GPU has some limitations due to being a specialized processor. There are some calculations which can only be run on the more flexible CPU client, so both will be around indefinitely.

For the technically minded, GPUs shine when runing algorithms with O(n)^2 complexity. These fit in well given that video rendering is similarly repetetive - and the 100s of shaders in GPUs run in parrallel to handle just such tasks. CPUs with 1-4 cores don't have the same parrallelism as one with 300, so... :)
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
54
91
wait i'm using driver: 174.16 (winxp 32), do i need to change?


is the correct driver to use the

CUDA 2.0 BETA2
CUDA driver: NVIDIA Driver for Microsoft Windows XP with CUDA Support (177.35)

??
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
Might be the problem, I'm using 177.35 with the CUDA driver installed.

A quick question for the brains out there. My system only uses 7-8% of cpu resources (e8400) while GPU folding. What kind of points could be expected on a cheapo build (e1200 + mobo + 9600GT)? Would there be any difference versus the faster CPU?

EDIT: Regarding the above concept, is the 8800GS and/or the 8800GT 256MB supported for the new client? Thinking of a $200 makeover for my mom's computer to crank out some real points.
 

Cutthroat

Golden Member
Apr 13, 2002
1,104
0
0
Originally posted by: Denithor
Might be the problem, I'm using 177.35 with the CUDA driver installed.

A quick question for the brains out there. My system only uses 7-8% of cpu resources (e8400) while GPU folding. What kind of points could be expected on a cheapo build (e1200 + mobo + 9600GT)? Would there be any difference versus the faster CPU?

I would guess I would produce just as many points.

You should run an SMP client on that E8400.
 

Diogenes2

Platinum Member
Jul 26, 2001
2,151
0
0
Originally posted by: Denithor.......
EDIT: Regarding the above concept, is the 8800GS and/or the 8800GT 256MB supported for the new client? Thinking of a $200 makeover for my mom's computer to crank out some real points.
I have an 8600GT, and it's cranking out ~1200ppd ..

 

Foxery

Golden Member
Jan 24, 2008
1,709
0
0
Originally posted by: LOUISSSSS
wait i'm using driver: 174.16 (winxp 32), do i need to change?

From the OP, the announcement page, and the good ol' FAQ... You MUST use version 174.55 or newer for proper driver support. 177.xx may require an INF hack for some systems.

Originally posted by: Denithor
My system only uses 7-8% of cpu resources (e8400) while GPU folding. What kind of points could be expected on a cheapo build (e1200 + mobo + 9600GT)? Would there be any difference versus the faster CPU?

EDIT: Regarding the above concept, is the 8800GS and/or the 8800GT 256MB supported for the new client? Thinking of a $200 makeover for my mom's computer to crank out some real points.

Only if she runs Vista, I think. XP uses more CPU power. (At least for ATI owners - not positive about nVidia.)
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: MechEng
6.000+ PPD :D

http://img145.imageshack.us/im...51/fahnvidiagpuuj4.gif

Spec's:
CPU: Intel Core 2 Duo E8500
RAM: Corsair XMS2 Dominator 2 x 2 GB
GFX: Gigabyte GV NX88T512HP (512 MB 8800GT)
MB: Gigabyte GA-X48-DQ6
PSU: Corsair HX620W

OS: Windows XP 32bit
Driver: 177.35 with modded INF file.

Nothing OC'ed.

EDIT:
Passed 200k points today.... Yay!

that does seem a little odd that 8800gt would get more than double my 3870. gtx 280 I expect to see 8k +, but 2/3 of that for 8800gt just seems...wrong.
 

Insidious

Diamond Member
Oct 25, 2001
7,649
0
0
OK, I need help......

On my quad it presently runs dual SMP clients. Do I run the GPU client along with both of them? One of them? By itself?

On my X2 it presently runs a single SMP. Do I run the GPU with the SMP or by itself?

Both machines are WindowsXP SP3

TIA :beer:

-Sid