Would it be possible for DC to take advantage of GPU power too?

superkdogg

Senior member
Jul 9, 2004
640
0
0
DISCLAIMER
Admittedly, I don't know a heck of a lot about what goes into developing a project and I know even less about programming.

That being said, I was wondering since most of the DC projects have a graphical representation of what is being calculated, I was wondering if maybe those renderings could actually be used to speed up the calculations. (Did that make any sense?)

Instead of showing what it is the CPU is working on, maybe the raw data could be fed to the GFX card with a software layer telling the GPU that it needs to figure out what to render and then the image that the GPU comes up with would actually be a part of the data set that is being analyzed.

Granted, like I said, I don't know my @$$ from a hole in the ground when it comes to this, so I completely defer to anybody else with some knowledge, but I was just thinking about the massive amount of power increase that this sort of a system would give DC if it were at all possible given the power of today's GPU's and their incredible memory bandwidths.

edit: I hate when I spell stuff wrong--

 

BlackMountainCow

Diamond Member
May 28, 2003
5,759
0
0
There are attempts to use the GPU's computing power already and considering that a current gfx chip has more transistors than any currnet CPU, that makes perfect sense. But I have no clue how far advanced these attempts are. But it'll be interesting to see what happnes once the gfx manufacturers include specialized physics chips on their boards. I assume those chips would rock in any DC project.

:beer::D:beer:
 
May 31, 2001
15,326
1
0
I think FaH is seriously working on a client to run on video card GPUs. Unfortunately, the forum where I read it got nuked, so I cannot confirm.
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
There's a big thread over at Folding-Community, which basically is summarized as "they're experimenting with it, but no promises or dates for release have been made yet." Before somebody asks, there's another thread on the recently announced Physics Processing Units (cliff's notes: "we don't know enough to say anything yet, but if it's powerful enough, we'll look at it").
 

ecvs85

Member
Mar 4, 2005
145
0
0
Originally posted by: mrwizer
That would be cool to put my 6800 to work while not gaming. :)
It would be cool to put my 9800pro to work since I don't game anymore :p
 

networkman

Lifer
Apr 23, 2000
10,436
1
0
The following is from the Einstein@Home FAQ

When showing the graphics the computaion slows down.

Einstein@Home makes heavy use of OpenGL. The part of this work the graphics card can't do on its own hardware has to be done in software by the CPU, preventing it from crunching. If you have a slow or "dumb" (not OpenGL-accelerated) graphics card, you might better not show the graphics if you want to get your results finished quickly.

Here are a couple of ways to tell if Einstein@Home is being slowed down by your graphics hardware.

Windows users: soon after an Einstein@Home workunit has started, bring up the 'task manager' by doing CONTROL-ALT-DELETE. Find the running einstein process and compare how much CPU time is shown with the time reported by the BOINC client. The BOINC client only reports the CPU time that was used by the 'science' part of the code, whereas the 'task manager' shows the CPU time used by both the science code and the screensaver code. If you watch there for thirty seconds, and the BOINC client reports that the science code has done only an additional five seconds of computation, whereas the task manager reports that the einstein application has used an additional thirty seconds of CPU time, this means that the graphics is consuming most of the CPU time. In this case, you should probably set your screensaver preferences to blank the screensaver after a few minutes. Alternatively, go to the web site of your computer or graphics card manufacturer, and download the latest graphics drivers. If you can switch your graphics to a mode that supports 'accelerated 3D OpenGL' then the screensaver should become very efficient.

Linux users: It's important to make sure that your graphics hardware is using 3D OpenGL hardware acceleration. A good test is to run the 'glxinfo' program. If it reports direct rendering: No then you are NOT using graphics acceleration, and the graphics will be slow and time-consuming. If it reports direct rendering: Yes, then all is well. In many cases, where direct rendering is *not* enabled, you can turn it on. See http://dri.freedesktop.org/wiki/ for more information and a useful troubleshooting section. Running glxgears is a simple way to test changes to your XF86Config file.

Mac users: all recent versions of Mac OSX include wonderful hardware acceleration. You don't need to do anything!


---

It's quite possible the same info above may hold true for other projects as well, but E@H is the only one I was sure of. I realize it's not exactly the answer you were looking for, but you can reap real benefits from a decent video card with OpenGL support when running the screen-saver version of the E@H client. ;)

PS: I noticed a significant improvement in performace in one of my Compaq Proliant DL360(dual p3-800) servers when I disabled the onboard video and dropped in an ATI Radeon 7000 in the open PCI slot. :D

 

BlackMountainCow

Diamond Member
May 28, 2003
5,759
0
0
Originally posted by: mrwizer
This is the very reason that I never run the screen savers. A waste of cycles.

Yip! But a good thing to hook people up with DC projects. I mean, who hasn't seen the SETI screensaver at first and just sat there drooling, looking for a spike in the pattern. I myslef did that for hours. And I still do it that way to other people to get them aquainted with DC in general, be it with folding proteins in F@H/FaD or spinning globes in CPDN. After that intro phase, naturally a CLI becomes the top choice.

:beer::D:beer:
 

mrwizer

Senior member
Nov 7, 2004
671
0
0
Agreed. It is important to be able to draw people to it. I have used the Seti screen saver to get people into DC myself. Although for me, I thought that the verbose mode on the cmd line was much more exciting once I got into it heavily :)
 

networkman

Lifer
Apr 23, 2000
10,436
1
0
And that's why I leave the screen-saver version running on the server. Of course, most people are surprised at my basement any way, but they usually ask about "that cool screen-saver" too. :)
 

Assimilator1

Elite Member
Nov 4, 1999
24,093
493
126
Originally posted by: mrwizer
This is the very reason that I never run the screen savers. A waste of cycles.

True if the graphics are software rendered, but if the graphics are being run by the gfx card then it won't hit processing time much:)
Which I think is more or less what the E@H info NWM showed is saying:)

 

superkdogg

Senior member
Jul 9, 2004
640
0
0
What I was referring to was tapping into the power of modern graphics cards to do some of the computations of DC. It seems like there should be a way to utilize more than just the CPU for DC and it would be an incredible boost for the whole community if a DC project could utilize GPU computing power as well.
 

Wiz

Diamond Member
Feb 5, 2000
6,459
16
81
Well I followed the links above and did some reading - sounds like the f@h people are at work on the project, not saying much about it though. It would be nice to add another processor for each machine on a dc project, but this will be limited to modern gpu's.
Who can tell when it will come out.
 

ASK THE COMMUNITY