• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Little whacky, but get your graphics card to crack?

TheApe

Member
OK, I've been up for 20hours staright about now so excuse me if I'm talking absolute b*llocks! I've been thinking whilst cracking the varouse projects we do, what are the possibilities of being able (hyprothetically at least) to get the T&L engine on your graphics card to do RC5 keys? I mean the current nVidia cards T&L engine is pretty usefull at throwing numbers about, if a client could be writen PC's would be able to do Seti in main CPU and RC5 in GPU? am I being stupid or is this at least technically feasable, and if so what sort of keyrate would the GPU get? (EDIT:In theory you could hook into dx8's T&L setup and use this kinda cracker on any card, even without T&L, although this would be unbeneficial? anyone clarify any of this?)
 
I think the gammaflux team wanted to make a geforce core or something. But I may just be imagining this too, for I too have been awake for 20 hours.
 
The consensus of opinion is that while a noble and creative thought - The keyrate wouldn't justify the trouble, or the possibility of overheating an expensive vid card processor. I for one, would be a little chagrined to have to replace my GEForce DDR.

On the other hand, Durons are cheap -

I'm not criticizing you - the idea is on the right track = "more blocks"
In practice, you'd be better off to add a crack node. 🙂
 
I've heared there is a tool out there which is available for Win NT4 and W95 (I believe) which can make use of your GPU. It then looks if you have a multiprocessor system. It's out there 🙂
I don't know if it can work together with the D.net client
 
Yeah spose so, seemed like a good idea at the time, had 4 hours sl;eep now so it doesnt look all that clever an idea right now.... but hey would be great to see it work, even if it was just from a technology side of things 🙂

 
i don't see how running rc5 on a graphics processor will overheat it. At best the T&L engine would be used, leaving the 4 graphics pipelines(in the case of Geforce) unused so it would run much cooler than if it were rendering pixels.
 
I think it was more centered around the cost of effort vs. output.
Crack rack node is cheaper, the software: Klinux is free and readily available.
There are some who would agree it's a tremendous R&D learning experience, but not likely to produce a lot of blocks.
 
if i had a nickel for every time someone asked this id buy a 900Mhz Duron node and crank that baby up to a great keyrate
 
Back
Top