Originally posted by: JackyP
How can they use GPGPUs (and even desktop cards at that?!) for rendering? I thought GPUs are not there yet when it comes to data integrity/security/RAS features. On the other hand maybe rendering does not require 100% data integrity?
But the applications for GPGPUs should be still pretty limited without RAS features, right?
Presumably the render farm itself is well isolated from security threat vectors in the form of front-end harware.
As for reliability...since things are done in real time, and no doubt error-checking at a minimum is occurring, a failure results in a lost frame that can readily be re-rendered.
These aren't bank accounts that need every cent accounted for 100% of the time, these are pre-production frames with pixels that can be in error provided the frequency and quantity of their occurrence does not degrade the overall value of having a real-time rendered frame in hand for fast/quick iterative development.
And the final leg of RAS, availability, is presumably a matter of scaling the system's total node count to keep peak loads under 100% utilization. I.e. a question of cost and budget and not really one of technology as they have already figured out how to gang 1,000 of these 4870's together. What's another 500 at that point than just the cost of doing it?
Now if they intended to release this hardware for engineering purposes (FEA for bridges, etc) then yeah they'd need to beef up the RAS so the liability of miscalculated data or compromised data is something less than human fatalities or missing 401k accounts.