AMD announces petaflop supercomputer

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
"when frames take a day or more to render, that puts a big crimp on the usefulness of any technology. With GPGPU acceleration, companies are now able to manipulate that data in nearly real time instead of days"

Keep in mind that while this is the same GPGPU tech that AMD released for desktop (which, as IDC pointed out last month, still has terrible visual quality at this point), it's most certainly not the same applicatiion of it...

The Inq from Charlie D
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Viditor
Keep in mind that while this is the same GPGPU tech that AMD released for desktop (which, as IDC pointed out last month, still has terrible visual quality at this point), it's most certainly not the same applicatiion of it...

The Inq from Charlie D

AMD is building a system with more than 1,000 R4870 class GPUs, each pumping out just over a teraflop, and the resulting complex will consume about 1/10th of the power of other petaflop-scale supercomputer installations.

The software that AMD runs on it is from Otoy, and the system is called the System Fusion Render Node.

Yeah this is great news because a professional installation like this will demand/drive robust application development which invariably will bleed down into the desktop toy apps.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Idontcare

Yeah this is great news because a professional installation like this will demand/drive robust application development which invariably will bleed down into the desktop toy apps.

From a purely selfish viewpoint, it's gonna make my own job/decisions a LOT easier!
Realtime rendering has been rumoured about in the industry for over a year now...can't wait to play with it!
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Viditor
Originally posted by: Idontcare

Yeah this is great news because a professional installation like this will demand/drive robust application development which invariably will bleed down into the desktop toy apps.

From a purely selfish viewpoint, it's gonna make my own job/decisions a LOT easier!
Realtime rendering has been rumoured about in the industry for over a year now...can't wait to play with it!

From what I saw of 3D projection technology at TI, its a necessary milestone (realtime 2D rendering) in processing capability before we can make the 3D technology a practical household consumer item.

I thought I wasn't going to like realtime rendering in movies until I realized what I didn't like about CGI to date is the lack of realness (ET sucks in the re-re-release) compared to solid material models. (the comments about Hancock in the link, I loved that movie, the rendering was quite tasteful IMO)

Hopefully with this new technology comes the ability to make rendered vs. physically existing objects in the scene a transparent situation to the viewer.
 

JackyP

Member
Nov 2, 2008
66
0
0
How can they use GPGPUs (and even desktop cards at that?!) for rendering? I thought GPUs are not there yet when it comes to data integrity/security/RAS features. On the other hand maybe rendering does not require 100% data integrity?
But the applications for GPGPUs should be still pretty limited without RAS features, right?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: JackyP
How can they use GPGPUs (and even desktop cards at that?!) for rendering? I thought GPUs are not there yet when it comes to data integrity/security/RAS features. On the other hand maybe rendering does not require 100% data integrity?
But the applications for GPGPUs should be still pretty limited without RAS features, right?

Presumably the render farm itself is well isolated from security threat vectors in the form of front-end harware.

As for reliability...since things are done in real time, and no doubt error-checking at a minimum is occurring, a failure results in a lost frame that can readily be re-rendered.

These aren't bank accounts that need every cent accounted for 100% of the time, these are pre-production frames with pixels that can be in error provided the frequency and quantity of their occurrence does not degrade the overall value of having a real-time rendered frame in hand for fast/quick iterative development.

And the final leg of RAS, availability, is presumably a matter of scaling the system's total node count to keep peak loads under 100% utilization. I.e. a question of cost and budget and not really one of technology as they have already figured out how to gang 1,000 of these 4870's together. What's another 500 at that point than just the cost of doing it?

Now if they intended to release this hardware for engineering purposes (FEA for bridges, etc) then yeah they'd need to beef up the RAS so the liability of miscalculated data or compromised data is something less than human fatalities or missing 401k accounts.