Originally posted by: BenSkywalker
the latter is pretty much dead, due to natural causes while the first one is fine, thanks for asking, except lot of them switched to building giant clusters from commodity hardware, thanks to free linux clustering solutions.
Cell and soon nVidia GPUs top that segment.
Yeah... except they are not even a blip on the radar right now.
Your point is...?
The market that used to be there, high end CPUs, is all but dead. The largest reason for that is Intel's own x86 platform evolved to the point where it was good enough(even RoadRunner uses x86 CPUs to give the Cells their workload).
Nonsense. You have no clue about high-end CPUs if you think Intel (or that joke called Cell) had much to do with it.
BTW before commodity x64 you couldn't even use a decent amount of memory - a
necessity for volume rendering, as you should know it.
What precisely is that supposed to show? I was using Ray Tracing and Radiosity ~20 years ago- it is
very far removed from being anything new for me. Actually, I was using Ray Tracing renders for several years prior to the Voodoo1 hitting(had access to Radiosity, but it was just far too slow to use until Celery 300a came around and gave a decent option for clusters for rendering). I worked with 3D for several years prior to doing analysis work.
Boohoo. Actually I linked here to just point out that nobody gives a flying frog about claimed experience in this field - it is commercialized, long available, there's zero reason to link some thesis doc with all the math, it comes through as a rather corny attempt to impress the amateurs.
When someone is beating his chest he was working in 3D using voxels on Celerons, that's a rather funny statement...
FYI I happen to work in 3D visu field for ~decade and yes, we use volumetric rendering (we usually develop our own plugins) and no, we never even considered Celerons or regular desktop chips - didn't make any sense. As a matter of fact the only reason we started with reagular vWS CPUs (Xeon/Opteron) was the sky high prices of the HPC market back then and even though Itaniums were so bad they couldn't match Xeons but we needed them until AMD64 arrived because there wasn't any other affordable option for 16-32-64GB memory. But this does not mean IA64 had anything to do with high-end or scaling - maybe in certain cases, I don't really know, never seen any good results from real life (one of them is still utilized under our main MySQL server, the other one is the spare.

) Heck, the #1 reason SGI died was their utterly idiotic bet on Itanium...