Another Ray Tracing demo from Intel

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MODEL3

Senior member
Jul 22, 2009
528
0
0
Essentially the way he (Intel guy) described it (barbie doll example), it is much more close to raycasting than raytracing.

BTW, the video sucks big time, lol.


 

extra

Golden Member
Dec 18, 1999
1,947
7
81
Man, I live in Oregon and i'm embarrassed for my state right now (intel being here and all hehe). :(
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Originally posted by: BenSkywalker

Is this the same "enterprise/big iron" aka high-end market where Intel is literally a no-show, even after sinking zillions into its utterly dead IA64 project?

Intel destroyed the 'big iron' market altogether. It used to consist of billions a year going to makers of enterprise class solutions. Now, most of those setups run x86 based hardware. There is still a very small market segment above x86, but it is a small fraction of what it was years ago.

Oh, PLEAHHSE...

First, it wasn't destroyed at all, second it shrank due to several factors, most of them had nothing to do with Intel per se and third Intel TRIED and still TRYING TO CONQUER IT with its craptastic Itaniums.

FYI I think you are confusing massively scalable, enterprise/HPC market with mainframes (Big Irons) - the latter is pretty much dead, due to natural causes while the first one is fine, thanks for asking, except lot of them switched to building giant clusters from commodity hardware, thanks to free linux clustering solutions.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
First, it wasn't destroyed at all

Spoken like someone who doesn't have a clue about what that market was not that long ago. Even simple web servers avoided x86 like the plague not that long ago. Itanium's biggest competitor isn't POWER, MIPS, Alpha or any of the other chips it was supposed to compete with, it's the Xeon/Opteron. The market that the high end CPUs once dominated suffered a complete implosion, the market is miniscule now and it is quickly collapsing.

the latter is pretty much dead, due to natural causes while the first one is fine, thanks for asking, except lot of them switched to building giant clusters from commodity hardware, thanks to free linux clustering solutions.

Cell and soon nVidia GPUs top that segment. The market that used to be there, high end CPUs, is all but dead. The largest reason for that is Intel's own x86 platform evolved to the point where it was good enough(even RoadRunner uses x86 CPUs to give the Cells their workload).


What precisely is that supposed to show? I was using Ray Tracing and Radiosity ~20 years ago- it is very far removed from being anything new for me. Actually, I was using Ray Tracing renders for several years prior to the Voodoo1 hitting(had access to Radiosity, but it was just far too slow to use until Celery 300a came around and gave a decent option for clusters for rendering). I worked with 3D for several years prior to doing analysis work.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Originally posted by: BenSkywalker

the latter is pretty much dead, due to natural causes while the first one is fine, thanks for asking, except lot of them switched to building giant clusters from commodity hardware, thanks to free linux clustering solutions.

Cell and soon nVidia GPUs top that segment.

Yeah... except they are not even a blip on the radar right now.
Your point is...?

The market that used to be there, high end CPUs, is all but dead. The largest reason for that is Intel's own x86 platform evolved to the point where it was good enough(even RoadRunner uses x86 CPUs to give the Cells their workload).

Nonsense. You have no clue about high-end CPUs if you think Intel (or that joke called Cell) had much to do with it.

BTW before commodity x64 you couldn't even use a decent amount of memory - a necessity for volume rendering, as you should know it.



What precisely is that supposed to show? I was using Ray Tracing and Radiosity ~20 years ago- it is very far removed from being anything new for me. Actually, I was using Ray Tracing renders for several years prior to the Voodoo1 hitting(had access to Radiosity, but it was just far too slow to use until Celery 300a came around and gave a decent option for clusters for rendering). I worked with 3D for several years prior to doing analysis work.

Boohoo. Actually I linked here to just point out that nobody gives a flying frog about claimed experience in this field - it is commercialized, long available, there's zero reason to link some thesis doc with all the math, it comes through as a rather corny attempt to impress the amateurs.

When someone is beating his chest he was working in 3D using voxels on Celerons, that's a rather funny statement...
FYI I happen to work in 3D visu field for ~decade and yes, we use volumetric rendering (we usually develop our own plugins) and no, we never even considered Celerons or regular desktop chips - didn't make any sense. As a matter of fact the only reason we started with reagular vWS CPUs (Xeon/Opteron) was the sky high prices of the HPC market back then and even though Itaniums were so bad they couldn't match Xeons but we needed them until AMD64 arrived because there wasn't any other affordable option for 16-32-64GB memory. But this does not mean IA64 had anything to do with high-end or scaling - maybe in certain cases, I don't really know, never seen any good results from real life (one of them is still utilized under our main MySQL server, the other one is the spare. :)) Heck, the #1 reason SGI died was their utterly idiotic bet on Itanium...
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Boohoo. Actually I linked here to just point out that nobody gives a flying frog about claimed experience in this field - it is commercialized, long available, there's zero reason to link some thesis doc with all the math, it comes through as a rather corny attempt to impress the amateurs.

It is an explenation for those who are interested. The math involved should be trivial for anyone with a basic level of education.

You seem to have a profound issue with reading comprehension, I was going to reply point by point but instead I'll just point a few things out. Never did I say anything about working with voxels, they certainly haven't been viable to work with for long and they still don't have much worth. Computing radiosity in the Celery 300a era couldn't be done with anything on a more cost effective basis. For all your talk of massive memory required workng with 3D, Onyx had an absolute maximum RAM capacity of 8GB, and even that couldn't handle full scene data for production level work, that ignores the reality that most workstations were more along the lines of an Indy which couldn't come remotely close to 32bit limits for RAM, let alone Onyx levels. Nothing in the workstation field in the mid 90s was anything at all like what you are talking about(Octane with the second revision mobos are the first legit workstation that was breaking the 32bit limit for mem address).

I mentioned Cell as the fact is that the former 'big iron' class has been so utterly crushed they can't keep pace with a commodity part from a console in their own standard(RoadRunner is a bunch of commodity chips). The HPC market is rapidly being taken over by the commodity market, the high end dedicate processors are vanishing quickly along with most of the people in the IT industry that reccomend them. Go ahead and try and link major new developments in the HPC or enterprise class machines. Or, if you would like, we can do a market breakdown of the utter implosion that type of machine has had over the last ~12 years.

If you want to get into a serious dicsussion be my guest, but I won't play your strawman game.
 

SRoode

Senior member
Dec 9, 2004
243
0
0
Originally posted by: BenSkywalker
It is an explenation for those who are interested. The math involved should be trivial for anyone with a basic level of education.

Basic level in math yes. Verbal, not so much...

The demo looked like garbage. It was slow and choppy.