ATI to render LOTR in realtime

Czar

Lifer
Oct 9, 1999
28,510
0
0
http://www.ati.com/companyinfo/press/2002/4520.html

ATI and Massive render The Lord of the Rings: The Fellowship of the Ring in real time in Linux

Massive and ATI join forces at SIGGRAPH 2002 to demonstrate real-time cinematic rendering

Tuesday, July 23, 2002

MARKHAM, Ontario - ATI Technologies Inc. (TSX:ATY, NASDAQ:ATYT) and Massive today announced that they will be rendering Academy Award® winner, "The Lord Of The Rings: The Fellowship Of The Rings," in real time at SIGGRAPH. This demonstration will be taking place in the ATI booth (#13097) and Massive booth (#5112) using ATI's FIRE GL? workstation graphics technology.

:Q
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: mchammer187
very impressive

err all this pressure to buy a 9700

must resist, must resist ahhh!


Resistance is futile. . . . however if you WAIT awhile after it is released, you won't have to pay $400. ;)

:D
 

gf4200isdabest

Senior member
Jul 1, 2002
565
0
0
i knew this was gonna happen. A couple months ago I had 3 friends rail on me for claiming this would happen anytime soon but I merely pointed out a striking resemblance between the Nature scene in 3dmark and the "Shire".

One question though: "using ATI's FIRE GL? workstation graphics technology." WTF!!?? FireGl isn't the same thing as 9700. Are they using a graphics chip currently available on the market?
 

kami

Lifer
Oct 9, 1999
17,627
5
81
Pretty cool, but I'd like some technical details :p Since it takes fleets of workstations and processor racks to render movie sequences in no where close to real-time, this is probably very low-res and lacking in some of the details the real thing would have. If what they're saying is possible, games would already look a lot nicer. Just like nVidia saying they can render final fantasy real time.

I think what they mean is that it'll render a low quality preview in real time, to aid the animators themselves, not the final project which would probably be like 3000-4000 pixels wide and put on the highest quality render possible. Hell we don't even know if this real-time render includes lighting.

Damn marketing! at least they chose a worthy movie :p
 

manko

Golden Member
May 27, 2001
1,846
1
0
Last year at Siggraph, Nvidia did the same thing: "rendering" the 'Final Fantasy' movie in real time. But it wasn't really anything close to film quality.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
You beat me to it Manko. I was about to say the same thing. This announcement reeks of the NVidia demo claiming FF realtime rendering. Technically it was rendering real time, but it wasn't rendering the Final Fantasy Movie. What it rendered was some reasonable facsimile at much lower resolutions and at 2.5fps which is not what one thinks when they read FF rendered in real time.
 

CrazySaint

Platinum Member
May 3, 2002
2,441
0
0
Originally posted by: gf4200isdabest
i knew this was gonna happen. A couple months ago I had 3 friends rail on me for claiming this would happen anytime soon but I merely pointed out a striking resemblance between the Nature scene in 3dmark and the "Shire".

One question though: "using ATI's FIRE GL? workstation graphics technology." WTF!!?? FireGl isn't the same thing as 9700. Are they using a graphics chip currently available on the market?

FireGL is ATI's high end professional graphics card for people who create and render 3D scenes for a living, not a desktop gaming card.
 

jbond04

Senior member
Oct 18, 2000
505
0
71
Actually, this time nVIDIA (and ATi for that matter) aren't lying. With highly programmable floating point vertex and pixel shaders, they can accelerate 3D animation far faster than a CPU! Since 3D animation benefits greatly from having multiple processors work on one job, highly parallel processors are perfect for rendering. Since GPUs are just like having multiple parallel processors on one chip, they fit the bill better than a CPU itself. What remains to be seen, however, is exactly what they can't replicate (because of any programming limitations). Now, ATi probably had to play it at 640x480, but keep in mind that this card (and all other DX9 cards, for that matter) can render it much faster even at 4000x3000 than any CPU. Before long, 3D artists will be using graphics cards instead of render farms to accelerate 3D rendering.

So before you accuse nVIDIA or ATi of "faking" it, get the facts. :disgust:


/me is just waiting to get a hold of the NV30. I can already hear Santa Claus coming down the chimney. :)
 
Aug 10, 2001
10,420
2
0
The press release is extremely vague because they don't say if it will be a Fire GL 2,4, 8800, or some new Fire GL card.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
all this to me means nothing...ATI can render WWII and I'll still not care. I want to see the drivers at work first...their FireGl cards don't have the same trouble as their consumer cards.
 

JellyBaby

Diamond Member
Apr 21, 2000
9,159
1
81
So what's meant by "real time"? Twenty-four FPS or better? And at what resolution will this scene be rendered?