Soccerman06
Diamond Member
- Jul 29, 2004
- 5,830
- 5
- 81
Do you really want the person you shoot in the face in a game to look like a real person?
The actual content. Say you want a real city 100 square blocks full of life like stuff, somebody has to model and skin all of that stuff.
I think what he's trying to say is that how much will costs ramp up in order to have games that truly replicate real life.
Considering the development costs of games nowadays, the amount of dollars and human capital needed to develop a game with a sprawling metropolis of 100,000+ people must be insane.
Well, "not practical" in the sense that it is not practical to brute force a strong AES key (an average of billions of years on the world fastest super computer) type impractical. As in, so impractical it might be impossible.I didn't say it wasn't practical. There's a dead guy who made a proof about it before practical computers ever existed.
It is a very fair argument, but I thought the statement you were making is that we will thousands of time the CPU power and hundreds of thousands of times the RAM to render todays movie CG in real time.Even so, if 128GB can do what we have now...
And how much will be needed to store a scene that doesn't look like CG to a discerning viewer that is not attempting to suspend disbelief? Today's CG looks much better than robots, masks and make-up. I'm far from against it. But, I have yet to see anything but high-atmosphere terrain renders that could be mistaken for being shot with a camera.
And, would it be possible to render it real-time using only the same amount of memory resources as taking a great deal of time to render it? Every time we've had such promises in the past wrt to games, it has typically taken far more resources to actually do it.
Surely within our lifetime, see below.
128 times faster = 2^7 faster.
18 months * 7 = 126 months = 10.5 years.
Moors law (and all historical data) says we will 128x the performance we do today in 10.5 years.
Yeah but the servers are terrible and the admins are bastards.the best video game out right now is .... real life ! you get pure HD, physics without physx processing, antialiasing + shadows without any performance problems.
I think we need to take a step back and start pumping up anti aliasing and texture filtering.
3dfx advertised the idea way back in 2000, that for photo realistic graphics we need tons of anti aliasing and filtering. However, at the time there was still quite a bit of improvement needed for 3d graphics. At this point though, I think we need to take a step back and eliminate jaggies and shimmering from computer graphics, everything else looks so good right now I'd rather see the obviously wrong things eliminated from images before we move on to even more complex details.
Oh yeah, and animations need to get much, much better.
Yes and no. Today's CG will need more power than being used for today's CG, because AFAIK, movies aren't being rendered today at 60FPS. But, the needs of processing are a linear function. Even with power concerns making Moore's Law not apply as direct performance improvements, performance improvements are occurring much faster than linear, and include consistent improvements in performance per xtor, on top of cramming more xtors together. RAM would not likely need to be excessively higher, but I'd bet it would take more (if nothing else, the entire scene has to be created, rather than just some parts). I'm not sure how long it takes for a frame typically. If it took 4 hours, our future box would need to be 864,000x as fast, nearly 2^20 ((4h*60m*60s*1000ms)/(1000ms/60hz) is how I came up with that). 1 hour, nearly 2^18, 1/2 hour, nearly 2^16, etc..It is a very fair argument, but I thought the statement you were making is that we will thousands of time the CPU power and hundreds of thousands of times the RAM to render todays movie CG in real time.
Agreed. IMO, we hit a milestone when Doom 3 and HL2 came out (Doom 3 looking more immersive, HL2 being a better game, IMO ). Both had immense work put into making the environments appear and act in a cohesive manner. Crysis then took it to a refined pinnacle. It's not that more detail will be a bad thing, but that the little artifacts here and there, and unintuitive physics, detract from the experience far more than added detail manages to enhance it. In addition, games that rely on high detail tend to look much worse when settings are lowered for performance, than games made with lower detail to begin with.I think we need to take a step back and start pumping up anti aliasing and texture filtering.
3dfx advertised the idea way back in 2000, that for photo realistic graphics we need tons of anti aliasing and filtering. However, at the time there was still quite a bit of improvement needed for 3d graphics. At this point though, I think we need to take a step back and eliminate jaggies and shimmering from computer graphics, everything else looks so good right now I'd rather see the obviously wrong things eliminated from images before we move on to even more complex details.
Oh yeah, and animations need to get much, much better.
Yes and no. Today's CG will need more power than being used for today's CG, because AFAIK, movies aren't being rendered today at 60FPS. But, the needs of processing are a linear function. Even with power concerns making Moore's Law not apply as direct performance improvements, performance improvements are occurring much faster than linear, and include consistent improvements in performance per xtor, on top of cramming more xtors together. RAM would not likely need to be excessively higher, but I'd bet it would take more (if nothing else, the entire scene has to be created, rather than just some parts). I'm not sure how long it takes for a frame typically. If it took 4 hours, our future box would need to be 864,000x as fast, nearly 2^20 ((4h*60m*60s*1000ms)/(1000ms/60hz) is how I came up with that). 1 hour, nearly 2^18, 1/2 hour, nearly 2^16, etc..
the best video game out right now is .... real life ! you get pure HD, physics without physx processing, antialiasing + shadows without any performance problems.
That the last I read a few years ago was talking days/weeks for short scenes, and this is not information that Google is exceptionally helpful with. If they can use dozens of standard CPUs and get it real-time or near it now, then sure, 10-15 years, and we're there.I don't know what makes you think that 2 to 4 hours per FRAME is in any way realistic.
Save games would be really nice to have, too.Yes, but the lack of respawn really sucks. :awe:
I don't know what makes you think that 2 to 4 hours per FRAME is in any way realistic.
The computing core - 34 racks, each with four chassis of 32 machines each - adds up to some 40,000 processors and 104 terabytes of RAM.
...
Each frame of the 24 frame-per-second movie saw multiple iterations of back and forth between directors and artists and took multiple hours to render.
According to an article online on Information Management each frame in Avatar took hours to render and required a huge computing factilty:
Ok, so that is 34 * 4 * 32 = 4352 individual machines. quick division gives 9.2 processors a machine; rounding doesn't seem right. 10 processors or 8 processors a machine don't properly add up to anything that should round to 40,000 easily. 8 gives us 34816. 10 gives us 43520. Maybe they are hetrogenous and contain CPUs and GPUs?The computing core - 34 racks, each with four chassis of 32 machines each - adds up to some 40,000 processors and 104 terabytes of RAM. The blades read and write against 3 petabytes of fast fiber channel disk network area storage from BluArc and NetApp.
AMD is already producing the first full length ray traced Anime feature film that is also a video game. That way the movie and the game can have exactly the same graphics. Other games are coming out that use ray tracing for things like picking out which car you want to drive or for cut scenes. However, real time ray traced games that look lifelike are at least twenty years in the future so I wouldn't hold my breath.
4 hours/frame * 240,000 frames = 960000 hours = 40000 days = 109.58904109589041095890410958904 YEARS!
Avatar did NOT take 110 years to render, this fails the most basic of sanity checks. I must simply conclude that the article is FALSE.
No, the author of the article made that mistake. I suggested that he misunderstood and that they were telling the author it is hours of work for a single node within the farm.You're making the extremely incorrect assumption that they only render one frame at a time using the entire facility. The whole point of a render farm is to produce many frames at once, in parallel.
taltamir said:I had an idea, maybe the article writers garbled up a claim of "several work hours", as in, several hours on a single machine out of the 4352 in the cluster. This would mean the entire cluster could render it in 9.2 days. (110 single machine work years for entire movie / 4352 machines).
Still ridiculously long time. Still, this is avatar we are talking about.