How far can graphics in video games go on?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ross Ridge

Senior member
Dec 21, 2009
830
0
0
2. The article CLAIMS this that each frame took "took multiple hours to render" and that the movie is 240,000 long. This is because the article is FULL OF SHIT!
...

As I already said, multiple frames would've been rendered at the same time.

You simply CAN NOT develop a movie "blind", you have to make something, render, examine, make changes, render, examine, render, etc.

Again as I already explained, they had access to a much faster low-quality renderer capable of rendering scenes in realtime. The article also explains that interative process like what you described was used during the final rendering process.

The fully rendered movie would've taken months to completely render, in addition to months of rendering frames that were ultimately thrown away, either to be redone or cut out of the final movie. Avatar took a couple of years to make cost between $200M and $300M to make depending on who you believe. The motion-capture filiming took a month, the live action filming another month. The most of the time and money was spent creating the computer animated parts of the movie. It wasn't and couldn't have been done in a month.

The fact is that PCs are many orders magnitude of rendering movie quality visuals in realtime. Even the massive render farms used to make these movies are several orders of mangitude away from rendering them at full quality in realtime.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
As I already said, multiple frames would've been rendered at the same time.
And as I have already said, in the same post from which you quote, the article has it wrong.
You cannot "render multiple frames in parallel" if each frame is rendered on the entirety of the array (which doesn't make sense actually). You can render multiple portions at once if each is rendered on its own station in the farm.

The fully rendered movie would've taken months to completely render
No it wouldn't. If it was 4 hours per entire farm then it would have taken 110 years. If it was 4 hours per station then it would have taken 9 days. And in both cases I don't think it was hours of rendering but hours of human work. If a team of 55 worked on it for 2 years then they could have simply extrapolated from there that it took hours per frame. Hours of human work per frame not hours to render.

Even the massive render farms used to make these movies are several orders of mangitude away from rendering them at full quality in realtime.
I have done the math on it. If it takes a single station in the cluster 4 hours to render a frame we would need machines 800,000 faster to do so in real time. Luckily exponential growth means that in 30 years we will have machines over a million times faster.
 
Last edited:

Merad

Platinum Member
May 31, 2010
2,586
19
81
No, the author of the article made that mistake. I suggested that he misunderstood and that they were telling the author it is hours of work for a single node within the farm.
I even said so in the SAME POST

The author of the article never said anything other than "several hours per frame". Not sure where you gathered that that was using the entire power of the server farm.



And to be honest, reading further into it I now believe it is much less than that. Because they mentioned humans were involved in a "back and forth"... so its "send to render" "review by humans" "make changes" "repeat". 110 work years makes sense. A team of 55 spending 2 years, or a team of 110 spending one year on it will get you 110 work years worth of HUMAN work (not rendering)

Why is this so hard for you to believe? When they're still tweaking things like animations and effects they'll do lower quality renders or tests of those specific things, all of which run significantly faster than the full quality scene (which I'd imagine only gets rendered once or twice in the whole production).

A couple hours per frame is really nothing horrific when you're talking about cinema quality CG. I've seen several sources saying that Cars took 12+ hours per frame to render.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Why is this so hard for you to believe?
I am a scientist. I don't do belief, but fact and evidence. I have suggested a variety of possible interpretation of the data and done the math for all of them. I have drawn conclusions about which is more and less likely and plausible, and which are completely impossible.
What is YOUR problem?
 

Con111112

Junior Member
Jun 19, 2019
1
0
6
Yes. It is already almost doable with ray tracing. I think as improvements occur there will be a time when realistic looking games are plausible. The real question is will they want to go that far? Do you really want the person you shoot in the face in a game to look like a real person?
ARE YOU redacted EVERYONE WANTS THAT!!

Welcome to the forums!
We do not allow personal insults
in the tech forums, so please
read the posting guidelines before
posting again.


https://forums.anandtech.com/threads/anandtech-forum-guidelines.60552/

AT Mod Usandthem
 
Last edited by a moderator:

GodisanAtheist

Diamond Member
Nov 16, 2006
6,715
7,004
136
Well, since this topic is a thing again:

We can already render stills that are practically indistinguishable from real life, especially if the still image does not have any animals or people in it. Its not unreasonable to assume we will get to a point where we will be able to do that in real time at high resolution.

The real missing ingredient is that real life doesn't just look real, it behaves real, and that's where the trick really is. AI, Physics, Animation: all these things need to keep pace with the "quality" of the raw graphics on display, and the level of computing power and effort to resolve these likely isn't worth the investment.

If the real world is a massive simulation (from a metaphysical perspective) no simulation within the real world is ever going to be able to reproduce the real world simulation its running in (No virtual machine is more capable than the host machine its running on). So we have a hard limit there as well.

Gaming is also a sort of escapist form of entertainment. I enjoy the artistic elements of a game's aesthetic as part of the overall experience of gaming. Borderland's comical violence is enhanced by its cell shaded graphics, and would be diminished if it just looked like a Mad Max movie. Bioshock is a visual feast because it exaggerates elements of reality, not because it reproduces it.

To paraphrase everyone's favorite fake scientist Dr. Ian Malcom: Lets not be so per-ocupied with whether or not we could, but whether or not we should...
 

SMOGZINN

Lifer
Jun 17, 2005
14,202
4,401
136
The real missing ingredient is that real life doesn't just look real, it behaves real, and that's where the trick really is. AI, Physics, Animation: all these things need to keep pace with the "quality" of the raw graphics on display, and the level of computing power and effort to resolve these likely isn't worth the investment.

I agree, but it is worth remembering that is the case right now. If we have learned anything it is that computing power continues to grow despite all expectations of us hitting either a physical or practical limit. The new Raspberry Pie 4b is currently more powerful the so called super computers I used to do AI research in university years ago. I remember enthusiastically chatted about buying my first 1ghz processor on these forums.
What is unreasonably expensive today will literally be children's toys in a decade.

If the real world is a massive simulation (from a metaphysical perspective) no simulation within the real world is ever going to be able to reproduce the real world simulation its running in (No virtual machine is more capable than the host machine its running on). So we have a hard limit there as well.

(Note I'm just arguing this as a thought experiment, I don't believe in the simulated reality theorem.)

That is not necessary true. If for example reality was created and ran on something like a cosmic Commodore64 but over the trillions of years has been upgraded many times, but kept the programming the same (for the sake of continuity), then it might have plenty of overhead left to run a virtual machine with even more power than the host OS is using.

I could, for example, run a Windows 10 VM inside windows 95.

Gaming is also a sort of escapist form of entertainment. I enjoy the artistic elements of a game's aesthetic as part of the overall experience of gaming.

And there will always be room for that, but even that could become ultra-realistic inside it's own framework, a Living 'Roger Rabbit' cartoon world.