How far can graphics in video games go on?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 4, 2009
34,497
15,729
136
I think it would be smarter to make them more fun, for example I had countless hours of fun with this when I was a kid


Uploaded with ImageShack.us

My friends & I also had tons of fun with Empire


Uploaded with ImageShack.us

So I'd rather see games do something unique than look better.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
The actual content. Say you want a real city 100 square blocks full of life like stuff, somebody has to model and skin all of that stuff.

that is a fair argument. An increase in capability could mean an increase in design costs.
Many games are coming out today from indie developers with very dated graphics.

I think what he's trying to say is that how much will costs ramp up in order to have games that truly replicate real life.

Considering the development costs of games nowadays, the amount of dollars and human capital needed to develop a game with a sprawling metropolis of 100,000+ people must be insane.

True, but you could procedurally generate those people and the metropolis.

I didn't say it wasn't practical. There's a dead guy who made a proof about it before practical computers ever existed.
Well, "not practical" in the sense that it is not practical to brute force a strong AES key (an average of billions of years on the world fastest super computer) type impractical. As in, so impractical it might be impossible.

Even so, if 128GB can do what we have now...
And how much will be needed to store a scene that doesn't look like CG to a discerning viewer that is not attempting to suspend disbelief? Today's CG looks much better than robots, masks and make-up. I'm far from against it. But, I have yet to see anything but high-atmosphere terrain renders that could be mistaken for being shot with a camera.

And, would it be possible to render it real-time using only the same amount of memory resources as taking a great deal of time to render it? Every time we've had such promises in the past wrt to games, it has typically taken far more resources to actually do it.
It is a very fair argument, but I thought the statement you were making is that we will thousands of time the CPU power and hundreds of thousands of times the RAM to render todays movie CG in real time.
I disagree with such a statement, to render todays movie CG in real time we need exactly the same amount of RAM they use to render those CG in not real time, we just need a lot more computational power.

To render them in such a photo-realistic manner as to be indistinguishable from reality? that I do not know how much RAM and computational power we might need. It could indeed be very much as you say for such a case.
 
Last edited:

BladeVenom

Lifer
Jun 2, 2005
13,540
16
0
Surely within our lifetime, see below.

128 times faster = 2^7 faster.
18 months * 7 = 126 months = 10.5 years.
Moors law (and all historical data) says we will 128x the performance we do today in 10.5 years.

Exactly. It's just a matter of time. Besides the hardware improvements, the software and techniques will also get better in time.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
I think we need to take a step back and start pumping up anti aliasing and texture filtering.

3dfx advertised the idea way back in 2000, that for photo realistic graphics we need tons of anti aliasing and filtering. However, at the time there was still quite a bit of improvement needed for 3d graphics. At this point though, I think we need to take a step back and eliminate jaggies and shimmering from computer graphics, everything else looks so good right now I'd rather see the obviously wrong things eliminated from images before we move on to even more complex details.
Oh yeah, and animations need to get much, much better.
 

gorcorps

aka Brandon
Jul 18, 2004
30,736
447
126
I think we need to take a step back and start pumping up anti aliasing and texture filtering.

3dfx advertised the idea way back in 2000, that for photo realistic graphics we need tons of anti aliasing and filtering. However, at the time there was still quite a bit of improvement needed for 3d graphics. At this point though, I think we need to take a step back and eliminate jaggies and shimmering from computer graphics, everything else looks so good right now I'd rather see the obviously wrong things eliminated from images before we move on to even more complex details.
Oh yeah, and animations need to get much, much better.

Agreed. Most larger surfaces look pretty good but to make thin things like cables, leaves, grass and such look believable the jaggies have to be pretty much zero.
 

Pr0d1gy

Diamond Member
Jan 30, 2005
7,775
0
76
With the steady and exponential growth of technology, there is no reason to think we won't have photorealistic 3-dimensional holographic projected images.

Everyone should read this article:

http://www.time.com/time/health/article/0,8599,2048138,00.html

I could see us using 3D holographic projectors with true depth and accurate sizes within 10-20 years. You'll basically be playing in the real world, visually speaking.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
It is a very fair argument, but I thought the statement you were making is that we will thousands of time the CPU power and hundreds of thousands of times the RAM to render todays movie CG in real time.
Yes and no. Today's CG will need more power than being used for today's CG, because AFAIK, movies aren't being rendered today at 60FPS. But, the needs of processing are a linear function. Even with power concerns making Moore's Law not apply as direct performance improvements, performance improvements are occurring much faster than linear, and include consistent improvements in performance per xtor, on top of cramming more xtors together. RAM would not likely need to be excessively higher, but I'd bet it would take more (if nothing else, the entire scene has to be created, rather than just some parts). I'm not sure how long it takes for a frame typically. If it took 4 hours, our future box would need to be 864,000x as fast, nearly 2^20 ((4h*60m*60s*1000ms)/(1000ms/60hz) is how I came up with that). 1 hour, nearly 2^18, 1/2 hour, nearly 2^16, etc..

I'd bet within 5-8 years, we'll have PC video cards capable of something like Shrek or Madagascar (lighting quality is where current and near-future high-end cards are lacking, compared to such CG), and within 20 years, current CG mixed in to real shots (Avatar, Pirates of the Caribbean sequels, etc.).

Detail approaching reality, however, will either take insane paradigm changes, or more FLOPS and RAM than we can conceive the use of, today. However, if just going with processing power and RAM, I don't think we would be able to manage the manhours necessary to create content, so those changes in how we approach creating a 3D environment to be rendered will still be needed, just to handle that aspect of it, if not the processing needs of rendering.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I think we need to take a step back and start pumping up anti aliasing and texture filtering.

3dfx advertised the idea way back in 2000, that for photo realistic graphics we need tons of anti aliasing and filtering. However, at the time there was still quite a bit of improvement needed for 3d graphics. At this point though, I think we need to take a step back and eliminate jaggies and shimmering from computer graphics, everything else looks so good right now I'd rather see the obviously wrong things eliminated from images before we move on to even more complex details.
Oh yeah, and animations need to get much, much better.
Agreed. IMO, we hit a milestone when Doom 3 and HL2 came out (Doom 3 looking more immersive, HL2 being a better game, IMO :)). Both had immense work put into making the environments appear and act in a cohesive manner. Crysis then took it to a refined pinnacle. It's not that more detail will be a bad thing, but that the little artifacts here and there, and unintuitive physics, detract from the experience far more than added detail manages to enhance it. In addition, games that rely on high detail tend to look much worse when settings are lowered for performance, than games made with lower detail to begin with.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Yes and no. Today's CG will need more power than being used for today's CG, because AFAIK, movies aren't being rendered today at 60FPS. But, the needs of processing are a linear function. Even with power concerns making Moore's Law not apply as direct performance improvements, performance improvements are occurring much faster than linear, and include consistent improvements in performance per xtor, on top of cramming more xtors together. RAM would not likely need to be excessively higher, but I'd bet it would take more (if nothing else, the entire scene has to be created, rather than just some parts). I'm not sure how long it takes for a frame typically. If it took 4 hours, our future box would need to be 864,000x as fast, nearly 2^20 ((4h*60m*60s*1000ms)/(1000ms/60hz) is how I came up with that). 1 hour, nearly 2^18, 1/2 hour, nearly 2^16, etc..

1. My point is that it will not take any more RAM. It has to load every asset in the scene into RAM.
2. Four hours per FRAME is ridiculous! I read in a magazine about one of the real render farms used to make real movies. It was a couple of years ago, I remember it was about a million dollars, a dozen computers with 4 quad cores each, 128GB RAM each, and rendered in real time. The whole point of the article is that using such an expensive 1 million dollar farm is worth it for the real time rendering, which allows for smoother work in developing the movie.
A budget render would be as little as one one lone computer shoved full of ram. Still costs a bundle due to the excessive amounts of RAM. Takes a day or two to render.
http://www.tyan.com/product_board.aspx
The real budget computers use much cheaper hardware with as much off the shelf RAM you can shove in (was 64GB back then), and they simply CAN NOT render images as impressive, because the amount of RAM you have acts as a hard limit to the amounts of assets you can have on each scene, its as simple as that.

I don't know what makes you think that 2 to 4 hours per FRAME is in any way realistic.
 
Last edited:

Pheran

Diamond Member
Apr 26, 2001
5,849
48
91
the best video game out right now is .... real life ! you get pure HD, physics without physx processing, antialiasing + shadows without any performance problems.

Yes, but the lack of respawn really sucks. :awe:
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I don't know what makes you think that 2 to 4 hours per FRAME is in any way realistic.
That the last I read a few years ago was talking days/weeks for short scenes, and this is not information that Google is exceptionally helpful with. If they can use dozens of standard CPUs and get it real-time or near it now, then sure, 10-15 years, and we're there.
 

Ross Ridge

Senior member
Dec 21, 2009
830
0
0
I don't know what makes you think that 2 to 4 hours per FRAME is in any way realistic.

According to an article online on Information Management each frame in Avatar took hours to render and required a huge computing factilty:

The computing core - 34 racks, each with four chassis of 32 machines each - adds up to some 40,000 processors and 104 terabytes of RAM.
...
Each frame of the 24 frame-per-second movie saw multiple iterations of back and forth between directors and artists and took multiple hours to render.

I assume more than one frame was rendered at the same time, but it wasn't capable of rendering the movie at anywhere near realtime.

Note that some sort of real-tiime rendering system was used to preview what scenes would look like in Avatar, but this was nowhere near the quality of the final product.
 

Soccerman06

Diamond Member
Jul 29, 2004
5,830
5
81
I think the first step is going to be developing a monitor with 300+ dpi so that you dont get jaggies in the first place, so a 30" screen would be 36.7 MP. Damn that would be... 7650x4800 which is kinda doable with current gpus .

10x&


Thats half way there at 8400x2100, so next gen doable?

go.jpg


same dude is building a 7x5 monitor setup, wonder how itll look (61.7MP) so now some manufacturer needs to put half that many pixels in 1 monitor and were half way there!
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
According to an article online on Information Management each frame in Avatar took hours to render and required a huge computing factilty:

interesting. It naturally varies from movie to movie but that is quite a lot of variance. The articles I was reading were from 2-3 years before avatar, from different movies. Seems avatar dialed it up to eleven though what with taking that much computer power. Speaking of which
The computing core - 34 racks, each with four chassis of 32 machines each - adds up to some 40,000 processors and 104 terabytes of RAM. The blades read and write against 3 petabytes of fast fiber channel disk network area storage from BluArc and NetApp.
Ok, so that is 34 * 4 * 32 = 4352 individual machines. quick division gives 9.2 processors a machine; rounding doesn't seem right. 10 processors or 8 processors a machine don't properly add up to anything that should round to 40,000 easily. 8 gives us 34816. 10 gives us 43520. Maybe they are hetrogenous and contain CPUs and GPUs?

Mmm, Ok I got it, The article exlains that:
1. This is the main render farm for the entirety of pixar. Impressive, yes.
2. The article CLAIMS this that each frame took "took multiple hours to render" and that the movie is 240,000 long. This is because the article is FULL OF SHIT!
I keep on seeing badly written articles that claim the most stupid things... Especially if it has anything technical. From articles claiming the development of working shields, to invisibility, to ridiculous diet claims. (dark chocolate healthier then fruit was the last one)

4 hours/frame * 240,000 frames = 960000 hours = 40000 days = 109.58904109589041095890410958904 YEARS!

Avatar did NOT take 110 years to render, this fails the most basic of sanity checks. I must simply conclude that the article is FALSE.

Lets do something more reasonable, lets assume that the final product took a whole whopping month to render. That is 30 days * 24 hours/day * 60 min/hour = 43200 minutes.
240,000 frames divided by 43200 minutes comes out to 5.5555555555555555555555555555556 frames/minute.

Still unreasonably long, with such a super cluster it will not surprise me if it took hours to render the whole thing. (and it had to be re-rendered again and again and again after every minute change)

You simply CAN NOT develop a movie "blind", you have to make something, render, examine, make changes, render, examine, render, etc.
PS. As far as the ram is concerned, while each machine needs a full duplicate of all required assets in its RAM, it is not necessary on a per PROCESSOR basis. Dividing the ram amount by the processor amount they give divides into a paltry 2.6GB/processor. But these things are on a per computer basis. so a cluster of 4352 individual machines having 104 terabytes of ram, 104 *1024 = 106496GB of ram or 24.5GB a machine... Still seems really really for a render farm. I am gonna chulk it up to further inaccuracies in the article. Maybe some of those machines have the hardware and software in place to share their RAM; I am really curious now to know more specifics about pixars render engine.

BTW, I had an idea, maybe the article writers garbled up a claim of "several work hours", as in, several hours on a single machine out of the 4352 in the cluster. This would mean the entire cluster could render it in 9.2 days. (110 single machine work years for entire movie / 4352 machines).
Still ridiculously long time. Still, this is avatar we are talking about.

Still, if we are saying that it takes 4 work hours on one of their machines to render a single frame, then you need 4 (work*hours/frame) * 60 (minutes/hour) * 60 (seconds/minute) * 60 (frames/second) = 864000 work., 2^20 = 1048576. 20 * 1.5 years = 30 years. Exponential growth is that awesome.
 
Last edited:

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
AMD is already producing the first full length ray traced Anime feature film that is also a video game. That way the movie and the game can have exactly the same graphics. Other games are coming out that use ray tracing for things like picking out which car you want to drive or for cut scenes. However, real time ray traced games that look lifelike are at least twenty years in the future so I wouldn't hold my breath.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
AMD is already producing the first full length ray traced Anime feature film that is also a video game. That way the movie and the game can have exactly the same graphics. Other games are coming out that use ray tracing for things like picking out which car you want to drive or for cut scenes. However, real time ray traced games that look lifelike are at least twenty years in the future so I wouldn't hold my breath.

I honestly think that is silly. I have seen this push to "gamize movies" and its beyond stupid. Games look so imperfect compared to movies because they have to be rendered in real time. Games make concessions, similar but not the same. To cripple your movie, intentionally making it look worse then it can, is just a silly gimmick.
Last I heard the people proposing this were unable to get any studios to back them up. Seems that while the movie studies wouldn't back it up, a GPU company (AMD) and an anime studio were willing to.
 

Merad

Platinum Member
May 31, 2010
2,586
19
81
4 hours/frame * 240,000 frames = 960000 hours = 40000 days = 109.58904109589041095890410958904 YEARS!

Avatar did NOT take 110 years to render, this fails the most basic of sanity checks. I must simply conclude that the article is FALSE.

You're making the extremely incorrect assumption that they only render one frame at a time using the entire facility. The whole point of a render farm is to produce many frames at once, in parallel.

I'm sure a farm that size can produce at least a couple thousand frames during each 4 hour period. Assuming that that figure (2000 per each 4 hours) is exactly correct, then the full movie would take ((240000 / 2000) * 4) / 24 = 20 days to render.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
You're making the extremely incorrect assumption that they only render one frame at a time using the entire facility. The whole point of a render farm is to produce many frames at once, in parallel.
No, the author of the article made that mistake. I suggested that he misunderstood and that they were telling the author it is hours of work for a single node within the farm.
I even said so in the SAME POST

taltamir said:
I had an idea, maybe the article writers garbled up a claim of "several work hours", as in, several hours on a single machine out of the 4352 in the cluster. This would mean the entire cluster could render it in 9.2 days. (110 single machine work years for entire movie / 4352 machines).
Still ridiculously long time. Still, this is avatar we are talking about.

And to be honest, reading further into it I now believe it is much less than that. Because they mentioned humans were involved in a "back and forth"... so its "send to render" "review by humans" "make changes" "repeat". 110 work years makes sense. A team of 55 spending 2 years, or a team of 110 spending one year on it will get you 110 work years worth of HUMAN work (not rendering)
 
Last edited: