How far can graphics in video games go on?

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
Yes. It is already almost doable with ray tracing. I think as improvements occur there will be a time when realistic looking games are plausible. The real question is will they want to go that far? Do you really want the person you shoot in the face in a game to look like a real person?
 

maniacalpha1-1

Diamond Member
Feb 7, 2010
3,562
14
81
Single player games can continue pushing the graphical envelope but multiplayer games like Battlefield need to stop advancing graphics so fast and slow down and focus on important things like increasing player counts. Or if they are going to keep advancing the Destruction stuff, model a dam that you can destroy and flood a multiplayer map with.
 
Last edited:

chalmers

Platinum Member
Mar 14, 2008
2,565
0
76
Yes. It is already almost doable with ray tracing. I think as improvements occur there will be a time when realistic looking games are plausible. The real question is will they want to go that far? Do you really want the person you shoot in the face in a game to look like a real person?

Yes.
 

Powermoloch

Lifer
Jul 5, 2005
10,085
4
76
the best video game out right now is .... real life ! you get pure HD, physics without physx processing, antialiasing + shadows without any performance problems.


JK

but it's possible, i'm just amazed the progress the graphic industry made so far. I'll keep my eyes open lols
 

Wyndru

Diamond Member
Apr 9, 2009
7,318
4
76
Wasn't there an experiment done a while back where they made some lifelike models in some demo, and had people maneuver them like they would in a game?

IIRC the general consensus was that real life models creep people out, something about their eyes being lifeless or something.

I'll see if I can find the experiment and post the link.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
do you think video games will look like real people,world,etc in future?

Look at movies with high end CG. They look real.
The only thing preventing your video game from looking like a movie is that a video game has to be rendered in real time at a playable framerate. As performance of video cards increases movies will get closer and closer to movie CG.
 

Merad

Platinum Member
May 31, 2010
2,586
19
81
Look at movies with high end CG. They look real.
The only thing preventing your video game from looking like a movie is that a video game has to be rendered in real time at a playable framerate. As performance of video cards increases movies will get closer and closer to movie CG.

Probably not in the very near future. I think it's pretty common for movie quality CGI to take 5-10 minutes or more to render a single frame. Games need to render tens of thousands of times faster than that to be playable.

Hopefully what's going to start increasing more in games are things like view distances. Most games even now start chopping off details pretty sharply after 20-30m.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,766
784
126
I'd be more interested in having games with real cities etc. Not "Video Game Cities" where there's about 100 people in it or towns with about 10 people. Assassins Creed was pretty good at that.
 

Sureshot324

Diamond Member
Feb 4, 2003
3,370
0
71
I don't know. Graphics advancement seems to be slowing down. It's been a long time since I've really been amazed at the graphics in a game.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I don't know. Graphics advancement seems to be slowing down. It's been a long time since I've really been amazed at the graphics in a game.

its not that advancement slowed down, its diminishing returns on that advancement. Going from 8bit color to 16 bit was huge, going from 16 bit to 24 bit was a lot less improssive. Going from 24 bit to 32bit? meh.

Same for resolutions, same for polygonal fill rates, some for shadow qualities, same for every aspect. We got to a point where games already look "rather good", there is still room for improvement, lots of it. But it isn't as breathtaking an improvement as we have had before. Because before things really really sucked. They were ugly and low res and dull colored and blocky
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
do you think video games will look like real people,world,etc in future?
No. Despite that having been said over and over again, unless we can get raster renderers hundreds of times faster at a given resolution, with hundreds of thousands of times more memory available to them, there is not even a remote chance. By the time we can do that, we will be using hundreds, of not thousands of times the pixels we do now, so...no. Even if the first part could be true, we would still need better ways to handle models and textures, because there is a difficult to surmount unrealism with textures stretching over polygons as the vertices change position, and shader effects can only do so much to mask it.

Likewise, I haven't seen a realistic ray trace, but the unreal look tends to be different.

Look at this way: CG done in movies takes days or weeks to render. You would need enough power to do that kind of work in <15ms, and to make it look good in real time, every last detail in the scene would have to be rendered the same way, so you'd likely be needing 5-10x more, before even getting into the possibility of increased resolutions, or artifact-detection issues that may arise due to not dealing with static scenery. That is, in a movie, the background and motions have all been planned before-hand, and likely even shot in reality, where a game will not have that to handle all of that from scratch, as a unique situation for every frame. As this goes on, we get used to it, and become better at detecting where the faults are. Movies who's effects used to look realistic no longer do, because we humans are excellent at detecting patterns, and breaks in patterns, so the better we make the technology, the more likely we are to find very minor details that can keep it from seeming real.
Going from 24 bit to 32bit? meh.
More like 0: not even worthy of a meh. I'm 99&#37; sure that all 32-bit color in all consumer/gaming video cards is 16777216 colors, with 8 bits of padding, for simpler addressing.
 
Last edited:

dpodblood

Diamond Member
May 20, 2010
4,020
1
81
Look at movies with high end CG. They look real.
The only thing preventing your video game from looking like a movie is that a video game has to be rendered in real time at a playable framerate. As performance of video cards increases movies will get closer and closer to movie CG.

This exactly. Question is how long until we will see the technology which will allow us to render high end CG quality graphics in real time? Probably not in our lifetimes, but who knows.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
This exactly. Question is how long until we will see the technology which will allow us to render high end CG quality graphics in real time? Probably not in our lifetimes, but who knows.

Surely within our lifetime, see below.

No. Despite that having been said over and over again, unless we can get raster renderers hundreds of times faster at a given resolution, with hundreds of thousands of times more memory available to them

128 times faster = 2^7 faster.
18 months * 7 = 126 months = 10.5 years.
Moors law (and all historical data) says we will 128x the performance we do today in 10.5 years.

As for hundreds of THOUSANDS more memory? why? Movies are rendered using at most 128GB of ram. This will be available to the home user much sooner.
But if you do insist on it, 2^17 = 131072. That is more than one hundred thousand.
In 25.5 years we are expected to having 131072x the amount of RAM we do today.

I still remember my first computer... 22Mhz CPU, 60MB HDD, 5 inch floppy, 1MB of RAM.
 
Last edited:

HeXen

Diamond Member
Dec 13, 2009
7,831
37
91
A.I. has always been sooo slow. Think about it. In Turok 2 for the N64, enemies could hear, have vision range and run to hide behind objects, chase you....most games out today do no more than just that. lol i mean their better at it now, but little has changed.
some games like Oblivion have taken it quite a bit more, but its still rough and far and few in between.

Audio is a dead end imo. Graphics have slowed due to consoles and i doubt how we play games will change much anytime soon.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
As for hundreds of THOUSANDS more memory? that is untrue. Movies are rendered using at most 128GB of ram. This will be available to the home user much sooner.
Are movie frames rendered in several milliseconds, creating an entire scene that could be mistaken for being shot with a camera with no CG involved? As long as you have enough total space to store the data, you can lower the memory use all you want, and increase time needed. I'm thinking about needing hundreds of times the polys (even w/ tess), textures easily many hundreds of times current size (each doubling of detail takes about 4x the space), and if replacing textures with other things, that space will get used up with buffers for those other things. We will also need the means to create accurate motion blur real-time.

What render farms have now can't do the job. You would need what render farms may have in 10 years, and then multiply that by the time difference between the time frames take to render for each use (hours, or days, v. milliseconds). If less memory is needed, that difference will need to be made up in processing power, so with less memory, it may take tens or hundreds of thousands more time FLOPS, instead of GBs. All of this will need to fit in a small box, and use <200W (preferably <100W). So, if a frame were to take four hours, you would need to increase total processing capability by 864,000 times for 60FPS...and I don't know of any finished product, completed in any time frame, which has managed to look truly realistic. Good, yes, realistic, no.

Maybe we'll get there, but we won't get there without some serious paradigm changes.
 
Last edited:

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
When graphics can get no better looking what's gonna happen to GPU development? Will AMD and nVidia keep bringing out cards to the point where you're getting 5,000 fps in Crysis 5?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Are movie frames rendered in several milliseconds, creating an entire scene that could be mistaken for being shot with a camera with no CG involved? As long as you have enough total space to store the data, you can lower the memory use all you want, and increase time needed.

No, you can't. RAM is billions of times faster in random performance than the HDD; and in sequential performance it is thousands of times faster.
To perform computational off of the HDD is completely impractical and is just not done. This is why movie studios MUST pay through the nose for machines with ridiculous amounts of RAM, 32GB and 64GB is the low end, 128GB+ in the high end.

What render farms have now can't do the job. You would need what render farms may have in 10 years, and then multiply that by the time difference between the time frames take to render for each use (hours, or days, v. milliseconds).
Render farms now must store the data in duplicate on each farm's memory. That is, RAM must be sufficient to hold the entire scene in each.

128GB of RAM is enough to fully store the data needed to create today's CG.

If less memory is needed, that difference will need to be made up in processing power, so with less memory, it may take tens or hundreds of thousands more time FLOPS, instead of GBs.
None of this is true. This is simply not how computers work.
 
Last edited:

JSt0rm

Lifer
Sep 5, 2000
27,399
3,947
126
whos gonna make all this shit you want to render in realtime? Ask yourselves that.
 

Blurry

Senior member
Mar 19, 2002
932
0
0
I think what he's trying to say is that how much will costs ramp up in order to have games that truly replicate real life.

Considering the development costs of games nowadays, the amount of dollars and human capital needed to develop a game with a sprawling metropolis of 100,000+ people must be insane.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
No, you can't. RAM is billions of times faster in random performance than the HDD; and in sequential performance it is thousands of times faster.
I didn't say it wasn't practical. There's a dead guy who made a proof about it before practical computers ever existed. Even so, if 128GB can do what we have now...
(snip) 128GB of RAM is enough to fully store the data needed to create today's CG.
And how much will be needed to store a scene that doesn't look like CG to a discerning viewer that is not attempting to suspend disbelief? Today's CG looks much better than robots, masks and make-up. I'm far from against it. But, I have yet to see anything but high-atmosphere terrain renders that could be mistaken for being shot with a camera.

And, would it be possible to render it real-time using only the same amount of memory resources as taking a great deal of time to render it? Every time we've had such promises in the past wrt to games, it has typically taken far more resources to actually do it.
None of this is true. This is simply not how computers work.
The existence and common use of prefab meshes, textures [and related maps] contradicts that statement. I have seen demos of dynamic creation of such items, yet they necessarily add to the processing necessary to create a scene, v. using up memory and memory bandwidth to apply premade data.
 
Last edited: