Diminishing Returns of real time 3D

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
This thread doesn't deal with issues pertaining to one company over another, so those only interested in such please look elsewhere.

A trend that I have noticed building over the last several years is people's growing lack of enthusiasm for the progress being made in game engines in terms of the evolution of real time 3D. The recent release of FEAR has really driven this home, although it has been evident for quite a few years now. People are very displeased with the small gains we are seeing for the enormous levels of performance it is taking to drive them- this is only going to get more and more drastic.

Starting back with the beginnings of real time 3D we had software rasterized super low resolution titles with point filtering and 8bit color. Hardware acceleration came along and we were able to quadruple the resolutions we were using, move to 16bit color and add the wonders of bilinear filtering. Going from 320x240 w/point filtering using 256 colors to 640x480 w/bilinear filtering and 65,536 colors was a quantum leap type improvement in visual quality. We could actually make out what those texture maps were supposed to look like- and we were all thrilled. We went from a fairly nondescript collection of blocks to smoothly blended images.

No shift from that point forward is going to be as large- there is nothing that will arrive anytime in the future that will have that kind of impact on real time 3D.

Moving forward we saw continued evolution of the basic principles started with the first 3D hardware until we hit a shifting point and moved from rasterizers to GPUs. With the move to GPUs we were allowed to have graphics functions outside of basic rasterization benefit from the same greater then linear performance increases due to the nature of adding dedicated hardware for features that were previously approximated in software. This move started us on our current evolutionary path.

For early 3D titles we were dealing with low bit depth, low resolution, low quality textures placed on top of low poly, low animation, low interactive objects. The evolution of rasterizers has taken care of the first set of functions while the overlapping evolution of GPUs is taking care of many elements of the second set of issues. We never saw the slap in the face transition to GPUs as we did with hardware rasterizers, as the market was still split, but its impact is easily seen today with the enormously increased model complexity, animations and shader effects being applied to games.

For model complexity we started off seing models in the dozens or hundreds of polys at the maximum. Moving from that levels to tens of thousands of polys is a staggering difference- going from the level to millions of polys is a much smaller step in terms of end visuals while moving to hundreds of millions is a much smaller step still. Each of these steps requires no less computational power in terms of exponential growth- but each one returns increasingly small returns.

Animations are the same. When you move from a poorly 'bound box' model bouncing off air and clipping through walls to some of the slick movements we are seeing out of characters today the transition seems huge. But adding in the finer nuances such as muscular deformation and cloth simulation, while very cool to the geek set when looking for it, is a much smaller transition. Going from that level up to per vertice accurate animations on a multi million vertice model will yield a significantly smaller improvement in terms of end visual impact also.

Shaders are a major function of GPUs as of now, and they have currently taken center stage as the focal point of performance in real time 3D. This is a very good thing as on a realistic basis shader hardware is still very weak compared to what we need to see that second level progress that the other elements have already reached. That said, the move from no shaders at all to what we are currently seeing in titles like DooM3, Quake4 and FEAR has certainly been an enormous one. Objects have been given a depth and luster that were never there before- we have some primitive lighting working its way into titles and are starting to see one of the last major hurdles cleared.

So what do you do when you have 10K-100K poly models with hig res textures, running at high resolutions with great animation and pixels shaded all over the screen? Where do you go from there?

For anything in the near future the answer is more of the same. The problem is that the power requirements to offer even a modest bump under the current circumstances is exponential in nature in terms of looking at the end visual impact. That isn't to say that certain engines aren't going to prove they have the ability to stand out running the same hardware, Carmack, Sweeney and the other top tier developers have always exceeded in figuring out which aspect they need to push the hardest to give them the best return per cycle.

This is not saying that other developers are creating sloppy code by any means, in fact it is possible that a lackluster appearing engine may have code worthy of worship written for it, but the proper choices were not made in where to use computational resources and so the end product doesn't appear to look all that good compared to how it performs. This particular aspect will likely require a very close working relationship between the artists and the coders from very early on in engine development(moreso then it is currently).

There are two fairly large areas I see where a relatively considerable leap can be made that will really stand out- physics and lighting. With physics we know that the answer is on the horizon- PPUs or perhaps GPUs will end up providing us with the power we need to handle singificantly higher levels of interactivity then what we are used to. This development should not be underestimated, it is the last major shift we are going to see in real time interactive 3D for some time. Likely the transition will be much as it was with GPUs, the full impact not readily apparent for a while as the haves and have nots will split developer attention until a sizeable market penetration is reality.

The final issue, lighting, is already being approximated by shaders to some extent, but the real holy grail- and what will almost certainly be the final major hurdle real time 3D will face is radiosity. Radiosity for those that don't know in simplified terms would be accurately modeled light. Not just I turn on flashlight and the circle on the wall lights up, but the actual calculations are done out for the interaction of light as it would happen in the real world. This is an enormously complex issue, and one that is still a long way off on the horizon(perhaps decades, hopefully less) but outside of physics it is really all we have left in terms of major pushes in real time 3D.

I don't want any of this to be misleading- a slow continued evolution of exactly what we have now is not a bad thing by any means, but it could very well end up over the next few years that you will need to wait for multiple generations of video cards to pass before you see a major improvement in end results. Your high end Q1 '08 board may look nigh identical to a Q4 '10 board playing a particular game- even though the '10 part allows you to set the detail levels higher- it may not be worth all that much or even noticeable to the majority of people.

This is not some trend I am trying to claim I am looking into the future and seeing- it is already happening and the situation has been escalating for some time now.
 

Beiruty

Senior member
Jun 14, 2005
282
0
0
Very long reading... thanks for taking the time to author it.

In short, companies invest billions of dollars each year in GPUs and real time 3D graphics. The market is so huge and the demand is the true driver to innovate.

Just look back some 15 years ago, we were at the age of Win 3.0/3.1 and the birth of 2D GUI. In 15 years, we do not know where we would be. In 4 years time frame, just look at design Spec of XBOX (base PC of its time) and XBOX 360 (surpassing the PC by few generations). The Xenos GPU (R500) is as powerful or even more powerful than the R520, plus a 6-threads 3.2Ghz 3-core PPC CPU should be as fast as the fastest CPU available today, plus a 512 MB, plus all the Broadband connectivity and true HDTV support for just $300! That is a killer for the PC 3D gaming scene.

I am worried more about Moore's law. We are now at 65nm and soon the 45nm level. We may scale to a certain limit then we will hit the true wall where we cannot expect better performance at the same price! Any future, higher spec product will cost more and the demand for such product will wane. Now what to do?
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
There's the little quirks we need to work out too which could drastically affect performance but deliver better visuals:
[*]Perfecting the AA/AF algorithms. Wouldn't it look good to run at 2x2SS+4xRGMS AA? No 'shimmering'? Fast angle independent AA?
[*]Kind of unrelated, but: What about real-time in-game level editors? Maps are still a pain to make. There is a LOT that would make this easier.
[*]Converting a real-life environment to a 3D game: the environment in general isn't that real. I'm talking Lost Coast-like realism. Never seen that before anywhere else.
[*]Environmental effects: Better rain and snow.
[*]Atmospherical effects: Better skies.
[*]More realistic modeling: We need the environments to look more damaged and used/abused.
[*]Terrain
[*]Projection: stained glass effects (like in Lost Coast)? Shadows on water? lol, they're just a blob now in BF2. I bet they can make water a LOT better.

After we're done perfecting the visuals, we need vendors to spend their R&D on making good GAMES.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: BenSkywalker

This is not saying that other developers are creating sloppy code by any means, in fact it is possible that a lackluster appearing engine may have code worthy of worship written for it, but the proper choices were not made in where to use computational resources and so the end product doesn't appear to look all that good compared to how it performs. This particular aspect will likely require a very close working relationship between the artists and the coders from very early on in engine development(moreso then it is currently).

I think this is one of the most important points in your post. In the past, developers seemed to design games to be more and more complex, meaning more polygons per model, etc. New techniques of rendering and especially shaders have made it a new ballgame - games can be made to look better and better with more shaders, but there is a threshold where it's more than modern GPU's can handle, and performance turns to crap. We're already seeing this in games like FEAR - a higher shader load is causing performance to drag even on otherwise extremely potent video hardware (eg. 7800 series hardware and 7800 SLI). People are often forced to run it at a lower resolution and consequently think that the graphics "aren't that good" . Meanwhile, if they were able to run it at 16X12 or 19X12, they'd see just how impressive the engine looks, but since they can't they conclude it doesn't look any better than what preceeded it.


There are two fairly large areas I see where a relatively considerable leap can be made that will really stand out- physics and lighting. With physics we know that the answer is on the horizon- PPUs or perhaps GPUs will end up providing us with the power we need to handle singificantly higher levels of interactivity then what we are used to. This development should not be underestimated, it is the last major shift we are going to see in real time interactive 3D for some time. Likely the transition will be much as it was with GPUs, the full impact not readily apparent for a while as the haves and have nots will split developer attention until a sizeable market penetration is reality.

Another possibility for handling physics, ligting and even perhaps some of the video load may actually be the CPU in the future. With the talk of 128 core CPU's by 2015 (which personally I can't see), the CPU's will be nonetheless much more capable to do parallel tasks in the future. Already we are entering a transition to dual core, with quad core a possibility in a few years. New CPU designs are already coming up - the Playstation 3's "Cell" CPU with essentially 9 wide and shallow pipelines is one of such new designs with a heavy emphasis on parallel processing.


The future only holds more realistic graphics. Just a quick search shows some interesting new rendering techniques on the horizon (check out the depth that 'steep parallax mapping' is able to provide to textures! Growing pains are no doubt inevitable as we shift away from conventional rendering and towards a pixel-shader dominated landscape, but it's giving us better, more realistic looking graphics than thought possible back in the Voodoo 1 days.

The old target for graphics used to be "as real as possible," while the new target seems to be "real life graphics." I've already seen pictures done with hundreds of pixel shaders that look like real life; it's only a matter of time before they become feasible to render on video hardware.

It is tough staying on the bleeding edge though, I'll give you that...
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
IMO, improved AI would do much much more for video games than just eye candy.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
I personally would like to see detail texturing used once again. The old Unreal-based games all had it and it made a huge difference in image quality, making even low resolution textures look quite good, but it's absent from pretty much all of the modern games/engines. The textures in modern games haven't gotten any larger for a while now, so this would make a big diffference.

we need vendors to spend their R&D on making good GAMES.

:thumbsup:
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Updated my first response. Where are the good skies? Weather? Is there any weather in games at all nowadays? Not the static bit you set when you want your map to rain, I'm talking real weather management here. Tornados? Not to mention, three dimensional stuff? Grass? There is still a LOT of work to do on shadows. Water effects? The list is endless! That should keep Carmack, Gabe, and Sweeney busy for a while.

We need better environments. Things that now took months just to make a small map, like Lost Coast, should be able to be created by a 3 year old in a couple days.

What I wonder is will the basic fundamentals of graphics change? Will there be no more textures, only 'surfaces' comprised of shaders automatically? Forget geometric faces? Also, how you can use the GPU to get the exact same output in a much faster way, that really interests me (I'll call it GPUology).

Personally I don't want graphics to be given to the CPU, even if I had the power. I'd rather that CPU be used for a process to which the CPU is more tailored than said graphics, like physics processing. The architecture of a GPU is worlds different than a CPU. It's a specialized processor that has many graphical-specific stages. Physics is just a bunch of calculaton. Even with 128 core CPUs, there is no way you are going to get a decent frame rate. Quake 3 runs 1 FPS when it uses my 2.2GHz A64. On 128 of them, maybe 128 FPS. But that's Quake 3. What happens with Quake 4? You'll still get probably 20 FPS. I do not want graphics handling to be the CPU's duty. It's not tailored for that purpose and 128 of them aren't even fast enough anyways. Imagine the difficulty of having 128 threads. Man, wouldn't that be a pain in the ass? They can barely handle 2 threads now, imagine 64 times that. I don't think there is any remote possibility that will ever, ever happen. Not to mention how would you upgrade your graphics? I like simply sliding in a new graphics card and enjoying faster graphics. And 128 cores in general? Yeah right. Intel can go make a nuclear reactor with 128 of their Prescotts, but that thing's not going in my PC. Anyway, not to veer off course...
/rant

We need more threads like this.

:) :beer:
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
A jump in physics would be neat, but the main problem I'm having with modern games is not diminishing returns in graphics, but lack of good game play. I'm finding a lot of games in the past couple years try to push themselves along on their eye candy alone, and wind up being boring to play. Very rarely do I see an original title anymore, as opposed to a sequel of a sequel, or a has-been.

Don't get me wrong, I do enjoy some sequels throroughly (HL2, DoD Source, Civ 2/3, Heroes series), but I like it when something like Savage comes out, and offers relatively original gameplay and is fun as hell to boot. I'm hoping the sequel will be a blast. The other game I'm waiting on to stop being delayed time and time again is STALKER. That's a game to look out for.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
The other game I'm waiting on to stop being delayed time and time again is STALKER. That's a game to look out for.

When is that supposed to come out, anyway? I have been hearing about it for quite some time now.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: CP5670
The other game I'm waiting on to stop being delayed time and time again is STALKER. That's a game to look out for.

When is that supposed to come out, anyway? I have been hearing about it for quite some time now.

Not sure, I'm REALLY looking forward to that too.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: Beiruty
Very long reading... thanks for taking the time to author it.

In short, companies invest billions of dollars each year in GPUs and real time 3D graphics. The market is so huge and the demand is the true driver to innovate.

Just look back some 15 years ago, we were at the age of Win 3.0/3.1 and the birth of 2D GUI. In 15 years, we do not know where we would be. In 4 years time frame, just look at design Spec of XBOX (base PC of its time) and XBOX 360 (surpassing the PC by few generations). The Xenos GPU (R500) is as powerful or even more powerful than the R520, plus a 6-threads 3.2Ghz 3-core PPC CPU should be as fast as the fastest CPU available today, plus a 512 MB, plus all the Broadband connectivity and true HDTV support for just $300! That is a killer for the PC 3D gaming scene.

I am worried more about Moore's law. We are now at 65nm and soon the 45nm level. We may scale to a certain limit then we will hit the true wall where we cannot expect better performance at the same price! Any future, higher spec product will cost more and the demand for such product will wane. Now what to do?

Xbox was a low end cpu with a high end gpu at the time.(and low end harddrive, low amount of memory but high speed, high end audio)
Xbox 360 is just different, not better. The 3.2ghz 3core PPC is slower in many things than current single core cpus, but should be much faster in others, the thing is the areas it's faster in aren't currently heavily utilized in games.(98% of the computational time in games is still spent on branching, is it worth pumping up floating point ability for the 2% of the time it is the limit?)
And the Xenos GPU is faster than the R520 in shader ability only(which is typically the current limit in games), it has much lower fillrate. Oh, it does have a cache so it beasts in memory bandwidth, but has less available video memory. Not sure how vertex transformations compare, but that's not a limit in current games either.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
He's talking about me in this post with my Lost Coast bit in the Software Forums. I've also noticed the trend. And it sucks.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
I disagree that lighting will play a major role in the future. While it will be there and it will advance with every other feature, I believe physics, parallax mapping, advanced shader functions and high polygon numbers are the key to making games more realistic and life-like in the future. PC's have been behind consoles for the longest time in poly pushing power and that has resulted in less detail being possible in PC games. Just look at what happens with current cards when you have a large amount of polygons on screen, they start dropping in frames really quick. Of course developers like Carmack have used workarounds to give the impression of high poly characters but it's still not good enough. Furthermore, we need better multiplayer code that can accomodate larger amount of people in persistent FPS gaming that is capable of precise collission detection and projectile calculation. However, I agree that PC graphics has been evolving quite slowly the past few years and we are seeing diminishing returns on graphics rendered vs. video card price. In part I'd attribute this to sloppy coding and this is quite evident in games like Battlefield 2 where the programmers still cannot fix simple browser problems and they seem to be having trouble with collision detection in their netcode as well.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Well, lost coast does show some good improvments....with less performance drops than other games. Valve seems to be on the ball here, but companies like Ubi with splinter cell and far cry are suffering larger performance drops to do full HDR. It does suck. BF2 seems to be a VERY ineffcient game....I've got a 7800GT, running at my LCD's max res, 1280x1024 and I can't hit AAx4.

There will be a point when we finally hit the theoretical limit, and there will be a gap between the new technology and the end of the old. This is when companies will learn to utilize hardware better.

What campanies need to do is get rid of this damned perfomance difference between nVidia and ATI...I'm sick of this card doing good for this, but sucking for that. Make it work just as well on both cards, dammit.

Its sad, even with a 7800GT, I have to worry about getting the good settings on new games. And the card cost me 350USD....to get anywhere I'd have to spend an extra 100, or even worse, spend an extra 1000 for SLI.

SLI/Crossfire may be nice and dandy, but the cost is absurd. What about us gamers, who aren't "serious" and just want fun with very good graphics? I like HDR in lost coast, I want to see more. But I don't want to have to buy a new card every bloody year just to get it. There are those out there, in the majority, who don't upgrade hardware every year, or less. We just buy a computer, and update occasionally, and when we do update, its major since technology changes so often.
 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
Originally posted by: CP5670
The other game I'm waiting on to stop being delayed time and time again is STALKER. That's a game to look out for.

When is that supposed to come out, anyway? I have been hearing about it for quite some time now.



My THQ crystal ball said 2007, and I heard that in May.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
I have to echo xtknight's sentiments that while graphics may be reaching a point of diminishing returns, a lot of less graphics intensive things have fallen by the wayside along the way in game productions, and need vast improvement. Realistic-looking environments only go so far, whereas realistic layouts and effects like real time weather are needed. For instance, as much fun as Morrowind is, the ground at times feels like a random generation of hills (and it's not) and the same goes for some levels of Doom3/HL2 style games that just don't feel immersive because the map making is just not intuitive. We've come a long way towards graphics that feel quite immersive (for the displays they run on) but we still haven't done much to make the game environment immersive, which I think is because you can't really screenshot these kinds of design implementations, so they don't get the attention they need. Frankly, I think some map makers need to just go out into the real world every now and then and emulate that terrain.

Score one for HDR though, that's a really big step towards truly light/dark engines, not either/or.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Valve's HDR is good for the time being, but I don't think it's good enough until you don't have to simulate your iris sizing within game code. Make the user's iris do it without any code (i.e. better displays). Not only do graphics need to evolve but monitors do too. I think we'll see some damn good stuff in SED displays. Companies don't bother advancing CRTs anymore. I'm sure they can make already-excellent CRTs better, they just don't bother. SEDs (closely related to CRTs) give them a very good excuse to try and advance (in order to sell more). That can only mean good things for us. The Canon SED display shown off the other day had a range from 0.004 cd/m² to 400 cd/m² (100000:1 contrast ratio). Not sure if that's better or worse than CRTs to be honest, but it sounds damn good. Wait, that's the ratio of the Brightside LCD, isn't it (hope the numbers are actual)? Anyone know how many candelas per meter the sun's light emits in direct view on a sunny day?
 
Mar 19, 2003
18,289
2
71
Originally posted by: CP5670
I personally would like to see detail texturing used once again. The old Unreal-based games all had it and it made a huge difference in image quality, making even low resolution textures look quite good, but it's absent from pretty much all of the modern games/engines. The textures in modern games haven't gotten any larger for a while now, so this would make a big diffference.

I agree 100% - along with just higher resolution textures to begin with. I have yet to see a recent game with textures that don't look blurry when you get close to them. I think just increasing the texture resolution would help a lot - and I don't see why this isn't done already. I honestly think that many of the high-resolution S3TC textures in Unreal/Unreal Tournament look a lot better than what we're seeing today, just because they're high enough resolution not to really get blurry when you approach them. The detail texturing helps too at close distances. Sometimes it's glaringly obvious that even in newer games, the texture detail is lacking compared to these 6-8 year old games (which is not to say that they're really good everywhere in those two games since not all of the textures were replaced, but in general the difference is dramatic). Why is that? Are we trying to use too many textures for the amount of video RAM we have these days? Hopefully as 256MB and even 512MB cards become more mainstream, we can see a return to really amazing texture quality.

A few examples (from Unreal):

http://pics.bbzzdd.com/users/SynthDude2001/Shot0063.jpg
http://pics.bbzzdd.com/users/SynthDude2001/Shot0064.jpg
http://pics.bbzzdd.com/users/SynthDude2001/Shot0081.jpg

Not all of these textures are high-resolution (most noticeably the weapons aren't), but some of the ground/wall textures are really impressive... I almost never see this level of texture detail today. Sure, it's good to have realistic models and shaders and such - but I don't think they should totally forget about using high-resolution textures.

Other than that, I really agree with what others have said - mainly that there are still huge advances to be made in environment realism and physics. Some of the stuff shown in that CryEngine2 video looked really impressive! (If you haven't seen it, the high points were: volumetric clouds, realistic interaction with foliage, destructible environments [to some extent], and dynamic day/night cycles.) I'm also looking forward very much to the arrival of the PPU (or physics processing on the GPU, if that can actually be done without a major performance hit...). I won't necessarily buy immediately, but if it can do the kind of things for games that they're claiming it can do - and more importantly, game developers actually take advantage of those capabilities - then I'd gladly buy one. Physics is still very much lacking in games overall today. Half-Life 2 was pretty good, but everything else is very unrealistic - like running into most objects in FarCry and them not even moving. Still a lot to be done in this area.

That's my two cents anyway - great thread, Ben!
 

PingSpike

Lifer
Feb 25, 2004
21,756
600
126
I just want better AI and more interactive envirnoments. I've found graphics acceptable in pretty much every game out there for a few years now, I could care less if they make any drastic improvements in that area.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
A few examples (from Unreal):

http://pics.bbzzdd.com/users/SynthDude2001/Shot0063.jpg
http://pics.bbzzdd.com/users/SynthDude2001/Shot0064.jpg
http://pics.bbzzdd.com/users/SynthDude2001/Shot0081.jpg

Not all of these textures are high-resolution (most noticeably the weapons aren't), but some of the ground/wall textures are really impressive... I almost never see this level of texture detail today. Sure, it's good to have realistic models and shaders and such - but I don't think they should totally forget about using high-resolution textures.

Ah, the memories. :) A few, more modern games like UT2004 and Far Cry supposedly use the detail texturing effect (it's one of the graphical options), but it doesn't look anything like how it did in the Unreal engine games. I don't think the texture resolution in games has really changed much at all in the last three or four years. The model/level polygon counts and the lighting quality have been steadily increasing but texture size seems to have taken a backseat somewhere along the way. Even the games with so called high resolution textures these days are a far cry (no pun intended) from what UT with the S3TC packages and detail texturing offered. I'm wondering if it has something to do with cross-platform development with consoles, which generally have less memory available for this sort of stuff. I know that at least one fairly recent game (Deus Ex Invisible War) had crap texture quality on the PC version because of this.
 

biostud

Lifer
Feb 27, 2003
19,766
6,850
136
Originally posted by: xtknight

[*]Environmental effects: Better rain and snow.
[*]Atmospherical effects: Better skies.
[*]More realistic modeling: We need the environments to look more damaged and used/abused.
[*]Terrain
[*]Projection: stained glass effects (like in Lost Coast)? Shadows on water? lol, they're just a blob now in BF2. I bet they can make water a LOT better.

AFAIK lots of these will be physics dependant too, if we want them not only to like the real stuff but also acting like the real stuff.
 

BespinReactorShaft

Diamond Member
Jun 9, 2004
3,190
0
0
For me at least, the biggest "quantum leap" was when Unreal first came along. From then on there was hardly anything with as much "wow factor" until perhaps FarCry which though magnificent-looking wasn't that great of a leap compared to Unreal vs its predecessors. Although AI, physics and lighting are the new frontiers of realism, there's an inescapable feeling that they're little more than add-ons to the heaps of whatever's already been accomplished visually. The more we push to perfect a technology that's already as "perfect" as most people bother to care, the shorter the timeframe cutting-edge hardware remains so, the less attractive the prospect of upgrading at all. Assuming the 3D gaming "boom period" is (soon to be) over and the "bubble" has shrunk/burst, where do we go from here?
 

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
Where are the good skies? Weather? Is there any weather in games at all nowadays? Not the static bit you set when you want your map to rain, I'm talking real weather management here. Tornados? Not to mention, three dimensional stuff?
______________________________________________________________________

Not in FPS games but you'll if you play FS2004 game...I get all that by either in real time or in customizations. Yes, I could include tornadoe(s) in my game as well, and all in 3d stuffs.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
A lot of good points brought up. In particular, weather. Imagine an MMORPG with dynamic weather... and seasons.

By the way videopho... FS2004's weather sucks. I mean... it's one of the best weather systems out there... but it still sucks.
 

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
By the way videopho... FS2004's weather sucks. I mean... it's one of the best weather systems out there... but it still sucks.
______________________________________________________________________

Can you elaborate your disgust? I mean what did you dislike it about the game's weather.