Perfecting the AA/AF algorithms. Wouldn't it look good to run at 2x2SS+4xRGMS AA? No 'shimmering'? Fast angle independent AA?
In broader terms these types of improvements won't net the kind of major leaps that I'm talking about. Moving from 4xS AA @2048x1536 to 16xS AA isn't going to make a major impact- it will look better without a doubt- but would your mother notice? Highly unlikely(maybe she would- but most people's wouldn't
😛 ).
Environmental effects: Better rain and snow.
This is a platform issue- weather effects have been done extremely well, they just haven't shown up on the PC quite yet. Tends to chew up a monsterous amount of fillrate, although it is very 'easy' fill(doesn't require even a TMU- let alone shaders).
Atmospherical effects: Better skies.
This is actually fairly simplistic to do now, although if you wanted to get really great skies you could start moving into 3D textures for clouds. Possible now, but not terribly viable with the memory requirements.
More realistic modeling: We need the environments to look more damaged and used/abused.
This is a physics limitation.
Projection: stained glass effects (like in Lost Coast)? Shadows on water? lol, they're just a blob now in BF2. I bet they can make water a LOT better.
To do those properly we need radiosity. We can see some decent approximations with significantly more shader power, but they won't be right until we are simulating light(instead of simply reproducing it, huge difference).
New techniques of rendering and especially shaders have made it a new ballgame - games can be made to look better and better with more shaders, but there is a threshold where it's more than modern GPU's can handle, and performance turns to crap.
Absolutely- and when looking at titles like UE3 which can run on a system comparable to what it takes to run FEAR you start to realize that FEAR's resources likely could have been spent much better then they were. This isn't to say is poorly coded at all, it may be absolutely brilliant in terms of implementation, but what they chose to implement at least was poor in terms of end visual impact.
Another possibility for handling physics, ligting and even perhaps some of the video load may actually be the CPU in the future. With the talk of 128 core CPU's by 2015 (which personally I can't see), the CPU's will be nonetheless much more capable to do parallel tasks in the future.
Dedicated hardware will still scale better. I'm not diagreeing with what you are saying, obviously when everyone has moved over to Cell's approach CPUs will be much better suited for handling physics loads, but general purpose always loses to dedicated hardware.
I personally would like to see detail texturing used once again.
They are quite inferior to shaders, and right now developers are focusing on those instead. Now you can easily make the argument that with today's level of shader hardware power there isn't enough there to make the effort they are pooring into them worthwhile- but they are trying to work towards the next big thing and shaders are it. When we do start to see some parts with real shader power then they will easily prove vastly superior to 'detailed textures'(an extra layer of noise applied on top of textures- very simplistic to do).
What I wonder is will the basic fundamentals of graphics change? Will there be no more textures, only 'surfaces' comprised of shaders automatically?
This is certainly where we are headed right now. It has huge advantages as you can have a lot of interesting interactions in the physical environment that aren't viable with texture maps. Of course, we need a
lot more shader power to start to see exactly what this will bring and the transition is going to be quite slow.
After we're done perfecting the visuals, we need vendors to spend their R&D on making good GAMES.
This one is worthy of its own thread, so much to cover there
🙂
I disagree that lighting will play a major role in the future.
Radiosity is the holy grail of real time 3D- it is hard to explain but I used to work with 3D viz for several years and the difference between radiosity and no radiosity is staggering no matter what else you are doing.
Just look at what happens with current cards when you have a large amount of polygons on screen, they start dropping in frames really quick.
This is partly a problem with DirectX, partly a problem with PC's architecture.
I can go from 16-bit color to 32-bit color and not notice a huge IQ differential.
Do you recall the flack nVidia was rightly taking on their shader substitution that was causing visual problems in games like FarCry? They were running FP16(64bit color) instead of FP24(96bit color). I think you would be shocked how bad 16bit color would look in today's games.