I think he means image quality, not Intelligence Quotient.
Ah, it all makes sense now!
I think he means image quality, not Intelligence Quotient.
Ah, it all makes sense now!
I was just playing the new Killzone 3 demo (which is a snow/ice level) and it looks vastly better than Crysis' equivalent levels.
I guess you didn't see the *hidden* levels in Crysis hehe :awe:
http://img114.imageshack.us/f/crysis2ja7.jpg/
I was just playing the new Killzone 3 demo (which is a snow/ice level) and it looks vastly better than Crysis' equivalent levels.
http://www.incrysis.com/index.php?option=com_content&task=view&id=935&Itemid=1
You can decide on gameplay yourselves.
I must admit ,even though that was a x- box 360 video that made me want to play it.
I get what you're saying but it was really more of a side note.That's not even the point. We aren't arguying whether or not Crysis is the best looking game (vs. Killzone 3, God of War 3, Uncharted 2, etc.).
If I set you down right now (and let's say you have never heard of the Crysis franchise before) and in a blind test I'd ask you to identify a 2007 year old game vs. a 2011 game, would you be able to do so convincingly?
What's with all the new posters showing up in this thread to bash Crysis? It's like we rattled a console fanboi nest somewhere on the interwebs.
I get what you're saying but it was really more of a side note.
I still feel that it has more to do with diminishing returns than anything else. What made Crysis so hard on hardware wasn't the (then) advanced rendering techniques but much rather the amount of content. By far the biggest framerate killers were shadow resolution (which needs tons of fillrate and video RAM) and 'object detail' (which increased the amount of light sources, among other things); just setting those two down to medium increased performance by a full 50% and allowed you run the game otherwise maxed out on then current hardware. The simple act of taking Crysis out of the jungle and into the city was probably enough to improve performance twofold.
Not at first glance but, as above, I think it's a matter of recognizing the effects. The wall that developers hit wasn't as much the limited console hardware as R&D. Recent hardware has immense horsepower but the problem is that nobody knows how to use it. It took over three years, for instance, for developers to realize that they could use the SPUs in the PS3 for DirectCompute-type effects like MLAA, DoF and motion blur. And that's not to mention new stuff like global illumination which is subtle effect but a significant leap in graphics technology.
I don't see why PC developers shouldn't be faced with the same dilemma. Just because shader model 5 allows for 'infinite' length instructions it doesn't follow that there any meaningful ways known to use it.
Not at first glance but, as above, I think it's a matter of recognizing the effects. The wall that developers hit wasn't as much the limited console hardware as R&D.
^ Hard to argue against that. But I don't really think it's the engines per se that are the problem since most of them are modular (UE3 has since integrated both AO and GI) but the knowledge how to implement new rendering techniques without killing performance.
The engine version names are mostly arbitrary anyway. UE3 could probably run on the original Xbox to a certain extent and id tech 5 is available for the iPhone, for two examples.
I also read the news about those developers but I think that they are full of it. Content is already being created at a much higher level than actually ends up on our screens.
Crysis is horrific looking on low settings. the visual jump from low to medium is massive.Crysis could run on some very low end hardware on low or medium settings and look decent enough in comparison to what else is out there.
Part of what makes the engine exemplary is the scalability of it and just what sort of results can be achieved when it is pushed out. Four years later there is no game that can match the level of visuals produced by Cryengine2 when it is at its highest settings at high resolutions.
I think there will be plenty of poo-pooing if when we get a look at the Crysis 2 demo on March 1st we see DX11 just bringing a framerate hit and requiring still screenshots to show a difference from DX9.
Basically, we want to see the bar raised from Crysis!