Crysis 2 Benchmarks - Demo is up now.

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mr. President

Member
Feb 6, 2011
124
2
81
I must agree with the sentiment that the original wasn't that much better looking. I'm replaying it now on my new GPU (maxed out with 2xSSAA at 720p) and, while still a great looking game, there is a world of difference between the carefully staged bullshots that typically pop up in these threads.

At the risk of starting a penis war, it's worth noting that my gaming PC is also my HTPC and so is connected to the same TV as my PS3 and 360. I was just playing the new Killzone 3 demo (which is a snow/ice level) and it looks vastly better than Crysis' equivalent levels.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I was just playing the new Killzone 3 demo (which is a snow/ice level) and it looks vastly better than Crysis' equivalent levels.

I guess you didn't see the *hidden* levels in Crysis hehe :awe:
http://img114.imageshack.us/f/crysis2ja7.jpg/

That's not even the point. We aren't arguying whether or not Crysis is the best looking game (vs. Killzone 3, God of War 3, Uncharted 2, etc.). If I set you down right now (and let's say you have never heard of the Crysis franchise before) and in a blind test I'd ask you to identify a 2007 year old game vs. a 2011 game, would you be able to do so convincingly?

Crysis 2
1.jpg
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
What's with all the new posters showing up in this thread to bash Crysis? It's like we rattled a console fanboi nest somewhere on the interwebs.

The graphics in Crysis 2 are nice, but they seem to have taken a step back from the original Crysis as far as "realism" goes. Not that I'm a guru at this or anything, but I think it's because the shaders are awfully simple, and that's probably where the performance boost comes from.
 

Mr. President

Member
Feb 6, 2011
124
2
81
That's not even the point. We aren't arguying whether or not Crysis is the best looking game (vs. Killzone 3, God of War 3, Uncharted 2, etc.).
I get what you're saying but it was really more of a side note.

I still feel that it has more to do with diminishing returns than anything else. What made Crysis so hard on hardware wasn't the (then) advanced rendering techniques but much rather the amount of content. By far the biggest framerate killers were shadow resolution (which needs tons of fillrate and video RAM) and 'object detail' (which increased the amount of light sources, among other things); just setting those two down to medium increased performance by a full 50% and allowed you run the game otherwise maxed out on then current hardware. The simple act of taking Crysis out of the jungle and into the city was probably enough to improve performance twofold.

If I set you down right now (and let's say you have never heard of the Crysis franchise before) and in a blind test I'd ask you to identify a 2007 year old game vs. a 2011 game, would you be able to do so convincingly?

Not at first glance but, as above, I think it's a matter of recognizing the effects. The wall that developers hit wasn't as much the limited console hardware as R&D. Recent hardware has immense horsepower but the problem is that nobody knows how to use it. It took over three years, for instance, for developers to realize that they could use the SPUs in the PS3 for DirectCompute-type effects like MLAA, DoF and motion blur. And that's not to mention new stuff like global illumination which is subtle effect but a significant leap in graphics technology.

I don't see why PC developers shouldn't be faced with the same dilemma. Just because shader model 5 allows for 'infinite' length instructions it doesn't follow that there any meaningful ways known to use it.
 

Mr. President

Member
Feb 6, 2011
124
2
81
What's with all the new posters showing up in this thread to bash Crysis? It's like we rattled a console fanboi nest somewhere on the interwebs.

I don't know if that was pointed at me but that's not the case at all. I just think that people are underestimating how far graphics technology has evolved and how much of it is now dependent on R&D.

Just consider how much of recent development originates in 2-3 year old Siggraph papers. The only things that DirectCompute is being used for nowadays, for instance, are just new takes on old concepts. It's easy to say 'make it look better' but it's just not that simple.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
I get what you're saying but it was really more of a side note.

I still feel that it has more to do with diminishing returns than anything else. What made Crysis so hard on hardware wasn't the (then) advanced rendering techniques but much rather the amount of content. By far the biggest framerate killers were shadow resolution (which needs tons of fillrate and video RAM) and 'object detail' (which increased the amount of light sources, among other things); just setting those two down to medium increased performance by a full 50% and allowed you run the game otherwise maxed out on then current hardware. The simple act of taking Crysis out of the jungle and into the city was probably enough to improve performance twofold.



Not at first glance but, as above, I think it's a matter of recognizing the effects. The wall that developers hit wasn't as much the limited console hardware as R&D. Recent hardware has immense horsepower but the problem is that nobody knows how to use it. It took over three years, for instance, for developers to realize that they could use the SPUs in the PS3 for DirectCompute-type effects like MLAA, DoF and motion blur. And that's not to mention new stuff like global illumination which is subtle effect but a significant leap in graphics technology.

I don't see why PC developers shouldn't be faced with the same dilemma. Just because shader model 5 allows for 'infinite' length instructions it doesn't follow that there any meaningful ways known to use it.

TBH, we still haven't seen the DX10/DX11 version of it yet. Or even the proper DX9 version. The fact that we are comparing Crysis 2 at gimped setting to crysis 1 at max settings is encouraging I think.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Not at first glance but, as above, I think it's a matter of recognizing the effects. The wall that developers hit wasn't as much the limited console hardware as R&D.

UT2004 (2004 game)

Shibo_UT_ingame2.jpg


UT3 (2007 game)

ut3.jpg


I agree that it's expensive to develop new engines, but it was never an excuse in the past. R&D for a new engine is expensive because consoles won't be able to run those new engines. Therefore, your development costs can only be recouped through sales to PC gamers. This makes it too expensive to invest into newer technology since you are automatically losing a large chunk of the gaming market. So imo, it is in fact consoles which have held back R&D spending on new game engines. The developers simply don't have any incentive to do so.

If you look at some of the comments of *certain* developers, they have in fact voiced they don't want new consoles at all because the artistic development costs would increase exponentially. But that's another way of saying: "If gamers are OK buying ugly looking current games, why spend a penny on developing something better? Let's just boost our profit margins". This is why many respect Crytek as they went out of their way to create something extraordinary in 2007.
 
Last edited:

Mr. President

Member
Feb 6, 2011
124
2
81
^ Hard to argue against that. But I don't really think it's the engines per se that are the problem since most of them are modular (UE3 has since integrated both AO and GI) but the knowledge how to implement new rendering techniques without killing performance.

The engine version names are mostly arbitrary anyway. UE3 could probably run on the original Xbox to a certain extent and id tech 5 is available for the iPhone, for two examples.

I also read the news about those developers but I think that they are full of it. Content is already being created at a much higher level than actually ends up on our screens.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
^ Hard to argue against that. But I don't really think it's the engines per se that are the problem since most of them are modular (UE3 has since integrated both AO and GI) but the knowledge how to implement new rendering techniques without killing performance.

The engine version names are mostly arbitrary anyway. UE3 could probably run on the original Xbox to a certain extent and id tech 5 is available for the iPhone, for two examples.

I also read the news about those developers but I think that they are full of it. Content is already being created at a much higher level than actually ends up on our screens.


Crysis could run on some very low end hardware on low or medium settings and look decent enough in comparison to what else is out there.

Part of what makes the engine exemplary is the scalability of it and just what sort of results can be achieved when it is pushed out. Four years later there is no game that can match the level of visuals produced by Cryengine2 when it is at its highest settings at high resolutions.

I think there will be plenty of poo-pooing if when we get a look at the Crysis 2 demo on March 1st we see DX11 just bringing a framerate hit and requiring still screenshots to show a difference from DX9.

Basically, we want to see the bar raised from Crysis!
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Crysis could run on some very low end hardware on low or medium settings and look decent enough in comparison to what else is out there.

Part of what makes the engine exemplary is the scalability of it and just what sort of results can be achieved when it is pushed out. Four years later there is no game that can match the level of visuals produced by Cryengine2 when it is at its highest settings at high resolutions.

I think there will be plenty of poo-pooing if when we get a look at the Crysis 2 demo on March 1st we see DX11 just bringing a framerate hit and requiring still screenshots to show a difference from DX9.

Basically, we want to see the bar raised from Crysis!
Crysis is horrific looking on low settings. the visual jump from low to medium is massive.
 

endlessmike133

Senior member
Jan 2, 2011
444
0
0
To those saying Killzone 3 looks better than Crysis 1/2:

TBH, the graphics in Killzone are not better; but the art is much much better than Crysis' art which combined with the wonderful graphics DOES make it look better than Crysis.
 

DrBoss

Senior member
Feb 23, 2011
415
1
81
To start, I am a new user, love Anandtech. I’ve never really felt a need to post on these forums until I started reading this thread.

I align myself with those of you who find it unfortunate that the console market has derailed most (all) companies from pushing the boundaries of PC hardware. Crysis 1 is, without a doubt, the best looking game ever made. Sure the story has it flaws and the gameplay is not anything unique… but I am talking about graphics. If you think otherwise, you’ve never seen Crysis running at 1920x1200 tweaked to very/ultra high settings.

The spat about the game having some ugly points, is mute… all games have certain areas where textures are not as great as they could be, or walls that are sterile, whatever. Crysis 1 is consistently jaw dropping. Consoles cannot touch the texture quality, clarity of high resolution, Anti-Aliasing, Antistrophic filtering… forget about it.

I'd quickly like to talk about resolution, which for the most part is limited to 720p on consoles...

720p (console), Standard High Def is 1280x720 = 921,600 pixels.
A good PC gaming rig will run 1920x1200 = 2,304,000 pixels... a resolution more than double that capable on a console
A great PC gaming rig will run 2560x1600 = 4,096,000 pixels.
Add to those resolutions AA and AF and you've got an image that is enormously clearer than anything rendered by a console.

I simply don't understand anyone's defense of console graphics in comparison to PC graphics.

Crysis 2
This may get me instantly banned, but I’ve seen the leaked version (in person) of Crysis 2, and I seriously doubt it represents the final quality of the PC version. Textures do not stand up to Crysis 1, nor do particle effects. That’s not to say they are bad, but I think this version of the game has surely been neutered to conform to the constraints of console hardware (lack of power). That said, I HOPE that the final build gets us back to Crysis 1 quality or better. Let’s not forget that even now, no gaming rig can run Crysis at maxed ultra settings. GPU manufactures are still trying to best CryEngine 2, and i hope CryEngine 3 (PC) raises the bar even further.

As far as what I saw of the gameplay. Crysis 2 is awesome. Fast paced (like warhead), amazing set pieces, killer arsenal. It looks as though the movement and suit controls have been streamlined for consoles. Only cloak and armor are selectable, speed is activated using shift, and power seems instantly activated when you grab something or jump. The slide and ledge-grab maneuvers look pretty awesome.

Anyway... I love my PS3, but I also hate it… and consoles in general, because they’ve shifted the focus of game development away from the cutting edge of performance – which is obviously PC hardware. This is unfortunate considering the power afforded by modern PC hardware extends beyond simply graphics... A.I. - Sound - Load Times - Mod'ability'.

Sigh

That said, Killzone 3 is pretty killer. Check it out as we all await the release of Crysis 2.
 
Last edited:

Xonipher

Member
Feb 21, 2011
25
0
0
I'm a new member, and I'm not trying to redeem myself with this post. BUT, I like anandtech, and I love crysis. Crysis is the reason I got hooked on PC gaming. BTW, has anyone here played the Beta?
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Well the demo is out on March 1st. We can discuss the DX11 visuals then hopefully. It's most likely going to have a benchmark too. So we can see some DX11 performance benches.
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
I heard from someone that knows someone(I know I know) at Crytek that DX11 won't be added until a post-launch patch. So we might be waiting a bit.