• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Warning: Crysis 3 Will Melt Your PC, Says Crytek

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
"I have Quadfire HD GTX 680's in SLI and this game doesn't run smoothly? Unoptimized! I don't care that it looks like I just had diamond-encrusted rainbows enter my eyeballs! I may no nothing about the engine or graphics in general, but I know it's unoptimized!"
 
Crysis really looks spectacular in outdoor, some of the indoor textures are quite crappy.Witcher 2 on the other hand looks gorgeous throughout.I don't want to start a debate here but to me Witcher 2 is more visually appalling.A five year old game which can compete with modern games visually shows its real strength though.
The Witcher 2 is more visually appealing because it doesn't have to render near as much onscreen at a time due to its much much shorter draw distance.

That was the only point i was trying to make, its not that The Witcher 2 looks bad compared to Crysis because it doesn't.
It is just a bad comparison because one chooses to render as little as possible onscreen at a time and add as much detail/effects as possible.
While the other does the opposite.

People keep saying Crysis 1 was so unoptimized, yet very few modern games have Crysis 1's draw distance and render as much onscreen at a time with Crysis's level of detail/performance.

Obviously some of it (a lot of it) has to do with most games nowadays being Console ports and their limited memory, just look at Crysis 2's draw distance compared to Crysis 1, its not even close.
 
The Witcher 2 is more visually appealing because it doesn't have to render near as much onscreen at a time due to its much much shorter draw distance.

That was the only point i was trying to make, its not that The Witcher 2 looks bad compared to Crysis because it doesn't.
It is just a bad comparison because one chooses to render as little as possible onscreen at a time and add as much detail/effects as possible.
While the other does the opposite.

People keep saying Crysis 1 was so unoptimized, yet very few modern games have Crysis 1's draw distance and render as much onscreen at a time with Crysis's level of detail/performance.

Obviously some of it (a lot of it) has to do with most games nowadays being Console ports and their limited memory, just look at Crysis 2's draw distance compared to Crysis 1, its not even close.

I think Arma II has higher draw distance but Crysis 1 looks better though.But I agree to your other points.
 
I think Arma II has higher draw distance but Crysis 1 looks better though.But I agree to your other points.

If you set ARMA 2 to max details/drawdistance...it will force ANY PC too it's knees.
For the same reason as Crysis is demanding...it renders a hell of a lot of content!
 
Most people don't get it.
Just like they don't get the engine...and start with FUD like "unoptimized"...due to a hurt e-peen....

You keep implying you know something we don't, and yet you won't say what it is.

Until then, Crysis engine is just poorly programmed.
 
You keep implying you know something we don't, and yet you won't say what it is.

Until then, Crysis engine is just poorly programmed.

You keep implying that Lonbjerg is wrong but haven't yourself named a better looking open-world/sandbox shooter that can get almost 60 fps with MSAA turned on at 1080P on a 7970 GE. Witcher 2, BF3 and Metro 2033 cannot give you 60 fps maxed out on the same videocard. Now if you said something like Far Cry 2 looks very good given its performance demands, most people probably wouldn't disagree with you.

Sleeping Dogs that looks worse than Crysis gets just 47 fps on a 1320mhz GPU Boosted 680 and 56 fps on a 1250mhz 7970. I would agree with you that on the multi-core CPU side, Crysis / Warhead is poorly optimized but it seems the game is mostly GPU limited with modern dual-core i3 CPUs anyway. I imagine the next level of graphical fidelity that puts Crysis 1 to shame will run at 30 fps on a GTX780. That next level of much superior graphics is exponentially more demanding.
 
Crysis 2 was certainly not very well optimized on very high DX10 settings. I have used mods that looked way better yet also ran better than the standard very high DX10 config.
 
It's impossible to judge an engine just based on sight. I've played through the Sleeping Dogs demo and I think it looks better than Crysis, but it seems RS thinks otherwise.

When you get up close to the leaves in Crysis, you can see how they fit entire forests of them on your screen. They only look good when you're looking out over the big draw distance they set.

I think Crysis looks good because it has the big forests. Cities cannot compare to that even with great detail in them.

Crysis also looks like it has a focus on the textures you'd notice instead of ones you don't.
 
If crysis didn't have so many trees, it would run at 100fps all day. Also, how would you explain far cry 2's awesome graphics and excellent fps compared to crysis?
 
You keep implying that Lonbjerg is wrong but haven't yourself named a better looking open-world/sandbox shooter that can get almost 60 fps with MSAA turned on at 1080P on a 7970 GE. Witcher 2, BF3 and Metro 2033 cannot give you 60 fps maxed out on the same videocard. Now if you said something like Far Cry 2 looks very good given its performance demands, most people probably wouldn't disagree with you.

Sleeping Dogs that looks worse than Crysis gets just 47 fps on a 1320mhz GPU Boosted 680 and 56 fps on a 1250mhz 7970. I would agree with you that on the multi-core CPU side, Crysis / Warhead is poorly optimized but it seems the game is mostly GPU limited with modern dual-core i3 CPUs anyway. I imagine the next level of graphical fidelity that puts Crysis 1 to shame will run at 30 fps on a GTX780. That next level of much superior graphics is exponentially more demanding.

Great post.

Did a little digging on the AT benches and Wikipedia...

Look at the history before Crysis was release. For example, the 7900GT was ~2x faster than the 6800GT it replaced as the high-end single-GPU offering.

Then the 8800GTX was released at the end of 2006/beginning of 2007 which then improved EVEN more (something like2-3x performance). That means in about the 2.5 years leading-up to Crysis releasing in 2007, we saw graphics speeds increase 4-6x. Thats pretty impressive!

Then look after it was released and you see in the last ~5 years, we have seen graphics 'only' increase ~8x. http://www.anandtech.com/bench/Product/514?vs=508

Crysis was WAY ahead of it's time, and I trully think the developers were thinking the huge gains we had seen in the 3-4 years before it's release would continue. Unfortunately, thermodynamics and 'consol-itis' got in the way and we became accustomed to much slower GPU growth. Let's assume the performance improvements of 2005-2007 continued through today; Crysis would be MUCH more playable on 'high-end' machines with mods and eye-candy maxed-out.

To say this game looks like crap is VERY naive. The fact that only 3-4 games in the last 5 years even fall in the same discussion is proof enough for me (TW2, BF3, Metro, maybe a few others?). The game still looks great completely 'stock' and IMHO blows-away anything else around highly-modded.
 
Last edited by a moderator:
You keep implying you know something we don't, and yet you won't say what it is.

Until then, Crysis engine is just poorly programmed.

You're the one making the accusation, the burden falls on you to prove it IS poorly optimized.
 
If crysis didn't have so many trees, it would run at 100fps all day. Also, how would you explain far cry 2's awesome graphics and excellent fps compared to crysis?

You must be related to tweakboy. That's the only way to explain the nonsense you've posted in this thread.
 
Great post.

Did a little digging on the AT benches and Wikipedia...

Look at the history before Crysis was release. For example, the 7900GT was ~2x faster than the 6800GT it replaced as the high-end single-GPU offering.

Then the 8800GTX was released at the end of 2006/beginning of 2007 which then improved EVEN more (something like2-3x performance). That means in about the 2.5 years leading-up to Crysis releasing in 2007, we saw graphics speeds increase 4-6x. Thats pretty impressive!

Then look after it was released and you see in the last ~5 years, we have seen graphics 'only' increase ~8x. http://www.anandtech.com/bench/Product/514?vs=508

Crysis was WAY ahead of it's time, and I trully think the developers were thinking the huge gains we had seen in the 3-4 years before it's release would continue. Unfortunately, thermodynamics and 'consol-itis' got in the way and we became accustomed to much slower GPU growth. Let's assume the performance improvements of 2005-2007 continued through today; Crysis would be MUCH more playable on 'high-end' machines with mods and eye-candy maxed-out.

To say this game looks like crap is VERY naive. The fact that only 3-4 games in the last 5 years even fall in the same discussion is proof enough for me (TW2, BF3, Metro, maybe a few others?). The game still looks great completely 'stock' and IMHO blows-away anything else around highly-modded.

http://www.anandtech.com/show/6202/...tafson-as-chief-graphics-product-architecture

This stagnation in the GPU market might be ready to change.
 
When you get up close to the leaves in Crysis, you can see how they fit entire forests of them on your screen. They only look good when you're looking out over the big draw distance they set.

Of course it's subjective to each user, but while there are many places where the textures look poor in Crysis, especially indoors/inside houses, the entire Sleeping Dogs game looks like a console game (and I really really hate the washed out PS3/360 console texture look). Not knocking its gameplay or storyline, but graphics are nothing special, especially since it brings a 1300mhz GPU Boosted $580 GTX680 Lightning to its knees. 39 fps on a stock GTX680 at 1080P.....holly molly.

This looks like a PS3 game to me with PC level of lighting model, not much else. Not seeing why it should do 40 fps on a 680.
HKShip_2012_08_21_16_37_13_096.jpg

HKShip_2012_08_21_17_05_25_679.jpg


There are certain areas in Sleeping Dogs where the game looks like a 2007 PC game, like here where the sidewalk is just 1 giant texture piece with no bump/displacement mapping even:
HKShip_2012_08_21_17_03_00_199.jpg
 
Last edited:
You keep implying that Lonbjerg is wrong but haven't yourself named a better looking open-world/sandbox shooter that can get almost 60 fps with MSAA turned on at 1080P on a 7970 GE.

Because I haven't played one. Sure Crysis was nice, I especially liked the effects when the bullets hit the water. I just don't think the visuals add up to the lack of performance.
 
Of course it's subjective to each user, but while there are many places where the textures look poor in Crysis, especially indoors/inside houses, the entire Sleeping Dogs game looks like a console game (and I really really hate the washed out PS3/360 console texture look). Not knocking its gameplay or storyline, but graphics are nothing special, especially since it brings a 1300mhz GPU Boosted $580 GTX680 Lightning to its knees. 39 fps on a stock GTX680 at 1080P.....holly molly.

This looks like a PS3 game to me with PC level of lighting model, not much else. Not seeing why it should do 40 fps on a 680.
There are certain areas in Sleeping Dogs where the game looks like a 2007 PC game, like here where the sidewalk is just 1 giant texture piece with no bump/displacement mapping even:

Sleeping Dogs does use higher resolution textures (offered as a free download, rather than use space on the disc) and other higher precision effects on PC over the console versions, like HDAO. Eurogamer's analysis makes this clear.

The beatdown it gives graphics cards is because it uses a native supersample antialiasing implementation, rendering the framebuffer two or four times greater than the display resolution then downscaling it, similar to The Witcher 2's ubersampling. This is a feature Crysis distinctly lacks (though you can force it in the graphics drivers) so Sleeping Dogs will undoubtedly have less geometry and texture aliasing than Crysis. The multisample antialiasing used in Crysis can't even try addressing texture aliasing. The supersampling method that Sleeping Dogs implements also uses DirectCompute, which is why AMD cards hold a lead over Nvidia (Kepler's compute performance is worse than GCN).
 
That's the only way to explain the nonsense.

Not to mention anyone specifically, but I still think its funny that some are using 2011-2012 games in their arguments against crysis (a 2007 game) and are still not proving their case.

Crysis still is a great game, graphically and gameplay wise, I hope crysis 3 follows that vein (and not crysis 2). While crysis 1 has/had great graphics, the gameplay was what made me dig it, I still enjoy it to this day.
 
Because I haven't played one. Sure Crysis was nice, I especially liked the effects when the bullets hit the water. I just don't think the visuals add up to the lack of performance.

No, you are just ignorant and try with the fallacy of "shifting the burden of proof" to bolster your FUBAR "argument"

Last moron I saw trying to use that fallacy was a creationist...he didn't have any valid arguments either...just like you.

Personal attacks are not permitted here. -Admin DrPizza
 
Last edited by a moderator:
No, you are just ignorant and try with the fallacy of "shifting the burden of proof" to bolster your FUBAR "argument"

Last moron I saw trying to use that fallacy was a creationist...he didn't have any valid arguments either...just like you.
😵
 
No, you are just ignorant and try with the fallacy of "shifting the burden of proof" to bolster your FUBAR "argument"

Last moron I saw trying to use that fallacy was a creationist...he didn't have any valid arguments either...just like you.

That's, um, a little off topic here.
 
Crysis still is a great game, graphically and gameplay wise, I hope crysis 3 follows that vein (and not crysis 2). While crysis 1 has/had great graphics, the gameplay was what made me dig it, I still enjoy it to this day.

:thumbsup:

While I certainly enjoy the benefits of PC gaming, in the end, the gameplay still matters. Even for the same graphics assets as a console version, I'll take the PC version, just because I get the clearer and cleaner picture thanks to 1080p, 60+ FPS and higher AF & AA, as well as a mouse and keyboard for precision.

Crysis 2 was a step back from Crysis in terms of fun and enjoyment.
 
All I know is I can play crysis 2 at 85fps avg @ 1080p but I can bairlly get thu the opening animation on crysis 1 demo @1440x900 it's like 25fps max and that's on a 2500k and 560ti both over clocked
 
All I know is I can play crysis 2 at 85fps avg @ 1080p but I can bairlly get thu the opening animation on crysis 1 demo @1440x900 it's like 25fps max and that's on a 2500k and 560ti both over clocked
you don't average 85fps in Crysis 2 on max DX11 settings at 1080 with a gtx560 ti. not even close.

and even my gtx560 SE can play through Crysis 1 on very high DX10 settings and 4x AA at just 1440x900 so it must be the demo if you are having trouble at that low res.
 
Last edited:
GTX560 Ti 448 gets about 40 fps average in Crysis 2 maxed out. Not sure how you are getting 85 fps "maxed out". Interesting to note that a stock 7850 beats GTX560Ti 448 in Crysis 2. I didn't even know that.

12_crys2.png
 
Back
Top