Crysis 2 Retail Benchmarked

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

toyota

Lifer
Apr 15, 2001
12,957
1
0
What do you mean? Crysis 2 supports 4 cores. So it's 4x 2.4ghz vs. 2 x 3.16ghz. You have 51% more CPU processing power in the Q6600 if the game is fully multi-threaded.
and yet it is performing 80% better. please show me one other game where a 4 core cpu is well over twice as fast clock for clock as a 2 core cpu of the same architecture.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
and yet it is performing 80% better. please show me one other game where a 4 core cpu is well over twice as fast clock for clock as a 2 core cpu of the same architecture.

Straight from the review:

"While it's become increasingly common to see dual-core processors struggling with modern games, we can't recall the last time dual-core chips suffered as badly as they do in Crysis 2"

Since Crysis 2 was created for consoles from ground-up, and both Xbox360 and PS3 have multi-core CPUs onboard, this likely explains why this particular game scales especially well with a quad-core CPU.

I can think of 2 other games, RE5 and BFBC2, where the performance scaling from dual-core to quad-core is no less impressive.

Resident Evil 5
C2D @ 2.4ghz = 38.5 fps
C2Q @ 2.4ghz = 65.3 fps (+70%)

Battlefield Bad Company 2
C2D E6600 @ 2.4ghz = 42.4 fps
C2Q Q6600 @ 2.4ghz = 73.1 fps (+72%)

I know you have followed the forums for a while now based on your join date (2001). In the past I have always maintained that it was better to get an Athlon 64 3800+ X2 over an Athlon 64 4000+ single core. Single cores reached EOL at least since Call of Duty Modern Warfare 1. History repeated itself when gamers were faced with a decision to choose between a Q6600/6700/Q9400 or faster dual core E8400-E8500 processors.

I have always said that just as A64 4000+ became obsolete one day, so will the dual-core processors (including C2D). Looks like this time is finally here in 2011, and it should be, considering Intel released E6400 ($224) and E6600 ($316) processors in 2006. Either of these CPUs easily overclocked to 3.2-3.6ghz. Compared to these 5 year old processors, the E8500 (Wolfdale) is less than 10% faster per clock in videogames. I am frankly disappointed it took 5 years for just a handful of games to take advantage of more than 2 cores.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Straight from the review:

"While it's become increasingly common to see dual-core processors struggling with modern games, we can't recall the last time dual-core chips suffered as badly as they do in Crysis 2"

Since Crysis 2 was created for consoles from ground-up, and both Xbox360 and PS3 have multi-core CPUs onboard, this likely explains why this particular game scales especially well with a quad-core CPU.

I can think of 2 other games, RE5 and BFBC2, where the performance scaling from dual-core to quad-core is no less impressive.

Resident Evil 5
C2D @ 2.4ghz = 38.5 fps
C2Q @ 2.4ghz = 65.3 fps (+70%)

Battlefield Bad Company 2
C2D E6600 @ 2.4ghz = 42.4 fps
C2Q Q6600 @ 2.4ghz = 73.1 fps (+72%)
I dont think you are actually paying attention to what I am saying. even your numbers for BC 2 and RE 5 prove MY point. they are only getting 70% faster clock for clock going from 2 to 4 cores on the same architecture. in Crysis 2 it would be over 130% difference in clock for clock speed which makes no sense. not to mention the E8500 has more cache and larger fsb than those older cpus.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I dont think you are actually paying attention to what I am saying. even your numbers for BC 2 and RE 5 prove MY point. they are only getting 70% faster clock for clock going from 2 to 4 cores on the same architecture. in Crysis 2 it would be over 130% difference in clock for clock speed which makes no sense. not to mention the E8500 has more cache and larger fsb than those older cpus.

I know exactly what you are saying. But you are making an assumption that the game scales linearly with clock speed.

Take a look here. 2600k @ 2.5ghz is only 10% slower than 2600k @ 4.0ghz in this game. Therefore, more likely than not this game doesn't care for clock speed beyond a certain point. In other words, you can't just assume that this game even benefits from E8500 clocked at 3.16ghz. Basically, it could be that beyond 2.8-3.0ghz that additional clock speed is simply irrelevant. Beyond a certain speed the 2 threads don't really run any faster, but the other 2 threads are sitting in idle. So what you need is more cores.

So you have a situation of 4 x 2.4 cores vs. 2 x 3.16 (but it may be the case that there is hardly any benefit beyond 2.8ghz-3.0ghz) cores. This is why the Quad is easily 60-70% faster. Another possibility is that you have a lot of frequent dips into single digits on the dual-core for min. frames. As a result, this type of a scenario would severely impact the average.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
x6bg3t.jpg



Look at the pizza.

Maximum Texture Quality Engaged!

That texture reminds of the original Doom :D


In related news, Crytek announces six months later DX11 is coming in the form of a $14.99 DLC that includes 2 new multiplayer maps...
 

TerabyteX

Banned
Mar 14, 2011
92
1
0
There's a thread on steam searching for the worst Crysis 2 texture. When you walk through flooded streets, you see some debris floating on the water and looks so horrible, its a texture floating with bottles, debris, cups etc, looks horrible!!!
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Yeah, that thread shows some really terrible places... And your screenshot Groove, the car looks horrible! So do the leaves... and everything is so blurry! The road doesn't look too hot either... And that's extreme? D:
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
We've always known CE2 had horrible indoor graphics, since a lot of Crysis 2 is indoors im not surprised it looks poor in some places.
 

dualsmp

Golden Member
Aug 16, 2003
1,627
45
91
You can create your own Crysis 2 benchmarks here:

http://www.gamefaqs.com/boards/960489-/58578630

I've tried his examples and they all work. He made a correction at the bottom saying the .bat files need to be in the Benchmarks folder.

1: Create a new text file in Crysis 2/Benchmarks. Name it CPU Benchmark 1.bat.
2: Paste the following and then make sure you save it.

@echo Running CPU Benchmark 1
@echo Type "quit" in console to end Benchmark
@echo Results will depend on current system settings
@pause
@echo Running...
@..\Bin32\Crysis2.exe -DEVMODE +exec AutoTestTimeDemo.cfg +demo_StartDemoLevel AlienVessel
@call "..\TestResults\autotest.log"
@del "..\TestResults\*.xml"

3: Open CPU Benchmark 1.bat. Watch the benchmark run. It loop's twice and then exits. Results will be shown once the benchmark is completed. More benchmark's below if you want to create them. I had to modify the GPU benchmark's a little to get them to work.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
So when is this DX11 patch coming out?

Doesn't matter though, im not paying $60 for ANY game. I'll wait till it comes down in price.
 

dust

Golden Member
Oct 13, 2008
1,328
2
71
So when is this DX11 patch coming out?

Doesn't matter though, im not paying $60 for ANY game. I'll wait till it comes down in price.

Well, I read above that the DX11 would mean another 15$ added to the RSP, but you get two useless multiplayer maps. That would be 75$ for a highly anticipated turd of a game.
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
Now that I've bought the game I wonder how people have the time to sit and check out the graphics in such detail.

I play at 2560x1440 at extreme settings and things look fine to me (I mean, some of the garbage boxes and cars don't look that great), but it doesn't detract from the game at all for me because I'm vaulting over buses chasing down calamari.

I don't get why people think the game is such a turd - I find the single player campaign a lot of fun, and the game looks quite good in some areas. Is the inference at play here simply "this isn't the best looking game, and I was expecting Crytek to produce such a game if it has the Crysis name, therefore this game sucks"? Seems a bit unfair to me.
 

Fleshgod

Junior Member
Apr 3, 2011
1
0
0
Crysis 2 is better than Crysis 1 in every single way. Gameplay is intense yet it still requires a tactical approach to every situation, not only are the graphics better than anything we've seen to date - even better than heaven benchmark, new AvP and Metro 2033. With the exception of things like the pizza etc. - which would be a total waste spending hours detailing a pizza that has nothing to do with the game; imagine your poor GPU trying to render 1000 tiny high texture objects for what?! Crytek spend their time on making a amazingly high detailed world rather than an amazing looking pizza. Some of you have super computers and thats good for you; the rest of us have better things to do than spend our money on gaming rigs and we like the fact that our budget pc's can play the game on the highest graphics. Those of you complaining about the lack of DX11...who cares, honestly...when you play the game take a moment to look at the level of detail in the city, done without the help of tessalation. You are dissing a game that you haven't even played. It would be of no value to have high detail suits as you rarely get to see the suite. Crytek have achieved something even more amazing here than they did with Crysis, they made a game that is so effieceint it can run on most pc's and consoles with graphics that are unmatched by any other developer. NONE OF YOU WILL EVER ACHIEVE WHAT CRYTEK HAVE ACHIEVED HERE.



Welcome to our forums.

What you posted and the way you posted is not allowed.

Before continuing, please read the guidelines here
http://forums.anandtech.com/showthread.php?t=60552

This is a tech forum and a modicum of civility is required here.

No insulting members and no profanity.

If you have questions about our forum, post it here:
http://forums.anandtech.com/forumdisplay.php?f=56



esquared
Anandtech Forum Director.
 
Last edited:
May 13, 2009
12,333
612
126
Crytek fails. Have fun guys. $75 for a dx11 version of crysis. Lol. I have no interest in crysis 2. I'm sure the single player is fun but what happens when you beat the single player mode in a couple days? From what I hear it only allows 16 players on the multiplayer and it very rarely even works. No thanks. I'll be sticking to my 32 player, DX11, just as good graphics Battlefield BC2.

Also forgot to mention no one plays crysis multiplayer. You'll be one of 3 guys battling it out on all three of there maps. :thumbsdown:
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I know exactly what you are saying. But you are making an assumption that the game scales linearly with clock speed.

Take a look here. 2600k @ 2.5ghz is only 10% slower than 2600k @ 4.0ghz in this game. Therefore, more likely than not this game doesn't care for clock speed beyond a certain point. In other words, you can't just assume that this game even benefits from E8500 clocked at 3.16ghz. Basically, it could be that beyond 2.8-3.0ghz that additional clock speed is simply irrelevant. Beyond a certain speed the 2 threads don't really run any faster, but the other 2 threads are sitting in idle. So what you need is more cores.

So you have a situation of 4 x 2.4 cores vs. 2 x 3.16 (but it may be the case that there is hardly any benefit beyond 2.8ghz-3.0ghz) cores. This is why the Quad is easily 60-70% faster. Another possibility is that you have a lot of frequent dips into single digits on the dual-core for min. frames. As a result, this type of a scenario would severely impact the average.

Just to add some merit to this theory - the PS3 and 360's CPUs are clocked at 3.2 ghz, witht he 360 having 3x3.2ghz PPEs and the PS3 having 1x3.2Gghz PPE + 6 SPUs.

Could be since it was developed on a console, there is a cut off for clocks but more cores would help the game out.

Any scores to compare to a Gulftown or Thuban (SP?) see if those extra two cores add performance?
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Now that I've bought the game I wonder how people have the time to sit and check out the graphics in such detail.

I play at 2560x1440 at extreme settings and things look fine to me (I mean, some of the garbage boxes and cars don't look that great), but it doesn't detract from the game at all for me because I'm vaulting over buses chasing down calamari.

I don't get why people think the game is such a turd - I find the single player campaign a lot of fun, and the game looks quite good in some areas. Is the inference at play here simply "this isn't the best looking game, and I was expecting Crytek to produce such a game if it has the Crysis name, therefore this game sucks"? Seems a bit unfair to me.

PC gaming enthusiasts love the idea of games that come out and can't be played on current hardware properly. It's a weird fetish to worship such games thus crysis 1 was loved like a first born for so long. Crytek tried to expand their market with crysis 2 by having the game be multiplatform and playable on a larger installbase of pc's. For this, they are being slammed by the enthusiast vocal minority. In reality crytek made the right choice that will keep the studio doors open longer.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Game is alot of fun, and I think there are some great graphics. Join some of the different multi-player maps. I heard this warning voice ' Gamma burst incoming'
And these colored bolts of lightning strike the ground and flash the screen, I caught a screen shot.
Crysis2_2011_03_30_21_57_19_313.png

Crysis2_2011_03_30_21_57_21_748.png
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
PC gaming enthusiasts love the idea of games that come out and can't be played on current hardware properly. It's a weird fetish to worship such games thus crysis 1 was loved like a first born for so long. Crytek tried to expand their market with crysis 2 by having the game be multiplatform and playable on a larger installbase of pc's. For this, they are being slammed by the enthusiast vocal minority. In reality crytek made the right choice that will keep the studio doors open longer.

Nobody is saying Crytek shouldn't be praised for making the game playable on a wider system selection.

What people are upset about is the fact that the game looks worse maxed than the previous one! And people with very strong systems don't have the chance to use the newest technologies their hardware supports. Did you even bother to read what the complaints are about? I guess not.

The game looks good and runs on mid-range systems = good.
High-end systems aren't utilized, there's no OPTION to make your PC sweat = bad.

Did you see the Cryengine 3 tech demos? The game Crytek released doesn't even have the OPTION of making it look similar to that. It's the same DX9 the consoles are getting, with higher resolution textures (and hell, not even everywhere!), anti-aliasing and higher frame rates.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
the Q6600 results seem odd. the E8500 is the same basic architecture yet at 3.16 it can only get 23fps so that means at the same 2.4 speed as the Q6600 it would get only 17 fps. it just doesn't make sense that the Q6600 is well over twice as fast clock for clock.

You could have simply looked at the Phenom II X2 3.3 GHz vs Phenom II X4 3.5 GHz. 24 fps vs. 61 fps. If you scale the X2 up to 3.5 GHz, then that's a 6% increase in clockspeed. Assuming perfectly linear scaling, at the very best that would improve the X2 @ 3.5GHz to 26 fps, and that would still make the quad core well over twice as fast (234% as fast). That is very odd considering the X4 only has two extra cores and the chips are otherwise identical, including the amount of L3 cache.

Take a look here. 2600k @ 2.5ghz is only 10% slower than 2600k @ 4.0ghz in this game. Therefore, more likely than not this game doesn't care for clock speed beyond a certain point. In other words, you can't just assume that this game even benefits from E8500 clocked at 3.16ghz. Basically, it could be that beyond 2.8-3.0ghz that additional clock speed is simply irrelevant. Beyond a certain speed the 2 threads don't really run any faster, but the other 2 threads are sitting in idle. So what you need is more cores.

Maybe your point is valid, but the data you provided does not really back up your statement. Obviously the game is completely GPU bottlenecked with the 2600k @ 4.0 GHz, which is going to make your "clockspeed scaling argument" inconclusive at best.

It's really not possible to draw clockspeed scaling arguments when you're running into a GPU bottleneck. First, a game and particularly Crysis is dependent on both the CPU and video card to provide performance. And different scenes in a benchmark will put different loads each on the video card and CPU. You can be both CPU and GPU limited in the same benchmark depending on what is being drawn on screen - and how often this happens and what percentage of the time one is bottlenecked over the other varies on a very context specific situation.

The benchmarks you used both discredit the point you make and illustrate the one I just made, it is clear as day a 2600K @ 3.5 GHz is feeding the video card they are using as much as it can handle at all times, since the same processor @ 4.0 GHz offers no performance improvements at all. So I really do not like how you used the 4.0 GHz to back up your argument, because that is simply and purposefully ignoring context to make your argument seem even better than what it is. It's pretty sneaky tactic, but IIRC you have used the same tactic before. Maybe it's simply un-awareness, but I think it's a bit on purpose.

But then drop the speed down the 3.0 GHz, 2.5 GHz, and 2.0 GHz. Now since we know the 2600K is capable of keeping the video card they used completely fed with data, the non-linear trend of performance vs. clockspeed makes sense. What's happening is what I previously described. As the clockspeed increases, it is encountering more times where the GPU is completely fed, until the clockspeed gets so high that it keeps the GPU fed all the time. Basically since we're looking at averages, you can think of the reported framerates also showing us that a faster processor is spending less time being the bottleneck than the GPU.

Here's the breakdown:

3.5 to 3.0 GHz: 14% less clockspeed, 3% less performance
3.0 to 2.5 GHz: 17% less clockspeed, 8% less performance
2.5 to 2.0 GHz: 20% less clockspeed, 15% less performance

Look at the ratio of clockspeed loss to performance loss. The lower you go, the ratio approaches 1.

Now back to the Q6600. The reason toyota had a had time believing the Q6600 vs. E8500 because the Q6600 is no where near being able to spend enough time keeping the video card they used fed, since the Q6600 is giving you half the peformance of a 2500K. And then the E8500 is giving you half the performance of a Q6600. That's a pretty huge discrepancy. I suppose it's possible, but I find it hard to believe.

I hope you've been following me so far, as most of my discussion has been about how your first statement doesn't collaborate your statement nor defeats toyotas. Something interesting is indeed going on with Crysis 2 here. Maybe your explanation about idle threads is correct, but I have a hard time believing if. Actually I don't buy your statement about clockspeed not making a difference above a certain threshold. It might be true, but I have a very difficult time believing that is the core of the issue.

If a CPU is providing a bottleneck 100% of the time, then increasing the clockspeed will help alleviate that bottleneck. So even if you have threads sitting idle, they shouldn't be sitting idle as long because the faster clockspeed should allow for other threads to finish quicker to do the idle threads. I can only see this not happening if something else is providing a bottleneck, like the memory subsystem or whatever.

I think there are probably multiple things happening here contributing to the unusualness we are seeing, since Crysis 2 is definitely the first game I've seen to great than (and significantly so) 100% performance increase with core scaling. One factor is probably the use of FRAPs to benchmark. Crysis on it's own is probably pushing a dual core 100% the entire time. FRAPs comes in and steals more performance than it should. Meanwhile a quad core has a much easier time coping. So basically the insertion of other processes and threads besides the ones the actual game produces could be a major factor in what we're seeing.
 
Last edited:

Veliko

Diamond Member
Feb 16, 2011
3,597
127
106
Nobody is saying Crytek shouldn't be praised for making the game playable on a wider system selection.

What people are upset about is the fact that the game looks worse maxed than the previous one!

It doesn't look worse than the first Crysis at all, the sequel is considerably more interesting to look at.

And people with very strong systems don't have the chance to use the newest technologies their hardware supports. Did you even bother to read what the complaints are about? I guess not.

The game looks good and runs on mid-range systems = good.
High-end systems aren't utilized, there's no OPTION to make your PC sweat = bad.

Crytek had always said that Crysis 2 wouldn't be pushing systems like the original one did. If you went out and bought expensive hardware in anticipation of Crysis 2 being a system eater then that is your problem and not Crytek's.
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
High-end systems aren't utilized, there's no OPTION to make your PC sweat = bad.

With overclocked 580s in SLI and an overclocked 2600K there were parts where my rig sweated, at 2560x1440 (extreme settings). It's a first person shooter, and I'm particularly sensitive to FPS dips in games like these. There were 2 or 3 parts in the game, for example on the roof top with the greenhouses - the lighting effects, all of the foliage, the reflections and so on (with all the movement/explosions etc) took their toll. I don't mean that it became a slideshow, but there was a definite reduction in the smoothness that I want in a first person shooter.

High end systems are utilized, probably not at 1080p I guess. Should I be happier, on your view, if I couldn't max out this game? If so, that seems counter intuitive to me. Who buys expensive hardware today and hopes that they'll only be able to max out a game 3 years from now?
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
How is having the option for the game to look even better something bad? Can the game look better? Based on the tech demos it can. So why not give the option? Not my problem you get upset you can't "max" the game on your rig - nobody would force you not to run the game at let's say DX9 "extreme". The game would look as it does now (and it's fine for you). But people having better hardware or playing at lower resolutions could run the game at DX11 "ultra extreme" or something...

I don't understand how can people approve of Crytek not making the game look as good as it could have been. Nobody is forcing anyone to max the game, why not give the option to use higher settings if Cryengine 3 allows it? Your "extreme" mode, what you get right now, would look the same still...
 
Last edited:

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
It's 'bad for them', if it takes them an inordinate amount of time and therefore money to develop options that are not usable today by the vast, vast majority (including enthusiasts) of gamers. Let's say they had put in some 'omg' mode, but to maintain 30fps at 1080p one would need 6990s in crossfire or 590s in SLI.

What's the incentive for Crytek to invest in options in a game that almost nobody in their market could use? To use your own language: "Not my problem that you're upset there aren't better graphics options available". I'm not the one who is upset; I enjoyed the game. It's people like you (the angry ones) who are rampaging across forums complaining that there aren't better graphics options available for their 5850 at 1080p. I mean...seriously.