2 years on, an overlocked 980 can't get to 60 fps average on Crysis 3

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,415
404
126
That's why I'm sticking with 1920x1080/1200 - can't afford the GPU bill otherwise :eek:

That and I don't particularly need anything physically larger than 24". To get a similar dot pitch on higher resolution monitors, I would need :
1) 2560x1440 - 30-32"
2) 3840x2160 - 45-47"
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Can it even? The 4790K drops as low as 33fps at stock 4GHz, so that means you'd need a CPU ~2x as fast as the 4790K?

If you are referring to the last area of the game, even my 920 at 3.8ghz can do much better than that. You just have to reload the game every few minutes to deal with the massive memory leak.

That or you are GPU limited.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
So Crysis 3 can get 60 FPS at 1080p. Just not at the settings they chose in the review...


A GTX 670 can also get 60 fps on Crysis at 1080p, if you're willing to go low on everything. I think you haven't quite grasped the issue.


*Related to the MSAA discussion, 4XMSAA is quite common a setting for most high-performance GPU benchmarks. It's not shocking nor strange, even if everyone may or may not have an opinion on it.

The bigger story is that in the 2 years since the release of Crysis 3 we haven't raised the bar in available graphics horsepower.


Bingo! We have a winner!
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Blame AMD for not releasing anything compelling enough to make Nvidia release GM200. They've been sitting on the Titan II/GTX 980Ti for a while now, it taped out a month after GM204 so provided yields are fine at 28nm they could've launched a chip 50% faster than the 780Ti by October. Forgot 60 fps at 1080p, we could be getting single card viable 4K performance right now if AMD was considerably more competitive against Nvidia besides just cutting into their margins with price cuts.

Same reason Intel is crawling at an ant's pace improving performance 2-3% a year. AMD is not making Intel push themselves more to maintain their position.

Yeah, that's kind of where we don't need to go. The GPU space has been slowing down in the last two years, but not nearly to the same degree as the CPUs, where there is essentially a total standstill right now.

I would imagine 50%+ of that 1.5% that have 1440p+ are sporting 27" iMacs.

I had a 2011 27"/1440p iMac with a 6770M / 512MB GPU, and I can tell you that screen looked better running most games with lower detail settings than the majority of 1080p PC monitors with full details. Some of that is quality of the screen, some is the resolution. You don't see that until you actually crank up a game on the iMac right next to a PC (which I had the ability to do).

Crysis isn't the only game out there, and many look spectacular at 1440p. There is also the point many posters have already made that 4xMSAA doesn't add much for the extra HP required.

If I didn't play a lot of fast-spaced FPS games, I'd gone for a 1440p a long time ago. The ROG Swift looked interesting but there's so many hardware problems people have been having. Will be taking a second look at the Freesync ones next year. I got a 290 Crossfire setup now, so I figure that I might get 100+ fps on 1440p in the more demanding titles.


I dont think anyone was able to reach 60 fps on crysis 1 some 5 years after release (8800gtx > 580gtx). So 2 years into c3 is not really unique.


Sure, but as I pointed out in the opening post, C1 was such a huge leap compared to everything else. C3 is beautiful, but it just isn't in the same league like C1 was, which even Crytek itself acknowledges(or at least one of their GPU gurus).

Secondly, the rate of progress in the GPU space during that time was pretty nuts, like the CPU progress. And that probably inadvertantly had an impact over the amount of graphical fidelity developers are now willing to go for, since you don't have 40-60% gains per year anymore. Just think of the huge leap that GTX 8800 was compared to 980.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
No, there are worse CPU areas than the last map, and I never had any problems with memory leaks there either.

There is only one other spot I can recall that hit the CPU hard, but it was only for a few seconds, once you moved out of the camp to actually play, FPS went back to normal. It was in some bay with an island in the middle, and the FPS dropped when tons of explosions were going off at the start, once you moved, everything was normal.
 

SPBHM

Diamond Member
Sep 12, 2012
5,076
440
126
I wonder if Crysis 1 is full 60FPS at 1080P ultra (including msaa8x)
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
People really need to specify what EXACT settings they're playing at. Otherwise, comparisons (and even generic discussion) is near-useless.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Your recollection of the game sounds vague, and I find this to be the most common reason why people think that Crysis performs much better than it actually does. Either that, or they didn't actually have the game at Very High.
I know the map, I just don't know the name of the map. I've played through it many times over the years. And I did not play with everything at very high. That is just silly, and isn't what was being discussed. We are talking about being CPU bound, not GPU bound. Those settings effect GPU performance, not the CPU. Though, most recently I was able to play with everything very high, though AA was at x2 MSAA. It is required to patch the game too.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
People really need to specify what EXACT settings they're playing at. Otherwise, comparisons (and even generic discussion) is near-useless.

I thought we were talking about CPU bound, most recently. Settings have little to no effect on the CPU usage.

And as said before, it is not like the game isn't playable at these FPS. It just isn't if you can't handle lowering a few settings.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
That grass level (Welcome to the Jungle?) I could easily feel a difference between an i5 and a 3930K. Way smoother, less dips. Pretty tech demo yes, but it does utilise what you chuck at it.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
From this statement alone, it's clear that you don't really know what you are talking about. Of course Med vs High vs Very High etc. affects the CPU load. Where do you think things like physics are being calculated - on the GPU?

And of course if we are talking about what system is required to conquer Crysis at 60fps, we are talking about Very High.

Because medium vs high vs very high doesn't effect physics. They effect graphical fidelity. Have you ever notice people who are extremely CPU bottlenecked see 0 change when changing their graphic settings? That is standard fair, with very few exceptions. If the CPU was effected by graphical settings, when CPU bound, lowing settings would improve performance.

Clearly you are the one with no idea. That said, there are exceptions, but it is extremely rare.

EDIT: BTW, are you meaning to say Crysis 3? Crysis 3 could be one of those exceptions, but you wrote Crysis, not Crysis 3.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Well I guess the dedicated drop down menu labelled PHYSICS QUALITY is just for show then. Do yourself a favor and try actually opening the game.

If you select a purely CPU bottlenecked scenario in Crysis and change the master setting from High to Very High, you will see a difference in framerate. The in game detail settings affect CPU load and that's a fact.

Sure, there are some games like BF3 where the in-game detail settings may have minimal effect on CPU load, but this not one of them.

You could be right with Crysis 1, but just because you have physics quality settings, does not mean they effect what is going on with the CPU. In many games, those physics settings are offloaded to the GPU, though at that time, it may have been different. That said, I'm pretty sure I had been playing the game with that setting at very high. I only turned down AA, and motion blur, if we are talking about the 5 years after the game was released quote. And at the time of release, I'm not certain at all, as I had a mix of high/very high settings. I'm not sure if that was one of them, but again, you quoted me on 5 years after the game was released.
 

Omar F1

Senior member
Sep 29, 2009
491
8
76
Exaggeration aside, I'm inclined to agree. The original Crysis was always falsely accused of being nothing more than a 'tech demo', but that accusation would not be unjustified if directed at Crysis 3. I find Crysis 3 to have no replay value whatsoever. I only ever open it to test hardware and take screenshots.

Well, here is a contrary opinion: In my book, Crysis is simply one of the best FPS-games ever made for PC.
I even had replayed it three times, and that's rarely happen.
Graphics aside, the game engine felt so dynamic and fluid, and the story was decent enough.

I just like the series, and hope that Crytek would survive :)
 

james1701

Golden Member
Sep 14, 2007
1,791
34
91
Well, here is a contrary opinion: In my book, Crysis is simply one of the best FPS-games ever made for PC.
I even had replayed it three times, and that's rarely happen.
Graphics aside, the game engine felt so dynamic and fluid, and the story was decent enough.

I just like the series, and hope that Crytek would survive :)

+1

Between this and the the first Bioshock, I don't think I have ever replayed a game that much. Last count I was at 14 times I replayed it.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Segway into this thread. I had an individual on the Reddit tell me about an hour ago that he ran Crysis 1 on a Celeron D with 512MBs of RAM. :p
 

Whitestar127

Senior member
Dec 2, 2011
397
24
81
We're setting the bar too low talking about 60 fps as the gold standard. I have a 120hz monitor. Give me 85+ fps.