Crysis 3: xbox 360 vs PC compared side by side

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
If they were just like "yeah, we're focusing on a great campaign experience," then I would have held my judgment for the campaign/reviews. The only reason I make a point of it is because they kept going on and on about how Crysis 3 was "going to melt your PC" and they failed to deliver IMO. They really haven't improved the graphics over Crysis 1 (over 5 years later mind you), they just changed the art direction/style a bit. That's not impressive, especially from a company that seems to think it pushes graphic boundaries.

You can tell this from a beta? Where MP is intentionally downgraded for gameplay performance?
 

Greenlepricon

Senior member
Aug 1, 2012
468
0
0
Which is it? You seem to flip flop, from C3 not melting cards to the performance is not good enough. Ignoring it's a beta. Or a beta-beta as a AMD rep wrote. I can't wait to read your opinion on Tomb Raider 2013 and Bioshock.

Maybe you are spoiled/effected from watching Bitcoin #'s scroll across your screen and or the heat in the room :)

He's saying that it's not revolutionary for the amount of power it takes to run it. Crysis 1 took a ton, but you could tell it definitely used some resources. Crysis 3 looks fine but it's not next generation, and it doesn't look like it should need two 7970's to run it max with playable frames. I think the problem is that it does melt cards, but it's doing that for the sake of melting them and not with the intention of sending out jaw-dropping graphics.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I really think sometimes the problem is for what the current cards are doing, they are a bit underpowered. So when you start using lots of tessellation and special effects, you have a card that gets low FPS and you almost need two to keep fps up.

I also think there's some effects and details that you aren't even really able to see. Stuff is done in the fine details but upon playing, you don't notice. Maybe if you zoomed way in on a specific spot. Maybe we have just gotten to the point where developers need to find new ways to use the resources available.

I don't know...just tossing out random thoughts.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Like someone mentioned earlier, I just want a better game. No more of this console-gameplay-kind-of-chasing-call-of-duty BS that Crysis 2 was. Granted it wasn't a bad game, but the first game was just so much more compelling.

As for the graphics, yeah they are nice. However, I'm not a huge proprietor of having PC titles needlessly crap all over top end graphics cards due to super high rez textures that require a 2560 x 1600 monitor to see in the first place. I just really like having a high frame resolution, and clean graphics with minimal pop/fade-in that effectively visualize the space I'm playing in.
 

0___________0

Senior member
May 5, 2012
284
0
0
Then it looks like I hit the nail on the head about you being to young to remember, especially with your above outburst. Everything from reviews to user experiences to screenshots, as I and others have posted, clearly proves the opposite of what you're trying to sell.

Speaking of deflection, you can't answer my question. Calling me out when you've previously posted stuff that isn't true.

How about we see what the release game actually delivers? Although, judging by how you've already declared the game to have "failed", I'm guessing nothing will satisfy you.

If I wanted to read childish remarks I'd scroll down to P&N.
 

Spjut

Senior member
Apr 9, 2011
931
160
106
the Alpha version worked relatively well on my 5750, even without using the lowest possible settings,

I think something slower, like a 8800GT can runs this in 720p over 30fps (if the game supports dx10 or dx9).

DX10 users get a message saying the game requires DX11
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Which is it? You seem to flip flop, from C3 not melting cards to the performance is not good enough. Ignoring it's a beta. Or a beta-beta as a AMD rep wrote. I can't wait to read your opinion on Tomb Raider 2013 and Bioshock.

Maybe you are spoiled/effected from watching Bitcoin #'s scroll across your screen and or the heat in the room :)
I'm not sure how any reasonable and intelligent person could think "not melting cards" and "performance is not good enough" are the same problem after Russian's detailed explanation. Those two are clearly different. On one side we were promised visual fidelity that would "melt your PC," which hasn't happened. As numerous people have mentioned, the graphics aren't anything jaw dropping and they actually run fine on modern hardware. Secondly, for what is being displayed, the performance is poor, especially with some aspects that shouldn't be a problem in this day and age (like AA implementation that drastically cuts performance or greatly decreases IQ ala FXAA). The overall product fails to deliver on what was promised and comes out as sloppy. You can play the "this is beta" nonsense all you want, but the game has already gone into production and packaging and is being shipped. Some of you clearly live in la-la land if you think they're hiding something for release.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm not sure how any reasonable and intelligent person could think "not melting cards" and "performance is not good enough" are the same problem after Russian's detailed explanation. Those two are clearly different. On one side we were promised visual fidelity that would "melt your PC," which hasn't happened. As numerous people have mentioned, the graphics aren't anything jaw dropping and they actually run fine on modern hardware. Secondly, for what is being displayed, the performance is poor, especially with some aspects that shouldn't be a problem in this day and age (like AA implementation that drastically cuts performance or greatly decreases IQ ala FXAA). The overall product fails to deliver on what was promised and comes out as sloppy. You can play the "this is beta" nonsense all you want, but the game has already gone into production and packaging and is being shipped. Some of you clearly live in la-la land if you think they're hiding something for release.

You seem to think that the beta is recently produced and isn't using months old code.

I'd put money on that NOT being the case at all.

So yeah, you should expect people to be called out for bias against crytek.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Which is it? You seem to flip flop, from C3 not melting cards to the performance is not good enough. Ignoring it's a beta.

How am I flip-flopping? I said repeatedly the context of C1's contribution to PC gaming graphics and how it compared to games at the time vs. comparing that to C3 today against existing good looking PC games is what I am talking about. It seems to me that the most punishing setting in C3 at VHQ appears to be MSAA/SMAA. I posted several screenshots of C3 with everything on VHQ, excluding MSAA/SMAA. The game looks like a slightly enhanced C2. This wasn't the case in C1. There was no game that even came close even if you just ran C1 without MSAA. Even at just 1280x1024 no AA and High settings, C1 looked better than any game in 2007. Also, even if you crank the settings to the highest quality settings, C3 is nowhere near as revolutionary graphically as C1 was for its time. I am not ignoring the beta as I specifically pointed out already that my view is coming from the beta and if the final game brings better graphics in SP campaign, I will change my view. If Crytek didn't hype up C3's graphics for a year, I wouldn't even care if C3 was only a slightly improved C2 with high-rez textures and 8 AA modes. Neither the developers of BioShock Infinite nor Tomb Raider have said anything about those games "melting PCs", being "the best looking games on the PC for at least 2 years," or "revolutionizing PC graphics." Therefore, I have no grand expectations from those games on the graphical side. If they look great, good, but if they don't, it's not as if I expected them to have a graphical wow factor since it was never promised.

And no bitcoins haven't affected my viewpoint. If anything, I have more $ to upgrade to faster GPUs but I won't waste it on some 4xMSAA setting. I want actual next generation graphics, even if means 30 fps on an HD8970 in CFX, I am OK with that. I want the next Crysis 1. Crysis 3 doesn't look like a major leap in PC graphics and yet can barely hit 40-45 fps on an HD7970GE at 1080P with 4xMSAA/SMAA. In other words, it doesn't appear to deliver on the WOW factor of C1, and yet it's not optimized well either, taking a 20-30 fps hit with 4xMSAA/SMAA. Considering there is minimal difference in IQ going from Low to High and almost indistinguishable going from High to VHQ, what is bringing the performance down so much? It sure isn't Unreal Engine 4 level of graphics.

He's saying that it's not revolutionary for the amount of power it takes to run it. Crysis 1 took a ton, but you could tell it definitely used some resources. Crysis 3 looks fine but it's not next generation, and it doesn't look like it should need two 7970's to run it max with playable frames. I think the problem is that it does melt cards, but it's doing that for the sake of melting them and not with the intention of sending out jaw-dropping graphics.

Bingo. If you can hit 50-60 fps at 1080P with 0AA on a GTX680 OC/ HD7970 OC, Crytek isn't pushing the limits of PC gaming. 20-30 fps performance hit of MSAA in deferred lighting engines does not quality as "melting PCs due to next generation graphics." That only proves how inefficient deferred lighting engines are with MSAA/SMAA filters. We already knew that. Delivering on the promises would have been 30 fps on a GTX690 at 1080P with jaw dropping graphics like UE4, Square Enix demo, etc. That would have been going back to Crysis 1's roots. If Crytek didn't make any promises of delivering a next generation PC game with revolutionary visual fidelity, I wouldn't have said a word.
 
Last edited:
Aug 11, 2008
10,451
642
126
DX10 users get a message saying the game requires DX11

Anyone surprised that it requires DX11? Seems like they are limiting their user base considerably, and I dont know if DX11 is that much better. I have been playing Metro 2033 and to be honest, I can tell very little difference between DX10 and DX11 except the framerate goes to hell with DX11, when I thought it was supposed to be more efficient.
 

Haserath

Senior member
Sep 12, 2010
793
1
81
DX11 introduces tesselation.

Tesselation isn't even in the beta, and they showed off their secret tesselated toad tech in the demo.

Not to mention tesselation on foliage and practically anything that can use it.

That should add a bit to it.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
Bingo. If you can hit 50-60 fps at 1080P with 0AA on a GTX680 OC/ HD7970 OC, Crytek isn't pushing the limits of PC gaming. 20-30 fps performance hit of MSAA in deferred lighting engines does not quality as "melting PCs due to next generation graphics." That only proves how inefficient deferred lighting engines are with MSAA/SMAA filters. We already knew that. Delivering on the promises would have been 30 fps on a GTX690 at 1080P with jaw dropping graphics like UE4, Square Enix demo, etc. That would have been going back to Crysis 1's roots. If Crytek didn't make any promises of delivering a next generation PC game with revolutionary visual fidelity, I wouldn't have said a word.

Thank you! I want the game to look amazing and be punishing for reasons other than AA, which may not even be noticeable during actual gameplay. Taxing 2x680GTX's should be able to provide real next gen graphics, nothing an 8 year old console can even think about running.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Thank you! I want the game to look amazing and be punishing for reasons other than AA, which may not even be noticeable during actual gameplay. Taxing 2x680GTX's should be able to provide real next gen graphics, nothing an 8 year old console can even think about running.[/QUOTE]

We have a winner !!!!
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Anyone surprised that it requires DX11? Seems like they are limiting their user base considerably, and I dont know if DX11 is that much better. I have been playing Metro 2033 and to be honest, I can tell very little difference between DX10 and DX11 except the framerate goes to hell with DX11, when I thought it was supposed to be more efficient.

I say, good for crytek. How old is DX9 again? The first DX9 GPUs were released in 2002.

11 years ago.

DX9: 11 years ago.

Good on crytek. I can't believe it has taken THIS LONG for developers to not cling to compatibility.
 

Spjut

Senior member
Apr 9, 2011
931
160
106
I say, good for crytek. How old is DX9 again? The first DX9 GPUs were released in 2002.

11 years ago.

DX9: 11 years ago.

Good on crytek. I can't believe it has taken THIS LONG for developers to not cling to compatibility.

He mentioned DX10, not DX9...And the DX11 API can target DX10/10.1 hw as well, as for example BF3, Civ5, Black Ops 2, Hitman Absolution, Assassins creed 3 do

Can you imagine how long DX11 will stick around if PS4/720 will be based around DX11 GPUs? 6-7 more years. :oops:

That's what makes me most disappointed with all that we've heard about them.
I think the consoles aren't wholly to blame though, The Witcher 2 for example was only DX9, despite being exclusive for one year and officially required a 8800GT/HD3850 to run.
I'd bet the next-gen consoles is the primary reason why we see more PC games ditching DX9 in favor of DX11 now, letting the studios get more experience with the latest architectures

DX11/11.1 games 5 years from now will probably look beatiful, but if DX12 performs better, I'd want it getting support quick.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
I have more $ to upgrade to faster GPUs but I won't waste it on some 4xMSAA setting. I want actual next generation graphics, even if means 30 fps on an HD8970 in CFX, I am OK with that. I want the next Crysis 1. Crysis 3 doesn't look like a major leap in PC graphics...

+1 -- in other words, it's looking like Crysis 2 all over again. However...

...and yet can barely hit 40-45 fps on an HD7970GE at 1080P with 4xMSAA/SMAA. In other words, it doesn't appear to deliver on the WOW factor of C1, and yet it's not optimized well either, taking a 20-30 fps hit with 4xMSAA/SMAA.

I imagine the optimisation will improve once we move from Beta to full release (though the original Crysis was pretty poorly optimised even after the patches).

Still, what really bugs me is the graphics. Crysis 1 was sooooooooooo good. Granted, I have to play it in DX9 32-bit mode in order to get a reliable 120fps on a 7970 at 2560x1440, but even then it looks orders of magnitude better than most games that have come out since 2009. Crytek can do better
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
Yes the PC version looks quite a bit better but nowhere near enough better to warrant the huge difference in price. An Xbox 360 looks incredible value compared to the two 7970's. Lazy developers and complete lack of ambition mean we get something looking quite a bit better than an xbox 360 version eight years later for eight times the price.
Pretty sure I won't be buying Crysis 3 if it looks like that.:thumbsdown:
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
DX11 introduces tesselation.

Tesselation isn't even in the beta, and they showed off their secret tesselated toad tech in the demo.

Not to mention tesselation on foliage and practically anything that can use it.

That should add a bit to it.

Is there a source somewhere saying there is no tessellation in the beta ? I think this can be easily checked, there is some method of breaking down what is rendering and displaying tessellation if it is on. Actually, someone with an AMD card could check as well by controlling the tessellation via the control panel and seeing if there is a performance impact.

I haven't seen anything about there being no tessellation in the beta, where did you see Crytek saying that ?
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Yes the PC version looks quite a bit better but nowhere near enough better to warrant the huge difference in price. An Xbox 360 looks incredible value compared to the two 7970's. Lazy developers and complete lack of ambition mean we get something looking quite a bit better than an xbox 360 version eight years later for eight times the price.
Pretty sure I won't be buying Crysis 3 if it looks like that.:thumbsdown:

But it means not having to look through the piss vision that is probably sub-720p slathered with FXAA.

I'm often perfectly happy with console level geometry and textures as long as I can get 1080p, 60 FPS, better AF and hopefully better LOD.

Just doubling the render resolution from 720p to 1080p automatically requires just about double the total graphics horsepower. Expecting an average 60 FPS instead of average 30 FPS means seeing the required graphics performance doubled again (usually). So basically meeting these requirements yet sticking to Xbox 360 level models and textures, etc means having something of Geforce GTS 450 or Radeon 5750 level. Beyond this, higher textures, AF, AA modes, LOD, shader quality, shadow quality, geometry, will put constant increases in loading on the GPU at ever diminishing returns. Yes, I know that what I've said here isn't the complete run down of how it all works, but it's an idea to get your minds thinking about it.

/Not surprised at console optimization versus high end PC hardware.

Console developers, especially at just 720p have been able to figure out the tricks of getting the most out of fixed hardware. It should be noted though that still screenshots and low resolution or compressed video do lie about much of what we see that only actively seeing a console game in action will show like tremendous amounts of geometry pop-in and LOD management that is no where near as prevalent in PC versions of many titles.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Is there a source somewhere saying there is no tessellation in the beta ? I think this can be easily checked, there is some method of breaking down what is rendering and displaying tessellation if it is on. Actually, someone with an AMD card could check as well by controlling the tessellation via the control panel and seeing if there is a performance impact.

I haven't seen anything about there being no tessellation in the beta, where did you see Crytek saying that ?

I think there's no tessellation in the multiplayer portion; I remember reading that somewhere but can't find it. I do believe SP will have tessellation.
 

0___________0

Senior member
May 5, 2012
284
0
0
I think there's no tessellation in the multiplayer portion; I remember reading that somewhere but can't find it. I do believe SP will have tessellation.

This is partially correct, this is how it was in Crysis 2 as well. They tone down and strip out some stuff from the single player to produce the MP maps.
 
Last edited:

Haserath

Senior member
Sep 12, 2010
793
1
81
Is there a source somewhere saying there is no tessellation in the beta ? I think this can be easily checked, there is some method of breaking down what is rendering and displaying tessellation if it is on. Actually, someone with an AMD card could check as well by controlling the tessellation via the control panel and seeing if there is a performance impact.

I haven't seen anything about there being no tessellation in the beta, where did you see Crytek saying that ?

I've had the control panel optimization on and off with no difference.

There also isn't a setting for it. I guess that might make sense to keep the geometry the same for everybody.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
I've had the control panel optimization on and off with no difference.

There also isn't a setting for it. I guess that might make sense to keep the geometry the same for everybody.

Crysis 2 was the same though, there is no setting for tessellation, it's just on.

I haven't seen anything anywhere saying tessellation is disabled in multiplayer. It works in Crysis 2 multiplayer. Just curious where this idea that there is no tessellation on is coming from.
 

Haserath

Senior member
Sep 12, 2010
793
1
81
Crysis 2 was the same though, there is no setting for tessellation, it's just on.

I haven't seen anything anywhere saying tessellation is disabled in multiplayer. It works in Crysis 2 multiplayer. Just curious where this idea that there is no tessellation on is coming from.

Then there isn't much anyway.