Crysis 2 Tessellation Article

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
So, who do you think really benefits from this patch?
All of us PC gamers. Next year, Crysis 2 DX11 will probably run great on upper-midrange AMD HD 7000 and Nvidia GTX 600 cards. Probably work great in S3D also.

And the CryTek boys did say that they weren't done patching Crysis 2; perhaps they will consider optimizing their tessellation in a future patch. There will probably be an expansion and a Crysis 3.
:\
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
All of us PC gamers. Next year, Crysis 2 DX11 will probably run great on upper-midrange AMD HD 7000 and Nvidia GTX 600 cards. Probably work great in S3D also.

And the CryTek boys did say that they weren't done patching Crysis 2; perhaps they will consider optimizing their tessellation in a future patch. There will probably be an expansion and a Crysis 3.
:\

No, not who will someday benefit from this patch? Or, who will likely benefit? Who is actually benefiting from it? It's nVidia. You could stretch it to include an extremely small percentage of gamers who own the hardware to run it. Even those are using their hi-end, minimal market share, gear very inefficiently.
 

996GT2

Diamond Member
Jun 23, 2005
5,212
0
76
Who said anything about mid-range cards? There's 3 current cards that give you playable framerates (~40fps) @1080, the 590, 580 and 6990. What percentage of gamers use those cards? That's why I said useless for almost everyone.

I don't know where you're getting that from, but my GTX 470 overclocked to 800/1800 (stock=607/1674) actually plays Crysis 2 DX11 perfectly fine at ultra settings with the high-res texture pack enabled. I'm getting around 40-50 FPS in FRAPS at 1920x1080.

Anything around a GTX 470 or above should be fine for Crysis 2 DX11/Ultra settings @ 1080p. The following #s were taken at 1200p which is a bit more demanding than 1080p. Also, the cards in the bench were all running @ stock clocks so a slight overclock will also help performance.

crysis2bench.jpg
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Who is actually benefiting the most from the DX11 patch? Those with the highest end cards - GTX 590, HD 6990, GTX 580; probably in that order. And i would also say that the HD 6970/6950 run it pretty well at 1080p - along with the GTX 560 Ti as long as you don't want absolutely everything. As i recall, my GTX 560 Ti ran DX11 pretty well.

That is the way it usually goes and has always been. With Far Cry, with Crysis and now with Crysis 2. And it will probably repeat in the future with Crysis 3. The highest end cards usually get the most fluid framerate at the most demanding settings and resolutions.

i still think we have a further Crysis 2 patch due us from CryTek and plenty more driver optimizations before the final word is said on this game and DX11.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Those are old drivers :p
- pre-patch for AMD, if i am not mistaken - terribly unfair of a comparison

11.6 and 275 .... Really old (in HW/dog years)

We are on 11.8 and 280. 280 made S3D go from fair to excellent for 3D Vision
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Those are old drivers :p
- pre-patch for AMD, if i am not mistaken - terribly unfair of a comparison

11.6 and 275 .... Really old (in HW/dog years)

We are on 11.8 and 280. 280 made S3D go from fair to excellent for 3D Vision

So, going by 996GT2's chart to the mix we can include one more card, the 570. We could also add some dual GPU cards and SLI/Crossfire setups, but those are pretty niche. I also consider the 590 and 6990 niche, but they are the top of the line for each supplier. So, I would be more inclined to include them. All up all of these cards don't really give us a very large user base, though. We still see a huge advantage to nVidia pretty much across the board though.

there's not much I can do if you guys just keep ignoring the obvious... that it gives nVidia a big advantage in benching even though most nVidia users still can't play the patch.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
So, going by 996GT2's chart to the mix we can include one more card, the 570. We could also add some dual GPU cards and SLI/Crossfire setups, but those are pretty niche. I also consider the 590 and 6990 niche, but they are the top of the line for each supplier. So, I would be more inclined to include them. All up all of these cards don't really give us a very large user base, though. We still see a huge advantage to nVidia pretty much across the board though.

there's not much I can do if you guys just keep ignoring the obvious... that it gives nVidia a big advantage in benching even though most nVidia users still can't play the patch.

Give up.
You have no source code to present, only still images that mean squat and no word from AMDthat there is something a foul...and yet you talk like it is a proven fact that NVIDIA did something evil?

Look...it's Elvis -------------->>

:rolleyes:
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Give up.
You have no source code to present, only still images that mean squat and no word from AMDthat there is something a foul...and yet you talk like it is a proven fact that NVIDIA did something evil?

Look...it's Elvis -------------->>

:rolleyes:

Well, there's a logical retort. And now we need source code? I'll remember that next time you think someone is wrong or biased. :rolleyes:
 

996GT2

Diamond Member
Jun 23, 2005
5,212
0
76
So, going by 996GT2's chart to the mix we can include one more card, the 570.

Not really.

On the AMD side, HD6990, 6970, 6950, 5870, or lesser cards in Crossfire can all play Crysis 2 DX11/Ultra fine at 1920x1080. The borderline cards like the 6950 may require a slight OC, but nothing extreme.

On the NVidia side, GTX 590, 580, 570, 560 Ti, 580, 470 will all be fine as well, or lesser cards such as the GTX 460 in SLI. The 560Ti and 470 are borderline at stock speeds but a slight (~10-15%) overclock will bring them into the ~40 FPS range at 1080p.

It makes a lot of sense if you think about it. Crysis 2 DX11/ultra is very demanding, and so you need at least a high-midrange card like a GTX 560 Ti or GTX 470 to play at those settings.

But then again, a GTX 470 @ $150 or a GTX 560 Ti @ $200 aren't really that expensive. I don't understand what you mean when you say "most users can't play at those settings" because a GTX 470 or 560 Ti are well within the reach of most people's budgets.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Well, there's a logical retort. And now we need source code? I'll remember that next time you think someone is wrong or biased. :rolleyes:

No, it a biased opinion and I look forward to your...well, nothing.

Because this is the amount of "evidence" in this thread.

Now if you insist on using it as a valid fact, you nedd to get confirmation from CryTek.

Or hell...were is Fuddy?!

That he is silent = there is no foul play.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Not really.

On the AMD side, HD6990, 6970, 6950, 5870, or lesser cards in Crossfire can all play Crysis 2 DX11/Ultra fine at 1920x1080. The borderline cards like the 6950 may require a slight OC, but nothing extreme.

On the NVidia side, GTX 590, 580, 570, 560 Ti, 580, 470 will all be fine as well, or lesser cards such as the GTX 460 in SLI. The 560Ti and 470 are borderline at stock speeds but a slight (~10-15%) overclock will bring them into the ~40 FPS range at 1080p.

It makes a lot of sense if you think about it. Crysis 2 DX11/ultra is very demanding, and so you need at least a high-midrange card like a GTX 560 Ti or GTX 470 to play at those settings.

But then again, a GTX 470 @ $150 or a GTX 560 Ti @ $200 aren't really that expensive. I don't understand what you mean when you say "most users can't play at those settings" because a GTX 470 or 560 Ti are well within the reach of most people's budgets.

I realize it's a bit of different strokes for different folks, but I was comparing stock cards at minimum avg of 40fps. Nothing on your chart beneath the 570 achieves that.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
Rather annoyingly I'm rather tempted to buy the game now. No amount of Nvidia/Crytek sabotage is going to hold my 6990's back!
I still strongly suspect foul play but to be fair it seems the game does look pretty good on the highest settings on mid to high range hardware at playable frame rates from both camps. The concrete barriers do look slightly better with tessellation BUT I still think its an unnecessary performance impediment for marginal image quality improvement.
Thanks Sontin for shedding a bit more light on proceedings.
As for people who require a written confession from Crytek/Nvidia before believing the worst I find that an incredible position for a grown adult to take, you simply have to ask yourself WHO benefits from it being implemented this way? Improving Nvidia's mid range cards relative performance in benchmarks is why I still believe it was done.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Rather annoyingly I'm rather tempted to buy the game now. No amount of Nvidia/Crytek sabotage is going to hold my 6990's back!
I still strongly suspect foul play but to be fair it seems the game does look pretty good on the highest settings on mid to high range hardware at playable frame rates from both camps. The concrete barriers do look slightly better with tessellation BUT I still think its an unnecessary performance impediment for marginal image quality improvement.
Thanks Sontin for shedding a bit more light on proceedings.
As for people who require a written confession from Crytek/Nvidia before believing the worst I find that an incredible position for a grown adult to take, you simply have to ask yourself WHO benefits from it being implemented this way? Improving Nvidia's mid range cards relative performance in benchmarks is why I still believe it was done.

We can't all believe Elvis lives, we didn't land on the moon, Oswald didn't shoot Kennedy og nessie is real....without hard evidence...

Something this thread is sorley lacking...all I see is FUD...to over up for AMD's poor fixedd tessellation engine...again, again, again....zzzZZZZzzzZZZZzzz...
 

WMD

Senior member
Apr 13, 2011
476
0
0
Not really.

On the AMD side, HD6990, 6970, 6950, 5870, or lesser cards in Crossfire can all play Crysis 2 DX11/Ultra fine at 1920x1080. The borderline cards like the 6950 may require a slight OC, but nothing extreme.

On the NVidia side, GTX 590, 580, 570, 560 Ti, 580, 470 will all be fine as well, or lesser cards such as the GTX 460 in SLI. The 560Ti and 470 are borderline at stock speeds but a slight (~10-15%) overclock will bring them into the ~40 FPS range at 1080p.

It makes a lot of sense if you think about it. Crysis 2 DX11/ultra is very demanding, and so you need at least a high-midrange card like a GTX 560 Ti or GTX 470 to play at those settings.

But then again, a GTX 470 @ $150 or a GTX 560 Ti @ $200 aren't really that expensive. I don't understand what you mean when you say "most users can't play at those settings" because a GTX 470 or 560 Ti are well within the reach of most people's budgets.

http://www.xbitlabs.com/articles/graphics/display/crysis-2-directx11_3.html#sect2

Pretty brutal framerates on all graphics card.

Gtx570 averages 41fps but drops as low as 12fps. The 6950 can't even hit 30fps average and drops to a ridiculous 6fps. That is not playable in my book.

quote from xbitlabs:
" Now we’ve got a game that suggests that the current generation of performance-mainstream solutions calls for replacement. You can’t get smooth gameplay with them."

Its not all that bad though Amd users can still turn off tessellation from control panel and enjoy the game in ultra with most other graphical features intact. I personally would rather play in very high with 70-90fps instead of ultra. Yes I can get see some nice protruding bricks if I bump my head against the walls but playing at 30fps isn't very fun.
 

Mistwalker

Senior member
Feb 9, 2007
343
0
71
We can't all believe Elvis lives, we didn't land on the moon, Oswald didn't shoot Kennedy og nessie is real....without hard evidence...

Something this thread is sorley lacking...all I see is FUD...to over up for AMD's poor fixedd tessellation engine...again, again, again....
This thread does not exist to meet your burden of proof, despite your numerous reminders it fails to do so.

Fact 1. Crysis 2 has, in certain cases, blatantly excessive tessellation with little or no regard for optimization or actual visual effect. With the unrendered ocean it is egregious and indefensible.

Fact 2. Nvidia was involved in the patch that ultimately resulted in the above.

Fact 3. Nvidia's cards end up looking better in benchmarks with the release of the patch.

Some would say the third point is obvious--Nvidia's cards were designed to perform better under heavy tessellation!--and it is a win for Nvidia's architecture. No disagreement here.

However, taken all together, you clearly have motive, opportunity, and result. Of course the details are stuck in speculation, but it's not a discussion based on nothing or FUD, and saying so is dismissive and disrespectful.

zzzZZZZzzzZZZZzzz...
I will kindly ask that if you personally think this entire thread is pointless and/or a waste of your time, that you not bother reminding us with sarcastic or derogatory posts stating so.

My personal opinion is that even if Nvidia pressured for massive tessellation everywhere, to play to the strengths of their own cards, it didn't need to be the unoptimized mess that ended up in the patch. It would be silly of Nvidia not to play to their strengths. But whether Crytek didn't know what it was doing, or allowed itself to be pressured into what we all got, it's their name on the box (and the patch). Tessellate away! Just do a better job of it!
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Witcher 2. Battlefield: Bad Company 2 (and probably Battlefield 3). Deus Ex: Human Revolution.

Now, in some respects Crysis 2 may be visually superior to some of these games. However, each maintains a nice level of IQ at a better performance level -- without pointless tessellation.

You haven't picked a single game there that equals crysis 2 graphically which is what I asked for. With DE:HR you've picked a game that a lot of the early DX9 renderers out do. If you had redone any of those games using the crysis 2 engine they'd look prettier (DE:HR would have been amazing).

The only game I own that's as good is metro 2033 and that if anything runs worse then crysis 2. BF3 might out do it but that's not out yet, not due for a while.

Hence the point still stands - if it is so bad and so inefficient how come no one else has produced a better renderer?


fps is pretty steady in 30's & 40's for me. Doesn't feel jerky.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
This thread does not exist to meet your burden of proof, despite your numerous reminders it fails to do so.

Fact 1. Crysis 2 has, in certain cases, blatantly excessive tessellation with little or no regard for optimization or actual visual effect. With the unrendered ocean it is egregious and indefensible.

Fact 2. Nvidia was involved in the patch that ultimately resulted in the above.

Fact 3. Nvidia's cards end up looking better in benchmarks with the release of the patch.

Some would say the third point is obvious--Nvidia's cards were designed to perform better under heavy tessellation!--and it is a win for Nvidia's architecture. No disagreement here.

However, taken all together, you clearly have motive, opportunity, and result. Of course the details are stuck in speculation, but it's not a discussion based on nothing or FUD, and saying so is dismissive and disrespectful.


I will kindly ask that if you personally think this entire thread is pointless and/or a waste of your time, that you not bother reminding us with sarcastic or derogatory posts stating so.

My personal opinion is that even if Nvidia pressured for massive tessellation everywhere, to play to the strengths of their own cards, it didn't need to be the unoptimized mess that ended up in the patch. It would be silly of Nvidia not to play to their strengths. But whether Crytek didn't know what it was doing, or allowed itself to be pressured into what we all got, it's their name on the box (and the patch). Tessellate away! Just do a better job of it!

I believe Crytek knew exactly what they were doing. They could have done so much more. They could have done it right. I believe they simply sold out.

The rest of your post +1, spot on, :thumbsup:.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
This thread does not exist to meet your burden of proof, despite your numerous reminders it fails to do so.

Fact 1. Crysis 2 has, in certain cases, blatantly excessive tessellation with little or no regard for optimization or actual visual effect. With the unrendered ocean it is egregious and indefensible.

Fact 2. Nvidia was involved in the patch that ultimately resulted in the above.

Fact 3. Nvidia's cards end up looking better in benchmarks with the release of the patch.

Some would say the third point is obvious--Nvidia's cards were designed to perform better under heavy tessellation!--and it is a win for Nvidia's architecture. No disagreement here.

However, taken all together, you clearly have motive, opportunity, and result. Of course the details are stuck in speculation, but it's not a discussion based on nothing or FUD, and saying so is dismissive and disrespectful.


I will kindly ask that if you personally think this entire thread is pointless and/or a waste of your time, that you not bother reminding us with sarcastic or derogatory posts stating so.

My personal opinion is that even if Nvidia pressured for massive tessellation everywhere, to play to the strengths of their own cards, it didn't need to be the unoptimized mess that ended up in the patch. It would be silly of Nvidia not to play to their strengths. But whether Crytek didn't know what it was doing, or allowed itself to be pressured into what we all got, it's their name on the box (and the patch). Tessellate away! Just do a better job of it!

+2:thumbsup:
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Rather annoyingly I'm rather tempted to buy the game now. No amount of Nvidia/Crytek sabotage is going to hold my 6990's back!
I still strongly suspect foul play but to be fair it seems the game does look pretty good on the highest settings on mid to high range hardware at playable frame rates from both camps. The concrete barriers do look slightly better with tessellation BUT I still think its an unnecessary performance impediment for marginal image quality improvement.
Thanks Sontin for shedding a bit more light on proceedings.
As for people who require a written confession from Crytek/Nvidia before believing the worst I find that an incredible position for a grown adult to take, you simply have to ask yourself WHO benefits from it being implemented this way? Improving Nvidia's mid range cards relative performance in benchmarks is why I still believe it was done.

Ok jacky. This is one for the books and for all to remember. Have it your way then, but the next time a dev does something to a game that takes more advantage of something AMD's own hardware runs better, I'd like to think we will hear from you again. Of course we'll have to wait for the day to come where AMD actually does outperform Nvidia given the same parameters, so who knows how long that will be. Until then, I suppose we'll have to deal with hearing whines of foul play and debotchery. How everyone is paid off that doesn't show AMD in a good light, etc. etc.
Whoever benefits from a feature implemented, and only if it's Nvidia, must be on the down and dirty.

I find that an incredible position for a grown adult to take, don't you?
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Ok jacky. This is one for the books and for all to remember. Have it your way then, but the next time a dev does something to a game that takes more advantage of something AMD's own hardware runs better, I'd like to think we will hear from you again. Of course we'll have to wait for the day to come where AMD actually does outperform Nvidia given the same parameters, so who knows how long that will be. Until then, I suppose we'll have to deal with hearing whines of foul play and debotchery. How everyone is paid off that doesn't show AMD in a good light, etc. etc.
Whoever benefits from a feature implemented, and only if it's Nvidia, must be on the down and dirty.

I find that an incredible position for a grown adult to take, don't you?

Condescend much?

You might try actually reading what people are complaining about. Nobody's complaining about the feature. Nobody's complaining about tessellation. Everyone's complaining about the shoddy implementation.

AMD often beats nVidia given same parameters. I have no idea why you say they don't. How about perf/mm^2? perf/watt? There are other areas AMD typically leads too. Like first to use GDDR5. First out with Dx11. Heck, even first with tessellation. They're even infringing on areas where nVidia has typically held the lead. SLI/Crossfire scaling. We'll see how GCN does with compute. Supposed to be much improved over the VLIW architecture. It looks like they're going to have the lead in 28nm too. I'm sure you know this though.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Condescend much?

You might try actually reading what people are complaining about. Nobody's complaining about the feature. Nobody's complaining about tessellation. Everyone's complaining about the shoddy implementation.

Can you show a better implementation of Tessellation in a game? In Dragon Age 2 nVidia cards have a 50% performance impact when you activate Tessellation + POM. In Crysis 2 it's only 20%. Why is noboby saying something about Dragon Age 2?

I saw the same behaviour with HAWX 2. In the end the target size of the triangles was something around 12-18 pixel. This is right in the range AMD is promoting.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Can you show a better implementation of Tessellation in a game? In Dragon Age 2 nVidia cards have a 50% performance impact when you activate Tessellation + POM. In Crysis 2 it's only 20%. Why is noboby saying something about Dragon Age 2?

I saw the same behaviour with HAWX 2. In the end the target size of the triangles was something around 12-18 pixel. This is right in the range AMD is promoting.

I'll mention it again. Massive tessellation of flat surfaces. Invisible tessellated meshes. Waste of GPU resources. Far more than a 20% performance hit for little or no IQ improvement.
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
Condescend much?

You might try actually reading what people are complaining about. Nobody's complaining about the feature. Nobody's complaining about tessellation. Everyone's complaining about the shoddy implementation.

AMD often beats nVidia given same parameters. I have no idea why you say they don't. How about perf/mm^2? perf/watt? There are other areas AMD typically leads too. Like first to use GDDR5. First out with Dx11. Heck, even first with tessellation. They're even infringing on areas where nVidia has typically held the lead. SLI/Crossfire scaling. We'll see how GCN does with compute. Supposed to be much improved over the VLIW architecture. It looks like they're going to have the lead in 28nm too. I'm sure you know this though.

Add high resolution multimonitor gaming on a single card also.
 

WMD

Senior member
Apr 13, 2011
476
0
0
Can you show a better implementation of Tessellation in a game? In Dragon Age 2 nVidia cards have a 50% performance impact when you activate Tessellation + POM. In Crysis 2 it's only 20%. Why is noboby saying something about Dragon Age 2?

I did say something about DA2 running horribly for an ugly looking game a few pages back. However nobody was too concern about DA2. AMD performed better than NV there.