Crysis 2 being redesigned for GTX580?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
You completely misrepresent the argument of those against nVidia shaping the way games are developed.

I'll agree that if you believe in what you have posted (though incorrect), your argument is appealing.
I just don't buy into conspiracy theories much. ATI folk have been accussing this of nVidia for a long time. It's all part of the game.
 

Vdubchaos

Lifer
Nov 11, 2009
10,408
10
0
I'm somewhat worries about Crysis 2 at this point. I keep hearing about "3d" crap.....god REALLY, 3d? Come on now.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
As long as the game doesn't get delayed from its March 2011 release date, NV can throw in 100x more tessellation in its Extreme Tessellation setting to help sell 3x GTX580 setups. This doesn't matter though since Crytek will likely have low/moderate/extreme level available for toggle in the game control panel. Gamers get more options if anything. I don't see how this makes gamers victims (unless the game gets delayed)??


/agree

But where have you gotten the information that a tessalation adjustment will be allowed in anything other than on/off? Does nVdia's marketing push (there are lessons learned from other games) mean the settings, if implemented, are likely to be solid choices for gamers on AMD hardware or should we expect tessalation is set up in a way to polarize performance as has been done in HAWX 2, Stone Giant and the updated Heaven Benchmark that nVidia sponsered?
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
I just don't buy into conspiracy theories much. ATI folk have been accussing this of nVidia for a long time. It's all part of the game.


It's not a conspiracy theory. Gamers are tired of nVidia's attempts to sabotage a games experience based on hardware.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
It's a good thing that the games of the future will necessarily have more tessellation that magically works on Nvidia's current cards without requiring users to upgrade :rolleyes: and only AMD users will have to upgrade for better future gaming performance, right? :rolleyes:

This will probrobly be true.
There will be many 58xx users needing/wanting to upgrade,before gtx owners IMHO.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I don't see how Nvidia has "sabotaged" a game's experience in any game.

He means sabotaged AMD gamers experience because their hardware cant run PhysX at all, Tesselation at low factors or it chokes, and AA in DX 9 games because AMD refuses to submit code to the devs to do that. Like I said before, poor poor AMD. Always a victim.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
/agree

But where have you gotten the information that a tessalation adjustment will be allowed in anything other than on/off? Does nVdia's marketing push (there are lessons learned from other games) mean the settings, if implemented, are likely to be solid choices for gamers on AMD hardware or should we expect tessalation is set up in a way to polarize performance as has been done in HAWX 2, Stone Giant and the updated Heaven Benchmark that nVidia sponsered?

Where have you got information they wont provide a slider?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
But where have you gotten the information that a tessalation adjustment will be allowed in anything other than on/off?

The issue with Hawx 2 is actually related to the benchmark itself. The game actually runs just as fast with Tessellation On with AMD as it does with no tessellation in Hawx 1 :)

Games like Metro 2033 have varying degrees of geometry detail, so why don't you think Crysis 2 will have those options? The fact is tessellation just runs faster on NV hardware right now -- NV is exploiting this advantage for marketing purposes.

What about when Intel worked closely with Capcom to provide significant performance advantages for its 8-threaded i7 processors in Resident Evil 5?
ftp://download.intel.com/products/processor/corei7/ResidentEvil.pdf

This resulted in massively improved performance for gamers with multi-core i7 CPUs. Is this practice also unfair because AMD Phenoms can't do 8 threads?

AMD can also work with gaming developers to optimize for its own tessellation depth, but they aren't doing that. Similarly, Intel also works closely with developers to make sure its CPUs run faster too (like Starcraft 2 and RE5).
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
It's a good thing that the games of the future will necessarily have more tessellation that magically works on Nvidia's current cards without requiring users to upgrade :rolleyes: and only AMD users will have to upgrade for better future gaming performance, right? :rolleyes:


If the want the same perfomance, it looks like AMD users will have to either lower their settings....or upgrade, compared to NVIDIA users:
http://www.geeks3d.com/20101028/test-asus-eah6870-1gb-review-direct3d-performances-part-4/


SubD11 show were the problem lies...look how AMD's performance tanks at 16x (normal) tessellation.

And the Lost Planet 2 bench mirrors the performance gap of HAWX2, comfirming AMD's lower performance.

And as a fun side fact.
Look at HAWX2.
98 FPS on a 5770.
That is a well implemented use of adaptive tessellation.
Just dosn't look impressive compared to NVIDIA's performance.
Hence why Fuddy had to spin some PR lies that far to many people took at face value...despite that the performance cleary showes how hollow his lies were.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The issue with Hawx 2 is actually related to the benchmark itself. The game actually runs just as fast with Tessellation On with AMD as it does with no tessellation in Hawx 1 :)

Games like Metro 2033 have varying degrees of geometry detail, so why don't you think Crysis 2 will have those options? The fact is tessellation just runs faster on NV hardware right now -- NV is exploiting this advantage for marketing purposes.

What about when Intel worked closely with Capcom to provide significant performance advancates for its 8-threaded i7 processors in Resident Evil 5?
ftp://download.intel.com/products/processor/corei7/ResidentEvil.pdf

This resulted in massively improved performance for gamers with multi-core i7 CPUs. Is this practice also unfair because AMD Phenoms can't do 8 threads?

AMD can also work with gaming developers to optimize for its own tessellation depth, but they aren't doing that. Similarly, Intel also works closely with developers to make sure its CPUs run faster too (like Starcraft 2 and RE5).

To be frank there really shouldn't be a "slider" for tessellation as it should be implemented adaptive ;)
 
Sep 19, 2009
85
0
0
The problem is the delay; making just one option of tesselation, extreme, would be a problem, although it is improbably to it happens.

If nVidia is indeed paying them to make a extreme tesselation level, but with others options everyone wins:
Crytek gets 2m richer; people with ATI cards can play with reasonable amounts of tesselation; and people with Fermi in a 3way SLI can vainglory that they have triangles smaller than pixels.

:)
 

shangshang

Senior member
May 17, 2008
830
0
0
AMD needs to STFU and dole out some of that awesome profits made form the 4xxx/5xxx/6xxx cards to the developers!!! NV is willing to pay developers to develop. Well, AMD should stop being EL CHEAPO and start putting some of that profit money to good use! Hey, the way I see it, anyone supporting the DEVELOPERS is a GOOD thing for gaming!
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
This will probrobly be true.
There will be many 58xx users needing/wanting to upgrade,before gtx owners IMHO.

That isn't crystal clear, though. We have to see what games adopt high levels of tessellation, what impact the higher levels of tessellation have on IQ and FPS, and also features about those games that we are not talking about (the future games). Tessellation is not the only variable that will change in the sense of increased requirements on the cards, right? So it's not clear to me that just because 4XXGTX card users are better off in tessellation (for the moment, and likely after 69XX release but we'll see) that the 4XXGTX series is thus future-proof any more than the 6XXX series. Do you see what I mean?

If the want the same perfomance, it looks like AMD users will have to either lower their settings....or upgrade, compared to NVIDIA users:
http://www.geeks3d.com/20101028/test-asus-eah6870-1gb-review-direct3d-performances-part-4/


SubD11 show were the problem lies...look how AMD's performance tanks at 16x (normal) tessellation.

And the Lost Planet 2 bench mirrors the performance gap of HAWX2, comfirming AMD's lower performance.

And as a fun side fact.
Look at HAWX2.
98 FPS on a 5770.
That is a well implemented use of adaptive tessellation.
Just dosn't look impressive compared to NVIDIA's performance.
Hence why Fuddy had to spin some PR lies that far to many people took at face value...despite that the performance cleary showes how hollow his lies were.

There is no doubt in anyone's mind that currently AMD's cards aren't as good at tessellation as the Nvidia counterparts. So I'm not sure why you posted that. What I'm saying is, AMD's cards, dollar for dollar, play all current games pretty well. Same with Nvidia. But the argument you seem to be making is that, necessarily, future games will require only more tessellation. You are isolating the variable that is 'tessellation' and using that to conclude that because the 4XX series is better at tessellation (now), that for sure the 4XX will be better in the future (if there are games that come out that make use of this advanced tessellation). But tessellation isn't the only fish to fry here, that's why I remain sceptical. I'm not arguing that AMD's tessellation is as good as Nvidia's, I'm merely arguing against the claim (with scepticism) that necessarily, Nvidia's cards are more future proof than AMD's, because of better tessellation performance. You may turn out to be right (along with Scali), or I might.

It seems odd to me that a company like AMD would be involved with a number of developers (I mean, they have so much of the market you'd think gaming companies have an ongoing discourse with them) and thus know what future games are going to need to run, and then deliberately not deliver on the tessellator as Nvidia has. Do you catch my drift? If this wasn't the case, then gaming developers would be happy to have their games only playable on a much more limited number of computers (thus shrinking the potential market). A gaming company should want their game to run on as many computers as is possible, so that they will sell more copies, and thus make more money. If lots of gaming companies have told AMD "look, tessellation is the way forward, improve your performance if you want your card holders and enthusiasts to play these games at high settings" and AMD chose to ignore that - I mean, it's just absurd. It defies belief. So I prefer to think that there's something amiss about this 'great leap forward' that tessellation is made out to be, at least in the short term.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
It's not a conspiracy theory. Gamers are tired of nVidia's attempts to sabotage a games experience based on hardware.

There is no conspiracy. Nvidia is, and always has, especially in the last few years, done more for "it's" customers.

And to properly fix your post as to what you really meant above, here goes:

Gamers using AMD/ATI hardware are tired of nVidia's attempts to improve gaming experiences on nVidia hardware and NOT take AMD/ATI with them.

There is no truer statement that can be said here. Spin as you like, but this is what is on the tip of every AMD fans tongue. Hey, if you're so tired of it, then I'm sorry, but you need to go out and buy an Nvidia product. If you feel you're missing out on so much, then that is what you need to do. Don't count on AMD to do these things for you. They may, or may not, but their track record CLEARLY shows them sitting back and letting everyone else do things for them. Even their 3D setup is 3rd party and has to be supported as such. Cmon guys, can't you see the writing on the wall yet?

/wow.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
That isn't crystal clear, though. We have to see what games adopt high levels of tessellation, what impact the higher levels of tessellation have on IQ and FPS, and also features about those games that we are not talking about (the future games). Tessellation is not the only variable that will change in the sense of increased requirements on the cards, right? So it's not clear to me that just because 4XXGTX card users are better off in tessellation (for the moment, and likely after 69XX release but we'll see) that the 4XXGTX series is thus future-proof any more than the 6XXX series. Do you see what I mean?

No.



There is no doubt in anyone's mind that currently AMD's cards aren't as good at tessellation as the Nvidia counterparts. So I'm not sure why you posted that. What I'm saying is, AMD's cards, dollar for dollar, play all current games pretty well. Same with Nvidia.

I see upcomming games use more tessellation, now that developers don't have to limit performance to AMD's current tessellation performance

But the argument you seem to be making is that, necessarily, future games will require only more tessellation. You are isolating the variable that is 'tessellation' and using that to conclude that because the 4XX series is better at tessellation (now), that for sure the 4XX will be better in the future (if there are games that come out that make use of this advanced tessellation). But tessellation isn't the only fish to fry here, that's why I remain sceptical. I'm not arguing that AMD's tessellation is as good as Nvidia's, I'm merely arguing against the claim (with scepticism) that necessarily, Nvidia's cards are more future proof than AMD's, because of better tessellation performance. You may turn out to be right (along with Scali), or I might.

You might wan't to read up on what DX11 is and what new/different from previous DirectX version:
http://www.anandtech.com/show/2716/6

It seems odd to me that a company like AMD would be involved with a number of developers (I mean, they have so much of the market you'd think gaming companies have an ongoing discourse with them) and thus know what future games are going to need to run, and then deliberately not deliver on the tessellator as Nvidia has. Do you catch my drift? If this wasn't the case, then gaming developers would be happy to have their games only playable on a much more limited number of computers (thus shrinking the potential market). A gaming company should want their game to run on as many computers as is possible, so that they will sell more copies, and thus make more money. If lots of gaming companies have told AMD "look, tessellation is the way forward, improve your performance if you want your card holders and enthusiasts to play these games at high settings" and AMD chose to ignore that - I mean, it's just absurd. It defies belief. So I prefer to think that there's something amiss about this 'great leap forward' that tessellation is made out to be, at least in the short term.

I have a feeling that your post are more based on...well "feelings" and "hope" rather than facts and and insight in the topic at hand :confused:
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
So it's not clear to me that just because 4XXGTX card users are better off in tessellation (for the moment, and likely after 69XX release but we'll see) that the 4XXGTX series is thus future-proof any more than the 6XXX series. Do you see what I mean?

Thats why I used the 58xx series as an example. 5850's and 5870's to be exact.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Crysis 2 isn't be "redesigned for the GTX580", because that would imply that this redesigning will allow the GTX580 to play it with everything on high.

The GTX480 can't play Crysis Warhead with everything on high at 1920x1200 with 4xAA at high frame rates (30fps IIRC from one review).
With most shaders at the 3rd highest setting of 4, AT shows the GTX480 managing 50fps.

You really think that a +25% GTX580 (assuming that's what it does) will manage to run decent frame rates when you also add in extreme tessellation?
Sure, maybe it might... just.

If tessellation is being added, it's not going to be for the current cards, but only really feasible at high levels when we see 28nm cards from both ATI and NV IMO. It's not being re-architected for now, but for the future, just like Crysis was ahead of its time, and Far Cry before that.
Crytek push the boundaries, and hopefully any tessellation they add will do that too. I would love another engine that no one can run at max on a single card for another 3+ years.

I don't think AMD's tessellation will be what holds it back in Crysis 2, assuming high levels are added (certainly not at the HD5870/HD6870 level). They can't manage Crysis Warhead at playable frame rates with everything set to max, so adding tessellation won't fix or break that. They will go from being unable to run it at max to... being unable to run it at max. Same goes for basically all but one NV card (the GTX480).

This all assumes that Crysis 2 will be at minimum as demanding as Crysis/Crysis Warhead, with tessellation being added on top of that, if not more demands form other elements.
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,055
2,271
126
Hey, if you're so tired of it, then I'm sorry, but you need to go out and buy an Nvidia product. If you feel you're missing out on so much, then that is what you need to do. Don't count on AMD to do these things for you. They may, or may not, but their track record CLEARLY shows them sitting back and letting everyone else do things for them. Even their 3D setup is 3rd party and has to be supported as such. Cmon guys, can't you see the writing on the wall yet?

/wow.

I'd rather have visual features supported on both types of cards (ie.AA/AF, tessellation). For example with tessellation, I don't mind that ATI cards would be slower at it...at least it is supported. I don't want to have to choose cards based on specific games. Who knows what features (if any) will be nV-specific in this game...
 
Last edited:

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
Thats why I used the 58xx series as an example. 5850's and 5870's to be exact.

Right. How long has the 5XXX series been out? Where are the games that stump those cards? How long will it be until those games come out? How much longevity should a card have? The 4XX came later, so obviously they (assuming equal longevity) should last longer than the 5XXX series or else they are certainly the worst buy overall. The original point I was referring to was one of card longevity/upgrading. I'm not sure that the longevity of the 4XX series is greater simply because of better tessellation than the 5XXX series. There is no logical way, currently, to deny this. Time will tell, and then we will see.

No.





I see upcomming games use more tessellation, now that developers don't have to limit performance to AMD's current tessellation performance



You might wan't to read up on what DX11 is and what new/different from previous DirectX version:
http://www.anandtech.com/show/2716/6



I have a feeling that your post are more based on...well "feelings" and "hope" rather than facts and and insight in the topic at hand :confused:

No, they aren't. And that's why you shied away from BFG10K's posts in another thread. You don't have the games, he does. He knows what he's talking about, and you link to synthetic benchmarks as if one can extrapolate from a synthetic tessellation benchmark to hypothetical games in the future (that don't exist yet) and show that the 4XX series is better in the long run.

Your posts are based on speculation about tessellation adoption in the future, and mine are based on what data is currently available today. That means, mine are based on actuality and yours are based on the hypothetical, and I think that you should read up on the difference between the two before pointing a finger at someone else and saying that his posts are based on hope. Yours are based on hope (for the future, hoping that games require tessellation more and more) and mine are based on fact (that AMD's cards perform quite well now, and there's no basis for anyone to say that Nvidia's are a better buy). Also, you can't refute a well argued point by myself with a 'no', that's not an argument, that's simply a negation based on wishful thinking. I find it hilarious that your very implication towards me is easily turned around and more accurately applied to yourself (and your post history).
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Right. How long has the 5XXX series been out? Where are the games that stump those cards? How long will it be until those games come out? How much longevity should a card have? The 4XX came later, so obviously they (assuming equal longevity) should last longer than the 5XXX series or else they are certainly the worst buy overall. The original point I was referring to was one of card longevity/upgrading. I'm not sure that the longevity of the 4XX series is greater simply because of better tessellation than the 5XXX series. There is no logical way, currently, to deny this. Time will tell, and then we will see.



No, they aren't. And that's why you shied away from BFG10K's posts in another thread. You don't have the games, he does. He knows what he's talking about, and you link to synthetic benchmarks as if one can extrapolate from a synthetic tessellation benchmark to hypothetical games in the future (that don't exist yet) and show that the 4XX series is better in the long run.

Your posts are based on speculation about tessellation adoption in the future, and mine are based on what data is currently available today. That means, mine are based on actuality and yours are based on the hypothetical, and I think that you should read up on the difference between the two before pointing a finger at someone else and saying that his posts are based on hope. Yours are based on hope (for the future, hoping that games require tessellation more and more) and mine are based on fact (that AMD's cards perform quite well now, and there's no basis for anyone to say that Nvidia's are a better buy). Also, you can't refute a well argued point by myself with a 'no', that's not an argument, that's simply a negation based on wishful thinking. I find it hilarious that your very implication towards me is easily turned around and more accurately applied to yourself (and your post history).

Look at Hawk2 and Lost Planet 2.
The performance gap mirrors the perfromance gap in eg. SubD11.
Hence why Fuddy started the whole "to much tessellation crap".

I highligthed something...NVIDIA's cards perform better, that is reason enough alone...unless you don't buy after performance, but brand loyality.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106

That is interesting, but how do you feel about the comments under this Fudzilla article?

(Nvidia GPU will not be in next generation consoles? Couldn't that have an affect Nvidia performance numbers with most future console ports? If so, is that why Nvidia is partnering up with PC developers like Crytek to help offset this to a degree?)
 
Last edited: