Crysis 2 being redesigned for GTX580?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
So you don't have any sources and you're just lying, again? Cool.

No, I just dislike repeating myself over andover and over to people.
It's like you only remember 2 posts back...not helpfull.

Instead of me waisting my time on you I gave you the relevant links...ball in your court.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Unless you want people to start calling you Lonejerk it's the very minimum to write someone else's name properly...

Try calling me that...I dare you :)

...except if you're living in one of those remote places with little culture and no decency where everybody is a jerk and keeps making fun of others' names, of course.
But you don't, do you?

Fuddy suits him, with all the FUD he speews.



No, he did not.

Lie:
http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/5

Disproven right here:
http://www.geeks3d.com/20100711/cpu-physx-x87-sse-and-physx-sdk-3-0/

No, he did not, it's a well-known fact: if you choose to use PhysX you get an awesome marketing and development budget from NV.
It's a fact, go and ask around or check dev interviews, they admit it without any hesitation.

So NVIDIA paid all these comapanies?:
http://physxinfo.com/

NVIDIA should be loosing money like AMD if that were true...
(And before you start about GPU physX...there is NO difference between "CPU Physx" and "GPU Physx"...besides the performance advantage of the GPU)

Jesus, you are more ignorant than I thought... :awe:

1) Keep the plaigat of the much older myth of Horus/Osiris out of this.
2) a game pulling 98 FPS on previous generation midrange hardware is not borked:
http://www.geeks3d.com/20101028/test-asus-eah6870-1gb-review-direct3d-performances-part-4/
Huddy is an industry veteran, he worked on the development of Direct3D, bought by Microsoft, then worked for Nvidia for years then ended up heading ATI's development relations.
Calling Huddy a liar only shows how utterly clueless you are about the topic, ROFL. ^_^

To bad for you I dispoved some of his lies in this thread...just because you like Fuddy dosn't alter that he is a lying AMD PR puppet...
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
waisting my time

it's "wasting my time"...
waist = the part of your body where your legs and torso meet.
waste = trash, refuse, etc.

With one console port / DX9 release after another, should we not welcome games that push the envelope? Not a fanboy, just tired of seeing game after game that doesn't support the latest visual technologies. Worse, technologies that have been around a while (AA).

Amen brother!
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Seeing as we know what the GTX580 is now. I don't see it being redesigned for it as its basically the same as a 480.
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
it's "wasting my time"...
waist = the part of your body where your legs and torso meet.
waste = trash, refuse, etc.

Thanks, taltamir. And while we're on the case, please can I add:

"loose (loosing)" = "slack", the opposite of "tight".
"lose (losing)" = "fail", the opposite of "win" [or "find"].
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
No, I just dislike repeating myself over andover and over to people.
It's like you only remember 2 posts back...not helpfull.

Instead of me waisting my time on you I gave you the relevant links...ball in your court.
You repeat things that have no backing and you have yet to prove though. You're spreading worse FUD then the person you're complaining about. I wonder if they have a forum label "Hypocrite"?
In none of those articles are there any quotations of Huddy even discussing x87, nevermind "lying" about it. The most he discusses is PhysX's poor multi-core CPU optimization, which is already well-known and documented.
So NVIDIA paid all these comapanies?:
http://physxinfo.com/

NVIDIA should be loosing money like AMD if that were true...
Do you have any proof that they don't? Or are you saying things in the hope that people won't challenge it? Ten of thousands of dollars are a drop in the bucket in these companies revenues, there's no saying NVIDIA doesn't offer financial support included with their PhysX support.
To bad for you I dispoved some of his lies in this thread...just because you like Fuddy dosn't alter that he is a lying AMD PR puppet...
First, it's "too." Second, you didn't disprove anything, as I just showed. You just spout nonsense in the hope that no one will call you on it. If you're going to post links, you better include a relevant quotation first.
 

at80eighty

Senior member
Jun 28, 2004
458
5
81
these threads have become like watching a battle between whacko fundies and militant atheists - both sides with the vocal posters posting nonstop are absolutely permaban worthy nuts, while thinking they themselves are blameless. cest la vie or something
 

apokalipse

Junior Member
Jan 12, 2005
18
3
81
He means sabotaged AMD gamers experience because their hardware cant run PhysX at all
They can run open standards like OpenCL and DirectCompute. There's no reason to use PhysX except to favor Nvidia.

Tesselation at low factors or it chokes
Proportion (not level) of tesselation to regular rendering is the key.

Since Nvidia cards use their shaders to tesselate, they can assign any proportion of shaders they want to tesselation.
And in a non-typical proportion (not level) of tesselation (Unigine Heaven) will cause more shaders than usual to be used in tesselation.
Given that Unigine Heaven tries to tesselate pretty much everything it possibly can (that's its whole purpose), I think it's safe to say the proportion of tesselation to non-tesselation work is highly imbalanced.

In a typical level of tesselation (i.e. what you'd expect to see in a game), with the proportion of tesselation to non-tesselation work being much smaller than Unigine Heaven,a smaller proportion of shaders in the Nvidia cards will also be used for tesselation.

Of course I don't blame Unigine developers for purposely favoring Nvidia, because they didn't. They obviously just thought that in a tesselation benchmark, putting as much tesselation as possible will really push cards.
They just didn't think about the fact that proportion of tesselation to non-tesselation work will affect the performance difference of Evergreen and GF1xx cards, due to the different models used (fixed unit vs shader based).

and AA in DX 9 games because AMD refuses to submit code to the devs to do that.
If you're referring to Batman: AA, AMD did not "refuse to submit code to the devs"
In fact, Eidos went to both AMD and Nvidia and asked how they should implement AA. AMD and Nvidia suggested the same, standard method. The same method they did in fact end up implementing, and which works on AMD hardware if it's not deliberately disabled.

That is evident by the fact that using an Nvidia vendor and device ID on AMD hardware allows you to run AA without problems. Which if you believe Nvidia, shouldn't even be possible.
So no, that is not some conspiracy. It is what the evidence tells us.
 

WaTaGuMp

Lifer
May 10, 2001
21,207
2,506
126
I have done zero reading about Crysis 2. Is the game going to follow suit with the 1st and be a hardware killer?
 

golem

Senior member
Oct 6, 2000
838
3
76
They can run open standards like OpenCL and DirectCompute. There's no reason to use PhysX except to favor Nvidia.


If you're referring to Batman: AA, AMD did not "refuse to submit code to the devs"
In fact, Eidos went to both AMD and Nvidia and asked how they should implement AA. AMD and Nvidia suggested the same, standard method. The same method they did in fact end up implementing, and which works on AMD hardware if it's not deliberately disabled.

That is evident by the fact that using an Nvidia vendor and device ID on AMD hardware allows you to run AA without problems. Which if you believe Nvidia, shouldn't even be possible.
So no, that is not some conspiracy. It is what the evidence tells us.

Do OpenCL and DirectCompute work for xbox360, PS3 and WII? I maybe wrong, but I believe Physx works on all 3 while OpenCL and Directcompute don't. That would be a huge reason to use Physx. Not to mention that the tools for Physx are supposed to be much better than the tools for DirectCompute and OpenCL.

Also isn't AMD support for OpenCL and DirectCompute sort of suck (in comparison to Nvidia's at least?) If the above 2 are correct, and you were a developer why wouldn't you use Physx instead of OpenCL or DirectCompute?

Also what you said about Batman AA. Nvidia actually PROVIDED code to make AA work, AMD SUGGESTED that dev use Nvidia's code.
 

tyl998

Senior member
Aug 30, 2010
236
0
0
I have done zero reading about Crysis 2. Is the game going to follow suit with the 1st and be a hardware killer?

I heard that Crysis 2 will require a machine from a *less* distant future compared to the original.
 

extra

Golden Member
Dec 18, 1999
1,947
7
81
I heard that Crysis 2 will require a machine from a *less* distant future compared to the original.

UNLESS YOU HAVE AN AMD CARD!!!! Then!! *gasp* I HEARD **shhhh** that the cd actually comes with bacteria on it that attack and devour AMD graphics cards (and turn them into a pile of "green" slime!)--thus the phrase "but can it play crysis 2" will be echoed across the lands for ever moar!
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
GTX 480 and GTX 580 are basically the same thing so how is Crysis being redesigned?

/thread.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
They can run open standards like OpenCL and DirectCompute. There's no reason to use PhysX except to favor Nvidia.

Who is they? You realize until very recently AMD drivers didnt support OpenCL out of the box? AMD required a seperate SDK download.

Proportion (not level) of tesselation to regular rendering is the key.

Since Nvidia cards use their shaders to tesselate, they can assign any proportion of shaders they want to tesselation.
And in a non-typical proportion (not level) of tesselation (Unigine Heaven) will cause more shaders than usual to be used in tesselation.
Given that Unigine Heaven tries to tesselate pretty much everything it possibly can (that's its whole purpose), I think it's safe to say the proportion of tesselation to non-tesselation work is highly imbalanced.

In a typical level of tesselation (i.e. what you'd expect to see in a game), with the proportion of tesselation to non-tesselation work being much smaller than Unigine Heaven,a smaller proportion of shaders in the Nvidia cards will also be used for tesselation.

Of course I don't blame Unigine developers for purposely favoring Nvidia, because they didn't. They obviously just thought that in a tesselation benchmark, putting as much tesselation as possible will really push cards.
They just didn't think about the fact that proportion of tesselation to non-tesselation work will affect the performance difference of Evergreen and GF1xx cards, due to the different models used (fixed unit vs shader based).

Nvidia's tesselator is in the Polymorph engine, not the shaders. Each shader cluster gets one polymorph engine. Nvidia's solution scales as more clusters are added to the chip.

If you're referring to Batman: AA, AMD did not "refuse to submit code to the devs"
In fact, Eidos went to both AMD and Nvidia and asked how they should implement AA. AMD and Nvidia suggested the same, standard method. The same method they did in fact end up implementing, and which works on AMD hardware if it's not deliberately disabled.

That is evident by the fact that using an Nvidia vendor and device ID on AMD hardware allows you to run AA without problems. Which if you believe Nvidia, shouldn't even be possible.
So no, that is not some conspiracy. It is what the evidence tells us.

Yes they did. The developers wanted code from the vendors. Nvidia provided the code, AMD didnt. There is no "standard" method. If there was, why would the devs need code from the hardware vendors??????
 

golem

Senior member
Oct 6, 2000
838
3
76
Then riddle me this, why does the GOTY version of B:AA have AMD AA?

I may be wrong on this, but I think Ryan Smith mentioned it in a forum posting that Nvidia gave the AA code to the developers of the Unreal engine so it could be incorporated into the engine itself.

Found the post, here's a cut and paste.

Ryan Smith
AnandTech GPU Editor

Join Date: Oct 2005
Posts: 64

Default

I saw this and thought I'd drop by and make a quick comment on the issue. NVIDIA went over this back at CES 2010, so all of this I have heard directly from them.

In-game AA in Batman is de-facto vendor locked. The solution NVIDIA initially devised for the game works on AMD cards too, however it required support for a texture format that AMD did not support at the time the technique was developed, and the texture format was not finally added until a couple of weeks before the game went gold. That didn't leave enough time to fully qualify the solution on AMD cards, so the game went without.

The fact of the matter is that - technically speaking - it could be enabled in the game for AMD cards at any time after the game shipped. As it stands the game has SecuROM, so the only way to enable it without faking device IDs is for the developer to roll out a patch with the solution enabled for AMD cards. Your guess is as good as mine why this hasn't happened.

In any case, along with making this fix for Batman, NVIDIA also submitted back to Epic. It was integrated in to the UE3 development tree and all future games using that version (or later) of the tree will have working AA, assuming of course they aren't using a different rendering method.


Here's a link to the post

http://forums.anandtech.com/showpost.php?p=30206585&postcount=209
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So all those posts for the last 9 months about NV intentionally locking out AMD out of AA are BS, because it was really the fact that AMD cards didn't have enough time to pass validation on time before launch and the game developer didn't release the patch later on? This "NV has ruined gaming" bashing just keeps getting better and better. :rolleyes:
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Anyone thinking that GPU PhysX (which runs under CUDA) can run on OpenCL or DirectCopmute without CUDA makes me laugh.

Anyone thinking OpenCL or DirectCompute is the same as PhysX makes me roll on the floor...the ignorance in this thread is strong.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
Do OpenCL and DirectCompute work for xbox360, PS3 and WII? I maybe wrong, but I believe Physx works on all 3 while OpenCL and Directcompute don't. That would be a huge reason to use Physx. Not to mention that the tools for Physx are supposed to be much better than the tools for DirectCompute and OpenCL.

I think you mean Physics and not PhysX.. cos WII and Xbox 360 have GPU's based on ATI's architecture.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
I think you mean Physics and not PhysX.. cos WII and Xbox 360 have GPU's based on ATI's architecture.

He may be broadening the definition of 'PhysX' to include PhysX middleware run on the CPU (not GPU accelerated) in which case it is supported on all the consoles.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Right...CryEngine 3 was designed specifically to be multi-platform. Every demo that has be shown is running on the Xbox 360. Crytek themselves have said that all the versions are identical. Wake up. If this game is successful, it will be the next Call of Duty or Halo and every release for the PC will be progressively worse. This is how it works on the PC. Release your first game on the PC, then once you have enough money you focus more on consoles.

Exactly, I would assume virtually all games are designed on a pc to run well on a console. This is why pc gamers worry about 32x aa, rather than something a little more special. If nvidia or amd are paying cash to get a couple of effects, they have lost the war. They should use their cash to develop a game that xbox can't play.