Nvidia Fermi is recommended for Metro 2033 game

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Can any card use it? Does it work in practically every game? Does it provide a noticeable difference? Can any programmer program it in?

People keep referencing Batman AA as a game that uses physics good but we've all said it's stuff that can be done at any time and has been done in tons of other games before it without Physx. Saying you need Physx for the Batman stuff is laughable.

I'd rather have "something" properitary than "nothing" at all.
(AMD has been talking since 2006...and they are still talking...keyword: Talking.)

Saying that a CPU can keep up with a GPU when doing physics caculations is a joke.
But I guess it's only PhysX that is useless...when AMD (if ever) lanuches their Bullet OpenCL GPU physics, I bet you will praise to the skies...nes pas?

I mean NVIDIA (PhysX) are the only ones who have realized that GPU-physics in games are worth pursuing.
It's not like Intel wants to (Havok)...or AMD(Bullet) wants to do it too...oh wait.

GPU-physics is here to stay...the reason the "debate" is so one-sided today...is because there is only one player on the market right now...when that changes, we can finally move past this "PhysX is useless!!!"-crap...and look the the implimentations..and decide which implementation that is really crap.

But until then:
Something > nothing
 
Last edited:

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
I'd rather have "something" properitary than "nothing" at all.
(AMD has been talking since 2006...and they are still talking...keyword: Talking.)

Saying that a CPU can keep up with a GPU when doing physics caculations is a joke.
But I guess it's only PhysX that is useless...when AMD (if ever) lanuches their Bullet OpenCL GPU physics, I bet you will praise to the skies...nes pas?

I mean NVIDIA (PhysX) are the only ones who have realized that GPU-physics in games are worth pursuing.
It's not like Intel wants too (Havok)...or AMD(Bullet) wants to do it too...oh wait.

Ok well you're mistaking me for an ATI fanboi. This is my first ATI card eve but obviously with a comment like that you're the fanboi. I'm just pointing out how big of a joke Physx is and how it adds nothing meaningful to games. Once they do it'll be great but Physx is not a reason to buy into Nvidia at all. I'm playing a Havok physics game and it seems to do meaningful physics well without being jewish about it and allowing everyone access to it. The reason why it's so one sided is cus the fanbois defend it but there's nothing to defend yet.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Ok well you're mistaking me for an ATI fanboi. This is my first ATI card ever. I'm just pointing out how big of a joke Physx is and how it adds nothing meaningful to games. Once they do it'll be great but Physx is not a reason to buy into Nvidia at all. I'm playing a Havok physics game and it seems to do meaningful physics well without being jewish about it and allowing everyone access to it. The reason why it's so one sided is cus the fanbois defend it but there's nothing to defend yet.

Wrong.
PhysX is the most used physics-middleware...so it's an success, like it or not.
PhysX is on the market as the sole GPU-physics middleware...more than "nothing"....like it or not.
So you don't have to be a fanboy to be very wrong.

And if you insists on comparing Havok CPU physics to PhysX GPU physics, the start by showing me Havok CPU physics besting PhysX CPU physics.
If you cannot do that...you have just proven how hollow (and misguided) you arguments really are.

.oO("Meaningfull Physics"...good one...what physics isn't "meaningfull"...I will tell you: Nothing :rolleyes:)
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
Wrong.
PhysX is the most used physics-middleware...so it's an success, like it or not.
PhysX is on the market as the sole GPU-physics middleware...more than "nothing"....like it or not.
So you don't have to be a fanboy to be very wrong.

And if you insists on comparing Havok CPU physics to PhysX GPU physics, the start by showing me Havok CPU physics besting PhysX CPU physics.
If you cannot do that...you have just proven how hollow (and misguided) you arguments really are.

.oO("Meaningfull Physics"...good one...what physics isn't "meaningfull"...I will tell you: Nothing :rolleyes:)

Show me a Physx game that without Physx would change the gameplay??? Oh yea there isn't one.

Havok Physx already beats it cus everyone can use it and it's already in games that make good use of it and enhance gameplay. That's already a win right there.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Can any card use it? Does it work in practically every game? Does it provide a noticeable difference? Can any programmer program it in?

Everytime someone counters your arguments you just bring everything you're saying in a new and completely different direction. You're incoherent, hard to follow, and sound very bitter about the whole situation. But to answer the above questions: Any nvidia 8000 series or better can use it, it works in every game it's coded for, it provides a noticeable difference when used, and yes any programmer can program "in it."

People keep referencing Batman AA as a game that uses physics good but we've all said it's stuff that can be done at any time and has been done in tons of other games before it without Physx. Saying you need Physx for the Batman stuff is laughable.

Name one other game that has had volumetric fog interaction with the player. Name a game that has on-the-fly dynamic and deformable cloth animation. Name one.

Without nvidia's engineers coding and helping developers add GPU-accelerated physx effects into games, this argument wouldn't even exist. Developers aren't going out of their way to add in these effects. ATI has done nothing, save for maybe Dirt 2's DX11 support, to enhance any game visually like nvidia is doing.

You don't like hardware accelerated physx - awesome for you. Since you'd rather not have it, turn off and pretend it isn't there. Play the game how it was "intended" to be played before Nvidia came along and screwed it up for you. Keep missing out.

Show me a Physx game that without Physx would change the gameplay??? Oh yea there isn't one..

And once again you're on this "change the gameplay" argument. Like I said before (to which you did not answer and instead replied with unrelated questions) anti aliasing hasn't changed gameplay on iota. anisotropic filtering hasn't changed gameplay at all. Bloom never changed gameplay. Since none of those change gameplay, you don't need them any more than you need physx so make sure to disable all those features every time you play a game.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Show me a Physx game that without Physx would change the gameplay??? Oh yea there isn't one.

Show me where AF alters the gameplay?
Show me where AA alters the gameplay?
Show me where HDR alters the gameplay?
Neither alters the gameplay...but they greatly add to the immersion factor.

And you dogded the question...shame on you.

Havok Physx already beats it cus everyone can use it and it's already in games that make good use of it and enhance gameplay. That's already a win right there.

Give an example...or do you prefer to dogde with empty words and fuzzy warm feelings?!

PhysX is more used the Havok now...it's a fact, like it or not.
PhysX isn't going anywhere.

Physics in game altering ways won't come untill one of two things happens.

A) A developer grows a pair and don't care about consoles or people with older hardware and makes a game that REQUIRES GPU-physcs.

B) AMD and INtel gets off their sorry asses, stop talking and MAKE a solution themselfes.

It will come either way....like it or not...but your FUD isn't contributing...or even have eyes on the ball.

Sitting at the bench, screaming foul at the others players...when your team is not even playing always struck me as moronic.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
Show me where AF alters the gameplay?
Show me where AA alters the gameplay?
Show me where HDR alters the gameplay?
Neither alters the gameplay...but they greatly add to the immersion factor.

And you dogded the question...shame on you.



Give an example...or do you prefer to dogde with empty words and fuzzy warm feelings?!

PhysX is more used the Havok now...it's a fact, like it or not.
PhysX isn't going anywhere.

Physics in game altering ways won't come untill one of two things happens.

A) A developer grows a pair and don't care about consoles or people with older hardware and makes a game that REQUIRES GPU-physcs.

B) AMD and INtel gets off their sorry asses, stop talking and MAKE a solution themselfes.

It will come either way....like it or not...but your FUD isn't contributing...or even have eyes on the ball.

Sitting at the bench, screaming foul at the others players...when your team is not even playing always struck me as moronic.

Like I said, how many games change the way it plays cus of Physx??? None

You keep on referencing how good Physx is on paper but when there's nothing that takes advantage of it in a meaningful, how good is it then?
 
Last edited:

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Wrong.
PhysX is the most used physics-middleware...so it's an success, like it or not.
PhysX is on the market as the sole GPU-physics middleware...more than "nothing"....like it or not.
So you don't have to be a fanboy to be very wrong.

And if you insists on comparing Havok CPU physics to PhysX GPU physics, the start by showing me Havok CPU physics besting PhysX CPU physics.
If you cannot do that...you have just proven how hollow (and misguided) you arguments really are.

.oO("Meaningfull Physics"...good one...what physics isn't "meaningfull"...I will tell you: Nothing :rolleyes:)

The PhysX SDK is offered free to developers as a base for their physics system that they apply to their games and is, as you'd expect, the most widely used physics engine. The SDK comes standard with the PhysX library running on the CPU and it's up to the developer to implement GPU acceleration. Therefore, just because it's 'PhysX' and Nvidia list it on their site, only literally a handful of games actually implement GPU accelerated PhysX. Perhaps a fair comparison would be the current iteration of Havok physics (CPU) and a standalone PhysX (CPU). Both will offer the opportunity to implement the same effects on the CPU and probably perform the same- however a direct comparison would be difficult.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Wrong.
PhysX is the most used physics-middleware...so it's an success, like it or not.
PhysX is on the market as the sole GPU-physics middleware...more than "nothing"....like it or not.
So you don't have to be a fanboy to be very wrong.

And if you insists on comparing Havok CPU physics to PhysX GPU physics, the start by showing me Havok CPU physics besting PhysX CPU physics.
If you cannot do that...you have just proven how hollow (and misguided) you arguments really are.

.oO("Meaningfull Physics"...good one...what physics isn't "meaningfull"...I will tell you: Nothing :rolleyes:)

What?

Havok is used in at least 211 games:
http://www.havok.com/index.php?page=available-games

Physx at least 208 games:
http://physxinfo.com/

Havok is used in Star Wars: Force Unleashed, where we saw physics actually doing something in game. Physx can do this as well I'm not saying one is better than the other. But GPU accelerated Physx has only been used for flavor (leaves, smoke, etc) that has been emulated with scripted effects nearly as well in the past.. dynamic leaves are cool I guess but it's not the full potential.

The reason people in the thread say it's a joke is because the effects in Batman could have been done with scripted effects but they were done as excessively over-calculated PPU effects or none at all, which was just a bad design decision - not an achievement of GPU physics.

Also last thing I'll say, quad core, 6 core, 8 core CPU's, even though they aren't great at physics like a heavily parallel GPU is, are going to have nothing else to do but physics.
 
Last edited by a moderator:

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Why does every fanboi come out of the woods to get all uppity when someone doesn't like physX...

What on earth is wrong with seeing the potential in something, yet not wanting to really care about it until that potential is realized?

Right now it is a graphics feature.. as awesome as AA is I know many folks, myself included at times, that don't care about it.. I can enjoy mass effect 2 on my laptop almost as much as I can on my desktop, sans AA...

What we want are game play physics... The physics of the universe is not limited to smoke effects, it is the science of everything (yes yes, I majored in applied physics.. insert lame plug on its awesomeness here).. Where is the issue with desiring it to be used as such in a game? Think of the puzzles, extra weapons, etc. we could be playing with if it was used to alter the game play instead of add fancy smoke and drapes.

I am of the opinion that we have had the technology to implement fundamentally game changing physics in games for years now. The reason we don't is often the "lowest common denominator" syndrome.. but I'd be surprised if it was not slowed by the prevalence of closed standards.

I mean who has not played crayon physics? Think what could be done with creativity like that applied to crysis level graphics, and an SLI physX set up... (or even a multi threaded cpu something).

If I could play a game like that I'd give up all the AA and even the high resolution for the physics... As it stands I remain "meh" on the current uses of it.
 

SmCaudata

Senior member
Oct 8, 2006
969
1,532
136
PhysX will NEVER be integral to actual gameplay in big budget games; it will always be eye candy.

If you were making a game and knew you could use PhysX for some actual gameplay mechanics you would instantly make the game less playable to all ATI users. You can sell it with different graphics capabilities though.

The minute that Nvidia made PhysX proprietary and break with any ATI installed in the system they killed their own technology. We saw it with BetaMax a long time ago.

NVIDIA is the reason that PhysX will never be anything more than eye candy for big budget games.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
PhysX will NEVER be integral to actual gameplay in big budget games; it will always be eye candy.

If you were making a game and knew you could use PhysX for some actual gameplay mechanics you would instantly make the game less playable to all ATI users. You can sell it with different graphics capabilities though.

The minute that Nvidia made PhysX proprietary and break with any ATI installed in the system they killed their own technology. We saw it with BetaMax a long time ago.

NVIDIA is the reason that PhysX will never be anything more than eye candy for big budget games.

You are likely right. As it stands, physx is almost all eye candy. Unfortunately, some people in this thread don't think nvidia should be adding helping developers add/code these things into games. They'd rather these additional special effects just not show up at all EVER.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
You are likely right. As it stands, physx is almost all eye candy. Unfortunately, some people in this thread don't think nvidia should be adding helping developers add/code these things into games. They'd rather these additional special effects just not show up at all EVER.

Did I say that??? I said it sucks until it adds something meaningful. You obviously don't understand that. You're defending it as it's the next greatest thing when it's obviously not cus if there's nothing to take advantage of it in a meaningful way, it sucks. And what's funny, never once I said that I never wanted it to show up at all if you actually read my posts. I said that if it does show up in a meaningful way then great but right now it's a dud.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Right now it is a graphics feature.. as awesome as AA is I know many folks, myself included at times, that don't care about it.. I can enjoy mass effect 2 on my laptop almost as much as I can on my desktop, sans AA...

Yep, I feel the same way.

What we want are game play physics... The physics of the universe is not limited to smoke effects, it is the science of everything (yes yes, I majored in applied physics.. insert lame plug on its awesomeness here).. Where is the issue with desiring it to be used as such in a game? Think of the puzzles, extra weapons, etc. we could be playing with if it was used to alter the game play instead of add fancy smoke and drapes.

I am of the opinion that we have had the technology to implement fundamentally game changing physics in games for years now. The reason we don't is often the "lowest common denominator" syndrome.. but I'd be surprised if it was not slowed by the prevalence of closed standards.

I mean who has not played crayon physics? Think what could be done with creativity like that applied to crysis level graphics, and an SLI physX set up... (or even a multi threaded cpu something).

I like the idea of physics being involved with the strategy of the game also.
 
Last edited:

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Maybe the dev was trying to be nice and spared AMD this sort of embarrassment?
I doubt it, but maybe they're just covering their butts.

Doubtful. I think the listed video cards all being nVidia and having nothing from AMD/ATI (CPU or GPU wise) is due to the TWIMTBP program more than anything.

There are two minimum video cards listed and both are nVidia cards. Both will take a noticeable performance dip with PhysX enabled. Enough to say that if you really thought PhysX ( or DX11 and 3D Vision) was important, you'd either have a multi-GPU system or at least something more powerful. And why was there no mention of any CPU's from AMD that would be the minimum? Technological superiority aside, we all know that while the C2D's are great CPU's (I have a Core 2 Quad) there are comparable CPU's from AMD from a cost and performance standpoint.

On the recommended front, the CPU situation is the same as in the recommended section. The game's recommended video card is a DX10 only video card and 3D Vision is such a niche product that it is negligible at this point in time. Note that I'm not disparaging 3D Vision as a technology, it's just such a small part of the market that it could be said to barely exist. I haven't seen the PhysX effects used on Metro 2033 so I can't comment on that part of it. However, the recommended GPU from nVidia has no DX11. It is also arguable that even in this sector PhysX might be too much of a performance hit for the recommended card. Note that I am judging performance hit from previous PhysX games since I haven't seen benchmarks (preliminary or not) from this game.

On the optimal configuration front, there is no question the recommended CPU should be an Intel i7. It is the best consumer CPU out there in the x86 market. It should also be unquestioned that the recommended GPU is a GTX 470 or 480. This is the area where the consumers want every single bell & whistle to be shown price be damned. From that standpoint, the GTX 470 and GTX 480 offers everything under the sun.

There is nothing wrong with any developer listing nVidia and Intel only parts. It is their right to do so. Just don't try to pass it off as sparing some company embarrassment unless said company has products that are utter crap. And no one except those who are biased can make the argument that AMD/ATI products are crap at this time. Perhaps not the best but certainly nothing to be ashamed of.

I hate to sound like a broken record here, but physics simulation is middleware, it's not an API construct. The solution won't be from MS, it will be from someone like Havok offering a physics simulation package that runs on DirectCompute/OpenCL.

I understand where you're coming from but it's probably a case of quibbling over semantics. It's also possible a solution could come from MS, and it would probably be preferable as it would be vendor agnostic. A solution from MS also does not mean that there can't be competing standards. It would certainly move the physics acceleration of games forward and bring game changing additions to games with a standard that all developers can use instead of adding a few bits of visual fluff.

Incidentally, for all you PhysX backers, Battlefield Bad Company 2 and Red Faction Guerilla utilizes physics in a way that changes and enhances how the games are played. No PhysX game has utilized physics to this degree in changing how a game is played. These two games run on Havok and uses the CPU for physics acceleration. This is not a wholesale endorsement of Havok. Merely pointing out to the "PhysX is teh pwnz" folks that other physics engines can be just as capable as PhysX.

Here you go a game where gameplay changed because of physx
http://physxinfo.com/data/vreview_cm2.html

The water effects (where PhysX is employed) looked like crap but we're worried about physics, not visuals. On that front, it looks like the game was already hampered since it seems to be a single-threaded app so it seems that the game was hampered when using the CPU to process physics and more likely than not could have been done purely on a multi-core CPU if properly coded to do so. This is the only info I could find on Crazy Machines 2 on whether it is multi-threaded or single-threaded. If you have info to the contrary, please link.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
And if you insists on comparing Havok CPU physics to PhysX GPU physics, the start by showing me Havok CPU physics besting PhysX CPU physics.

Play Battlefield: Bad Company 2,
awesome physics and its done with havok on the cpu.
they affect GAMEPLAY and arnt just there for eyecandy (like physx smoke and paper).
You can destriy objectives by leveling buildings

I ran this game on my triple core cpu with the FPS in the 50s at 1920x1080
http://www.youtube.com/watch?v=rCe8fBn58Fo
http://www.youtube.com/watch?v=KvZMsVxphFU&feature=related
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Like I said, how many games change the way it plays cus of Physx??? None

You keep on referencing how good Physx is on paper but when there's nothing that takes advantage of it in a meaningful, how good is it then?


Keep dodging the qeustion.
You have now gone from "PhysX is useless" to "Havok is better than PhysX"...to "I have seen NO good physics".

Again, I'd rather have something...than nothing.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
What?

Havok is used in at least 211 games:
http://www.havok.com/index.php?page=available-games

Physx at least 208 games:
http://physxinfo.com/

Yea, Havok is quite represented...in older games.
Last year, the most used API was PhysX.
In 3 years, PhysX has overtaken Havok.


Havok is used in Star Wars: Force Unleashed, where we saw physics actually doing something in game. Physx can do this as well I'm not saying one is better than the other. But GPU accelerated Physx has only been used for flavor (leaves, smoke, etc) that has been emulated with scripted effects nearly as well in the past.. dynamic leaves are cool I guess but it's not the full potential.

You are joking right?
Simple riggid bodies physics that dissapers after 10 second is "doing something usefull"?
I guess you think you CPU will beat you GPU at folding too?

The reason people in the thread say it's a joke is because the effects in Batman could have been done with scripted effects but they were done as excessively over-calculated PPU effects or none at all, which was just a bad design decision - not an achievement of GPU physics.

Bollcks, by scripting it, you take out the part that makes it belivable...the dynamic collisions.
I'm sure we could script an entire game.
What would we call such a thing?
Scriptupicures?
Or just about any console cutscence.
"Fun" to watch...no os fun to "play".

Also last thing I'll say, quad core, 6 core, 8 core CPU's, even though they aren't great at physics like a heavily parallel GPU is, are going to have nothing else to do but physics.

You do realize that CPU physics far from scales 100% with each added core?
Jack of all trades, master of none makes the CPU a slouch for physics....unlike...lets say a GPU...that shines in SIMD.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
Keep dodging the qeustion.
You have now gone from "PhysX is useless" to "Havok is better than PhysX"...to "I have seen NO good physics".

Again, I'd rather have something...than nothing.

Umm what question did you ask? I don't see any. You just keep saying I didn't answer something.

And how do I keep going from one thing to the other? Physx is useless right now cus it adds nothing to any games that couldn't be done otherwise. Batman AA and Mirror's Edge did things that you can do without Physx.

I'll support something that does something now that everyone can use rather than something that does nothing.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Why does every fanboi come out of the woods to get all uppity when someone doesn't like physX...

I wouldn't have a problem...if it wasn't for the sheer AMOUNT of bable being posted.
FUD is being used as facts...let me gives you a list of the most used(and most ingornat too!) reasons for not liking "PhysX"

  • It's properiatary!!!/owned by NVIDIA
Well, I hope you turn off Windows (and it's DirectX parts)..and only run Linux...and OpenGL games.

  • It's no better than my CPU/(artificially limited).
Prove it!
Show me where Havok can match GPU PhysX?
Or did you just get confused in all the SIMD, SP, DP, Mhz,Ghz, buswidth...and now also think you CPU can rneder all your graphics better (or the same) as your GPU?

  • It dosn't alter the gameplay!
Show me what does?
Does AA change the gameplay?
Does AF?
Texturing?
Shading?
No?
Then they must be useless too...oh wait.


Some of these have already been posted in this very thread.
The problem is a lot of ignorant wannabee geeks posting FUD, which in turn makes other people believe their FUD.

I wouldn't have a problem if it where educated people debating the facts, based on data/benches...but it's not....it's frakking far from.

So as long as people with no clue post FUD...I will counter it.
If you see that as a bad thing...I guess I can put you the the catagory of "people with no clue"

If not...then we are on the same page.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
I wouldn't have a problem...if it wasn't for the sheer AMOUNT of bable being posted.
FUD is being used as facts...let me gives you a list of the most used(and most ingornat too!) reasons for not liking "PhysX"

  • It's properiatary!!!/owned by NVIDIA
Well, I hope you turn off Windows (and it's DirectX parts)..and only run Linux...and OpenGL games.

  • It's no better than my CPU/(artificially limited).
Prove it!
Show me where Havok can match GPU PhysX?
Or did you just get confused in all the SIMD, SP, DP, Mhz,Ghz, buswidth...and now also think you CPU can rneder all your graphics better (or the same) as your GPU?

  • It dosn't alter the gameplay!
Show me what does?
Does AA change the gameplay?
Does AF?
Texturing?
Shading?
No?
Then they must be useless too...oh wait.


Some of these have already been posted in this very thread.
The problem is a lot of ignorant wannabee geeks posting FUD, which in turn makes other people believe their FUD.

I wouldn't have a problem if it where educated people debating the facts, based on data/benches...but it's not....it's frakking far from.

So as long as people with no clue post FUD...I will counter it.
If you see that as a bad thing...I guess I can put you the the catagory of "people with no clue"

If not...then we are on the same page.

Most here aren't debating any numbers or benches or anything. I'm saying that Physx has nothing that uses it in a meaningful way and there are other API's that do. Numbers mean nothing when there's nothing good that uses it.
 
Last edited: