Nvidia Fermi is recommended for Metro 2033 game

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
The moronic "my CPU is as good as a GPU for physics" FUD migth soon be a thing of the past:

http://www.geeks3d.com/20100304/preview-physx-simulation-on-a-16-core-cpu/

I wonder what that will leave the naysayers with?
Oh wait..."meaningfull"...yeah :thumbsdown:

I hope so cus GPU physics has nothing decent to show for it right now. Go ahead and try all you want with numbers and stuff but numbers mean nothing when there's no good games to use it. and use it well.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Problem is while MS would give you a vendor agnostic setup, they won't be platform agnostic which PhysX already is. Right now PhysX runs on far more systems used for gaming then DirectX. I'm not saying an open solution would be bad in any way at all, just pointing out that MS certainly isn't going to provide us with anything close to an open standard, PhysX is already more open in terms of developer's needs then DirectX.

Personally I would much rather see something like Bullet take off which could be run on any of the available gaming platforms which would be a far better solution then swithching between a bunch of entirely proprietary standards(PhysX, DC, Havok) all with hardware companies behind them having vested interests in making things not work on another platform.

I agree on I'd rather see something like Bullet physics (or other completely independent middleware) take off rather than a solution from MS. I was just pointing out that just because MS doesn't have physics based middleware out there right now doesn't mean they won't put one out. In fact, from MS's perspective, it actually makes a lot of sense...you ease the development of games between MS's game consoles and Windows as well as tighten the ties between the game and MS.

The only thing a solution from MS would provide is a physics middleware for the PC that would not be tied into ATI or nVidia. From that point, it would be up to others to compete with MS as well as up to ATI and nVidia to provide the best physics acceleration possible. For PC gamers, that would be the best possible solution.

And as an aside, software based PhysX is prevalent but hardware accelerated PhysX is still rather small at this point in time.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I hope so cus GPU physics has nothing decent to show for it right now. Go ahead and try all you want with numbers and stuff but numbers mean nothing when there's no good games to use it. and use it well.
Could you define the difference between GPU and CPU physics?

You cant...can you?
Let me tell you:
Only difference (but also the key factor) is....PERFORMANCE.

Still waiting for a logical valid argument from you...argument from ignorance is a bad position to have.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Could you define the difference between GPU and CPU physics?

You cant...can you?
Let me tell you:
Only difference (but also the key factor) is....PERFORMANCE.

Still waiting for a logical valid argument from you...argument from ignorance is a bad position to have.

Funny you should mention performance.
Most of the added PhysX effects in fact REDUCE performance. So while having a graphics card to offload PhysX stuff onto means that more physics can be run, the added physics actually reduces overall game performance in the majority of cases, while only adding nice graphics.

Maybe in games without added PhysX effects there is the opportunity for improved performance, but for the most part the touted added effects in games like Mirror's Edge result in reduced performance.
 

Larries

Member
Mar 3, 2008
96
0
0
Does taking away AA do anything to the gameplay?
Does taking away AF do anything to the gameplay?
Does going over 680x480 do anything to the gameplay?

You need to define and specify "meaningfull" before you argument make any sense.

"Meaninfull" seems to be you last straw.
No link to ingame havok physics to back up your "stance".
No definition of "meaningfull" to back up your "stance".

Care to surpise me...or I am correct in assumsing you have nothing logical to add?

Actually, going over 640x480 does have significant and meaningful impact to the gameplay. Most of the time, higher resolution means you can view more things at once. For stragety games, the map is larger, making decisions faster. For MMO, you can put much more chat windows around without blocking your view. etc etc
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Funny you should mention performance.
Most of the added PhysX effects in fact REDUCE performance. So while having a graphics card to offload PhysX stuff onto means that more physics can be run, the added physics actually reduces overall game performance in the majority of cases, while only adding nice graphics.


OMG :eek:
STOP THE PRESS!!!

More features, mean more caculations...how would have THOUGHT!!!
(But try calcualting all that on the CPU...*hint-hint*)


So does AA, AF and higher resolution.
But since you are not agaisnt those things, I can only presume we are dealing with the sour grapes syndrome.
A has something.
B dosn't have something.
B wants to have something.
Fan of B disses product from A...beacuse of sour grapes.
Until B gets something.
Then all is fine and dandy.

Maybe in games without added PhysX effects there is the opportunity for improved performance, but for the most part the touted added effects in games like Mirror's Edge result in reduced performance.

No matter how you slice and dice it, this is the bottom line:
The world is ruled by physics.
Games tend to mimick the real world.
We have gotten quite good at making pretty dead world worlds.
Next step is to get more physcis involved.

And not the Battlefield: Bad Company fake physics that seems to impress consoleplayers...but more REALISTIC (aka not scripted) "destruction" physcis.

NVIDIA realizes this...and aqquired AGEIA.
Intel realizes this...and thus got Havok.
AMD realizes this...and(well...the are pretty broke and couldn't buy them...) are working with Bullet Physcis.

You are telling me that PhysX is useless...I am telling you that as soon as AMD get on the playing field, people will change their arguments..and suddenly GPU-physics are the "COOOOOLZ!!!".

I migth even go through several posts, just to show how fanboys change their stance on this topic.

Thank you for inspring me ;)

GPU-physics has not only come to stay....it's not even taken off yet.
(Even if people predicted it dead in 2006, 2007, 2008, 2009 and now in 2010..some people learn slow.)
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Actually, going over 640x480 does have significant and meaningful impact to the gameplay. Most of the time, higher resolution means you can view more things at once. For stragety games, the map is larger, making decisions faster. For MMO, you can put much more chat windows around without blocking your view. etc etc

So the resolution changes the FOV in a FPS? :hmm:
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
Could you define the difference between GPU and CPU physics?

You cant...can you?
Let me tell you:
Only difference (but also the key factor) is....PERFORMANCE.

Still waiting for a logical valid argument from you...argument from ignorance is a bad position to have.

I don't care what the difference is. The only difference is that no good games use GPU physics well at all. Like I said I don't care about number or anything. Good games is all I care about.
 

shaynoa

Member
Feb 14, 2010
193
0
0
how can Nvidia Fermi be recommended for any game when nobody knows a thing about Fermi and are you really going to trust no benchmarks as a recommendation
what a joke
 
Jan 24, 2009
125
0
0
I make no distinctions when judging physics, regardless of whether they are run on a GPU or a CPU, regardless of the vendor.

Extra visual effects are always nice, and I would never refuse them unless they reduce performance to an unsatisfactory level, but they are not something that compels me overly much.

I am more interested in applications of physics that enhance the gameplay of whichever title it is that I am playing. (as an aside, to date, I think Red Faction Guerrilla has displayed this best)
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
PhysX is not particularly compelling, but that's not the fault of developers or NV per-se, it's just the nature of the beast. No one person can make it compelling, and it's difficult to force the issue, it's almost one of those things that will have its time eventually.
Maybe it's a tenuous comparison, but I would liken it somewhat to tablet PCs. They have been around for years, but they have been mostly expensive, limited and not exactly having a mass market appeal. With development of technology they have become more affordable, more useful, and have more appeal. But that's taken time because the technology has needed to come together, costs drop, other things like Wi-Fi have become mainstream, the internet has taken off. Such things have all helped tablets be the upcoming thing, even though tablets have been around for ages.
Maybe when everything comes together for hardware accelerated physics we will see it hit the mainstream, but it might take time.


You are telling me that PhysX is useless...I am telling you that as soon as AMD get on the playing field, people will change their arguments..and suddenly GPU-physics are the "COOOOOLZ!!!".

I migth even go through several posts, just to show how fanboys change their stance on this topic.

Thank you for inspring me ;)

GPU-physics has not only come to stay....it's not even taken off yet.
(Even if people predicted it dead in 2006, 2007, 2008, 2009 and now in 2010..some people learn slow.)

I posted the top quote before the post you quoted originally.
Thanks for putting words in my mouth that I did not say, and then saying the same thing that I had already said in this thread.

We seem to agree with each other. GPU physics is nothing special, it's just like AA/AF (I run without AA/AF for the most part because my computer can't really handle them at the resolution I play in newer games, and I just don't care that much).
I'm sure they can be nice features to have for those who pay that much attention, but to me my games mostly look fine (Bioshock/2 excluded) with some details turned down, and I don't miss AA or AF (I have used them before), same as I don't think I miss PhysX (even though I haven't had a chance to use it since my previous card was only a 7800GT which didn't support hardware PhysX and currently I have ATI).

It has nothing to do with a level playing field, it has to do with worth and value. You seemed to be claiming that PhysX is awesome now, and then you just changed to say that it's not taken off yet... so presumably the additions we have currently from PhysX aren't in fact that compelling, since they aren't particularly widespread? Or perhaps there are barriers to use outside NV's control?
Wait, I said both of those things already and you seem to be saying the same.

There is a world outside the high end/enthusiast market, and when it comes to playing games I would consider myself mostly part of that market. I play games, I have a decent computer, but I don't care about having every single little bell and whistle enabled, partly because I don't care, partly because my computer isn't up to it. The same can be said fo the majority of game players, and consequently hardware accelerated PhysX remains a niche product in a niche market because it isn't compelling enough for people to really want to get it, or they can't really use it because they can't handle it.
 

Soltis

Member
Mar 2, 2010
114
0
0
Hello to all!~

Been reading through posts in this thread(and this forum for a short while now) and finally decided to post(although bad sign that my first post is in what appears to be a battlefield I suppose).

Everyone in the PC business has a #1 goal of making money I would presume, and part of that means giving the most thought to the largest part of the market(mainstream). Game developers who take this into mind respond by putting the most effort into the largest selling points(console platforms and games that can run on most setups to some degree). This means that tying things that run the game into tech that is available to only a part of the market(aka using physX to its full potential) will usually result in reduced sales, which is counterproductive to developers. For those who are saying people who didn't care for physX are going to suddenly turn around when ATI jumps in; of course, simply because when this happens physX will probably be considered mainstream by developers and therefore good for business.

I personally can only see 2 ways for physX to truly be utilized;
1) physX becomes "mainstream" (i.e not Nvidia only)
2) Nvidia starts developing games as well as video cards
Although I could always be wrong.


In regards to "meaningful" not being a valid argument.. lol "meaningful" isn't a "valid" argument but at the end of the day, its really the only argument that matters even though it varies from person to person, considering these video cards often cost as much or more than a console system just to play most of the same games(at the moment).


But take what you will from this post. For those who would label me a "fanboy" or any such thing, just know I haven't built my system yet, so I'm actually still on the fence about which companies to go with for now. Just consider this the opinion of a relatively informed, relatively unbiased observer who has never always been right.:D
 

yepp

Senior member
Jul 30, 2006
403
38
91
You need to reach down your pants and grab a double handful of balls.

No that accolade goes to 4A Games/THQ for dropping the balls by not promoting 3D Vision Surround in their system requiements for Metro 2033.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Phys-X = Nvida asks dev to take explosion + particles, remove them from final code, include in physx package to off-load from cpu, and be calculated by physx coprocessor, or nvidia gpu, or nv gpu as coprocessor.

They know that as the effects & simulations become more complex, the cpu will struggle to process the calculations, so a proprietorial api allowing the calculations to be done on more capable hardware was developed. Havok, Ageia, Bullet, Phys-X.

Nvidia has become the master at removing things like fog, sparks, explosions, and various other post-effects from a twimtbp game, and including them in their physics package. At this point in time, I can't think of any video game physics calculations that cannot be done on a Deneb or Bloomfield. You would have to intensify the situation, like fluidmark, rocketsled, particle simulations to really tax the cpu and make it worthwhile on the gpu. If particle simulations are your game of choice, then you NEED Phys-X. If you want to enjoy the special effects in Batman Archam Asylum, you NEED Phys-x (but not technically).

I hate phys-x, but a compute standard is going to be needed once in-game effects become too complex for a cpu. This is starting to take place now, but it isn't fog or sparks that are to complicated for a cpu.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
In fact, from MS's perspective, it actually makes a lot of sense...you ease the development of games between MS's game consoles and Windows

What are MS's big franchises using for physics on the PC now? Halo3/Halo ODST, Forza3? I think looking into what their PC ports run for physics will help people understand how seriously MS is taking the advancement of PC games right now.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Did I say anything at all about directx 11? Let me check... nope sure didn't.
Haha, I didn't see you had responded to my post.

I beleive that above being a game that supports PhysX and 3D, Metro 2033 is a Direct X11 game. You can argue all you want but the fact remains that 3D View(or whatever it's called) is a pretty small part of the GPU market. PhysX is a nice feature but it requires you to use a nVidia card which not everyone uses.

So besides those features, I think most people who want to play this game is because it supports Direct X11 more than anything else. When you completly ignore the Direct X11 parts currently available, I think thats quite a slap to the face to ATi/AMD and people who are actually using ATi/AMD cards.

Yes I know that certain features in the game require nVidia's cards but how come there's no ATi/AMD part mentioned for the recommended or even the minimum? That is what I find unacceptable.

Also was it so hard to just say "Or equivalent"? I mean you don't have to list ATi/AMD's offerings but you should ackowledge it.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
This has so many people on the edge?

Is is your feeling of righteousness that is stepped on?
In a competitive market, you either have to do what Nvidia does, namely work to cut of your competitor and fortify yourself, so you can get the upper hand, or you work at it from another angle, like just having a superior product line.

Im sure there are game boxes where only ATI graphics parts are mentioned.

As a sidestep but on topic remark, even in AMD athlons glory days, gameboxes still had some form om text like this: "Intel pentium 4 3ghz or the equivalent required". It should have been AMD Athlon and equivalent, or what?

I blame ATI for this happening, honestly. Where is the ATI equivalent to thewayitsmeanttobeplayed?
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
What are MS's big franchises using for physics on the PC now? Halo3/Halo ODST, Forza3? I think looking into what their PC ports run for physics will help people understand how seriously MS is taking the advancement of PC games right now.

They don't have one. Nor is it likely they will have one any time soon but Halo would definitely be a candidate. Don't know who owns the Forza franchise but racing games are not really my thing. Racing games are definitely a candidate for realistic physics though.

I think there is also a "chicken and egg" thing to a degree. If MS had a decent solution then developers would likely use it. To a large degree, I think it's why GPU PhysX has had such minimal support from developers and for not much more than graphical eye candy. You'll notice the software based PhysX actually has very good support (as does software Havok).

In the case of a game like Battlefield Bad Company 2, the software based physics has gone way beyond what is seen from GPU accelerated physics. Probably because it works on almost any PC (if you have powerful hardware) and consoles. Something you're arguing for, and I agree 100%. It's just that that too many nVidia fanboys are arguing for PhysX and not seeing how nVidia's stance (let's not get into whose fault PhysX's current status is cause I think both nVidia and ATI are to blame) on PhysX is actually hindering GPU accelerated physics.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
No matter how you slice and dice it, this is the bottom line:
The world is ruled by physics.
Games tend to mimick the real world.
We have gotten quite good at making pretty dead world worlds.
Next step is to get more physcis involved.

You see, the world is ruled by "Physics", not "PhysX". Developers can and will use open standards such as OpenCL and DirectCompute to run their Physics engines on.

I commend Nvidia for always pushing new standards and technologies, but no matter how you slice it and dice it, closed proprietary standards such as PhysX are a dead end.

Here is a history example for you. Nvidia's first 2d/3d card the NV1, had pretty impressive 3D performance for it's time (1995), but it used a proprietary rendering method called "quadratic texture mapping". While a few games did release for it, developers in the end decided to go with open standards Direct3D and OpenGL instead.

From firingsquad: "Although the NV1 was technologically superior to other chips of that era from a number of perspectives, the proprietary quadratic texture mapping of the NV1 was its death sentence. When Microsoft finalized Direct3D not too long after the NV1 had reached store shelves, polygons had been chosen as the standard primitive, and despite NVIDIA's and Diamond's best efforts, developers were no longer willing to develop for the NV1."

http://www.firingsquad.com/features/nvidiahistory/page2.asp

Of course the situation with PhysX is a bit different because Nvidia did have a longer head start this time, and their cards also support OpenCL/DirectCompute. But all this will mean is that it would take a bit longer before PhysX fades into oblivion.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
"Although the NV1 was technologically superior to other chips of that era from a number of perspectives, the proprietary quadratic texture mapping of the NV1 was its death sentence. When Microsoft finalized Direct3D not too long after the NV1 had reached store shelves, polygons had been chosen as the standard primitive, and despite NVIDIA's and Diamond's best efforts, developers were no longer willing to develop for the NV1."

http://www.firingsquad.com/features/nvidiahistory/page2.asp

Of course the situation with PhysX is a bit different because Nvidia did have a longer head start this time, and their cards also support OpenCL/DirectCompute. But all this will mean is that it would take a bit longer before PhysX fades into oblivion.

Fade into oblivion, and rightfully so.
Hopefully, with Phys-x gone, Nvidia will have incentive to produce more compelling hardware innovations that keeps the market competitive. Proprietary software technology and registered trademarks out the wahzoo make for a pretty uneven playing field in our pc gaming arena. If enough devs get on board with phys-x, nvidia's sales rise, and ati has to produce hardware 2x as capable just to maintain sales (which is unlikely) if they don't have their own proprietary solution to compete with. Then we are left with a stalemate. Stagnate hardware progress & innovation. 20 more generations of rebranded G80's & G92's, proprietary trademarking of all IP from the Nexus C++ debugger to the lowercase n in the Nvidia logo. Nobody wants the GTX 780 to be a G92...
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
What are MS's big franchises using for physics on the PC now? Halo3/Halo ODST, Forza3? I think looking into what their PC ports run for physics will help people understand how seriously MS is taking the advancement of PC games right now.

Halo series all use Havok, dunno about Forza 3 but I would imagine an in house physics engine since it's suppose to simulate car physics specifically.