any consideration to PhysiX factor?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
Originally posted by: nRollo

Do you know how many tens of millions of people that market includes?

Selling ONE million copies of a game is a very big deal to a developer. Might make it a little easier to get there if you put some code in that 30-40 million people can take advantage of and make the game look better.
Or they could use Havok which everyone can use, so hundreds of millions would benefit.

Software Havok is actually quite excellent. Far Cry 2 uses it and it has amazing physics, especially when things blow up, and when vegetation sways in the wind. I can actually think of a large number of games that I play that use Havok and have great physics which everyone can use.

In the business world, money drives decisions, and money only.
In that case why implement a feature tens of millions can use when you can implement a feature everyone can use?

Have you been gaming long enough to remember GLQuake BFG10K?

At the time, the only people who could run it were those with 3DFX cards, and yet John Carmack went to the trouble to do the OpenGL port just for them so his game could look better.

Same with a lot of titles back then. Developers want to advance their product, not be limited by the feeble amounts of physics processing they can get from the CPU.

The number of games in development now with PhysX and the number of big development firms who have adopted it as their physics standard is evidence of this.

 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: nRollo
Next month you'll be able to play Mirrors Edge, and in February you'll be able to play Cryostasis.

There are many games slated for launch this year, undoubtedly some will make it, some will not.
And again, how much GPU will it take to run those games with PhysX effects enabled and still maintain respectable framerate? Those Cryostasis benchmarks you posted don't look promising at all in that reguard. Can you assure us that Mirror's Edge or any of the other upcoming games be any different, or perhaps for all you actually know he might even wind up worse off in practice by picking the card with PhysX support over the alternative?
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
OP, I hope this thread answered your question. The same handful of people (some focus group members who are compensated with free hardware to push Nvidia products/features and a fanboy) are telling you Physx is incredible. Pretty much everyone else says at this point it's pretty 'meh' but could be worthwhile in the future.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: SlowSpyder
OP, I hope this thread answered your question. The same handful of people (some focus group members who are compensated with free hardware to push Nvidia products/features and a fanboy) are telling you Physx is incredible. Pretty much everyone else says at this point it's pretty 'meh' but could be worthwhile in the future.

Really?

Guru3d

So then, I have to admit to like what I tested today. Overall gaming with PhysX adds a much more immersive experience to gaming. NVIDIA implementation as it is right now is downright good, as they offer combinations.

Elite Bastards

Well, now we know, and it has to be said that NVIDIA has turned PhysX into a great marketing point for their GeForce series of graphics boards in a quicker time than any of us would have thought possible. Here we are, six months later, with a GPU-accelerated version of PhysX in our laps, and a handful of real-world game titles to use it on. It work, it works well, and kudos is due to the software engineers who worked on this translation via CUDA for their efforts.

Do you mean to say forum users who aren't particularly fond of NVIDIA or NFG don't care for PhysX, as reviewers obviously do?
 

Atechie

Member
Oct 15, 2008
60
0
0
Originally posted by: rogue1979
Based on the only game that I play with Physx, it isn't worth worrying about.

GRAW2 physx implementation is too wierd. You have the option to turn the physx setting to "extreme".

What does this add to the game? Explosions look a little more detailed as well as the debris field and some damage to the actual environment. But I turned it down to normal
because there were small bits of "stuff" constantly floating around on the ground and sometimes in the air. Way too distracting, it takes away from realistic gameplay.

Honestly, I noticed the extra eye candy only when stopping to "watch" an explosion. When I was busy fragging away I didn't notice anything except the distracting weird crap flying around.

I was using an overclocked 9600GT which handled the game just fine, even with physx turned on.

It was so important to me that I switched back to an overclocked HD 3850 which looks better to me in some older games...

Somebody sounds like they have never been on a real battlefield...in combat.
 

Atechie

Member
Oct 15, 2008
60
0
0
Originally posted by: SlowSpyder
OP, I hope this thread answered your question. The same handful of people (some focus group members who are compensated with free hardware to push Nvidia products/features and a fanboy) are telling you Physx is incredible. Pretty much everyone else says at this point it's pretty 'meh' but could be worthwhile in the future.

I have no affiliation with NVIDIA, AMD or Intel...and I still thinks hardware accelerated physics is the biggest thing since accelerated 3D...so you little fallacy is nothing but that...a fallacy.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: nRollo
Do you mean to say forum users who aren't particularly fond of NVIDIA or NFG don't care for PhysX, as reviewers obviously do?

He obviously means us consumers generally have yet to find much reason to appreciate PhysX support; unlike the NFG members and reviews who don't have to pay for their hardware, and few other people who also hold a particularly strong fondness for Nvidia. Granted, you beeing a part of the latter side of the divide, it is easy to see how you misunderstood him.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I'm just gonna say that from a user standpoint, I think that what NV has been doing with CUDA is fantastic. PhysX is interesting as well, but it will take some time before it's useful, and will probably require next-gen GPUs to be enjoyed in its full glory.

Really I think the biggest deal is going to be the use of the GPU for general processing tasks, such as Photoshop, Video Editing, and 3D Modeling. I know from experience that 3D Modeling programs need all the processing power they can get; hopefully AMD and NV can contribute something here.

The whole PhysX thing just shows how much the two companies hate each other, and how unwilling they are to work on a standard together. It's pretty stupid. I'm not sure which company is at fault.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Atechie
Originally posted by: SlowSpyder
OP, I hope this thread answered your question. The same handful of people (some focus group members who are compensated with free hardware to push Nvidia products/features and a fanboy) are telling you Physx is incredible. Pretty much everyone else says at this point it's pretty 'meh' but could be worthwhile in the future.

I have no affiliation with NVIDIA, AMD or Intel...and I still thinks hardware accelerated physics is the biggest thing since accelerated 3D...so you little fallacy is nothing but that...a fallacy.


If you look through the 100+ replies to this thread you'll see that most people not affiliated with Nvidia are not impressed with Physx... yet. I think hardware accelerated physics is going to be a huge deal going forward, but not necessarily Physx. Please realize that when we are talking hardware accelerated physics and Physx, they are not the same thing. Physx may very well become the next big thing with hardware accelerated physics. Havok might be the next big thing if they go GPU accerlated. Some new standard with DX11 might become the next big thing with hardware accelerated physics. I think we can all agree that hardware accelerated physics is important and will be a big deal. But with that being said there is no gaurantee that Physx will be the hardware accerlerated physics to have. I'm just sitting on the sidelines at this point waiting to see how things play out, and right now, today, what Physx has to offer does not impress me. A year from now Physx may be by far the most advanced hardware accelerated physics on the market and may have a huge lead on anything else. If lots of games support it and it truely does change the way we play games, then I'll be on board with it 100%. Right now that just isn't the case, and there's not gaurantee that it will be the case with so much up in the air right now regarding hardware accelerated physics.

I completely agree with you 100% that hardware accelerated physics is going to be the next big thing. I never said that hardware accelerated physics is not important or will not be. I said that "PHYSX" has very little to offer at this point and that there is no gaurantee that it will be relevant 1-2 years from now (as another hardware accelerated physics standard may surpass it and be used by everyone). On the other hand it could be 'the' must have in a year. Right now it's just too early to tell and I would certainly not make a buying decision based on Physx at this point.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: WelshBloke
Originally posted by: nRollo
There's really only one question that needs to be considered for the original post:

Are you willing to gamble that there won't be any games the OP would like to play launching with PhysX effects he/she would like to see for the lifespan the card purchased will have?

If he/she is willing to gamble "No there won't be" then the cards can be evaluated strictly on other factors.

There are many PhysX games coming this year, and the Physx versions will only be acceptably accelerated on one brand. CPU won't cut it.


There is no way in hell that game developers are going to design their games for G80 and up GPU's only.

If you want Physx then yes they will, otherwise no physx unless you like 15fps.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: BFG10K
Originally posted by: nRollo

Do you know how many tens of millions of people that market includes?

Selling ONE million copies of a game is a very big deal to a developer. Might make it a little easier to get there if you put some code in that 30-40 million people can take advantage of and make the game look better.
Or they could use Havok which everyone can use, so hundreds of millions would benefit.

Software Havok is actually quite excellent. Far Cry 2 uses it and it has amazing physics, especially when things blow up, and when vegetation sways in the wind. I can actually think of a large number of games that I play that use Havok and have great physics which everyone can use.

In the business world, money drives decisions, and money only.
In that case why implement a feature tens of millions can use when you can implement a feature everyone can use?

Havok is more limited in complexity. A barrel blowing sky high to me isn't impressive at all. I want individual particles of snow, ice, or water to be individually adjusted dynamically. Only hardware acceleration can do that.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: TheSnowman
Originally posted by: nRollo
Next month you'll be able to play Mirrors Edge, and in February you'll be able to play Cryostasis.

There are many games slated for launch this year, undoubtedly some will make it, some will not.
And again, how much GPU will it take to run those games with PhysX effects enabled and still maintain respectable framerate? Those Cryostasis benchmarks you posted don't look promising at all in that reguard. Can you assure us that Mirror's Edge or any of the other upcoming games be any different, or perhaps for all you actually know he might even wind up worse off in practice by picking the card with PhysX support over the alternative?

25fps is acceptable, have you ever played Crysis with everything maxed and only getting 20fps? The game isn't slow, not a slidewhow, it's a very playable game.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: TheSnowman
Originally posted by: nRollo
Do you mean to say forum users who aren't particularly fond of NVIDIA or NFG don't care for PhysX, as reviewers obviously do?

He obviously means us consumers generally have yet to find much reason to appreciate PhysX support; unlike the NFG members and reviews who don't have to pay for their hardware, and few other people who also hold a particularly strong fondness for Nvidia. Granted, you beeing a part of the latter side of the divide, it is easy to see how you misunderstood him.

Only consumers who are blinded by their current card purchases. Forgetting that you won't be using your current card forever. I had a 4870, and did not buy a GTX280 because of physx, that's a bonus I will take advantage of when I can.

There will be a time when every card will have hardware accelerated physx, whether it's Nvidia's current implimentation or DX11 or Havok.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: cmdrdredd
25fps is acceptable, have you ever played Crysis with everything maxed and only getting 20fps? The game isn't slow, not a slidewhow, it's a very playable game.
I play Crysis, seems just as choppy at such low frameates as any other game.

Originally posted by: cmdrdredd
Only consumers who are blinded by their current card purchases. Forgetting that you won't be using your current card forever.
My current card is an 8800gt, which I mentioned previously, and I just ordered a gtx260 on Friday. So, what about those purchases are you suggesting is blinding me here?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Originally posted by: nRollo

Have you been gaming long enough to remember GLQuake BFG10K?

At the time, the only people who could run it were those with 3DFX cards,
That?s not true as there were OpenGL workstation cards that could run it too.

Same with a lot of titles back then. Developers want to advance their product, not be limited by the feeble amounts of physics processing they can get from the CPU.
So again I'll ask, where are the games? You keep telling us what developers want and keep marketing PhysX, but you can't provide a single list of substance to back your claims.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: TheSnowman
Originally posted by: cmdrdredd
25fps is acceptable, have you ever played Crysis with everything maxed and only getting 20fps? The game isn't slow, not a slidewhow, it's a very playable game.
I play Crysis, seems just as choppy at such low frameates as any other game.

Originally posted by: cmdrdredd
Only consumers who are blinded by their current card purchases. Forgetting that you won't be using your current card forever.
My current card is an 8800gt, which I mentioned previously, and I just ordered a gtx260 on Friday. So, what about those purchases are you suggesting is blinding me here?

You mentioned previously it was for a HTPC...not a gaming rig. So that means no overclocking etc am i right?

And no crysis is far from choppy at 25fps. It is smoother than Fallout 3 at 50fps since fallout has engine issues that create frame skipping.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
My HTPC is my primary rig, and where I tend to do my gaming, and I always overclock my CPUs but generally don't bother with videocards. Why do you ask?

And again, I don't find 25fps in Crysis any less choppy than any other game. As for Fallout 3, I got judder when I turned all the settings up, but at the settings I play the game it runs much smoother than I could ever hope to get Crysis to on my rig.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: TheSnowman
My HTPC is my primary rig, and where I tend to do my gaming, and I always overclock my CPUs but generally don't bother with videocards. Why do you ask?

And again, I don't find 25fps in Crysis any less choppy than any other game. As for Fallout 3, I got judder when I turned all the settings up, but at the settings I play the game it runs much smoother than I could ever hope to get Crysis to on my rig.

Well, honestly Crysis feels and plays fine with everything maxed out at 1680x1050 on my system. DX10 very high etc.

That's besides the point anyway. The point is people tend to shoot down physx when they can't use it, but once they actually see the things it can do it's a different story.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: TheSnowman
Originally posted by: nRollo
Next month you'll be able to play Mirrors Edge, and in February you'll be able to play Cryostasis.

There are many games slated for launch this year, undoubtedly some will make it, some will not.
And again, how much GPU will it take to run those games with PhysX effects enabled and still maintain respectable framerate? Those Cryostasis benchmarks you posted don't look promising at all in that reguard. Can you assure us that Mirror's Edge or any of the other upcoming games be any different, or perhaps for all you actually know he might even wind up worse off in practice by picking the card with PhysX support over the alternative?

I ran the cryostasis demo with everything maxed in DX10 at 1920x1200 and got the following results.

Average FPS: 36.7
Minimum FPS: 22.7
Maximum FPS: 99.6

So it will be completely playable if this tech demo is any indication.

I'll run the same test in a minute with AA.

edit: it doesn't seem that it functions with AA, even when set to forced in the control panel. FPS did not change and it was hard for me to determine if it was functional or not. I'm guessing it isn't but a 36fps average for a game running hardware physx on this level is good, especially with an engine that isn't optimized.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: cmdrdredd
Well, honestly Crysis feels and plays fine with everything maxed out at 1680x1050 on my system. DX10 very high etc.
Sure, your quad is clocked almost as high as my dual, and your FSB is a bit higher as well.

Originally posted by: cmdrdredd
That's besides the point anyway. The point is people tend to shoot down physx when they can't use it, but once they actually see the things it can do it's a different story.
I can use it, I've got a videocard that supports it right here, I just haven't seen it do any good for me yet. As for your benchmarking, that looks in line with the ones Rollo posted above. You can find impressions above as well if you like.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Originally posted by: cmdrdredd
Originally posted by: TheSnowman
My HTPC is my primary rig, and where I tend to do my gaming, and I always overclock my CPUs but generally don't bother with videocards. Why do you ask?

And again, I don't find 25fps in Crysis any less choppy than any other game. As for Fallout 3, I got judder when I turned all the settings up, but at the settings I play the game it runs much smoother than I could ever hope to get Crysis to on my rig.

Well, honestly Crysis feels and plays fine with everything maxed out at 1680x1050 on my system. DX10 very high etc.

That's besides the point anyway. The point is people tend to shoot down physx when they can't use it, but once they actually see the things it can do it's a different story.

No ppl don't shoot downs physics. They just say it isn't a consideration yet. Especially cause ATI cards can provide physics acceleration, they just not using physx cause that is nVidia property and they want a compensation for its use by amd/ati (which is acceptable).

But AMD/ATI doesn't want to pay, especially cause they expecting other physics engine to be the standard.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: GaiaHunter
Originally posted by: cmdrdredd
Originally posted by: TheSnowman
My HTPC is my primary rig, and where I tend to do my gaming, and I always overclock my CPUs but generally don't bother with videocards. Why do you ask?

And again, I don't find 25fps in Crysis any less choppy than any other game. As for Fallout 3, I got judder when I turned all the settings up, but at the settings I play the game it runs much smoother than I could ever hope to get Crysis to on my rig.

Well, honestly Crysis feels and plays fine with everything maxed out at 1680x1050 on my system. DX10 very high etc.

That's besides the point anyway. The point is people tend to shoot down physx when they can't use it, but once they actually see the things it can do it's a different story.

No ppl don't shoot downs physics. They just say it isn't a consideration yet. Especially cause ATI cards can provide physics acceleration, they just not using physx cause that is nVidia property and they want a compensation for its use by amd/ati (which is acceptable).

But AMD/ATI doesn't want to pay, especially cause they expecting other physics engine to be the standard.

AMD or ATI had their shot to impliment it, they didn't want it. The 2900XT even lists hardware accelerated physics as a selling point on the front of the box. ATI dropped the ball and Nvidia picked it up and ran with it. That's how it goes, Physx is the first real implementation of physics that uses any hardware acceleration and can do calculations complex enough to be worth something. I don't consider Havok to be worth all that much as it is now. Far Cry 2 for example uses it, and when you walk into a tree or bush it moves with you as if you pushed it over. However, it springs back up like nothing happened as soon as you move from that spot and look at it again. That's hardly realistic. Now imagine if it was hardware accelerated and every leaf was calculated for and every droplet of water or flury of snow was calculated for and in real time, dynamically changed depending on your interaction with it. So you could brush a bush and scrape leaves off and watch them flutter in the wind individually modeled, perhaps landing in a pool of water and every ripple and wave reacted realisticly and in real time.

I myself said it wasn't worth anything and it was a gimmick, but when I used it and have seen how it can add to a game (looking at cryostasis) while not hindering the game's playability I am very impressed and want it to continue to grow into a standard. Physics needs to be hardware accelerated to work, otherwise you get rag doll physics which is a joke, or the way havok is used now which is not complex enough to add any extra immersion or distraction to the player.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: TheSnowman
Originally posted by: cmdrdredd
Well, honestly Crysis feels and plays fine with everything maxed out at 1680x1050 on my system. DX10 very high etc.
Sure, your quad is clocked almost as high as my dual, and your FSB is a bit higher as well.

Originally posted by: cmdrdredd
That's besides the point anyway. The point is people tend to shoot down physx when they can't use it, but once they actually see the things it can do it's a different story.
I can use it, I've got a videocard that supports it right here, I just haven't seen it do any good for me yet. As for your benchmarking, that looks in line with the ones Rollo posted above. You can find impressions above as well if you like.

Yeah I looked at those but what I'm saying is the game based on the tech demo is playable, and I'd bet even a 3.2Ghz C2D with a GTX260 would play it very fine maxed out at a resolution like 1680x1050.

That tells me that physx can work, it in fact does work. Like I said in my previous post, physics needs to be hardware accelerated to go forward.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: cmdrdredd
AMD or ATI had their shot to impliment it, they didn't want it.

And Nvidia wouldn't have wanted it either if AMD owned it. To do so is akin to playing poker with someone who is free to change the rules as the game goes along. It was as sucker's bet and AMD's only smart move was to pass.

Originally posted by: cmdrdredd
Yeah I looked at those but what I'm saying is the game based on the tech demo is playable, and I'd bet even a 3.2Ghz C2D with a GTX260...

And I'm saying with the 9800gtx+ the OP was thinking of getting it may be to damn slow to be worth using the PhysX effects at all.

Originally posted by: cmdrdredd
That tells me that physx can work, it in fact does work. Like I said in my previous post, physics needs to be hardware accelerated to go forward.

And going forward it is likely that newer games will need even faster cards to make PhysX support useful.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: TheSnowman
Originally posted by: cmdrdredd
AMD or ATI had their shot to impliment it, they didn't want it.

And Nvidia wouldn't have wanted it either if AMD owned it. To do so is akin to playing poker with someone who is free to change the rules as the game goes along. It was as sucker's bet and AMD's only smart move was to pass.

Originally posted by: cmdrdredd
Yeah I looked at those but what I'm saying is the game based on the tech demo is playable, and I'd bet even a 3.2Ghz C2D with a GTX260...

And I'm saying with the 9800gtx+ the OP was thinking of getting it may be to damn slow to be worth using the PhysX effects at all.

Originally posted by: cmdrdredd
That tells me that physx can work, it in fact does work. Like I said in my previous post, physics needs to be hardware accelerated to go forward.

And going forward it is likely that newer games will need even faster cards to make PhysX support useful.

Well then, it's a good gosh darn thing that faster cards come out quite frequently. Isn't that just darling?