PhysX and multi-core support

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

toyota

Lifer
Apr 15, 2001
12,957
1
0
The current consoles already support PhysX.

I expect it to be more popular than DX11 this year, just as it was more popular than DX10 last year.

With a string of successful titles already out there and more sure to come it has no where to go but up.

Most major game developer have already announced some sort of support for PhysX over the past year or so.

Richard Huddy is really giving AMD a black eye, by going around and spreading all this FUD and misinformation.

ATI was offered PhysX and they refused it. They are the ones blocking the standard. How come it runs on everything else but ATI cards.
consoles only have low level software physx effects. its the hardware level physx at issue here.
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
A lot of games have PhysX but they are run off of the CPU. GPU physx titles are very few.
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
I disagree. AAA Games like Mirror's Edge, Batman, and Dark Void have gotten released over several platforms and have been enhanced for the PC. Metro 2033 is an upcoming game that is going to be using physx and you can be sure that when Fermi hits Nvidia will spill the beans on more upcoming AAA games using physx.

The current console cycle is expected to extend to 2012 or beyond. If developers aren't leveraging the abundantly excess amount of GPU power available on PC's by then there won't be much to argue about or look forward to in the future of GPU's and PC gaming.

You seem to have missed the key word: "meaningful".

Only the latter one offers all features like swirling jet streams of the jetpack (According to Nvidia up to 100,000 particles) via APEX Turbulence Module or noticeably more particles when using certain weapons. The Disintigrator for example dissolves the aliens into a huge spray of particles (Nvidia mentions 30,000 particles) which rain down afterwards. The Magnetar or the Lighting Gun rip small lumps from surfaces - the effects are illustrated with a lot of sparks of course. On ‘Medium' settings the jet stream swirls vanish and on ‘Low' the default particles are shown on the screen for a little longer and collide with the environment.

Those aren't meaningful effects.
Meaningful effects would have an impact on gameplay. Current PhysX effects are as meaningful as AA. AA doesn't impact gameplay.

And again in reference to Wreckage: AMD already supports PhysX.
PhysX runs on two AMD graphics powered games console (the Xbox 360 and the Nintendo Wii), it also runs on consumer PCs with AMD power (AMD CPUs), it also runs on computers with AMD graphics hardware (any computer with an ATI graphics chip).

So how can AMD be the ones blocking anything?
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
A lot of games have PhysX but they are run off of the CPU. GPU physx titles are very few.

It's something the green trolls totally forget. Or their mental capacity is too small to accept this little detail. Or they're actually dumb enough to believe it's otherwise. Only GeForce 8-series and higher can utilize hardware PhysX ("GPU PhysX") and only on the PC (plus the Ageia PPU). Every other platform has low level CPU-based physics, that's like Havok.

Here's another example of a non-civil post that earns an infraction. Leave the insults in some other forum, please. -Admin DrPizza
 
Last edited by a moderator:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
You seem to have missed the key word: "meaningful".

Really? You want to talk about meaningful?

DX11 has no "meaingful" effects on gameplay.
Tesselation has no "meaningful" effects on gameplay.
Anisotropic filtering has no "meaningful" effects on gameplay.
Anti Aliasing has no "meaningful" effects on gameplay.
The list goes on, and on, and on, and on.

Downgrade back to 1900XT and play everything at 640x480 at minimal settings, since better visual effects are not meaningful and therefore do not improve your gaming experience. Or, if you want meaningful effects on gameplay, grab a wii remote and have yourself a jolly good time slashing it around the room in display of a "meaningful" effect on gameplay. Meanwhile I'll take my "meaningless" GPU physx effects and much, much better eye candy.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
Really? You want to talk about meaningful?

DX11 has no "meaingful" effects on gameplay.
Tesselation has no "meaningful" effects on gameplay.
Anisotropic filtering has no "meaningful" effects on gameplay.
Anti Aliasing has no "meaningful" effects on gameplay.
The list goes on, and on, and on, and on.

Downgrade back to 1900XT and play everything at 640x480 at minimal settings, since better visual effects are not meaningful and therefore do not improve your gaming experience. Or, if you want meaningful effects on gameplay, grab a wii remote and have yourself a jolly good time slashing it around the room in display of a "meaningful" effect on gameplay. Meanwhile I'll take my "meaningless" GPU physx effects and much, much better eye candy.

You say all that, yet you say you'll go back to your PhysX.
Now if you disagree that PhysX is meaningless, then that means that you think DX11 has meaning as well, and since having PhysX means you can't have DX11, you are missing out on DX11, which means you are missing out in the same way anyone who doesn't have hardware PhysX is.

So why is PhysX so meaningful where DX11 isn't?


Also you just said that PhysX is only meaningful for eye candy, which was kind of my point. You give a laundry list of GRAPHICAL things, and talk about eye candy.
PhysX should be about PHYSICS, not about GRAPHICS. GRAPHICS, from more effects to make things look pretty, are not PHYSICS. If the only thing to use the extra PHYSICS processing power for is to improve GRAPHICS, then the PHYSICS part is meaningless.
 
Last edited:

Jovec

Senior member
Feb 24, 2008
579
2
81
Nvidia is trying to leverage PhysX to make them more money. Currently, that strategy seems to be offering support to developers to implement PhysX so that they sell more video cards. The marketing aspect of this is more important than the actual PhysX effects on games. For example, Nvidia hasn't lost much ground these last 18 months even though they do not have the better card at most price points and still don't have a current generation card out. Nvidia's branding (including PhysX) is a big reason why they still have 2x the cards on Newegg than AMD despite losing to them in the graphics wars for 18 months.

I'm sure Nvidia offered to license PhysX to AMD. I'm equally sure that the terms of that deal were unreasonably high. The likelihood of PhysX becoming a defacto standard is low and AMD knows they can stall on hardware physics support until OpenCL is mainstream. They know that developers aren't going to restrict their potential customer base to not only just customers with Nvidia cards, but customers with SLI mobos and two Nvidia cards. Even though PhysX and graphics could run on a single Nvidia card, I don't think single GPU is the target market - those users want high resolutions, all the eye candy, and PhysX effects, and that means a dedicated GPU(s) and a dedicated PhysX card.

Both companies are screwing us. Why doesn't Nvidia offer PhysX for free and work with AMD and MS to incorporate it in DX? Why does Nvidia block using a Nvidia card for PhysX with an AMD graphics card (on Win7)? Why isn't AMD pushing harder to get hardware physics on their cards via OpenCL or DX?
 
Last edited:

Wreckage

Banned
Jul 1, 2005
5,529
0
0
consoles only have low level software physx effects. its the hardware level physx at issue here.

I was pointing out that because of the broad support for Physx in general, it's not going anywhere.

There are already a number of top shelf titles using GPU physx and more on their way.

Unlike DX11, PhysX can be used to enhance gameplay.
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
I was pointing out that because of the broad support for Physx in general, it's not going anywhere.

There are already a number of top shelf titles using GPU physx and more on their way.

Unlike DX11, PhysX can be used to enhance gameplay.

Technically you are correct, but non-technically you're full of crap.

You are saying DirectCompute can't be used to enhance gameplay, even though it basically allows GPGPU calculations?

I mean technically DirectCompute isn't DX11 specific, since it also runs on DX10, but the spirit is there.
 

WelshBloke

Lifer
Jan 12, 2005
30,439
8,108
136
PhysX should be about PHYSICS, not about GRAPHICS. GRAPHICS, from more effects to make things look pretty, are not PHYSICS. If the only thing to use the extra PHYSICS processing power for is to improve GRAPHICS, then the PHYSICS part is meaningless.

Good point.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
I disagree with Wreckage. He's mixing two different types of PhysX implementations. Saying PhysX is running on pretty much anything is wrong and just shows a total lack of understanding on the subject - the software version runs on CPUs, any CPUs. Be it on a PS3, Wii or an iPhone. Or PC (Intel/AMD CPUs). And this is the one that's a lot more common. It's nothing comparable to GPU PhysX - the topic at hand. It's a "clone" of Havok at best.

Then there's GPU PhysX that runs only under Windows when you have a GF 8-series or higher (and is the subject of this thread). It's nothing alike the first PhysX and only shares the name. Again, this one runs ONLY on Windows and ONLY on a limited number of cards. Not to mention you need a GTX260 or better to enjoy those effects in the games... So saying the 8-series supports it is a stretch, as you can't comfortably play games on those using GPU PhysX anyway. They work fine as dedicated PhysX cards though (well, 9600GSO+ do). So again, all the 8400s, 8500s, 8600s, 9400s and 9500s don't have enough "oomph" to serve as dedicated cards. So neglecting an arguably bigger part of the current 8-series+ cards on the market that doesn't have the power to run GPU PhysX anyway shows another sign of ignorance (and trolling as this has been pointed out numerous times).

So that people get it: I was wondering if GPU PhysX, when running on the CPU, can be multithreaded. AMD PR guy said nVidia's blocking this feature. nVidia PhysX dude said it's there. We can clearly see it's there in 3DMark Vantage. Ergo - AMD PR dude accused the competition of something they're not doing. They should retract and apologize. They most likely won't. I could probably write a few more paragraphs here but I won't bother... Looks like the trolls won't have the thread last (and we even proved something that puts nVidia in a good light and AMD in bad... in like an eternity).

EDIT: Also, I don't mind GPU PhysX adding fluff. But imo it's not worth the speed advantage the current AMD cards offer in every other non-GPU-PhysX game. Then again... Once Fermi comes out and shows its muscles this will most likely change. It will offer everything the Radeons have plus PhysX. At a price premium, sure, but if you want to see some different stuff, this will be the only way to go.
small edit for tone/mild personal attack -Admin DrPizza
 
Last edited by a moderator:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
My facts are based on results in my computer thank you, and demos like Cell Factor while it ran in single thread, it was using the AGEIA card for PhysX acceleration, later on I will install the demo and run it in software mode to see if your rant is true.

I'm waiting...let me know if you have issues running in PhysX mode on the CPU...simple "hack".
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
I'm waiting...let me know if you have issues running in PhysX mode on the CPU...simple "hack".

Simple hack? WTF are you talking about? My AGEIA card works thanks to the GenL patch because nVidia blocked PhysX acceleration when an ATi card is used, unlike you, I work all day and I have very little time to spend on other stuff, if you are in a hurry, then do it yourself Mr. Waiting.
 

Modular

Diamond Member
Jul 1, 2005
5,027
67
91
As perhaps the hardware-based PhysX running on the CPU is multithreaded only when an nVidia GPU is detected in the system (hey, it's far-fetched, I know, but you never know...)


LoL. Not these days. The companies are engaging in numerous practices such as this to screw over gamers. nVidia has been leading the way with the way they limit PhysX and Anti-Aliasing in certain games to nVidia cards only.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
LoL. Not these days. The companies are engaging in numerous practices such as this to screw over gamers. nVidia has been leading the way with the way they limit PhysX and Anti-Aliasing in certain games to nVidia cards only.

It has been proven not to be the case here, as happy medium has his 4 cores pegged at 100% when the PhysX part is running on his CPU. And he has a HD5750.

Also, this line of the first post is there more as a joke than anything serious (kinda a small slap for the PhysX lockout, actually :p)
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,230
2
0
Y
PhysX should be about PHYSICS, not about GRAPHICS. GRAPHICS, from more effects to make things look pretty, are not PHYSICS. If the only thing to use the extra PHYSICS processing power for is to improve GRAPHICS, then the PHYSICS part is meaningless.

Thats my view as well, although obviously no one would mind more eye candy, but seriously, physx should be about enhancing GAMEPLAY so that you can do things you couldnt do before... right now its just a bunch of fog and banners, hardly exciting

But yes, I wont fall to a double standard and say DX11 is more important in that sense, since DX11 is also about graphics...

And I dont really have anything to say about the whole cpu vs gpu stuff, its a fact that some of the games with best physics out there had them done on the cpu, so until physx shows us something amazing, Ill keep my skepticism
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
What is more serious? ATI's speeches? Nvidia's responses? PhysX? how PhysX utilizes hardware? or how PhysX impact games?

It is funny how red team keeps on saying that the way PhysX utilizes hardware is problematic, where PhysX isn't even enabled for their hardware. On the other hand, green team is suffering performance hit from PhysX, but somehow they don't feel it.

I mean I can understand why will ATI user complains if the some game utilizes 100% CPU when Nvidia card is used but only 1% CPU when ATI is used, but that is not the case. CPU is not blocked by Nvidia in anyways right? Why do red team so upset about what Nvidia have and no upset for what they don't have? Why doesn't the green team even question the obvious? If physics actually runs better on CPU than GPU, thus producing better FPS, than why not?

As of now, having your video card to handle some flying paper is okay, but anything complicated than that hurts FPS unless you have another video card sitting in your PC doing nothing but eating electricity or PhysX. Green team is okay with it, I don't know why. Red team is having problems with it, I also don't know why.

I should really go back to school.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Simple hack? WTF are you talking about? My AGEIA card works thanks to the GenL patch because nVidia blocked PhysX acceleration when an ATi card is used, unlike you, I work all day and I have very little time to spend on other stuff, if you are in a hurry, then do it yourself Mr. Waiting.

To run it all on the CPU...and debunk the false AMD PR FUD about multi-core in PhysX being crippled by NVIDIA.

But I guess I will be waiting forever...because the test will not be what you hope for...but I can wait.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
What is more serious? ATI's speeches? Nvidia's responses? PhysX? how PhysX utilizes hardware? or how PhysX impact games?

It is funny how red team keeps on saying that the way PhysX utilizes hardware is problematic, where PhysX isn't even enabled for their hardware. On the other hand, green team is suffering performance hit from PhysX, but somehow they don't feel it.

I mean I can understand why will ATI user complains if the some game utilizes 100% CPU when Nvidia card is used but only 1% CPU when ATI is used, but that is not the case. CPU is not blocked by Nvidia in anyways right? Why do red team so upset about what Nvidia have and no upset for what they don't have? Why doesn't the green team even question the obvious? If physics actually runs better on CPU than GPU, thus producing better FPS, than why not?

As of now, having your video card to handle some flying paper is okay, but anything complicated than that hurts FPS unless you have another video card sitting in your PC doing nothing but eating electricity or PhysX. Green team is okay with it, I don't know why. Red team is having problems with it, I also don't know why.

I should really go back to school.

Your post doesn't make sense at all, I don't have any complaint at all because I have an AGEIA card and I can run PhysX just fine, current GPU's aren't suitable for Collision detection when Phisics is calculated, nVidia is doing fine using the other card for PhysX calculations, CPU PhysX runs the same regardless of the GPU, what you post isn't even related to the thread, definitively you should go back to school.

To run it all on the CPU...and debunk the false AMD PR FUD about multi-core in PhysX being crippled by NVIDIA.

But I guess I will be waiting forever...because the test will not be what you hope for...but I can wait.

I ran tests and it never pegged any core at all, while it used only one core at an average of 45%, it never pegged the core to 100% of usage, the other cores showed an average of usage of 7% which proves that PhysX is single threaded and performed the same and used the cores in the same way regardless of if PhysX acceleration was on or off.

I used the Cellfactor demo and it showed very great phisics effects, I wonder why there's no games that can come closer to Cellfactor in terms of Phisics effects and why current games runs PhysX so slow in a single thread pegging the core to 100% in usage.
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Your post doesn't make sense at all, I don't have any complaint at all because I have an AGEIA card and I can run PhysX just fine, current GPU's aren't suitable for Collision detection when Phisics is calculated, nVidia is doing fine using the other card for PhysX calculations, CPU PhysX runs the same regardless of the GPU, what you post isn't even related to the thread, definitively you should go back to school.



I ran tests and it never pegged any core at all, while it used only one core at an average of 45%, it never pegged the core to 100% of usage, the other cores showed an average of usage of 7% which proves that PhysX is single threaded and performed the same and used the cores in the same way regardless of if PhysX acceleration was on or off.

I used the Cellfactor demo and it showed very great phisics effects, I wonder why there's no games that can come closer to Cellfactor in terms of Phisics effects and why current games runs PhysX so slow in a single thread pegging the core to 100% in usage.
You are a very funny person. The OP quoted that Richard Hubby accuses Nvidia for removing "multicore optimization" from the original PhysX from AGEIA. You come in with the test done by your AGEIA card showing that it does not utilize multicore to begin with. However you don't seem to know what you have proved. I will spell it out for you, you have proved that Richard Hubby indeed accused Nvidia as the original code doesn't utilize multicore to begin with and Nvidia has not remove anything out of the original PhysX code.

So let me get it right. You are saying that Richard Hubby has no idea of what he saids? Geez, I don't see that coming from you.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Quoted by "HappyMedium":
"All my cores were pegged at 100%. 5750 gpu"

Quoted by "Evolution8":
"I ran tests and it never pegged any core at all, while it used only one core at an average of 45%, it never pegged the core to 100% of usage, the other cores showed an average of usage of 7% which proves that PhysX is single threaded and performed the same and used the cores in the same way regardless of if PhysX acceleration was on or off."

Maybe HappyMedium and Evolution8 should compare notes instead of this third party argument.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
You are a very funny person. The OP quoted that Richard Hubby accuses Nvidia for removing "multicore optimization" from the original PhysX from AGEIA. You come in with the test done by your AGEIA card showing that it does not utilize multicore to begin with. However you don't seem to know what you have proved. I will spell it out for you, you have proved that Richard Hubby indeed accused Nvidia as the original code doesn't utilize multicore to begin with and Nvidia has not remove anything out of the original PhysX code.

So let me get it right. You are saying that Richard Hubby has no idea of what he saids? Geez, I don't see that coming from you.

Well, that are the results of my tests using Cellfactor, a demo that predates nVidia's acquisition of AGEIA. Unlike you, I don't wear tinted glasses at all and I don't have any problems to admit anything at all, recommending nVidia cards or proving wrong Richard Huddy doesn't give me any allergic reaction at all, that's why I might be funny, but you are hillarious!!!
 
Dec 30, 2004
12,554
2
76
Well, that are the results of my tests using Cellfactor, a demo that predates nVidia's acquisition of AGEIA. Unlike you, I don't wear tinted glasses at all and I don't have any problems to admit anything at all, recommending nVidia cards or proving wrong Richard Huddy doesn't give me any allergic reaction at all, that's why I might be funny, but you are hillarious!!!

nothing wrong with funny glasses, its called 3d dude.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Quoted by "HappyMedium":
"All my cores were pegged at 100%. 5750 gpu"

Quoted by "Evolution8":
"I ran tests and it never pegged any core at all, while it used only one core at an average of 45%, it never pegged the core to 100% of usage, the other cores showed an average of usage of 7% which proves that PhysX is single threaded and performed the same and used the cores in the same way regardless of if PhysX acceleration was on or off."

Maybe HappyMedium and Evolution8 should compare notes instead of this third party argument.
well I ran Physx for Batman on the cpu and it uses 70-75% of my dual core which is just slightly less than the cpu usage as running it on my gtx260. the load was slightly different as using physx on the cpu seem to favor one core more than the other but the end result was still about the same overall usage. of course running physx on the e8500 results in about 10-15fps where as its 40-45fps with it on the gtx260.
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Richard Huddy is both right and wrong.

For the software based physics effects which is what the game devs basically use it runs on as many threads as the game devs like - which being as it's probably a console port isn't very many.

For the hardware based physics effects which are pretty well all nvidia sponsored (and I suspect written) it seems to only use one thread in software mode. That's because no one has bothered to code the software fallback for the hardware mode in a multi-threaded way.

Is this wrong? We personally I don't think so - if it's nvidia's work that is providing the hardware path (via TWIMTBP) then why would nvidia make it work multithreaded? They don't sell quad core cpu's, they sell nvidia graphics cards - and that's why they offer the hardware physx path - to sell nvidia gpu's not AMD cpu's - nvidia isn't a charity.

However it doesn't give a fair comparison (for example say in batman) of what a quad core cpu could really manage is fully utilised for those effects. If you want that comparison just use 3d mark I suppose - they coded it to maximise physics for both cpu and gpu path's.

If you want games with hardware physics to fully utilise multi-threading you'll need to ask the game devs to add them not nvidia. However the bottom line seems to be the game dev's don't care and will just give use straight console ports and no more - it's only nvidia or ati getting involved that seems to give the PC version anything other then higher resolutions and textures.
 
Status
Not open for further replies.