Interesting take on Kanters article PhysX87: Software Deficiency

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Because hardware acceleration was going to be placed on the GPU from the PPU. It wasn't about hardware accelerated PPU's anymore.

My zest for GPU Physics started here from ATI and nVidia from the humble beginnings of the potential of HavokFX.

http://www.neoseeker.com/Articles/Hardware/Previews/havokfx-nvidia/

http://www.firingsquad.com/news/newsarticle.asp?searchid=10649

That last link is funny, this tidbit hits home:

"We know the CPU can and will continue to push into the 1000’s of objects for game-play; while the GPU can carve out 10’s of thousands of collidable object simulations to add visual fidelity that no one thought was possible with an off the shelf graphics card"

Havok speaking.
Not NVIDIA PR.
GPU > CPU for physics...by a major factor.

This part was also interesting:


"FiringSquad: One of Havok's competitors' , AGEIA, has said of the ATI-Havok FX hardware set up, "Graphics processors are designed for graphics. Physics is an entirely different environment. Why would you sacrifice graphics performance for questionable physics? You’ll be hard pressed to find game developers who don’t want to use all the graphics power they can get, thus leaving very little for anything else in that chip. " What is Havok's response to this?
Jeff Yates: Well, I’m sure the AGEIA folks have heard about General Purpose GPU or “GP-GPU” initiatives that have been around for years. The evolution of the GPU and the programmable shader technology that drives it have been leading to this moment for quite some time. From our perspective, the time has arrived, and things are never going to go backwards. So, if people are going to purchase extra hardware to do physics, why not purchase an extra GPU, or better yet relegate last year’s GPU to physics, and get a brand new GPU for rendering? The fact is that this is not stealing from the graphics – rather it gives the option of providing more horsepower to the graphics, or the physics, or both – depending on what a particular game needs. I fail to see how that’s a bad thing. Not to mention that downward pricing for “last year’s” GPUs are already feeding the market with physics-capable GPUs at the sub $200 price point –even reaching the magic $100 price point."

"Oddly" PhysX wasn't implemented on SM3.0 hardware.
It was implemented on the G80.
The GPU where NVIDIA began putting GPGPU into their designs.

So it would seem that AGEIA was right at the time...SM3.0 card were not geared for physics, the need a more GPGPU base.

Even the fact that NVIDIA bought AGEIA didn't alter that.
And that might explain why AMD/ATI hasn't delivered...a problem with the underlying architecture...a problem that NVIDIA is tackling by adding a lot of GPGPU stuff into their GPU's.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
So nothing to do with PhysX at all?
Just checking because I was confused, since this thread is about PhysX, and NV recently said there was no reason for them to port PhysX to OpenCL.

Of course there isnt a reason to port it. Their rival wont even support OpenCL out of the box. Why make the effort?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
That last link is funny, this tidbit hits home:

"We know the CPU can and will continue to push into the 1000’s of objects for game-play; while the GPU can carve out 10’s of thousands of collidable object simulations to add visual fidelity that no one thought was possible with an off the shelf graphics card"

Havok speaking.
Not NVIDIA PR.
GPU > CPU for physics...by a major factor.

A lot of irony there!:)

Here is the take of the players after HavokFX was killed by Intel:

http://www.xbitlabs.com/news/multim...Now_Says_AMD_s_Developer_Relations_Chief.html
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
You might want to look at how PhysX has gone from nothing (2006) to besting havok (2009), since the growth of PhysX contradicts your assumption:

http://bulletphysics.org/wordpress/?p=88

Notice where it's posted...Bullet...the PhysX API AMD talks and talks about...hardly NVIDIA PR at work...

.oO(Again what is up with the all posting against PhysX based on either ignorance or flawed assumptions in this thread?)

And here are what I am hoping to see in games in a few years:
http://www.youtube.com/watch?v=ZwoJ-upjeKo
http://www.youtube.com/watch?v=1JrM4ujLY_A

I'm not saying that nvidia gpu-specific physx CAN'T be competitive, I'm saying that it's not compelling enough RIGHT NOW to justify the amount of PR that nvidia throws at it. You seemed to validate my point by showing us what you hope to see "in a few years". physx has been just a few years away from being something special for 5 years now. nvidia has some of the best engineers in the tech industry, and they have a lot of them, so my recommendation that they put those people to use is reasonable. If they can change physx from "meh" to the avg user to "gotta have it" like AA is then you'll have a true sea change on your hands; if they can't do that and/or are forced to go to an open standard then their proprietary physx will end up as a bust.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I would wager that nVidia will really enjoy a GPU physic standard and if GPU PhysX did die at least it helped bring awareness, bring choice and some content and get the ball rolling.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I'm not saying that nvidia gpu-specific physx CAN'T be competitive, I'm saying that it's not compelling enough RIGHT NOW to justify the amount of PR that nvidia throws at it. You seemed to validate my point by showing us what you hope to see "in a few years". physx has been just a few years away from being something special for 5 years now. nvidia has some of the best engineers in the tech industry, and they have a lot of them, so my recommendation that they put those people to use is reasonable. If they can change physx from "meh" to the avg user to "gotta have it" like AA is then you'll have a true sea change on your hands; if they can't do that and/or are forced to go to an open standard then their proprietary physx will end up as a bust.

By those definitions AA and AF would have died a long time ago

I remember the first time I enabled AA on a GeForce 3 Ti 500.
2xAA cut my FPS in more than half.
4xAF was a FPS eater too.
Today it's a different senario.

I suspect you memory is a bit rusty about AA...or AF.
It might be better to look at a more recent feature...HDR:
http://www.firingsquad.com/hardware/far_cry_1.3_midrange/page18.asp

Are you saying that we shouldn't have done AA, AF or HDR as the intial result were very poor?

And FFS...
DirectX is proprietary...should we skip it and go OpenGL?
X86 is proprietary...should we skip that too?

I rather have something proprietary...than nothing "open-and-free-for-all"...nes pas?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I would wager that nVidia will really enjoy a GPU physic standard and if GPU PhysX did die at least it helped bring awareness, bring choice and some content and get the ball rolling.

Some people sadly can't look beyond vendor bias...and thus they deem all progress not from their favourite vendor must be evil and must die.

Physics in games didn't really progress from 2000 too 2006.
It took AGEIA to get the ball rolling there.
Now AGEIA is no more...but the ball is still rolling...allthough it seems some apparently prefer the ball never was invented in the first place...what to do?

It would also be a lot easier of all the critique wasn't based on...well...FUD...to be frank.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
By those definitions AA and AF would have died a long time ago

I remember the first time I enabled AA on a GeForce 3 Ti 500.
2xAA cut my FPS in more than half.
4xAF was a FPS eater too.
Today it's a different senario.

I suspect you memory is a bit rusty about AA...or AF.
It might be better to look at a more recent feature...HDR:
http://www.firingsquad.com/hardware/far_cry_1.3_midrange/page18.asp

Are you saying that we shouldn't have done AA, AF or HDR as the intial result were very poor?

And FFS...
DirectX is proprietary...should we skip it and go OpenGL?
X86 is proprietary...should we skip that too?

I rather have something proprietary...than nothing "open-and-free-for-all"...nes pas?

I can't believe I actually read this entire thread...my head hurts. Probably because you've been going after everybody who hasn't 100% agreed with everything that you've said.

I'm not entirely sure whom you're arguing with here. I provide an opinion about a way that nvidia could improve physx and somehow you end up talking about hdr, aa, af, dx, ffs, openGL, and x86. I didn't say that they should give it up. I didn't say that physx has no place in the future of gaming. I think that you just pulled an own goal strawman on yourself!
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I can't believe I actually read this entire thread...my head hurts. Probably because you've been going after everybody who hasn't 100% agreed with everything that you've said.

I'm not entirely sure whom you're arguing with here. I provide an opinion about a way that nvidia could improve physx and somehow you end up talking about hdr, aa, af, dx, ffs, openGL, and x86. I didn't say that they should give it up. I didn't say that physx has no place in the future of gaming. I think that you just pulled an own goal strawman on yourself!

AA was pretty meh to start with...that was your argument against the success of PhysX, right?
And did you miss all the false claims in this thread?
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Because of bickering and whining is off putting. Nobody wants to read that. How about a little nicer discussion rather than people twisting other peoples statements to fit their own agenda? Doesnt that sound better?
Would you care to prove that? All you ever do is just spread misinformation without presenting any facts. My much more accurate opinio...facts say bickering and whining is on putting.

Myth:
"Bickering is off putting"
False, your wrong
Status: Debunked.

Your hippie agenda doesn't belong here. Go take it to a hippie forum such as "Motherboads" where problems get worked out. This is "Video Cards and Graphics", we only have room for red roosters and green goblins, not tiedye tampons.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Didn't you get the memo?
OpenCL is gonna make PhysX look bad ;)

The DX11 cloth test certainly makes NV look bad, what with the HD5870 outperforming the supposedly compute superior GTX470.

(I'm ignoring the results in the second post because all of them, ATI or NV, seem to be vastly different from every other result posted, such as an overclocked GTX460 in post 2 getting 148fps in DX11 cloth, while someone further down posts 700+fps for a GTX470 at the same resolution).
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The DX11 cloth test certainly makes NV look bad, what with the HD5870 outperforming the supposedly compute superior GTX470.

You wake me up when OpenCL is more tha beta and a finished standard..and all the test actually works on AMD GPU's...

(I'm ignoring the results in the second post because all of them, ATI or NV, seem to be vastly different from every other result posted, such as an overclocked GTX460 in post 2 getting 148fps in DX11 cloth, while someone further down posts 700+fps for a GTX470 at the same resolution).

3%20monkeys.jpg


Then you should have ignored all tests....not go on a selective cherry picking mode.

But hey!!!

What does your post have to do with the OP?
(You asked for it...)
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
You wake me up when OpenCL is more tha beta and a finished standard..and all the test actually works on AMD GPU's...



3%20monkeys.jpg


Then you should have ignored all tests....not go on a selective cherry picking mode.

But hey!!!

What does your post have to do with the OP?
(You asked for it...)

What did he ask for? And "DAMN" why should another person have any opinions. When you can have opinions FOR him.. You sure got a point there Ljonberg
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
What did he ask for?

Read the thread, then you will understand.

And "DAMN" why should another person have any opinions. When you can have opinions FOR him.. You sure got a point there Ljonberg

Opinions are fine, as long as they are not presented as facts.
A lot of "opnions" in this thread is not based on facts, but false information.
Even when debunked, the "claims" are used again and again and again and again and again and again and again and again and again...

My gripe is that we are writing 2010.
PhysX has worked since 2006.
Bullet 2.77 OpenCL cleary dosn't work very well.

Which reminds me, I need to write a few emails about something relevant for this thread.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Bullet 2.77 OpenCL cleary dosn't work very well.

The fun part is that AMD was spouting off in the media about their OpenCL support and collaboration with the Bullet project.
If AMD was so committed to Bullet and OpenCL, you would at least expect the stuff to WORK. Apparently it was all talk... again...

I know Erwin Coumans was under NDA from AMD, and was not allowed to comment on AMD's drivers and hardware performance. He mentioned that in an interview a few months ago, I think on HiTech Legion, or something.
I suppose the NDA ran out, and Erwin figured it was time to release OpenCL support for Bullet, even though it still doesn't work properly on AMD hardware.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Then you should have ignored all tests....not go on a selective cherry picking mode.


Second tests:

"Q9550 4ghz+460GTX 850Mhz max window for 1920×1080
DX11 cloth 148fps"

"Core i7 860 stock+5850 stock max window for 1920×1080
DX11 cloth 35fps"

Subsequent posts:
"For reference I get>500fps on DX11 cloth at nearly 2560x1600 on a 5870."
"DirectX11 cloth demo on NVIDIA GTX 470, 1920x1200: 723 FPS
DirectX11 cloth demo on Radeon 5870, 1920x1200: 1126 FPS"
"On a 5870 with a i7 930 I get over 1600 fps on the dx11 cloth demo"
"In the DX11 cloth demo It stays around 1620 fps"

So the guy who does a load of benchmarks gets scores that are nothing like everyone elses scores, by a factor of 5~30. Everyone else has much higher numbers.

Any sane person would assume that the very low numbers of poster #2 are wrong, because both ATI and NV cards show different numbers according to everyone else.
It's not a vendor specific thing, it's both NV and ATI.
Therefore the logical conclusion is that #2 did it wrong somewhere. Thus the results should be ignored.

"Cherry picking" makes sense when something is obviously out of whack with everything else. Any logical, sane, sensible person would be able to understand that. (Also I did point this out in my original post, you just seem to have ignored it).
Most people call them anomalous results.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Second tests:

"Q9550 4ghz+460GTX 850Mhz max window for 1920×1080
DX11 cloth 148fps"

"Core i7 860 stock+5850 stock max window for 1920×1080
DX11 cloth 35fps"

Subsequent posts:
"For reference I get>500fps on DX11 cloth at nearly 2560x1600 on a 5870."
"DirectX11 cloth demo on NVIDIA GTX 470, 1920x1200: 723 FPS
DirectX11 cloth demo on Radeon 5870, 1920x1200: 1126 FPS"
"On a 5870 with a i7 930 I get over 1600 fps on the dx11 cloth demo"
"In the DX11 cloth demo It stays around 1620 fps"

So the guy who does a load of benchmarks gets scores that are nothing like everyone elses scores, by a factor of 5~30. Everyone else has much higher numbers.

Any sane person would assume that the very low numbers of poster #2 are wrong, because both ATI and NV cards show different numbers according to everyone else.
It's not a vendor specific thing, it's both NV and ATI.
Therefore the logical conclusion is that #2 did it wrong somewhere. Thus the results should be ignored.

"Cherry picking" makes sense when something is obviously out of whack with everything else. Any logical, sane, sensible person would be able to understand that. (Also I did point this out in my original post, you just seem to have ignored it).
Most people call them anomalous results.

What you just posted speaks more to the current state of OpenCL/Bullet 2.77 than anyhing else.

If you want to use guesswork as argumentation, that is your choice...just don't expect me to play along.

Besides I hope too soon have answer for the horses own mouth...regarding OpenCL and Bullet 2.77...and lot of other questions.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The fun part is that AMD was spouting off in the media about their OpenCL support and collaboration with the Bullet project.
If AMD was so committed to Bullet and OpenCL, you would at least expect the stuff to WORK. Apparently it was all talk... again...

I think AMD suffers from lack of personel.
NVIDIA has a lot more software people employed and have actully gotten CUDA from being a nice concept to a warkable platform with tolls, support, framework ect in place.
I guess they looked at Intel and learned.

I know Erwin Coumans was under NDA from AMD, and was not allowed to comment on AMD's drivers and hardware performance. He mentioned that in an interview a few months ago, I think on HiTech Legion, or something.
I suppose the NDA ran out, and Erwin figured it was time to release OpenCL support for Bullet, even though it still doesn't work properly on AMD hardware.

You can't wait forever...although some people seems to have that stance.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
And speaking of Erwin Coumans.

People claiming that their CPU's can follow/macht GPU's for Physics:

Erwin Coumans said:
GPU NVIDIA Geforce 260:
Profiling: Root (total running time: 0.704 ms) ---
0 -- stepSimulation (99.57 %) :: 0.701 ms / frame (1 calls)
Unaccounted: (0.426 %) :: 0.003 ms
...----------------------------------
...Profiling: stepSimulation (total running time: 0.701 ms) ---
...0 -- synchronizeMotionStates (0.00 %) :: 0.000 ms / frame (1 calls)
...1 -- solveSoftConstraints (75.75 %) :: 0.531 ms / frame (1 calls)
...2 -- internalSingleStepSimulation (12.55 %) :: 0.088 ms / frame (1 calls)
...Unaccounted: (11.698 %) :: 0.082 ms


CPU Intel Quadcore Q6800 at 2.93 Ghz:
Profiling: Root (total running time: 12.511 ms) ---
0 -- stepSimulation (99.96 %) :: 12.506 ms / frame (1 calls)
Unaccounted: (0.040 %) :: 0.005 ms
...----------------------------------
...Profiling: stepSimulation (total running time: 12.506 ms) ---
...0 -- synchronizeMotionStates (0.01 %) :: 0.001 ms / frame (1 calls)
...1 -- solveSoftConstraints (68.69 %) :: 8.590 ms / frame (1 calls)
...2 -- internalSingleStepSimulation (15.21 %) :: 1.902 ms / frame (1 calls

That should put that false claim to rest now.
Since we have NVIDIA(PhysX), Intel(Havok) and now Bullet Physics saying the same thing.
GPU > CPU for physics...by a long shot.

Anyone claming else is speaking against the facts now can only be considered to be trolling/spreading misinformation.

And there should come more OpenCL support in Bullet 3.x, more inforamtion about that at March 2011 at the Game Developers Conference in San Francisco.
 
Last edited:

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Wait.. has someone said that CPU physics capability is superior to the GPUs? What is this monologue thing your doing Ljonberg?
 

Scali

Banned
Dec 3, 2004
2,495
0
0
"DirectX11 cloth demo on NVIDIA GTX 470, 1920x1200: 723 FPS

Well, my Gigabyte GTX460 1 GB OC (715 MHz) gets close to 900 fps.. okay, it's 1920x1080, but that's probably not the reason, as it runs at the same speed in the standard 800x600 window. Those last few pixels won't drop it by over 160 fps.
http://bohemiq.scali.eu.org/BulletDX11Cloth.png
And that's with an E6600 CPU.

Not sure why they score less with a more powerful CPU and GPU.