Why I don't think PhysX is a big deal

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

thilanliyan

Lifer
Jun 21, 2005
12,048
2,262
126
Originally posted by: Wreckage
Is that why Intel chipsets will now be supporting SLI? Do you ever get tired of being wrong?

Which Intel chipsets will support SLI? Or are you talking about Nehalem chipsets...which will require an EXTRA chip to run SLI?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: apoppin
Same ole rumor

You still did not link to the ATI Havok driver or supported games.

I know, I know. No such thing exists and you are using rumor as fact.

Maybe in 2012 you can actually argue this point. maybe.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Wreckage
Originally posted by: apoppin
Same ole rumor

You still did not link to the ATI Havok driver or supported games.

I know, I know. No such thing exists and you are using rumor as fact.

Maybe in 2012 you can actually argue this point. maybe.

Rumor? from the FIVE links i quoted that says that AMD is DEFINITELY GOING TO use Havok for both their CPU and GPUs physical calculations in PC gaming?

if that fact escapes you, nothing will open your eyes. Yes, in 2012 PhysX may be completely dead

and if you are talking *today* .. lets see more than your tech demos
- right now .. show me!

 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
This whole sh!t about PhysX reminds me of the SM3.0 days of the 6-series. People were screaming about it. I mean there were literally *igore ATi* to be blunt. And how did it end? Well, sure, 3.0 *was* the deal in the end, but the 6-series sucked so bad at it in the regular titles that it was meaningless until the next gen hit the shelves. So if you're basing your purchase decision on PhysX, you should reconsider, cause, frankly, I am pretty sure that by the time next gen hits the market both nV and ATi will support it. And this gen... there's no titles that will be required to run a game! Was the same deal with SM3.0. And no, tech demo sucky gameplay titles do not count.

HD4870 is faster than GTX260- fact.
HD4870x2 rips a new one in GTX280 - fact.
HD4850 - best mid-range - fact.

And no, it's only the US where GTX260 is cheaper and GTX280 has a fair price. EU, Asia, Africa... The prices haven't changed alot and won't.
 

toadeater

Senior member
Jul 16, 2007
488
0
0
"PhysX by NVIDIA - A review of what to expect"

http://www.guru3d.com/article/physx-by-nvidia-review/1

Judging by the performance of PhysX on the GPU--even a budget GPU in SLI, Intel/AMD are going to have a rough time beating it with Havok on the CPU. Who wants to buy an expensive Nehalem or Larrabee, and whatever it is AMD is still trying to get off the drawing board, when they can plug in a new GPU on their existing systems and get a MASSIVE boost in physics and graphics performance? Gamers don't need new CPUs, they need new GPUs. That is the cheapest, most effective upgrade route for the next couple of years. Gamers don't have to pay some Intel-AMD CPU tax and be forced to buy a new motherboard, DDR3 RAM in addition to a new GPU they'll have to buy anyway.

But Nvidia is going to have a rough time selling SLI for future multi-GPU desktop "supercomputers," so Nvidia will likely have to support Crossfire unless Intel once again licenses SLI. At least Nvidia doesn't have to worry about GPU computing since it will be included in DX11 and OpenCL.
 

deerhunter716

Member
Jul 17, 2007
163
0
0
I still say Havok is looking better and better especially with Intel behind it now!! Time will tell if either really becomes anything at all.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: toadeater
"PhysX by NVIDIA - A review of what to expect"

http://www.guru3d.com/article/physx-by-nvidia-review/1

Judging by the performance of PhysX on the GPU--even a budget GPU in SLI, Intel/AMD are going to have a rough time beating it with Havok on the CPU. Who wants to buy an expensive Nehalem or Larrabee, and whatever it is AMD is still trying to get off the drawing board, when they can plug in a new GPU on their existing systems and get a MASSIVE boost in physics and graphics performance? Gamers don't need new CPUs, they need new GPUs. That is the cheapest, most effective upgrade route for the next couple of years. Gamers don't have to pay some Intel-AMD CPU tax and be forced to buy a new motherboard, DDR3 RAM in addition to a new GPU they'll have to buy anyway.

But Nvidia is going to have a rough time selling SLI for future multi-GPU desktop "supercomputers," so Nvidia will likely have to support Crossfire unless Intel once again licenses SLI. At least Nvidia doesn't have to worry about GPU computing since it will be included in DX11 and OpenCL.

here is their conclusion and their problem:

The enigma however remains that though a userbase of 70 million potential GPUs can open up broader game support, it's still based on CUDA. And though CUDA is an open standard, it's fixed at GeForce graphics cards. ATI surely won't use it, neither will Intel or even S3. So a large chunk of the market (read ATI users) can not access this technology. However I do see an interesting option to use a GeForce card solely for PhysX, and maybe even a Radeon card to render the games. Unless NVIDIA will prevent this in it's drivers of course .. from my point of view, there is no obstruction for that construction.

What is this about using a Geforce Card to render PhysX and a radeon card for the graphics?

IF so, that might turn out much better for Nvidia .. and all of us
- a cheapass nvidia GPU
:D

so your PC might be powered by Nvidia/intel and AMD
:Q;

Even though i read it, on 2nd look the article is interesting .. here is what the author likes and dislikes about PhysX; and it is a pretty good summary:

NVIDIA's GeForce PhysX implementation

I have to be honest here, all respect to the PhysX team for making this happen. NVIDIA has created three sets of circumstances on how you can choose to use your PhysX setup from within the PhysX driver, let's have a look:

*Standard - one GPU renders both Graphics + PhysX (not ideal as you'll need a lot of GPU horsepower).
*SLI mode - have two GPUs render both Graphics + PhysX.
*Multi-GPU mode - GPU1 renders Graphics and GPU2 renders PhysX.

For me personally the last option is by far ideal as this is a situation where with any mainboard with two x8 or x16 PCIe slots you can use your old adapter as PhysX unit.

GeForce PhysX - you do not need a power-house of a graphics card to deal with the PhysX calculations. . . . you use your 9600 GT (or even 8600 GT) as a Physx unit and the GTX 260 for graphics.

Why do I like that so much? Because you can do something new with an old outdated graphics card and ... and you are not bound to an nForce platform since you are not running SLI mode.

What's there to dislike?

... when you have two GPUs at full load hard at work, it inevitably will increase the power consumption.

Here is what everything hinges on imo
Fact is, for PhysX to succeed the software industry will need to see broad support spread out over all vendors. In roughly 18 months DirectX 11 will be released. And that's where everything will change for sure

My 4870 will be ancient in 18 months; i believe we will not see widespread industry accpetance of any one standard by then and Nvidia may even open it up to allow my Radeon to run the Graphics and my 8800GTX to run the physX.

PhysX today is unimportant to a graphics card purchase today. But that is my opinion and it is subject to change as more info becomes available.

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
physX can run on the CPU, or it can run on the CPU AND the GPU at once. adding the GPU just makes it run faster.
Most of these games use so little physX effects that the CPU is the only thing used. And not maxed out at that. So really, they don't matter, what matters is games where physX can take things to the next level...

Check out this
http://forums.anandtech.com/me...=2215882&enterthread=y

Basically GRAW is the first instance i saw where physX is not just "extra particle effects". The player can go, break off a single part of a plank in a wall, put his gun through it, and shoot people while remaining hidden. Something previously impossible. And the AI uses those tricks AS WELL. So now you have gone into a whole new level of immersion.

PhysX is very VERY intesive, it eats through your frame rate. GPU accelerated physX greatly mitigates that. But that intensivity is there for a reason, there is more data to calculate. AA is super intensive as well and gives a fraction of what physX does. I would think it is obvious that I would rather disable AA then disable physX.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
So Havok will also run on the CPU and the GPU

what is the point?

they are competing systems and intel/amd supports Havok and Nvidia supports PhysX

who will win?
- can you see the future clearly; or anyone else here?

i am simply betting on intel/amd - even though they may have a *perceived* "late start", AMD has been working on GPU calculated physics when they decided to go with Havok recently and very publically - who already has a nice library; they just need to port some of it and i expect to see AAA titles eventually also showcasing it. It is not like writing a complete new game [i believe]. i do not think AMD/Intel are so far behind. But that is my reading of my own crystal ball =P

the really COOL thing is, i get to check it out too

from that http://www.guru3d.com/article/physx-by-nvidia-review/8
The proof is in the pudding. NVIDIA will release it's PhysX Pack #1 next Tuesday, August the 12th. It will contain PhysX-related freebies that will get you guys going on the right track. This is what you'll can expect:

nVIDIA PhysX Pack #1

* Full version of Warmonger
* Full version of UT III PhysX Mod Pack (Includes three PhysX-specific levels)
* Latest Ghost Recon Advanced Warfighter 2 patch (1.05)
* First peek at Nurien, an upcoming social networking service, based on UE III
* First peek at Metal Knight Zero, an upcoming PhysX-capable title
* "The Great Kulu" technology demo
* "Fluid" technology demo

Obviously at that date the release of GeForce Forceware 177.79 drivers and the new PhysX drivers 8.07.18 is pending as well.

i'll take a free game to try on my 8800GTX .. if i like it, i can pick up a cheap as hell 2nd nvidia GPU

best of all worlds

maybe i will agree with you

i doubt it .. based on 1 game
- but that is a prediction .. my mind is open to new experiences

hopefully tonight .. at the party
:wine: