Opinion - Nvidia for PhysX or ATI?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
In the rankings of what I take into consideration when buying a graphics card, Physx scores a few slots below the box are in importance. In other words I wouldn't factor it into my purchasing decision at all.

I think it was 2006 when Ageia launched their PPU card and Nvidia bought them in 2008. After this many years now it still has had very, very little impact on gaming. I just do not see hardware Physx taking off.

It's had four years, it's not taking off. It's a dead in the water feature, they ought to take the money they waste on Physx and put it toward making faster cards.

18 games in four years, 75% of which are trash, free demos, 5% of the full game.
 

sandorski

No Lifer
Oct 10, 1999
70,783
6,341
126
Nvidia, by locking out the use of competitor cards for Video, has essentially killed PhysX.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
It's had four years, it's not taking off. It's a dead in the water feature, they ought to take the money they waste on Physx and put it toward making faster cards.

18 games in four years, 75% of which are trash, free demos, 5% of the full game.

News flash: tessellation adoption rate slower than physx. Tessellation is a dead in the water feature. They ought to take the money they waste on tessellation and put it towards making faster cards.

News flash: OpenCL doesn't have an adoption rate. It is an aborted fetus. It doesn't have a single game that uses it. They ought to stop talking about OpenCL and instead talk about faster cards.

News flash: the adoption rate of eyefinity / surround users is lower than physx games. Eyefinity / surround is dead in the water. They ought to stop wasting money on it and instead use it to make faster cards.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
IMHO, hardware PhysX is not a must, yet something nice to have, but not necessary. Many nay sayers are right, not many games support PhysX, and those that does are likely not fun to play. However, hardware PhysX is not an unit within the card, but is a method of utilization of those CUDA cores, meaning that nothing is wasted on non-PhysX based games. What gets confusing is PhysX works regardless of the brand of video cards. Since games are designed for players and not video cards, the core mechanics of any games will work on both camps, so you won't miss a bit. At most, you lose some eye candy in exchange of better framerate.

Having said that, Nvidia video card is not more expensive then AMD cards. You don't pay more for hardware PhysX, you get it if you have a Nvidia video card. Check out Anandtech benching and then check the price of the cards and you will realized that more you pay, the better card you will get. If that card is Nvidia, then you are open for hardware physX. If the card is AMD, then you are open for triple display with a single card.

There are no down side of hardware PhysX, so getting a Nvidia card do give you a possibility of having better visual experience on say 1 or 2 good games for the live of the card. If hardware PhysX isn't strong enough motivation when it comes to selection, then pass, and you can still be able to enjoy PhysX with other cards. Note that other than PhysX, Nvidia's card are also better at tessellation. As of now, nay sayers claim that current games don't need those level of tessellation. If tessellation does get adopted in general, it will only get more demanding in that department, meaning that with a Nvidia card, it may age a bit slower than an AMD card. Also, with red/blue 3d glasses, you can also experience 3D with a Nvidia with no other extra equipment.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Surely this thread is a wind up fella's.....The OP is asking what choice of cards for PhysX, when there is only one choice right now...Regardless of other peoples opinions on value of PhysX, it is obvious it must be of some interest to the OP....
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Wow, back in 2008? Can we remember that far back? When i asked Nvidia about PhysX at Nvision08, they said all that they did was port Aegia PhysX to CUDA. Basically, they did nothing with it until 2009 - when they went to their dev partners. And it takes about 3 years to make it from concept to finished game.

Alice: Madness Returns is impressive with PhysX on high. i am interested in seeing what Batman Arkham City brings.

Hey, How well does Physx perform on the CPU now? In Alice. I've been curious ever since Mafia 2 had much more improved performence for Physx and nV did say they were working on getting physx to run better on the CPU.

I saw in some benchmarks, an overclocked i7 920 was performing just as fast as a GTX460. I would like to see how a 2600K Handles Physx now.

MAFIA-II-20.jpg
 

WelshBloke

Lifer
Jan 12, 2005
33,106
11,281
136
Surely this thread is a wind up fella's..... The OP is asking what choice of cards for PhysX, when there is only one choice right now...Regardless of other peoples opinions on value of PhysX, it is obvious it must be of some interest to the OP....

No he isn't. He was asking if it was worth spending more on a card that had physx.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
And I believe everyone who cares to, has given their opinion. Now it's up to the OP.
 

PrayForDeath

Diamond Member
Apr 12, 2004
3,478
1
76
I have to say that Alice video looked great. It's a shame nVidia wouldn't let others use their technology.
 
May 13, 2009
12,333
612
126
I don't know why everyone just dismisses 3D completely? It's not like physX or whatever where you really have to look to see it. 3D is right there in your face. It's freaking awesome IMO. I played BF2 a few days ago in 3D after not playing pc games for a few weeks and I was like a kid in a candy store. I was even telling my wife to try the glasses on and look at the game. It was like discovering it's awesomeness all over again.
 

Arg Clin

Senior member
Oct 24, 2010
416
0
76
While PhysX is nice to have, it doesn't really seem like it has a really big impact right now, or in the near future. If it takes off in 2-3 years from now you'll probably be wanting a new GPU anyways at that time.

I'd let PhysX take the backseat to other concerns like your opinion on driver quality, heat & noise and (natually) price. If all else equal, sure why not get PhysX. All else is rarely equal though...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Hey, How well does Physx perform on the CPU now? In Alice. I've been curious ever since Mafia 2 had much more improved performence for Physx and nV did say they were working on getting physx to run better on the CPU.

I saw in some benchmarks, an overclocked i7 920 was performing just as fast as a GTX460. I would like to see how a 2600K Handles Physx now.
Surprisingly well. You can get playable framerates in Alice Madness Returns with PhysX on *medium* - most of the time. When the pepper grinder really gets going, it will choke the CPU, however. i tested it with my i7-920 at 4.0GHz; i5 2600K is not that much faster.

I have to say that Alice video looked great. It's a shame nVidia wouldn't let others use their technology.
PhysX *does* run on the CPU; but not on 'high'. You just cannot run the fluid animations.

It takes a GTX 560 Ti to generally run Alice on 'high'; the CPU simply cannot run it. However, when the pepper grinder is churning out smoke and particles (filling the entire screen) and you have the oily PhysX fluid animations going everywhere, you will want a GTX 580 (or even a GTX 590 at 2560x1600).

I don't know why everyone just dismisses 3D completely?
It's mostly cases of "sour grapes" - 'i don't have it, so it must suck'

i have 5 months into a S3D evaluation with nearly 100 games. i also brought 20 of my friends and neighbors in to evaluate it with me. NOT ONE person disliked it and i couldn't get the kids out of here. 20 for 20 were impressed. One medical doctor now wants his own 3D Vision kit (and big screen HD 3DTV).
:whiste:
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
I don't know why everyone just dismisses 3D completely? It's not like physX or whatever where you really have to look to see it. 3D is right there in your face. It's freaking awesome IMO. I played BF2 a few days ago in 3D after not playing pc games for a few weeks and I was like a kid in a candy store. I was even telling my wife to try the glasses on and look at the game. It was like discovering it's awesomeness all over again.

Before, AMD is inactive about 3D and gaming. Fanboys believe it is a gimmick and sabotage missions. Now, AMD once again active on this 2 ends, for 3d, although they don't support it directly, many monitors do come with their own 3d wares and AMD actually introduced 3d development kit. On the other hand, they now have the problem "Game Evolved" which focuses on gaming. With both party playing active, we should be able to see an interesting immediate future.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Surprisingly well. You can get playable framerates in Alice Madness Returns with PhysX on *medium* - most of the time. When the pepper grinder really gets going, it will choke the CPU, however. i tested it with my i7-920 at 4.0GHz; i5 2600K is not that much faster.

That actualy good news to me and it should be to everyone. I wonder if more cores would would increase performance. I'd like to see what 6 core SB-E and Bulldozer would be able to do with physx.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
That actualy good news to me and it should be to everyone. I wonder if more cores would would increase performance. I'd like to see what 6 core SB-E and Bulldozer would be able to do with physx.
Nvidia is finally reworking PhysX. Just over two years ago they did a straight port of Aegia's PhysX to CUDA. It worked on a single CPU core. Now, i believe it is optimized for more than one core and the next iteration of PhysX (3.0) is out recently, and it should be even more optimized for future games.
This is a pretty good tweak guide for optimizing PhysX in Alice:

http://physxinfo.com/news/5883/gpu-physx-in-alice-madness-returns/

Arkham City is the next PhysX showcase. i am looking forward to see what it brings.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Because the developer has to code for it, and why spend time and money coding for something that <50&#37; people will use?

Oh right, logic apparently doesn't work in this thread. That opportunity cost link was pasted around here somewhere...

Why code for higher-end systems or have maxed graphical settings when 100 percent of the end-users can't utilize them?
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
It's mostly cases of "sour grapes" - 'i don't have it, so it must suck'

i have 5 months into a S3D evaluation with nearly 100 games. i also brought 20 of my friends and neighbors in to evaluate it with me. NOT ONE person disliked it and i couldn't get the kids out of here. 20 for 20 were impressed. One medical doctor now wants his own 3D Vision kit (and big screen HD 3DTV).
:whiste:

Is this you trying to somehow impress upon us that your "evaluation" was empirically sound?

This sounds like your personal opinion to me, and although ill be the first to admit you have way more experiance in testing and way more hardware available to enjoy, its still only your opinion. The guy sitting next to you could just as well hate it.


Watching a movie in 3D at a cinema theater is great. Having glasses on while sitting in your own home, fragging deformed humans in Duke Nukem is just not great. More like "i need some friends".
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Nvidia is finally reworking PhysX. Just over two years ago they did a straight port of Aegia's PhysX to CUDA. It worked on a single CPU core. Now, i believe it is optimized for more than one core and the next iteration of PhysX (3.0) is out recently, and it should be even more optimized for future games.
This is a pretty good tweak guide for optimizing PhysX in Alice:

http://physxinfo.com/news/5883/gpu-physx-in-alice-madness-returns/

Arkham City is the next PhysX showcase. i am looking forward to see what it brings.

Imho,

That's what I never understood -- all the talk about trying to make the CPU look bad on purpose but most PhysX titles were CPU based. It was in nVidia's self interest to make the CPU look good, too, based on the amount of titles that utilize the CPU, now and in the future. For PhysX to be taken seriously, one has to innovate for not only the GPU but the CPU as well, because both have their strengths.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Is this you trying to somehow impress upon us that your "evaluation" was empirically sound?

This sounds like your personal opinion to me, and although ill be the first to admit you have way more experiance in testing and way more hardware available to enjoy, its still only your opinion. The guy sitting next to you could just as well hate it.


Watching a movie in 3D at a cinema theater is great. Having glasses on while sitting in your own home, fragging deformed humans in Duke Nukem is just not great. More like "i need some friends".
i am not trying to impress anyone here. Especially the 3D haters.
- i am not selling 3D and i don't get a penny for recommending it to all of my friends now.

i got a 3D Vision kit and i spent 5 months with nearly 100 games. To *validate* my OWN EXPERIENCE (and to make for a much more interesting evaluation - which i am writing now), i brought in my friends an neighbors (for many hours) to evaluate it with me - including another ABT Senior editor who was visiting me for a few days.

Letsee - the difference between your experience and mine - yours is sitting home alone fragging deformed humans in 2D, while mine is in 3D. :p
- Maybe it's more like "you need some friends" .. i have had my share over the past few months evaluating S3D with me.
:whiste:

20/20 liked 3D Vision - that is awesome odds; i am still looking for the person who hates (who has *really* tried it). S3D gaming is not the same as the Cinema S3D experience at all; the experience happens "inside" the box with few "pop-outs"
:thumbsup:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Imho,

That's what I never understood -- all the talk about trying to make the CPU look bad on purpose but most PhysX titles were CPU based. It was in nVidia's self interest to make the CPU look good, too, based on the amount of titles that utilize the CPU, now and in the future. For PhysX to be taken seriously, one has to innovate for not only the GPU but the CPU as well, because both have their strengths.
i asked Nvidia this same question back at their first GTX in '09. Their answer was "we did a straight port of Aegia PhysX to CUDA and haven't spent any time with it yet". This was 2 years ago.

It evidently took a couple of years to rewrite the PhysX SDK and they are actively optimizing it. I believe they are up to 3.0 now (and Alice is pre-3.0). Yes, they are actively innovating PhysX for the CPU and the GPU; however, it will always run better on the GPU due to its nature.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Letsee - the difference between your experience and mine - yours is sitting home alone fragging deformed humans in 2D, while mine is in 3D. :p
- Maybe it's more like "you need some friends" .. i have had my share over the past few months evaluating S3D with me.
:whiste:

Haha! :) good one! :)
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
20/20 liked 3D Vision - that is awesome odds; i am still looking for the person who hates (who has *really* tried it). S3D gaming is not the same as the Cinema S3D experience at all; the experience happens "inside" the box with few "pop-outs"
:thumbsup:
I have 3D vision, not that I hated it, but do have it off most of the time playing multi-player games. It isn't because it doesn't work, but more like work too well.

In 3D, everything is more intense, what makes it more difficult to compete. Yes, the person further away does look further away, but on 2D, I eyes don't work as hard if you know what I mean. I don't know about S3D, but Nvidia 3D does not play nice with SLI in some games. I enjoy playing single player games a lot with 3D, but not in a serious Raid or multi-player FPS, it is just too much to handle.

Other than WoW, none of the other MMO have 3D cursor, even SC2 doesn't have 3D cursor, and without 3D cursor, the whole freaking control is just not right, which makes me want to throw the entire setup off the table(exaggerating). If you know a trick or two about setup, please please please share.

And I still see ghosts even on an independent USB3 PCIe card...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Yes, i DO know exactly what you mean. i could not play Bulletstorm on the first play-through without getting "confused". Some games (Read: EA-sponsored) only play lip-service to S3D while generally botching the experience. After tweaking the game thoroughly, i can appreciate the S3D experience - but still not in super-widescreen (5780x1080) 3D Vision.

The first two months that i had 3D Vision i felt a bit like you did. Then i discovered the hotkeys and the infinite tweaking you can do to reduce issues (convergence/ghosting) and it got much much better. i hope you are using Nvidia's crosshairs and not the one that is supplied in-game.
:thumbsup:

I have 3D vision, not that I hated it, but do have it off most of the time playing multi-player games. It isn't because it doesn't work, but more like work too well.

In 3D, everything is more intense, what makes it more difficult to compete. Yes, the person further away does look further away, but on 2D, I eyes don't work as hard if you know what I mean. I don't know about S3D, but Nvidia 3D does not play nice with SLI in some games. I enjoy playing single player games a lot with 3D, but not in a serious Raid or multi-player FPS, it is just too much to handle.

Other than WoW, none of the other MMO have 3D cursor, even SC2 doesn't have 3D cursor, and without 3D cursor, the whole freaking control is just not right, which makes me want to throw the entire setup off the table(exaggerating). If you know a trick or two about setup, please please please share.

And I still see ghosts even on an independent USB3 PCIe card...
 
Last edited: