New Mafia II PhysX ON/OFF video.

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bryanW1995

Lifer
May 22, 2007
11,144
32
91
oops, never mind, the real reason that I keep buying nvidia is obvious:

Buying a video card without PhysX would be like buying a video card without AA or AF.

thanks for clearing that up wreckage, you just verbalized what we were all thinking!
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
What happened to the innovation being in producing powerful GPUs and video cards. Just straight up raw horsepower. Being the leader in delivering higher performing parts ?

NVIDIA also has the fasted chip. So I guess they nailed that area as well. :thumbsup: Probably why you have a pair in your signature. Because ATI just couldn't do it for you.


Besides the difference between 40fps and 10000fps is a number you read on a screen. I guess if you are looking for some sort of fps ego trip. :rolleyes:
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I have no idea what he was complaining about with nVidia. Complaining about the fastest and speed and yet that is exactly what nVidia offers. IF you don't like a choice -- you don't have to use it and enjoy all the performance one desires -- if that is important to ya.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Downplaying a feature you don't have is textbook jealousy.

actually jealously is what you feel when your preferred vendor gets smoked like a cheap cigarrette for 6 months b/c its dear leader wants to go after a market that doesn't even exist instead of making tons of money in a market that he used to dominate. Then you feel it even more when your preferred vendor comes out with one or two decent offerings, only to be smoked even more by the imminent release of newer hardware again from the competition. You REALLY get jealous when you look up and realize that it will be 18 mos between the last time your company had the fastest card and the next time that it could even POSSIBLY regain the performance crown, even though everything your company does is geared towards this one goal (allegedly). And, finally, jealousy is what prompts you to brag about physx and 3d because you have nothing else to talk about.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
This is the myth right there, because nvidia implements some features unique to their cards that are very limited in scope and only work with a small number of titles, they are suddenly innovative and trying harder ?

What happened to the innovation being in producing powerful GPUs and video cards. Just straight up raw horsepower. Being the leader in delivering higher performing parts ?

This constant reference to 'innovation' because of things like physx and 3dvision is getting tired and old.

Right now the innovation is coming from ATI. They're delivering higher performing parts to market faster than nvidia. They are making better technology available to the end-user sooner than nvidia.

Higher framerates give a better experience than useless stuff like physx. You can continue to say well what is the difference between 200 and 100fps and I agree, nothing. But when we are talking about games that are visually stunning, higher framerates are everything, because it's the difference between playing the game and a stuttering slideshow.

If we are talking about increasing resolutions to higher and higher levels, which arguably makes a big visual impact, we need raw performance, not junk like physx.

And for perspective, my system gets about 80-90fps without physx in Mafia 2 and gets as low as 25 to as high as 50 with physx on. It makes the game unplayable in certain areas, so it really does matter. If it was actually delivering a significant visual improvement I wouldn't find it so unreasonable, but it gives such a small addition that a 50% framerate cut is laughable.

When a developer wants to again push the envelope in real tangible visual improvement, the way a game like Crysis did. It's not physx that will do it, it's raw GPU horsepower and the ability to deliver those visuals at a playable framerate.

Awesome post right there.

Now since I can use physx I can say it not worth the performance hit or worth considering when buying a card.

The effects aren't even that good vitos coat moves around like its made of silk or something, the stone chunks have no inertia and the cars always go straight up when the explode. SF4 and FF13 have better cloth effects than that game.

The other physx effects look like they were added as an afterthought after the game was finished (just like some of you were saying when about the DX11 stuff in dirt2 and BC2 that AMD helped add, although that didn't kill performance).
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
NVIDIA also has the fasted chip. So I guess they nailed that area as well. :thumbsup: Probably why you have a pair in your signature. Because ATI just couldn't do it for you.

Quoted for hilarity

Fortunately bryan already said everything that needed to be said, [redacted]


Personal attacks are not acceptable.

Moderator Idontcare
 
Last edited by a moderator:

Wreckage

Banned
Jul 1, 2005
5,529
0
0
I have no idea what he was complaining about with nVidia. Complaining about the fastest and speed and yet that is exactly what nVidia offers. IF you don't like a choice -- you don't have to use it and enjoy all the performance one desires -- if that is important to ya.

Agreed :thumbsup:

I can't wait until NVIDIA buys AMD and we can end these arguments once and for all. ;)
 

golem

Senior member
Oct 6, 2000
838
3
76
This is the myth right there, because nvidia implements some features unique to their cards that are very limited in scope and only work with a small number of titles, they are suddenly innovative and trying harder ?

What happened to the innovation being in producing powerful GPUs and video cards. Just straight up raw horsepower. Being the leader in delivering higher performing parts ?

This constant reference to 'innovation' because of things like physx and 3dvision is getting tired and old.

Right now the innovation is coming from ATI. They're delivering higher performing parts to market faster than nvidia. They are making better technology available to the end-user sooner than nvidia.

Higher framerates give a better experience than useless stuff like physx. You can continue to say well what is the difference between 200 and 100fps and I agree, nothing. But when we are talking about games that are visually stunning, higher framerates are everything, because it's the difference between playing the game and a stuttering slideshow.

If we are talking about increasing resolutions to higher and higher levels, which arguably makes a big visual impact, we need raw performance, not junk like physx.

And for perspective, my system gets about 80-90fps without physx in Mafia 2 and gets as low as 25 to as high as 50 with physx on. It makes the game unplayable in certain areas, so it really does matter. If it was actually delivering a significant visual improvement I wouldn't find it so unreasonable, but it gives such a small addition that a 50% framerate cut is laughable.

When a developer wants to again push the envelope in real tangible visual improvement, the way a game like Crysis did. It's not physx that will do it, it's raw GPU horsepower and the ability to deliver those visuals at a playable framerate.

ATI is currently faster to market, there's no question about that. But Nvidia currently has the fastest GPU. Even when HD68xx comes out, it's not like Nvidia cards will suddenly become slow, they just won't be the fastest until they refresh. So if you count having a fast GPU as innovation. Then both companies innovate.

But if you want something outside of just raw speed, then Nvidia provides more innovation than ATI does, because besides Eyefinity, there not really much else for ATI.

It very true that Physx has a small number of titles (I think 3d can be used with almost any title?), but what about raw speed above a certain level? It obviously important to you. But besides a few titles at high resolutions (over 19x12) what midrange card (GTX460 and above) isn't playable (I know subjective, but then so is saying physx, 3d or multiple monitors aren't necessary or suck)? The number of titles that need more than a GTX460 at mainstream resolutions is probably EVEN smaller than the titles that use Physx. I'm talking about quantity of games not quality.

So ATI is innovating for this small uber performancing demanding crowd. But so does Nvidia, maybe not as fast as you would like currently, but they are. But Nvidia is also innovating for people that would like maybe something besides just raw speed. Why is this bad?

To increase resolutions so they have a big impact. You need an expensive monitor. I would argue that people that have these monitors already have cards that can drive them with current titles. Do you for see any title within the next upgrade cycle that are more demanding than Metro 2033 or Crysis? If not or not many, then I would argue a faster card is just as useless as you claim physx to be. Not for all, but for many.

You claim to not want to do tweaking to run physx well. That's fine and up to you. But if you plan on getting the fastest ATI cards and Crossfiring them. Are you going to bitch about crossfire too then? Because currently it does require some tweaking to get to work optimally. Until the latest drivers, it was pretty much broken for the last couple of month for several games.

Regarding what developers want... Name some PC games that required raw horsepower to give tangible visual improvement after Crysis (a game thats several years old)? Not many, besides Star Craft 2 most notworthy games are console ports now. Developers develop for console first and then port it over to the PC with little or no modification (unless of course some video card vendor pays for it). Crysis 2 may change that. But that's not until next year when NI and Fermi 2 should be coming out.

So is it bad to have more speed than you need? Of course not. You buy the card knowing your spending the funds on something you don't really need right now or might not ever need. But then you could say the same about physx, 3d, eyefinity etc etc. It's built into the cost of the card. Already paid for, if you don't use it fine, but it's there in case you do.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
NVIDIA also has the fasted chip. So I guess they nailed that area as well. :thumbsup: Probably why you have a pair in your signature. Because ATI just couldn't do it for you.


Besides the difference between 40fps and 10000fps is a number you read on a screen. I guess if you are looking for some sort of fps ego trip. :rolleyes:

[redacted]


Personal attacks are not acceptable.

Moderator Idontcare
 
Last edited by a moderator:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Agreed :thumbsup:

I can't wait until NVIDIA buys AMD and we can end these arguments once and for all. ;)

I don't know, AMD has gained discrete edge and have a very balanced line-up with impressive performance per watt and simply executing with fine precision. Have an opportunity to gain traction with their CPU's based on a level playing field. Bringing in Fusion to create new revenue streams and platform potential. They have the CPU and GPU and have talent -- tall order here.

nVidia has their hands full but certainly enjoy their pro-active nature and trying to create new gaming experiences for the PC platform.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Its obvious that followers of the green are heavily promoting this good-turned-bad tech, physx, in this topic.


What is so hard to understand about this scenario: You have two of the highest performing Nvidia GPUs --> you turn on a feature called physx which adds some debris and eyecandy --> for all purposes it now feels like you only have 1 GTX480 beasts instead of two.

you could either think: Damn! Mafia 2 has some gorgeous and taxing grahics.

Or you could think: But..Crysis (insert random other better looking games here) looks better..so why is this crap feature disabling one of my cards?



gsus.. i got better things to do than edit this post.
 
Last edited:

Wreckage

Banned
Jul 1, 2005
5,529
0
0
I don't know, AMD has gained discrete edge and have a very balanced line-up with impressive performance per watt and simply executing with fine precision. Have an opportunity to gain traction with their CPU's based on a level playing field. Bringing in Fusion to create new revenue streams and platform potential. They have the CPU and GPU and have talent -- tall order here.

nVidia has their hands full but certainly enjoy their pro-active nature and trying to create new gaming experiences for the PC platform.

All that needs to happen (and several financial analysts have been predicting it) is for AMD to file bankruptcy. Then NVIDIA (which has cash in the bank), could buy AMD. The recent FTC ruling even helps with the whole x86 issue (many were thinking NVIDIA might buy VIA because of this).

This could all easily happen before the end of next year.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
ATI is currently faster to market, there's no question about that. But Nvidia currently has the fastest GPU. Even when HD68xx comes out, it's not like Nvidia cards will suddenly become slow, they just won't be the fastest until they refresh. So if you count having a fast GPU as innovation. Then both companies innovate.

But if you want something outside of just raw speed, then Nvidia provides more innovation than ATI does, because besides Eyefinity, there not really much else for ATI.

It very true that Physx has a small number of titles (I think 3d can be used with almost any title?), but what about raw speed above a certain level? It obviously important to you. But besides a few titles at high resolutions (over 19x12) what midrange card (GTX460 and above) isn't playable (I know subjective, but then so is saying physx, 3d or multiple monitors aren't necessary or suck)? The number of titles that need more than a GTX460 is probably EVEN smaller than the titles that use Physx at mainstream resolutions. I'm talking about quantity of games not quality.

So ATI is innovating for this small uber performancing demanding crowd. But so does Nvidia, maybe not as fast as you would like currently, but they are. But Nvidia is also innovating for people that would like maybe something besides just raw speed. Why is this bad?

To increase resolutions so they have a big impact. You need an expensive monitor. I would argue that people that have these monitors already have cards that can drive them with current titles. Do you for see any title within the next upgrade cycle that are more demanding than Metro 2033 or Crysis? If not or not many, then I would argue a faster card is just as useless as you claim physx to be. Not for all, but for many.

You claim to not want to do tweaking to run physx well. That's fine and up to you. But if you plan on getting the fastest ATI cards and Crossfiring them. Are you going to bitch about crossfire too then? Because currently it does require some tweaking to get to work optimally. Until the latest drivers, it was pretty much broken for the last couple of month for several games.

Regarding what developers want... Name some PC games that required raw horsepower to give tangible visual improvement after Crysis (a game thats several years old)? Not many, besides Star Craft 2 most notworthy games are console ports now. Developers develop for console first and then port it over to the PC with little or no modification (unless of course some video card vendor pays for it). Crysis 2 may change that. But that's not until next year when NI and Fermi 2 should be coming out.

So is it bad to have more speed than you need? Of course not. You buy the card knowing your spending the funds on something you don't really need right now or might not ever need. But then you could say the same about physx, 3d, eyefinity etc etc. It's built into the cost of the card. Already paid for, if you don't use it fine, but it's there in case you do.

You make valid points. The distinction is there though that the 5XXX series has been available for a year, the 4XX series has been available for just over four months.

I have no problem tweaking, what was offered as a 'tweak' was removing the largest aspect of physx in that game, the cloth effects. To me, a tweak is generally a small adjustment for improvement. Isn't removing the largest aspect of a feature's implementation going beyond a tweak to basically turning off 75% of the feature. Of course, the argument is open there to call that a tweak to its function or an amputation of its arms and legs.

The 6XXX series is coming before the year is up. I think it's clear which vendor is playing catch up and which vendor is leading in performance.

Do we have more speed than we need for most games, yes. But having more power available is what will make innovations in visuals in future games possible. It also allows users to increase resolution, which has a big impact on image quality.

Innovation is an amazing thing, the pursuit of it is what keeps everything in a constant state of improvement.

Physics in PC games has been around for a long time, the innovative aspect of introducing that is long past. Is the implementation of physx radically re-defining what was already available to us on the CPU, in my opinion, nowhere close. Is the idea of using the GPU rather than the CPU innovative ? I'd say once the idea has proven its merits with tangible improvement then it can be labeled as innovative. In its current implementation, it's having a hard time offering anything we didn't already have and is proving its self to be a huge resource hog. :thumbsdown:
 
Last edited:

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
All that needs to happen (and several financial analysts have been predicting it) is for AMD to file bankruptcy. Then NVIDIA (which has cash in the bank), could buy AMD. The recent FTC ruling even helps with the whole x86 issue (many were thinking NVIDIA might buy VIA because of this).

This could all easily happen before the end of next year.

Nvidia can't buy AMD. They don't have enough money. They have enough money to buy a significant amount of stock and cuase grief but they have no where near enough $$ to make a offer to buy AMD.
 

extra

Golden Member
Dec 18, 1999
1,947
7
81
PhysX is meh.

PhysX effects in Mafia 2 do add something to the game, which is nice. However, it is very minor. The effects could have easily been done on the CPU with much, much, less overhead than the PhysX cpu stuff takes. It's nowhere near as impressive as physics in many other games that don't use PhysX. Not complaining that I get an added bonus though. Shrug. The 470 plays the game nicely and the game is fun. :) Thanks, EVGA!
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
And back then, I don't think anyone cared how it was done, only that it "was" done. Times are a changing.

Here's the point, they are driving a feature that looks no better than a CPU driven feature from 10 years ago (hell max payne doesnt even support dual core) as a physX only development.

The same effect could have clearly been CPU driven, but was not.

CPU driven physx runs like crap and screams "intentional crippling", hell even GPU driven physx is a large performance hit.

This is coming from someone who wanted to invest in Ageia back in the day when they started showing off their 1st PPU.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Here's the point, they are driving a feature that looks no better than a CPU driven feature from 10 years ago (hell max payne doesnt even support dual core) as a physX only development.

They have a feature that looks better than anything their competitor has to offer. That in itself is enough right there.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Nvidia can't buy AMD. They don't have enough money. They have enough money to buy a significant amount of stock and cuase grief but they have no where near enough $$ to make a offer to buy AMD.

Once AMD files for bankruptcy they can do just what AMD did to ATI. cash+stock+borrow.

Based on market cap, AMD is worth less today than what they paid for ATI. :eek:



Not even in the farthest nether-regions of my drunken stupor can I rationalize how this has anything to do with the thread title/topic.

Moderator Idontcare
 
Last edited by a moderator:

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
Once AMD files for bankruptcy they can do just what AMD did to ATI. cash+stock+borrow.

Based on market cap, AMD is worth less today than what they paid for ATI. :eek:


Not really but if you belive that tripe, go fo it. You can't always buy a company for market cap either. Book value comes into play as well as IP and assets.

Secondly AMD isn't going bankrupt. they have less debt now then they did a year ago.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Lol even on ignore i see wreckages posts in nested quotes...

You do realize wreckage that nvidia is worth HALF of their value from the same period as well? It's called a recession.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
Lol even on ignore i see wreckages posts in nested quotes...

You do realize wreckage that nvidia is worth HALF of their value from the same period as well? It's called a recession.


the annoying part is that rampant fanboism seems to color the opinions of those who have the disease. I have owned a mydriad of products. I own tools from compansy a b and c becuase each one does something better then the other at one time or another.

With physx. I just see gimicky special effects. People don't get it. Nvidias days are numbered unless they can really make discrete workstation a core bussiness.

at best this feature can be seen as a last ditch effort to stay relevant. which honestly given its propritary nature seems stupid.

what nvidia should do is what ARM does. Leasing IP. Would be a good bussiness move for them considering how boxed in they are.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
what nvidia should do is what ARM does. Leasing IP. Would be a good bussiness move for them considering how boxed in they are.

Nvidia would never do something like that due to the fear factor alone. Wouldn't make them look so hot if Intel or AMD whooped their arse with there own IP :)
 
Last edited:

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
Nvidia would never do something like that due to the fear factor alone. Wouldn't make them look so hot if Intel or AMD whooped their are with there own IP :)


Well it all ties into these gimicky things that they are doing and why they are trying very hard to lock people into propritary archtectures like cude etc.

They should maybe buy ATI cores and write there own drivers for them. They should also liscense IP. It would make them far more profitable and fit to survive. there simply will not be enough left at the table in the next 5 years for them to survive as a hardware manufacturer.

Its what I'd do if I was in nvidias shoes.