• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

New Mafia II PhysX ON/OFF video.

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Hardly, you don't see people rushing out to buy a 2nd card for physx. The only thing changing is the marketing of of Nvidia and ATI making you think you need it.

I don't think you can prove that they "aren't" either. And as far back in recorded history as you can go, when didn't an advertiser not market their products as "something you need to have!".

You'll never see any company selling a product yelling out, "Well, you could buy our products, but you really don't need them.".

This simply ISN'T done, so I don't know what you think has "changed" in the history of advertising as we know it.
 
Hardly, you don't see people rushing out to buy a 2nd card for physx. The only thing changing is the marketing of of Nvidia and ATI making you think you need it.

The part that I like is I can use my dated GPU for a discrete PhysX card and don't have to buy one. For me it has offered value -- try that on a dated CPU. Instead of collecting dust or a paper weight, it can improve PhysX performance over a single GPU doing both rendering and PhysX.
 
I think most game developers are not impressed either, given the very small number of games that even use it. It has such a small footprint in PC gaming, it's astonishing how much it gets pushed around.

Because you're basing on the GPU PhysX context, imho, when it is an engine, library, tool-set, middle-ware that is multi-platform and device.

What do game developers really think of PhysX?

This is a bit dated but may offer some insight on what developers think:

http://bulletphysics.org/wordpress/?p=88
 
The part that I like is I can use my dated GPU for a discrete PhysX card and don't have to buy one. For me it has offered value -- try that on a dated CPU. Instead of collecting dust or a paper weight, it can improve PhysX performance over a single GPU doing both rendering and PhysX.

What if you switch sides? Aren't you in the same situation?

I have a 9800 GTX+ I took out of the GF's PC, I'd love to stick it into mine for some PhysX playtime. Unfortunately, I got a 5870 for my main card.

Guess that 9800 GTX+ is now "collecting dust or a paper weight."

(I've tried to get her to install it as second dedicated PhysX card but she doesn't want the extra heat in her case. And I can't use it 🙁 )
 
I guess you'd have a similar problem if you changed from an Intel mobo to an AMD mobo. You may be able to re-use the RAM, but the CPU is now useless. Sure, that is a physical hardware limitation, and your 9800GTX not working with your 5870 for PhysX via software, but the end result is you can't use an older piece of hardware in both cases.
I guess you can say that switching from Nvidia to AMD is becoming more of a platform switch when it comes to PhysX.
Of course folks can take bits and pieces of this post and wildly put some twists on each bit, but the end result will be the same. You can now sell your 9800GTX and get a few bucks for it. Put it towards your next vid card or some other ypgrade, like an SSD.
 
I seem to remember a thread a while back where some people where saying tessalation was not worth the performance hit and it sucked ect. ect. ect. It was mainly the ATI guys and because the 5000 series sucks with it.
Can someone find it?

I bet the tune changes when the 6000 series comes out. It will suddenly become the greatest thing since bread and butter. 🙂

I think we are seeing the same argument here with physx.
As soon as ATI catches up, tessalation will then be needed.
Anyone like to take bets?

You can say this on every feature though from the past and part of forum lore when there are sides or competitors. It's easy for me -- they're all welcomed because the features improve upon the gaming experience. There are some that desire to rip apart and there are posters that build or give credit and offer constructive criticism to help improve upon. It is something to be studied and have termed the condition: The ATI/AMD vs nVidia Paradox Sydrome! There is no end of the side mentality and fueled by product family or generations and specific features. All-in-all, it is a good thing though, because it creates passionate and spirited discussion on forums.
 
Dude, it's pretty funny that in order for people to try to defend physx and its performance hit in this game, they point to your ability to gut the feature.

So it's great, because, we can reduce physx functionality to make it playable, and this is necessary on nvidia hardware no less.

Wow you should have been a local politician (if you're not already). I didn't realize removing but 1 of 4 added beneficial features is the same as "gutting." And that's all that was recommended to do with the tweak link. I always thought gutting meant to remove the majority of the inner workings. Like gutting a fish - if you just remove 1 organ, say for instance the liver, I don't think anyone is going to consider that gutting. But I guess for you that's all it takes.

So we have this awesome feature, but hey, we need to reduce it to make it work, what little there even is of it to begin with.

Yeah it's called scalability. The very thing you said you love about PC gaming.

You have to love a feature that literally halves your framerate for a fairly small visual impact.

And once again, I don't care if it halves my frame rate if my frame rate was originally at 100 fps. I'd rather play this game with more special effects and/or higher graphical fidelity at "half of 100 fps."

I'm sorry it is so offensive to you that I do not like physx and point out its glaring flaws. You're free to your opinion, I'm free to mine.

I'm not offended that you don't like implementations of physx. I'm offended that you continue to complain about it when turning it off as the simple, obvious, most wanted solution in your case.

And this is not some cutting edge demanding game, that's the point, it's just demanding, not cutting edge. The game's visuals are dated. It shouldn't need to be tweaked, the graphics are just nothing special. These are not 'significant' graphical additions, they're pretty mediocre.

That is certainly your opinion, but it isn't in the majority.
http://pc.ign.com/articles/111/1115076p2.html
http://www.gamespot.com/pc/action/mafia2/review.html
http://www.gameshark.com/reviews/3605/Mafia-II-Review.htm
http://www.strategyinformer.com/pc/mafia2/reviews.html

Even reviews that are critical of the game praise it's graphics.

But rev up the physx train, because that is about the only thing nvidia will have to distinguish their cards for the next six+ months.

Although this last bit is completely unrelated to physx itself, I hope you're wrong.


This whole argument to me is like going to eat at a restaurant with your friends, lets say mexican food. They give chips and salsa to everyone - so whether you see the chips and salsa as 'free' or already included in the price is up to you. But you go to this restaurant, and you don't like the chips and salsa there, but they're still going to give it to you and your friends. Your friends like it, but you don't. And now suddenly you're offended that these horrible chips and salsa are even being offered in the first place! How dare they! Why do they waste their time on something you think sucks so much! Even though your friends like it you complain and complain and never shut up about it because it's not good enough for you and if you don't like it no one else should be allowed to try it out and ... (gasp...) maybe enjoy it! You think the restaurant should just stop serving free appetizers and not ever, ever try to give out free stuff to their customers.
 
You can, there are tweaks you can do if that is truly important to you.

Even Rage3d offered some PhysX combined performance numbers with an ATI product:

I'm aware of the PhysX hacks and driver options. I'd have to keep updated to the underground of hack drivers and stuff to keep this an option. Something I'm not a big fan of.

What if I'm using Windows XP or Vista? What's my solution there?

I guess you'd have a similar problem if you changed from an Intel mobo to an AMD mobo. You may be able to re-use the RAM, but the CPU is now useless. Sure, that is a physical hardware limitation, and your 9800GTX not working with your 5870 for PhysX via software, but the end result is you can't use an older piece of hardware in both cases.
I guess you can say that switching from Nvidia to AMD is becoming more of a platform switch when it comes to PhysX.
Of course folks can take bits and pieces of this post and wildly put some twists on each bit, but the end result will be the same. You can now sell your 9800GTX and get a few bucks for it. Put it towards your next vid card or some other ypgrade, like an SSD.

A simple statement with various solutions yet in the end the simple statement still applies.

The bold is an interesting concept. I wonder if PC gaming will go the console route. When will either side start paying for exclusive content (such as downloadable maps, mission campaigns, and extra vehicles?) That's when we know we're screwed haha.
 
I'll try to be brief and not take sides here - and yes, I'm a Radeon user at the moment.

PhysX was initially a stand-alone add-on card you could use with whatever PC hardware you wanted. nVidia changed that, and now you have to use an nVidia graphics card for PhysX to function. The reasons behind this seem to be in dispute, and lie somewhere between "nVidia evil" and "ATI evil", and I don't want to go there.

By limiting PhysX to only run on all-nVidia systems (possibly artifically), a promising technology is denied to half the systems out there. If anyone could buy and run a PhysX card, regardless of the rest of the hardware in their PC, things would be better. We would just be discussing the benefits of PhysX.

This is the way it used to be. The appearance is that by buying Aegia, nVidia have taken a good thing away from anyone who uses a competing graphics card - presumably this is to make people buy an nVidia card for their primary GPU. What nVidia could have done is buy Aegia, make PhysX run on nVidia GPUs, and use this fact to sell nVidia GPUs as physics cards to anyone who wants one.

Sure, they can do what they like, they're a company out to make as much money as possible...but it feels needlessly mallicious to do things the way they have. And that part is my opinion - the world would be a better place with stand-alone GPU-agnostic physics cards, which might or might not happen to also be GPUs. I guess that on the whole that sounds like nVidia bashing, but in my opinion they've done something unneccessary that just serves to fragment the PC gaming platform that we all love.
 
Last edited:
I'm aware of the PhysX hacks and driver options. I'd have to keep updated to the underground of hack drivers and stuff to keep this an option. Something I'm not a big fan of.

What if I'm using Windows XP or Vista? What's my solution there?

I don't know what to tell ya that you don't already seem to know. Forcing idealism where none exist is like banging your head against the wall.
 
Wow you should have been a local politician (if you're not already). I didn't realize removing but 1 of 4 added beneficial features is the same as "gutting." And that's all that was recommended to do with the tweak link. I always thought gutting meant to remove the majority of the inner workings. Like gutting a fish - if you just remove 1 organ, say for instance the liver, I don't think anyone is going to consider that gutting. But I guess for you that's all it takes.



Yeah it's called scalability. The very thing you said you love about PC gaming.



And once again, I don't care if it halves my frame rate if my frame rate was originally at 100 fps. I'd rather play this game with more special effects and/or higher graphical fidelity at "half of 100 fps."



I'm not offended that you don't like implementations of physx. I'm offended that you continue to complain about it when turning it off as the simple, obvious, most wanted solution in your case.



That is certainly your opinion, but it isn't in the majority.
http://pc.ign.com/articles/111/1115076p2.html
http://www.gamespot.com/pc/action/mafia2/review.html
http://www.gameshark.com/reviews/3605/Mafia-II-Review.htm
http://www.strategyinformer.com/pc/mafia2/reviews.html

Even reviews that are critical of the game praise it's graphics.



Although this last bit is completely unrelated to physx itself, I hope you're wrong.


This whole argument to me is like going to eat at a restaurant with your friends, lets say mexican food. They give chips and salsa to everyone - so whether you see the chips and salsa as 'free' or already included in the price is up to you. But you go to this restaurant, and you don't like the chips and salsa there, but they're still going to give it to you and your friends. Your friends like it, but you don't. And now suddenly you're offended that these horrible chips and salsa are even being offered in the first place! How dare they! Why do they waste their time on something you think sucks so much! Even though your friends like it you complain and complain and never shut up about it because it's not good enough for you and if you don't like it no one else should be allowed to try it out and ... (gasp...) maybe enjoy it! You think the restaurant should just stop serving free appetizers and not ever, ever try to give out free stuff to their customers.

It is gutting the feature, do you even know why removing the cloth physx effects makes it perform so much better ?

It's because it's the most prevalent physx effect in the game, it's persistent. It's actually the only part of the physx in the game that actually gets the most 'stage time', because there are always characters on your screen. And it still is something you have to actually stop playing and try to see by staring at the hem of a jacket or some coat tails 🙄

Take that away, and all that is left is when you decide to stop aiming at the enemies and start blasting away at a wall to watch rocks fall on the ground.

Your restaurant analogy is apples to oranges. If my friends raved to me about the free salsa and told me I should only go to this restaurant because of their chips and salsa, that this set them apart from all other restaurants. Then I go and discover the chips and salsa are crap, yeah man, I'd be pissed. Of course they are my friends and I would say nothing and grin and say I enjoyed it.

But nvidia is not my friend or my pal. And physx is the crap chips and salsa that was recommended to me by the salesman who wants my money.

Look if you appreciate physx that's fine. My stance is that it's a shitty feature and gpu physx is present in about 10 games, making it even more of a lousy unimportant feature. And when people start tossing around that it's a good reason to go nvidia, or even more ridiculous, buy a dedicated card for it, then that is just crap.

I freely say that the two cards I have now in SLI deliver amazing framerates with everything cranked up at high resolutions. They do it better than 5870CF or a 5970 can, noticeably better. But I could care less about physx, I would care if it actually was a feature with some impact, and not one that brings little to the table that was not already there, and making it more useless it comes at the cost of a huge framerate hit.

I don't have a hard set vendor preference, I care about what cards deliver the best frames and to some extent at times which cards do it for the best value. Physx is nowhere near being a valid enough feature to even put a dent in that metric when measuring the quality of a video card.
 
Last edited:
I'll try to be brief and not take sides here - and yes, I'm a Radeon user at the moment.

PhysX was initially a stand-alone add-on card you could use with whatever PC hardware you wanted. nVidia changed that, and now you have to use an nVidia graphics card for PhysX to function. The reasons behind this seem to be in dispute, and lie somewhere between "nVidia evil" and "ATI evil", and I don't want to go there.

By limiting PhysX to only run on all-nVidia systems (possibly artifically), a promising technology is denied to half the systems out there. If anyone could buy and run a PhysX card, regardless of the rest of the hardware in their PC, things would be better. We would just be discussing the benefits of PhysX.

This is the way it used to be. The appearance is that by buying Aegia, nVidia have taken a good thing away from anyone who uses a competing graphics card - presumably this is to make people buy an nVidia card for their primary GPU. What nVidia could have done is buy Aegia, make PhysX run on nVidia GPUs, and use this fact to sell nVidia GPUs as physics cards to anyone who wants one.

Sure, they can do what they like, they're a company out to make as much money as possible...but it feels needlessly mallicious to do things the way they have. And that part is my opinion - the world would be a better place with stand-alone GPU-agnostic physics cards, which might or might not happen to also be GPUs. I guess that on the whole that sounds like nVidia bashing, but in my opinion they've done something unneccessary that just serves to fragment the PC gaming platform that we all love.

As far as it's just about the money I find it hard to swallow that statement.

If it was all about the money then would it not be more profitable to allow those with a ATI card as primary the option to purchase a nvidia card and use it as a dedicated physX card. All it would take is a couple of keystrokes to allow it and it could also be locked to a minimum spec'd card....Hence it's not about the money!

People bitch and complain about physX all the time. How it doesn't add much to the game experience other than some eye candy. People complain how it could be put to much better use but it's not. People say it's not worth the framerate hit. I don't think it's a problem with physX itself it's more like how it is being handled in general. Currently it's being used only as a marketing tool by nvidia. A gimmic or feature as they call it that you may or not be able to use depending on the card you purchase and the games you play. Yep it would be nice to have but depending on the card and the resolution you play at then it could just be a feature you paid for but can't use anyways!

The only problem with physX is nvidia's spoiled rich kid attitude of my way or the high way! Until this is changed Ageia's work was a waste of man power and the technology!

Now on the other hand if AMD and Intel came up with a replacement slot for PCI-E and didn't license it to nvidia or made it non-functional if a nvidia card was detected as primary alot more people would bitch. Cry fowl and state that it was done on purpose. How it wasn't fare! How there is no logical reason why it shouldn't work! These are the guys that will defend nvidia till the end....No matter what they do and no matter how much they hurt the industry as a whole!

I don't hate nvidia and have considered buying their cards and have had many in the past. Almost went for a pair of GTX 460's but decided to hold out for something with a little more staying power. We'll see if they can make something similar power/heat wise in the near future!
 
I don't know what to tell ya that you don't already seem to know. Forcing idealism where none exist is like banging your head against the wall.

Isn't that the truth. About 90% of posts here are just that.

Makes for interesting reads though 😀
 
What we need from the software community is a way to standardize game features etc into some sort of standard API so we can avoid this BS going forward.

It levels the hardware playing field to.
 
snip

But nvidia is not my friend or my pal. And physx is the crap chips and salsa that was recommended to me by the salesman who wants my money.

Look if you appreciate physx that's fine. My stance is that it's a shitty feature and gpu physx is present in about 10 games, making it even more of a lousy unimportant feature. And when people start tossing around that it's a good reason to go nvidia, or even more ridiculous, buy a dedicated card for it, then that is just crap.

I freely say that the two cards I have now in SLI deliver amazing framerates with everything cranked up at high resolutions. They do it better than 5870CF or a 5970 can, noticeably better. But I could care less about physx, I would care if it actually was a feature with some impact, and not one that brings little to the table that was not already there, and making it more useless it comes at the cost of a huge framerate hit.

I don't have a hard set vendor preference, I care about what cards deliver the best frames and to some extent at times which cards do it for the best value. Physx is nowhere near being a valid enough feature to even but a dent in that metric when measuring the quality of a video card.

In my analogy, your friends are not Nvidia. The restaurant is Nvidia. Your friends are you and I. You did not interpret that analogy entirely correct. But I've gotten several PM's from nonbiased people saying I hit the nail on the head, so I'll just keep referring back to this in the future if need be.

I never said you cared what your vendor preference was, nor did I ever make a suggestion that you, I, or anyone should care. But if it weren't for Nvidia at least trying this out, we wouldn't even be having these discussions. We wouldn't be talking about computeCL or bullet physics, we'd be playing straight up port after port with no differentiation. Whenever I ask physx haters to name one time AMD or Intel helped developers add features specific to a pc port of a multiplatform game outside of Microsoft sanctioned parameters, they always ignore the question. And without this particular initiative to innovate, there would be no incentive from other hardware companies (AMD or Intel) to do anything in regards to accelerated physics.

And once again, I've said this repeatedly over and over again - 100, 150, 200 fps is a waste. I'd rather get 50 fps with more effects and/or higher graphical fidelity any moment of any day of any week of the year. I've never once turned off a graphical feature so I could go from 60 fps to 90 fps.

And here you are saying it's a lousy, unimportant feature yet you take time to make complaint after complaint about it, and rather than having SOMETHING, ANYTHING differentiating a PC game from it's console counterpoint, you would just assume not have the option at all. Well I think 16x AA and 32x AA is a lousy unimportant feature and I'd rather just see AMD and Nvidia not work to provide support or performance improvements at those levels of AA because the frame hit is just too much and it doesn't make anything look better. (see what I did there???)

Being on the platform that prides itself on scaling, pushing the envelope, being the first with new graphical features, and sometimes out dating hardware months after it's release, I'll take something over nothing.
 
What we need from the software community is a way to standardize game features etc into some sort of standard API so we can avoid this BS going forward.

It levels the hardware playing field to.

That is what I thought OpenCL was going to do? Has ATI included support for it in their drivers yet without requiring the SDK be installed as well?

I sense a common theme on physics progression. AMD holding it back.
 
This all sounds like AMD damage control. They have a new card coming out soon and it probably won't offer any new features over last years card. So how are they going to sell it? Not very well is my guess. They need a physics solution, they need a 3D solution, they need to improve their image quality, drivers, developer relations. They need a new architecture. They are years behind NVIDIA and I don't see them catching up.

So bring out Havok or games using Bullet or something. Otherwise AMD and its various supporters sound downright envious of these features.
 
This all sounds like AMD damage control. They have a new card coming out soon and it probably won't offer any new features over last years card. So how are they going to sell it? Not very well is my guess. They need a physics solution, they need a 3D solution, they need to improve their image quality, drivers, developer relations. They need a new architecture. They are years behind NVIDIA and I don't see them catching up.

So bring out Havok or games using Bullet or something. Otherwise AMD and its various supporters sound downright envious of these features.

Well... I'd certainly like them to have a GPU physics solution.. but the rest of what you said is kind of stretching it towards some other reality of fanboi logic..

By the by, 3D works fine on ATI cards... I have 3rd party 3D working just great, and the new Envy 14/17 ship with 3D as well.. on ATI graphics. Why do they need to have driver level 3D functioning... Sure I'd love to not have to pay extra for 3D but it is what it is and teh software is a pittance comapred to teh hardware (unless you go red/blue).
 
Last edited:
The only problem with physX is nvidia's spoiled rich kid attitude of my way or the high way! Until this is changed Ageia's work was a waste of man power and the technology!

I get what you are trying to say but in most situations I just don't think it makes sense for Nvidia.

I do think Nvidia should allow an AMD/Nvidia card combo to work together on physics (with no official support/but no lock outs either). But I just don't see how any other situations would make business sense for Nvidia.

Having Physx run on AMD cards is not something Nvidia would want to pay the support for and probably not what AMD would allow for anyway. For it to run in an optimized manner, AMD would have to provide details on how their hardware and drivers work for Nvidia to write code for this.. not gonna happen, neither side would do this. You could say AMD would write their own drivers.. but again I don't think this would happen. AMD doesn't even have standalone opencl or direct compute drivers, are they really going to spend the resources to write drivers for a product put out by their competitor?

Better Physx on CPU. Again this would be nice, but it makes no business sense for Nvidia. They would be spending resources with no tangible benefit to themselves (remember they have physx to help sell video cards, not as a product in itself) and they would be also reducing demand for their cards.

This is not a spoiled rich kid attitude (well maybe the first one) but it's a our shareholders would kill us if we did it attitude.
 
That is what I thought OpenCL was going to do? Has ATI included support for it in their drivers yet without requiring the SDK be installed as well?

I sense a common theme on physics progression. AMD holding it back.


what your seeing is a community using different API standards for every stupid game feature that comes out. what your also seeing is a hardware manufacturer basically saying. Here is the hardware. We aren't going to rewrite our driver becuase someone refuse to use standard API features to accomplish a task.

The whole issue is that the API still sucks becuase it varys so widely.

Nvidia is screwed if they don't get ahead of this stuff becuase they basically have no market left so its all they have. Thats why they are on it so hard. All they have is features becuase AMD and Intel are about to literally shove them out of the profitable low end market.

All of this again comes down to the API not being cohesive enough for hardware manufacturers to really support it without massive $$$$$ and nobody wants to spend money becuase some guy doesn't know how to properly code game effects

Wanna know why ATI got behind Dx11 so heavily. Becuase DX11 is a codified standard API.

Opengl is nice but as I understand is difficult to work with becuase its almost a assembly language. Not a C++ type of code base.

Also even if ATI/AMD did beter support open gl "which you can download and install" how many games engines are actually making use of it ?

the problem is gaming engines etc are becoming alot like linux with to many distros etc. Its more sensiable for ATI to get better dx9 dx10 dx11 complaince and let the OS vendor fix the other issues.

Its what I would do.
 
Last edited:
what your seeing is a community using different API standards for every stupid game feature that comes out. what your also seeing is a hardware manufacturer basically saying. Here is the hardware. We aren't going to rewrite our driver becuase someone refuse to use standard API features to accomplish a task.

The whole issue is that the API still sucks becuase it varys so widely.

Nvidia is screwed if they don't get ahead of this stuff becuase they basically have no market left so its all they have. Thats why they are on it so hard. All they have is features becuase AMD and Intel are about to literally shove them out of the profitable low end market.

All of this again comes down to the API not being cohesive enough for hardware manufacturers to really support it without massive $$$$$ and nobody wants to spend money becuase some guy doesn't know how to properly code game effects

Wanna know why ATI got behind Dx11 so heavily. Becuase DX11 is a codified standard API.

Opengl is nice but as I understand is difficult to work with becuase its almost a assembly language. Not a C++ type of code base.

My understanding is OpenCL not OpenGL was supposed to be that standard API. Yet AMD has not been willing to support it outside of lip service. In the meantime while AMD sits on their ass Nvidia pushes PhysX and Cuda until OpenCL becomes the standard. And Nvidia has support in each driver for OpenCL and compilers and tools that will work for it.
 
Last edited:
My understanding is OpenCL not OpenGL was supposed to be that standard API. Yet AMD has not been willing to support it outside of lip service. In the meantime while AMD sits on their ass Nvidia pushes PhysX and Cuda until OpenCL becomes the standard. And Nvidia has support in each driver for OpenCL and compilers and tools that will work for it.


OK let me repeat this again. Opencl is nice. but show me where ti fits into the dx11. Also it is difficult to code with and AMD actually does support open CL if you download the SDK kit which takes about 5 minutes.

Is there a consoule on the market with dx10 or dx 11 ? hmmmm really ??? hmmm. you don't think AMd wants that embeded bussiness to stay put ?

It isn't in the interest of AMD to support every stupid thing Nvidia comes up with and they will soon enough " if there enough profit in it" start supporting nvidia like features.Not only that but the benefits are small in I think physics in alot of games obscures vision and makes it hardr to play.

My take however is that we need a bit less splintering and alot more cohesion.

that would mean giving everyone the same all encompassing API and then allowing the hardware manufacturers to just build the dammned hardware. Not waste millions on driver enhancements for games. Which is stupid. Why are hardware companies fixing drivers becuase of bad game coding.

Its like the microsoft vs Other OS argument. Windows is a fiarly poorly written Os with alot of feature creep and code bloat. We all suffer with it so why introduce more of that into the damn system when the games themselves are the problem.

Usually its just lazer coders wanting to use predone effects which nvidia supplies.

Nvidia is actually make the problem worse not better. But from the perspective of nvidia they are loosing market share and this is a strategy that keeps them in the game longer.
 
Last edited:
This all sounds like AMD damage control. They have a new card coming out soon and it probably won't offer any new features over last years card. So how are they going to sell it? Not very well is my guess. They need a physics solution, they need a 3D solution, they need to improve their image quality, drivers, developer relations. They need a new architecture. They are years behind NVIDIA and I don't see them catching up.

It would seem the market disagrees with you. Did they not recently overtake nVidia in overall discrete marketshare? I'm pretty sure that also includes notebooks, but they certainly aren't falling behind in the desktop market. 5800 series prices haven't dropped so their sales must be good. They have their upcoming Fusion chips (along with Intel) which will more than likely take over the low end market. They have the new 6000 series coming out soon which will offer even better performance, and hopefully better prices for us all. What exactly does nVidia have going for it to combat this? Offering better looking debris, clothes, and smoke at the cost of locking you into their own hardware? PhysX is not a game changer. It's nVidia who needs to get their act together (the 460 is definitely a step in the right direction). I certainly don't want to see them fall behind, but - slowly but surely - that seems to be precisely what is happening.

All this despite the apparent "need" for a physics solution, which by the way PhysX is not (it provides a few extra visuals, but there is no physical change in how the game plays).
 
Last edited:
Back
Top