Nvidia Fermi is recommended for Metro 2033 game

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
They don't have one. Nor is it likely they will have one any time soon but Halo would definitely be a candidate. Don't know who owns the Forza franchise but racing games are not really my thing. Racing games are definitely a candidate for realistic physics though.

I think my point didn't get through. MS cares so much about PC gaming they don't even release their biggest franchises on the PC anymore. You really think they are a better option then nVidia for promoting PC gaming technology now? nV pays to get PC exclusive content, nV has paid for all PC users(including ATi owners) to get DLC their console counterparts had to pay for for free. Obviously nV has their own best interest in mind, but the reality is that MS is actively reducing the impact of PC gaming- they are not releasing their own top games on the PC.

Something you're arguing for, and I agree 100%. It's just that that too many nVidia fanboys are arguing for PhysX and not seeing how nVidia's stance (let's not get into whose fault PhysX's current status is cause I think both nVidia and ATI are to blame) on PhysX is actually hindering GPU accelerated physics.

nVidia's stance of supporting every single physics standard and proactively developing the most robust system available is hindering GPU accelerated physics? You honestly think that?

Here is a history example for you. Nvidia's first 2d/3d card the NV1, had pretty impressive 3D performance for it's time (1995), but it used a proprietary rendering method called "quadratic texture mapping". While a few games did release for it, developers in the end decided to go with open standards Direct3D and OpenGL instead.

I take it you are too young to remember those days- that is stunningly revisionistic to put it mildly. Direct3D and OpenGL were almost completely ignored for several years following the nV1 as devs stuck with Glide. D3D was shockingly bad, and OpenGL wasn't optimized for performance in that era. The entriely proprietary Glide ended up being the dominant platform for PC game development for quite some time, and noone seemed to care too much.

Also- the nV1 was a console chip that ended up getting released for the PC. It was the Sega Saturn's 3D chip.

Halo series all use Havok, dunno about Forza 3 but I would imagine an in house physics engine since it's suppose to simulate car physics specifically.

Actually, on the PC they use Intel's Petiumtastic phyics. That would be imaginary and made up ;)
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
I think my point didn't get through. MS cares so much about PC gaming they don't even release their biggest franchises on the PC anymore. You really think they are a better option then nVidia for promoting PC gaming technology now? nV pays to get PC exclusive content, nV has paid for all PC users(including ATi owners) to get DLC their console counterparts had to pay for for free. Obviously nV has their own best interest in mind, but the reality is that MS is actively reducing the impact of PC gaming- they are not releasing their own top games on the PC.

Windows for games is Microsoft's campaign for promoting PC gaming. AFAIK Gears of War 2 is the only PC game that worth mentioning that hasn't been released on PC. Reducing PC games baseline will reduce Windows sales, If I can't play good games in Windows, there's no reason for me to pay for an OS when I can do the same thing except gaming in Linux.

nVidia's stance of supporting every single physics standard and proactively developing the most robust system available is hindering GPU accelerated physics? You honestly think that?

nVidia's only supports PhysX, doesn't support any other Physics standard because Havok doesn't run on GPU's, it can run even with an Intel IGP.

Also- the nV1 was a console chip that ended up getting released for the PC. It was the Sega Saturn's 3D chip.

And yet, PS1 had better graphics.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
I take it you are too young to remember those days- that is stunningly revisionistic to put it mildly. Direct3D and OpenGL were almost completely ignored for several years following the nV1 as devs stuck with Glide. D3D was shockingly bad, and OpenGL wasn't optimized for performance in that era. The entriely proprietary Glide ended up being the dominant platform for PC game development for quite some time, and noone seemed to care too much.

Actually I owned an NV1 card, the Diamond Edge 3D, used it for a couple of months, before switching to the S3 VIRGE based Diamond Stealth 3D card, which had way better 2D performance (3D was slow). Then a few months later I bought an Orchid Righteous 3D (Voodoo1), I remember playing some awesome Glide games on it such as Quake&Quake2, Tomb Raider1&2, Fatal racing, Descent etc. Anyways I wasn't too young as you say because I bought those 3 cards in 1996 for about $300 each.

Eventually Glide support died out, like NV1 support before it, and as I mentioned earlier, developers choose Direct3D and OpenGL. Your post only proved my point, thanks for the support :)

Also- the nV1 was a console chip that ended up getting released for the PC. It was the Sega Saturn's 3D chip.

Sega Saturn used a custom chip VDP1 to do the texturing, but it was actually the Hitachi CPU that did most of the 3D work, Sega added a second CPU in the last minute to improve 3D performance. NV1 was never in Sega Saturn, but NV1 based PC cards had two Saturn Joypad ports and there was a few Sega Saturn games that were ported to the NV1 PC cards. The Diamond Edge that I bought came with Virtua Fighter and Panzer Dragon.

Is my post still "revisionist" for you?
 
Last edited:

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
I think my point didn't get through. MS cares so much about PC gaming they don't even release their biggest franchises on the PC anymore. You really think they are a better option then nVidia for promoting PC gaming technology now? nV pays to get PC exclusive content, nV has paid for all PC users(including ATi owners) to get DLC their console counterparts had to pay for for free. Obviously nV has their own best interest in mind, but the reality is that MS is actively reducing the impact of PC gaming- they are not releasing their own top games on the PC.

I didn't say better. But it would have wider developer support because nVidia is part of the problem on why GPU accelerated physics is not taking off. That's why.

While I agree that it's puzzling they don't port some of the biggest games to the PC (perhaps they are waiting until all possible Xbox sales have died), that wasn't what I was arguing. My argument was that it would make sense for them to make further ties between PC's and their Xbox console. It seems that your argument was that they have abandoned PC gaming. Different arguments.

nVidia's stance of supporting every single physics standard and proactively developing the most robust system available is hindering GPU accelerated physics? You honestly think that?

There's a lot of "hidden" stuff and if you read between the lines from various places you'll see that nVidia is not as altruistic as it seems from a casual look. This was through comments by both ATI and nVidia. Too many comments spread over a period of time for me to even want to attempt to look them up again. There is also an undeniable fact that too many companies abuse what power they have.

ATI would be stupid, and if I was a shareholder of ATI I'd sue them, for putting themselves in a position where a key tech they support is 100% controlled by their largest competitor. Except when said tech has become the defacto industry standard of course. But that's not the case with PhysX as GPU PhysX is still very very small.

If nVidia really wanted GPU PhysX to take off with no strings attached (as they have stated and which I don't believe) they would have already done so. If you truly believe that nVidia was not trying to get one over on ATI then you are being naive. Every company is trying to get one over on the competition. That's why it's nVidia's "fault" that GPU physics is being hindered. Keep in mind that the this decision makes huge business sense from nVidia's perspective and if I was an nVidia shareholder I applaud the move.

The most puzzling move is the fact that nVidia even locked out GPU PhysX when using an ATI card as the primary. That more than all the "reading between the lines" shows me that nVidia never meant to be open and honest in regards to to PhysX with ATI. One would think that allowing PhysX to flourish, even in conjunction with an ATI video card as the primary, would benefit PhysX in the long term and actually force ATI to support it eventually.

With all that said about nVidia's very good developer support, they are still actively locking out at least 35% of the video cards (ATI) out there and developers would have to think long and hard, or get paid a lot of money, before putting in experience changing GPU PhysX gameplay in.

This is stuff that has already been argued to death though. No one has said anything to change my opinion that GPU accelerated physics is being held back by nVidia for not fully opening up PhysX to ATI as well as locking ATI out completely when they couldn't get their way. And this is not to say ATI isn't at fault either cause they haven't gotten their sh*t together.

Unless you have new information you'd like to share with everyone, I don't want to continue about the GPU PhysX issue since most of it has already been covered in previous threads, and again. Nothing anyone has said has changed my mind. If you want to go over a few points, I'd be happy to do so over PM's or a new thread.

I take it you are too young to remember those days- that is stunningly revisionistic to put it mildly. Direct3D and OpenGL were almost completely ignored for several years following the nV1 as devs stuck with Glide. D3D was shockingly bad, and OpenGL wasn't optimized for performance in that era. The entriely proprietary Glide ended up being the dominant platform for PC game development for quite some time, and noone seemed to care too much.

I know the comment was not to me but the very first video card I ever purchased was a Geforce 256 if that tells you anything. My very first experience with multi-player gaming was on the Mac...Marathon was much better than Doom. Sadly, it was on the Mac...dooming it to oblivion.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Windows for games is Microsoft's campaign for promoting PC gaming. AFAIK Gears of War 2 is the only PC game that worth mentioning that hasn't been released on PC.

You ignore Forza? Me too FPSs are a dime a dozen, a top tier fully flushed out racing title on the PC does not exist and hasn't for years. Microsoft decided that PC gamers weren't good enough for them to port it over. Same with MS's largest gaming franchise ever- they decided PC users were not good enough to get that either- and these are the people you want to see leading PC gaming? I would have exactly the same PoV if we were talking about Sony or Nintendo. MS is a console gaming company that has some PC games in 2010, not all that different from Sony with their approach as of late.

Reducing PC games baseline will reduce Windows sales

Not by much.

nVidia's only supports PhysX, doesn't support any other Physics standard because Havok doesn't run on GPU's, it can run even with an Intel IGP.

nVidia chairs OpenCL and has supported Bullet for quite some time.

Eventually Glide support died out, like NV1 support before it, and as I mentioned earlier, developers choose Direct3D and OpenGL. Your post only proved my point, thanks for the support

OpenGL support has died out which was open, D3D is 100% proprietary. It is the API most developers chose and hence everyone is forced to support it. The same could happen with another API for a different element of gaming.

And yet, PS1 had better graphics.

Your Saturn must have been broken :) Best to best during their runs, the Saturn had better visuals, the PS1 just had much better games.

Sega Saturn used a custom chip VDP1 to do the texturing, but it was actually the Hitachi CPU that did most of the 3D work, Sega added a second CPU in the last minute to improve 3D performance.

Is this like the PS3 not having a G70 chip in it? It's actually a custom RSX? BTW- while it seems Google got you a couple of correct answers, the Saturn had dual VDPs too, not just CPUs and all 3D hardware back then were simply rasterizers- the orginal GeForce was the first GPU.

Is my post still "revisionist" for you?

Quite so, you do do a wonderful job of maintaining that theme :)

ATI would be stupid, and if I was a shareholder of ATI I'd sue them, for putting themselves in a position where a key tech they support is 100% controlled by their largest competitor. Except when said tech has become the defacto industry standard of course. But that's not the case with PhysX as GPU PhysX is still very very small.

You aren't a shareholder of ATi? Then are you one of MS, Sony or Nintendo? I truly can not comprehend why people would be against PhysX. I don't see even the most over the top rabid nVidia fans so much as hinting that developers shouldn't support Eyefinity- but that is exactly what the anti PhysX crowd has been doing. You aren't missing anything that wouldn't have been there without nV. ATi operates in a free society, they are completely free to make a counterpart as is MS. Given the choice between having the option and noone having it- I'd rather the option be there. It's much like nVidia's 3D gaming, I think it's fairly stupid and always have, but in no way would I lament developers supporting another avenue of potential progress which is precisely what the anti PhysX crowd has been doing with a nigh religous zeal.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
OpenGL support has died out which was open,

Which API do you think Mac games run on?

D3D is 100% proprietary. It is the API most developers chose and hence everyone is forced to support it. The same could happen with another API for a different element of gaming.

I will correct your comment. "It is the API all developers chose, hence Nvidia/3DFx were forced to support it"

Do you know the reason why developers chose DirectX? A DirectX (or OpenGL) game runs on all supporting hardware, even crappy Intel IGPs, while PhysX for example only runs on Nvidia hardware. Hopefully you can see the difference.


Is this like the PS3 not having a G70 chip in it? It's actually a custom RSX? BTW- while it seems Google got you a couple of correct answers, the Saturn had dual VDPs too, not just CPUs and all 3D hardware back then were simply rasterizers- the orginal GeForce was the first GPU.

You mentioned the NV1 was in Saturn, so I was just commenting that the custom chip or chips in Saturn were not made by Nvidia. Sega Saturn did not have any Nvidia hardware. The RSX in the PS3 is an Nvidia chip. If I'm not mistaken, the first console to use an Nvidia chip was the original XBox.

You aren't a shareholder of ATi? Then are you one of MS, Sony or Nintendo? I truly can not comprehend why people would be against PhysX. I don't see even the most over the top rabid nVidia fans so much as hinting that developers shouldn't support Eyefinity- but that is exactly what the anti PhysX crowd has been doing. You aren't missing anything that wouldn't have been there without nV. ATi operates in a free society, they are completely free to make a counterpart as is MS. Given the choice between having the option and noone having it- I'd rather the option be there. It's much like nVidia's 3D gaming, I think it's fairly stupid and always have, but in no way would I lament developers supporting another avenue of potential progress which is precisely what the anti PhysX crowd has been doing with a nigh religous zeal.

I don't have anything against PhysX, I'm just fed up with reading over and over again about it as if it is the be all end all feature, with Nvidia & gang trying to shove this "dead" API down our throats.

If a developer creates a game with Eyefinity support, and Nvidia releases a similar multi-monitor technology, wouldn't that game run on multi-monitors on the Nvidia GPU? I'm just asking cause I'm not sure about this, never was interested in Eyefinity.
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Which API do you think Mac games run on?

If Mac gaming is considered a viable metric, PhysX is far more dominant then any open 3D gaming API :)

I will correct your comment. "It is the API all developers chose, hence Nvidia/3DFx were forced to support it"

The largest game publisher in the world has released a total of zero D3D titles ever. Saying all developers chose D3D is flat out insane. Also, nVidia was the first vendor to use D3D as their standard API- 3Dfx, Rendition, SGi and even ATi were all using a proprietary API when the Riva hit.

Do you know the reason why developers chose DirectX? A DirectX (or OpenGL) game runs on all supporting hardware, even crappy Intel IGPs, while PhysX for example only runs on Nvidia hardware. Hopefully you can see the difference.

DirectX runs on 100% of gaming PCs and the 360. PhysX runs on 66% of gaming PCs, the 360, the Wii, the PS3 and even the iPhone. PhysX supports more hardware used for gaming, by a huge margin, then DirectX.

The RSX in the PS3 is an Nvidia chip.

No, it's a Sony chip. It is a Sony chip that utilizes a nVidia design(the NV2A used in the original XBox was an actual nV chip, made by nV, RSX is licensed IP).

I don't have anything against PhysX, I'm just fed up with reading over and over again about it as if it is the be all end all feature, with Nvidia & gang trying to shove this "dead" API down our throats.

Better physical interaction is the next big step in 3D gaming. nVidia is actively trying to promote this. ATi and MS are actively trying to hold the industry back. I don't like any approach that is anti progress, one of the reasons I used to bash 3dfx. In those days I was pushing that hardware dedicated to geometry processing and pixel shading was the way forward and 3dfx fans insisted that more pixel pushing power was all we needed. nVidia was continually bashed by anti progress fans for doing something that wasn't needed and wasn't utilized(PhysX has a much higher adoption rate now then they were getting back then). In those days, ATi decided on the side of progress and lived to see those against it die off. Nothing is stopping ATi from coming up with their own physics solution, nor MS for that matter. It has been years now that we have been waiting while they continue to remain an anchor around the industry's neck.

If a developer creates a game with Eyefinity support, and Nvidia releases a similar multi-monitor technology, wouldn't that game run on multi-monitors on the Nvidia GPU? I'm just asking cause I'm not sure about this, never was interested in Eyefinity.

I'm not even sure, but I am sure that two orders of magnitude more gamers can use PhysX then Eyefinity. I'm also sure that nV is dealing with a large portion of the costs and development of getting PhysX effects into games. So, what do I consider to be a better use of developers time? Be that as it may, I'm not big on holding back the industry so I won't be lamenting devs adding any new feature to a game no matter how small the audience for that feature is.
 
Jan 24, 2009
125
0
0
The largest game publisher in the world has released a total of zero D3D titles ever. Saying all developers chose D3D is flat out insane. Also, nVidia was the first vendor to use D3D as their standard API- 3Dfx, Rendition, SGi and even ATi were all using a proprietary API when the Riva hit.

Ok, clearly you aren't talking about Activision Blizzard, which is what I, and presumably most other people, would think of when hearing the words 'world's largest game publisher'.

By your definition, who is the world's largest game publisher?
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
You ignore Forza? Me too FPSs are a dime a dozen, a top tier fully flushed out racing title on the PC does not exist and hasn't for years. Microsoft decided that PC gamers weren't good enough for them to port it over. Same with MS's largest gaming franchise ever- they decided PC users were not good enough to get that either- and these are the people you want to see leading PC gaming? I would have exactly the same PoV if we were talking about Sony or Nintendo. MS is a console gaming company that has some PC games in 2010, not all that different from Sony with their approach as of late.

I don't care about racing games. I don't know why MS decided not to port some of their larger published games over on the PC and frankly, I'm not going to waste brain cells worrying about it. I've also already stated that they probably won't release them soon because they want to move people towards the Xbox games console where they could hopefully recoup the billions spent on the console as well as making a buck on each game sold as opposed to the current Windows situation where they make zero (at least directly) from any game sales.

I already said why we should not cross MS out from releasing a physics middleware. You can agree or disagree. I'm not going to spend five pages arguing about it. I've said my piece and am ending that part of the argument. If you want to continue to rage about it, by all means continue to do so but it won't be with me responding.

You aren't a shareholder of ATi? Then are you one of MS, Sony or Nintendo?
Umm...are you an nVidia shareholder? What kind of question is that? Anyone, and I'm sure including you, could see where I was coming from with that comment. If you couldn't understand what I wrote or misread what I wrote, then go back and reread it. Cause I'm sure most people who read these forums could.

I truly can not comprehend why people would be against PhysX.
I have nothing against PhysX. What I posted was my personal opinion and analysis on why nVidia is holding GPU accelerated physics back. Hypothetically, if nVidia developed another physics package instead of buying PhysX and nVidia pulled the same stunts with said internally developed physics package, I'd say the same thing about it. What I said had nothing to do with PhysX directly, but rather about nVidia's business decisions handling a piece of technology they own.

And instead of providing a counter argument on why my opinion and analysis is wrong, your only rebuttal can be summed up as "OMGWTFBBQ, everyone hates PhysX."

I don't see even the most over the top rabid nVidia fans so much as hinting that developers shouldn't support Eyefinity- but that is exactly what the anti PhysX crowd has been doing.You aren't missing anything that wouldn't have been there without nV. ATi operates in a free society, they are completely free to make a counterpart as is MS. Given the choice between having the option and noone having it- I'd rather the option be there. It's much like nVidia's 3D gaming, I think it's fairly stupid and always have, but in no way would I lament developers supporting another avenue of potential progress which is precisely what the anti PhysX crowd has been doing with a nigh religous zeal.
Frankly, I couldn't care less about Eyefinity, or 3D Vision. Some people don't like nVidia's handling of PhysX not because they're ATI fanboys or anti-PhysX but because they aren't blinded by bias and can see how PhysX, in its current form, can fracture the market. The unbiased can also see why ATI, from a business standpoint, won't support PhysX natively on their cards because they don't want a key tech under 100% control of their largest competitor. And nVidia's recent handling of some games in the TWIMTBP program further enforces this belief that they will try to pull a fast one on ATI.

And...umm...my zeal? You're trying to imply I'm biased? I provided reasons why I felt nVidia's handling of PhysX was holding GPU physics back. Instead of merely saying I'm biased, provide counter arguments to what I've written.

And I've already stated the absolute very first video card I purchased, way back in 2000, was a Geforce 256. Last I checked, that was an nVidia video card. I currently have a Radeon 4870. Card before that, Geforce 8800GTS. My brother has two 9800's in the closet I could use for PhysX. Guess who won't allow that to happen?

PhysX had a real chance to be THE physics standard. Especially with nVidia's Tegra platform (another platform for PhysX), their strong position in the PC video card markets, and the fact that they cater to the console market as well. I've argued why it makes sense for MS to release a physics middleware. It makes equal, if not more, sense for nVidia to open up PhysX a bit and allow nVidia cards as PhysX accelerators to work freely with ATI cards. This would allow more widespread developer support than is currently enjoyed by PhysX on the PC front. It'd also allow nVidia to go to MS, Sony and Nintendo and push for them to include an nVidia GPU in their next console systems because it'd make developers happy. Why would developers be happy? Because they have more options when coding a game and if they use PhysX, it'd make it easier (cheaper) to port multi-console games.

Nothing you've said in this thread has swayed my perception that ATI made the right business moves in regards to native PhysX support on ATI cards.It is nVidia, who you are claiming as promoting physics, that is locking out 35% of the market from being able to use PhysX. This is a very real roadblock in PhysX adoption.

It is a fact that at least two games using Havok have provided way better use of physics in the environment that promote gameplay than any PhysX game. I think this quote from Ben Kuchera of Ars Technica sums up Bad Company 2's use of Havok physics well "In one tense standoff I was pinned inside a house, fire coming from nearly every angle. My solution? I threw a grenade behind me, blew a hole through the wall, and exited, flanking my adversaries." Sadly, no PhysX game has provided that type of interaction because of nVidia's handling of PhysX.

It is still my opinion that nVidia bungled PhysX. With the type of games Havok is showing itself as a very capable package. It's CPU intensive but CPU's, much like GPU's, are getting more and more powerful.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
If Mac gaming is considered a viable metric, PhysX is far more dominant then any open 3D gaming API :)

Gaming or non-gaming metric, you said that OpenGL was dead, when it's alive and well on the Mac platform. MS did a great job with DirectX to the point where it caught up with OpenGL and overtook it. That's why OpenGL is less popular on the PC these days.

The largest game publisher in the world has released a total of zero D3D titles ever. Saying all developers chose D3D is flat out insane. Also, nVidia was the first vendor to use D3D as their standard API- 3Dfx, Rendition, SGi and even ATi were all using a proprietary API when the Riva hit.

I'm not sure which company you're talking about, Activision-Blizzard and EA, the two largest 3rd party publishers have been making DirectX games for over a decade. Maybe you are talking about a Japanese company (Nintendo, Sony?) that doesn't even make any PC games in the first place?

DirectX runs on 100% of gaming PCs and the 360. PhysX runs on 66% of gaming PCs, the 360, the Wii, the PS3 and even the iPhone. PhysX supports more hardware used for gaming, by a huge margin, then DirectX.

Now that you brought that up, lets check the numbers shall we. Worldwide Wii, 360, and PS3 sales courtesy of http://www.vgchartz.com/:

Wii: 68 Million, 360: 38 Million, PS3: 32 Million. Wii+360 = 124 Million

Now lets modify your statement. PhysX supports more ATI based hardware used for gaming, by a huge margin, than Nvidia hardware (124M vs. 32M). That's kinda sad isn't it? You can add the PC numbers if you like, that won't help much. Anyways, enjoy those awesome facts.

The Wii and 360 use ATI GPUs, how come Nvidia didn't lock out PhysX on those platforms like they did on the PC? I mean they both use non-nV GPUs.

Oh wait, those two consoles are the market leaders, and Nvidia as I said, wants to shove PhysX down everyone's throats. ATI hardware must be really powerful, since even the weak ATI chip in Wii can run PhysX :)

I'm not even sure, but I am sure that two orders of magnitude more gamers can use PhysX then Eyefinity. I'm also sure that nV is dealing with a large portion of the costs and development of getting PhysX effects into games. So, what do I consider to be a better use of developers time? Be that as it may, I'm not big on holding back the industry so I won't be lamenting devs adding any new feature to a game no matter how small the audience for that feature is.

For all we know, adding Eyefinity support to a game may be a trivial thing, while developing a true PhysX title takes lots of time and resources, so they are not directly comparable as you are implying. People that might find use for Eyefinity are a small fraction of the market, so it's up to the developer to decide to add support for it or not.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
If Mac gaming is considered a viable metric, PhysX is far more dominant then any open 3D gaming API :)
A growing(?) number of OSX games are done using Cyder to port the PC versions, which means they are using a wrapper to make the D3D version work.

DirectX runs on 100% of gaming PCs and the 360. PhysX runs on 66% of gaming PCs, the 360, the Wii, the PS3 and even the iPhone. PhysX supports more hardware used for gaming, by a huge margin, then DirectX.

Actually PhysX works on 100% of gaming PCs.
I'm not sure why you would divide the PC market into NV/ATI and then say that the console market can be added to the NV side.
Consoles run 'software'/CPU PhysX, as can all PCs, only NV cards can run hardware PhysX... why would you say PhysX runs on all consoles and 66% of the PC market when it either runs on all consoles and all PCs, or it just runs on 66% of the PC market. Youre being rather fuzzy here.

Also, as a counterpoint, UnrealEngine3 runs on 100% of PCs, the PS3, the 360, and the Wii and the iPhone.
That means that everyone should use UnrealEngine 3 for their games. I mean, if we're comparing an API to a middleware engine, why not compare an API to a graphics engine?
Why not compare a middleware engine to a game engine which makes use of said middleware?

Oh, also:

Havok Physics is the fastest and most scalable physics engine available. It is extensively optimized on all today’s gaming platforms: Microsoft® Xbox 360,® Sony® PLAYSTATION®3, Nintendo® Wii,™ Microsoft® Xbox®, Sony PlayStation®2, PSP™, and the PC.


My main question is:
Why are you comparing physics middleware to an API, and why are you blurring the lines between hardware PhysX and software PhysX to present an argument that's both full of fallacy and lacking in logic?
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
I'm not even sure, but I am sure that two orders of magnitude more gamers can use PhysX then Eyefinity. I'm also sure that nV is dealing with a large portion of the costs and development of getting PhysX effects into games. So, what do I consider to be a better use of developers time? Be that as it may, I'm not big on holding back the industry so I won't be lamenting devs adding any new feature to a game no matter how small the audience for that feature is.

Not even gonna respond to the other shit, but "Eyefinity support" (or, ultra-wide resolution) is so goddamn trivial it's a shame more games don't support it. It's a simple matter of calculating the viewport out naturally instead of arbitrarily limiting it when you hit certain resolutions, which is actually an extra step in code than not. It takes more work to not support it than it does to do.


Either way you have to calculate the viewport, the proper way supports ultra-wide aspect ratios, the improper way is the same amount of work (or more) and doesn't. Even games like Goldeneye that came out before widescreen format was popular do ultra-widescreen properly.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
My main question is:
Why are you comparing physics middleware to an API, and why are you blurring the lines between hardware PhysX and software PhysX to present an argument that's both full of fallacy and lacking in logic?

Well according to him, PhysX is running on more ATI hardware than Nvidia. Because as you said, on the PC side he only counted hardware PhysX, then he counted PhysX on consoles which makes their PhysX "hardware based" too. And of course the Wii and 360 use ATI GPUs, so we have PhysX running on mostly ATI hardware. Maybe Nvidia should thank ATI :)

Anyway, I would like the answer too, although I'm expecting more bunk and no substance.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
I hate phys-x, but a compute standard is going to be needed once in-game effects become too complex for a cpu. This is starting to take place now, but it isn't fog or sparks that are to complicated for a cpu.
What effects would this be? Now that 6 cores are coming, and we've got 12 on a server CPU (so 1 year or more and AMD can start rolling it out for desktops if they wanted and if there was a place for it, but there isn't currently), then Bulldozer promising to have an 8-core version, it seems to me that there's a lot of spare CPU power lying around. Which effects would be too complex for a CPU? Cloth? Water? Smoke? In 2 years we can have 2-6 cores dedicated to just physics (I mean it can be programmed that way since the hardware exisits) and the remaining 2 cores for everything else the game needs.

I guess unless they want fur/cloth effects that are lifelike and dynamic instead of scripted, I don't see game developers hurriedly pushing out a standard GPU API for physics. This way, all CPU power available is used, and the GPU power is reserved for graphics details (tessellation comes to mind, since it can sap performance quite a bit).

I'm no expert, merely my musings about it.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I think Tviceman just did.
I think there may be a problem advertising PhysX and 3DVision and then listing AMD hardware in any of the spec. Min, recommended or optimal. I'm pretty sure the dev knows that AMD hardware are unable to execute these features.

Thing is guys, you might be complaining even more if the DID list AMD in the specs.
"How dare they list AMD in the specs if they can't run PhysX or 3DVision!!! This is false advertising!!"
Know what I mean? If they DID list AMD hardware, there would have to be a "*" next to that entry, and the comment at the bottom would read something like:

*"AMD hardware can run this game with DirectX 10 and 11 features, but cannot run PhysX or 3DVision. Listed Nvidia graphics cards are needed for these features."

Maybe the dev was trying to be nice and spared AMD this sort of embarrassment?
I doubt it, but maybe they're just covering their butts.

This is a joke. Physx and 3dVision are ~ eyefinity in my book. physics effects can be enabled without the nvidia branded physx, and nvidia cards can also run multiple monitors with a little extra effort.

Could you see the dirt2 devs listing "eyefinity" as a feature and then only listing amd cards?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Now that you brought that up, lets check the numbers shall we. Worldwide Wii, 360, and PS3 sales courtesy of http://www.vgchartz.com/:

Wii: 68 Million, 360: 38 Million, PS3: 32 Million. Wii+360 = 124 Million

I see it is silly season.
PhysX runs on the SPU in the PS3.
Last I looked the Cell was Sony/IBM/Toshiba....not AMD/ATI

  • Strike 1...
PhysX runs on the Xenon on the Xbox3260.
Last I looked Xenon was MS/Chartered/IBM.
  • Strike 2...
PhysX on the Wii run on the "Broadway" CPU...that IBM again(and CPU for a third time).
Last I looked IBM still wasn't AMD/ATi.

  • Strike 3...you are out!!!
So could you please rewrite your post to include the facts?


Who needs drugs when you have "argumentum ad ignorantiam" in these quanteties?
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I see it is silly season.
PhysX runs on the SPU in the PS3.
Last I looked the Cell was Sony/IBM/Toshiba....not AMD/ATI

  • Strike 1...
PhysX runs on the Xenon on the Xbox3260.

Last I looked Xenon was MS/Chartered/IBM.
  • Strike 2...
PhysX on the Wii run on the "Broadway" CPU...that IBM again(and CPU for a third time).
Last I looked IBM still wasn't AMD/ATi.

  • Strike 3...you are out!!!
So could you please rewrite your post to include the facts?


Who needs drugs when you have "argumentum ad ignorantiam" in these quanteties?

Maybe you should try telling this to BenSkywalker?
He's the one who equated NV hardware PhysX to general PhysX which runs on a variety of platforms.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Maybe you should try telling this to BenSkywalker?
He's the one who equated NV hardware PhysX to general PhysX which runs on a variety of platforms.


No need to...there is so much FUD about PhysX (and havok/bullet) and CUDA (and OpenCL and DirectCompute) it's imposible to sort out all the ingorance/lies.

I just strike down when I see something that shouldn't be posted on a geek forum:
Ingorance of the 3 order.

Actually I have seen ~10 people that get it (on the AT forums)...and then a lot of FUD'sters that don't get, but sure wan't to sound like it.


I look forward to (in lets say 6-10 years) when AMD/ATI actually delivers something GPGPU physics...besides talk.

Then we can look at the performance and the debate will be more intelligent.
It was actually getting there...with Hell Gate: London.
Supposedly to run ATi GPU physics via Havok...untill it came out it wasn't to be.
Right before that cancelation the debate was much sounder....suddenly physics on the GPU mattered.

Until all the red boys got shafted...and we went back to ignorant land.

Please carry on.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
No need to...there is so much FUD about PhysX (and havok/bullet) and CUDA (and OpenCL and DirectCompute) it's imposible to sort out all the ingorance/lies.

I just strike down when I see something that shouldn't be posted on a geek forum:
Ingorance of the 3 order.

Actually I have seen ~10 people that get it (on the AT forums)...and then a lot of FUD'sters that don't get, but sure wan't to sound like it.

I look forward to (in lets say 6-10 years) when AMD/ATI actually delivers something GPGPU physics...besides talk.

Then we can look at the performance and the debate will be more intelligent.
It was actually getting there...with Hell Gate: London.
Supposedly to run ATi GPU physics via Havok...untill it came out it wasn't to be.
Right before that cancelation the debate was much sounder....suddenly physics on the GPU mattered.

Until all the red boys got shafted...and we went back to ignorant land.

Please carry on.

AMD/ATI shouldn't be delivering GPGPU physics, they should be delivering hardware which runs an API that a physics middleware product can be built upon/make use of.
So far they haven't (or hadn't) done that yet (unless ATI have finally sorted out their OpenCL driver...).

That's also the problem with PhysX. The accelerated version is a piece of proprietary middleware which runs on a proprietary API, and that's what the problem is.
The middleware itself is fine and dandy, because while it's proprietary, it runs on basically everything in software mode.

It's taking us back to the dark ages when we had multiple different graphics card makers each with their own API, as well as having more standard ones, so games like Unreal had the option of running in D3D, OpenGL, Glide and you could use S3TC if you had an S3 card to give you better textures.
UnrealEngine runs on PS3, Xbox 360, PC etc, and it runs on ATI, NV, whatever. And that's how it should be. You make middleware, and it runs on systems using the standard API(s) for that system.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
I see it is silly season.

No, I think it is Whale Bloodbath season?

PhysX runs on the SPU in the PS3.
Last I looked the Cell was Sony/IBM/Toshiba....not AMD/ATI

  • Strike 1...
PhysX runs on the Xenon on the Xbox3260.

Last I looked Xenon was MS/Chartered/IBM.
  • Strike 2...
PhysX on the Wii run on the "Broadway" CPU...that IBM again(and CPU for a third time).
Last I looked IBM still wasn't AMD/ATi.

  • Strike 3...you are out!!!

So PhysX on consoles is software based and runs on the CPU, just like Havok physics for example. But lets check Ben's post again:

"DirectX runs on 100% of gaming PCs and the 360. PhysX runs on 66% of gaming PCs, the 360, the Wii, the PS3 and even the iPhone."

As you know DirectX 3D rendering is hardware based, done on the GPU, and he mentions DirectX on the PC and 360, then compares it with hardware PhysX on the PC. After that he adds up the software PhysX on consoles and the iPhone, and declares:

"PhysX supports more hardware used for gaming, by a huge margin, then DirectX."

He did some mixing and matching to make PhysX sound more impressive, when in truth console PhysX is no better than Havok, and Havok is actually more widely used and probably faster.

Of course I did my own mixing and matching, which seems to have agitated some Nvidia parrots/fanatics. There will be more facts coming.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
He did some mixing and matching to make PhysX sound more impressive, when in truth console PhysX is no better than Havok, and Havok is actually more widely used and probably faster.

Of course I did my own mixing and matching, which seems to have agitated some Nvidia parrots/fanatics. There will be more facts coming.

according to Game Developer Magazine world’s most popular physics API is nVidia PhysX, with 26.8% market share [if there was any doubt that nVidia PhysX isn't popular, which was defense line from many AMD employees], followed by Intel’s Havok and its 22.7% – but Open sourced Bulled Physics Library is third with 10.3%
(From last September).

There are probably 2 main reasons for this.
1) PhysX is free for the basic SDK (Havok is only free if your game sells for less than $10 and is on PC)
2) UnrealEngine 3 uses PhysX.
Although if Epic decided to use it, it probably wasn't because it was cheap, so there may be merit to it, but it would need a game developer who has experience to chip in.
Certainly for lower budget titles an inexpensive physics engine is a boon though.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
AMD/ATI shouldn't be delivering GPGPU physics, they should be delivering hardware which runs an API that a physics middleware product can be built upon/make use of.
So far they haven't (or hadn't) done that yet (unless ATI have finally sorted out their OpenCL driver...).

That's also the problem with PhysX. The accelerated version is a piece of proprietary middleware which runs on a proprietary API, and that's what the problem is.
The middleware itself is fine and dandy, because while it's proprietary, it runs on basically everything in software mode.

It's taking us back to the dark ages when we had multiple different graphics card makers each with their own API, as well as having more standard ones, so games like Unreal had the option of running in D3D, OpenGL, Glide and you could use S3TC if you had an S3 card to give you better textures.
UnrealEngine runs on PS3, Xbox 360, PC etc, and it runs on ATI, NV, whatever. And that's how it should be. You make middleware, and it runs on systems using the standard API(s) for that system.
Well said, and this is an argument that seems to have gotten lost in the last 50 posts of factoid-slinging and e-peen.

PhysX isn't worth anything until a developer actually does something interesting with it, and no developer is going to waste time on it as long as it stays proprietary and the market distribution is what it is. You can talk about the advantages of GPU-accelerated physics until you're blue in the face, but something as simple as the destructible environments in BF:BC2 has done more for in-game physics and gaming in general than any released GPU-accelerated PhysX implementations. Let that give some of you perspective on the situation.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
Well said, and this is an argument that seems to have gotten lost in the last 50 posts of factoid-slinging and e-peen.

PhysX isn't worth anything until a developer actually does something interesting with it, and no developer is going to waste time on it as long as it stays proprietary and the market distribution is what it is. You can talk about the advantages of GPU-accelerated physics until you're blue in the face, but something as simple as the destructible environments in BF:BC2 has done more for in-game physics and gaming in general than any released GPU-accelerated PhysX implementations. Let that give some of you perspective on the situation.

lol I've said this in every single one of my posts.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Well said, and this is an argument that seems to have gotten lost in the last 50 posts of factoid-slinging and e-peen.

PhysX isn't worth anything until a developer actually does something interesting with it, and no developer is going to waste time on it as long as it stays proprietary and the market distribution is what it is. You can talk about the advantages of GPU-accelerated physics until you're blue in the face, but something as simple as the destructible environments in BF:BC2 has done more for in-game physics and gaming in general than any released GPU-accelerated PhysX implementations. Let that give some of you perspective on the situation.

nw0boi.gif

Havok.

red-faction-guerrilla-20090319031107736_640w.jpg

Havok.

http://www.youtube.com/watch?v=6o6YVLmOs74#t=50s
Modified Havok.

Interesting uses of physics. Three games which don't have hardware acceleration of physics available (since they all run on consoles).
And what do we get on PC? "realistically waving flags and realistic volumetric fog".

And this is why we don't give a damn.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
lol I've said this in every single one of my posts.
Eh, I breezed through it, exclude yourself from my generalization if you didn't deserve it :p.

Havok.


Havok.

http://www.youtube.com/watch?v=6o6YVLmOs74#t=50s
Modified Havok.

Interesting uses of physics. Three games which don't have hardware acceleration of physics available (since they all run on consoles).
And what do we get on PC? "realistically waving flags and realistic volumetric fog".

And this is why we don't give a damn.
A-frickin'-men.