blastingcap
Diamond Member
- Sep 16, 2010
- 6,654
- 5
- 76
I'm pretty sure if that was the case they would put some standard hair on her...
In fact I have no doubt she already has standard hair for people with medium to lower end systems. And don't forget the consoles.
LOL, exactly! The Xbox360 version of Batman:AA still had fog in it, but for the PC version, turning PhysX to "off" removed fog altogether. That was not nice. Did the developer really want to do that for 40-50% of PC gamers?I like physx. Have nice effects. My only problem with physx is the cut off graphics when the effect is off to increase the sense of improving image quality.
Mirros Edge:
PhysX ON: nice flags and plastic interation
PhysX OFF: no flags, no plastic
Batman AA:
PhysX ON: nice volumetric smoke
PhysX OFF: no smoke
If the Lara's hair were made by PhysX probably she would be bald with physx off.
No more PhysX. DirectCompute/OpenCL is going to do a better job. PhysX always sucked. It was always a bit laggy for me, with stuttering - especially in Mirror's Edge, Batman: AA, UT3, Warmonger, etc.. PhysX was never a real part of the gameplay, actually built into the game engine itself upon which gameplay operated. It was always an add-on visual gimmick, and nothing more. Oftentimes, the performance didn't match. The game could be running very very smoothly, then suddenly become a stuttering, jittery mess dropping to 20-ish fps (with the frame times making it feel much worse than that), and then once the billion pieces of broken glass shards magically disappear, the frame rate goes back to normal..There is only 1 PhysX.
There is no CPU-PhysX...no GPU-PhysX.
It's all PhysX.
I think you confuse performance caps implemented by the developer within the game itself.
Take dynamic fog.
It could be run via the CPU...at single digit framerates as the result.
So the developers cap certain features in the game CP/config, simply because the CPU dosn't have the performance to do it in any usefull ways and it would be a useless feature then.
It's all just PhysX...no difference in language/code/compiler.
LOL, exactly! The Xbox360 version of Batman:AA still had fog in it, but for the PC version, turning PhysX to "off" removed fog altogether. That was not nice. Did the developer really want to do that for 40-50% of PC gamers?
No more PhysX. DirectCompute/OpenCL is going to do a better job. PhysX always sucked. It was always a bit laggy for me, with stuttering - especially in Mirror's Edge, Batman: AA, UT3, Warmonger, etc.. PhysX was never a real part of the gameplay, actually built into the game engine itself upon which gameplay operated. It was always an add-on visual gimmick, and nothing more. Oftentimes, the performance didn't match. The game could be running very very smoothly, then suddenly become a stuttering, jittery mess dropping to 20-ish fps (with the frame times making it feel much worse than that), and then once the billion pieces of broken glass shards magically disappear, the frame rate goes back to normal..
It was always poorly optimized code, often over-done to try to make it look impressive, but still hardly impressive. Ghostbusters with Infernal Velocity physics engine that made good use of 4 CPU cores had far-better looking physics than most hardware physx games out there at that time. Even Crysis and Far Cry 2 made all those Physx games seem unimpressive at large. Usually, such PhysX effects would be limited to only a very small handful of add-on "effects" here and there.
I have heard people like you declare "PhysX is dead!" since 2006.
The "argument" ran out of steam YEARS ago...
imho,
Personally feel my views are irrelevant in the big picture that is the over-all market. Instead of the silly bickering, personal aspects, allow the market to decide. Pretty simple actually.
You don't like proprietary? Fine! Some don't!
It hasn't effected the market place negatively over-all; and because of proprietary, nVidia is growing and so is the PC platform.
The screenshots look good, but I haven't seen any videos of gameplay made available by AMD. Isn't physics inherently difficult to show in a screenshot?
Especially the boobs! We had to put up with polygonal boobs for the entire decade, basically, when the GPUs were most certainly capable of smoothing them out to some degree, at least.
It would have been very nice in severely polygonal games like Doom 3, where every single monster head looked like an octagonal stop sign. Especially the boobs! We had to put up with polygonal boobs for the entire decade, basically, when the GPUs were most certainly capable of smoothing them out to some degree, at least.
Yup...you need video to see stuff like that:
http://www.youtube.com/watch?v=WpW6kPBnyQ4
Oh crap...I posted a link to an actual game...some people are going to go nuts now ^^
You and some other posters seem to be either confused or not understanding that proprietary graphics/physics technology is not advancing the market. Every game that uses it feels like a new PhysX demo and nothing more. Physics effects via PhysX locks out 50-60% of the GPU market. For this reason, developers are reluctant to go to the next level. This is why most games that use PhysX do a piss-poor job doing so or end up with poorly exaggerated versions of what we call physics in real life. In other words, proprietary PhysX will not be very prevalent in games unless NV has 80-90% market share. Trying to make realistic physics or DX11 effects using an open-standard compute language, etc. is HOW to make games. This is because the developers could use Compute shaders of the GPU to make games better looking and let NV and AMD focus on making faster hardware with more advanced capability for DirectCompute.
PhysX will continue to be a failure unless NV lets AMD cards use it.
PCGH: AMD claims that PhysX is proprietary. What's your reaction?
Nadeem Mohammed: PhysX is a complete Physics solution which runs on all major platforms like PS3, XBOX360, Wii, PC with Intel or AMD CPU, and on the PC with GeForce cards; it even runs on iPhone. It's available for use by any developer for inclusion in games for any platform - all free of license fees. There's nothing restrictive or proprietary about that. We have been told that some AMD spokespeople talk about PhysX being like 3DFX's GLIDE API - that's even more of inaccuracy analogy, games written for GLIDE simple would not run on any system without a 3DFX card, whereas PhysX runs on more platforms than any other Physics Solution out there, and comes with tools and plug-ins, like APEX which help developers create content which can actually scale between different solutions. So please try out some of the latest PC titles - and give feedback to the developers on what things gamers really want in games - let's keep on pushing the industry to make killer games together!
Godfrey Cheng, Director of Technical Marketing in AMD's Graphics Product Group, has said that PhysX will die if it remains a closed and proprietary standard.
"There is no plan for closed and proprietary standards like PhysX," said Cheng. "As we have emphasised with our support for OpenCL and DX11, closed and proprietary standards will die."
Same old broken record eh?
So the last 7 years of games with physics is nothing...but AMD making hair is "the shizzle"...gotcha...
Sounbds more like AMD trying to say no to PhysX...and then blame NVIDIA because it dosn't run on AMD GPU's...cart befor ehorse
. ATI had Truform, and NV had their own tessellation algorithm over 10 years ago, but both companies didn't want to collaborate in any way at all.
.Of course ATI/AMD didn't want to adopt NV's PhysX
because we've actually had quiet conversations with them
Jen Hsun Huang said:It's been quite a journey and quite an investment. But now CUDA is impacting every aspect of our business. In Tesla, you could see the progress there we just talked about. CUDA is also making it possible for workstations and design applications to design and simulate at the same time. CUDA is also making it possible for our PC gaming GeForce to be able to use simulation for special effects and to materials and dynamics in the virtual world. And so CUDA is -- has proven to be a real lift for our entire GPU business. And I would go so far as to say that, because of CUDA, we have kept the GPU continuing to grow into more and more applications, and as a result, continue to grow the business.
No, my post is how someone who is objective would approach PhysX vs. DirectCompute.
No, my post is how someone who is objective would approach PhysX vs. DirectCompute. It's pretty funny how you are so brand brainwashed that you can't even see how a particular piece of tech locked to a particular brand is alienating a certain portion of consumers. Now try to connect the dots between that and how developers and publishers think...
GCN is the first true from the ground-up DirectCompute architecture from NV. It's slightly more than a 1 year old. At this stage, games that are starting to use DirectCompute to accelerate graphical effects / advance graphical effects are popping up a lot quicker than PhysX games. NV and AMD can continue improving their GPU architectures in the next 5-10 years to be able to handle even more graphically demanding effects via DirectCompute. Compute shaders, geometry shaders (tessellation) can make games better for everyone. I realize you can't get this concept through your head for 7 years now. :hmm:
That entire comment is BS. You just linked some quote that states PhysX is an open standard and then in the same reply you acknowledge it doesn't run on AMD GPUs at all. That means PhysX is proprietary 100%. BL2's .ini hack is more proof than ever that NV blocks PhysX unless you have an NV GPU. Why can't PC gamers with AMD cards run PhysX in games like BL2, Batman AC, etc.?
Whatever the case is, the argument is just stupid to begin with. With more powerful DirectCompute GPUs, the entire gaming community benefits. With PhysX, you have to have an NV card. The more developers push for open standards like tessellation and compute shaders to make games more realistic, the better. PhysX hasn't moved at all in the last 7 years. Same unrealistic looking effects.
My point was proprietary has translated into growth for nVidia:
Don't call BS and then use lies and BS as an counter"...I'm not impressed.
You and some other posters seem to be either confused or not understanding that proprietary graphics/physics technology is not advancing the market. Every game that uses it feels like a new PhysX demo and nothing more.
Physics effects via PhysX locks out 50-60% of the GPU market. For this reason, developers are reluctant to go to the next level and make a game where PhysX effects impact actual gameplay. This is why most games that use PhysX do a piss-poor job doing so or end up with poorly exaggerated/inaccurate versions of what we call "physics" in real life.
In other words, proprietary PhysX will not be very prevalent in games unless NV has 80-90% market share. Trying to make realistic physics or DX11 effects using an open-standard compute language, etc. is HOW to make games. This is because the developers could use Compute shaders of the GPU to make games better looking and let NV and AMD focus on making faster hardware with more advanced capability for DirectCompute.
PhysX will continue to be a failure unless NV lets AMD cards use it. It's good to see that AMD is working closer with developers to make better looking games for everyone.
What would you rather have: (1) Next generation graphical effects available to everyone or (2) if NV and AMD each spent millions of dollars advancing proprietary graphical/physics tech in games that would not work on the competing brand's products?
Just answer 1 or 2. Don't say let the market decide.
The only way anyone thinks PhysX is great is if they exclusively use NV cards and plan on doing so forever and/or if they are a shareholder of NV and/or if they are an employee of NV or its affiliated partners/AIBs.
DirectCompute effects in games is open to any developer on any GPU capable of running them. So now we have geometry shaders, pixel shaders, and compute shaders. All of these are standard.
Thinking so highly of yourself that others have to impress you now? You really are a very weird individual. Your entire argument falls flat on its face because a certain fraction of the market cannot use PhysX. Everyone can run DirectCompute on NV or AMD. Any advancement in visuals that is open for developers to exploit is welcome. I know you can't think outside the box since you'll just to be using NV branded GPUs forever, which is why you can't get this simple point how proprietary PhysX alienates the gaming market and prevents the developers from truly using it in a more advanced manner.
