Call of Duty Ghosts will have PhysX and TXAA

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Nvidia claims PhysX is not proprietary because it runs on multiple platforms. Given that, I would not take anything NV says about PhysX seriously.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Nvidia claims PhysX is not proprietary because it runs on multiple platforms. Given that, I would not take anything NV says about PhysX seriously.

Given that? How does this comment make any sense whatsoever?

Hmmf. Given that...
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Nvidia claims PhysX is not proprietary because it runs on multiple platforms. Given that, I would not take anything NV says about PhysX seriously.

PhysX is proprietary, in the manner that it is NVidia's intellectual property.

But that doesn't mean it's closed to other companies or developers.
 

SPBHM

Diamond Member
Sep 12, 2012
5,071
431
126
PhysX, isn't entirely limited to nVidia hardware -- with titles like Arma 3 and Bioshock Infinite; it's much more than just the GPU component!

the CUDA optimized effects NVIDIA uses are a complete disaster in terms of performance running on the CPU, so it's pretty much NV hardware only...

it started with Ageia trying to sell those PCI cards... at least it wouldn't lock you to one vendor of graphics cards... like nvidia do, even if they detect you are using an AMD or Intel GPU for traditional rendering + nv card for physx.

It isn't limited to NV hardware. It can be licensed by competitors. IMHO AMD is intentionally holding the industry back. PhysX would be much further along right now if AMD licensed PhysX from Nvidia, which they have every ability to do and cost restraints have been shown to be non-existant.

What it boils down to:
Nvidia users get PhysX.
AMD users do not. Why is that?

AMD doesn't need to be license anything, NV just needs to make their GPU accelerated Physx software work with the standards the industry is using, so any Intel, Nvidia or AMD hardware can run, I think the game developers adopting Physx should be asking Nvidia to make that work

but, with Nvidia marketing relying so heavily on physx, it's hard to see it happening, or they offering reasonable "licensing" for AMD like you suggest.

still, with the next gen of consoles having much stronger GPUs than CPUs (compared to PCs), and people demanding 4K displays, I wonder if using GPU resources for physics a good idea for the next couple of years?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
IMHO Nvidia is intentionally holding the industry back. Why won't they submit PhysX to a standards body, or better yet open source it. I don't think anybody can complain about AMD until Nvidia makes PhysX available to everyone, no strings attached.

Or just maybe someone else can make something that competes. How about that for a change? Instead of people complaining, and AMD talking smack they could be developing their own Physics API that works through OpenCL. Get a few games to use it and show the benefits but no, they would rather just say "Nvidia is evil booo" or whatever. *shrug*
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
I still believe nVidia could get some real traction with PhysX if they would simply allow it to run on a secondary nVidia gpu regardless of which brand of gpu was being used as the primary display adapter. They could probably even create a card that was purpose built for this with the extraneous features from their normal gpus stripped out as best as possible, fit it on a tiny, single slot card and make use of those pci-e x1 slots everyone has going unused. Then not only could they pocket a little cash off of the AMD crowd, they would probably even catch a few double sells off of their own customers. Quite honestly I doubt many people choose their GPU on the basis of PhysX, I'm sure they exist but I'm betting the number is quite small and would easily be eclipsed by the numbers they would sell to the staunch AMD gpu patrons who would otherwise not buy an nVidia card regardless.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
NV just needs to make their GPU accelerated Physx software work with the standards the industry is using, so any Intel, Nvidia or AMD hardware can run, I think the game developers adopting Physx should be asking Nvidia to make that work

Not exactly the behavior of a company who wants to differentiate themselves from the competition all that much more. Are you in the practice of creating technology so your competitors can benefit without paying royalties or licensing fees? Let me know, I'll be right over with a shopping cart.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
it started with Ageia trying to sell those PCI card

Indeed! At 299 dollars for this luxury -- and now added value to GeForce! How dare nVidia do this!

Edit:

the CUDA optimized effects NVIDIA uses are a complete disaster in terms of performance running on the CPU

And why it was welcomed to see with PhysX on in Hawken with debris particles with the CPU -- shared my views at Rage3d!

Zogrim's in-depth article touches on this as well:

As for CPU execution of PhysX effects, CPU can operate particle debris with surprisingly decent performance (40-50 fps),

http://physxinfo.com/news/10642/gpu-physx-in-hawken/

Was very glad to see strong CPU performance so many more gamers could potentially enjoy the debris particles.
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Not exactly the behavior of a company who wants to differentiate themselves from the competition all that much more. Are you in the practice of creating technology so your competitors can benefit without paying royalties or licensing fees? Let me know, I'll be right over with a shopping cart.

Apple seems to be doing pretty well at it.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Was very glad to see strong CPU performance so many more gamers could potentially enjoy the debris particles.

Does Hawken use PhysX 3.0 or better? That would explain the better CPU performance compared to older SDKs.

That's where the real battle will take place. NVidia cannot bet the future of PhysX on hardware acceleration. PhysX must be able to run on at least medium settings in software with good performance.

The high setting should be reserved for hardware acceleration, or for a future AVX/AVX2 optimized version.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Apple seems to be doing pretty well at it.

lmao, you wouldn't happen to be talking about the same Apple that sues any company that makes a smart phone that uses icons, would you? :)

You wouldn't happen to have noticed that this is a discussion about Call of Duty: Ghosts having PhysX and TXAA, would you?
-- stahlhart
 
Last edited by a moderator:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The same Apple that helped create OpenCL!

I'm more interested to see what type of dynamic effects the developers can come up with in COD!
 

Baasha

Golden Member
Jan 4, 2010
1,989
20
81
Is this for Call of Duty's cutting-edge next-gen must have feature; Fish AI ? :D

http://www.youtube.com/watch?v=yRriF6Pu1kk&feature=c4-overview&list=UUaupSIOToYMVsMygUA8lvwQ
http://www.youtube.com/watch?v=KJIIgdSW6O4

BF4 is going to dump all over this game. I think IW is resorting to bringing nvidia in to try stay relevant tech-wise against the Battlefield 4 juggernaut.

http://www.youtube.com/watch?v=3_xaIv7Wo1A

It's a little sad.

Couldn't agree more. After seeing the previews of both games, BF4 is head and shoulders above CoD graphics-wise.

I like FPS in general so I'll be getting both games but a bulk of my gaming time will be on BF4 - especially online with friends. :D
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Never used and cared about Physx, and probably never will. Especially funny when NV harps it on a game that NEEDS to run well on a 5770 class GPU for the desired market penetration.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Don't see the humor or understand the point! Borderlands 2 -- DirectX 9 -- playable with a 5770 -- was harped by nVidia -- and this title had a very welcomed dynamic addition with PhysX!

PhysX is in DirectX 11 titles, DirectX 9 titles, some graphic power houses, different genres from MMO's to FPS, different models from conventional and traditional to free-to-play!
 

SPBHM

Diamond Member
Sep 12, 2012
5,071
431
126
Indeed! At 299 dollars for this luxury -- and now added value to GeForce! How dare nVidia do this!

you can pay $1K for a Titan, if you commit the sin of using something else for rendering in your game, Nvidia wants your dedicated PhysX card to act more like a... brick.

yes, the Ageia card was expensive, and suffered with lack of games (still a big problem for Nvidia)

opening GPGPU PhysX more would help to improve this, but as I said... I'm not seeing GPU physics as something that amazing for the next few years...

as for the rest of your post:

Don't see the humor or understand the point! Borderlands 2 -- DirectX 9 -- playable with a 5770 -- was harped by nVidia -- and this title had a very welcomed dynamic addition with PhysX!

Borderlands 2 running the GPU optimized effects on the CPU was a disaster... and with heavy action (mainly with MP), even high end NVidia cards had performance issues for physx + higher settings
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
you can pay $1K for a Titan, if you commit the sin of using something else for rendering in your game, Nvidia wants your dedicated PhysX card to act more like a... brick.

yes, the Ageia card was expensive, and suffered with lack of games (still a big problem for Nvidia)

opening GPGPU PhysX more would help to improve this, but as I said... I'm not seeing GPU physics as something that amazing for the next few years...

as for the rest of your post:



Borderlands 2 running the GPU optimized effects on the CPU was a disaster... and with heavy action (mainly with MP), even high end NVidia cards had performance issues for physx + higher settings

Physx or not BL2 wasn't graphically impressive either to justify the turdy performance.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Borderlands 2 running the GPU optimized effects on the CPU was a disaster... and with heavy action (mainly with MP), even high end NVidia cards had performance issues for physx + higher settings

PhysX in BL2 is currently screwed up because the game engine has issues with handling multiple threads, and not because of PhysX specifically....or so NVidia says.

And I'm likely to believe them, because I get stuttering and lag in certain areas in single player even with my dedicated PhysX card (a GTX 650 Ti), and my rig is powered by an overclocked 3930K.

Disabling HT and running on four cores is supposedly offers the best performance.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
you can pay $1K for a Titan, if you commit the sin of using something else for rendering in your game, Nvidia wants your dedicated PhysX card to act more like a... brick.

All imho,

Ideally, desire nVidia to rethink this based on it still is a PhysX discrete card with a nVidia GPU but I don't have the support costs and data nVidia has! AMD could simply license Cuda and PhysX for their customers and spend resources for them - or at least offer competition here as well. It's tough to have idealism with competitors at all times!

yes, the Ageia card was expensive, and suffered with lack of games (still a big problem for Nvidia)

It was a problem with content but recently there has been very welcomed content: Borderlands 2, PlanetSide 2, Hawken, WarFrame, Rise of the Triad, Metro Last Light, X-com with potentially Cod, Everquest, Batman Origins and Witcher to look forward to!


Borderlands 2 running the GPU optimized effects on the CPU was a disaster... and with heavy action (mainly with MP), even high end NVidia cards had performance issues for physx + higher settings

Borderlands 2 was a hoot with PhysX and added dynamic immersion! There was flexibility with PhysX settings and nVidia render owners could officially use PhysX discrete!
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Apple seems to be doing pretty well at it.

Which one? Differentiating themselves or creating technology for others to benefit? Because AFAIK, Apple doesn't actually "create" anything.

http://www.youtube.com/watch?v=wFeC25BM9E0 17:20 quoting Tim Cook

And also, ask Samsung if Apple appreciates using a design idea and lets them get away with it? Cause Apple had stopped an entire line of Samsung phones from being allowed for release in the US.

Nvidia has every bit of right to protect their IP as any other company. AMD users are honked off because either they don't have the ability to utilize PhysX, or have to run some sort of hack to maybe run it. I'd be honked off at AMD if I was a fan, for not allowing me as a customer all that was possible to offer in gaming.

Call of Duty Ghosts is just another title that AMD users will complain about. Doesn't matter whether the game is great or terrible. It still has a Nvidia feature.
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Which one? Differentiating themselves or creating technology for others to benefit? Because AFAIK, Apple doesn't actually "create" anything.

Here's a partial list
- Webkit
- OpenCL
- Darwin
- Bonjour

All available without licensing fees or royalties, and used by Apple's competitors. Why won't Nvidia do the same I wonder?
 
Last edited:

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
PhysX in BL2 is currently screwed up because the game engine has issues with handling multiple threads, and not because of PhysX specifically....or so NVidia says.

And I'm likely to believe them, because I get stuttering and lag in certain areas in single player even with my dedicated PhysX card (a GTX 650 Ti), and my rig is powered by an overclocked 3930K.

Disabling HT and running on four cores is supposedly offers the best performance.

This sounds like its made up. :sneaky:

If it were true it'd have been plastered everywhere to disable HT.

Didn't gearbox deny the problem was theirs too?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Here's a partial list
- Webkit
- OpenCL
- Darwin
- Bonjour

All available without licensing fees or royalties, and used by Apple's competitors. Why won't Nvidia do the same I wonder?

Webkit
http://www.osnews.com/story/24705/Apple_Withholds_iOS_LGPL_WebKit_Source_Code

OpenCL
http://en.m.wikipedia.org/wiki/Khronos_Group

Darwin
http://en.m.wikipedia.org/wiki/Darwin_(operating_system)
Its all UNIX at the core babe.

Bonjour
Bonjour is released under a terms-of-limited-use license by Apple. It is freeware for clients, though developers and software companies who wish to redistribute it as part of a software package or use the Bonjour logo may need a licensing agreement. The source code for mDNSResponder is available under the Apache License.

Maybe you should have considered posting a complete list instead of a partial one? :)

While you're putting that together, I'll busy myself
And put together a partial list of Apple IP that truly does need a license fee. Keep in mind now, that if I endeavor to find a single one, it would render all seventy billion of your "examples" moot.
I don't really want to go down this pointless road, do you?
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
it would render all seventy billion of your "examples" moot.

Why would it do that?

Your original post:
Not exactly the behavior of a company who wants to differentiate themselves from the competition all that much more. Are you in the practice of creating technology so your competitors can benefit without paying royalties or licensing fees? Let me know, I'll be right over with a shopping cart.

I have shown that a company has created technology and given it freely away to it's competitors without royalties or licensing fees can be quite successful. They can even be "different" as you say.

As far as your shopping cart is concerned, you can fill it up right here:
http://www.opensource.apple.com/
It even has the iOS 6.1 Webkit source that your two year old linked article says doesn't exist.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Oh I see. I don't think I'll be playing the out of context warrior game with you today.

But anyway, you only list half the story. So, if you do this, we can't have a conversation. But we never could anyway.

Maybe we should get back to the root of the "conversation" ?

I'll start. "I think AMD is intentionally holding back the industry..."

Then you,

"I think Nvidia is holding back the industry...."

We go on for eleventy billion pages NOT changing each others minds.

No thanks.
 
Last edited:
Status
Not open for further replies.