The physics "claims" thread...

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It's quite simple, the GPU is already doing something else. The CPU is not. So we use the CPU for physics.

As GPU's grow in power graphics demands will grow with them, thus GPU's will never have spare cycles to use on physics, even if it is a hundred times more efficient at it.

As CPU's grow in power the only thing to scale up to make use of the extra power is physics calculations.

Physics will always be a rarity on GPU for that reason. Physics will simply slowly evolve with CPU speed, especially as CPU's get integrated parallel processing capability.

Then why are many things placed on GPU's because CPU's can't handle it, from transform and lighting, 3d rendering, video? Why wouldn't Physics be any different?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Also from that TH link , review of Batman :And new today, the collaboration with Nvidia : Epic Shows Off Mind Blowing Next-Gen Graphics

I'm not an expert, but does any of the effects shown in the link use PhysX? Subsurface scattering, high quality shadows, broken depth of field, reflections...

If I remember correctly, Nvidia didn't want to support hybrid video card systems because of liability issues. The drivers would have had to be redesigned to support AMD hardware, and Nvidia would also have had to supply tech support as well..

That's the marketing line. Reality... they had to do nothing to have it work. It worked fine until they decided to disable it simply because they could. They wanted to punish all those people who were buying nice shiny new 5000 series cards (while nVidia had nothing new to offer) and use their old nVidia card for PhysX.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
They have shiny new GeForce products and yet they still don't officially support AMD as a primary. No need to punish users based on your logic because nVidia has competitive product now.
 

gorobei

Diamond Member
Jan 7, 2007
3,957
1,443
136
Then why are many things placed on GPU's because CPU's can't handle it, from transform and lighting, 3d rendering, video? Why wouldn't Physics be any different?

physics, folding, etc require a lot of smaller(not exactly simpler) calculations done very quickly. GPUs are parallel brute force approach(able to do wide number of calculations at the same time) vs CPU (branching and conditional out of order stuff). The number of objects in dynamic simulations requires a parallel approach to to get the math done in real time.

However since AMD and Intel are building those parallel vector functions into the APUs, at some point it will be easier to run those dynamics computations on the cpu side rather than the gpu(especially if the dynamics are affecting gameplay: blowing a hole in wall, collapsing a multi story building, toppling trees in forest to block a tank, etc) since you are guaranteed that all systems with a APU have those functions vs maybe the gpu having physx.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Then why are many things placed on GPU's because CPU's can't handle it, from transform and lighting, 3d rendering, video? Why wouldn't Physics be any different?

It's not a question of it handling it better, it's how much the GPU can do at once. All these tasks have already been offloaded to the GPU like rasterization and T&L and even the highest end GPU will choke on a modern game like Metro 2033. If you want GPU physics it's only gonna fly in an older game engine that isn't stressing the GPU.


You can have Crysis graphics with CPU physics or Mafia II graphics with GPU physics, good luck getting both at once.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It's not a question of it handling it better, it's how much the GPU can do at once. All these tasks have already been offloaded to the GPU like rasterization and T&L and even the highest end GPU will choke on a modern game like Metro 2033. If you want GPU physics it's only gonna fly in an older game engine that isn't stressing the GPU.


You can have Crysis graphics with CPU physics or Mafia II graphics with GPU physics, good luck getting both at once.

It seems to me why wouldn't someone try to take advantage of a technologies strength? It's just logical to me. It would be a waste not to try to take advantage of this.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
If I remember correctly, Nvidia didn't want to support hybrid video card systems because of liability issues. The drivers would have had to be redesigned to support AMD hardware, and Nvidia would also have had to supply tech support as well..

nVidia used to sell mobo chipsets, when they first started with SLI they made it such that their own made northbridge was required, an entirely artificial limitation as any set of two 16x lanes would work as has been demonstrated by many.

when their chipset market dried up they thought and though how they can preserve this, and they came up with the idea of creating encrypted licesnse keys they will sell mobo vendors, the mobo vendors will put those in the bios and will pay them 5$ per board. nVidia drivers then had a DRM component added to them that will check for the keys and disable SLI if they are not found. Cracked drivers obviously ensued, which started escalating of ever more complicated DRM to close the latest crack techniques; such as hiding things like gravity reversal code in "game specific optimizations".

Anyways, today mobos that can support AMD xfire and nvidia SLI only support the free xfire unless they pay the 5$ to nVidia, so sometimes you have the exact same board (with intel chipset) come with AMD xfire support, or nvidia SLI + AMD xfire support for a few bucks more.

This is an example of nvidia's insanity, they could have sold so much more cards for people to run SLI if they weren't stupidly greedy about it like this. This is no different then record companies running out of money due to constantly suing people (then thinking they will pay themselves and lawyers but not the artists; sending them instead a letting saying they have no money to pay them since they used it all for lawsuits), or companies putting DRM on their products that costs them millions, decreases their sales, and is broken in under a week.

Anyways, you said about hybrid systems not their SLI, but the principle is the same. This is in fact the SECOND piece of CRACKABLE DRM that is in nvidia drivers. nVidia drivers have only those 2 pieces of DRM code.

nVidia actually accidently released a DRM free build of one beta driver version not too long ago, they quickly pulled it, then put it back due to bad publicity, spun this whole "reliability and compatibility" nonsense, and waited for it to die out and then continued business as usual with future models. This version allowed you to perfectly use AMD main with nVidia secondary.

Not only is nvidia shooting theselves in the foot loosing valuable business they could have made while cementing their position as a physics middleware monopoly (just as MS has their DX monopoly), they are also setting themselves up for an eventual lawsuit as it is blatantly illegal behavior (anti competative). Only reason we didn't see it is because AMD recorgnized nvidia's behavior to be self destructive and actually aiding AMD's goals...
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
It seems to me why wouldn't someone try to take advantage of a technologies strength? It's just logical to me. It would be a waste not to try to take advantage of this.

Is it logical to add more work to a unit operating at 99% capacity? If you add physics to it you have to take something else away.

You need a dedicated physics card or an APU. Physics is the red headed step child of gaming, GPU is too busy for it and CPU isn't that great at it.

People aren't that willing to buy a dedicated physics card. APU will eventually get market penetration and then you may see physics leveraging the strength of parallel processing.

So physics will continue to pace CPU growth.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
People aren't that willing to buy a dedicated physics card. APU will eventually get market penetration and then you may see physics leveraging the strength of parallel processing.

Everyone I talked to who owns a video card said they are willing to buy one... just that the ones being offered suck because:
1. There is not even a single good first order game (don't give me the chicken and egg theory, consoles manage to have exclusive games and even without special contracts most games start out on a single platform). The best implementation is an unreal tournament 3 map mod that does very little.
2. There are issues with bad business practices driving away customers.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
No, the point of physX is to IMPROVE upon said effects, however the improvement proved to be so miniscule that they cheat. They say "either you use physics or we completely erase all fog, debris, and cloth from the game".
Remember that game that had FLAGS and cloth DISAPPEAR when physX was disabled? same principle.
Bad: No flags/fog/etc
Good: Scripted flags/fog/etc
Slightly better: Interactive flags/fog/etc

This is completely untrue. Disabling PhysX doesn't erase anything. Batman Arkham Asylum is a game that was primarily developed for consoles, and the console versions have none of those effects; even in scripted animation form.

These effects were added specifically for Nvidia hardware users.

That's the marketing line. Reality... they had to do nothing to have it work. It worked fine until they decided to disable it simply because they could. They wanted to punish all those people who were buying nice shiny new 5000 series cards (while nVidia had nothing new to offer) and use their old nVidia card for PhysX

Nothing like a good old conspiracy theory eh? :rolleyes: I find it hard to believe a profit seeking company like Nvidia would decline perspective sales because they were angry that people were buying AMD hardware and not Nvidia.

Is it logical to add more work to a unit operating at 99% capacity? If you add physics to it you have to take something else away.

You need a dedicated physics card or an APU. Physics is the red headed step child of gaming, GPU is too busy for it and CPU isn't that great at it.

People aren't that willing to buy a dedicated physics card. APU will eventually get market penetration and then you may see physics leveraging the strength of parallel processing.

So physics will continue to pace CPU growth.

Most gamers have old video cards laying around. I know I do, I have several in fact.

If you want PhysX, you can plug in an old Nvidia video card, or buy a new or slightly used low end one on Ebay for physx processing only.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Nothing like a good old conspiracy theory eh? :rolleyes: I find it hard to believe a profit seeking company like Nvidia would decline perspective sales because they were angry that people were buying AMD hardware and not Nvidia.

Nothing at all like a conspiracy theory. It's a fact that you could use an AMD card as the primary rendering card and add/use an nVidia with it for PhysX. Nothing extra that nVidia had to do, support, work on, etc... It just worked. So, they decided to disable it. They don't want you to use an AMD card for anything. If you want nVidia's PhysX, you need to run nVidia hardware exclusively. (or use Hacked/old drivers)

Just like you don't need anything more on your mobo to run SLI than you do to run Crossfire. nVidia doesn't get their five bucks though and you can't do it on your board. (Again, unless you use a hack)

It's the way they operate. Obviously, they believe locking people out will make them want nVidia hardware. While I'm sure that works to a certain extent, I think it does just as much to hurt them and their reputation.
 

thilanliyan

Lifer
Jun 21, 2005
12,039
2,251
126
This is completely untrue. Disabling PhysX doesn't erase anything. Batman Arkham Asylum is a game that was primarily developed for consoles, and the console versions have none of those effects; even in scripted animation form.

These effects were added specifically for Nvidia hardware users.

I think he was talking about that free running game Mirror's Edge? I seem to remember flags and such disappearing when PhysX was turned off.
 
Last edited:

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Most gamers have old video cards laying around. I know I do, I have several in fact.

If you want PhysX, you can plug in an old Nvidia video card, or buy a new or slightly used low end one on Ebay for physx processing only.

1. some of us sell off older hardware to help offset the cost of constant upgrades and staying bleeding edge

2. many of us have AMD as their primary video card and thus do not have the option of a dedicated PhysX GPU because nVidia prevents that from happening without some sort of hack/work around

I used to own an 8800GT that would work perfect for PhysX as a dedicated PhysX GPU but I sold that off long ago as it was useless to me considering it played second fiddle to a 4850 and then 5850, each of which I would have had to employed the hack to even begin to think of bothering with PhysX.

Now that I'm on dual GTX 470s, PhysX has been neat to try out but ultimately I'm glad I never kept the 8800GT to bother with PhysX, the feature just hasn't been worthwhile to go out of the way for at all IMO.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
Then why are many things placed on GPU's because CPU's can't handle it, from transform and lighting, 3d rendering, video? Why wouldn't Physics be any different?

because the future of cpu is going to multicore heck even right now we have more core to use. so to maximize its potential we must use that untaped power, or soon we will waste our 20 core CPU or the developer should use the GPU on the APU to do physics calculation and since hardcore gamer won't use onboard GPU and by the fact future CPU will be All APU, why not we use it for GPU physics


Everyone I talked to who owns a video card said they are willing to buy one... just that the ones being offered suck because:
1. There is not even a single good first order game (don't give me the chicken and egg theory, consoles manage to have exclusive games and even without special contracts most games start out on a single platform). The best implementation is an unreal tournament 3 map mod that does very little.
2. There are issues with bad business practices driving away customers.

this. if there are games that use physix like BF:BC2 did then i will buy NVDIA graphic card and not just two crapy no one know games

This is completely untrue. Disabling PhysX doesn't erase anything. Batman Arkham Asylum is a game that was primarily developed for consoles, and the console versions have none of those effects; even in scripted animation form.

These effects were added specifically for Nvidia hardware users.



Nothing like a good old conspiracy theory eh? :rolleyes: I find it hard to believe a profit seeking company like Nvidia would decline perspective sales because they were angry that people were buying AMD hardware and not Nvidia.



Most gamers have old video cards laying around. I know I do, I have several in fact.

If you want PhysX, you can plug in an old Nvidia video card, or buy a new or slightly used low end one on Ebay for physx processing only.

did you try it your self ??? i have several laying nvdia video card (8600GT) that i use but after nvdia try to put DRM on it i give up, and trash it, i don't need more DRM on my computer, i have enough of them.
 
Last edited:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Disabling PhysX on it's own card when the primary card is AMD is the biggest mistake nVidia made.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Disabling PhysX on it's own card when the primary card is AMD is the biggest mistake nVidia made.

especially, consider that AMD had the upper hand in performance for about a year. They were first for DX11 and nvidia's cards were bigger slower hotter etc.
So who buys an AMD card at that point? why the people with the money and willingness to spend it on top end hardware, the exact people who would buy a secondary physics card. nVidia could have specialized in that for about a year while getting their act together, but they instead put DRM to prevent it as if that would force people to use an nvidia main card and an nvidia secondary physics card... instead of just harming physics. If nvidia had a monopoly it would have forced the result they wanted, but they didn't so it backfired. And that was the obvious result.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Everyone I talked to who owns a video card said they are willing to buy one... just that the ones being offered suck because:
1. There is not even a single good first order game (don't give me the chicken and egg theory, consoles manage to have exclusive games and even without special contracts most games start out on a single platform). The best implementation is an unreal tournament 3 map mod that does very little.
2. There are issues with bad business practices driving away customers.

Yep, I bought one card, physx wasn't even worth the $20 I payed for the 9600gt and the all the fiddling I had to do to get the drivers to work. I might have kept it if I didn't have to fool around with drivers.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
When PhysX first came out the CPU was woefully incapable with parallel loads. Not so much now with 4/6/8 core processors with 4/6/8 or even 12 threads. Unless nVidia really get it in gear the window of opportunity will close on GPU Physx. While GPU's are getting more and more taxed, CPU parallelism is growing faster than the software is able to take advantage of it.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
When PhysX first came out the CPU was woefully incapable with parallel loads. Not so much now with 4/6/8 core processors with 4/6/8 or even 12 threads. Unless nVidia really get it in gear the window of opportunity will close on GPU Physx. While GPU's are getting more and more taxed, CPU parallelism is growing faster than the software is able to take advantage of it.

You have a problem though.
On the 4c/8T CPU, you spend almost as much time waiting for physics threads to finish as you do comupting them.

It's abot being parrellel and 8 thears is no match for 512 CUDA cores...do the the math.

Or read claim #1...with documentation saying otherwise from Bullet Physics.

It's really sad that myths are being used as facts.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Is it logical to add more work to a unit operating at 99% capacity? If you add physics to it you have to take something else away.

You need a dedicated physics card or an APU. Physics is the red headed step child of gaming, GPU is too busy for it and CPU isn't that great at it.

People aren't that willing to buy a dedicated physics card. APU will eventually get market penetration and then you may see physics leveraging the strength of parallel processing.

So physics will continue to pace CPU growth.

Why would a gamer choose an APU?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
When PhysX first came out the CPU was woefully incapable with parallel loads. Not so much now with 4/6/8 core processors with 4/6/8 or even 12 threads. Unless nVidia really get it in gear the window of opportunity will close on GPU Physx. While GPU's are getting more and more taxed, CPU parallelism is growing faster than the software is able to take advantage of it.

I think having more cores and threads are great for CPU's. I also think taking advantage of the GPU strengths for GPU processing is great as well.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
Why would a gamer choose an APU?

because its the future our CPU heading off, look at SB, its enthusiast CPU but have GPU in it, and since enthusiast tend to be hardcore gamer too, the GPU become useless thats why we need open physics to become more standard(like bullet) and not nvdia crapy implemented, limited physix.