Batman Arkham City, no physics at all if you don't use physx ?

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Am I the only one surprised by this game's extensive use of tessellation? There is tessellation on buildings, on statues, on floors/pavement, walls....even trees..

Batman AC uses tessellation almost as much as Crysis 2, which is pretty damn awesome!

In this screenshot, you can see tessellation on the bricks in the building across the street, and on the cobbled pavement:



And in this one, you can see it on the tree:





It's too bad DX11 isn't working properly yet, as the DX11 mode is truly spectacular.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Did anyone EVER asked for solid proof of direct money transfer from Nvidia to Rocksteady or retract the statement? Please enlighten me.
Although most of what you say are reasonable, you still haven't answer my inquiry. It was bolded, but I guess you didn't see it.

...Unfortunately the ION LE isn't the smoking gun you think it is...ION LE was a discounted ION chipset to go after the netbook/nettop market by disabling DX10 in return for a lower cost. NVIDIA was willing to take a lower margin to move more IONs, and doing it in this manner would avoid undercutting the DX10-capable ION.
So cutting Dx10 reduce cost. How? Is it just the hardware binning process? Well ION LE runs with DX10 fine with some simple inf change. Will you explain how can cost be reduced again?

Since when it was my intention to say that it is MS's evil doing? I explicitly said that that is not my intention. The whole point of the ION LE case is to show Nvidia cut cost by removing support to Dx10.

If you check the PhysX developer agreement, it's free for use unless you need the source code. I don't know if Rockstead needed the source, but it's unlikely.
http://developer.nvidia.com/physx-downloads
The NVIDIA binary PhysX SDK is 100% Free for both commercial and non-commercial use and is available for immediate download by registered PhysX developers. To become a registered PhysX Developer please complete the registration form (steps provided below) on NVIDIA's PhysX Developers Website.
I read it as, the use of PhysX SDK is free. That doesn't cover the right of selling games featuring PhysX.

Additional Information

For additional information about licencing the PhysX Source Code SDK or questions conerning NVIDIA’s PhysX SDK Support offerings, contact us directly by sending an email with your questions to: PhysX Licensing
I read this as, if you want support or other licenses, you call.

MS isn't giving it out for free. You're paying for it when you buy Windows.
First, I never brought windows, I brought a license to use windows and some of its features. If 8 Gb memory cap under 64 bit OS is acceptable to you, then I have nothing to say. If you too think that is ridiculous, then we can start talking.

Developers get to use it for free because it's applications that make Windows worth using.
There is no reason to go in a loop, is there? To simply put, I say the commercial license is not free as they are IPs, you say it is free because you brought windows. I say any hardware and/or software that requires IP from others to operate needs to pay up, you say it is free if it is from MS because you brought windows. BFG10K believes that I am embarrassing myself, must supply solid proof of what I said or retract my statement. You believe that the behavior and demand of BFG10K are perfectly in line, while mine is on the edge of violating forum rules.

Did I miss anything?
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Although most of what you say are reasonable, you still haven't answer my inquiry. It was bolded, but I guess you didn't see it.
I'm not sure what you're asking?

So cutting Dx10 reduce cost. How? Is it just the hardware binning process? Well ION LE runs with DX10 fine with some simple inf change. Will you explain how can cost be reduced again?
Example: ION is $20, ION LE is $10. ION LE lets NVIDIA sell a lower cost part without cannibalizing the higher end part, because ION LE lacks support for DX10. This is product differentiation in action. They're physically the same, with software defining the features available (which is the same for Quadro/GeForce/Tesla).

Since when it was my intention to say that it is MS's evil doing? I explicitly said that that is not my intention. The whole point of the ION LE case is to show Nvidia cut cost by removing support to Dx10.
I'm responding to "Can I get a piece of evidence indicating that Nvidia and/or AMD may need to pay to operate Dx10/11 on their hardware? Yes, the ION LE case", which is where you submitted the ION LE as evidence that MS was charging AMD/NVIDIA. The fact that it's NVIDIA purposely crippling ION LE to go after the low-end market nullifies that claim.

I read it as, the use of PhysX SDK is free. That doesn't cover the right of selling games featuring PhysX.
Use in this case includes releasing games using it; commercial use is practically self-defining. The whole point being that PhysX is free. NVIDIA gives it away for free so that developers will use it, because using it adds value to NVIDIA's products.

First, I never brought windows, I brought a license to use windows and some of its features. If 8 Gb memory cap under 64 bit OS is acceptable to you, then I have nothing to say. If you too think that is ridiculous, then we can start talking.
Eh? What does this have to do with whether MS is charging developers for DirectX? (I'm trying not to get off-topic here since the only thing I'm here to argue is whether devs are charged to use DirectX)

There is no reason to go in a loop, is there? To simply put, I say the commercial license is not free as they are IPs, you say it is free because you brought windows. I say any hardware and/or software that requires IP from others to operate needs to pay up, you say it is free if it is from MS because you brought windows. BFG10K believes that I am embarrassing myself, must supply solid proof of what I said or retract my statement. You believe that the behavior and demand of BFG10K are perfectly in line, while mine is on the edge of violating forum rules.

Did I miss anything?
Here's the one (and only) thing I am arguing: you claim that MS charges developers to use DirectX, I claim it is not. You have not been able to back that up with any solid evidence that MS is charging to use DirectX on Windows, which would be consistent with the fact that it is free (as in beer) to use.

Meanwhile you've also started conflating free (as in speech) with all of this. Which is another matter entirely and doesn't have any relevance as to whether MS is charging developers to use DirectX.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Am I the only one surprised by this game's extensive use of tessellation? There is tessellation on buildings, on statues, on floors/pavement, walls....even trees..

Batman AC uses tessellation almost as much as Crysis 2, which is pretty damn awesome!

In this screenshot, you can see tessellation on the bricks in the building across the street, and on the cobbled pavement:


It's too bad DX11 isn't working properly yet, as the DX11 mode is truly spectacular.

Tessellation overdone like Crysis 2 is stupid. Reduce performance for nothing. I don't think overdone tessellation looks realistic at all and honestly looks like a cartoon to me which isn't as appealing IMO. Also, a good number of people tell me that Crysis 2 has tessellation going on that you cannot see (underground) just to make AMD cards tank on FPS and look bad. If Batman does that then...I don't know. Makes me hate game companies.

Not sure if anyone even cares anymore about this, but:

For AMD users, get the 11.11b Hotfix driver, it increased my performance in the 80% range on a 5870.

For those using AMD + GeForce hardware for PhsyX (ie me) don't use the latest PhysX software - it has a bug. Use the version including with the Forcware 285.62 WHQL driver, ending in 1062.

Installed the PhysX Mod from NQHF and remove the physixdevice.dll file from the binaries folder.

Just reinstalled all the drivers again and it's better. Still doesn't reflect great numbers in the benchmark(min still dropping heavily), but playing the game now is 50-60fps. Whatever Was going on before to give me below 20fps standing still is no longer a problem.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Tessellation overdone like Crysis 2 is stupid. Reduce performance for nothing. I don't think overdone tessellation looks realistic at all and honestly looks like a cartoon to me which isn't as appealing IMO.

If performance is that big of a concern to you, then you can always disable it, just like you can disable PhysX, or AA, or any other IQ enhancing feature.

Myself on the other hand, want it turned on! Although I must admit I find it humorous that it's always AMD users that complain about the amount of tessellation being used in DX11 games... I wonder why? :D

Also, a good number of people tell me that Crysis 2 has tessellation going on that you cannot see (underground) just to make AMD cards tank on FPS and look bad. If Batman does that then...I don't know. Makes me hate game companies.

Well, thats not really true. Apparently, that underground tessellation was just a by product of how the engine works. The graphics card wasn't actually expending any resources in rendering it.
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Myself on the other hand, want it turned on! Although I must admit I find it humorous that it's always AMD users that complain about the amount of tessellation being used in DX11 games... I wonder why? :D

Because AMD has 1/8 the tessellation performance, because it thought such high capability for tessellation is not necessary and improved the chip in other aspects. And the claim is that such high levels of tessellation as to crash performance on AMD cards does not improve discernible visual quality. Aka, cheating.

I have yet to see enough evidence to settle those claims. But they don't seem totally baseless. See crysis 2 invisible ocean
http://techreport.com/articles.x/21404/3

Note that this DOES harm performance for nvidia cards. It just harms AMD cards much much more.

Well, thats not really true. Apparently, that underground tessellation was just a by product of how the engine works. The graphics card wasn't actually expending any resources in rendering it.
I am going to need to see proof of that claim.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Did anyone EVER asked for solid proof of direct money transfer from Nvidia to Rocksteady or retract the statement? Please enlighten me.
I'm not sure what you're asking?
I can see that this thread is heavily moderatorated. Derailing, trolling, personal attacks and abused did not occur.

As to the rest of the argument, it is done. I have supplied evidence, reasoning, and spent several days over one statement. If you don't realized, others are staying away and going back to what the thread is about. Lets spend several days on the statement above for a change. Do you really don't understand such a simple question?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
As to the rest of the argument, it is done. I have supplied evidence, reasoning, and spent several days over one statement.

I never saw any of that. I saw a link to a general IP policy of MS.
Can you please tell me which post in this thread has that info? (posts are numbered)
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
There is no charge to use DirectX in games, or whatever. If you require help (support) from Microsoft, they will charge you for that, generally. Although they do have limited support for MSDN subscribers. There is also free to use code, runtimes, etc. As well as the developer's forum where you can ask other subscribers for free assistance.

Microsoft, obviously does this to promote DirectX, and therefore, Windows.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Because AMD has 1/8 the tessellation performance, because it thought such high capability for tessellation is not necessary and improved the chip in other aspects. And the claim is that such high levels of tessellation as to crash performance on AMD cards does not improve discernible visual quality. Aka, cheating.

I know why. I was just poking a bit of fun at AMD users :biggrin:

I am going to need to see proof of that claim.

Crysis 2 uses hardware occlusion culling, so if you can't see it, it's not being rendered.

Also, I think I remember that invisible body of water being there in the first game as well.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I know why. I was just poking a bit of fun at AMD users :biggrin:



Crysis 2 uses hardware occlusion culling, so if you can't see it, it's not being rendered.

Also, I think I remember that invisible body of water being there in the first game as well.

It's vertices are being calculated. Even if it's not rendered it's using gpu cycles to calculate the millions of polies. It being there is not a product of the engine. It being there was a product of easier to do it that way in the original game. It's a stupid to do it that way. It's also stupid to apply all of the tessellation they do to so many flat surfaces. It is smart in only that it creates a large performance difference between nVidia and AMD hardware. It hurts everyone's performance though, and if it was done correctly, it wouldn't.

Remember when tessellation was first being talked about? It was being touted a a way to increase detail and performance. Instead it's become a marketing tool hurting performance for everyone.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
DirectX is proprietary to Microsoft, meaning that applies.

This is Microsoft general IP policy! This is not evidence that DirectX costs money. Your claim of providing evidence is a lie if this is all you ever provided.

That page says that MS licenses its IP. Sometimes for free. On SDK's specifically all it says
Application to software developer kits (SDKs): Microsoft recognizes that in the past independent software vendors (ISVs), as part of the SDK program for developing software that runs on the Windows platform, have received certain IP licenses from Microsoft. We do not expect to make changes to our existing approach of granting specific licenses to develop such software under our SDK agreements.

And nowhere on that page does it mention anything about licensing DirectX. I checked.

Also see this thread in MSDN http://social.msdn.microsoft.com/Fo.../thread/66e45b84-19da-4d34-837a-b4b4ba9d08c3/
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Because AMD has 1/8 the tessellation performance, because it thought such high capability for tessellation is not necessary and improved the chip in other aspects. And the claim is that such high levels of tessellation as to crash performance on AMD cards does not improve discernible visual quality. Aka, cheating.

I have yet to see enough evidence to settle those claims. But they don't seem totally baseless. See crysis 2 invisible ocean
http://techreport.com/articles.x/21404/3

Note that this DOES harm performance for nvidia cards. It just harms AMD cards much much more.


I am going to need to see proof of that claim.

Also AMd has a tessellation slider to reduce the impact as much as anyone likes. Or you know, maybe dont enable it??????

Anyway.................
Where is the proof that anyone cheated AMD? These claims are rather childish, one sided, and baseless.
You wouldnt be another one of these persons who can make a claim without any proof at all but demand others to disprove it? something which you yourself have not proven.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Anyway.................
Where is the proof that anyone cheated AMD?

First, read this
taltamir said:
I have yet to see enough evidence to settle those claims. But they don't seem totally baseless.

I said its a possibility. And I was actually being overly generous to nVidia. Because the evidence presented in the article I linked is strong enough evidence to suggest it as the most likely scenario. I wouldn't say beyond any shadow of a doubt. But it is pretty strong evidence.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It's vertices are being calculated. Even if it's not rendered it's using gpu cycles to calculate the millions of polies. It being there is not a product of the engine. It being there was a product of easier to do it that way in the original game. It's a stupid to do it that way. It's also stupid to apply all of the tessellation they do to so many flat surfaces. It is smart in only that it creates a large performance difference between nVidia and AMD hardware. It hurts everyone's performance though, and if it was done correctly, it wouldn't.

Remember when tessellation was first being talked about? It was being touted a a way to increase detail and performance. Instead it's become a marketing tool hurting performance for everyone.

This is the explanation a knowledgeable guy (Novum) on another forum had to say concerning the body of water:

There is a good reason why the water is always "visible".

It's derived from the water rendering algorithm of Crysis 1/CryEngine 2. It's a camera aligned mesh that is rendered after all opaque geometry and therefore invisible parts of it are very quickly rejected by the GPUs Hier-Z mechanism. This is very efficient without tessellation, because the geometry load is minimal without it. So always rendering it was actually the fastest way.

With tessellation it's another story, because the geometry load actually is quite substantial with all the domain/vertex/hull shader work to do. But doing the visibility calculations on the CPU instead or using occlusion queries wouldn't have been trivial.

As with the other D3D11 stuff it seems to have been implemented in a rush or as an afterthought.

Nothing wrong with that from my perspective, because it's a free add on. I also don't think there is some NVIDIA conspiracy behind it, but you never know ;)

I'm not a graphics programmer so I can't say for certain why the body of water is there, but it seems ridiculous to attribute it to some conspiracy against AMD and AMD hardware users.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
*Yawn* I can guarantee I've been PC gaming longer than you have...

Further I never said I hate physx. Your particular quote from me mentioned performance hit. NOBODY, and I do mean NOBODY will say that physx in Batman: AC runs good unless you're running a crazy card like a 570 just for Physx. Maybe if you have SLI it'll be alright, crossfire doesn't have good scaling from what I've read. I find it pretty unbelievable that my GTX 295 being used for physx still gets below 20fps and in Batman: AA and every other game/test I have for physx reflects great performance from it. Maybe the game just needs an update, maybe drivers at this point. Who knows. This is in DX9 BTW.
Also...where did you get 5fps in skyrim? I'm running everything maxed, sliders and all. Never below 20.



hack it like I did. Still, this game runs below 20fps at many times when I set physx on normal when I run my GTX 295 for physx and my 6950 runs everything else in DX9 at 1920x1200 no AA. Turn AA to 4x MSAA and physx off and it never drops below 40. It's pretty bad.

Maybe you just need SLI to get it working or something. With as old an engine as this game uses I think it's pretty sad that it runs as poorly. The textures are horrid at times.

Are you sure you been gaming on the PC for long? Whining over the extra features just made me think your new to PC gaming.

8x AA usually causes my card to crawl and therefor i just dont use it. The performance hit is wayy to much for the visual effects. In PC gaming you sometimes must have to turn down settings to make games more enjoyable.

I could say "8x AA is terrible, they shouldnt have made it, I hate game makers. There is a setting, when i enable it, it makes my game run slower!!!!" I could say that sort of stuff, but i think it is sort of childish. Just because i think 8x AA is an overkill, not worth the hit, doesnt make me go on a religious mission. Its just a setting, you have to enable it..... and just with one click and it goes away!

Tessellation overdone like Crysis 2 is stupid. Reduce performance for nothing. I don't think overdone tessellation looks realistic at all and honestly looks like a cartoon to me which isn't as appealing IMO. Also, a good number of people tell me that Crysis 2 has tessellation going on that you cannot see (underground) just to make AMD cards tank on FPS and look bad. If Batman does that then...I don't know. Makes me hate game companies.

Why hate? If you want identical performance accross the board the i recommended a console (which is completely 100% AMD). You dont have to worry about physX performance drops. You dont have to worry about tessellation. Buy an Xbox, and you wont have to worry at all about evil evil nvidia.

I mean all these games exist without all this terrible terrible stuff you cant stand. They only have these added features on the PC. Which you seem not to enjoy. The "game companies" give us the games in the true form on the consoles. Nvidia tries to spice them up for their customers who bought their expensive GPUs for the PC. What a crime!!!!!

But guess what? You are not obligated at all to play these games with their added features. There is nothing forcing you to loose performance on features you dont want. They are added features though and its your choice to enable them. They take nothing from the original game, nothing. These are added features that you dont like, but these games exist without them, entirely in their original form and nothing is stopping you from playing them that way!

It seems to me that you just dont like having added options and features PC gaming offers. They do cause performance hits. Why bother with all the settings and features that upset you. It seems consoles would be better suited for your gaming needs. Simple, easy, and none of these added features.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Are you sure you been gaming on the PC for long? Whining over the extra features just made me think your new to PC gaming.

8x AA usually causes my card to crawl and therefor i just dont use it. The performance hit is wayy to much for the visual effects. In PC gaming you sometimes must have to turn down settings to make games more enjoyable.

I could say "8x AA is terrible, they shouldnt have made it, I hate game makers. There is a setting, when i enable it, it makes my game run slower!!!!" I could say that sort of stuff, but i think it is sort of childish. Just because i think 8x AA is an overkill, not worth the hit, doesnt make me go on a religious mission. Its just a setting, you have to enable it..... and just with one click and it goes away!



Why hate? If you want identical performance accross the board the i recommended a console (which is completely 100% AMD). You dont have to worry about physX performance drops. You dont have to worry about tessellation. Buy an Xbox, and you wont have to worry at all about evil evil nvidia.

I mean all these games exist without all this terrible terrible stuff you cant stand. They only have these added features on the PC. Which you seem not to enjoy. The "game companies" give us the games in the true form on the consoles. Nvidia tries to spice them up for their customers who bought their expensive GPUs for the PC. What a crime!!!!!

But guess what? You are not obligated at all to play these games with their added features. There is nothing forcing you to loose performance on features you dont want. They are added features though and its your choice to enable them. They take nothing from the original game, nothing. These are added features that you dont like, but these games exist without them, entirely in their original form and nothing is stopping you from playing them that way!

It seems to me that you just dont like having added options and features PC gaming offers. They do cause performance hits. Why bother with all the settings and features that upset you. It seems consoles would be better suited for your gaming needs. Simple, easy, and none of these added features.

PS3 GPU was developed by Nvidia

Who's whining about features? The whole point of the last few pages (whatever part is specific to physx performance) is that the performance hit sometimes is very large when it should not be. Well, I guess we should all just accept that 15fps is normal now?

Added features cause performance hits? How come Battlefield 3 runs with HBAO, Tessellation etc and doesn't get 10fps? Explain to me oh wise one? Oh? What's that? DICE isn't a bunch of idiots? Bingo!

You can make a game look good and not run like a turd. You actually defend the idea of running tessellation underground just to make performance tank on AMD hardware? Really? That IS what you are implying.

BTW: I do own consoles, all of them in fact and I play them as well. Thing is, I choose certain titles for PC because they look better. Why do you have a problem with someone expecting them to, well you know work without getting 10fps? Shouldn't we all expect more from a PC game? Yes I think we should. You and others like you are the reason game companies release garbage ports at times that only work on one set of hardware correctly. You put up with it and just tell everyone to go buy a console. That's backwards as all hell. I'd rather see game companies create a game that works on the PC and actually reflects the power of a PC over a console without bias against one GPU or the other. It isn't that hard...I mean I hate to sound like a broken record but Battlefield 3...perfect example of a game that was done right for the PC IMO. It's not without certain issues related to net code or origin, but the engine is well done overall.

Anyway it seems to me you'd rather game companies pump out ports from the Xbox that need 10 patches to run right without crashing all the time or running at 10-15fps when you turn on a feature. It isn't wrong to expect stuff to run right without artificially crippling a specific GPU.
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I'm not a graphics programmer so I can't say for certain why the body of water is there, but it seems ridiculous to attribute it to some conspiracy against AMD and AMD hardware users.

That was a plausible explanation on how such a mess up could occur due to time and money constraints rather then malice.

But it doesn't support the following
The graphics card wasn't actually expending any resources in rendering it.
In fact it explicitly contradicts it.

Also, there is nothing ridiculous in such conspiracies since they are known to have happened before. Conspiracies are real, people are sitting in prison for conspiracies to commit crimes. People admit to making conspiracies that are not illegal. You are using the word conspiracy to describe the above as a means to ridicule opposing views, which is a logical fallacy and not a nice thing to do.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm not a graphics programmer so I can't say for certain why the body of water is there, but it seems ridiculous to attribute it to some conspiracy against AMD and AMD hardware users.

Similar things have been proven in the past. Cheating on IQ to get more fps for example. I think it's a likely possibility. That or Crytek are a bunch of lazy SOBs who won't go about things the right way. Perhaps even Crytek is purposely crippling or attempting to cripple performance in DX11, similar to how Crysis did it with DX10. Make them the benchmark game. Remember the joke? "but can it run crysis?"
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
Because if a Amd card is detected physx is disabled. Nvidia payed for it and its a nvidia function or whatever its called only thing. But we all know that hacks on how to bypass that so thats not much of a issue anymore unless they stop updating it

but we own the card. its really dirty tactic. thank God there are not many physix games that are worth to use dedicated physix card

*Yawn* I can guarantee I've been PC gaming longer than you have...

Further I never said I hate physx. Your particular quote from me mentioned performance hit. NOBODY, and I do mean NOBODY will say that physx in Batman: AC runs good unless you're running a crazy card like a 570 just for Physx. Maybe if you have SLI it'll be alright, crossfire doesn't have good scaling from what I've read. I find it pretty unbelievable that my GTX 295 being used for physx still gets below 20fps and in Batman: AA and every other game/test I have for physx reflects great performance from it. Maybe the game just needs an update, maybe drivers at this point. Who knows. This is in DX9 BTW.
Also...where did you get 5fps in skyrim? I'm running everything maxed, sliders and all. Never below 20.



hack it like I did. Still, this game runs below 20fps at many times when I set physx on normal when I run my GTX 295 for physx and my 6950 runs everything else in DX9 at 1920x1200 no AA. Turn AA to 4x MSAA and physx off and it never drops below 40. It's pretty bad.

Maybe you just need SLI to get it working or something. With as old an engine as this game uses I think it's pretty sad that it runs as poorly. The textures are horrid at times.


nah, its really doesn't worth it, i give up physix a year ago after nvdia pulling this crap.

what do you expect from a 7 years old engine, its ancient and its console port, not worth a dime. better playing BF3 right now
 

nsavop

Member
Aug 14, 2011
91
0
66
Cmdrdredd what your gpu usage when running batman ac on dx9, my usage stays in the 50's yet I'm getting similar dip in frames. Just curious cause the game looks like it should be pushing my Gtx 570 much more then that.

Definitely wouldn't be surprised if its another bug with this game.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
Cmdrdredd what your gpu usage when running batman ac on dx9, my usage stays in the 50's yet I'm getting similar dip in frames. Just curious cause the game looks like it should be pushing my Gtx 570 much more then that.

Definitely wouldn't be surprised if its another bug with this game.

its a very bad console port, i mean its not have a BF3 IQ yet the performance was horrible. heck its even have a problem running DX 11
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
its a very bad console port, i mean its not have a BF3 IQ yet the performance was horrible. heck its even have a problem running DX 11
do you even own the game? your comment shows that you do not even know that DX11 implementation is the MAIN problem and for most people it is the only problem. it runs just fine in DX9 for 99.9% of people including myself. with vsync and the framerate cap off, the game will use much more of the gpu. with physx on, it does seem to use less of the gpu during heavy physx spots but that happens in other physx games too.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
This is Microsoft general IP policy! This is not evidence that DirectX costs money. Your claim of providing evidence is a lie if this is all you ever provided.

That page says that MS licenses its IP. Sometimes for free. On SDK's specifically all it says


And nowhere on that page does it mention anything about licensing DirectX. I checked.
Did you see the line that stated that licenses/rights must be expressly stated. Dig into EULAs and others can you will find out that you can't quote it without written permissions. Everything about licensing patients are generally non-exclusive and determined by MS. DirectX is MS's Intellectual Property, so licensing is a must. ViRGE agreed to that. The argument was, he believes that the licensing is free, while I believe it isn't. Didn't I said I don't have proof about that in post #371? Well, does he have proof stating otherwise? So far, ViRGE was only able to show that it is free to use its SDK. Yes, it is free for people to develop things with, but that has nothing to do with games featuring Dx code paths right? Why am I the only one who needs to backup my statements where others can just state it? Where is the proof that MS doesn't get a share from the revenue made by a game that use Dx code path? All I see is "Oh MS wants ppl to use DX, so it is free as we brought windows." Really? To end user, that is the case. To the studio that made the game? I don't think so. My claim is said to be extraordinary.

I used the ION case as evidence of how cutting cost means cutting support to Dx10. I really thought this is trivial as Dx10 support can be enabled by a simple INF change. Yet, I guess it isn't trivial to others.

Seriously, forum post? It doesn't even answer the question at hand.
That being said, the DirectX SDK does not require a license (other than the one for Windows). It's "free" to use for development, and the DirectX Runtime is free to distribute as well. For full details, read the license agreement.

Microsoft XNA is a tool made by MS to develop games. Did you miss that? It has a yearly license fee of 99 USD. Did you miss that? Creators, referring to game developers are being paid 70% of the revenue of sales as a baseline. Is that not a proof of game devs payment?
http://en.wikipedia.org/wiki/Microsoft_XNA

Of course not, it doesn't explicitly state that part of it is for licensing a certain right to feature DirectX. It isn't a proof, but an evidence.

Say you brought XBox 360 and a game for it, do you know that you have EULA to click? Yes, you brought a copy of the game alright, but the EULA stated that it is a license to run that game on this XBox ONLY! As to developers, XNA showed that they need to pay. That is, you brought XBox, which is a source of income to MS. You brought a game, and part of that payment also goes to MS. Every single time you buy a game, you are paying MS, not just the whatever payment you made to get that xbox 360. I really thought this is well known.

That is not just xbox, but ps3 and wii too. Devs needs to share revenues with them because their code runs on those IPs. If PC games can escape from this, there probably won't be console gaming. When you pay 39.99USD for a game, I don't think Rocksteady gets more than 10 buck.
 
Last edited: