How to enable Nvidia Phsyx on Ati cards in Batman:AA

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Originally posted by: Shaq

The fact that it can be enabled by changing the vendor ID proves nothing but that people can circumvent the check and get it for free. If AMD would have paid the devs to put the AA in the game it would be there. Complain to ATI that they don't support owners of their hardware and don't tear down companies that do to make ATI look better.
The point is that it?s an artificial block by the developer. We know there?s nothing inherently special about nVidia?s hardware to make AA work in-game because AA works when you defeat the vendor check. Since nVidia stated they weren?t responsible for the block in the other thread, that leaves the developer to blame.

I got the same framerate forcing AA or using it in-game.
If that?s the case then why bother implementing in-game AA in the first place?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Originally posted by: Wreckage

Stop distorting the truth. The unreal 3 engine does not support AA.
According to Sweeney it does under DX10. The options might not be there to enable it to end-user, but the code certainly is there.

This coupled with in-game AA working on ATi cards in Batman when the vendor check is defeated - along with claims that driver AA runs the same speed as in-game - makes me struggle to understand just what the developer?s effort actually achieved.

Seriously out of this huge list you can only count a few games?
http://www.nzone.com/object/nzone_physxgames_all.html

I see a lot of great games.
We?ve been over this before, yet you keep producing that disingenuous list. Most of the games on that list show no benefit from hardware acceleration, and there are console-only titles there too.

The real list looks more like this one: http://www.anandtech.com/video/showdoc.aspx?i=3539&p=8

With recent games like Batman we can conclude that the master list of hardware accelerated PhysX titles numbers somewhere around 15-20.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Originally posted by: BFG10K

BFG can you run through the logic you used to get to that conclusion? Don't take in a bad way I just can't see the way you got there.
This is an artificial block. The work has already done, there?s nothing inherently special about nVidia?s hardware, and we know in-game AA works on ATi?s hardware when the vendor check is defeated.

Since the OP?s quote states nVidia isn?t responsible for the block, that leaves the developer to blame.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Wreckage
Originally posted by: akugami


Click on the link at the bottom of that list for a list of games that actually use GPU PhysX and the list becomes...much less impressive.

How about the list of list of games that actually use GPU Havok? You do know don't you?

You keep talking about GPU accelerated physics, and how Havok lacks that right now, then you produce a list of in which the vast majority of games named are software accelerated. Distortion of truth yet again...
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: BFG10K
Since the OP?s quote states nVidia isn?t responsible for the block, that leaves the developer to blame.

And that naturally begs the question - why would the developer implement the block?

The Occam's razor here would suggest it was about quality control...if it takes effort (money, resources) to qualify/validate software as running properly and fully on hardware then why would the developer leave themselves open to risk ruining their credibility by releasing a game that potentially would be viewed as being "buggy" on AMD hardware?

To reduce this risk the developer would have to expend resources (money) to validate the code as working properly on AMD hardware...why would they do this?

What would it gain them? Would they sell more copies of the game if it had AA support for AMD cards? Would they recover their investment?

The fact that they weren't going to put AA into the game even for NV hardware unless NV ponied up for it tells me their internal sales projections were telling them (the developer) that investing in AA wasn't going to create enough additional sales to justify the AA pricetag (for either AMD or NV or both combined).

We really need not invoke actions of sinister ne'er-do-wells being afoot to explain the outcome here...logical, rational, typical business decisions made from the simple assessments of return on investment as well as quality control and validation (routine things in business) are ample to explain what happened here.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K

According to Sweeney it does under DX10. The options might not be there to enable it to end-user, but the code certainly is there.
What about DX9? What work is needed for the DX10 version? Have you written the code and can clear this up for us?
This coupled with in-game AA working on ATi cards in Batman when the vendor check is defeated - along with claims that driver AA runs the same speed as in-game - makes me struggle to understand just what the developer?s effort actually achieved.
If AMD wanted in-game AA why did they not work with the company to implement it? Why do you think they should be rewarded for doing nothing? People should stop buying their cards until they start supporting games correctly.

We?ve been over this before, yet you keep producing that disingenuous list. Most of the games on that list show no benefit from hardware acceleration, and there are console-only titles there too.
That was a discussion (not involving you) regarding Havok vs Physx. Stop trying to bounce back and forth between CPU and GPU in order to skew a veiwpoint. If you are going to discuss Havok vs Physx then the list I used is entirely accurate. Since Havok still does not support GPU acceleration there are zero games to compare it against Physx. I would hope you know this for your "reviews" at the other hardware site you work at.

Originally posted by: SlowSpyder

You keep talking about GPU accelerated physics, and how Havok lacks that right now, then you produce a list of in which the vast majority of games named are software accelerated. Distortion of truth yet again...
That was a discussion (not involving you) regarding Havok vs Physx. Stop trying to bounce back and forth between CPU and GPU in order to skew a veiwpoint. If you are going to discuss Havok vs Physx then the list I used is entirely accurate. Since Havok still does not support GPU acceleration there are zero games to compare it against Physx.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K
This is an artificial block.
The AA code may not have been fully tested on AMD hardware. So it may cause problems to enable it.

The work has already done
By NVIDIA. AMD should work with developers more so games won't run poorly on their hardware.

there?s nothing inherently special about nVidia?s hardware,
They still have the fastest video card out there, that's kind of special. ;)

and we know in-game AA works on ATi?s hardware when the vendor check is defeated.
We know this from an AMD marketing blog. The game developer seems to think otherwise.
Since the OP?s quote states nVidia isn?t responsible for the block, that leaves the developer to blame.

No it leaves AMD to blame for not working with the developer.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Originally posted by: Wreckage

No it's clearly and 100% AMDs fault.
How? Their hardware already supports the in-game implementation, and the work has already been done.

Why should AMD get a free lunch? So while NVIDIA did all the ground work to help AA get going with the developer, AMD should just get to sit at home and wait for a check to arrive.
You might have a point if there was something inherently special about the implementation that only made in-game AA possible on nVidia?s cards. But that isn?t the case because it works unaltered on ATi cards. It also appears to be no faster than driver AA, so the argument that it?s more efficient doesn?t seem to apply either.

Your argument failed. AA is not a "standard feature" of the Unreal Engine.
According to Sweeney the engine supports it under DX10.

Wrong again. It has to be programmed into the game for it to work "in game" otherwise you have to force it globally which is less efficient.

In fact AA does run on Batman by using CCC, it's just slower. So what really have they lost out on?
Someone tested it in the other thread and they claimed there?s no performance difference between in-game AA and nVidia?s control panel. If true it again begs the question of what makes it so special that it expended so much effort from nVidia and the developer, and why it needed to be artificially blocked on ATi?s parts.

Well then AMD should have worked with the developer.
Why, given their GPUs already work with the implementation out of the box? The only thing stopping it is the artificial vendor check. Again, you might have a point if it required something special inherent to nVidia?s hardware, but that doesn?t seem to be the case.

This whole thing is looking more and more like a marketing and advertising tool by having an AA toggle that?s artificially grayed out on ATi?s cards.

Kind of like how Crysis? very high options were grayed out on XP despite many settings having absolutely nothing to do with DX10, like sound and physics. Or like how Halo 2 only works on Vista despite not even being DX10.

I guess in these instances you?d claim XP is at fault despite the artificial locks from the developer, and despite the fact that both locks can be defeated on XP?

AMD is holding back game development by not working with game developers, not implementing physics, delaying games and all their other shenanigans.
If you honestly think that nVidia would allow ATi to compete with them with PhysX then your arguments are naïve. If ATi implemented PhysX and it happened to run faster on their boards, do you honestly think nVidia would sit idly by and do nothing? ATi don?t even support PhysX and we?ve already seen nVidia apply a vendor-lock to PhysX by disabling it in systems with ATi cards.

Also DX11 is not an open standard.
Yes it is. Any vendor is free to implement it on their hardware without restriction or royalty requirements. Furthermore Microsoft is vendor agnostic so there?s no conflict of interest, unlike someone trying to compete with nVidia and their own PhysX API.

As for non-MS platforms, I guess you?re not familiar with Crossover games.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Originally posted by: Wreckage

The AA code may not have been fully tested on AMD hardware. So it may cause problems to enable it.
Do you have any evidence to back the claim that enabling it causes problems?

By NVIDIA. AMD should work with developers more so games won't run poorly on their hardware.
It runs the in-game AA code just like nVidia?s hardware does.

They still have the fastest video card out there, that's kind of special. ;)
Stop trolling and address the point.

We know this from an AMD marketing blog. The game developer seems to think otherwise.
Got any evidence to back that claim? Last time I checked the developer hadn?t even commented on the issue.

No it leaves AMD to blame for not working with the developer.
Nope, because their cards run in-game AA just like nVidia?s cards do. Furthermore there is evidence to suggest this AA is nothing more than a toggle that doesn?t actually implement anything special, like I mentioned above.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
BFG your whole argument ignores that the developers had to add AA themselves to the game and that NVIDIA helped them do this.

Until you come to grips with that fact... you have no argument.

From the first post in this thread...

Batman is based on Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Originally posted by: Pantalaimon
AMD delayed Dirt 2 to add DX11 as a marketing feature. And not just a delay of a few weeks. You can play it now on the consoles but not until the end of the year on the PC.

It's been delayed to add DX11 which will also benefit the owners of the new nvidia cards. DX11 is not restricted to ATI cards only. The game doesn't lock out DX11 features when a DX11 nvidia card is used. So it's no where even near to nvidia's tactic.

:thumbsup:

How is delaying a game launch to add DX11 (which will benefit both ATi and nVidia cards) seen as negative is beyond me :confused:

Batman and Mirror's Edge were delayed so that the devs could implement PhysX - it works only on nVidia. So if anything, this *could* be seen as something negative - but how? Extra features aren't added by a tick in options... And PhysX is nVidia only (compared to AA).

Even if we would assume such delay is something bad and unacceptable (which is crazy talk anyway), which one would be worse?

1. delay a game to add a standard vendor-independent feature but block part of the market from using it (Batman)
2. delay a game to add a standard vendor-independent feature that will benefit everybody (DiRT2)

Well? Anyway, this was more rhetorical, as the whole concept of "delay due to extra features = bad" is absurd imo.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Originally posted by: Idontcare

The Occam's razor here would suggest it was about quality control...if it takes effort (money, resources) to qualify/validate software as running properly and fully on hardware then why would the developer leave themselves open to risk ruining their credibility by releasing a game that potentially would be viewed as being "buggy" on AMD hardware?
Again this might be a valid point if the AA was actually doing something special, but it appears to be nothing more than regular AA. If it?s standard AA that doesn?t rely on vendor specific features then it?ll work on any such hardware with the same capabilities.

To put it another way, should all current DX10.1 and DX11 games implement anti-nVidia blocks on the basis that such code has never been tested on them? Of course not, because it?s standard code. And we know Batman?s in-game AA code is standard because it runs on ATi?s cards when the vendor locks are defeated.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Originally posted by: Wreckage

BFG your whole argument ignores that the developers had to add AA themselves to the game and that NVIDIA helped them do this.

Until you come to grips with that fact... you have no argument.

From the first post in this thread...

Batman is based on Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.
http://www.evga.com/gaming/gaming_news/gn_100.asp

Originally posted by: Tim Sweeney

The most visible DirectX 10-exclusive feature is support for MSAA on high-end video cards. Once you max out the resolution your monitor supports natively, antialiasing becomes the key to achieving higher quality visuals.
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
I was going to buy Batman AA, but I just changed my GTX 285 out for a 5870, and after seeing this bullsh*t I'm not going to bother buying it.

You just lost a customer Edios! and I'm going to tell all my friends the same thing :)
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K
Originally posted by: Wreckage

BFG your whole argument ignores that the developers had to add AA themselves to the game and that NVIDIA helped them do this.

Until you come to grips with that fact... you have no argument.

From the first post in this thread...

Batman is based on Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.
http://www.evga.com/gaming/gaming_news/gn_100.asp

Originally posted by: Tim Sweeney

The most visible DirectX 10-exclusive feature is support for MSAA on high-end video cards. Once you max out the resolution your monitor supports natively, antialiasing becomes the key to achieving higher quality visuals.

I was unaware that Batman was a DX10 only game.
 

Fenixgoon

Lifer
Jun 30, 2003
33,590
13,286
136
Originally posted by: Wreckage
BFG your whole argument ignores that the developers had to add AA themselves to the game and that NVIDIA helped them do this.

Until you come to grips with that fact... you have no argument.

From the first post in this thread...

Batman is based on Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.

im curious about this because GoW PC has AA enabled via DX10.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Originally posted by: Wreckage

What about DX9? What work is needed for the DX10 version? Have you written the code and can clear this up for us?
No, I haven?t worked on the code, but Sweeney has given he wrote the engine.

It?s possible the work they refer to is implementing AA under DX9, but again this would be standard that would work on any DX9 card, and indeed works on ATi?s boards.

If AMD wanted in-game AA why did they not work with the company to implement it? Why do you think they should be rewarded for doing nothing?
Why do you think nVidia should be rewarded when they ship DX10.1/DX11 hardware and benefit from games that were developed on ATi?s hardware?

Should all current DX10.1/DX11 titles have anti-nVidia blocks to stop nVidia from being rewarded for doing nothing?

People should stop buying their cards until they start supporting games correctly.
Should people stop buying nVidia?s cards because they don?t support games ?correctly? by not having DX10.1/DX11?

That was a discussion (not involving you) regarding Havok vs Physx. Stop trying to bounce back and forth between CPU and GPU in order to skew a veiwpoint. If you are going to discuss Havok vs Physx then the list I used is entirely accurate. Since Havok still does not support GPU acceleration there are zero games to compare it against Physx.
Fair enough then. But you do know that the list of software Havok titles far outnumbers the list of PhysX titles, right? PhysX has the advantage of hardware titles of course given Havok is basically zero, but with 15-20 titles I?m not sure that constitutes as a large one.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K

No, I haven?t worked on the code, but Sweeney has given he wrote the engine.
So you took a comment from him that says DX10 AA is supported in the engine to mean. It works flawlessly without testing and without even having to write any code? I've programmed for DX in the past. It just aint that simple. Besides the fact that it still ignores DX9.

It?s possible the work they refer to is implementing AA under DX9, but again this would be standard that would work on any DX9 card, and indeed works on ATi?s boards.
Nope. You should try reading game reviews sometimes. As games tend to constantly perform differently on different cards. Based on your understanding of video cards, they should be able to use the same drivers.

Why do think nVidia should be rewarded when they eventually ship DX10.1/DX11 hardware and benefit from games that were developed with the support of ATi?s hardware?
Ah the old straw man argument. DX would exist with or without AMD. AA in Batman would not exist without NVIDIA.

Should people stop buying nVidia?s cards because they don?t support games correctly by not having DX10.1/DX11?
Considering Dirt 2 won't be here till the end of the year, sounds like NVIDIA will have their support in time. NVIDIA is at least working on DX11 support, instead of bitching about not getting it for free in a blog comment.

Fair enough then. But you do know that the list of software Havok titles far outnumbers the list of PhysX titles, right? PhysX has the advantage of hardware titles of course given Havok is basically zero, but with 15-20 titles I?m not sure that constitutes as a large one.
Your opinion is duly noted. However I was still right.
 

NYHoustonman

Platinum Member
Dec 8, 2002
2,642
0
0
Originally posted by: Keysplayr
Originally posted by: adriantrances
Might not be GOTY worthy, but it's still a good game

I'd definitely put it in the top 10 as far as games go. I really enjoy it. Great gameplay, storyline, graphics. At any rate, it's a AAA title.

I'd personally have no problem calling it my GotY to this point, great game.

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Originally posted by: Wreckage

So you took a comment from him that says DX10 AA is supported in the engine to mean. It works flawlessly without testing and without even having to write any code?
Nope, that?s a strawman on your part. But if a developer states an engine supports something, you at least assume it actually works.

Nope. You should try reading game reviews sometimes. As games tend to constantly perform differently on different cards.
We aren?t talking about performance, we?re talking about the feature being enabled or not.

Based on your understanding of video cards, they should be able to use the same drivers.
Nope because drivers individually program specific hardware; Batman?s AA does not.

DX would exist with or without AMD.
But DX10.1/DX11 render paths in games wouldn?t exist unless developers implement it. With current games they implemented them on ATi cards and now nVidia will be rewarded for doing nothing when they release such cards.

AA in Batman would not exist without NVIDIA.
Exactly the same thing applies to current DX10.1/DX11 implementations and ATi.

Remember, this is your logic, not mine. You?re happy for nVidia to be rewarded for doing nothing, but not for ATi to do the same. Why?

Again, why aren?t you advocating developers to implement anti-nVidia blocks in current DX10.1/DX11 titles because they weren?t tested on nVidia?s hardware, and because nVidia shouldn?t be rewarded for doing nothing? These are the exact same arguments you?re using for Batman?s AA.

Why the rampant double standard?

Considering Dirt 2 won't be here till the end of the year, sounds like NVIDIA will have their support in time.
Yeah? And whose hardware do you think Dirt 2?s DX11 path is being developed on? That and Stalker will be here in November, and DX10.1 titles have already been out for months. Again, all free rewards for nVidia when they release such hardware.

Again, should these developers implement anti-nVidia blocks like Batman?s developer did?

NVIDIA is at least working on DX11 support, instead of bitching about not getting it for free in a blog comment.
Nice attempted deflection, but it does nothing to address your woeful arguments.

However I was still right.
Yeah? So count the Havok titles and compare them to your list, and tell me which is bigger. Hint: it?s not PhysX.
 

Red Irish

Guest
Mar 6, 2009
1,605
0
0
Bad move on Nvidia's part and very bad publicity for their company, I hope they suffer for it in terms of lost sales.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Fine BFG since you keep dodging the issue. I will make it nice and simple.

Either

A.) Batman is a DX10 only game and the developers and or NVIDIA are doing something fishy.

B.) Batman is not a DX10 only game and extra work\testing had to be done to get AA working.

If it's A I will admit to being wrong and publicly apologize. If it's B, you are intentionally ignoring that just to bait\troll me.

So which is it....A or B?
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Wreckage
Fine BFG since you keep dodging the issue. I will make it nice and simple.

Either

A.) Batman is a DX10 only game and the developers and or NVIDIA are doing something fishy.

B.) Batman is not a DX10 only game and extra work\testing had to be done to get AA working.

If it's A I will admit to being wrong and publicly apologize. If it's B, you are intentionally ignoring that just to bait\troll me.

So which is it....A or B?

He's dodging the issue? He's addressed all of your (weak) points and explained why you are wrong. Answer his question, should Nvidia cards be locked out of DX10.1 features since they were more than likely developed on AMD hardware?
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
C.) Nvidia is engaging in business practices that will only serve to further fragment the PC gaming industry and they deserve all the scorn being directed at them until they can learn to play nice with others.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Originally posted by: Wreckage

So which is it....A or B?
It?s ?B?, and here are some more ?B? statements:

Stalker CS is not a DX10.0 only game and extra work/testing had to be done to get DX10.1 working.

Stalker CoP is not a DX10.0 only game and extra work/testing had to be done to get DX11 working.

The rampant double standards in your arguments can?t acknowledge other ?B? statements, and can?t see that your vendor lock arguments (?rewards for free? and ?not tested?) equally apply to them.

Again, why aren?t you advocating the developers to lock out nVidia from their DX10.1/DX11 render paths like they locked out ATi from Batman?s AA?