How to enable Nvidia Phsyx on Ati cards in Batman:AA

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Schmide

Diamond Member
Mar 7, 2002
5,788
1,092
126
Originally posted by: Wreckage
Originally posted by: Schmide

Or working with open standards like DX11, DirectCompute, OpenCL, etc that benefit all instead of a few kickback receivers.

NVIDIA was the first to offer an OpenCL driver. In fact I don't even know if AMD offers one to the public yet. Nice try though.

EDIT: Also DX11 is not an open standard. It is closed source that only runs on Microsoft operating systems.


Originally posted by: Schmide

On a Mac. :shocked: Nice try though.

DX11 works on a Mac?

Edit: I got confused by out of order post and an edit. If you add something put an edit tag on it FFS. You edited your post to specifically make it look like I said something different Straw Man Edit FTW. Kind of like a dirty nVidia trick.

It only works on a MAC!

Originally posted by: Ryan Smith
As it stands, neither AMD nor NVIDIA have a complete OpenCL implementation that's shipping to end-users for Windows or Linux. NVIDIA has OpenCL working on the 8-series and later on Mac OS X Snow Leopard, and AMD has it working under the same OS for the 4800 series, but for obvious reasons we can?t test a 5870 in a Mac. As such it won?t be until later this year that we see either side get OpenCL up and running under Windows. Both NVIDIA and AMD have development versions that they're letting developers play with, and both have submitted implementations to Khronos, so hopefully we?ll have something soon.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Wreckage
Originally posted by: nitromullet

Agreed again. It looks like Batman:AA is selling really well (over 2M), but I haven't been able to find a breakdown of the sales by platform. I'd be willing to bet that Xbox 360 leads, followed by PS2, and then PC.

Actually it may be selling better on the PS3 because of proprietary content (you get to play as the Joker). ;)

Yeah. but I said "PS2". ;)
(fixed the typo in my original post)

You might be right though. Either way, I'm pretty sure that the individual console versions are both out selling the PC version. Personally, I'm probably just going to borrow the 360 version from a buddy of mine when he's done with it.
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
I think the real party at fault here is the game developer, not nVidia or AMD. It has been proven that AA can be enabled on AMD cards and works quite well with a very simple change, and I find it hard to believe the developer would not know about it. If the money they get from nVidia comes with a condition that they could not even work that simple change to get AA going, then things are looking kind of dirty.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Originally posted by: Keysplayr
Look at these two statements and tell me what you think is good policy for big business.
"We pay you to have this feature only for us."
"We pay you to have this feature for us, and our competition."

If you were paying for it, which would. You pick?

I'm selecting a third statement:

We pay you to block ATi cards from using a standard feature.

This particular AA is not something nVidia cards can run only. It's not PhysX or another CUDA-enabled thing. It's standard MSAA, which ATi cards run just fine (if not better) - and this has been proved by tricking the game into thinking you have an nVidia card. It's not magic, it's a simple block.

You pay a developer and help him with his application - you get your logo at start and some praise in the press release. This has always been like that. But you do not artificially block your competition from running something they're perfectly capable of running (to make your cards seem better than they are). This is totally unacceptable. Not to mention AA is not an nVidia invention or it's not owned by them - and yet the game is actively blocking it from running on ATi cards.

This thing is so thick and foul, I can't possibly understand how one cannot see it... I mean, the game already uses PhysX, great, nVidia helped with that and it's a technology that nVidia cards can run only. I have no gripe with that - but come on, Anti Aliasing? It's just sad and pathetic.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: Qbah
Originally posted by: Keysplayr
Look at these two statements and tell me what you think is good policy for big business.
"We pay you to have this feature only for us."
"We pay you to have this feature for us, and our competition."

If you were paying for it, which would. You pick?

I'm selecting a third statement:

We pay you to block ATi cards from using a standard feature.

This particular AA is not something nVidia cards can run only. It's not PhysX or another CUDA-enabled thing. It's standard MSAA, which ATi cards run just fine (if not better) - and this has been proved by tricking the game into thinking you have an nVidia card. It's not magic, it's a simple block.

You pay a developer and help him with his application - you get your logo at start and some praise in the press release. This has always been like that. But you do not artificially block your competition from running something they're perfectly capable of running (to make your cards seem better than they are). This is totally unacceptable. Not to mention AA is not an nVidia invention or it's not owned by them - and yet the game is actively blocking it from running on ATi cards.

This thing is so thick and foul, I can't possibly understand how one cannot see it... I mean, the game already uses PhysX, great, nVidia helped with that and it's a technology that nVidia cards can run only. I have no gripe with that - but come on, Anti Aliasing? It's just sad and pathetic.


Apparently you didnt read the update on this. What you described is not what happened.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Originally posted by: OCguy
Apparently you didnt read the update on this. What you described is not what happened.

You meant the one from OP?

Both of these claims are NOT true. Batman is based on Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.

Or the statement by Epic's Tim Sweeney regarding UE3?

The most visible DirectX 10-exclusive feature is support for MSAA on high-end video cards. Once you max out the resolution your monitor supports natively, antialiasing becomes the key to achieving higher quality visuals.

As both of them are clearly contradictions. And I would say the engine creators know a bit more.

So which update are you talking about?


EDIT: Not to mention what will we see next? nVidia QA'ing high resolution textures in a game or resolutions above 1280x1024? Or running a DX9 engine in general? You start the game and will get an error: "Sorry, DX9 is not supported by your hardware." or grayed out "high" details? "Awesome" :disgust:
 

Fattysharp

Member
Nov 23, 2005
95
0
0
There is too much speculation about what actually happened here. ATI is saying one thing, Nvidia another, and the game dev has no comment yet. We have too many "what ifs" but it certainly makes all the fanatic's come out of the wood work.

IF nvidia told the game dev's they could only use AA with their cards to get funding for the game, then that is a valid choice and decision the Game Devs made. Is it really a good choice to limit the experience of your audience based on brand ? Probably not. These sorts of fiasco's have a way of following the companies involved on to future titles.

If the Dev'sare unwilling to work with ati for whatever reason, be it nvidia's involvement or not, that is also the dev's choice. Again, probably not a very good choice.

the unreal engine does not natively support AA, and there are other games that AA does not work wih ati cards without tricks. The first one I can think of is Fall Out 3. ATI cards can not force AA in this title, and will not use AA from the game menu with out renaming the exe.

We do have good reasons to suspect Nvidia was holding back dx10 and dx10.1 titles (assassin's creed anyone?), so it is reasonable to suspect they are involved in limiting AA here as well. However it is not fact yet, so argue away !
 

golem

Senior member
Oct 6, 2000
838
3
76
[Qbah[/i]


I'm selecting a third statement:

We pay you to block ATi cards from using a standard feature.

This particular AA is not something nVidia cards can run only. It's not PhysX or another CUDA-enabled thing. It's standard MSAA, which ATi cards run just fine (if not better) - and this has been proved by tricking the game into thinking you have an nVidia card. It's not magic, it's a simple block.

What I'm curious about is if this particular type of AA is actually a standard feature in UE3 games? Not a driver force but something IN GAME. Was it something that had to be specifically added to this game?

If IN GAME AA is not a standard feature, and Nvidia paid/assisted in getting it in there, then why shouldn't they be able to request a vendor ID check to only enable their customers to be able to access it?

ATI users are still able to driver force it correct? It's only the In game AA they are locked out of?
 

WelshBloke

Lifer
Jan 12, 2005
33,488
11,632
136
Originally posted by: Wreckage
Originally posted by: WelshBloke
I do however care about PC gaming.

Exactly, that's why I pick a card from a company that actually works with game developers. Instead of doing nothing and crying about it.

AMD is holding back game development by not working with game developers, not implementing physics, delaying games and all their other shenanigans.

So you'd be happy if ATI did exactly the same thing and half of all video games ran well on NV hardware and half ran well on ATI?
 

WelshBloke

Lifer
Jan 12, 2005
33,488
11,632
136
Originally posted by: OCguy
Originally posted by: Qbah
Originally posted by: Keysplayr
Look at these two statements and tell me what you think is good policy for big business.
"We pay you to have this feature only for us."
"We pay you to have this feature for us, and our competition."

If you were paying for it, which would. You pick?

I'm selecting a third statement:

We pay you to block ATi cards from using a standard feature.

This particular AA is not something nVidia cards can run only. It's not PhysX or another CUDA-enabled thing. It's standard MSAA, which ATi cards run just fine (if not better) - and this has been proved by tricking the game into thinking you have an nVidia card. It's not magic, it's a simple block.

You pay a developer and help him with his application - you get your logo at start and some praise in the press release. This has always been like that. But you do not artificially block your competition from running something they're perfectly capable of running (to make your cards seem better than they are). This is totally unacceptable. Not to mention AA is not an nVidia invention or it's not owned by them - and yet the game is actively blocking it from running on ATi cards.

This thing is so thick and foul, I can't possibly understand how one cannot see it... I mean, the game already uses PhysX, great, nVidia helped with that and it's a technology that nVidia cards can run only. I have no gripe with that - but come on, Anti Aliasing? It's just sad and pathetic.


Apparently you didnt read the update on this. What you described is not what happened.

Yeah apparently you can hack the Physx to get it to run as well even without an NV card.. :shocked:

 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
Originally posted by: Idontcare
Originally posted by: GaiaHunter
IDC: Is nVidia paying to add AA to the game or are they paying to not allow AA on AMD hardware?

And even if it is the first case, the line between both situations is a slim one.

I'm sorry but I just don't see what the problem is if it is the former.

So can I infer you find the second situation a problem?

It will be impossible or close to it to prove if it is the second and it wasn't the first, but:

If I'm not mistaken, the Xbox version of this game does support some AA (2xAA it seems) and if I'm not mistaken again the xbox 360 graphics are done by ATI hardware.

I don't know anything about porting games from a console to a pc or how different is coding AA for a xbox and for a pc in case of a port, but considering that xbox is from MS and they are fiercely interested on protecting DX, I'm not sure if it differences are so substantial a devs would have to spend incredible amounts of resources considering the xbox version already has some AA (if the xbox version doesn't have AA or if it the difference between coding AA for xbox 360 and the PC is like the difference from day to night, then my point is moot).

Actually it is interesting that the only way the devs have to stop ATI hardware of running the AA is by blocking the vendor...

And I'll have the same opinion if any company try to do this.

I enjoy having a system where I can add whatever hardware manufacturer I want it will still works on any hardware agnostic API/piece of software/etc.

I don't really care about this game and. if I did I, atm, have both AMD and nVidia hardware so I could run it, but I might not have in the future and I don't a war to break out and next time I go shopping I need a checklist to see if that software will work with all my different bits of hardware.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Fattysharp
There is too much speculation about what actually happened here. ATI is saying one thing, Nvidia another, and the game dev has no comment yet. We have too many "what ifs" but it certainly makes all the fanatic's come out of the wood work.

IF nvidia told the game dev's they could only use AA with their cards to get funding for the game, then that is a valid choice and decision the Game Devs made. Is it really a good choice to limit the experience of your audience based on brand ? Probably not. These sorts of fiasco's have a way of following the companies involved on to future titles.

If the Dev'sare unwilling to work with ati for whatever reason, be it nvidia's involvement or not, that is also the dev's choice. Again, probably not a very good choice.

the unreal engine does not natively support AA, and there are other games that AA does not work wih ati cards without tricks. The first one I can think of is Fall Out 3. ATI cards can not force AA in this title, and will not use AA from the game menu with out renaming the exe.

We do have good reasons to suspect Nvidia was holding back dx10 and dx10.1 titles (assassin's creed anyone?), so it is reasonable to suspect they are involved in limiting AA here as well. However it is not fact yet, so argue away !

Are you kidding me? I have been playing Fallout3 with in-game 4xAA just fine on my Radeon 4890. No tricks needed whatsoever. Moreover, the devs already get funded by the publisher to create the game, do they need NV's money so badly that they'd lock out features from non-NV cards just to get it. Either way, it's a shady practice from either NV's side or the developers'... or both.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
So AMD and NVIDIA both go to the car wash. NVIDIA pays extra to get a wax. AMD does not and then complains that their car is not as shiny. Now AMD could have paid for a wax, they could also wax the car themselves. Instead they think they should get a wax for free and not have to do any work.

Good luck with that AMD.

Originally posted by: WelshBloke

So you'd be happy if ATI did exactly the same thing and half of all video games ran well on NV hardware and half ran well on ATI?

Does the game not run well on ATI? That's not the impression I got from the Anand review.

Originally posted by: munky


Moreover, the devs already get funded by the publisher to create the game, do they need NV's money so badly that they'd lock out features from non-NV cards just to get it. Either way, it's a shady practice from either NV's side or the developers'... or both.

If not for NVIDIA the feature would not even be there in the first place. So you are basically trying to reward AMD for doing nothing. It's like welfare or something.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
Originally posted by: Wreckage

If not for NVIDIA the feature would not even be there in the first place. So you are basically trying to reward AMD for doing nothing. It's like welfare or something.

That is what they say now...

 

CP5670

Diamond Member
Jun 24, 2004
5,692
796
126
Originally posted by: Tempered81
Originally posted by: CP5670
UE3 games have worked with driver-forced MSAA on both Nvidia and AMD cards for a long time now, on both DX9 and DX10. Does that not work in this game by default?
That's the catch, No NV vendor ID, no MSAA.

So does that mean no MSAA even with driver forcing? I don't care whether the game natively supports it or not, but there is no reason for it not to work if the driver forces it on. This works in all other UE3 games regardless of the DX version. In that case it's obviously an intentional lockout, and Eidos is as much at fault as Nvidia for it.

In fact, I have only ever come across 3 or 4 games (all of which are several years old now) that don't work with driver-forced AA. 99% of games work with it.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage
Originally posted by: munky


Moreover, the devs already get funded by the publisher to create the game, do they need NV's money so badly that they'd lock out features from non-NV cards just to get it. Either way, it's a shady practice from either NV's side or the developers'... or both.

If not for NVIDIA the feature would not even be there in the first place. So you are basically trying to reward AMD for doing nothing. It's like welfare or something.

I disagree. In-game AA is a pretty standard feature, and has been for the last 5 years or more. If it weren't for Nvidia, the game would likely have AA working on all modern video cards.
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: munky


Moreover, the devs already get funded by the publisher to create the game, do they need NV's money so badly that they'd lock out features from non-NV cards just to get it. Either way, it's a shady practice from either NV's side or the developers'... or both.

If not for NVIDIA the feature would not even be there in the first place. So you are basically trying to reward AMD for doing nothing. It's like welfare or something.

I disagree. In-game AA is a pretty standard feature, and has been for the last 5 years or more. If it weren't for Nvidia, the game would likely have AA working on all modern video cards.

Is it a common feature on UE3 games tho?
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: munky

I disagree. In-game AA is a pretty standard feature, and has been for the last 5 years or more. If it weren't for Nvidia, the game would likely have AA working on all modern video cards.

Not with that engine.

Originally posted by: CP5670
[
So does that mean no MSAA even with driver forcing? I don't care whether the game natively supports it or not, but there is no reason for it not to work if the driver forces it on. This works in all other UE3 games regardless of the DX version. In that case it's obviously an intentional lockout, and Eidos is as much at fault as Nvidia for it.

In fact, I have only ever come across 3 or 4 games (all of which are several years old now) that don't work with driver-forced AA. 99% of games work with it.

Yes, it works with driver forcing, just not as well.


Edit: Holy hell the timewarps are massive today
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: munky


Moreover, the devs already get funded by the publisher to create the game, do they need NV's money so badly that they'd lock out features from non-NV cards just to get it. Either way, it's a shady practice from either NV's side or the developers'... or both.

If not for NVIDIA the feature would not even be there in the first place. So you are basically trying to reward AMD for doing nothing. It's like welfare or something.

I disagree. In-game AA is a pretty standard feature, and has been for the last 5 years or more. If it weren't for Nvidia, the game would likely have AA working on all modern video cards.

Is it a common feature on UE3 games tho?
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Originally posted by: Wreckage
Originally posted by: thilanliyan
Originally posted by: Wreckage
delaying games and all their other shenanigans.

This is exactly what nVidia did with Arkham Asylum (to add those "special" effects).

Batman was delayed?

Console version shipped first and a while back it was decided that the PC version would come after to introduce PhysX and from what we now know, the ingame AA as well.

EDIT: Benskywalker beat me to it.

Originally posted by: Wreckage
AMD is holding back game development by not working with game developers, not implementing physics, delaying games and all their other shenanigans.

So are you now going to say nVidia is also holding back game development?