How to enable Nvidia Phsyx on Ati cards in Batman:AA

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Schmide

Diamond Member
Mar 7, 2002
5,788
1,092
126
Originally posted by: Wreckage
Originally posted by: Schmide

Or working with open standards like DX11, DirectCompute, OpenCL, etc that benefit all instead of a few kickback receivers.

NVIDIA was the first to offer an OpenCL driver. In fact I don't even know if AMD offers one to the public yet. Nice try though.

EDIT: Also DX11 is not an open standard. It is closed source that only runs on Microsoft operating systems.

Originally posted by: Schmide

On a Mac. :shocked: Nice try though.

DX11 works on a Mac?

Seriously change it back. Your edit makes me say something I didn't!!!!!!!!!!!!!!
 

thilanliyan

Lifer
Jun 21, 2005
12,084
2,281
126
Originally posted by: Keysplayr
AFAIK, only Nvidia spent resources to get AA working on it's hardware when it wasn't even present in the engine in the first place.

AA is present in the engine...there are other UE3 games that have AA. A game like Mass Effect (UE3) had to have AA forced through the driver on both nV and ATI...so it's definitely present in the engine.
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: thilanliyan
Originally posted by: Keysplayr
AFAIK, only Nvidia spent resources to get AA working on it's hardware when it wasn't even present in the engine in the first place.

AA is present in the engine...there are other UE3 games that have AA.

Which other games besides Gears of War under DX10 (in game AA)?
 

thilanliyan

Lifer
Jun 21, 2005
12,084
2,281
126
Originally posted by: golem
Originally posted by: thilanliyan
Originally posted by: Keysplayr
AFAIK, only Nvidia spent resources to get AA working on it's hardware when it wasn't even present in the engine in the first place.

AA is present in the engine...there are other UE3 games that have AA.

Which other games besides Gears of War under DX10 (in game AA)?

Mass Effect for one as I mentioned in my updated post. I just read a thread where in Section 8 also it's possible to enable it.
 

Udgnim

Diamond Member
Apr 16, 2008
3,682
124
106
Mirror's Edge has AA support for DX9

Dead Space has an AA option but I wouldn't really consider it AA
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: thilanliyan
Originally posted by: golem
Originally posted by: thilanliyan
Originally posted by: Keysplayr
AFAIK, only Nvidia spent resources to get AA working on it's hardware when it wasn't even present in the engine in the first place.

AA is present in the engine...there are other UE3 games that have AA.

Which other games besides Gears of War under DX10 (in game AA)?

Mass Effect for one as I mentioned in my updated post. I just read a thread where in Section 8 also it's possible to enable it.

Crap, I worded that wrong.

I meant:

Are there any UE3 games that have in game AA besides Gears of War (which has it but only with DX10 and not DX9)?

I think in Mass Effect you have to driver force to get AA?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: golem
Fair enough, I haven't tried myself to see if Arkham supports AA in DX9, it's just something I read somewhere.

But if Arkham does support DX9 in game AA, that does tend support the argument of no AA w/o Nvidia's help.

DX10 in game AA is a different matter.. but if no other game besides one published by the game engine maker has it, that doesn't totally support or disprove the no aa w/o Nvidia argument.

I'm wondering though if Arkham even supports DX10?? Seems like in this day and age any half-decent new game should have the option to run DX10, along with whatever advantages that rendering mode has.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: SlowSpyder
Originally posted by: Wreckage

If not for NVIDIA the feature would not even be there in the first place. So you are basically trying to reward AMD for doing nothing. It's like welfare or something.

I wonder how much Nvidia benefited from AMD working with TSMC to iron out the bugs on the 40nm process...

Usually the early adopter benefits with better wafer pricing. Particularly the guy in the pole-position on a new node as their first chip is the so-called "qual-device" for the node. The cost-savings for that player can be substantial. (TI was the qual-device customer for TSMC at 130nm, 90nm, and 65nm...intentionally absent at 45nm)
 

thilanliyan

Lifer
Jun 21, 2005
12,084
2,281
126
Originally posted by: golem
Crap, I worded that wrong.

I meant:

Are there any UE3 games that have in game AA besides Gears of War (which has it but only with DX10 and not DX9)?

I think in Mass Effect you have to driver force to get AA?

That I'm not sure about. Ever since we got to the point where AA was hit or miss depending on game/engine/etc, I stopped bothering with AA and only really use it if it's ingame.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Idontcare
Originally posted by: SlowSpyder
Originally posted by: Wreckage

If not for NVIDIA the feature would not even be there in the first place. So you are basically trying to reward AMD for doing nothing. It's like welfare or something.

I wonder how much Nvidia benefited from AMD working with TSMC to iron out the bugs on the 40nm process...

Usually the early adopter benefits with better wafer pricing. Particularly the guy in the pole-position on a new node as their first chip is the so-called "qual-device" for the node. The cost-savings for that player can be substantial. (TI was the qual-device customer for TSMC at 130nm, 90nm, and 65nm...intentionally absent at 45nm)

I was just trying to make the point that it's not like only Nvidia drives this industry, and AMD just borrows their advances in technology. Obviously both companies push forward and borrow from each other. I think the point was valid in the context of what Wreckage said... hardly throwing shit at the wall and seeing if it sticks.

AMD creating a 40nm GPU months ago, when 40nm seemed to still be a work in progress, benefits both companies today.
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: SlowSpyder
Originally posted by: Idontcare
Originally posted by: SlowSpyder
Originally posted by: Wreckage

If not for NVIDIA the feature would not even be there in the first place. So you are basically trying to reward AMD for doing nothing. It's like welfare or something.

I wonder how much Nvidia benefited from AMD working with TSMC to iron out the bugs on the 40nm process...

Usually the early adopter benefits with better wafer pricing. Particularly the guy in the pole-position on a new node as their first chip is the so-called "qual-device" for the node. The cost-savings for that player can be substantial. (TI was the qual-device customer for TSMC at 130nm, 90nm, and 65nm...intentionally absent at 45nm)

I was just trying to make the point that it's not like only Nvidia drives this industry, and AMD just borrows their advances in technology. Obviously both companies push forward and borrow from each other. I think the point was valid in the context of what Wreckage said... hardly throwing shit at the wall and seeing if it sticks.

AMD creating a 40nm GPU months ago, when 40nm seemed to still be a work in progress, benefits both companies today.

What you're saying is true. AMD creating the 40nm GPU months ago helps both AMD and Nvidia, but AMD got the extra benefit of lower pricing because they were the early adapter.

IF Nvidia did help make in game AA possible with the UE3 engine under DX9, shouldn't they have the benefit of restricting it to their cards for that game? I imagine since it's been done once, future UE3 games will have in game DX9 for both ATI/Nvidia. But for THIS game, shouldn't Nvidia be allowed to restrict the benefits from their investment to their cards?

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Maybe, but why can't Eidos devs do in game AA if Epic can? Are you implying that UE3 is so complex to work with, that only Epic, the guys who made the engine, would know how to do AA with it?

What buffer effects/writes are they doing with the game? UE is rather flexible, it may be an issue of certain effects creating certain issues, and that may even only be the case with certain drivers. Do I think Epic could get AA to work where others would fail running UE? I would hope the hell so. Do I think nVidia could get AA to work on their hardware running UE even better then Epic? Again, I would hope the hell so :)
 

Arglebargle

Senior member
Dec 2, 2006
892
1
81
Did Nvidia really get enough sales/increased rep out of this move, versus the bad rep/increased sales for Ati? Hard to say. The combination of poor pricing and bad rep tipped my hand when I voted with my money. That vote did not go Nvidia.

Funny thing though: Not interested in Arkham Asylum at all, will probably never play it.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: SlowSpyder
Originally posted by: Idontcare
Originally posted by: SlowSpyder
Originally posted by: Wreckage

If not for NVIDIA the feature would not even be there in the first place. So you are basically trying to reward AMD for doing nothing. It's like welfare or something.

I wonder how much Nvidia benefited from AMD working with TSMC to iron out the bugs on the 40nm process...

Usually the early adopter benefits with better wafer pricing. Particularly the guy in the pole-position on a new node as their first chip is the so-called "qual-device" for the node. The cost-savings for that player can be substantial. (TI was the qual-device customer for TSMC at 130nm, 90nm, and 65nm...intentionally absent at 45nm)

I was just trying to make the point that it's not like only Nvidia drives this industry, and AMD just borrows their advances in technology. Obviously both companies push forward and borrow from each other. I think the point was valid in the context of what Wreckage said... hardly throwing shit at the wall and seeing if it sticks.

AMD creating a 40nm GPU months ago, when 40nm seemed to still be a work in progress, benefits both companies today.

Yeah I'm not really paying all that much attention to whatever else is going on here (re: wreckage comment, etc) I just saw your post and since it hits home I thought I'd respond with some background.

FWIW I suspect NV unwittingly uncovering TSMC's chip packaging weakness as well as their pushing manufacturing and packaging to deal with 500mm^2+ diesizes can't really be understated for their value to the rest of the foundry-user club either.

Everyone pays into that community social program, that is the underlying premise of a foundry business model. The customer ultimately is the one paying for whatever you are doing.

That said, AMD paid dearly for the privilege in this one instance, a situation that wouldn't have bit them in any prior technology node. It was their unlucky timing of personnel shift at TSMC and the resource shift at TI. Without TI's resources (and load commitments) to drive the leading edge qual device for 45nm (unlike the prior three nodes) combined with being without Morris at the helm and their lead process development guy going on hiatus, the perfect storm was set at TSMC for AMD to unfortunately fall into regarding the leakage/yield issue (and the respins that its resolution required) which for absolutely positively sure the rest of the 40nm customers have to thank.

(for people wondering what happened to TI and TSMC at 45nm...TI changed strategy at 45nm and decided to liquidate their internal node development team as well as lock themselves down into solely relying on UMC for their access to 45nm technology)
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: Wreckage
Originally posted by: WelshBloke
I do however care about PC gaming.

Exactly, that's why I pick a card from a company that actually works with game developers. Instead of doing nothing and crying about it.

AMD is holding back game development by not working with game developers, not implementing physics, delaying games and all their other shenanigans.

Wrong on all counts.

ATI is not holding back game development any more than nVidia is. It's business. ATI is not in the business of developing games. Neither is nVidia. Both are in the business to sell GPU's and video cards. If they feel a game will be popular and want to help promote it by providing funding. ATI and nVidia gets a little marketing in exchange via the splash screens. Ultimately it is up to the developers to obtain the proper funding and talent to develop a game or game engine. If that means they are getting some of it from either ATI or nVidia, then by all means do so. But don't claim that ATI is holding back games development because they don't provide funding to developers. ATI does have a similar developer funding program as nVidia for those that don't know. It's under the "Get in the Game" moniker.

To claim ATI is holding back game development because they do not provide funding to developers is just plain ludicrous. It's almost like claiming that it's the fault of John that Timmy does not drive because John is not buying Timmy a car. If Jimmy wants to drive, he better find a way to get a car.

Both ATI and nVidia have to provide support and make sure that games and game engines work with their hardware and hardware drivers. If there are developers with issues, ATI and nVidia will provide tech support. This is a, relatively speaking, a very basic level of support. Generally this support does not include funding.

The nVidia program, The Way It's Meant to be Played, which provided "support" is basically a marketing campaign by nVidia. The "support" is essentially money paid to a developer as well as extra tech support with part of the agreement being to put up an nVidia splash screen in the final game.

From a business standpoint and as a business oriented person, this is to be applauded. They are trying to persuade consumers into buying their cards due to the added features, which in this case is a little more eye candy. From a consumer standpoint and as a gamer, this sucks to high heaven.

I hope you're not serious about ATI not implementing physics. I think you meant to say ATI is not implementing PhysX, a proprietary standard from nVidia. I don't feel like going back into why it's a sound business reason for ATI not to implement PhysX. Just use the search function. Also when using the search function look up all the threads where you yourself argue against Havok and how ATI is stupid for implementing that instead of PhysX.

How in the heck is ATI delaying games? Please explain. This is on the same level of funny as the ATI not implementing physics claim.

We won't even get into detail on some of the many shenanigans that have been pulled by nVidia over the years. Sine of which include Doom3 and Assassin's Creed as well as cheating on benchmarks. Not that ATI is a clean company by any means.
 

thilanliyan

Lifer
Jun 21, 2005
12,084
2,281
126
Originally posted by: akugami
It's almost like claiming that it's the fault of John that Timmy does not drive because John is not buying Timmy a car. If Jimmy wants to drive, he better find a way to get a car.
You mean Timmy? :p

How in the heck is ATI delaying games? Please explain. This is on the same level of funny as the ATI not implementing physics claim.
ATI didn't delay the game (the developer is the one delaying it) but Dirt2 is delayed until November while the console version came out in September. Same thing happened with Arkham Asylum and nVidia.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: akugami

The nVidia program, The Way It's Meant to be Played, which provided "support" is basically a marketing campaign by nVidia. The "support" is essentially money paid to a developer as well as extra tech support with part of the agreement being to put up an nVidia splash screen in the final game.
http://alt.3dcenter.org/artikel/2004/10-08_english.php

Not one dollar will change hands between either side in an agreement on a title for that title to be included in the "The Way Its Meant To Be Played" program.


I hope you're not serious about ATI not implementing physics. I think you meant to say ATI is not implementing PhysX, a proprietary standard from nVidia.
Feel free to link games using GPU physics for ATI.
How in the heck is ATI delaying games? Please explain. This is on the same level of funny as the ATI not implementing physics claim.
http://forums.steampowered.com...howthread.php?t=913278
AMD delayed Dirt 2 to add DX11 as a marketing feature. And not just a delay of a few weeks. You can play it now on the consoles but not until the end of the year on the PC. It's been rumored that AMD paid them over $1,000,000. So if you want to accuse a company of payola there you go.

Originally posted by: thilanliyan
but Dirt2 is delayed until November while the console version came out in September.
Amazon says December 31st.
http://www.amazon.com/Dirt-2-P...&qid=1254369627&sr=8-4

BTW what's with the name change (nothing to do with that video I hope)?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: BenSkywalker
If NVIDIA had any kind of weight, their PhysX games wouldn't be mostly from crappy third-tier developers, but from mainstream developers. I think Eidos is the only known developer they've landed yet, and that's only because Eidos probably needed the cash considering they haven't released a decent game since Hitman: Blood Money. The only reason developers bother coding for PhysX is because of monetary compensation from NVIDIA.

Eidos is a publishing label owned by SuareEnix, do you know anything at all about the gaming market? Eidos is not a development studio. In terms of nV using money to get PhysX in games, what do you think gives you power in this markert? Only two possible choices, marketshare or money. In terms of marketshare nV has ATi bested 2:1, in terms of money it isn't remotely that close. I also made it rather clear that I was talking about nVidia paying to get all of these titles ported. In the console market this type of practice is rather common, offering monetary compensation to land exclusive content or just plain exclusitivity. nVidia has the on hand cash to fund this, and they are doing so now on a limited basis which I expect that they will continue. Why do you think that won't be the case?

No developer is going to waste time and resources coding for a proprietary solution with little market penetrance. NVIDIA's pockets aren't nearly as deep as you think.

There are 100 million PhysX capable cards in consumer machines. That is almost double the PS3 and 360 combined. Little maket penetration? Have you even glanced on what types of numbers you are talking about? As far as how deep nVidia's pockets are- oops, you are wrong again. nV has $1.46Billion in cash and equiv, there pockets are actually a little deeper then I implied in my post(click on the balance sheet if you are having problems).

The fact that you bring "ATI fans" and "NV fans" shows your over-focus on a fanboy war and not the actual situation presented.

Like Eidos being a development studio, or nVidia not having over $1Billion they could play with if they wanted to? Or the fact that nV has a staggering installed base of PhysX parts? Or that they have a 2:1 edge in marketshare? My apologies, I'm quite familiar with every one of these elements. The only people that are going to whine about it are the fans.

As a business choice, as I have very clearly explained, this makes sense for the developers to accept in no uncertain terms. The only people that stand to lose in this situation is nVidia themselves if their title selection fails to promote their parts or perhaps those devoted to ATi who want to convince themselves that they are being cheated.

But hey, if you think NVIDIA is going to save PC gaming, who am I to ruin your dreams, real life will do that quickly enough as it is. Some of the posts on this forum are comedy gold.

I never so much as implied that nV was going to save anything at all. What I stated is that they are helping to make sure that we see at least some enhanced ports making it onto the PC from the consoles. This is a point of fact. You can try and consider it comedy, much as I have a hard time not laughing at you lack of knowledge on this topic altogether, I would suggest in the future you may want to look a few things up that you know nothing about before trying to debate them when it comes to the marketplace :)
How is that relevant to my point that Eidos hasn't put out a decent game since Blood Money? I'm viewing a game as a product influenced by several groups. The point is Eidos is hungry for a good title to tack on to its list, and that probably drove some of this. Maybe I'm oversimplifying the situation, but if NVIDIA comes strolling in offering some big cash, you don't think there's going to be pressure on the studio?

Also, I don't think marketshare means much in this case. Think about it - NVIDIA has twice as many GPUs out there as ATI, but how many can A) even play games and B) play games with PhysX (not just have the capability, but actually be able to produce playable framerates as well). NVIDIA has 100 million "PhysX capable" GPUs out there. So that's what, every card since the 8 series? Now that number isn't nearly as high as it once was.

The simple fact of the matter is, you can throw out revenue numbers, "units sold" and other market numbers, and it means little because we have yet to see any real substance come out of PhysX. If you want to formulate a theory, go ahead, I'm reserving my judgment. As I said, you have this arcadian image of NVIDIA marching in and saving PC gaming, that's exactly what your post sounds like. Console games aren't all of a sudden going to stop being porting because NVIDIA doesn't come in and give them cash, give me a break.

Right now you have Batman: AA that was ported through funding by NVIDIA, pimped with their proprietary technology, and broken on other hardware. Did NVIDIA fix a shitty port for their cards only or rush what could have been a decent port to try to hoodwink people into getting their technology? Either way, it doesn't matter. Proprietary technology is and always will stagnate the market.

Oh, and your posts are comedy gold sometimes. Batman: AA PC GOTY? lmfao
 

zebrax2

Senior member
Nov 18, 2007
977
70
91
Originally posted by: Wreckage
Originally posted by: akugami

How in the heck is ATI delaying games? Please explain. This is on the same level of funny as the ATI not implementing physics claim.
http://forums.steampowered.com...howthread.php?t=913278
AMD delayed Dirt 2 to add DX11 as a marketing feature. And not just a delay of a few weeks. You can play it now on the consoles but not until the end of the year on the PC. It's been rumored that AMD paid them over $1,000,000. So if you want to accuse a company of payola there you go.
[/quote]

Nvidia did it with Batman so why not in Dirt 2 plus they are pushing a technology that will be adopted by Nvidia as well which means it benefits all PC gamers

I would like to see a link discussing that $1m rumor.

edit:
fixed my quotations
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
AMD delayed Dirt 2 to add DX11 as a marketing feature. And not just a delay of a few weeks. You can play it now on the consoles but not until the end of the year on the PC.

It's been delayed to add DX11 which will also benefit the owners of the new nvidia cards. DX11 is not restricted to ATI cards only. The game doesn't lock out DX11 features when a DX11 nvidia card is used. So it's no where even near to nvidia's tactic.
 

thilanliyan

Lifer
Jun 21, 2005
12,084
2,281
126
Originally posted by: Wreckage
Originally posted by: thilanliyan
but Dirt2 is delayed until November while the console version came out in September.
Amazon says December 31st.
http://www.amazon.com/Dirt-2-P...&qid=1254369627&sr=8-4

BTW what's with the name change (nothing to do with that video I hope)?

I was going by this:
http://pc.ign.com/objects/143/14300762.html
and
http://pc.gamespy.com/pc/dirt-2/

About the name, I wanted to change it before anyway (get part of my last name in there), and now is as good a time as any. I would have made an unrecognizable nick if it was about that. :)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: adriantrances
Might not be GOTY worthy, but it's still a good game

I'd definitely put it in the top 10 as far as games go. I really enjoy it. Great gameplay, storyline, graphics. At any rate, it's a AAA title.