• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How to enable Nvidia Phsyx on Ati cards in Batman:AA

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Wreckage

In fact AA does run on Batman by using CCC, it's just slower. So what really have they lost out on?

yes, correct. But the method that NV supposedly "paid to have all the groundwork done on" is like scene selective multisampling. This is not what you get from the CCC, on Ati from the CCC you get FSAA (old & slow). If you change device ID to nvidia, you can force MSAA in CCC and its 4x faster than FSAA. That's the catch, NV only enabled the more efficient AA modes for its own hardware. The hackers found it was totally capable on ATi hardware because its just MSAA being protected by a vendor flag. I need BFG10k to say something about these AA modes, because afaik he knows more about them than anyone here.
 
Originally posted by: Wreckage
Originally posted by: BFG10K
This is clearly the developer at fault then because we know the game?s AA works fine on ATi cards when it?s tricked through the device ID.

No it's clearly and 100% AMDs fault.

Why should AMD get a free lunch? So while NVIDIA did all the ground work to help AA get going with the developer, AMD should just get to sit at home and wait for a check to arrive.

Terrible logic.

Already been covered.

"Why does ATI have to pay to have AA support? Just because Nvidia did? Since when did card makers have to pay to have AA? That seems more like a cover than anything else. ATI and Nvidia don't pay for AA support in pretty much any other game, it is a default feature that almost all developers implement and support for both companies. It seems to me that Nvidia "paid" to have AA just so they could say ATI didn't pay when they didn't get AA. In other words, they paid for ATI to not have AA because if they didn't pay then both would have had AA just like EVERY other game out there."
 
Originally posted by: dguy6789


"Why does ATI have to pay to have AA support? Just because Nvidia did? Since when did card makers have to pay to have AA? That seems more like a cover than anything else. ATI and Nvidia don't pay for AA support in pretty much any other game, it is a default feature that almost all developers implement and support for both companies. It seems to me that Nvidia "paid" to have AA just so they could say ATI didn't pay when they didn't get AA. In other words, they paid for ATI to not have AA because if they didn't pay then both would have had AA just like EVERY other game out there."

ATI has AA support that you can enable through CCC :confused
 
Originally posted by: Tempered81

yes, correct. But the method that NV supposedly "paid to have all the groundwork done on" is like scene selective multisampling. This is not what you get from the CCC, on Ati from the CCC you get FSAA (old & slow).

Well then AMD should have worked with the developer. Instead of crying about it later.

 
The developer should of respected the DirectX caps flags and designed their game appropriately instead of adding blue crystals for greenbacks.

Edit: quote snip.
 
Worked with developer = paid an advertising fee to have a splash screen on boot. And some vendor specific feature lock-out.

Think about what you're saying for a moment. Do you want to be able to pick up a game title and be assured it'll run properly on your hardware or have to do research to see which hardware vendor paid to have the game crippled on the opposing hardware? Do you want to have both an "nvidia" and "ati" computer to be assured of running all interesting games, or do you want console-like exclusives locked to one manufacturer or another?
 
Originally posted by: thilan29
Both of these claims are NOT true. Batman is based on Unreal Engine 3, which does not natively support anti-aliasing.
[/i]

You'd think that from all the money Epic gets for this engine they would add native support for AA.


 
IDC: Is nVidia paying to add AA to the game or are they paying to not allow AA on AMD hardware?

And even if it is the first case, the line between both situations is a slim one.
 
Originally posted by: Wreckage
Originally posted by: Tempered81

yes, correct. But the method that NV supposedly "paid to have all the groundwork done on" is like scene selective multisampling. This is not what you get from the CCC, on Ati from the CCC you get FSAA (old & slow).

Well then AMD should have worked with the developer. Instead of crying about it later.

Why?
 
Quote from Tim Sweeney (of Epic, makers of the engine in question), talking about UE3:

The most visible DirectX 10-exclusive feature is support for MSAA on high-end video cards. Once you max out the resolution your monitor supports natively, antialiasing becomes the key to achieving higher quality visuals.

The engine natively supports MSAA, at least with DX10. No doubt the support has to be turned on, which is the "extra work" referred to. Once turned on, it has to be actively TURNED OFF for a specific vendor in order not to work.
 
Wow did half of you even read what the nvidia rep said? They said they had no part in "locking" ati out, but UE3 doesn't support AA by default, they put the extra cash in to get AA into the game, and they in no way prevent developers they have worked with from working with ati. So it sounds to me that you guys are all butthurt because ati chose not to go "help" some game developer. Get over it.
 
Originally posted by: GaiaHunter
IDC: Is nVidia paying to add AA to the game or are they paying to not allow AA on AMD hardware?

And even if it is the first case, the line between both situations is a slim one.

I'm sorry but I just don't see what the problem is if it is the former.

AMD pays Intel so AMD can add certain ISA's to their CPUs, and they do this without even thinking once about broadening the license to include VIA. VIA is on their own when it comes to paying/negotiating their access to the market.

So what if NV pays a developer to go the extra mile and incorporate a featureset into the game on NV's dime and in support of NV's hardware? If the developer wasn't going to incorporate the feature otherwise, and the only reason it was implemented was because NV sponsored it, I just don't see what the deal is here.

I would think the first stop on the ethics train in that scenario is with the game developer themselves, for whoring themselves out to the highest bidder in exchange for "locking down" their game's AA support. Who controls the game? The developer. They are the monopoly in this context and they are deciding who gets to have an exclusive in their game, for a price.

If you really have a bone to pick, boycott the game and the developer house's other games.

We aren't about to punish Dell or HP or any of the other Intel customers who knowingly took rebates from Intel to the harm of the consumer, those poor resellers were just trying to keep an edge over their competitor. NV did no different, here is a developer who wanted to exploit the hardware guys, and presumably was willing to take the highest bidder if we really believe sinister motive were afoot, so NV did no worse than DELL and offered themselves up for exploitation in hopes that at least that was the lesser of the evils.

See, if we put our collective intelligence to work on it we can paint just about anyone or anything to be the villain or the victim. Is the world a better place yet? Or are we just needlessly making hot air again?
 
Lets all go back to the days before caps bits and standards. NVidia, ATI and other card venders worked with Microsoft to get these standards into DirectX. Same goes with OpenGL. It is disingenuous at best to believe that both cards do not support the full DX10 rendering pipeline.

If nVidia did something outside the realm of DX then it may be a little less damming, but the fact that it can be hacked by an ID change, affirms the damage.
 
UE3 games have worked with driver-forced MSAA on both Nvidia and AMD cards for a long time now, on both DX9 and DX10. Does that not work in this game by default?
 
Originally posted by: Keysplayr
Look at these two statements and tell me what you think is good policy for big business.
"We pay you to have this feature only for us."
"We pay you to have this feature for us, and our competition."

If you were paying for it, which would. You pick?

I'd pick whatever gets me the most money. Seeing how AMD cards are perfectly capable of supporting that feature, but I chose to lock them out, then I shouldn't be surprised if consumers boycott my products because they don't like my business practices. And in that case, the first choice is the wrong one.
 
Originally posted by: munky
Originally posted by: Keysplayr
Look at these two statements and tell me what you think is good policy for big business.
"We pay you to have this feature only for us."
"We pay you to have this feature for us, and our competition."

If you were paying for it, which would. You pick?

I'd pick whatever gets me the most money. Seeing how AMD cards are perfectly capable of supporting that feature, but I chose to lock them out, then I shouldn't be surprised if consumers boycott my products because they don't like my business practices. And in that case, the first choice is the wrong one.

Plus when did we start cheerleading for whats good for companies rather than whats good for gamers.😕

I dont give a shit whats good for ATI/NV. I do however care about PC gaming.

 
Originally posted by: WelshBloke
I do however care about PC gaming.

Exactly, that's why I pick a card from a company that actually works with game developers. Instead of doing nothing and crying about it.

AMD is holding back game development by not working with game developers, not implementing physics, delaying games and all their other shenanigans.
 
Originally posted by: Idontcare
Originally posted by: GaiaHunter
IDC: Is nVidia paying to add AA to the game or are they paying to not allow AA on AMD hardware?

And even if it is the first case, the line between both situations is a slim one.

I'm sorry but I just don't see what the problem is if it is the former.

AMD pays Intel so AMD can add certain ISA's to their CPUs, and they do this without even thinking once about broadening the license to include VIA. VIA is on their own when it comes to paying/negotiating their access to the market.

So what if NV pays a developer to go the extra mile and incorporate a featureset into the game on NV's dime and in support of NV's hardware? If the developer wasn't going to incorporate the feature otherwise, and the only reason it was implemented was because NV sponsored it, I just don't see what the deal is here.

I would think the first stop on the ethics train in that scenario is with the game developer themselves, for whoring themselves out to the highest bidder in exchange for "locking down" their game's AA support. Who controls the game? The developer. They are the monopoly in this context and they are deciding who gets to have an exclusive in their game, for a price.

If you really have a bone to pick, boycott the game and the developer house's other games.

We aren't about to punish Dell or HP or any of the other Intel customers who knowingly took rebates from Intel to the harm of the consumer, those poor resellers were just trying to keep an edge over their competitor. NV did no different, here is a developer who wanted to exploit the hardware guys, and presumably was willing to take the highest bidder if we really believe sinister motive were afoot, so NV did no worse than DELL and offered themselves up for exploitation in hopes that at least that was the lesser of the evils.

See, if we put our collective intelligence to work on it we can paint just about anyone or anything to be the villain or the victim. Is the world a better place yet? Or are we just needlessly making hot air again?
Good point it's just like intel forcing hp & dell to buy less than 5% amd cpus for their builds.
It's just the way it is.

Originally posted by: CP5670
UE3 games have worked with driver-forced MSAA on both Nvidia and AMD cards for a long time now, on both DX9 and DX10. Does that not work in this game by default?
That's the catch, No NV vendor ID, no MSAA.

 
Originally posted by: IdontcareI would think the first stop on the ethics train in that scenario is with the game developer themselves, for whoring themselves out to the highest bidder.

Agreed. IMO, this whole issue should be moot. Batman:AA is a Games for Windows Live title. Microsoft imposes minimum standards for games to carry the Games for Windows label, which should include min 4xAA in game support as a requirement.

Clearly MS hasn't given up on GFWL http://www.neoseeker.com/news/...-up-games-for-windows/ but if they don't leverage this to the benefit of consumers there is really no incentive for us to support GFWL.

Originally posted by: v8envy
Fragmenting the already niche PC game market with vendor-paid for and enabled features is going to backfire, hard.

Agreed again. It looks like Batman:AA is selling really well (over 2M), but I haven't been able to find a breakdown of the sales by platform. I'd be willing to bet that Xbox 360 leads, followed by PS3, and then PC.
 
Originally posted by: Wreckage
Originally posted by: WelshBloke
I do however care about PC gaming.

Exactly, that's why I pick a card from a company that actually works with game developers. Instead of doing nothing and crying about it.

AMD is holding back game development by not working with game developers, not implementing physics, delaying games and all their other shenanigans.

Or working with open standards like DX11, DirectCompute, OpenCL, etc that benefit all instead of a few kickback receivers.
 
Originally posted by: Schmide

Or working with open standards like DX11, DirectCompute, OpenCL, etc that benefit all instead of a few kickback receivers.

NVIDIA was the first to offer an OpenCL driver. In fact I don't even know if AMD offers one to the public yet. Nice try though.

Also DX11 is not an open standard. It is closed source that only runs on Microsoft operating systems.

Originally posted by: Schmide

On a Mac. :shocked: Nice try though.

DX11 works on a Mac?
 
Originally posted by: Wreckage
Originally posted by: Schmide

Or working with open standards like DX11, DirectCompute, OpenCL, etc that benefit all instead of a few kickback receivers.

NVIDIA was the first to offer an OpenCL driver. In fact I don't even know if AMD offers one to the public yet. Nice try though.

On a Mac. :shocked: Nice try though.
 
Originally posted by: Wreckage
Originally posted by: WelshBloke
I do however care about PC gaming.

Exactly, that's why I pick a card from a company that actually works with game developers. Instead of doing nothing and crying about it.

AMD is holding back game development by not working with game developers, not implementing physics, delaying games and all their other shenanigans.

Thats funny, I thought Nvidia was the one delaying directx implementation in the form of not even having dx10.1 cards out, not to mention dx11 now

:roll:

You must be the favourite AT Moderator troll considering youre still here after all the thread crapping you do
 
Originally posted by: Wreckage
Originally posted by: WelshBloke
I do however care about PC gaming.

Exactly, that's why I pick a card from a company that actually works with game developers. Instead of doing nothing and crying about it.

AMD is holding back game development by not working with game developers, not implementing physics, delaying games and all their other shenanigans.

*snort*

Nv is holding back game development by working with game developers to actively prevent features from working on their competitors cards, by purposely disabling physics processing when an ATI card is detected, delaying DX10.1 adoption by saying DX10 is enough and all their other (many, MANY) shenanigans.
 
Originally posted by: nitromullet

Agreed again. It looks like Batman:AA is selling really well (over 2M), but I haven't been able to find a breakdown of the sales by platform. I'd be willing to bet that Xbox 360 leads, followed by PS2, and then PC.

Actually it may be selling better on the PS3 because of proprietary content (you get to play as the Joker). 😉
 
Back
Top