How to enable Nvidia Phsyx on Ati cards in Batman:AA

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Schmide

Diamond Member
Mar 7, 2002
5,718
992
126
I think this question has already been answered.

1. Yes, Unreal Engine 3 does not support AA natively, and developers have to code it in by themselves.

Correction. Unreal Engine 3 does not support the combination of MSAA and HDR in DirectX 9 under windows XP.

Which actually brings up a side topic, why didn't MSAA and HDR work in DirectX 10+?
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Since you won't answer any of my questions BFG and I know your agenda won't allow you to take this to PM I will just finish the conversation with this.

http://forums.anandtech.com/showpost.php?p=28693105&postcount=104

You responded.
http://forums.anandtech.com/showpost.php?p=28693177&postcount=107

Like I said "if it's B, you are intentionally ignoring that just to bait\troll me.
". Since you admitted it was B and did not dispute you were doing it to bait\troll me.
Uh, no. You don’t get to ask a loaded question like that and then claim I admitted to something because I didn’t comment on the loaded part. Not commenting on something is not an admission of anything.

You stated in this thread “you even admitted to trolling in that discussion”.

I did absolutely no such thing, and since you’ve failed to provide a direct quote of that admission, you’ve been reported for intentionally spreading lies.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Uh, no. You don’t get to ask a loaded question like that and then claim I admitted to something because I didn’t comment on the loaded part. Not commenting on something is not an admission of anything.

You stated in this thread “you even admitted to trolling in that discussion”.

I did absolutely no such thing, and since you’ve failed to provide a direct quote of that admission, you’ve been reported for intentionally spreading lies.

You asked for proof and I provided it. You never denied in that post why you intentionally left out information from the quote. Nor did you deny that you were trolling\bating me. If the mods want me to remove the quote, fine. Or you or the others can remove it.

However, as always I provided what was asked and you just refuse to accept it or deal with it in PM as it should be.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
You asked for proof and I provided it.
You did no such thing. You made up shit from your loaded question and started spreading lies from it.

Also don't PM me again unless it's for mod business. I see enough of your crap in the forums without my PM inbox getting too.
 

Schmide

Diamond Member
Mar 7, 2002
5,718
992
126
deal with it in PM as it should be.

WTH you can spew crap in public all day but when it comes time to put up or apologize you don't have the balls?

I'm not afraid to admit fault when my logic or information fails me, are you afraid?

Shenanigans
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
WTH you can spew crap in public all day but when it comes time to put up or apologize you don't have the balls?

I'm not afraid to admit fault when my logic or information fails me, are you afraid?

Shenanigans

This is nothing new. Many times before, he's been trolling in the forums and when people completely shut down his arguments with facts and figures, he would just ignore it and go post his stuff in another thread acting as if the previous thread never happened. It's frankly disappointing that the mods have not addressed the problem after it being an issue for so long. It's no secret, ask any regular video forum member who the biggest troll on the forum is, you will get a unanimous response.

Thank goodness for the new ignore feature.
 

dflynchimp

Senior member
Apr 11, 2007
468
0
71
You asked for proof and I provided it. You never denied in that post why you intentionally left out information from the quote. Nor did you deny that you were trolling\bating me. If the mods want me to remove the quote, fine. Or you or the others can remove it.

However, as always I provided what was asked and you just refuse to accept it or deal with it in PM as it should be.

Stop trying to be eloquent dude. You're only coming off as pretentious.
 

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
4
81
It amazes me that people can come in here and talk about how Nvidia cripples AMD. This game WOULD NOT have had AA for anyone if nvidia didn't step in and write it for their cards; if you went out and put effort into getting something to work would you want a bunch of people coming along and piggy backing on your efforts. I don't like what's going on, but the way the fanboys are spinning it is just rediculous. There's tons of games getting released that don't have it for either AMD or nvidia without doing some tricks to force it and taking a huge performance hit (just look at borderlands). That all these fanboys don't even realize this makes me question whether you AMD fanboys even play games? Or do you just buy a red card and spend all your time posting about how evil nvidia is and how great the philantropist red team is.
 

Red Irish

Guest
Mar 6, 2009
1,605
0
0
It amazes me that people can come in here and talk about how Nvidia cripples AMD. This game WOULD NOT have had AA for anyone if nvidia didn't step in and write it for their cards; if you went out and put effort into getting something to work would you want a bunch of people coming along and piggy backing on your efforts. I don't like what's going on, but the way the fanboys are spinning it is just rediculous. There's tons of games getting released that don't have it for either AMD or nvidia without doing some tricks to force it and taking a huge performance hit (just look at borderlands). That all these fanboys don't even realize this makes me question whether you AMD fanboys even play games? Or do you just buy a red card and spend all your time posting about how evil nvidia is and how great the philantropist red team is.

Haven't you just fallen into the same display of mindless favouritism of which you accuse others?
 

waffleironhead

Diamond Member
Aug 10, 2005
7,047
551
136
It amazes me that people can come in here and talk about how Nvidia cripples AMD. This game WOULD NOT have had AA for anyone if nvidia didn't step in and write it for their cards; if you went out and put effort into getting something to work would you want a bunch of people coming along and piggy backing on your efforts. I don't like what's going on, but the way the fanboys are spinning it is just rediculous. There's tons of games getting released that don't have it for either AMD or nvidia without doing some tricks to force it and taking a huge performance hit (just look at borderlands). That all these fanboys don't even realize this makes me question whether you AMD fanboys even play games? Or do you just buy a red card and spend all your time posting about how evil nvidia is and how great the philantropist red team is.

This isnt about crippling AMD. Its about crippling the consumer. Are we now going to need different spec sheets for each game labeling which features will be present depending on the brand of card you happan to have?
 

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
4
81
This isnt about crippling AMD. Its about crippling the consumer. Are we now going to need different spec sheets for each game labeling which features will be present depending on the brand of card you happan to have?

If the trend continues and developers keep doing shitty console ports unless someone comes in and gives them a hand it might be likely. As I said, I don't like the situation. I'd post about how bad wreckage is, but he has a few amd fanboys following him where ever he goes and within all the crap they spew is also all the critisism I could ever level at him.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Proof the AA code is actually standard DirectX code:

http://www.brightsideofnews.com/new...-nvidia-vs-eidos-fight-analyzed.aspx?pageid=1

Naturally, seeing the code that states MSAA being a trademark of any company would raise the alarm in competing technologies. According to Richard, both ATI and nVidia recommend the following standard technique for enabling MSAA: "storing linear depth into the alpha channel of the RenderTarget and using that alpha value in the resolve pass. You only need to linearize it if you are storing something like 8-bit color so that you can get depth approximation, stuff it into alpha channel and then when you've finished rendering at high resolution you simply filter down the color values and use depth value maintain some kind of quality to your Anti-Aliasing so you don't just average."

We have asked our "panel of independent judges" i.e. developers about this method to ensure objectivity of this article and in both cases our developers told us that the description above is a standard technique advocated to gamers by both AMD and nVidia. With the description above being explained to us as "standard", it turns out that the code that was labeled "MSAA Trademark NVIDIA Corporation" was identical to the one recommended by both sides with one very important difference: nVidia's code features a Vendor ID lock that allows for AA menu inside the game only if nVidia hardware is detected.

Also some of that code already runs on ATi's hardware, even if AA is disabled:

What got AMD seriously aggravated was the fact that the first step of this code is done on all AMD hardware: "'Amusingly', it turns out that the first step is done for all hardware (even ours) whether AA is enabled or not! So it turns out that NVidia's code for adding support for AA is running on our hardware all the time - even though we're not being allowed to run the resolve code!
So… They've not just tied a very ordinary implementation of AA to their h/w, but they've done it in a way which ends up slowing our hardware down (because we're forced to write useless depth values to alpha most of the time...)!"
There's your proof Wreckage - it's standard DirectX code. I guess this means you think Microsoft owns it, right? :rolleyes:

Oh, and here's the proof you asked for about ATi's code contributions:

"We haven't locked a single line of code to AMD hardware. No fragments of code anywhere that I am aware of, is locked to any AMD hardware.”
But I’m sure the next time the issue pops up, you’ll pretend this was never posted and go right back to your antics.
 

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
4
81
I have no preference when it comes to graphics cards, but then again, that doesn't sit well with your position.

My position is that people are blowing this out of proportion and twisting it around. You see me leveling critism at people blowing the situation out of proportion and contorting it into something more than it is and assume I'm an nvidia fanboy; I was just poking fun at you for making such a leap in logic.

While currently, I like nvidia because they're the ones pushing gpus to do more than just render graphics. However, I don't by default favor nvidia. AMD is atleast starting to get on the ball with it, and I could conceivable get one of their cards in the future if they show they're willing to push to support for stuff like gpu physics; all they've done so far though is have power point discussions and a checkbox feature. Granted, physx isn't all that either, but atleast it's getting a little real world traction. That you don't have a preference is fine with my position.
 

Red Irish

Guest
Mar 6, 2009
1,605
0
0
My position is that people are blowing this out of proportion and twisting it around. You see me leveling critism at people blowing the situation out of proportion and contorting it into something more than it is and assume I'm an nvidia fanboy; I was just poking fun at you for making such a leap in logic.

While currently, I like nvidia because they're the ones pushing gpus to do more than just render graphics. However, I don't by default favor nvidia. AMD is atleast starting to get on the ball with it, and I could conceivable get one of their cards in the future if they show they're willing to push to support for stuff like gpu physics; all they've done so far though is have power point discussions and a checkbox feature. Granted, physx isn't all that either, but atleast it's getting a little real world traction. That you don't have a preference is fine with my position.

I feel that you were to some extent excusing Nvidia's marketing practices, practices that will only hurt the consumer in the long run. I don't think the situation can be blown out of proportion. Can you imagine a world wherein we have to ask if a given game is an ATI title or and Nvidia title before purchasing? By playing down the importance of this issue, I assumed you were an Nvidia zealot.
 
Last edited:

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
4
81
I feel that you were to some extent excusing Nvidia's marketing practices, practices that will only hurt the consumer in the long run. I don't think the situation can be blown out of proportion. Can you imagine a world where we have to ask if a given game is an ATI title or and Nvidia title before purchasing? By playing down the importance of this issue, I assumed you were an Nvidia zealot.

I personally think they should have just left it alone, and let it release without AA for anyone, but I guess since they were focusing on it for gpu physx they wanted to get the rest of the shitty console stuff out of it for their cards. Developers are already hurting consumers with their shitty ports; video card companies going out of their way to fix them isn't really a bad thing in theory although I agree this could be bad in the long run. And saying it's plausable that this is going to turn into ATI titles and nvidia titles as a general rule is pretty much the definition of blowing it out of proportion. SCPs (shitty console ports) should be a far greater concern than that especially since it's at the root of this problem in my opinion.
 

Red Irish

Guest
Mar 6, 2009
1,605
0
0
I personally think they should have just left it alone, and let it release without AA for anyone, but I guess since they were focusing on it for gpu physx they wanted to get the rest of the shitty console stuff out of it for their cards. Developers are already hurting consumers with their shitty ports; video card companies going out of their way to fix them isn't really a bad thing in theory although I agree this could be bad in the long run. And saying it's plausable that this is going to turn into ATI titles and nvidia titles as a general rule is pretty much the definition of blowing it out of proportion. SCPs (shitty console ports) should be a far greater concern than that especially since it's at the root of this problem in my opinion.

Instead of "blowing out of proportion", think of it as an attempt to "nip this crap in the bud".

I agree with you as far as SCP's go ;)
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Why all the fuss about enabling/disabling AA in an average game, that no one will be playing in 6 months? Move on everyone...
 

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
So, here's an interesting thought. If Nvidia continues to get away with their practice and games are labeled "For Nvidia GPUs" or "For AMD GPUs" then... we've basically created a really expensive console... at the PC level. Will this eventually degrade to games that only work with one or the other GPU, rather than lack features?

Let's say Nvidia gets away with this with little to no harm. Will they then have to courage to lock out other competing products from their motherboard chipsets? Maybe, eventually a CPU-less computer running GPGPUs on an Nvidia chipset... with Nvidia memory?

For some reason, I feel Nvidia is following in the footsteps of Apple's footsteps. Defiantly pushing consumers in one or the other direction all the while depending on their fanatic customer base to ride all over.

Why all the fuss about enabling/disabling AA in an average game, that no one will be playing in 6 months? Move on everyone...

IMO, the point is not enabling/disabling AA, it's about Nvidia getting away with and being emboldened with their actions and continue with this nonsense in the future.
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
IMO, the point is not enabling/disabling AA, it's about Nvidia getting away with and being emboldened with their actions and continue with this nonsense in the future.

Exactly, if this doesn't get stomped now, the PC market will get fragmented into games coded for one or another GPU specifically. If NVIDIA is so gung ho on pushing their own codes and standards, they should set up their own game development studio and write their own games which can be locked to their own hardware to their hearts' content.
 

Schmide

Diamond Member
Mar 7, 2002
5,718
992
126
Why all the fuss about enabling/disabling AA in an average game, that no one will be playing in 6 months? Move on everyone...

Why try and squelch it? Do you understand the issue and the implications? If you don't want to discuss it move one, but don't try and silence it.
 

Schmide

Diamond Member
Mar 7, 2002
5,718
992
126
That all these fanboys don't even realize this makes me question whether you AMD fanboys even play games?

I play tons of games. Ironically, it has made me question every "nVidia the way you're meant to be played" games. I started playing Shattered Horizons. I have 2 4870s in crossfire with a q9550 at 3.6ghz. It runs fine but if I enable AA it runs fine for a while then after about 5 min becomes a slide show. So I have to run without out it. I fear that every time the nVidia engineers stamp their seal of approval on a game, I'm getting screwed.

I may try and play on my gtx260 and see if it offers a difference, but it's hardly an equal comparison.

Edit: It's all watercooled so it's not a throttling issue.

Edit2: Update. I played on my gtx260 and it's relatively unplayable at 1920x1200 without AA. So the comparison just isn't there. I just want to reiterate this is about the perception that sabotage of one game or competitor leads to fear that it may be happening more often than not.
 
Last edited:

dflynchimp

Senior member
Apr 11, 2007
468
0
71
I play tons of games. Ironically, it has made me question every "nVidia the way you're meant to be played" games. I started playing Shattered Horizons. I have 2 4870s in crossfire with a q9550 at 3.6ghz. It runs fine but if I enable AA it runs fine for a while then after about 5 min becomes a slide show. So I have to run without out it. I fear that every time the nVidia engineers stamp their seal of approval on a game, I'm getting screwed.

I may try and play on my gtx260 and see if it offers a difference, but it's hardly an equal comparison.

Edit: It's all watercooled so it's not a throttling issue.

okay schmide don't make that sort of argument because the fallacy in the logic is playing right into their hands. If ATI hardware starts to crap out on TWIWMTBP games people like Wreckage will attribute that to Nvidia "co-developing" and "holding hands" with the game devs, rather than consider the possibility that Nvidia is utilizing anti-competitive practices.

Of course, that does not rule out the possibility that ATI just simply did not have the driver optimizations necessary to perform. The only way we can come up with a conclusion remotely close to the truth is if we had details.

In this case, the AA code from Batman AA is not unique to Nvidia, and since changing vendor ID's clearly allowed ATI hardware to run AA we can say with a measure of certainty that it's not the incompatibility of the code that is preventing ATI users from accessing AA.

The only thing left to argue really is if Nvidia's move is acceptable to end users, or that the business aspect of the deal outweighed the cons of a public backlash.

For the record, neither Nvidia, AMD or Eidos has ANY obligation to please customers. As companies their goal is profit. It just so happens that pleasing customers tend to net them more money. Seriously if card designers and driver devs were as immune to customer boycott as the Oil industry you can bet your asses they'll be pulling all sorts of mark-up+screw-you-over crap if it means more gold in their pockets.