Batman AA fiasco: Who's telling the truth?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
I think he means how various people found a way to get AA to work on ATI cards. As in we, the enthusiasts.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
I think he means how various people found a way to get AA to work on ATI cards. As in we, the enthusiasts.

The only work the enthusiasts did was to remove the Vendor ID block. I believe ATi already suggested removing the block in one of those emails.
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Did you even read the link?

It’s also worth noting here that AMD have made efforts both pre-release and post-release to allow Eidos to enable the in-game antialiasing code - there was no refusal on AMD’s part to enable in game AA IP in a timely manner.

Said another way "nVidia paid to get all that work done, why shouldn't ATi benefit?"

ATi was offered a chance to write their own AA code, they chose not too, that was entirely their choice, they don't think they need to support their customers in that fashion. The same can be said about nV and DX11 in certain titles. Big difference is you don't see the temper tantrums quite the same, not even from Wreckage.

This though is the really kick in the ass, as part of the code is running on ATi hardware enabled or not.

Do you understand what they are saying in that quote? They are stating that they need to run a different version of the engine altogether. Writing a depth value for alpha isn't something you toggle on the fly as you can do with AA.

This is clearly Eidos fault for not putting in AA themself if they really wanted it, or at least denying the use of the code once they found out Nvidia was going to block some of Eidos customers from it.

Eidos already stated they blocked it based on what their legal team told them to do. They claimed they did this due to the contract with nV, nV has a contract based on PhysX code ownership, it does not encompass AA.

AA is part of the DX API and shouldn't be locked to one vendor.

Is this too complicated for you on a technical level? Do you understand how they are doing AA and that it isn't anything remotely like DX's AA support? If a game comes out with accumulation buffer style SSAA built into the engine because ATi wanted it coded that way it would be nV's job to get support up and running on their boards, it is not covered under the DX spec.
 

Schmide

Diamond Member
Mar 7, 2002
5,786
1,085
126
Said another way "nVidia paid to get all that work done, why shouldn't ATi benefit?"

ATi was offered a chance to write their own AA code, they chose not too, that was entirely their choice, they don't think they need to support their customers in that fashion. The same can be said about nV and DX11 in certain titles. Big difference is you don't see the temper tantrums quite the same, not even from Wreckage.

NVidia was given the chance to follow the directX guidelines. Entirely their choice to make rendering decisions based on the Vendor ID and not the DirectX Caps bits.

I wouldn't call this a tantrum, more of a WTF - the industry has been moving towards these standards for some time now all of a sudden they want to go all rogue on us ignoring the industry for their own benefit, a few game sales and PR pluses and minuses.

Do you understand what they are saying in that quote? They are stating that they need to run a different version of the engine altogether. Writing a depth value for alpha isn't something you toggle on the fly as you can do with AA.

Actually it is. You check the CAPS BITS and if the surface supports it, you can do it. Since DirectX9 doesn't support a Multi Sample FP resolve in the final surface this must be done on a texture. A small hit.

Before you act all high and mighty, you should read the AMD perspective here. They explain in great detail what they found the engine doing.

Eidos already stated they blocked it based on what their legal team told them to do. They claimed they did this due to the contract with nV, nV has a contract based on PhysX code ownership, it does not encompass AA.

Whatever the legal team said, it was the marketing team making the decisions.

Is this too complicated for you on a technical level? Do you understand how they are doing AA and that it isn't anything remotely like DX's AA support? If a game comes out with accumulation buffer style SSAA built into the engine because ATi wanted it coded that way it would be nV's job to get support up and running on their boards, it is not covered under the DX spec.

It is too complicated if you talk in half truths and make assumptions on the DirectX level of compatibility of ATI cards.

Edit: Since when is deferred shading not allowed by DirectX calls?
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
Before you act all high and mighty, you should read the AMD perspective here. They explain in great detail what they found the engine doing.

That article was an interesting read. I'm not sure if that article was already linked in some independent thread or not, but maybe it would be a good idea for thilanliyan to add it to the OP.
 
Last edited:

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
@BenSkywalker

I can't believe you condone Nvidia actions here.

In your mind it must be perfectly fine that Nvidia implemented ingame AA code that partially running on ATi's hardware all the time.

You right though I don't understand how Nvidia can get away with sabotaging the competitor's performance.

Please enlighten me, and while your at it please explain how Nvidia's code works perfectly on ATi cards if it doesn't follow DX spec.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Let's discuss this running partially all the time -- personally had some posters offer detrimental performance loss but where is this loss in frame-rate with no AA?

If anything, ATI offers much more performance with a HD4890 over GTX-260; and is a lot more
and not even close.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Let's discuss this running partially all the time -- personally had some posters offer detrimental performance loss but where is this loss in frame-rate with no AA?

If anything, ATI offers much more performance with a HD4890 over GTX-260; and is a lot more
and not even close.
All cards run part of this code with 0xAA, so running without AA the 4890 should be faster.

The problem is most user will turn on some AA, even X360 runs this title with 2xAA.
http://forum.beyond3d.com/showpost.php?p=1113344&postcount=3

1920x1200 0xAA / 2xAA
HD4890 - 84.3FPS / 53.4FPS
GTX260 - 71.9FPS / 58.3FPS
http://www.firingsquad.com/hardware/ati_radeon_5770_5750_performance/page13.asp

The 4890 takes double performance drop when running 2xAA, but the question is how much does this extra code add to the mix?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The X-box 360 is different than DirectX 9 when it comes to the Unreal engine and AA.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
NVidia was given the chance to follow the directX guidelines.

Of course, it isn't Eidos job, it wasn't Epic's job, and it wasn't ATi's job, it was nVidia's job to get AA running on ATi hardware by doing it the way that you think it should be done. So, if nV releases a part at some point that is more then twice as fast as ATi's, is it then nVidia's fault for not doing ATi's R&D for them too? Is it nVidia's job to pay ATi enough to be profitable too?

Actually it is. You check the CAPS BITS and if the surface supports it, you can do it.

So nVidia is now a game developer? Seems to me they were asked to get AA working on their hardware and they did. ATi was asked to get AA working on their hardware and they did not. And somehow it's nVidia's fault. I understand some people have the need to keep ATi in a divine spot, They can do no wrong. Since this is such a typical AA routine, why didn't Sweeney use it in the first place?

Before you act all high and mighty, you should read the AMD perspective here.

Them whining because they didn't want to get the work done for their own parts. They were given the offer, they turned it down. It was their choice, they had the option to do it themselves.

It is too complicated if you talk in half truths and make assumptions on the DirectX level of compatibility of ATI cards.

I don't doubt it will work on ATi cards. What I find absurd is that nV should be qualifying a routine for ATi. How on Earth is that logical?

I can't believe you condone Nvidia actions here.

Is AA working on nV cards? Yes. They did their job. They are not in the business of making games, they do not have final say on any code base that is in the game. The developers and the publisher both had complete freedom to refuse their alternative. Furthermore ATi was offered a chance to get it running on their hardware. It is not nV's job to make ATi look good.

You right though I don't understand how Nvidia can get away with sabotaging the competitor's performance.

When did nV release a game? Get back to me when that happens, until it does its' just loyalists rage.

Please enlighten me, and while your at it please explain how Nvidia's code works perfectly on ATi cards if it doesn't follow DX spec.

The AA routine they are running doesn't follow DX's guidelines for AA. Is anyone here honestly under the impression that Sweeney wouldn't have added it in engine himself it was some sort of implementation that followed normal DX routines? Seriously, if you think nV is better then Sweeney at working with Sweeney's game engine you must have a lot less faith in his abilities then I do.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Man you need to do some reading in this debate.

I'm not saying these things are true, but those are the things AMD claim.

- They said they circumvented the vendor lock and AA worked fine;

- They claimed even though AA isn't working on their cards, the initial AA
code, even though it is disabled for ATI hardware is being sent to their cards, so they are getting the performance penalty as if they were using AA but getting no result.

All this has been talked in other threads in here and you can find them for yourself.

Again, I'm not saying who is right or wrong, who's lying or not, just correcting your affirmations by providing you the correct AMD claims or the way they present their side of the story.
Sorry for the late reply. I can see and understand your arguments and I didn't look at any threads about this topic.

"Batman: Arkham Asylum" is developed by Rocksteady Studios Ltd.. Eidos Interactive Ltd. is the Publisher. It is clear that it isn't the in game setting that people are upset about, but the fact that AA does nothing when changing the settings through CCC. AA only works when AA is specified in game, and only Nvidia card can do it by default.

The original version from Rocksteady do not support AA as the graphic files ain't stored correctly to utiltize AA. Nvidia restored all the graphics so it utilizes AA. Because of this, Eidos put a simple vendor ID check in the final verison as ATI never submit their version of the graphics. ATI claims that the final version of the graphics will be identical to the ones Nvidia has submitted, but it won't. They may look alike, but not identical as it will be done by different people, machine, software and time.

Another thing ATI complain about is the depth value before AA, which does nothing for ATI's HW as they don't support 3D version. Without depth value, 3D images with appears to be wrong, flickering.

The fix isn't to remove the vendor check, but to redo all the graphics. The current optimized graphics belongs to Nvidia, and won't be used when AA is not enabled. That means even if ATI user chooses to enable AA in CCC, the graphics will not be enhanced. Now Eidos can have Rocksteady redo all its graphics just for ATI's HW to work so without using Nvidia's optimized graphics, but why will they want to do it? Afterall, Nvidia helped them on the entire development and redoing every graphics requires time and money. All Eidos askes is for ATI for contribute something for their HWs to work instead of just asking for it. There are much better ways to lock out ATI's HW than a simple vendor ID check, but that will do as Eidos do want ATI user to play its game.

The idea is not to prevent ATI's hardware from using AA, but to make them do it themselves as those optimized graphics belongs to Nvidia, not Rocksteady Studios Ltd.. This means without Nvidia, AA will have no effects and has nothing to do with the DirectX API calls.

This is how I understood it after reading from HEXUS and BSN. I may be wrong, but I am in no position in judging your point of view, simply sharing mine. I quoted you not because I disagree with you, but I found it funny ATI claims a lot without working a lot.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Of course, it isn't Eidos job, it wasn't Epic's job, and it wasn't ATi's job, it was nVidia's job to get AA running on ATi hardware by doing it the way that you think it should be done.
Oh, come on! It's a Vendor ID lockout! Remove the lockout and the AA works just fine on ATi hardware. Nvidia actually had to do extra work to prevent it from working on ATi cards by adding the lockout.



So nVidia is now a game developer? Seems to me they were asked to get AA working on their hardware and they did. ATi was asked to get AA working on their hardware and they did not.
Again, you're deliberately ignoring the fact that the MSAA works just fine on ATi hardware if you remove the lockout. The lockout is extra coding on top of just that programming necessary for AA to function.



And somehow it's nVidia's fault.
Because it is. I suppose the developers just threw that ATi lockout in there for the heck of it. Yes, I'm sure deliberately making their game look worse for a good portions of their potential customer base is a standard business practice for them. :rolleyes:

I understand some people have the need to keep ATi in a divine spot, They can do no wrong. Since this is such a typical AA routine, why didn't Sweeney use it in the first place?
We could ask you why you have this need to keep Nvidia in a divine spot since obviously (to you) they can do no wrong.



Them whining because they didn't want to get the work done for their own parts. They were given the offer, they turned it down. It was their choice, they had the option to do it themselves.
Why would they need to? It's standard code. So they needed to fly an engineer to Rocksteady Studios to tell them to implement the same AA code that they were informed would already be present?

I don't doubt it will work on ATi cards. What I find absurd is that nV should be qualifying a routine for ATi. How on Earth is that logical?
Nobody was asking Nvidia to qualify the coding for ATi cards. They were doing their own qualifying in-house.



Is AA working on nV cards? Yes. They did their job. They are not in the business of making games, they do not have final say on any code base that is in the game. The developers and the publisher both had complete freedom to refuse their alternative.
Just because the developer has the right to refuse accepting the code does not absolve Nvidia from insisting that the Vendor ID lockout be included if Rocksteady wanted to use the MSAA. It's dirty pool no matter how many directions you try to twist it.



When did nV release a game? Get back to me when that happens, until it does its' just loyalists rage.
And this has exactly what to do with anything? Nice smoke screen, however.



The AA routine they are running doesn't follow DX's guidelines for AA. Is anyone here honestly under the impression that Sweeney wouldn't have added it in engine himself it was some sort of implementation that followed normal DX routines? Seriously, if you think nV is better then Sweeney at working with Sweeney's game engine you must have a lot less faith in his abilities then I do.
How does that excuse including a Vendor ID lockout? You don't think that Nvidia cards have displayed content in the past that was put forth through the efforts of ATi? You don't think they will again in the future, especially with DX11?

I simply can't understand how an end user can condone their actions in this as we're the ones being hurt by it.
 
Last edited:

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
On the "nVidia paid for it, why should ATI benefit" argument. ATI has provided funding for other games in the past and prior to this but the fact is that these games have never been coded to perform worse or have less features on competing products. While ATI's position in the market is weaker than nVidia you don't see them engaging in anti-competitive moves such as stipulating that a game looks worse or is less featured on a competitor's product. And I'm not talking about coding in features that uses effects that only your hardware can provide such as PhysX but "standardized" features.

The question for those who seem to be defending nVidia's decision is are you saying it is ok for future games that ATI sponsors or provides funding for to lock out DX11 features from working on nVidia hardware? Last I heard, ATI was working with developers on about 20 games utilizing DX11. Is it ok for these games to have code that locks out nVidia hardware on DX11 and use a different DX variant such as DX9 or DX10 instead? I mean, ATI paid to have these features put in, why should nVidia benefit? This is not a new question and has been asked many times by supporters of nVidia's decision in this AA lockout issue.

This isn't about whether ATI or nVidia can do no wrong as you put it. This is about me as a consumer. The bottom line is this move by nVidia hurts consumers in the long run. I don't want to have the day when I have a very basic game that can run on any hardware but will need an ATI card to get certain effects cause ATI sponsored it and need to pop in an nVidia card in a different game to get the most out of it because nVidia sponsored it.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
While is true that Anti Aliasing is not supported when deffered shading is used under DX9, nVidia and ATi both found a hack to enable it under Unreal 3 Engine. But changing the vendorID of my card allows me to use the nVidia's AA in Batman AA and it works like a charm, the image quality is slighly worse than the AA when is forced in the CCC, but I don't understand why the cheap explanation that the nVidia's AA implementation runs buggy in ATi hardware because it doesn't.

I condone any vendor lock out because that stagnates the gaming market and it affects more the gamers/users than the companies.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
On the "nVidia paid for it, why should ATI benefit" argument. ATI has provided funding for other games in the past and prior to this but the fact is that these games have never been coded to perform worse or have less features on competing products. While ATI's position in the market is weaker than nVidia you don't see them engaging in anti-competitive moves such as stipulating that a game looks worse or is less featured on a competitor's product. And I'm not talking about coding in features that uses effects that only your hardware can provide such as PhysX but "standardized" features.

The question for those who seem to be defending nVidia's decision is are you saying it is ok for future games that ATI sponsors or provides funding for to lock out DX11 features from working on nVidia hardware? Last I heard, ATI was working with developers on about 20 games utilizing DX11. Is it ok for these games to have code that locks out nVidia hardware on DX11 and use a different DX variant such as DX9 or DX10 instead? I mean, ATI paid to have these features put in, why should nVidia benefit? This is not a new question and has been asked many times by supporters of nVidia's decision in this AA lockout issue.

This isn't about whether ATI or nVidia can do no wrong as you put it. This is about me as a consumer. The bottom line is this move by nVidia hurts consumers in the long run. I don't want to have the day when I have a very basic game that can run on any hardware but will need an ATI card to get certain effects cause ATI sponsored it and need to pop in an nVidia card in a different game to get the most out of it because nVidia sponsored it.

If this was the rule of things -- would agree. It's the exception to the rule. The reason for improved standards is so these things don't happen. This was a DirectX 9 API using the unreal engine -- and a work around, imho. We are in DirectX 11 now.

Can see points for the lock out but if ATI can guarantee and take responsibility for nVidia's work on this -- would like to see nVidia bend and make it possible some-how.

But, if this was so trivial to do and easy -- ATI could just do the work themselves and be done with it, too, instead of airing out dirty laundry aimed at gamers and consumers.

Show everyone that they are willing to be just as pro-active and much more open, too, instead of complaining about it. Do more - -complain less -- and show how developer relations suppose to be like.

Would rather see a very pro-active developer relations with some questionable examples once in-a-while than a reactive developer relations that may be more ethical but yet whinny.

It's not what you say but what gets actually done.
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
On the "nVidia paid for it, why should ATI benefit" argument. ATI has provided funding for other games in the past and prior to this but the fact is that these games have never been coded to perform worse or have less features on competing products. While ATI's position in the market is weaker than nVidia you don't see them engaging in anti-competitive moves such as stipulating that a game looks worse or is less featured on a competitor's product. And I'm not talking about coding in features that uses effects that only your hardware can provide such as PhysX but "standardized" features.

The question for those who seem to be defending nVidia's decision is are you saying it is ok for future games that ATI sponsors or provides funding for to lock out DX11 features from working on nVidia hardware? Last I heard, ATI was working with developers on about 20 games utilizing DX11. Is it ok for these games to have code that locks out nVidia hardware on DX11 and use a different DX variant such as DX9 or DX10 instead? I mean, ATI paid to have these features put in, why should nVidia benefit? This is not a new question and has been asked many times by supporters of nVidia's decision in this AA lockout issue.

This isn't about whether ATI or nVidia can do no wrong as you put it. This is about me as a consumer. The bottom line is this move by nVidia hurts consumers in the long run. I don't want to have the day when I have a very basic game that can run on any hardware but will need an ATI card to get certain effects cause ATI sponsored it and need to pop in an nVidia card in a different game to get the most out of it because nVidia sponsored it.
As a consumer, you question why your ATI card can't run AA in Batman. As a consumer, you know Nvidia is the one who spend their money in the development yet conquer to use of vendor check. As a consumer, you want everything to work on your purchase other than buying the right ones.

If money is not a problem, you can simply go buy a Nvidia video card. If money is a problem, then I don't see what is wrong with the vendor check as Nvidia is the one who spend money on it. Nvidia video card owners spent money on it.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
It is clear that it isn't the in game setting that people are upset about, but the fact that AA does nothing when changing the settings through CCC. AA only works when AA is specified in game, and only Nvidia card can do it by default.

Wrong. People get AA when using CCC panel options - it takes a larger performance hit though, as it is a brute force technique. The game option of AA even refers to CCC if an ATi card is present.

The original version from Rocksteady do not support AA as the graphic files ain't stored correctly to utiltize AA. Nvidia restored all the graphics so it utilizes AA. Because of this, Eidos put a simple vendor ID check in the final verison as ATI never submit their version of the graphics. ATI claims that the final version of the graphics will be identical to the ones Nvidia has submitted, but it won't. They may look alike, but not identical as it will be done by different people, machine, software and time.

Some part of that I don't understand, but the main point is that the AA CODE DOES WORK for ATi cards if vendor lock is hacked.

Another thing ATI complain about is the depth value before AA, which does nothing for ATI's HW as they don't support 3D version. Without depth value, 3D images with appears to be wrong, flickering.

The fix isn't to remove the vendor check, but to redo all the graphics. The current optimized graphics belongs to Nvidia, and won't be used when AA is not enabled. That means even if ATI user chooses to enable AA in CCC, the graphics will not be enhanced. Now Eidos can have Rocksteady redo all its graphics just for ATI's HW to work so without using Nvidia's optimized graphics, but why will they want to do it? Afterall, Nvidia helped them on the entire development and redoing every graphics requires time and money. All Eidos askes is for ATI for contribute something for their HWs to work instead of just asking for it. There are much better ways to lock out ATI's HW than a simple vendor ID check, but that will do as Eidos do want ATI user to play its game.

Again that is a completly different interpretation of the facts and claims, and people in this thread already showed AA screen shots of the game with ATi hardware.

The idea is not to prevent ATI's hardware from using AA, but to make them do it themselves as those optimized graphics belongs to Nvidia, not Rocksteady Studios Ltd.. This means without Nvidia, AA will have no effects and has nothing to do with the DirectX API calls.

This is how I understood it after reading from HEXUS and BSN. I may be wrong, but I am in no position in judging your point of view, simply sharing mine. I quoted you not because I disagree with you, but I found it funny ATI claims a lot without working a lot.

The problem is that the implementation being used is the standard one - both NVIDIA and ATi can generate AA from the same input, and then each vendor handles that code in their own away.

I think you are doing an interpretation no one else here does.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
But, if this was so trivial to do and easy -- ATI could just do the work themselves and be done with it, too, instead of airing out dirty laundry aimed at gamers and consumers.
Why would they need to? The standard code provided by Nvidia works just fine on ATi hardware. The standard code + Nvidia Vendor ID lockout is what's the problem.



Show everyone that they are willing to be just as pro-active
So I suppose the time and money ATi is spending on DX11 titles won't be of benefit to Nvidia at some point in the future?

and much more open, too, instead of complaining about it. Do more - -complain less -- and show how developer relations suppose to be like.
Did you really just call Nvidia more open than ATi?!? The same Uli SLI lockout, PhysX lockout, Batman:AA lockout Nvidia? Um... I'm not even sure how to respond to that.



Would rather see a very pro-active developer relations with some questionable examples once in-a-while than a reactive developer relations that may be more ethical but yet whinny.
Well, finally something we can agree on. The part about Nvidia being questionable, that is. And ATi is being pro-active with regards to DX11 content. There you go.

Why do you defend Nvidia at every turn? Can't you simply state the fact that this was a very dickish move by Nvidia without throwing in something to try and make ATi look bad at the same time? Simply call it what it is without apologizing for Nvidia in the same breath.



It's not what you say but what gets actually done.
Oh, we can see what gets done when developers bow under to Nvidia pressure, all right.
 
Last edited:

Schmide

Diamond Member
Mar 7, 2002
5,786
1,085
126
Of course, it isn't Eidos job, it wasn't Epic's job, and it wasn't ATi's job, it was nVidia's job to get AA running on ATi hardware by doing it the way that you think it should be done. So, if nV releases a part at some point that is more then twice as fast as ATi's, is it then nVidia's fault for not doing ATi's R&D for them too? Is it nVidia's job to pay ATi enough to be profitable too?

No! if you actually read what I wrote, it's nVidia's job to uphold the DirectX standards they helped create!

This isn't about profits, it's about standards. Apparently you put profits above community. Sad.

So nVidia is now a game developer? Seems to me they were asked to get AA working on their hardware and they did. ATi was asked to get AA working on their hardware and they did not. And somehow it's nVidia's fault. I understand some people have the need to keep ATi in a divine spot, They can do no wrong. Since this is such a typical AA routine, why didn't Sweeney use it in the first place?

Apparently you confused me with a fanboi. I will chastise any company for such dirty tricks. Guess what? AMD did this during their push for AMD64 and I think it's mentioned in this thread or the other. (it was the other here) ATI did some ground breaking driver hax as well.

It's really funny when someone like yourself, just repeats the nVidia line regardless of the truth. At first it seemed like you really understood how it was implemented (technically), now I'm starting to wonder if you really understand the issue. It's starting to seem like bullshit.

Them whining because they didn't want to get the work done for their own parts. They were given the offer, they turned it down. It was their choice, they had the option to do it themselves.

Repeating yourself? Nothing new here.

I don't doubt it will work on ATi cards. What I find absurd is that nV should be qualifying a routine for ATi. How on Earth is that logical?

Did you even read the link at BSN? I guess not since you continue to spew your misinformed line. AMD was qualifying the builds. Duh enough said.

Is AA working on nV cards? Yes. They did their job. They are not in the business of making games, they do not have final say on any code base that is in the game. The developers and the publisher both had complete freedom to refuse their alternative. Furthermore ATi was offered a chance to get it running on their hardware. It is not nV's job to make ATi look good.

Apparently, their lawyers do have the final say. Regardless, you can continue to blame AMD while towing the nVidia excuse line. You just end up sounding like an uninformed fanboi.

When did nV release a game? Get back to me when that happens, until it does its' just loyalists rage.

No but they provided a piece of DirectX code that doesn't follow Microsoft's guidelines for making rendering decisions. If they want to use DirectX caps only when it's convenient for them, they should drop support for the standard! An unsupported standard is not worth supporting. Use it or GTFO.

The AA routine they are running doesn't follow DX's guidelines for AA. Is anyone here honestly under the impression that Sweeney wouldn't have added it in engine himself it was some sort of implementation that followed normal DX routines? Seriously, if you think nV is better then Sweeney at working with Sweeney's game engine you must have a lot less faith in his abilities then I do.

If you know so much. I'm calling you on it. Exactly how does one render MSAA and HDR in DirectX9?

I understand the two general methods for MSAA and HDR in DirectX9. I really don't think you do! If you did, you wouldn't be making the above argument. I'm not afraid of being proven wrong so get too it! Name the two methods and tell me how they avoid using DirectX calls?

This isn't a Sweeney issue so stop trying to divert it. You're making a false choice!
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Did you really just call Nvidia more open than ATi?!? The same Uli SLI lockout, PhysX lockout, Batman:AA lockout Nvidia? Um... I'm not even sure how to respond to that.

Actually it was the other way around but through your extremism you read it the way you want to read it.
 

Schmide

Diamond Member
Mar 7, 2002
5,786
1,085
126
Sorry for the late reply. I can see and understand your arguments and I didn't look at any threads about this topic.

I'm sorry you ignored my reply to you. A lot of what you misspeak here was answered there.

"Batman: Arkham Asylum" is developed by Rocksteady Studios Ltd.. Eidos Interactive Ltd. is the Publisher. It is clear that it isn't the in game setting that people are upset about, but the fact that AA does nothing when changing the settings through CCC. AA only works when AA is specified in game, and only Nvidia card can do it by default.

The original version from Rocksteady do not support AA as the graphic files ain't stored correctly to utiltize AA. Nvidia restored all the graphics so it utilizes AA. Because of this, Eidos put a simple vendor ID check in the final verison as ATI never submit their version of the graphics. ATI claims that the final version of the graphics will be identical to the ones Nvidia has submitted, but it won't. They may look alike, but not identical as it will be done by different people, machine, software and time.

Uhh. I don't think you understand MSAA. Do more research or wait for BenSkywalker to reply.

Another thing ATI complain about is the depth value before AA, which does nothing for ATI's HW as they don't support 3D version. Without depth value, 3D images with appears to be wrong, flickering.

Wrong again. The depth value is use to resolve the Alpha in the MSAA. A lot more complicated than z-buffering flicker.

(snip not worth it just FUD and misunderstanding)

The idea is not to prevent ATI's hardware from using AA, but to make them do it themselves as those optimized graphics belongs to Nvidia, not Rocksteady Studios Ltd.. This means without Nvidia, AA will have no effects and has nothing to do with the DirectX API calls.

Please don't act like you know what you're saying. Guess what. You're wrong again.

This is how I understood it after reading from HEXUS and BSN. I may be wrong, but I am in no position in judging your point of view, simply sharing mine. I quoted you not because I disagree with you, but I found it funny ATI claims a lot without working a lot.

Wow a truthful statement in bold. You may want to preface all your arguments with that.

PS I guess you can find things funny when you don't really understand them. Huh.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Wrong. People get AA when using CCC panel options - it takes a larger performance hit though, as it is a brute force technique. The game option of AA even refers to CCC if an ATi card is present.

Some part of that I don't understand, but the main point is that the AA CODE DOES WORK for ATi cards if vendor lock is hacked.

Again that is a completly different interpretation of the facts and claims, and people in this thread already showed AA screen shots of the game with ATi hardware.

The problem is that the implementation being used is the standard one - both NVIDIA and ATi can generate AA from the same input, and then each vendor handles that code in their own away.

I think you are doing an interpretation no one else here does.
I know AA works on ATI card by hacking. I know the images created by Nvidia will also works on ATI cards. I know both vendors can generate AA from the same input, but where does the input came from?

Original code from Rocksteady don't support AA, meaning that even if you can specify AA in game, nothing is enhanced. Eidos can release it as it is, with no ingame AA, butNvidia come in with codes that enables AA. Eidos requested this code from ATI, but they said it is going to be the same. Until now, ATI still haven't send in their version of the code. Nvidia is not stopping ATI from using AA, just not with Nvidia's code.

Think of it this way. Say Nvidia can make Batman more beautiful via its own driver, will you claims that Nvidia driver should work on ATI's card? Right now, Nvidia has a little "driver" within the game that checks vendor ID. Yes the driver will also work on ATI's card, but it is really a Nvidia driver and they have the right to restrict on what condition shall it be run.

The "driver" is not little as it contains all the images of batman, ones that will enhanced by AA. Nvidia created them from taking the images from Rocksteady's code and modified them. If Rocksteady rejects this modification, then Nvidia can't use it, and Eidos can do nothing but to remove it from the game. However, Rocksteady is okay with it, and requested ATI to modify the source code so that batman can run in AA for both parties. ATI never submit their version of it.

Now Rocksteady can redo all the images, then both vendors can use AA, but they did not. This have nothing to do with Nvidia. The question is who is going to reinvent the wheels. Rocksteady or ATI? Nvidia is not stopping any of them from writing it, but it does have ownership on the code that they have written.

Legally speaking, Eidos can not alter any codes supplied by Nvidia. They can however "accept" or "reject" to publish the code. If they do reject, then no vendor can have AA because the code supplied by Rocksteady don't support it. If they accept, then they can not remove just the vendor check. For ATI cards to run AA legally, they must supply their version of it, which no one other than Nvidia has it.

Again, the actuall API calls to enable AA is not blocked by Nvidia as they can't, but calling the API without their code will have no result, meaning AA do not enhance the graphic via the original code.

Is this a dirty trick? Well Nvidia don't need to write that piece of code, they did it for their own customer. Can Nvidia create something for its own customer? Yes i know the ones that they have written do work on ATI, but that isn't the question.

Say you brought a game and your friend wants you to share cd-key. Is it wrong not to share? Are you hurting your friend by not sharing? Is the invention of cd-key bad as its solo purpose is to protect developers and ownership?
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
HMMM But it does look so much better on Nvidia with PhysX.

I don't have an nVidia card and yet I'm enjoying the PhysX effects thanks to my AGEIA PPU, Batman without PhysX doesn't feel alive. But it's a pity that the content of a game which must be a standard for everyone its locked to a single vendor like the Anti Aliasing, DirectX vs GLIDE again.

Such bold moves shows that nVidia is the Apple of the videocards.
 

Schmide

Diamond Member
Mar 7, 2002
5,786
1,085
126
I know AA works on ATI card by hacking. I know the images created by Nvidia will also works on ATI cards. I know both vendors can generate AA from the same input, but where does the input came from?

You really really don't understand rendering do you? Stop acting like you do.

Original code from Rocksteady don't support AA, meaning that even if you can specify AA in game, nothing is enhanced. Eidos can release it as it is, with no ingame AA, butNvidia come in with codes that enables AA. Eidos requested this code from ATI, but they said it is going to be the same. Until now, ATI still haven't send in their version of the code. Nvidia is not stopping ATI from using AA, just not with Nvidia's code.

Actually you're wrong here. Again. The Unreal Engine supports MSAA and HDR just not at the same time. There are 2 directX specific methods for getting around this, until you understand them, stop acting like you do.

If all you're going to do is repeat the nVidia line, you do yourself and your community no justice.

Think of it this way. Say Nvidia can make Batman more beautiful via its own driver, will you claims that Nvidia driver should work on ATI's card? Right now, Nvidia has a little "driver" within the game that checks vendor ID. Yes the driver will also work on ATI's card, but it is really a Nvidia driver and they have the right to restrict on what condition shall it be run.

The "driver" is not little as it contains all the images of batman, ones that will enhanced by AA. Nvidia created them from taking the images from Rocksteady's code and modified them. If Rocksteady rejects this modification, then Nvidia can't use it, and Eidos can do nothing but to remove it from the game. However, Rocksteady is okay with it, and requested ATI to modify the source code so that batman can run in AA for both parties. ATI never submit their version of it.

Now Rocksteady can redo all the images, then both vendors can use AA, but they did not. This have nothing to do with Nvidia. The question is who is going to reinvent the wheels. Rocksteady or ATI? Nvidia is not stopping any of them from writing it, but it does have ownership on the code that they have written.

(Not literal) Are you on crack? The driver contains images?

Please Please stop while you're ahead. At the very lease wait for BenSkywalker to enlighten us.

(snip not even worth it.)
 
Last edited: