I think he means how various people found a way to get AA to work on ATI cards. As in we, the enthusiasts.
Did you even read the link?
It’s also worth noting here that AMD have made efforts both pre-release and post-release to allow Eidos to enable the in-game antialiasing code - there was no refusal on AMD’s part to enable in game AA IP in a timely manner.
This though is the really kick in the ass, as part of the code is running on ATi hardware enabled or not.
This is clearly Eidos fault for not putting in AA themself if they really wanted it, or at least denying the use of the code once they found out Nvidia was going to block some of Eidos customers from it.
AA is part of the DX API and shouldn't be locked to one vendor.
Said another way "nVidia paid to get all that work done, why shouldn't ATi benefit?"
ATi was offered a chance to write their own AA code, they chose not too, that was entirely their choice, they don't think they need to support their customers in that fashion. The same can be said about nV and DX11 in certain titles. Big difference is you don't see the temper tantrums quite the same, not even from Wreckage.
Do you understand what they are saying in that quote? They are stating that they need to run a different version of the engine altogether. Writing a depth value for alpha isn't something you toggle on the fly as you can do with AA.
Eidos already stated they blocked it based on what their legal team told them to do. They claimed they did this due to the contract with nV, nV has a contract based on PhysX code ownership, it does not encompass AA.
Is this too complicated for you on a technical level? Do you understand how they are doing AA and that it isn't anything remotely like DX's AA support? If a game comes out with accumulation buffer style SSAA built into the engine because ATi wanted it coded that way it would be nV's job to get support up and running on their boards, it is not covered under the DX spec.
Before you act all high and mighty, you should read the AMD perspective here. They explain in great detail what they found the engine doing.
All cards run part of this code with 0xAA, so running without AA the 4890 should be faster.Let's discuss this running partially all the time -- personally had some posters offer detrimental performance loss but where is this loss in frame-rate with no AA?
If anything, ATI offers much more performance with a HD4890 over GTX-260; and is a lot more
and not even close.
NVidia was given the chance to follow the directX guidelines.
Actually it is. You check the CAPS BITS and if the surface supports it, you can do it.
Before you act all high and mighty, you should read the AMD perspective here.
It is too complicated if you talk in half truths and make assumptions on the DirectX level of compatibility of ATI cards.
I can't believe you condone Nvidia actions here.
You right though I don't understand how Nvidia can get away with sabotaging the competitor's performance.
Please enlighten me, and while your at it please explain how Nvidia's code works perfectly on ATi cards if it doesn't follow DX spec.
Sorry for the late reply. I can see and understand your arguments and I didn't look at any threads about this topic.Man you need to do some reading in this debate.
I'm not saying these things are true, but those are the things AMD claim.
- They said they circumvented the vendor lock and AA worked fine;
- They claimed even though AA isn't working on their cards, the initial AA
code, even though it is disabled for ATI hardware is being sent to their cards, so they are getting the performance penalty as if they were using AA but getting no result.
All this has been talked in other threads in here and you can find them for yourself.
Again, I'm not saying who is right or wrong, who's lying or not, just correcting your affirmations by providing you the correct AMD claims or the way they present their side of the story.
Oh, come on! It's a Vendor ID lockout! Remove the lockout and the AA works just fine on ATi hardware. Nvidia actually had to do extra work to prevent it from working on ATi cards by adding the lockout.Of course, it isn't Eidos job, it wasn't Epic's job, and it wasn't ATi's job, it was nVidia's job to get AA running on ATi hardware by doing it the way that you think it should be done.
Again, you're deliberately ignoring the fact that the MSAA works just fine on ATi hardware if you remove the lockout. The lockout is extra coding on top of just that programming necessary for AA to function.So nVidia is now a game developer? Seems to me they were asked to get AA working on their hardware and they did. ATi was asked to get AA working on their hardware and they did not.
Because it is. I suppose the developers just threw that ATi lockout in there for the heck of it. Yes, I'm sure deliberately making their game look worse for a good portions of their potential customer base is a standard business practice for them.And somehow it's nVidia's fault.
We could ask you why you have this need to keep Nvidia in a divine spot since obviously (to you) they can do no wrong.I understand some people have the need to keep ATi in a divine spot, They can do no wrong. Since this is such a typical AA routine, why didn't Sweeney use it in the first place?
Why would they need to? It's standard code. So they needed to fly an engineer to Rocksteady Studios to tell them to implement the same AA code that they were informed would already be present?Them whining because they didn't want to get the work done for their own parts. They were given the offer, they turned it down. It was their choice, they had the option to do it themselves.
Nobody was asking Nvidia to qualify the coding for ATi cards. They were doing their own qualifying in-house.I don't doubt it will work on ATi cards. What I find absurd is that nV should be qualifying a routine for ATi. How on Earth is that logical?
Just because the developer has the right to refuse accepting the code does not absolve Nvidia from insisting that the Vendor ID lockout be included if Rocksteady wanted to use the MSAA. It's dirty pool no matter how many directions you try to twist it.Is AA working on nV cards? Yes. They did their job. They are not in the business of making games, they do not have final say on any code base that is in the game. The developers and the publisher both had complete freedom to refuse their alternative.
And this has exactly what to do with anything? Nice smoke screen, however.When did nV release a game? Get back to me when that happens, until it does its' just loyalists rage.
How does that excuse including a Vendor ID lockout? You don't think that Nvidia cards have displayed content in the past that was put forth through the efforts of ATi? You don't think they will again in the future, especially with DX11?The AA routine they are running doesn't follow DX's guidelines for AA. Is anyone here honestly under the impression that Sweeney wouldn't have added it in engine himself it was some sort of implementation that followed normal DX routines? Seriously, if you think nV is better then Sweeney at working with Sweeney's game engine you must have a lot less faith in his abilities then I do.
On the "nVidia paid for it, why should ATI benefit" argument. ATI has provided funding for other games in the past and prior to this but the fact is that these games have never been coded to perform worse or have less features on competing products. While ATI's position in the market is weaker than nVidia you don't see them engaging in anti-competitive moves such as stipulating that a game looks worse or is less featured on a competitor's product. And I'm not talking about coding in features that uses effects that only your hardware can provide such as PhysX but "standardized" features.
The question for those who seem to be defending nVidia's decision is are you saying it is ok for future games that ATI sponsors or provides funding for to lock out DX11 features from working on nVidia hardware? Last I heard, ATI was working with developers on about 20 games utilizing DX11. Is it ok for these games to have code that locks out nVidia hardware on DX11 and use a different DX variant such as DX9 or DX10 instead? I mean, ATI paid to have these features put in, why should nVidia benefit? This is not a new question and has been asked many times by supporters of nVidia's decision in this AA lockout issue.
This isn't about whether ATI or nVidia can do no wrong as you put it. This is about me as a consumer. The bottom line is this move by nVidia hurts consumers in the long run. I don't want to have the day when I have a very basic game that can run on any hardware but will need an ATI card to get certain effects cause ATI sponsored it and need to pop in an nVidia card in a different game to get the most out of it because nVidia sponsored it.
As a consumer, you question why your ATI card can't run AA in Batman. As a consumer, you know Nvidia is the one who spend their money in the development yet conquer to use of vendor check. As a consumer, you want everything to work on your purchase other than buying the right ones.On the "nVidia paid for it, why should ATI benefit" argument. ATI has provided funding for other games in the past and prior to this but the fact is that these games have never been coded to perform worse or have less features on competing products. While ATI's position in the market is weaker than nVidia you don't see them engaging in anti-competitive moves such as stipulating that a game looks worse or is less featured on a competitor's product. And I'm not talking about coding in features that uses effects that only your hardware can provide such as PhysX but "standardized" features.
The question for those who seem to be defending nVidia's decision is are you saying it is ok for future games that ATI sponsors or provides funding for to lock out DX11 features from working on nVidia hardware? Last I heard, ATI was working with developers on about 20 games utilizing DX11. Is it ok for these games to have code that locks out nVidia hardware on DX11 and use a different DX variant such as DX9 or DX10 instead? I mean, ATI paid to have these features put in, why should nVidia benefit? This is not a new question and has been asked many times by supporters of nVidia's decision in this AA lockout issue.
This isn't about whether ATI or nVidia can do no wrong as you put it. This is about me as a consumer. The bottom line is this move by nVidia hurts consumers in the long run. I don't want to have the day when I have a very basic game that can run on any hardware but will need an ATI card to get certain effects cause ATI sponsored it and need to pop in an nVidia card in a different game to get the most out of it because nVidia sponsored it.
It is clear that it isn't the in game setting that people are upset about, but the fact that AA does nothing when changing the settings through CCC. AA only works when AA is specified in game, and only Nvidia card can do it by default.
The original version from Rocksteady do not support AA as the graphic files ain't stored correctly to utiltize AA. Nvidia restored all the graphics so it utilizes AA. Because of this, Eidos put a simple vendor ID check in the final verison as ATI never submit their version of the graphics. ATI claims that the final version of the graphics will be identical to the ones Nvidia has submitted, but it won't. They may look alike, but not identical as it will be done by different people, machine, software and time.
Another thing ATI complain about is the depth value before AA, which does nothing for ATI's HW as they don't support 3D version. Without depth value, 3D images with appears to be wrong, flickering.
The fix isn't to remove the vendor check, but to redo all the graphics. The current optimized graphics belongs to Nvidia, and won't be used when AA is not enabled. That means even if ATI user chooses to enable AA in CCC, the graphics will not be enhanced. Now Eidos can have Rocksteady redo all its graphics just for ATI's HW to work so without using Nvidia's optimized graphics, but why will they want to do it? Afterall, Nvidia helped them on the entire development and redoing every graphics requires time and money. All Eidos askes is for ATI for contribute something for their HWs to work instead of just asking for it. There are much better ways to lock out ATI's HW than a simple vendor ID check, but that will do as Eidos do want ATI user to play its game.
The idea is not to prevent ATI's hardware from using AA, but to make them do it themselves as those optimized graphics belongs to Nvidia, not Rocksteady Studios Ltd.. This means without Nvidia, AA will have no effects and has nothing to do with the DirectX API calls.
This is how I understood it after reading from HEXUS and BSN. I may be wrong, but I am in no position in judging your point of view, simply sharing mine. I quoted you not because I disagree with you, but I found it funny ATI claims a lot without working a lot.
Why would they need to? The standard code provided by Nvidia works just fine on ATi hardware. The standard code + Nvidia Vendor ID lockout is what's the problem.But, if this was so trivial to do and easy -- ATI could just do the work themselves and be done with it, too, instead of airing out dirty laundry aimed at gamers and consumers.
So I suppose the time and money ATi is spending on DX11 titles won't be of benefit to Nvidia at some point in the future?Show everyone that they are willing to be just as pro-active
Did you really just call Nvidia more open than ATi?!? The same Uli SLI lockout, PhysX lockout, Batman:AA lockout Nvidia? Um... I'm not even sure how to respond to that.and much more open, too, instead of complaining about it. Do more - -complain less -- and show how developer relations suppose to be like.
Well, finally something we can agree on. The part about Nvidia being questionable, that is. And ATi is being pro-active with regards to DX11 content. There you go.Would rather see a very pro-active developer relations with some questionable examples once in-a-while than a reactive developer relations that may be more ethical but yet whinny.
Oh, we can see what gets done when developers bow under to Nvidia pressure, all right.It's not what you say but what gets actually done.
Of course, it isn't Eidos job, it wasn't Epic's job, and it wasn't ATi's job, it was nVidia's job to get AA running on ATi hardware by doing it the way that you think it should be done. So, if nV releases a part at some point that is more then twice as fast as ATi's, is it then nVidia's fault for not doing ATi's R&D for them too? Is it nVidia's job to pay ATi enough to be profitable too?
So nVidia is now a game developer? Seems to me they were asked to get AA working on their hardware and they did. ATi was asked to get AA working on their hardware and they did not. And somehow it's nVidia's fault. I understand some people have the need to keep ATi in a divine spot, They can do no wrong. Since this is such a typical AA routine, why didn't Sweeney use it in the first place?
Them whining because they didn't want to get the work done for their own parts. They were given the offer, they turned it down. It was their choice, they had the option to do it themselves.
I don't doubt it will work on ATi cards. What I find absurd is that nV should be qualifying a routine for ATi. How on Earth is that logical?
Is AA working on nV cards? Yes. They did their job. They are not in the business of making games, they do not have final say on any code base that is in the game. The developers and the publisher both had complete freedom to refuse their alternative. Furthermore ATi was offered a chance to get it running on their hardware. It is not nV's job to make ATi look good.
When did nV release a game? Get back to me when that happens, until it does its' just loyalists rage.
The AA routine they are running doesn't follow DX's guidelines for AA. Is anyone here honestly under the impression that Sweeney wouldn't have added it in engine himself it was some sort of implementation that followed normal DX routines? Seriously, if you think nV is better then Sweeney at working with Sweeney's game engine you must have a lot less faith in his abilities then I do.
Did you really just call Nvidia more open than ATi?!? The same Uli SLI lockout, PhysX lockout, Batman:AA lockout Nvidia? Um... I'm not even sure how to respond to that.
Sorry for the late reply. I can see and understand your arguments and I didn't look at any threads about this topic.
"Batman: Arkham Asylum" is developed by Rocksteady Studios Ltd.. Eidos Interactive Ltd. is the Publisher. It is clear that it isn't the in game setting that people are upset about, but the fact that AA does nothing when changing the settings through CCC. AA only works when AA is specified in game, and only Nvidia card can do it by default.
The original version from Rocksteady do not support AA as the graphic files ain't stored correctly to utiltize AA. Nvidia restored all the graphics so it utilizes AA. Because of this, Eidos put a simple vendor ID check in the final verison as ATI never submit their version of the graphics. ATI claims that the final version of the graphics will be identical to the ones Nvidia has submitted, but it won't. They may look alike, but not identical as it will be done by different people, machine, software and time.
Another thing ATI complain about is the depth value before AA, which does nothing for ATI's HW as they don't support 3D version. Without depth value, 3D images with appears to be wrong, flickering.
(snip not worth it just FUD and misunderstanding)
The idea is not to prevent ATI's hardware from using AA, but to make them do it themselves as those optimized graphics belongs to Nvidia, not Rocksteady Studios Ltd.. This means without Nvidia, AA will have no effects and has nothing to do with the DirectX API calls.
This is how I understood it after reading from HEXUS and BSN. I may be wrong, but I am in no position in judging your point of view, simply sharing mine. I quoted you not because I disagree with you, but I found it funny ATI claims a lot without working a lot.
I know AA works on ATI card by hacking. I know the images created by Nvidia will also works on ATI cards. I know both vendors can generate AA from the same input, but where does the input came from?Wrong. People get AA when using CCC panel options - it takes a larger performance hit though, as it is a brute force technique. The game option of AA even refers to CCC if an ATi card is present.
Some part of that I don't understand, but the main point is that the AA CODE DOES WORK for ATi cards if vendor lock is hacked.
Again that is a completly different interpretation of the facts and claims, and people in this thread already showed AA screen shots of the game with ATi hardware.
The problem is that the implementation being used is the standard one - both NVIDIA and ATi can generate AA from the same input, and then each vendor handles that code in their own away.
I think you are doing an interpretation no one else here does.
HMMM But it does look so much better on Nvidia with PhysX.
I know AA works on ATI card by hacking. I know the images created by Nvidia will also works on ATI cards. I know both vendors can generate AA from the same input, but where does the input came from?
Original code from Rocksteady don't support AA, meaning that even if you can specify AA in game, nothing is enhanced. Eidos can release it as it is, with no ingame AA, butNvidia come in with codes that enables AA. Eidos requested this code from ATI, but they said it is going to be the same. Until now, ATI still haven't send in their version of the code. Nvidia is not stopping ATI from using AA, just not with Nvidia's code.
Think of it this way. Say Nvidia can make Batman more beautiful via its own driver, will you claims that Nvidia driver should work on ATI's card? Right now, Nvidia has a little "driver" within the game that checks vendor ID. Yes the driver will also work on ATI's card, but it is really a Nvidia driver and they have the right to restrict on what condition shall it be run.
The "driver" is not little as it contains all the images of batman, ones that will enhanced by AA. Nvidia created them from taking the images from Rocksteady's code and modified them. If Rocksteady rejects this modification, then Nvidia can't use it, and Eidos can do nothing but to remove it from the game. However, Rocksteady is okay with it, and requested ATI to modify the source code so that batman can run in AA for both parties. ATI never submit their version of it.
Now Rocksteady can redo all the images, then both vendors can use AA, but they did not. This have nothing to do with Nvidia. The question is who is going to reinvent the wheels. Rocksteady or ATI? Nvidia is not stopping any of them from writing it, but it does have ownership on the code that they have written.
(snip not even worth it.)
