Batman AA fiasco: Who's telling the truth?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Seero

Golden Member
Nov 4, 2009
1,456
0
0
I'm sorry you ignored my reply to you. A lot of what you misspeak here was answered there.

Uhh. I don't think you understand MSAA. Do more research or wait for BenSkywalker to reply.

Wrong again. The depth value is use to resolve the Alpha in the MSAA. A lot more complicated than z-buffering flicker.

Please don't act like you know what you're saying. Guess what. You're wrong again.

Wow a truthful statement in bold. You may want to preface all your arguments with that.

PS I guess you can find things funny when you don't really understand them. Huh.
I think you are too aggressive to reply to. You have 30 years of programming experience and don't know the existences of "Junk code".

I am not the one who said depth value is useless, AMD did.
Naturally, seeing the code that states MSAA being a trademark of any company would raise the alarm in competing technologies. According to Richard, both ATI and nVidia recommend the following standard technique for enabling MSAA: "storing linear depth into the alpha channel of the RenderTarget and using that alpha value in the resolve pass. You only need to linearize it if you are storing something like 8-bit color so that you can get depth approximation, stuff it into alpha channel and then when you've finished rendering at high resolution you simply filter down the color values and use depth value maintain some kind of quality to your Anti-Aliasing so you don't just average."
...
What got AMD seriously aggravated was the fact that the first step of this code is done on all AMD hardware: "'Amusingly', it turns out that the first step is done for all hardware (even ours) whether AA is enabled or not! So it turns out that NVidia's code for adding support for AA is running on our hardware all the time - even though we're not being allowed to run the resolve code!
So… They've not just tied a very ordinary implementation of AA to their h/w, but they've done it in a way which ends up slowing our hardware down (because we're forced to write useless depth values to alpha most of the time...)!"
Source

Yes, I know you think that I don't know anything about MSAA, but that is your opinion and won't contradict what I have said. Mind you that I can say I have 3000 years of programming experiences and can fly too. Welcome to internet.

Yes, the code from Nvidia is based on this standard, but it does not mean any coding done based upon standard should run on every hardware that support such standard for free. My PC can run Vista, yet I still have to buy it due to licensing. Yes, I can "hack" it so that I don't need to pay, but I have no problem paying for what I use. All games that run DirectX follow the same standard, but you still have to pay individually if you want to play the game. Nvidia created something and it is free for Nvidia User. I really don't see any contradictions.

I don't act as if I know, and I have clearly stated that those were my opinions. Why are you having problems with that? I don't reply to you because I have no intention of starting rages in this thread.
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
No! if you actually read what I wrote, it's nVidia's job to uphold the DirectX standards they helped create!

DirectX is 100% controlled by MS without exception. It is an entirely closed and proprietary standard. It is in no way, shape or form nVidia's job to uphold MS's standards. What you are doing now would be akin to saying it is MS's job to uphold the PhysX standard. It simply is not true.

Apparently you put profits above community.

I am a capitalist, not a communist. In no way do I think it is anyone's obligation to spend their resources for the benefit of their competitor. If they want to do so that is their business, but in the real world that is best described most of the time as bad business. This isn't like we are talking about saving the lives of sick children here.

At first it seemed like you really understood how it was implemented (technically), now I'm starting to wonder if you really understand the issue.

The issued with deferred rendering using AA under DX9, or the render to texture style workaround that they are using? Or are you simply talking about a vendor ID lockout over DX caps? Which issue would you like to discuss?

Did you even read the link at BSN? I guess not since you continue to spew your misinformed line. AMD was qualifying the builds.

Your assertion seems to be that it is nVidia's obligation to make ATi look better. ATi was offered the chance to submit their own code- they declined. Per AMD's legal department, they have abridged the emails they are releasing to the web community, handy.

No but they provided a piece of DirectX code that doesn't follow Microsoft's guidelines for making rendering decisions.

MS does not own the World. nVidia created a workaround to a limitation in Sweeney's engine, that in no way means that ATi has the right to use that code.

Exactly how does one render MSAA and HDR in DirectX9?

The issue is with deferred rendering, not HDR. You sure you want to debate these points?

This isn't a Sweeney issue so stop trying to divert it.

It is a limitation of Sweeney's engine, nVidia created a workaround for their own cards. This is not a diversion, it is what happened. nVidia took care of their customers, most people consider that a good thing. If I go to Burger King and get shitty fries I don't expect McDonalds to take care of the problem for me :)

But it's a pity that the content of a game which must be a standard for everyone its locked to a single vendor like the Anti Aliasing, DirectX vs GLIDE again.

That is a reasonable stance to have IMO. Although, ATi was given the option to get it running on their own hardware. It seems that the general consensus amongst the red leaners is that noone should have gotten any AA in this game at all. nVidia took the time to do it for their customers, ATi did not.

Such bold moves shows that nVidia is the Apple of the videocards.

That is a stretch, if that were the case the game wouldn't run at all on hardware other then nV's ;)
 

Schmide

Diamond Member
Mar 7, 2002
5,587
719
126
I think you are too aggressive to reply to. You have 30 years of programming experience and don't know the existences of "Junk code".

To so called junks to ati's card are not junks to nvidia's card.

Big difference between saying "junk code" and "called junks" I was trying to form some acronym around it.

I aggressively research and get my facts right, something you may want to try and do. When you spew misinformation as fast as you do, it deserves an aggressive reply.

I am not the one who said depth value is useless, AMD did.

No you were the one that took AMD's use of depth value in the Alpha channel out of context. Was it wrong for me to correct it?

Yes, I know you think that I don't know anything about MSAA, but that is your opinion and won't contradict what I have said. Mind you that I can say I have 3000 years of programming experiences and can fly too. Welcome to internet.

I now call on you and BenSkywalker to grace us with this great knowledge. Put up or shut up. Pretending will just get you called out.

Yes, the code from Nvidia is based on this standard, but it does not mean any coding done based upon standard should run on every hardware that support such standard for free. My PC can run Vista, yet I still have to buy it due to licensing. Yes, I can "hack" it so that I don't need to pay, but I have no problem paying for what I use. All games that runs DirectX follows the same standard, but you still have to pay individually if you want to play the game. Nvidia created something and it is free for Nvidia User. I really don't see any contradictions.

You need to go to more Microsoft DirectX conferences. Once you do, you'll understand.

I don't act as if I know, and I have clearly stated that those were my opinions. Why are you having problems with that? I don't reply to you because I have no intention of starting rages in this thread.

Actually you act as if you know. We all have opinions, but you phrase your opinions as fact which is disingenuous. When you're corrected (so called aggressively) you ignore it and continue. If you don't want to be corrected, check your facts.

BTW I'm not upset. I can separate the argument from the individual. I enjoy your contributions to this thread, it shines a light on what misinformation is out there.
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I now call on you and BenSkywalker to grace us with this great knowledge. Put up or shut up. Pretending will just get you called out.

Call out on what? On how to perform AA? Which method do you want to get into? You want to go back old school and talk about accumulation buffer style multi sampling, or would you rather use the simpler super sampling that became popular for a while, maybe 3dfx's take on AB techniques with their VSA series? Or the DX version of multi sampling where you need differing Z values for edge AA only? Or perhaps you want to discuss a defferred render to texture with object flagged AA? What types of sampling patterns would you like to use? Ordered grid, rotated grid, stochastic, psuedo stochastic(ether OG or RG varrying on pixel cluster)?

What type would you like to discuss?
 

Schmide

Diamond Member
Mar 7, 2002
5,587
719
126
The issued with deferred rendering using AA under DX9, or the render to texture style workaround that they are using? Or are you simply talking about a vendor ID lockout over DX caps? Which issue would you like to discuss?

MS does not own the World. nVidia created a workaround to a limitation in Sweeney's engine, that in no way means that ATi has the right to use that code.

The issue is with deferred rendering, not HDR. You sure you want to debate these points?

Why would I not ask it? But since you put forth a little.

The other method is an RGBE surface which avoids the need for a FP16 surface. (Source Engine)

Now move on to how the above deferred rendering is done outside DirectX calls.
 

Schmide

Diamond Member
Mar 7, 2002
5,587
719
126
Call out on what? On how to perform AA? Which method do you want to get into? You want to go back old school and talk about accumulation buffer style multi sampling, or would you rather use the simpler super sampling that became popular for a while, maybe 3dfx's take on AB techniques with their VSA series? Or the DX version of multi sampling where you need differing Z values for edge AA only? Or perhaps you want to discuss a defferred render to texture with object flagged AA? What types of sampling patterns would you like to use? Ordered grid, rotated grid, stochastic, psuedo stochastic(ether OG or RG varrying on pixel cluster)?

What type would you like to discuss?

The one that goes outside of DirectX calls please.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The one that goes outside of DirectX calls please.

Wow, maybe do a bit of research first. I listed multiple types of AA that DirectX does not support at all, including hardware controlled methods and ones that were never part of the DirectX API. As someone with as much experience as you have, it should be simple to spot them :)

Now move on to how the above deferred rendering is done outside DirectX calls.

The problem is with the engine, not the API. The engine under DX9 doesn't support AA with deferred rendering. Go ahead and look it up :)
 

Schmide

Diamond Member
Mar 7, 2002
5,587
719
126
Wow, maybe do a bit of research first. I listed multiple types of AA that DirectX does not support at all, including hardware controlled methods and ones that were never part of the DirectX API. As someone with as much experience as you have, it should be simple to spot them :)

Well I guess if you consider the shader code "Outside of DirectX"?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Well I guess if you consider the shader code "Outside of DirectX"?

TBuffer and ABuffer types of AA are not supported by DirectX and never have been.

Not even in a texture render target?

That is a workaround, Sweeney could have had that built into the engine by default, but he didn't.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Big difference between saying "junk code" and "called junks" I was trying to form some acronym around it.

I aggressively research and get my facts right, something you may want to try and do. When you spew misinformation as fast as you do, it deserves an aggressive reply.

No you were the one that took AMD's use of depth value in the Alpha channel out of context. Was it wrong for me to correct it?

I now call on you and BenSkywalker to grace us with this great knowledge. Put up or shut up. Pretending will just get you called out.

You need to go to more Microsoft DirectX conferences. Once you do, you'll understand.

Actually you act as if you know. We all have opinions, but you phrase your opinions as fact which is disingenuous. When you're corrected (so called aggressively) you ignore it and continue. If you don't want to be corrected, check your facts.

BTW I'm not upset. I can separate the argument from the individual. I enjoy your contributions to this thread, it shines a light on what misinformation is out there.
I may be inexperienced when it comes to forum chatting. You are the most aggressive person I have yet meant. Greeting my friend.

You mentioned something about research and facts. What are they? Can you enlighten me on some of the facts that you have obtained in another thread? I don't think this thread is about "How Multisample anti-aliasing is done on Batman:Arkham Asylum".

My solo interpretation of my researches is that Nvidia has written something which allows AA to enhance Batman's graphics. ATI was given the offer but never actually submit similar codes on this matter. I know that Nvidia has no rights to "Block AA", and they have not. I know that Nvidia had put a "vendor block" on its codes that allow AA to enhance graphics. I know that ATI have not contributed in this problem.

To ease your emotion, I do not have access to the source code, contracts, or any documentations between Rocksteady, Eidos, ATI, and/or Nvidia. I don't know how exactly AA was done in Batman, and I am not good with terms. I don't know what codes belong to Nvidia, and what codes belong to Rocksteady.

I do know that every company wants to make more money. I believe that Eidos will publish a game with maximum profit. Allowing "vendor block" contradicts with the maximum profit idea and they must have a reason. I do know that Batman's graphics won't be enhanced by AA without Nvidia's coding. I do know that Rocksteady will have to do a lot of work to natively support AA, as a result it does not. If it does, Eidos can tell Nvidia to go away and enable AA for everyone.

If having MSAA is such an easy task and you claim that you know how, why don't you write it out and send it to Eidos so all ATI user can enjoy AA without the need of "Hacking"? Either you ...
...Put up or shut up. Pretending will just get you called out.
 
Last edited:

Schmide

Diamond Member
Mar 7, 2002
5,587
719
126
I may be inexperienced when it comes to forum chatting. You are the most aggressive person I have yet meant. Greeting my friend.

Thanks. In the end, ones style is only as good as ones argument.

You mentioned something about research and facts. What are they? Can you enlighten me on some of the facts that you have obtained in another thread? I don't think this thread is about "How Multisample anti-aliasing is done on Batman:Arkham Asylum".

Well you can take the world on faith or you can use conjecture. When part of the argument rests on whether the solution was outside DirectX, you must discuss the method.

My solo interpretation of my researches is that Nvidia has written something which allows AA to enhance Batman's graphics. ATI was given the offer but never actually submit similar codes on this matter. I know that Nvidia has no rights to "Block AA", and they have not. I know that Nvidia had put a "vendor block" on its codes that allow AA to enhance graphics. I know that ATI have not contributed in this problem.

This is a fair opinion to have. You are however, just repeating the nVidia statement.

To ease your emotion, I do not have access to the source code, contracts, or any documentations between Rocksteady, Eidos, ATI, and/or Nvidia. I don't know how exactly AA was done in Batman, and I am not good with terms. I don't know what codes belong to Nvidia, and what codes belong to Rocksteady.

We are all in the same boat. Only so many facts have been released. If you don't understand a term or something, we can help you.

I do know that every company wants to make more money. I believe that Eidos will publish a game with maximum profit. Allowing "vendor block" contradicts with the maximum profit idea and they must have a reason. I do know that Batman's graphics won't be enhanced by AA without Nvidia's coding. I do know that Rocksteady will have to do a lot of work to natively support AA, as a result it does not. If it does, Eidos can tell Nvidia to go away and enable AA for everyone.

You have to understand. For those of us who don't wish to return to the time where coding for each card was a unique experience with varying results. Standards and definitions of capabilities mean a lot. We've all spent too much time and effort to let the industry regress like this.

Edit for edit:

If having MSAA is such an easy task and you claim that you know how, why don't you write it out and send it to Eidos so all ATI user can enjoy AA without the need of "Hacking"? Either you ...

It's not easy. It has, on the other hand, been shown in other engines and papers. If it is the method we all think it is.
 
Last edited:

akugami

Diamond Member
Feb 14, 2005
5,656
1,850
136
I am a capitalist, not a communist. In no way do I think it is anyone's obligation to spend their resources for the benefit of their competitor. If they want to do so that is their business, but in the real world that is best described most of the time as bad business. This isn't like we are talking about saving the lives of sick children here.

I agree with what you're saying in that no one is obliged to spend resources to benefit their competitor. From a business standpoint, it makes more than perfect sense. I've said so in the past and will continue to say so in the future. I actually understand why nVidia did what it did from a business standpoint.

It's just that from a consumer standpoint it sucks to heck and I can see that if this were allowed to continue it is the consumers who stand to lose out. That's why I'm opposed to it. Not out of some misguided support for a faceless corporation but because I'm looking out for my own interest.

And taken another way, can we assume that you think it's ok for ATI to ask developers to lock out nVidia games from being able to run DX11? After all, ATI is helping developers implement DX11 in about 20 games with the list likely growing. Why should ATI spend resources to benefit nVidia?

As a consumer, you question why your ATI card can't run AA in Batman. As a consumer, you know Nvidia is the one who spend their money in the development yet conquer to use of vendor check. As a consumer, you want everything to work on your purchase other than buying the right ones.

If money is not a problem, you can simply go buy a Nvidia video card. If money is a problem, then I don't see what is wrong with the vendor check as Nvidia is the one who spend money on it. Nvidia video card owners spent money on it.

I know why my ATI card can't run certain AA modes in Batman. nVidia paid the developer and got some strings pulled to exclude ATI cards. There is nothing from a technical standpoint preventing ATI from running those AA modes. In fact, the "hack" is nothing more than something that tricks the game into allowing you to run the same exact code that allows for the extra AA modes on an ATI card and not disable those modes if it knows you have an ATI card. nVidia may not have provided cash directly to the developer but nVidia most certainly paid for it to happen.

You never answered the question of whether you feel it is ok for ATI to stipulate that the games they are supporting get vendor locks to disable DX11 in nVidia cards. It is an open question to everyone supporting the vendor lock in. A lot of nVidia supporters seem to ignore and dodge that little question and instead shunt the argument into ATI's lack of developer support. An argument that has been proven false over and over and over again since ATI has supported many developers.

The point I was trying to make is that I don't want to have to buy Brand X card to enjoy one game and Brand Y card to enjoy another game. The reason for "standards" such as OpenGL and DirectX is so that we can get a reasonable feature set that is supported on any card without having to worry if a certain feature will or won't work on a different card. The fact that I personally can afford to buy both an ATI and nVidia card doesn't mean I want to nor do I feel I need to. I just want to get one card and that's that. If the nVidia card is the best value for my money then I will bought it. Just like I bought an 8800GTS before my current video card which is a Radeon 4870.

As a consumer I don't want to go down this path. If you're ok with it then that's fine since you are entitled to your opinions but I and plenty of others, even those who have primarily bought nVidia cards (my first video card ever was a Geforce 256) do not want to go down this path. If I wanted to get locked into specific hardware and software combinations then I sure as hell would never have made the move to a PC and just shoulda stuck with a Mac. I've used computers for nearly two decades and my very first computer was a Macintosh Performa (good ol' pizza box).

I'd rather not have the extra AA modes than allow for a precedence to be set where we go down the path of having to buy specific hardware if we want specific features enabled in certain games. I'd rather have a few less features if it means that my games and other programs work just as well (quality of hardware aside) on Brand X hardware as Brand Y hardware. Most of the people here would abhor the idea of a closed system dictated by an overlord such as how Macs are set up but supporting nVidia in their decision to lock out features from games into their own closed hardware is leading down that path IMHO.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
I don't think there'd be as much screaming if Eidos required ONLY nvidia hardware on the box. Unfortunately, ATI hardware is there not only as minimum but also recommended.

That's the real problem. Unless you're fortunate enough to own hardware from the company insisting on the vendor locks you'll have to have both an ATI and NVidia computer to properly experience new games. And the customer has to research who bought off the dev and publisher before buying any title to avoid getting something rendered partly functional by the other team. Nobody with a clue wants that, not even the vendors.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
I agree with what you're saying in that no one is obliged to spend resources to benefit their competitor. From a business standpoint, it makes more than perfect sense. I've said so in the past and will continue to say so in the future. I actually understand why nVidia did what it did from a business standpoint.

It's just that from a consumer standpoint it sucks to heck and I can see that if this were allowed to continue it is the consumers who stand to lose out. That's why I'm opposed to it. Not out of some misguided support for a faceless corporation but because I'm looking out for my own interest.

And taken another way, can we assume that you think it's ok for ATI to ask developers to lock out nVidia games from being able to run DX11? After all, ATI is helping developers implement DX11 in about 20 games with the list likely growing. Why should ATI spend resources to benefit nVidia?
If ATI owns a part of code and don't wish other vendors from executing, yes. Should ATI lock out Dx11 from Nvidia card just because they helped in development? They can not, unless developers and publishers agree with them. Will a publisher publish a game that only works on a portion of the market? I don't think so.

I know why my ATI card can't run certain AA modes in Batman. nVidia paid the developer and got some strings pulled to exclude ATI cards. There is nothing from a technical standpoint preventing ATI from running those AA modes. In fact, the "hack" is nothing more than something that tricks the game into allowing you to run the same exact code that allows for the extra AA modes on an ATI card and not disable those modes if it knows you have an ATI card. nVidia may not have provided cash directly to the developer but nVidia most certainly paid for it to happen.

You never answered the question of whether you feel it is ok for ATI to stipulate that the games they are supporting get vendor locks to disable DX11 in nVidia cards. It is an open question to everyone supporting the vendor lock in. A lot of nVidia supporters seem to ignore and dodge that little question and instead shunt the argument into ATI's lack of developer support. An argument that has been proven false over and over and over again since ATI has supported many developers.

The point I was trying to make is that I don't want to have to buy Brand X card to enjoy one game and Brand Y card to enjoy another game. The reason for "standards" such as OpenGL and DirectX is so that we can get a reasonable feature set that is supported on any card without having to worry if a certain feature will or won't work on a different card. The fact that I personally can afford to buy both an ATI and nVidia card doesn't mean I want to nor do I feel I need to. I just want to get one card and that's that. If the nVidia card is the best value for my money then I will bought it. Just like I bought an 8800GTS before my current video card which is a Radeon 4870.

As a consumer I don't want to go down this path. If you're ok with it then that's fine since you are entitled to your opinions but I and plenty of others, even those who have primarily bought nVidia cards (my first video card ever was a Geforce 256) do not want to go down this path. If I wanted to get locked into specific hardware and software combinations then I sure as hell would never have made the move to a PC and just shoulda stuck with a Mac. I've used computers for nearly two decades and my very first computer was a Macintosh Performa (good ol' pizza box).

I'd rather not have the extra AA modes than allow for a precedence to be set where we go down the path of having to buy specific hardware if we want specific features enabled in certain games. I'd rather have a few less features if it means that my games and other programs work just as well (quality of hardware aside) on Brand X hardware as Brand Y hardware. Most of the people here would abhor the idea of a closed system dictated by an overlord such as how Macs are set up but supporting nVidia in their decision to lock out features from games into their own closed hardware is leading down that path IMHO.
Again, I do support ownership. I think product ID check online is smart as it protects programmers. The game, Batman:Arkham Asylum, can be played regardless of vendors. It however don't support AA. Nvidia written codes to make it possible, ATI didn't. If Nvidia never written that, that no one will see Batman in MSAA. This has nothing to do with standard, but a piece of code that does something, and was created by Nvidia, and thus should only be used the way Nvidia see fit. If ATI written something in Dx11, it can put whatever in it as it see fit. Nvidia's help towards Batman:Arkham Asylum is more than just AA. It gave hardwares to Rocksteady to develop the game, and that part of the game can be used freely by ATI user.

The misunderstanding people have is Nvidia is blocking ATI from using AA, but that is not true. Nvidia has written a piece of code that allows AA to be utilized in Batman:Arkham Asylum. ATI can have someone in their lab to do it too, but they didn't. If ATI decided to rewrite Batman:Arkham Asylum in Dx11 for free and Nvidia don't do the same thing, then we will have a Batman:Arkham Asylum that can only run Dx11 on ATI's hardware.

Have I fully answered the question on my behave?
 
Last edited:

Schmide

Diamond Member
Mar 7, 2002
5,587
719
126
Since BenSkywalker, who I do respect his knowledge, has exited this argument at the moment of reasoning. For those wondering I will explain. Please correct me where I'm wrong. I have never implemented this method.

The MSAA texture method works as follows.

Create 2 textures 1 being a MSAA render target. Render all the objects than need MSAA into this texture. (maybe all of them) Copy this texture into the other texture, then use this texture for whatever you need. The reason for the copy is 2 fold, the multi samples stored in the MSAA render target need to be resolved. (basically combine all the multi sample into a single pixel per pixel) Some may think that rather than copy the texture, just use it as a texture for future renderings. If you do this, every time you fetch a pixel from the texture it will compute the multi sample pixel each time. So you take a little hit copying some surfaces around but it's really not as much as you may think.

It is bad to say this isn't done with directX calls or infer that it isn't supported and thus the reason I took this argument so far.
 

akugami

Diamond Member
Feb 14, 2005
5,656
1,850
136
If ATI owns a part of code and don't wish other vendors from executing, yes. Should ATI lock out Dx11 from Nvidia card just because they helped in development? They can not, unless developers and publishers agree with them. Will a publisher publish a game that only works on a portion of the market? I don't think so.


Again, I do support ownership. I think product ID check online is smart as it protects programmers. The game, Batman:Arkham Asylum, can be played regardless of vendors. It however don't support AA. Nvidia written codes to make it possible, ATI didn't. If Nvidia never written that, that no one will see Batman in MSAA. This has nothing to do with standard, but a piece of code that does something, and was created by Nvidia, and thus should only be used the way Nvidia see fit. If ATI written something in Dx11, it can put whatever in it as it see fit. Nvidia's help towards Batman:Arkham Asylum is more than just AA. It gave hardwares to Rocksteady to develop the game, and that part of the game can be used freely by ATI user.

The misunderstanding people have is Nvidia is blocking ATI from using AA, but that is not true. Nvidia has written a piece of code that allows AA to be utilized in Batman:Arkham Asylum. ATI can have someone in their lab to do it too, but they didn't. If ATI decided to rewrite Batman:Arkham Asylum in Dx11 for free and Nvidia don't do the same thing, then we will have a Batman:Arkham Asylum that can only run Dx11 on ATI's hardware.

Have I fully answered the question on my behave?

Give developers enough money and they will do anything including locking out nVidia cards from DX11. Look at all the console exclusives. Microsoft paid for exclusives on their Xbox 360 including console RPG's which are very popular in Japan and with their Xbox selling extremely poorly in Japan. Why? Because MS paid them enough money to make it worth their while. Most of the exclusives are due to money or because of overwhelming market share by a console maker such as in the Playstation 2 days. The PC games market is a little different but the same reason for console exclusives exists. Money.

Yes. You have. Just like every other nVidia apologist, and ATI apologist for that matter, you are defending the company of your preference whether it hurts consumers or not. What you refuse to see is that the "code" written by nVidia is not exactly special. Yes it does take resources (coders & time = money) which was provided by nVidia. Yes, from a business perspective nVidia can and should dictate what is done with that code but I refuse to believe it the code itself was anything special that could not have been done by a company the size of Eidos. This decision by both nVidia and Eidos is harmful to consumers and if you wish to let a company bend you over, that's your right. I refuse to do so and if it was ATI pulling this crap, I'd say the same thing.

Having vendor lock in codes does not protect programmers. How the heck did you come up with that conclusion? There is no reason for it in a case like the Batman AA one. Each incidence should be evaluated separately on its own merits so I can't speak for future events. But in no way shape or form does it protect programmers. I'd really love to see the logic behind that thinking.

I guess the whole issue doesn't really matter now since nVidia tried this and lost. They likely won't be trying something similar in the future since there's a ton of bad PR over this issue.

And I hope no one should take this as me saying there shouldn't be any PhysX features in the Batman (or any other) game. Those are something that ATI simply can't provide at the moment either because of lack of tools or because of hardware deficiencies. Much like how Adobe CS5 is going to include CUDA enhancements due to a lack of development tools in developing GPGPU enhancements that would utilize a vendor agnostic method such as DirectCompute or OpenCL. I have no problems with something like that.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
I know AA works on ATI card by hacking. I know the images created by Nvidia will also works on ATI cards. I know both vendors can generate AA from the same input, but where does the input came from?

Original code from Rocksteady don't support AA, meaning that even if you can specify AA in game, nothing is enhanced. Eidos can release it as it is, with no ingame AA, butNvidia come in with codes that enables AA. Eidos requested this code from ATI, but they said it is going to be the same. Until now, ATI still haven't send in their version of the code. Nvidia is not stopping ATI from using AA, just not with Nvidia's code.

Think of it this way. Say Nvidia can make Batman more beautiful via its own driver, will you claims that Nvidia driver should work on ATI's card? Right now, Nvidia has a little "driver" within the game that checks vendor ID. Yes the driver will also work on ATI's card, but it is really a Nvidia driver and they have the right to restrict on what condition shall it be run.

The "driver" is not little as it contains all the images of batman, ones that will enhanced by AA. Nvidia created them from taking the images from Rocksteady's code and modified them. If Rocksteady rejects this modification, then Nvidia can't use it, and Eidos can do nothing but to remove it from the game. However, Rocksteady is okay with it, and requested ATI to modify the source code so that batman can run in AA for both parties. ATI never submit their version of it.

Now Rocksteady can redo all the images, then both vendors can use AA, but they did not. This have nothing to do with Nvidia. The question is who is going to reinvent the wheels. Rocksteady or ATI? Nvidia is not stopping any of them from writing it, but it does have ownership on the code that they have written.

Legally speaking, Eidos can not alter any codes supplied by Nvidia. They can however "accept" or "reject" to publish the code. If they do reject, then no vendor can have AA because the code supplied by Rocksteady don't support it. If they accept, then they can not remove just the vendor check. For ATI cards to run AA legally, they must supply their version of it, which no one other than Nvidia has it.

Again, the actuall API calls to enable AA is not blocked by Nvidia as they can't, but calling the API without their code will have no result, meaning AA do not enhance the graphic via the original code.

Is this a dirty trick? Well Nvidia don't need to write that piece of code, they did it for their own customer. Can Nvidia create something for its own customer? Yes i know the ones that they have written do work on ATI, but that isn't the question.

Say you brought a game and your friend wants you to share cd-key. Is it wrong not to share? Are you hurting your friend by not sharing? Is the invention of cd-key bad as its solo purpose is to protect developers and ownership?

Yep, from now on games need to have 2 copies of exactly the same come but one with AMD vendor lock and one with NVIDIA vendor lock...

Whatever rocks your boat...
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Give developers enough money and they will do anything including locking out nVidia cards from DX11. Look at all the console exclusives. Microsoft paid for exclusives on their Xbox 360 including console RPG's which are very popular in Japan and with their Xbox selling extremely poorly in Japan. Why? Because MS paid them enough money to make it worth their while. Most of the exclusives are due to money or because of overwhelming market share by a console maker such as in the Playstation 2 days. The PC games market is a little different but the same reason for console exclusives exists. Money.

Yes. You have. Just like every other nVidia apologist, and ATI apologist for that matter, you are defending the company of your preference whether it hurts consumers or not. What you refuse to see is that the "code" written by nVidia is not exactly special. Yes it does take resources (coders & time = money) which was provided by nVidia. Yes, from a business perspective nVidia can and should dictate what is done with that code but I refuse to believe it the code itself was anything special that could not have been done by a company the size of Eidos. This decision by both nVidia and Eidos is harmful to consumers and if you wish to let a company bend you over, that's your right. I refuse to do so and if it was ATI pulling this crap, I'd say the same thing.

Having vendor lock in codes does not protect programmers. How the heck did you come up with that conclusion? There is no reason for it in a case like the Batman AA one. Each incidence should be evaluated separately on its own merits so I can't speak for future events. But in no way shape or form does it protect programmers. I'd really love to see the logic behind that thinking.

I guess the whole issue doesn't really matter now since nVidia tried this and lost. They likely won't be trying something similar in the future since there's a ton of bad PR over this issue.

And I hope no one should take this as me saying there shouldn't be any PhysX features in the Batman (or any other) game. Those are something that ATI simply can't provide at the moment either because of lack of tools or because of hardware deficiencies. Much like how Adobe CS5 is going to include CUDA enhancements due to a lack of development tools in developing GPGPU enhancements that would utilize a vendor agnostic method such as DirectCompute or OpenCL. I have no problems with something like that.
The code is not special, but Nvidia did own that piece of non-special code that ATI never submit. So your conclusion is to have that piece of non-special to be free for anyone basically?

The vendor lock is nothing but to protect ownership of Nvidia's non-special code. In other words, the vendor who owns the code is protected. If every single program should be free for everyone, then you really think programmers/companies will work their butt off? If the vendor lock must be removed, then ATI will never write something that helps gamers, and Nvidia will not waste resources also. Who is the loser here? Not the vendors, but us, the gamers.

I am a gamer, and love new techs, new methods to reduce lag, enhance graphics and gameplay. Those new stuffs are never free. Dx11 is meant to make programming easier and able to utilize hardwares better, but it isn't free. You need to own either vista or 7, alone with a video card that supports Dx11. None of these are free. Shall Dx11 be used by Mac user?

You know that Nvidia's card are priced higher then ATI. Many choose to buy ATI simply because it is cheaper. Nvidia uses that extra money not only on its own development, but also towards game developers. Batman is one of the games that got Nvidia's support before it has a publisher. Is this a bad thing?

Have you ever domate money to freewares? What did you actually do to help programmers to make better code?
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
It's just that from a consumer standpoint it sucks to heck and I can see that if this were allowed to continue it is the consumers who stand to lose out. That's why I'm opposed to it. Not out of some misguided support for a faceless corporation but because I'm looking out for my own interest.

I haven't felt left out on any of it for a long time now, around the time I ditched my 1800xt. This is one area everyone in this forum knows is a factor before they buy their parts. It is one of the reasons, one of the main reasons, people are willing to pay a bit more for nVidia. This is what nV is going for, and if you check out the gaming forums, people are not upset with nV or Eidos over this situation overwhelmingly. This is just part of the gaming market now.

And taken another way, can we assume that you think it's ok for ATI to ask developers to lock out nVidia games from being able to run DX11? After all, ATI is helping developers implement DX11 in about 20 games with the list likely growing. Why should ATI spend resources to benefit nVidia?

Any code written by ATi absolutely they have the right to lock it. That said, I haven't heard of them authoring any code for any of the upcoming DX11 titles at all, just offering support and guidance. Either way though, any code ATi writes themselves should absolutely fall under their guidelines for usage. In a real world sense, that is locking out about 70% of PC gamers at the moment though, so it will likely take a much larger incentive for them to get it done then just donating the code at this point.

I don't think there'd be as much screaming if Eidos required ONLY nvidia hardware on the box. Unfortunately, ATI hardware is there not only as minimum but also recommended.

Not only does ATi run the game extremely well, it runs AA in Batman too, use the CCC. The real issue is the level of performance it offers. nV's workaround is far more sophisticated- and if people would drop the ranting loyalist bit they may happen to notice that ATi's AA in Batman looks better then nV's.

That's the real problem. Unless you're fortunate enough to own hardware from the company insisting on the vendor locks you'll have to have both an ATI and NVidia computer to properly experience new games. And the customer has to research who bought off the dev and publisher before buying any title to avoid getting something rendered partly functional by the other team. Nobody with a clue wants that, not even the vendors.

How's Bioshock's in game AA using DX9? Doesn't work. Are people going to now start blaming ATi because nV didn't pay to do that for them too? It would be one thing if we were talking about a feature, any feature, removed from a game. We are not talking about that, we are talking about bonus features for owners of nV hardware. Why is that a problem? ATi is absolutely free to do the exact same thing and neither one of them are hurting any consumer by doing it unless they are rather extreme loyalists who feel ripped off that someone got something extra for paying for something different.

I guess the best way to put this would be to have nV fans freak out if any game comes with Eyefinity support. I honestly hope that it is widely supported and I'm sure as hell not going to throw a hissy fit if it is. ATi owners bought the card knowing that was a probability and was paid for in the price of purchase. nV owners buy their cards knowing that extra attention in game is expected and paid for in the price of purchase.

Create 2 textures 1 being a MSAA render target. Render all the objects than need MSAA into this texture. (maybe all of them) Copy this texture into the other texture, then use this texture for whatever you need. The reason for the copy is 2 fold, the multi samples stored in the MSAA render target need to be resolved. (basically combine all the multi sample into a single pixel per pixel) Some may think that rather than copy the texture, just use it as a texture for future renderings. If you do this, every time you fetch a pixel from the texture it will compute the multi sample pixel each time. So you take a little hit copying some surfaces around but it's really not as much as you may think.

From all the information I have seen that is what I am fairly certain they are doing, although I have not looked at the source code.

It is bad to say this isn't done with directX calls or infer that it isn't supported and thus the reason I took this argument so far.

It isn't a supported approach to AA, it is a very creative workaround to an issue with a game engine.

but I refuse to believe it the code itself was anything special that could not have been done by a company the size of Eidos.

What you are ignoring how tiny the PC market is for them. The PC version is on pace to sell 10% of what the consoles sold(that would still make it a profitable port, but only because it is a port). It's possible the title will hit 1Million units, but if the console version stopped selling today(versus the big sales spike it will get in a couple weeks) they would hit ~20% of what the console version sold. The reality of the market right now is that PC gaming is not going to see any perks unless nV/ATi push it. That is the way things are and it isn't because Eidos or anyone else can't afford it, it is because it is stupid business to spend more resources on something that is going to generate significantly less revenue. Oh, and another thing- if you think Eidos couldn't come up with it, why the hell didn't Sweeney do it in the fitst place? ;)

I guess the whole issue doesn't really matter now since nVidia tried this and lost. They likely won't be trying something similar in the future since there's a ton of bad PR over this issue.

Overall they got way more positive rep then you seem to realize. Step away from the tech forums and into the general gamer forums. If you do, you'll get a glimpse as to why nVidia held such a dominant market position over the 4xxx series despite what the overwhelming majority of tech enthusiasts thought.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Not only does ATi run the game extremely well, it runs AA in Batman too, use the CCC. The real issue is the level of performance it offers. nV's workaround is far more sophisticated- and if people would drop the ranting loyalist bit they may happen to notice that ATi's AA in Batman looks better then nV's.

That's true and I can confirm that. For some reason when I enabled PhysX in my PPU using the GenL patch, I couldn't enable Anti Aliasing in the CCC afterwards no matter what. So I had to use the ATT to change DeviceID/VendorID to use the in game anti aliasing and it looks considerably worse, it has a lot of jaggies everywhere, its like if anti alliasing would only work in some edges and no in others.

I don't understand the fanboyims of some guys here that approve such attitude of locking out in game anti aliasing for only one vendor, I can understand that attitude with PhysX, but Anti Aliasing? It doesn't matter who code for it, its a standard feature of DirectX. While its true that Unreal 3 engine natively doesn't support Anti Aliasing under DX9, favoring one vendor heavily and supporting all their propietary technologies will simply limit them of pottential buyers who owns ATi cards. Believe, BAA is far less inmersive when its played without PhysX and AA, it looks like a console port.

Now that ATi is under control with DX11, using some fanboyism logic of someone here, let just block nVidia of doing any Anisotropic Filtering under DX11 games and add unnecessary junk data like overfiltering so they can perform slower. As stupid as it can gets.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
That's true and I can confirm that. For some reason when I enabled PhysX in my PPU using the GenL patch, I couldn't enable Anti Aliasing in the CCC afterwards no matter what. So I had to use the ATT to change DeviceID/VendorID to use the in game anti aliasing and it looks considerably worse, it has a lot of jaggies everywhere, its like if anti alliasing would only work in some edges and no in others.

I don't understand the fanboyims of some guys here that approve such attitude of locking out in game anti aliasing for only one vendor, I can understand that attitude with PhysX, but Anti Aliasing? It doesn't matter who code for it, its a standard feature of DirectX. While its true that Unreal 3 engine natively doesn't support Anti Aliasing under DX9, favoring one vendor heavily and supporting all their propietary technologies will simply limit them of pottential buyers who owns ATi cards. Believe, BAA is far less inmersive when its played without PhysX and AA, it looks like a console port.

Now that ATi is under control with DX11, using some fanboyism logic of someone here, let just block nVidia of doing any Anisotropic Filtering under DX11 games and add unnecessary junk data like overfiltering so they can perform slower. As stupid as it can gets.
Your comment is the reason why there should be a vendor check. You said AA performance worst compare to the AA from CCC. Now who is responsible for that? The ones to developed the game? The ones that made AA made AA possible? Or ATI?

The dilema is this. If Nvidia allows ATI to run its code, then fanbois will say Nvidia deliberately put down ATI's performance. If Nvidia disallow ATI to run its code, then fanbois will say Nvidia is "blocking" what is open standard. So clearly, it is a lose lose saturation ofr Nvidia, score for ATI.

Other than claiming, did ATI even try to make AA possible for their customer? No, they did nothing, and have no plans to. So they don't offer to their customer what Nvidia offers to theirs. Yet, human are smart, if they don't have what they wanted, they will go get them. Unfortunately, some likes to steal, and thus found a way to steal what belongs to Nvidia's customer, just like what you did. What is evil isn't just the stealing part, but claiming that they shouldn't need to steal as owner should have giving it out for free, for everyone. They are claiming that the owner is evil for not share after they have stolen from them and complain that it is half good to them. Marvellous.

A thief breaks into a person's house, beat up the owner for having a locked door, then take anything as they see fit, then beat up the owner again because the goods doesn't fit perfectly on the thief. After that the thief go around teaching others on how to break that owners door on varies forums. Others who have learned this go break in again claiming that it is evil for having a locked door and really should have it remove so others don't need to break it.

Is this what you called logic? In reality, there are police and laws who enforce safety, yet people will lock their doors. Those who violate laws will need to face consequences, yet often after the fact. In the digital world, things are much worst. Stealing is as simple as a click of a button. There are no one to protect you but yourself. Thieves are everywhere. If one person found a way in, thousands will be in within 5 minutes. They feel no shame, and not pity. The good thing is they rarely kill people, but the bad thing is they will do anything else.

Logic? Yeah, I take what I can, and do what I want. I like Nvidia's code and I will take it, bitch about it, and tell people how to do it. Pay? BS, those are free. Am I logical enough?
 

Forumpanda

Member
Apr 8, 2009
181
0
0
The dilema is this. If Nvidia allows ATI to run its code, then fanbois will say Nvidia deliberately put down ATI's performance. If Nvidia disallow ATI to run its code, then fanbois will say Nvidia is "blocking" what is open standard. So clearly, it is a lose lose saturation ofr Nvidia, score for ATI.
And what if nVidia makes ATI run their code AND blocks them?

I gave up on the rest of your post, it was just way out there.

Is there really anyone who still thinks it is as simple as ATI shipping them a bit of code?
Do people really think ATI would bother sending them multiple emails over weeks of time, if the problem could simple be solved by sending a piece of standard code, that they already have available as an example elsewhere.

Obviously the problem is more complicated than what most posters are making it out to be.