Is there more to it than TWIMTBP? (Personal commentary)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Seero

Golden Member
Nov 4, 2009
1,456
0
0
So whats the reason to keep game devolopers around? Just let nvidia and atis' dev rels turn into gaming studios. If ppl think its perfectly fine to lock out code the dev rels put in to the games, aint this the path we will go down.

You wont buy a gaming pc anymore, you buy an intel nvidia or ati pc consol.
Why students where there are teachers?
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
Pick another example, Dragon's Age. It released console version as well as the PC version at the same time and reviewers said the PC version is actually better than the console version. With or without 3D.

Dragon Age can run in 2D mode? :hmm:

Yeah, Dx11 benchmark is the one thing people with Dx11 card do most, because there is nothing to do better than that. Bravo. I wonder what is more fun, Pac-Man, or that.
There are AFAIK 3 DX11 games out right now, not just benchmarks. That is much better than what we got with DX10.

It should be clear why developers need help, not so much as to complete what they start out with, but the use new stuffs that they won't use without the help, and TWIMTBP do just that. Keep in mind that ATI also have such team, just a lot smaller. Without them, we may as well go buy consoles.
Something like PhysX I can understand devs needing help but did Rocksteady for example really need help implementing AA in Batman? That's really hard to believe (but I could be wrong). Batman was not the first UE3 game with AA was it?
 
Last edited:

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Something like PhysX I can understand devs needing help but did Rocksteady for example really need help implementing AA in Batman? That's really hard to believe (but I could be wrong). Batman was not the first UE3 game with AA was it?

We are not just beating a dead horse with Batman but digging up a dead horse and beating it's skeleton.

AA is not standard in the UE3 engine. You have to code it separately. NVIDIA assisted in getting the code written and tested on their cards. AMD did not.

It's just that simple.
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
We are not just beating a dead horse with Batman but digging up a dead horse and beating it's skeleton.

AA is not standard in the UE3 engine. You have to code it separately. NVIDIA assisted in getting the code written and tested on their cards. AMD did not.

It's just that simple.

You're beating that dead horse because you think I was insulting nV when I was not. I was implying that Rocksteady should have included it to begin with. I think AA should be a feature available in all games but that's just my opinion.

To address your point, kudos to nV for helping Rocksteady but ATI did the same thing (added MSAA) to Stalker Clear Sky with GSC Gameworld but ATI did not lock it into just their hardware. You should stop bringing up what you wrote to try and say nV is the only company doing any work because they're not the only ones.
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
You're beating that dead horse because you think I was insulting nV when I was not. I was implying that Rocksteady should have included it to begin with. I think AA should be a feature available in all games but that's just my opinion.

To address your point, kudos to nV for helping Rocksteady but ATI did the same thing (added MSAA) to Stalker Clear Sky with GSC Gameworld but ATI did not lock it into just their hardware. You should stop bringing up what you wrote to try and say nV is the only company doing any work because they're not the only ones.
I think you are confused. GSC added MSAA into the game as a patch. However it cost too much FPS and requires Dx10.1. ATI released a hotfix to its driver to ease the hit. Not that ATI is bad, but they did not aid Nvidia on this in anyway as ATI driver don't work on Nvidia's hardware.

To clarity this problem, Dx10 was too new and didn't do what it was suppose to. Now there are 2 parts to Dx10, both the hardware and the software. Upgrading the software is easy (via windows update), upgrading the hardware is the problem as existing Dx10 card may not support the upgrade. This create mass confusion to user as they don't really know what support what. Some Dx10 card support Dx10.1, some Dx10 card don't have a problem to begin with.

There is a similarity between Stalker Clear Sky and Batman AA, which they both use something called deferred shading. This prevents AA in general to work (graphics are is still bad giggly after AA). There are 2 solutions, one requires another pass of graphics, but there are no functions in Dx9 that handles this 2nd pass, that is why UE3 can't support MSAA on Dx9. Now Dx10.1 has a function that handles this, but as I mentioned above, it takes a serious performance hit. The second solution, that both vendors had retrofitted, is FSAA through their control panel, which more or less does the same thing with no extra performance hit, so everything is good.

Now you can do some research on Anti-aliasing to find out more on the technical side of it. There are many different ways to implement AA, but FSAA and MSAA it the norm. However, the actual code that does MSAA may be different, and thus cost performance and stability variation on different hardware setup.

Now ATI's attitude towards Nvidia is no different from ATI's fanboys towards Nvidia's fanboys, everything that they do or say is biased and bad. Wreckage is a good example as every single word s/he said is seen as Nvidia's PR speech. On the disable to Dx10.1 in Assassin's Creed ATI claims that it was done because Nvidia pay Ubisoft to do it as the issue arise is next to none on Nvidia's hardware. Lots of ATI fanboy confirms that it is fine on Nvidia. However, the fact is some Nvidia's hardware crashes and others have light bleed through walls and memory leaks. IF everything Wreckage said or done is biased and wrong, then why do people think that ATI's PR's speech is legit?

So how reliable is the testing done by people who change the vendorID? Will they be biased? I don't know. What I do know is Nvidia spent the time on implementing MSAA through games that are developed through UE3, and appears to have further optimized it in newer games like Borderland, which beat the crap out of AMD hardware when AA is enabled. It may have something to do with driver optimization and Nvidia have done it through the experience of intoducing MSAA to Batman AA that ATI didn't have a chance to. 285 beating 5870 by more than 30% is just not right. If ATI fail to fix the problem when the game releases, then there will be another round of red/green war about Nvidia's Sabotages.

Edit: Suppose Nvidia is that bad and is trying to block something from ATI's hardware to use it, do you really think that the hack will be as easy as changing vendor ID? PhysX is something that Nvidia don't want ATI user to use, and thus disabling it through its driver. It won't work even if you have a Nvidia card on the system. Won't this method be more effective?
 
Last edited:

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
We are not just beating a dead horse with Batman but digging up a dead horse and beating it's skeleton.

AA is not standard in the UE3 engine. You have to code it separately. NVIDIA assisted in getting the code written and tested on their cards. AMD did not.

It's just that simple.

It's not just that simple. AA may not be standard in UE3, however it is standard in DirectX, and the implementation NVidia apparently provided is a DirectX standard implementation using publicly available API calls, not anything specific or proprietary to NVidia. This and this alone shows either a willful attempt at a vendor lockout, or the ineptness of the game studio to not be able to write a single, well documented DirectX API procedure. Personally, being that the studio has been around a while here, I'd like to give them the benefit of the doubt here. Money and greed are powerful motivators, as I'm sure you're well aware of.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
It's not just that simple. AA may not be standard in UE3, however it is standard in DirectX,
Uh no. It can be written using DirectX, but it's not standard in DirectX.
and the implementation NVidia apparently provided is a DirectX standard implementation using publicly available API calls, not anything specific or proprietary to NVidia.
Again it was written using DirectX. It was also only tested on NVIDIA hardware.

This and this alone shows either a willful attempt at a vendor lockout, or the ineptness of the game studio to not be able to write a single, well documented DirectX API procedure.
Have you ever written DirectX code? It's not as simple as you want it to be. Also it's not the game developers fault, it's Epic who wrote the game engine. It takes a lot of work to add features like this to a game and a lot of testing to make sure it works correctly. Probably why AMD chose not to do the work.

Personally, being that the studio has been around a while here, I'd like to give them the benefit of the doubt here. Money and greed are powerful motivators, as I'm sure you're well aware of.
No money changed hands, unless you have proof otherwise.

Again AA is not standard in the Unreal game engine. It's also not "standard" in DirectX (there are standard ways of writing the code using DirectX). In fact I believe more than one game has been released using the Unreal engine that don't use AA. So NVIDIA wanted AA on their cards and wrote the code. Now AMD and their fans want a free ride for doing nothing. Good luck with that.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
It's not just that simple. AA may not be standard in UE3, however it is standard in DirectX, and the implementation NVidia apparently provided is a DirectX standard implementation using publicly available API calls, not anything specific or proprietary to NVidia. This and this alone shows either a willful attempt at a vendor lockout, or the ineptness of the game studio to not be able to write a single, well documented DirectX API procedure. Personally, being that the studio has been around a while here, I'd like to give them the benefit of the doubt here. Money and greed are powerful motivators, as I'm sure you're well aware of.
There is no lockout. ATI can implement it whenever they like. However, what if their implementation is not as good as the one implemented by Nvidia? Hey, Nvidia did sit in through the development and probably know several "corner cutting" tricks when it comes to implementing MSAA specifically to Batman AA. How will ATI fanboy feels if the performance of their ATI's implementation is worst than if they were to use the Nvidia's implementation through changing the vendor ID? What will the arguments then?
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Uh no. It can be written using DirectX, but it's not standard in DirectX.

:D I had to quote that just for the hilarity of the statement itself. Thanks, I needed a good laugh.

Have you ever written DirectX code? It's not as simple as you want it to be. Also it's not the game developers fault, it's Epic who wrote the game engine. It takes a lot of work to add features like this to a game and a lot of testing to make sure it works correctly. Probably why AMD chose not to do the work.

Sounds like you've written DirectX code, and more particularly UE code. Am I right? And to answer the question you posed: Why yes, yes I have. And still do no less.

It's also not "standard" in DirectX (there are standard ways of writing the code using DirectX).

You're not a developer, are you.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
:D I had to quote that just for the hilarity of the statement itself. Thanks, I needed a good laugh.



Sounds like you've written DirectX code, and more particularly UE code. Am I right? And to answer the question you posed: Why yes, yes I have. And still do no less.



You're not a developer, are you.

Well then since you are a developer who writes games using the Unreal 3 engine. Show me the "standard" code included in the DX SDK that enables AA on UE games. Go ahead and just copy and paste it.... :rolleyes:

Being that Epic is such a new game developer they must have missed it :hmm:

I am not a game developer but I have used the DX SDK.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
So how reliable is the testing done by people who change the vendorID? Will they be biased? I don't know.
What's to be biased about? They changed the vendor ID and AA MSAA suddenly becomes functional. And it wasn't just some random guy who reported it in a forum post and nobody else could duplicate it. It is well-documented by multiple end users, multiple websites and by ATi themselves.


What I do know is Nvidia spent the time on implementing MSAA through games that are developed through UE3, and appears to have further optimized it in newer games like Borderland, which beat the crap out of AMD hardware when AA is enabled.
And what does this have to do with the vendor ID lockout? ATi is helping multiple developers with DX11 coding that Nvidia will be able to immediately take advantage of when Fermi is finally released. And ATi has stated that none of it will be locked to their hardware the way Nvidia is locking B:AA and PhysX.


It may have something to do with driver optimization and Nvidia have done it through the experience of intoducing MSAA to Batman AA that ATI didn't have a chance to. 285 beating 5870 by more than 30% is just not right. If ATI fail to fix the problem when the game releases, then there will be another round of red/green war about Nvidia's Sabotages.
I highly doubt it. The B:AA MSAA scandal came to the front burner because somebody changed the vendor ID and discovered they could then enable MSAA. The only reason there would be another uproar would be if it can be proven that Nvidia was deliberately messing around again. I sincerely hope the fallout from B:AA was enough to dissuade them from trying something similar anytime in the immediate future.


Edit: Suppose Nvidia is that bad and is trying to block something from ATI's hardware to use it,
We don't have to suppose, they've already done it (B:AA, PhysX).


do you really think that the hack will be as easy as changing vendor ID? PhysX is something that Nvidia don't want ATI user to use, and thus disabling it through its driver. It won't work even if you have a Nvidia card on the system. Won't this method be more effective?
Just how would an Nvidia driver get on an ATi system in the first place?
 

Rezist

Senior member
Jun 20, 2009
726
0
71
Wreckage is right nVidia wrote the code and helped fund the game. If AMD want's AA they can pay the money and write the code.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
What's to be biased about? They changed the vendor ID and AA MSAA suddenly becomes functional. And it wasn't just some random guy who reported it in a forum post and nobody else could duplicate it. It is well-documented by multiple end users, multiple websites and by ATi themselves.
Where is the document you spoke of? There is a difference between "not crashing" and "working without problems." Do you have a side by side comparison between varies video cards. Not just the FPS count, but the image produced. Has anyone examined varies scenes for edges, bleed and blur? What about memory usage? Missing effects? How do a normal ATI user know where and what to check?

If you can link ONE source showing the details above, then that source is unbiased. Otherwise, I see it as the code don't cause crashes in some cases.

And what does this have to do with the vendor ID lockout? ATi is helping multiple developers with DX11 coding that Nvidia will be able to immediately take advantage of when Fermi is finally released. And ATi has stated that none of it will be locked to their hardware the way Nvidia is locking B:AA and PhysX.
Again, there is no lockout. There is a piece of code written by Nvidia which overcomes the known problem of Deferred Shading, which caused AA to have no effect. This piece of code is not a part of Rocksteady's code and was developed and QAed by Nvidia. The method of testing may involve running batman with it and varies Nvidia's hardware. Unfortunately, Nvidia did not run any test on ATI's hardware.

Now speaking of changing VendorID, how do you know which ID to use that will work for your ATI card? Which ID should I use if I am using 5850? Which ID should I use if I am using 4870? Did anyone cross-matches the combo and see which one runs best? Who is to be blamed if your Vendor ID change causes BSoD on my PC when using a ATI card?

I highly doubt it. The B:AA MSAA scandal came to the front burner because somebody changed the vendor ID and discovered they could then enable MSAA. The only reason there would be another uproar would be if it can be proven that Nvidia was deliberately messing around again. I sincerely hope the fallout from B:AA was enough to dissuade them from trying something similar anytime in the immediate future.
MSAA can be done on games that are developed under UE3, Nvidia has proven that by written code to explicitly support frame buffer resolves. This piece of code belongs to Nvidia, MSAA isn't. ATI, Rocksteady or Eidos can implement it themselves. There is a ID check within this code. The reason may be because different hardware requires different code path to maximize performance. For example:
If it is GTX 280 then
do this.
elsif it is GT100 then
do that.
...
end if
Now you understand it as:

If Nvidia card than
do MSAA
...
Now neither you or I have access to the code. I don't know what that piece of code does exactly, but it belongs to Nvidia. ATI can write a similar code that does the same thing easily as they claimed so. The hard part is to run effectively. See my post above.

BTW, what fallout you spoke of? Batman AA sales great. If you ain't bias, then you will know that AA from CCC works perfectly.

We don't have to suppose, they've already done it (B:AA, PhysX).

Just how would an Nvidia driver get on an ATi system in the first place?
Let me rephrase it. Why don't Nvidia block MSAA in batman the way they block PhysX? Won't it be more effective than a ID check?
 
Last edited:

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Wreckage is right nVidia wrote the code and helped fund the game. If AMD want's AA they can pay the money and write the code.
So I guess all the game developers who are accepting ATi assistance with DX11 effects should lock out those effects from ever working on Nvidia cards?

And that help us, the end users, and the PC gaming industry as a whole.... how?
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
So I guess all the game developers who are accepting ATi assistance with DX11 effects should lock out those effects from ever working on Nvidia cards?

And that help us, the end users, and the PC gaming industry as a whole.... how?
You seriously think that won't if they can?

ATI "Hey Nvidia, we have this tessellation algorithm that is much better than the one we use to have, want it?"

Nvidia "You shouldn't have ATI. Hey guess what, our 3D vision actually work better with this algorithm too, lets work it out!"

ATI "Yeah, hey let us send you those engineers who work on tessellation to assist you with your 3D toy."

Nvidia "No no no, let us both send our top guys to intel to share this great news."

Back to the real world now.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
So I guess all the game developers who are accepting ATi assistance with DX11 effects should lock out those effects from ever working on Nvidia cards?

And that help us, the end users, and the PC gaming industry as a whole.... how?

They may at some point if nVidia keeps this up.
We as consumers may not want it but if nVidia gets every game vendor locked you have to buy there hardware.
 
Last edited:

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Where is the document you spoke of? There is a difference between "not crashing" and "working without problems." Do you have a side by side comparison between varies video cards. Not just the FPS count, but the image produced. Has anyone examined varies scenes for edges, bleed and blur? What about memory usage? Missing effects? How do a normal ATI user know where and what to check?

If you can link ONE source showing the details above, then that source is unbiased. Otherwise, I see it as the code don't cause crashes in some cases.
The documentation I speak of are end user observations. From the various websites I visited, nobody spoke of widespread crashes or graphical corruption. This subject was under a LOT of scrutiny and from what I read, nobody saw any graphical differences between the Nvidia MSAA and the ATi MSAA. No, I don't recall seeing a scene-by-scene comparison between the two. Did you happen to hear that there was a difference between them?


Again, there is no lockout.[/q]
Anytime a game deliberately disables certain features from being used on a card that is capable of displaying said features, it's a lockout.


Now speaking of changing VendorID, how do you know which ID to use that will work for your ATI card? Which ID should I use if I am using 5850? Which ID should I use if I am using 4870? Did anyone cross-matches the combo and see which one runs best?
Simply use the ID of a recent Nvidia card, such as a GTX260. That would mean, for vendor ID use 10DE and for the device ID use 5E2. For the device description, use 'NVIDIA GeForce GTX 260'. It shouldn't matter which Nvidia device ID you use, as long as it is newer than an 8800GT.


Who is to be blamed if your Vendor ID change causes BSoD on my PC when using a ATI card?
Since nobody has been reporting BSODs in B:AA while using the vendor ID spoofing, I don't think you need to worry.


MSAA can be done on games that are developed under UE3, Nvidia has proven that by written code to explicitly support frame buffer resolves. This piece of code belongs to Nvidia, MSAA isn't, so ATI, Rocksteady or Eidos can implement it themselves. There is a ID check within this code. The reason may be because different hardware requires different code path to maximize performance. For example:
If it is GTX 280 then
do this.
elsif it is GT100 then
do that.
...
end if
Now you understand it as:

If Nvidia card than
do this
...
Now neither you or I have access to the code. I don't know what that piece of code does exactly, but it belongs to Nvidia. ATI can write a similar code that does the same thing easily as they claimed so. The hard part is to run effectively. See my post above.
Except that it has already been shown to work effectively on ATi cards. If it increases performance, even if not quite to the same level as an Nvidia card, who cares? Besides, from what I've read, the AA used is a standard implementation. There's nothing in it that is Nvidia specific.

Let me rephrase it. Why don't Nvidia block MSAA in batman the way they block PhysX? Won't it be more effective than a ID check?
Because PhysX is blocked from operating in conjunction with an ATi card in the PhysX driver itself. There is no Nvidia driver necessary to run Batman:AA on an ATi card so the lockout had to be implemented using device IDs within the game.
 
Last edited:

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
You seriously think that won't if they can?
No, I don't. ATi haven't up until now and they've publicly stated numerous times that they refuse to engage in the same tactics that Nvidia seems so fond of. The work ATi is doing on DX11 titles will also benefit Nvidia in the end.

Perhaps Nvidia should implement some self-blocking driver scripts so they don't inadvertently utilize any ATi DX11 work and end up looking like hypocrites once Fermi is released. ;)


Back to the real world now.
If you say so.
 
Last edited:

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
4
81
Something like PhysX I can understand devs needing help but did Rocksteady for example really need help implementing AA in Batman? That's really hard to believe (but I could be wrong). Batman was not the first UE3 game with AA was it?

Atleast Borderlands didn't have AA and I'm pretty sure there's many other titles that don't.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
The documentation I speak of are end user observations. From the various websites I visited, nobody spoke of widespread crashes or graphical corruption. This subject was under a LOT of scrutiny and from what I read, nobody saw any graphical differences between the Nvidia MSAA and the ATi MSAA. No, I don't recall seeing a scene-by-scene comparison between the two. Did you happen to hear that there was a difference between them?
Again, those tests can only show that the code doesn't cause crashes on some setup that uses ATI. Sorry, QA is more than to having someone play the game. If it is that simple, than no QA is needed.

Simply use the ID of a recent Nvidia card, such as a GTX260. That would mean, for vendor ID use 10DE and for the device ID use 5E2. For the device description, use 'NVIDIA GeForce GTX 260'. It shouldn't matter which Nvidia device ID you use, as long as it is newer than an 8800GT.
Gee, when you said "shouldn't", you mean you are not sure right? Have you e-mail to ATI technical support for an exact setting?

Since nobody has been reporting BSODs in B:AA while using the vendor ID spoofing, I don't think you need to worry.
You are avoiding the question my friend, unless you believe Nvidia is some form of God who can create code that won't break.
Murphy's law stated that "Anything that can go wrong will go wrong."
Except that it has already been shown to work effectively on ATi cards. If it increases performance, even if not quite to the same level as an Nvidia card, who cares? Besides, from what I've read, the AA used is a standard implementation. There's nothing in it that is Nvidia specific.
What do you mean by "effectively", so are you saying that ATI is incapable of producing code that runs as effective as the one writen by Nvidia? Or do you really mean ATI is incapable of producing code that enable MSAA at all?

You have read it wrong. First you need to understand what is AA, then how to measure the quality of AA. For example, you turn a game on a select MSAA 16x. That means you use standard AA 16 times. How you came to a conclusion that is free and Nvidia have no ownership on code that they have written which allows MSAA to actually work through UE3 is beyond me. If you don't like what Nvidia wrote, than don't use it.

Because PhysX is blocked from operating in conjunction with an ATi card in the PhysX driver itself. There is no Nvidia driver necessary to run Batman:AA on an ATi card so the lockout had to be implemented using device IDs within the game.
You either fail to understand my question, or simply not answering it.

To sum up what you said here. You are saying that it is a robust code developed by Nvidia, QAed and well documented by some user who randomly choose a Vendor/Device ID that belongs to any video card from Nvidia that is newer than GT8800. Maintenance is not needed and it will never fail on any ATI video card under any circumstances. This piece of code is suppose to be free to anyone who want to use it, but evil Nvidia only allow the select few who they see fit to use it and not to let others who will also be benefit from it to use it.

Am I right?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
So I guess all the game developers who are accepting ATi assistance with DX11 effects should lock out those effects from ever working on Nvidia cards?

And that help us, the end users, and the PC gaming industry as a whole.... how?

:rolleyes:

60-70% of people own NVIDIA cards. I doubt any game developer would want to shoot for such a small audience. Besides there are 100's of games in the TWIMTBP program. All of which have benefited PC gamers ...even the ones with ATI cards :shock: ...

Also AA does work in Batman on ATI cards, so this is just another ATI fan rant with zero substance. Hopefully Fermi will push NVIDIA up to 80% markeshare or more so that PC gaming will continue to be supported.
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
I think you are confused. GSC added MSAA into the game as a patch. However it cost too much FPS and requires Dx10.1. ATI released a hotfix to its driver to ease the hit. Not that ATI is bad, but they did not aid Nvidia on this in anyway as ATI driver don't work on Nvidia's hardware.

I know it was from a patch...this is from the patch release notes:
"Added MSAA for alpha-tested objects ("alpha-to-coverage" for DX10.0, custom "alpha-to-coverage" for DX10.1 for better performance, native DX10.1 implementation for better quality)."

DX10 did have AA from what I've read but you got better performance running DX 10.1. My point stands...ATI did help implement MSAA and did NOT lock it into their own cards.
 
Last edited:

Rebel_L

Senior member
Nov 9, 2009
457
66
91
Excuse my ignorance of the TWIMTBP program or the ati one, but is it normal for developers/studio's to accept pieces of code to put into a game without also being able to take ownership of said code so that they can modify it if the need arises.

Im no programmer and dont have any experice in the field at all, but from observing mmorpg patches it seems like almost any code can be a problem when you start patching things, wouldnt a developer/studio insist on having the rights to do with the code what they please before actually putting it into a release product? I suppose that a game like batman isnt really made expecting major patches but it still seems like a very odd practice to me.
 
Dec 30, 2004
12,553
2
76
:rolleyes:

60-70% of people own NVIDIA cards. I doubt any game developer would want to shoot for such a small audience. Besides there are 100's of games in the TWIMTBP program. All of which have benefited PC gamers ...even the ones with ATI cards :shock: ...

Also AA does work in Batman on ATI cards, so this is just another ATI fan rant with zero substance. Hopefully Fermi will push NVIDIA up to 80% markeshare or more so that PC gaming will continue to be supported.

You're right. AMD doesn't do anything to work with developers. :/
http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/1