Is there more to it than TWIMTBP? (Personal commentary)

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Keys please, you can do better than that. Calling a lock "enabler" is only a valid reason for fanboys or people who cannot see the truth. The only wrong thing in the code is the vendor ID locker, period. The Anti Aliasing works on ATi hardware with no problems whatsoever. Nvidia must only worry about QA the Anti Aliasing and performance on their hardware and forget about ATi, no need to spend time to put a lock, if the game has issues with ATi, then ATi should be the one to blame and fix it. The lock there is simply a lock, not an enabler, I was able to "enable" the Anti Aliasing with the Device ID cheat and it worked fine with no artifacts and better performance than standard Anti Aliasing.

It would be as if ATi thinks that the DX11 path of Dirt 2 would not run great on Fermi and then putting a lock err "enabler" on DX11, so it would only enable the DX9 path when nVidia hardware is detected. The same trick was done on Half Life 2 days with the DX8 path on GeForce FX series and it was foul play, even though the FX series doesn't really supported DX9 like it supposed to but it was an nVidia problem, ATi didn't had the right to lower the DX path for GeForce FX.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Keys please, you can do better than that. Calling a lock "enabler" is only a valid reason for fanboys or people who cannot see the truth. The only wrong thing in the code is the vendor ID locker, period. The Anti Aliasing works on ATi hardware with no problems whatsoever. Nvidia must only worry about QA the Anti Aliasing and performance on their hardware and forget about ATi, no need to spend time to put a lock, if the game has issues with ATi, then ATi should be the one to blame and fix it. The lock there is simply a lock, not an enabler, I was able to "enable" the Anti Aliasing with the Device ID cheat and it worked fine with no artifacts and better performance than standard Anti Aliasing.

It would be as if ATi thinks that the DX11 path of Dirt 2 would not run great on Fermi and then putting a lock err "enabler" on DX11, so it would only enable the DX9 path when nVidia hardware is detected. The same trick was done on Half Life 2 days with the DX8 path on GeForce FX series and it was foul play, even though the FX series doesn't really supported DX9 like it supposed to but it was an nVidia problem, ATi didn't had the right to lower the DX path for GeForce FX.

Umm, no I can't do better than the truth. You pretending that it isn't does not help anybody, well except who you intend to help.
If a game has issues with ATI, guess who will get the fallout? Eidos. It is what it is EV.
And it's most definitely not what you'd love it to be. A lockout. It's a word you love to use because it places an image of Nvidia with devil horns on it's corporate head.
I wouldn't bother with this topic if it hadn't gone so far. You see, and create, all the flack you read on the net about this subject. Nvidia tried, and did, make this game better than it was in a couple of ways. You do not appreciate this, because you can't. Or won't let yourself acknowledge it.

As for the comparison of ATI locking out Dirt2 DX11 path? You know what? I know Nvidia would work with the devs to get it done for NV hardware. This is a no brainer. It's not Nvidia that sits around and lets the other guy do the heavy lifting. That title belongs to ATI and Richard Huddy's "this is not a committment" mentality. How hysterical is that?

@ Creig: In this post, there are facts alongs with some of my opinions about "what if's" as per Evolution8.
 
Last edited:

NIGELG

Senior member
Nov 4, 2009
852
31
91
Umm, no I can't do better than the truth. You pretending that it isn't does not help anybody, well except who you intend to help.
If a game has issues with ATI, guess who will get the fallout? Eidos. It is what it is EV.
And it's most definitely not what you'd love it to be. A lockout. It's a word you love to use because it places an image of Nvidia with devil horns on it's corporate head.
I wouldn't bother with this topic if it hadn't gone so far. You see, and create, all the flack you read on the net about this subject. Nvidia tried, and did, make this game better than it was in a couple of ways. You do not appreciate this, because you can't. Or won't let yourself acknowledge it.

As for the comparison of ATI locking out Dirt2 DX11 path? You know what? I know Nvidia would work with the devs to get it done for NV hardware. This is a no brainer. It's not Nvidia that sits around and lets the other guy do the heavy lifting. That title belongs to ATI and Richard Huddy's "this is not a committment" mentality. How hysterical is that?

@ Creig: In this post, there are facts alongs with some of my opinions about "what if's" as per Evolution8.
Unfortunately we will never know if money changed hands and of course the developer will not tell you that if it was true.You only have their word which you believe is gospel truth.Huddy's words are not gospel truth either,mybe only a forensic audit will tell us if money changed hands.....but the overall bad guys in many people's opinion is Nvidia and that will not change anytime soon unless Nvidia changes some of their tactics with their propriety stuff.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
And it's most definitely not what you'd love it to be. A lockout. It's a word you love to use because it places an image of Nvidia with devil horns on it's corporate head.
You can call it an "ENABLER" until you're blue in the face, but the majority of people see it as a "DISABLER".

Yes, Nvidia did a good thing by putting forth the effort to get MSAA included in B:AA. Nobody is disputing that. But then they went and disabled it for anybody not running an Nvidia card. In other words, they locked out the feature from working on anybody else's card. Hence the term, "lockout".

And that is not an opinion either, but a solid fact.

The reason that Nvidia rightfully caught flack for it is that it is bad for consumers. It is bad for ATi owners because even though they paid the same price for the game as an Nvidia owner, they cannot access the same level of content even though their video card is perfectly capable of displaying it. It is also bad for Nvidia owners because now it gives ATi an excuse to do the same to them.

ATi is working with developers to add DX11 enhancements to current and upcoming titles. Even though my current gaming rig has an ATi card in it, I don't want to see ATi locking out features any more than I want Nvidia to. If ATi starts locking out features from games, that means I won't have access to those graphics if I decide to switch to an Nvidia card later on down the road.

It doesn't matter if you're a current member of the green team or the red team. Both sides should realize that what Nvidia did was not in our best interests and should voice their displeasure so that Nvidia knows it. After all, it's our wallets that ultimately decide if a company sinks or swims. If Nvidia's policies push enough people away from them and towards ATi, they will rethink their closed-door strategy.



I wouldn't bother with this topic if it hadn't gone so far. You see, and create, all the flack you read on the net about this subject. Nvidia tried, and did, make this game better than it was in a couple of ways. You do not appreciate this, because you can't. Or won't let yourself acknowledge it.
You're wrong. People DO see that Nvidia did make B:AA a better game. But what you refuse to see (or acknowledge) is that they were wrong by making it a better game for only Nvidia users.



As for the comparison of ATI locking out Dirt2 DX11 path? You know what? I know Nvidia would work with the devs to get it done for NV hardware. This is a no brainer.
Why should one company have to waste resources re-coding something that another company has already done? How does that help us? So rather than one company spending money to improve game A and the other company helping with game B (which is how it has been up until now), both companies will be spending money to do identical improvements to game A because of a lockout.

So now both companies are spending money to do the same improvements to the same game and have less time and money available to improve other titles.

Yes, that's real progress there.



It's not Nvidia that sits around and lets the other guy do the heavy lifting. That title belongs to ATI and Richard Huddy's "this is not a committment" mentality. How hysterical is that?
If you want to totally ignore the work that ATi's engineers and programmers have done in the past and continue to do today, that's up to you. But whether you like it or not, that work has, and will, directly improve the gaming experience for ATi and Nvidia owners alike. In the future, Fermi owners will benefit from the work that ATi is doing with DX11 titles. And unlike Nvidia, ATi has no plans to lock that content solely to their own cards:

It was not my intention to claim that DX11 is an AMD-only feature, and I think if you re-read my queston you'll see I did not suggest that.

My point here is that if NVIDIA believes that it's right for NVIDIA to lock DX's antialiasing support in to NVIDIA-only hardware in a game - then I assume you'd agree that AMD locking DX11 functionality in upcoming games would also be right. Yes?

Doesn't doing this that undercut the whole of point of DirectX?

If you also take note of Tim Sweeney's statements (available on BrightSideOfNews) that the DX10 version of UE3 fully supports antialiasing then this whole situation is clearly just NVIDIA locking generic functionality to it's own hardware.

Can we have a poll on Hexus please asking the simple question "Do you feel that AMD would be right to lock any generic DirectX 11 functionality in forthcoming DirectX 11 games (like DiRT2 or AvP etc) so that these features are available only on AMD's hardware?"

I'd be astonished if the answer on a poll came out as yes. And right now I can tell you that AMD does _NOT_ plan to do it!

Thanks,


Richard Huddy [Graphics Developer Relations, AMD]



As Painman stated so directly, succinctly (and graphically):

Painman said:
You're so frantically swinging your big green dick, Jen-Hsun, whacking anyone and everyone including your own customers, that you swung it underneath your own foot.


Again.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Why should one company have to waste resources re-coding something that another company has already done? How does that help us?
Why should one company spend resources testing code on another companies systems? Who would do that?

If you want to totally ignore the work that ATi's engineers and programmers have done in the past and continue to do today, that's up to you.

If you want to totally ignore the work that NVIDIA's engineers and programmers have done in the past and continue to do today, that's up to you. 100's of games have benefited from TWIMTBP and they work just fine on ATI cards. The source engine is a fine example of this.


You can repeat that this is a big issue over and over, but that won't make it true. It's only a big issue for the ATI fanatics and I doubt NVIDIA is really going to care what they say, because they will go out of their way to bash them no matter what.

Dirt2 get's delayed for several months and then conveniently lacks SLI support. Of course that's different because.... :rolleyes:

I'm sure AMD paying codemasters $1,000,000 had nothing to do with it. The door swing both ways, you just refuse to see it. AMD admitted they could write the code to include AA, this would make the whole issue go away. So why don't you petition them?
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Unfortunately we will never know if money changed hands and of course the developer will not tell you that if it was true.You only have their word which you believe is gospel truth.Huddy's words are not gospel truth either,mybe only a forensic audit will tell us if money changed hands.....but the overall bad guys in many people's opinion is Nvidia and that will not change anytime soon unless Nvidia changes some of their tactics with their propriety stuff.
Proving is one thing, pointing at the moon and insisted that it is square is another. There are ethics in this world, something it is very hard to define. Intentionally damaging competitors is said to be unethical, do better than competitor is ethical. Having vendor specific functionalities is not new and it is ethical. You car has a sunroof? My car is a 4WD. Eventually, if the feature is good, it will be retrofitted by vendor and become a standard feature.

Think of Nvidia as a movie producer, and ATI users are the people who use BT. Nvidia created PhysX and the TWIMTBP team. In batman AA, a few ATI users tried to use what Nvidia produced by using a cheap Nvidia card + hack to enable everything that TWIMTBP created. Ethically, it is wrong, but ethic means next to nothing in the internet. The bottom line is, user "can" use "it" at a much lower cost, which is unethical, but okay.

What is NOT okay is there are a few people who claims that TWIMTBP is actually bad. Making a game works better only for a selected few is bad, damaging the game industry. However, they are the ones who also pays low price or nothing at all to use what they claim is bad, but instead of admitting this, they accuse TWIMTBP for preventing something that should be open for everyone. So they go around telling people that hacking is fine, teaching others how to do it, and at the same time bad mouthing the one who made those things they hack for, the good stuffs that were made by, or featured by TWIMTBP.

There can only be 2 ends, a) TWIMTBP continue to create things exclusively to Nvidia user. In return it should generate more sales on Nvidia product, and thus generate indirect revenue. Or b) TWIMTBP will be seen as a bunch of suckers who does nothing good for the company and shall be cut to reduce expense.

If Nvidia pull the plug, then TWIMTBP will cease to exist. While a few ATI user who have no sense of ethic will be happy, all other gamers should cry out loud as a part of gaming just died.

Sending out a team to assist developers while retrofitting vendor specific code is ethical. If however they deliberately insert code to cause hardwares from competing vendor to crash or perform badly, then it will be unethical, but it isn't happening. Now not maximizing hardwares from competing vendor is not unethical, preventing others to do that is unethical, which is also not happening. Seriously, they are not angels, just a group of employees with skills. The question is, why is the counter part from ATI? If GITG can come out once again, then we can "Get Into The Game" which is "The Way It Meant To Be Played." As of now, Richard Hobby is working hard, doing a supreme job in his field, but that isn't what I want. I want ATI to make gaming better like Nvidia. I want these programs to become a standard, and only by then we can truely get into the next phrase of gaming.

What I really want to know from the beginning is "did the programmers who coded that MSAA got punished"? If Keys have the connection, can you pass the following for me to the team please.
"Forget what nay sayer said, as their purpose in live is to say negative things. I don't know if you got punished for the work you have done that causes this, but to me, you made it better with your own hands. Your hands added favor to what is already good, making it better. You guys showed us it is possible regardless of what others said. Without you guys, we will not have known The Way It Meant To Be Played. Unfortunately you guys won't be known as heroes or inventors, but there exist people like me who see it, and really want to say this in front of you. Thank you.(Edit) Guess what, what the nay sayer is really saying is that is "The Way It Meant To Be Played"."
 
Last edited:

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Sending out a team to assist developers while retrofitting vendor specific code is ethical. If however they deliberately insert code to cause hardwares from competing vendor to crash or perform badly, then it will be unethical, but it isn't happening.
Unfortunately, that's exactly what did happen in Batman:

'Amusingly', it turns out that the first step is done for all hardware (even ours) whether AA is enabled or not! So it turns out that NVidia's code for adding support for AA is running on our hardware all the time - even though we're not being allowed to run the resolve code!
So… They've not just tied a very ordinary implementation of AA to their h/w, but they've done it in a way which ends up slowing our hardware down (because we're forced to write useless depth values to alpha most of the time...)!
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Unfortunately, that's exactly what did happen in Batman:
Sorry to say, but do you aware that you keep quoting the speeches from the same person, who's job is to engineer speeches that makes their company look good? Can you get someone from Eidos or Rocksteady to confirm this? Can you link a performance chart that indicates ATI hardware runs unusually bad on Batman AA?

(edit)
'Amusingly', it turns out that the first step is done for all hardware (even ours) whether AA is enabled or not! So it turns out that NVidia's code for adding support for AA is running on our hardware all the time - even though we're not being allowed to run the resolve code!
So… They've not just tied a very ordinary implementation of AA to their h/w, but they've done it in a way which ends up slowing our hardware down (because we're forced to write useless depth values to alpha most of the time...)!
To be honest, I tried to understand what Richard meant when he said it, but I just can't understand it. I guess what he mean was:
'Amusingly', it turns out that the first step is done for all hardware (even ours) whether AA is enabled or not! So it turns out that NVidia's code for adding support for AA is running on our hardware all the time - even though we're not being allowed to run the resolve code!
The game code doesn't have anything special. The code Nvidia owns runs fine on all vendors, and with it, MSAA can be enabled regardless of which vendors, but are being excluded to Nvidia hardwares.
So… They've not just tied a very ordinary implementation of AA to their h/w, but they've done it in a way which ends up slowing our hardware down
That means the code enables MSAA on Nvidia's hardware, and ATI users are being forced to use FSAA.
(because we're forced to write useless depth values to alpha most of the time...)!
As having the scene in 2D doesn't require depth values most of the time. In 3D, it means a lot, but ATI don't support it.

The way he said it made it sounded like Nvidia put something into the code that prevents ATI hardware to play well, but didn't say it directly. He also didn't mention the complicity of the code itself, just said it works on their hardware. He deliberately not use the term MSAA, and use "the ordinary way of implementing AA", which technically speaking, is right. In fact, the whole statement, is legit.

Very well said. He isn't being paid for nothing.

(Edit) I spent an hour just to put right words into them. Yes my English sucks, I got it.
 
Last edited:

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
The game code doesn't have anything special. The code Nvidia owns runs fine on all vendors, and with it, MSAA can be enabled regardless of which vendors, but are being excluded to Nvidia hardwares.

That means the code enables MSAA on Nvidia's hardware, and ATI users are being forced to use FSAA.

I've been telling you that for ages, the code doesn't have anything special except the lock aka "enabler". That's why it ran fine with the DeviceID with my ATi card with no issues or glitches.
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
The video card companies (on both sides) have dumped the problem on the end-user, forcing them to takes sides. They seem to want an endless fanboy war. No thanks.

As end-users, demand better. What we need is an open standard video abstraction-layer. (Something similar to Hydra, though I'm not sure how open that solution might be.)
 
Dec 30, 2004
12,553
2
76
The video card companies (on both sides) have dumped the problem on the end-user, forcing them to takes sides. They seem to want an endless fanboy war. No thanks.

As end-users, demand better. What we need is an open standard video abstraction-layer. (Something similar to Hydra, though I'm not sure how open that solution might be.)

It's called directX, and apparently Nvidia will pay people to only implement features for NVIDIA cards.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
As having the scene in 2D doesn't require depth values most of the time. In 3D, it means a lot, but ATI don't support it.
No, Seero, it doesn't have anything to do with Nvidia 3D Vision but rather the additional pixel depth values that are being added. Here is a good explanation of what is going on:

The unreal engine needs for every pixel a depth value to render correctly. Reading back depth values isn’t supported at all for DX9. But nvidia and ATI have implemented some “driver hacks” to work around this limitation. Unfortunately this works only for nonAA depth buffers. Epic ignores this problem and therefore the engine doesn’t support AA by its own. If you force AA by the driver the driver uses AA color and depth buffers but ensure that they look like non AA buffers for the engine. This is a heavy operation.

To support AA direct the engine was modified to write the depth value in the alpha channel. This requires some additional shader instructions. Later the AA color buffer is resolved to a non AA buffer and the depth value can be read from the alpha channel.

As the color buffer needs to resolve anyway they save the work to resolve the depth buffer that is needed by driver forced AA.

The reason why the write the depth value to alpha even without AA is simple. To disable this you need a complete additional shader set and implement a switch based on the AA state. Such a change would not be an easy quick hack.
So by modifying the Unreal engine to add MSAA, Nvidia is forcing the engine to write additional depth values. That's not a problem. What is a problem is that by implementing MSAA and then DENYING ATi cards access to it, ATi cards are still having to write these MSAA-only depth values without being able to use them for anything. This has the effect of artificially lowering framerates on ATi cards without any increased graphics whatsoever for the ATi user.

In a nutshell, the Nvidia-only MSAA causes ATi cards to run Batman:AA at a lower framerate than they normally would if the MSAA code hadn't been added in the first place. And there is absolutely no benefit at all given to ATi users in return for this decreased framerate.

That's my understanding of the situation.
 
Last edited: