This is also why you can't blame Blizzard. The code has to be done by the vendor. You can blame AMD for not providing the code for their customers or blame DX9 for not having it in the first place (fixed in DX10 though).
So you're saying that AMD should have programmed Starcraft for Blizzard? Blizzard chose not to include AA in the game. A boneheaded move. AMD had nothing to do with that. The only fault AMD has is they should keep feature parity with nVidia and taken the extra resources to force it in. But I don't see how one can come to the conclusion that this is somehow AMD's fault. Convoluted reasoning at its best.
After Batman: Arkham Asylum I would have thought AMD learned their lesson. Their response about performance insults everyone who reads that article. The game in not that demanding with newer cards hitting 100fps+ so there is plenty of horsepower left for AA. From the screenshots it looks like 4xAA is more than enough to make the game look good.
I read somewhere that they have been working on the game for at least 6 years. So that would probably rule out DX10. Not to mention that they also want it to run on a wider range of hardware.
The only lesson AMD can come to with the Batman fiasco is not to put in vendor lock codes or you will be blitzed by negative PR.
As I stated. As soon as I got the game in I'd investigate the RadeonPro utility. Doesn't work. However, one thing was reinforced is my belief that you won't notice it. Seriously, if you are playing Starcraft 2 and have time to zoom up really really close to see if there are any jaggies, you're playing it wrong. In a new game we all oggle the graphics. Quite a few of us are going to pause the game here and there or take a few screencaps to look at and analyze.
However, when playing the game, I didn't have time to notice any jaggies and if you're playing at 16x10 or higher resolutions, it's barely noticeable even looking at the screen. You really gotta zoom up way close in a screencap to notice it. Meaning it's there but it's not there.