No AA in StarCraft II with ATI Cards

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
What a stupid response by AMD. Doesn all AA impact performance?!?!?!?!?!?!?

Did you add enough question marks and exclamation points?!?!?!?!?!?!?

Of course AA affects performance. But what AMD actually said was, "support for AA within StarCraft II would only be made possible by including it in the driver, an approach that could significantly impact performance."

The way I read that statement is that a driver level AA will incur a larger framerate hit than a game engine AA. Therefore, they chose not to go that route. But they also said, "In discussions during the development of StarCraft II, Blizzard indicated that they would not initially include options to set levels of in-game anti-aliasing (“AA”)."

The word "initially" leaves open the possibility that AMD is currently working with Blizzard to have in-game AA added at a future date. In addition, "We will continue to work with Blizzard on this matter and hope to offer our customers an acceptable AA solution at a later date."

Remember, AMD is the company that helps developers make games look good for everybody. Nvidia is the company that helps developers make games look good only on Nvidia hardware. So to have AMD assisting Blizzard with an in-game AA that anybody can use rather than a driver level AA that only AMD users would benefit from would be completely in character for them.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I think this is being blown out of proportion...

While it sucks that AA isn't an option, I don't see how this is AMD's fault. The only fault I see on AMD's part is that Nvidia went above and beyond for their customers by creating a work around, AMD has not (yet?). A few of the more vocal ones are all over AMD for this, but where is the anger at Blizzard? It's midway through 2010, AA should be available.

And, if AMD and Nvidia start adding features that should be added by the developer, where does it stop? What other features will developers just get lazy on and expect hardware manufactures to implement?

While I'm not happy that I won't have AA (at least not initially) I also don't fault AMD here as much as some.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Remember, AMD is the company that helps developers make games look good for everybody. Nvidia is the company that helps developers make games look good only on Nvidia hardware. So to have AMD assisting Blizzard with an in-game AA that anybody can use rather than a driver level AA that only AMD users would benefit from would be completely in character for them.

While you wait for that great day, nVidia customers can enjoy AA in StarCraft.
 

vshin

Member
Sep 24, 2009
74
0
0
AA is a nice luxury but as soon as average fps drops below 60, it's the first thing I turn off to speed things up. In terms of value, it's a marginal benefit for relatively high cost in performance (up to 20 fps!). I don't notice the difference unless I look at a magnified screenshot.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
And, if AMD and Nvidia start adding features that should be added by the developer, where does it stop? What other features will developers just get lazy on and expect hardware manufactures to implement?

I think it is great to see IHV's be pro-active to try to improve the experience over what the developers intended. Either through getting in-game included, enhancements of transparency or AA, stereo3d, GPU Physics or processing, or multi-monitor

There is no blaming or pointing fingers. Right now, StarCraft on AMD hardware looks like what the developers intended. With nVidia -- there is enhancements of AA. That's just the way it is.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
While you wait for that great day, nVidia customers can enjoy AA in StarCraft.

Below is what I wrote in this thread in page 2. If people using Radeon cards wanted AA, they already can enable it.

And as I said in the other thread, there is RadeonPro and ATI Tray Tools to override the AA settings and enable AA in Starcraft 2 if you really feel you can't live without it.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Remember, AMD is the company that helps developers make games look good for everybody. Nvidia is the company that helps developers make games look good only on Nvidia hardware. So to have AMD assisting Blizzard with an in-game AA that anybody can use rather than a driver level AA that only AMD users would benefit from would be completely in character for them.

Translated ...

Remember, AMD is the company that talks about helping developers make games look good for everybody, but never actually does it. Nvidia is the company that actually helps developers make games look good only on Nvidia hardware. So to have AMD talking about assisting Blizzard with an in-game AA that anybody can use rather than a driver level AA that only AMD users would benefit from would be completely in character for them.

Action > words imo
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Below is what I wrote in this thread in page 2. If people using Radeon cards wanted AA, they already can enable it.
Link? If there’s no specific code in the driver to handle Starcraft’s engine, those utilities won’t do squat.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Enabling AA in Starcraft 2 on my system gives a large performance hit. Not sure how viable it is for those with single/lower end nvidia cards.

Also, it causes anomalies in all the video cut scenes. The mouths of speaking characters and other fast moving objects located in an isolated location of the screen develop what I can only call some sort of blurring effect. It just looks weird.

That said, enabling AA does work. :thumbsup:
 

heat901

Senior member
Dec 17, 2009
750
0
0
I dont know what everyone is crying about... I played the beta a handful of times with my 4870 and it looks great without AA...I really do not know how much better it is going to look with AA one to be honest. I think everyone is cryin over something stupid. If this was an FPS I could understand everyone going crazy but this is a rts where you are zoomed out high above the clouds like a benevolent god!!!!
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Link? If there’s no specific code in the driver to handle Starcraft’s engine, those utilities won’t do squat.

Admittedly I haven't tried this out. Heading out to BB now to get Starcraft 2. The location I'm going to had a midnight launch but I was out getting wasted with some friends. I'll let you guys know how it goes.

http://forums.amd.com/game/messageview.cfm?catid=260&threadid=136531&highlight_key=y

Found the info originally in the link above. Below is the Guru3D thread on it.

http://forums.guru3d.com/showthread.php?t=322031
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I dont know what everyone is crying about... I played the beta a handful of times with my 4870 and it looks great without AA...I really do not know how much better it is going to look with AA one to be honest. I think everyone is cryin over something stupid. If this was an FPS I could understand everyone going crazy but this is a rts where you are zoomed out high above the clouds like a benevolent god!!!!



Having an AA choice is stupid -- no doubt about it. The key is if one desires to feel like a benevolent god, high in the clouds, well, one has the choice to disable AA on nVidia products as well.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
If I were to blame anyone, it'd be the developers. If they chose to use DX10.1 or DX11, they could enable AA via a standard API, and vendor-specific solutions wouldn't be required in the first place.
Because they chose DX9 and still went for a deferred rendering approach, they ruled out AA in a vendor-agnostic way.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Translated ...
Originally Posted by Creig
Remember, AMD is the company that talks about helping developers make games look good for everybody, but never actually does it. Nvidia is the company that actually helps developers make games look good only on Nvidia hardware. So to have AMD talking about assisting Blizzard with an in-game AA that anybody can use rather than a driver level AA that only AMD users would benefit from would be completely in character for them.



Action > words imo

Sooooo..... The DX11 content that both AMD and Nvidia users are enjoying today had nothing at all to do with AMD, their programmers or their developer relations teams?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
It has been said before, but mostly ignored, so I'll reiterate it:
The game uses deferred rendering, which means that the z-buffer contents are determined first (along with other per-pixel data in other buffers), and shading is applied later, as a sort of post-processing stage, not directly geometry-based.
The problem with DX9 is that AA only works when rendering geometry directly. The AA is resolved implicitly. So if you want to do anything later, in a post-processing stage, the subsample information for AA is no longer available.
Forcing AA on in the control panel is not going to solve this, it will not produce correct results.

What nVidia has done, is to add extra code using driver extensions, which take advantage of their DX10+ hardware, which DOES allow you to access subsample information in later render passes.
So this is nVidia-specific code, which can not be done with vanilla DX9 code, nor with forcing AA in control panel.

If AMD wants to offer AA aswell, they also need to supply custom rendering code for the game, like nVidia did.

This is also why you can't blame Blizzard. The code has to be done by the vendor. You can blame AMD for not providing the code for their customers or blame DX9 for not having it in the first place (fixed in DX10 though).

After Batman: Arkham Asylum I would have thought AMD learned their lesson. Their response about performance insults everyone who reads that article. The game in not that demanding with newer cards hitting 100fps+ so there is plenty of horsepower left for AA. From the screenshots it looks like 4xAA is more than enough to make the game look good.

If I were to blame anyone, it'd be the developers. If they chose to use DX10.1 or DX11, they could enable AA via a standard API, and vendor-specific solutions wouldn't be required in the first place.
Because they chose DX9 and still went for a deferred rendering approach, they ruled out AA in a vendor-agnostic way.
I read somewhere that they have been working on the game for at least 6 years. So that would probably rule out DX10. Not to mention that they also want it to run on a wider range of hardware.
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Difference here:

Batman AA - NVIDIA wrote code that was part of the actual game's code that allowed AA to be enabled on NVIDIA cards, but disabled it when an ATI card was detected.

SC2 - The game doesn't ship with AA functionality, NV has created a work around in their drivers to enable, ATI has not (yet).
Correction. The code Nvidia has written is not part of the game, but come with the game. In fact, the 2 set of codes from batman and SC2 are the same in principle as it is used to solve the same problem.

Another Correction, It is possible to use Force AA on batman with AA, but not possible in SC2.

If you are in the beta, then you will know that Nvidia driver doesn't like SC2 much and Nvidia has done lots of fixes to make it work. Remember the overheat problem from NV driver? The game isn't special as it is just another Dx9 game. Why would there be so many problems with NV? Well since bundling a vendor specific solution is not a good idea as there were too many negative voices, this time they embedded the code within their drivers.
 

smb

Senior member
Mar 7, 2000
563
0
76
i'm sure that most of you are aware that this game was written for the lowest system specs in mind......which is probably with Intel Integrated graphics.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
I read somewhere that they have been working on the game for at least 6 years. So that would probably rule out DX10. Not to mention that they also want it to run on a wider range of hardware.

Not really.
The 3D API is just a small part of the overall game.
You think all the DX10/11 games we have today started development AFTER DX10/11 was released? Not quite.
It's not THAT difficult to upgrade the rendering API in a game. Especially with D3D, where you won't have to rewrite your shaders to use them in DX10/11.
For such a big title, I would think it'd be worth the investment.

Besides, you can keep DX9 around, side-by-side. A lot of DX10/11 games still have a DX9 path for XP.
 

golem

Senior member
Oct 6, 2000
838
3
76
Sooooo..... The DX11 content that both AMD and Nvidia users are enjoying today had nothing at all to do with AMD, their programmers or their developer relations teams?

Besides maybe Dirt2, what other DX11 game did AMD help with?

Actually, I heard they were helping with the DX11 features of Dirt2, but what did that involve?
 

lopri

Elite Member
Jul 27, 2002
13,329
709
126
I've got the game and started the campaign, and the game is pretty taxing on graphics at Ultra setting. My stance hasn't changed in that AMD should get AA sorted out ASAP, but it looks like AA is out of the question for competitive play. FPS drops below 60 frequently @2560x1600, everything maxed out. But watching replays with AA would be nice. A couple screenshots from the first mission.

Cutscene

Direct Link (2560x1600)

In-game

Direct link (2560x1600)
 
Status
Not open for further replies.