No AA in StarCraft II with ATI Cards

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

lopri

Elite Member
Jul 27, 2002
13,329
709
126
I agree with posters above me that your analogy is so far off.
There is no analogy in my previous post.

maybe it wont really matter? just seems very apple like of them, 'you dont need AA for our games' is alot like 'we didnt make the iphone so you can look at porn'
Does anyone get this 'analogy'? :confused:

why is game being released in DX9, DX11 has been out for how long? not even DX10?
Do you realize that Windows XP still commands more than half of OS market? And Blizzard doesn't make XBox and PS3 games.
 

Ika

Lifer
Mar 22, 2006
14,264
3
81
yes this.

why is game being released in DX9, DX11 has been out for how long? not even DX10?

you can't be talking down ATI for laziness or missing the parade with any credibility if you are not willing to dump a big shovel of it in Blizards lap.

it's being released in DX9 because/so it can still run on my ancient 6800NU P4 Dell :) that's blizzard's goal, not eye candy.
 

Zargon

Lifer
Nov 3, 2009
12,218
2
76
I am not saying they shouldnt support DX9, but they certainly could try and support Dx10/11

There is no analogy in my previous post.


Does anyone get this 'analogy'? :confused:


Do you realize that Windows XP still commands more than half of OS market? And Blizzard doesn't make XBox and PS3 games.


sorry your comparison. not supporting a non-required feature(AA esp in a RTS) is not anything close to not supporting the game or having it not run on their cards.
 

Zargon

Lifer
Nov 3, 2009
12,218
2
76
imagoon: no you arent, a good buddy of mine swears it gives him headaches when enabled
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
yes this.

why is game being released in DX9, DX11 has been out for how long? not even DX10?

you can't be talking down ATI for laziness or missing the parade with any credibility if you are not willing to dump a big shovel of it in Blizards lap.

Sure we can when Nvidia will get it done.
 

Zargon

Lifer
Nov 3, 2009
12,218
2
76
Sure we can when Nvidia will get it done.

:rolleyes:

so when someone release a game that needs driver workaround to be played in anything but 1024x768, it will be ATI or NVidia's fault, good to know

I never said ATI wasn't culpable, but Blizard surely is as well. They left a feature wanted by their customers out of a game.

So far only Nvidia has provided a workaround thats STILL not as good as if blizard just did it right the first time.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
:rolleyes:

so when someone release a game that needs driver workaround to be played in anything but 1024x768, it will be ATI or NVidia's fault, good to know

I never said ATI wasn't culpable, but Blizard surely is as well. They left a feature wanted by their customers out of a game.

So far only Nvidia has provided a workaround thats STILL not as good as if blizard just did it right the first time.

Who cares if it is as good as blizzard? The fact is Nvidia will have it ready at launch time and AMD wont. And it isnt like this came out of nowhere for AMD. This is a AAA title. If they cant get it done for this game, they cant get it done for any game.
 

Ashkael

Member
May 5, 2010
51
0
0
I highly doubt Blizzard cares about making their game as pretty as Crysis or Metro 2033. After all, people are still playing SC1, a 10+ year old game. Blizzard's goal is probably to make the game as accessible as possible, even to people who are running ancient dinosaur computers.

I played the beta with a good chunk of friends, all of whom are drooling over SC2 finally getting pushed out of the door. These guys are the type of people that watch SC2 and SC1 replays and tournaments on a regular basis. I haven't heard a single one of them remark about the game's graphical quality or lack of AA features. They only talk about strategy, unit mechanics and all that stuff.

Granted, anecdotal evidence is hardly proof of anything, but when it comes to a franchise that is a cornerstone of e-sports I'm willing to put my hand on the fire and say that a good chunk of the people buying this game couldn't care less about the graphics as long as their game is playable.

Yes, it's dumb that SC2 doesn't support AA. However, and as much as I hate to quote a tired Internet meme, if you are buying this game for the graphics you are quite honestly doing it wrong.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Who cares?

AMD has been out pushing and delivering awsome DX11 gpus for 6 months before it's competition and we are suppose to knock them over Blizzard not implementing AA in a DX9 game?

Again, how about who gives a shit. Gamers should be looking and calling for developers to take care of this, not accepting that GPU makers make up for a games shortcomings. If we accept this crap then developers will have lower motivating need to include what gamers value.
 

imagoon

Diamond Member
Feb 19, 2003
5,199
0
0
Who cares?

AMD has been out pushing and delivering awsome DX11 gpus for 6 months before it's competition and we are suppose to knock them over Blizzard not implementing AA in a DX9 game?

Again, how about who gives a shit. Gamers should be looking and calling for developers to take care of this, not accepting that GPU makers make up for a games shortcomings. If we accept this crap then developers will have lower motivating need to include what gamers value.

What I find amusing is that the "force AA" box on my ATI card does exactly that. I haven't played SC2 but it does force it games like FFXI. I actually force it to off...
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
1. This has been known for a while now- look at HC's GPU performance article.

2.
why is game being released in DX9, DX11 has been out for how long? not even DX10?
This game has been in development for 6 years, well before DX10 even started to appear.

3. ATI inevitably will release a patch for AA in SC2 it's just a matter of time. The same thing has happened with UE3 games, Nvidia and ATI have had to implement their own shader algorithm in order to support AA in a engine using Deferred Rendering. It took time but it eventually arrived from both camps.

We all know Nvidia developer relations are more involved than ATI's, SC2 is perhaps the biggest game this year so to position your product as 'the best in SC2' (considering many people will be benching this game against the competition) makes sense even with the additional expenditure required to implement AA (and later Stereoscopic 3D). Nvidia have invested heavily in this and with posts like Wreckage's all around the internet today it's exactly what they're aiming for.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
What I find amusing is that the "force AA" box on my ATI card does exactly that. I haven't played SC2 but it does force it games like FFXI. I actually force it to off...

forcing it on doesn't work with SC2

as was previously mentioned, SC2 uses deferred rendering:

Another rather important disadvantage is that, due to separating the lighting stage from the geometric stage, hardware anti-aliasing does not produce correct results any more: although the first pass used when rendering the basic properties (diffuse, normal etc.) can use anti-aliasing, it's not until full lighting has been applied that anti-alias is needed. One of the usual techniques to overcome this limitation is using edge detection on the final image and then applying blur over the edges.[2] DirectX 10 introduced features allowing shaders to access individual samples in multisampled render targets (and depth buffers in version 10.1), making hardware anti-aliasing possible in deferred shading. These features also make it possible to correctly apply HDR luminance mapping to anti-aliased edges, where in earlier hardware any benefit of anti-aliasing may be lost, making this form of anti-aliasing desirable in any case.

source: http://en.wikipedia.org/wiki/Deferred_Rendering
 

Sahakiel

Golden Member
Oct 19, 2001
1,746
0
86
For SC2, Blizzard has a specific set of goals in mind.

One of them is the oft-quoted emphasis on getting games playable for as many people as possible. It's a cornerstone for their technical decisions and it's a cornerstone for their gameplay decisions (looking at you, WoW).
That's why the game engine is DX9. OS and video card compatibility as well as system performance mean it'll be a while before they make a game for DX11 (since it's basically an extension of DX10).

Another goal is to keep gameplay consistent for competition players. That's the main reason you have limited graphical options and the main reason SC stayed at 640x480 all these years. Sure, more than likely someone will find a way to get the game to run at a higher resolution and/or triple monitor, but Blizzard has not and will not support that and may actually submit to cat and mouse to block the functionality.

The universal goal is to make money. Blizzard spent a lot, and although expectations are they will earn it back, it's not guaranteed. Therefore, alienating a significant portion of their user base to write a game engine that supports AA is not a good choice. Spending exponentially more time and money to write a second engine isn't economically feasible at this time. Perhaps in the future after their initial investment pays off and funding for future projects exceeds baseline.


Overall, it seems the AA feature is limited by the engine itself, though driver settings can override it to provide some measure of functionality. Blizzard has no compelling reason to write a new engine to support it, but plenty of reasons not to or set it aside for a future date. It's not as if you flip a switch and get AA. It's not some separate functionality in the video card that simply doesn't run if it's not enabled. If you enable AA, usually you want to not only make sure it doesn't make everything too blurry, but also that it won't break the game and specifically no other game when it's a driver-level workaround.
 

extra

Golden Member
Dec 18, 1999
1,947
7
81
Screw you blizz for not including AA...inexcusable imho. Don't give me any "this game has been in the works for too long, blah blah" crap. Blizz could afford to make AA work. Any newly released game that doesn't support AA is inexcusable imho. (Though I doubt this thread would have existed if ATI's driver forced AA to work but Nvidia's different, because we seem to have a lot of ppl who love to diss on ati's drivers).
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Who cares?

AMD has been out pushing and delivering awsome DX11 gpus for 6 months before it's competition and we are suppose to knock them over Blizzard not implementing AA in a DX9 game?

Huh? What does that have to do with this thread again?
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Blizzard chose to use dx9 with deferred lighting they are to blame not ati or nvidia. The best you can hope for is either a blur filter or per pixel filtering on the lighting stage. Neither will be the AA you are used to with the traditional AA setups.

You can force AA on but that still will not make it render the way it is supposed to. Deferred lighting and AA are like oil and water.
 
Last edited:

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
If the developer doesn't bother with in-game AA settings, is it really ATI's fault?

In a way the fault is with ATI in that they shoulda been on the ball and gotten an AA profile in their drivers to force AA in Starcraft 2. Barring that they need to enable the easy creation of user created profiles. This is only one of the biggest games of the past few years and a game that people have been clamoring for for the last decade.

And FU Blizzard for not including a feature that should be standard by now. Unfortunately for gamers this is not a big enough missing feature from Starcraft 2 that it won't sell millions of copies.

I must be the only person on the planet that thinks AA makes the image look fuzzy and out of focus. The fact that he needed to zoom in like 300% for me to even see the effect confuses me even more.

Anyway on topic: Nvidia is just overriding the settings for the game. So this is a 'missing' feature from Blizzard.

Also am I missing something? My 5770 has: "override application settings" and "application profiles" in the control panel. One of the options is override AA.

One of the reasons why I think this isn't as big of a deal as some would make is the fact that Starcraft by nature is a very fast paced game. If you've got time to notice very very tiny jaggies that really require zooming way close up to really see, then you're playing it wrong.

One of the reasons for AA was always the fact that older computer systems were unable to run games at a decent pace and at sufficiently high resolutions so AA was always a stopgap to smooth out graphics. With many gamers moving to resolutions like 19x12 the need for AA has been lessened by a large amount.

Bottom line is that if someone wants to play SC2 with AA, NV cards are the only solution for now. I will see what happens tomorrow (official release) but this game isn't 'just another console-port' type of thing. Blizzard is the few left PC-only developer (correction: PC & Mac) if AMD wans to duck away from a game of this magnitude, it might as well duck away from PC GPU market.

And as I said in the other thread, there is RadeonPro and ATI Tray Tools to override the AA settings and enable AA in Starcraft 2 if you really feel you can't live without it.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
They made SC2 to cater to the lowest common denominator so they can make the most money. It sucks but that's the way of the game.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I must be the only person on the planet that thinks AA makes the image look fuzzy and out of focus. The fact that he needed to zoom in like 300% for me to even see the effect confuses me even more.
No, they zoom in to make it insanely obvious to those who don't see the originals as bad. You already noted, FI, that you couldn't. Even w/o my glasses, in the pcper link, the non-zoomed in Zerg shot looks to me like it was made from construction paper cutouts.

Making edges look blurry has always seemed like a poor substitute for higher resolutions.
A Prius is a poor substitute for a fuel cell car, too. Where can you buy one of those? 96DPI from around 3-5ft is still considered fairly dense, and affordable monitors only get to around 24", so the need for AA is still here. Somewhere around 400DPI from 3-5ft, it will be obsolete. Alternatively, we could get flexible enough hardware to allow for the game developers to make near-perfect edge rendering in their games, rather than relying on many point-samples blended together (if you've never noticed, that is a huge chip on Tim Sweeny's shoulder).

You can force AA on but that still will not make it render the way it is supposed to. Deferred lighting and AA are like oil and water.
The nV AA screenshots look perfectly good to me. Given that not all the edges look bad w/o AA, a workaround like nV's, with a very low AA setting (do they even have a 2x option, anymore?), might do the job. As long as there aren't oddly-high-contrast staircases lining shape edges, I'll be fine.

akugami said:
And as I said in the other thread, there is RadeonPro and ATI Tray Tools to override the AA settings and enable AA in Starcraft 2 if you really feel you can't live without it.
...and the 'trick' used by those who successfully applied AA on newer Radeons has been revealed for all! :p

As someone who always has AA forced on, and had had it that way since a GF2 GTS, even I think folks are getting a little too riled up about it. Microsoft and Blizzard are the ones to blame, but even then, the situation doesn't look that bad. ATi will have a proper hack out, I'm sure, and everyone will be happy (well, with AA support, anyway).
 

apathy_next2

Member
Jun 15, 2010
166
0
76
Blizzard and Ati apologists can say all they like. AA may be fuzzy or blurry for some of you, it may even make your tummies hurt, but I like the fact that Nvidia spent its resources and got it done.
I hope ATi gets it done soon as well.
But kudos to nvidia for going above and beyond for its customers in this context
 
Status
Not open for further replies.