No AA in StarCraft II with ATI Cards

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
So you think that Nvidia/AMD should be the ones adding AA to game titles?

Personally desire IHV's to be pro-active here when developers don't. Desire IHV's to try to improve the gaming experience over what the developers may of intended. This is why developer relations is so important so gamers may enjoy the hardware and features that the IHV's offer in gaming titles.

I don't expect idealism; it is a pipe dream.

I also expect titles not to include AA at times and when they don't, have a desire for IHV's to try for their customer base. Of course, this isn't ideal, but it's a way to improve the gaming experience for their customers.

One of my fondest memories on ATI hardware was the day I could use and force HDR+AA in Oblivion. It transformed that title because of the way that title handled HDR. Sure, it would of been nice to have in-game AA in that title but was grateful that ATI did try to bring more for their customer base.

There are always different mind-sets:

IHV mind-set

Developer mind-set

Gamer mind-set

My bias is a gamer mind-set and having the choice to improve immersion and not dictated by what the developer intended. What happens if there is application filtering and the quality is suspect? Having the ability to force filtering or add super-sampling helps improve immersion over what the developer intended.

What happens if there is no application AA? That's it? End of tune?

If developers have all the tools and yet don't use them -- why should the gamer be locked in and that's it? Allow the gamer to have some control to improve their immersion for their titles. Sure, it is more work for the IHV's -- but don't ya think gamers are worth it?
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Should they? Probably not, but when given the opportunity to pick up the slack of the devs. It is a nice value add by the vendor. In this case Nvidia added value to their product by having this ready by launch day. AMD imo should have also had a similar fix just to save face. But the higher ups at AMD again dropped the ball and let themselves be smeared across the tech boards over this. The management at AMD infuriates me. For being such a 2nd rate player they sure dont help themselves by not being more proactive than the big players in their markets(Intel and Nvidia). And this has been AMDs motto since I have followed them in 1997.

So you agree that AMD/Nvidia shouldn't be the ones adding AA to games and then go on a rant because AMD didn't have it released already.

Way to contradict yourself.

Was AMD was "proactive" enough for you when they released their 5800 series six months ahead of Nvidia's Fermi? Or when they were assisting developers with DX11 content while Fermi was nothing more than a hacked off PCB board and wood screws?
 
Last edited:

golem

Senior member
Oct 6, 2000
838
3
76
So you agree that AMD/Nvidia shouldn't be the ones adding AA to games and then go on a rant because AMD didn't have it released already.

Way to contradict yourself.

Was AMD was "proactive" enough for you when they released their 5800 series six months ahead of Nvidia's Fermi? Or when they were assisting developers with DX11 content while Fermi was nothing more than a hacked off PCB board and wood screws?

No, the developer should be the one to add AA to games. That would be the best option. But if the developer doesn't, it would be nice if AMD/Nvidia stepped in and did a work around so AA did work.

Well, wasn't Nvidia pretty much raked over the coals for being late and ATi generally applauded for having an DX11 card out?
 

golem

Senior member
Oct 6, 2000
838
3
76
Dirt2, BC2, Stalker, battleforge. Thats all I can think of from the top of my head.

BC2 and battleforge both run faster in DX11 than 10. So I'm inclined to believe AMD will get blizzard to add AA that doest kill performance just like they did with BC2. I'm also sure that nV cards will not be locked out either

Oh, and they did they same thing with stalker:CS, it wasn't going to have ingame AA either.

Nobody blamed nV there. Hypocrites

EDIT: I wasn't calling you a hypocrite, golem.

Thanks Skurge, I didn't know about BC2 and battleforge.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Better have some good horsepower from nvidia to enable AA in Starcraft 2. It will crush a GTX 460 imo, on my rig there is a noticeable performance impact with it.

It also is a flawed implementation as there are graphical errors in the game from turning on AA from control panel.

I'm sure interested to see a comparison of the performance impact with ATI's AA in the game vs Nvidia's. Should be interesting to see.
 

MJinZ

Diamond Member
Nov 4, 2009
8,192
0
0
Better have some good horsepower from nvidia to enable AA in Starcraft 2. It will crush a GTX 460 imo, on my rig there is a noticeable performance impact with it.

It also is a flawed implementation as there are graphical errors in the game from turning on AA from control panel.

I'm sure interested to see a comparison of the performance impact with ATI's AA in the game vs Nvidia's. Should be interesting to see.

Yes, my GTX 480 can do it, my Mobility 5870 (5770) would choke, die, and stab itself if it even attempted 1x AA probably.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
So you agree that AMD/Nvidia shouldn't be the ones adding AA to games and then go on a rant because AMD didn't have it released already.

Way to contradict yourself.

Was AMD was "proactive" enough for you when they released their 5800 series six months ahead of Nvidia's Fermi? Or when they were assisting developers with DX11 content while Fermi was nothing more than a hacked off PCB board and wood screws?

I didnt contradict myself. In an ideal world the devs would put this in the game. When they dont the hardware vendor picking up the slack is great. I am ranting against AMD because they again dropped the ball. If Nvidia can write this into their drivers so can AMD. Why wasnt this tested and done before launch? AMDs excuse is beyond pathetic as well. AA will drop performance? And let me guess, water is wet?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
AMDs excuse is beyond pathetic as well. AA will drop performance? And let me guess, water is wet?

This makes no sense to me. If that's the case then how come their hotfix is almost done? Either they did not want to add it because of the performance hit or they used this lame excuse to cover the lack of a fix at launch.

Either way it makes them look very bad.

My bet is that the hotfix is a ways off. Otherwise they would not have made such a lame statement.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I didnt contradict myself. In an ideal world the devs would put this in the game. When they dont the hardware vendor picking up the slack is great. I am ranting against AMD because they again dropped the ball. If Nvidia can write this into their drivers so can AMD. Why wasnt this tested and done before launch? AMDs excuse is beyond pathetic as well. AA will drop performance? And let me guess, water is wet?

You must be livid with Blizzard if you are this annoyed at a company that isn't responsible for the code and features that go in a game. Yet I'm not seeing any posts from you complaining about Blizzard.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
You must be livid with Blizzard if you are this annoyed at a company that isn't responsible for the code and features that go in a game. Yet I'm not seeing any posts from you complaining about Blizzard.

It's not Blizzard's fault. It's a flaw in DirectX 9 that requires the GPU vendor to create a work around. This has been explained in this thread already.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
It's a flaw in DirectX 9 that requires the GPU vendor to create a work around.

It's not a flaw in DirectX 9. The hardware to perform AA this way simply wasn't around yet, in the days of the DX9 standard.
The AA workaround only works on DX10/11 hardware, not on actual DX9 hardware.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
It's not Blizzard's fault. It's a flaw in DirectX 9 that requires the GPU vendor to create a work around. This has been explained in this thread already.

I disagree. It sounds to me that they chose and out dated method to create a game for a 2010 launch. I mean, wouldn't it be Ford's fault if they introduced a car today that didn't have air bags and their excuse was that they started designing the car years ago? Just about every other game has this option, it's pretty much a universal standard option at this point.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
I disagree. It sounds to me that they chose and out dated method to create a game for a 2010 launch. I mean, wouldn't it be Ford's fault if they introduced a car today that didn't have air bags and their excuse was that they started designing the car years ago? Just about every other game has this option, it's pretty much a universal standard option at this point.

You are comparing AA to a safety issue like Airbags? :eek: OK lets use your analogy. Since no companies were making airbags when they designed their car, their cars don't have airbags. GM however decided to produce their own airbags for their customers. Ford is AMD, GM is NVIDIA.

The bottom line is that the DirectX 9 method that Blizzard used does not support AA. NVIDIA provided a fix for its customers, AMD did not.

These are the facts.
 

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
I agree with insomniator. I love mah graphical appltitudes when it comes to like a shooty game. But SC2 is about pew pew pew not Alcoholics Anonymous.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
AFAIK, the DX9 API is to blame. When Deferred rendering is used with Anti Aliasing, it will show incorrect results and will not work at all. They're using standard DX9 code, so a work around is needed like a driver hack to bypass such limitation. And it's an API limitation, not hardware limitation because DX10 can do deferred rendering with Anti Aliasing and there's hardware that works with it.

I think that the best medium should be to call the GPU vendors to add AA in the control panel that works with both vendors, and not favoring one vendor since both have important market shares.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
AFAIK, the DX9 API is to blame. When Deferred rendering is used with Anti Aliasing, it will show incorrect results and will not work at all. They're using standard DX9 code, so a work around is needed like a driver hack to bypass such limitation. And it's an API limitation, not hardware limitation because DX10 can do deferred rendering with Anti Aliasing and there's hardware that works with it.

DX9 API is not to blame.
DX9 API and DX9 hardware go hand-in-hand.
It WAS a hardware limitation when we had DX9 hardware.
Only DX10.1 hardware (and nVidia's DX10 hardware) and above can do it, which is why it is part of the DX10.1/11 API.
You can't add to an API what doesn't exist in hardware, so you can't fault the API for not supporting it. DX9 is just terribly outdated. There have been two major and one minor D3D revisions since.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
It seems there is a confusion about AA and Dx9. I don't know what is so complicated. Dx9 + Deferred Shading = No AA, the API doesn't support it. It is as simple as that. How can this be anyone's fault except Microsoft?

I don't know what techniques used by Nvidia, but AMD could have done what they did in BC2, which resulting AA effects.

In theory, the only solution to this is doing it via the driver level. This can only be done through direct communication between developers and hardware vendors. The actual control can be in-game or at driver control. Note that Dx9, unlike Dx10 and above, have bit flags all over the place which may or not work, let alone do the same thing. Since the game itself is based upon Dx9, it is possible that the game somehow bypass the OS and talk to the driver and therefore made AA possible, or maybe it is just a forced SSAA no matter what options you choose.

If you think AA is something standard nowadays, then you don't get this standard with ATI cards. You can buy 2 5970s, running at 230 FPS, yet still have no AA. It isn't whose fault, it is simply a fact.

Now the next question is how will Blizzard enable AA in game? Going to Dx10/11 appears to be an option, but unlike other games, SC2 has its own engine where people can make games with, it may not be ideal, even if it is visible. We know that Batman AA is written on Dx9 and the AA code, which was developed by Nvidia, actually works for ATI too. This means it is possible to enable AA without being bounded by vendors. Blizzard may be able to buy that piece of code off Nvidia assuming they are willing to sell it, or create one themself. The question is, will it be just like Rocksteady said, require a complete rewrite?

Again, the problem isn't as simple as people say. ATI engineers may not be able to solve this problem without blizzard's cooperation. Since the game is just released yesterday, there may be lots of issues that are far more important than AA on ATI cards, like "balance issues (lol) and crashes.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
How can this be anyone's fault except Microsoft?

It's not Microsoft's fault.
It's nobody's fault. DX9 is just old, and at the time, this technology was not feasible yet. If anything, nVidia/ATi are to blame for not adding hardware support at the time of DX9, but it was not realistic to expect it at the time.

The fault here is that Blizzard STILL chooses DX9 as their API, while they go for a deferred renderer. This means AA is not an option, and apparently people expect AA these days (and so does Microsoft... since DX10.1, the hardware has to support 4xAA as a minimum).
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
It's not Microsoft's fault.
It's nobody's fault. DX9 is just old, and at the time, this technology was not feasible yet. If anything, nVidia/ATi are to blame for not adding hardware support at the time of DX9, but it was not realistic to expect it at the time.

The fault here is that Blizzard STILL chooses DX9 as their API, while they go for a deferred renderer. This means AA is not an option, and apparently people expect AA these days (and so does Microsoft... since DX10.1, the hardware has to support 4xAA as a minimum).

This is kind of my point. There are a few here who appear pretty pissed and AMD for not having some kind of patch, but where is the blame for Blizzard using an old API for a huge AAA release in mid 2010?
 
Status
Not open for further replies.