Can I use 8xMSAA on Battlefield 3?

Frost_Bites

Junior Member
Sep 24, 2012
18
0
0
hello everyone

I want to ask some question,

can I use 8xMSAA on battlefield 3? because I was trying to enable it on NVIDIA Control Panel, setting AA mode to 8x (in Program Settings, not Global Settings) , and disable 4xMSAA and AA Post in bf3 video settings, and nothing work.

if I can use, can you tell me how to do it?

and what is the difference between AA Mode, and AA Transparency on NCP?

Thanks,

Regards,

Frost_Bites
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
You will need a 670 or 7950 at least to maintain good fps in single player at 1080p. Might even need a 680 or 7970. With 8x msaa that is.
 

Durvelle27

Diamond Member
Jun 3, 2012
4,102
0
0
why would you want to use that your FPS is gonna tank as even with 4xMSAA the GTX 670 can only get a little over 60 FPS at 1920x1080. You would need a dual gpu setup to make the playable and i don't think BF3 even supports 8xMSAA only 4xMSAA
 

Frost_Bites

Junior Member
Sep 24, 2012
18
0
0
why would you want to use that your FPS is gonna tank as even with 4xMSAA the GTX 670 can only get a little over 60 FPS at 1920x1080. You would need a dual gpu setup to make the playable and i don't think BF3 even supports 8xMSAA only 4xMSAA

based on my setup, I can get most 70 - 90 FPS with All Max settings @1080p all of time, but dips between 45-60 FPS. I don't know about other people setup, sorry

that's why I ask this, I will never ask this question, if I know my GPU can't handle it. Furthermore, I will not use this settings all of time. I just want to know about it.

and I ask this question, because I see in tomshardware review, that he enable 8xMSAA on battlefield 3.

see this if you don't believe me :

http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html

look at bf3 review, he use 8xMSAA.
 
Last edited:

Durvelle27

Diamond Member
Jun 3, 2012
4,102
0
0
based on my setup, I can get most 70 - 90 FPS with All Max settings @1080p all of time, but dips between 45-60 FPS. I don't know about other people setup, sorry

that's why I ask this, I will never ask this question, if I know my GPU can't handle it. Furthermore, I will not use this settings all of time. I just want to know about it.

and I ask this question, because I see in tomshardware review, that he enable 8xMSAA on battlefield 3.

see this if you don't believe me :

http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html

look at bf3 review, he use 8xMSAA.

It's very hard to believe your getting over 70 FPS with a single GTX 670 unless on single player

http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/5.html
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
I get 80-100+ fps avg at 1080p 4x msaa on bf3 single player on a single 7970 1125 1575 with very old drivers. On single player though. Minimum used to be 60-75ish or so and almost never below 65-70.

But I had a 2600k and this is one game where a stock CPU or 2500k would be slower by 10-20 fps especially with min fps.
 

Frost_Bites

Junior Member
Sep 24, 2012
18
0
0
I get 80-100+ fps avg at 1080p 4x msaa on bf3 single player on a single 7970 1125 1575 with very old drivers. On single player though. Minimum used to be 60-75ish or so and almost never below 65-70.

But I had a 2600k and this is one game where a stock CPU or 2500k would be slower by 10-20 fps especially with min fps.

yep you're right, I only have an i5 3570K, not i7 3770K, and running my GPU @stock clock (1150 Mhz).
 

SomeoneSimple

Member
Aug 15, 2012
63
0
0
Instead of arguing, here are some answers:

I was trying to enable it on NVIDIA Control Panel, setting AA mode to 8x (in Program Settings, not Global Settings) , and disable 4xMSAA and AA Post in bf3 video settings, and nothing work.

You can't force MSAA in DX10/11 games via drivers.

what is the difference between AA Mode, and AA Transparency on NCP?

'AA mode' in NCP is the method of adding AA, if a game supports MSAA itself, you should choose 'Enhance', instead of override. It will then enhance the sampling patern of ingame-AA with additional colour-/coverage samples if you choose a higher AA setting in the driver, and adds support for transparency AA. Enhance generally has better compatibility and performance then plain overriding.

Transparancy AA is a driver implementation of alpha-to-coverage, you can enable this to use multi- or supersampling on 2D textures to reduce aliasing. Plain MSAA only works on geometry, enabling Tr-AA will anti-alias transparent, flat objects (like leaves on a tree) as well. This isn't forcable in DX10/11 either.
In BF3's case, Tr-AA doesn't really matter, since BF3 has implemented alpha-to-coverage in its engine.
 
Last edited:

Barfo

Lifer
Jan 4, 2005
27,554
212
106
Your name is Frost Bites, you should know better thsn any of us :awe:
 

Frost_Bites

Junior Member
Sep 24, 2012
18
0
0
Instead of arguing, here are some answers:

You can't force MSAA in DX10/11 games via drivers.

if I can't, then how about this :

11355618.png


http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html

how can I enabled the 8xMSAA on bf3? this is the main question I've asked to all of you so far, but I think no one answered it.

'AA mode' in NCP is the method of adding AA, if a game supports MSAA itself, you should choose 'Enhance', instead of override. It will then enhance the sampling patern of ingame-AA with additional colour-/coverage samples if you choose a higher AA setting in the driver, and adds support for transparency AA. Enhance generally has better compatibility and performance then plain overriding.

Transparancy AA is a driver implementation of alpha-to-coverage, you can enable this to use multi- or supersampling on 2D textures to reduce aliasing. Plain MSAA only works on geometry, enabling Tr-AA will anti-alias transparent, flat objects (like leaves on a tree) as well. This isn't forcable in DX10/11 either.
In BF3's case, Tr-AA doesn't really matter, since BF3 has implemented alpha-to-coverage in its engine.

thanks for your explanation.
 
Last edited:

Durvelle27

Diamond Member
Jun 3, 2012
4,102
0
0
I get 80-100+ fps avg at 1080p 4x msaa on bf3 single player on a single 7970 1125 1575 with very old drivers. On single player though. Minimum used to be 60-75ish or so and almost never below 65-70.

But I had a 2600k and this is one game where a stock CPU or 2500k would be slower by 10-20 fps especially with min fps.

That's about right


bf3192012001.gif
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
if I can't, then how about this :

11355618.png


http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html

how can I enabled the 8xMSAA on bf3? this is the main question I've asked to all of you so far, but I think no one answered it.



thanks for your explanation.

It's probalby working but since MSAA in a Deferred Rendering engine only applies AA to parts fo the scene you probably don't notice it. MSAA in any Deferred Render Engine is a waste of performance if you ask me. Especially since FXAA and MLAA are far better than they used to be. Not nearly as blurry if you ask me.
 

SomeoneSimple

Member
Aug 15, 2012
63
0
0
It's probalby working but since MSAA in a Deferred Rendering engine only applies AA to parts fo the scene you probably don't notice it.

That has nothing to do with what Frost Bites is asking.

Not sure what you're getting at either, deferred rendering doesn't change the quality of MSAA, as long as the sampling is done correctly (which is more complex then in a forward renderer, and primarily costs more memory bandwidth). That said, Frostbite 2's MSAA implementation is nearly a showcase model, how MSAA in a deferred renderer is properly done.

And no, post-process-AA hasn't changed much, at all. Nvidia has stopped active development on FXAA some time ago, and SMAA1x is still the latest extension on MLAA, and that's more then a year old.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
That has nothing to do with what Frost Bites is asking.

Not sure what you're getting at either, deferred rendering doesn't change the quality of MSAA, as long as the sampling is done correctly (which is more complex then in a forward renderer, and primarily costs more memory bandwidth). That said, Frostbite 2's MSAA implementation is nearly a showcase model, how MSAA in a deferred renderer is properly done.

And no, post-process-AA hasn't changed much, at all. Nvidia has stopped active development on FXAA some time ago, and SMAA1x is still the latest extension on MLAA, and that's more then a year old.

I'm referring to the fact that MSAA in a deferred engine does not Anti-alias the entire scene. There will still be jaggies all over the place. The only way to antialias the entire scene in a deferred game is with post AA. So why take the huge performance hit of MSAA when you still have to use FXAA or MLAA?

It's a waste.
 

SomeoneSimple

Member
Aug 15, 2012
63
0
0
I'm referring to the fact that MSAA in a deferred engine does not Anti-alias the entire scene. There will still be jaggies all over the place. The only way to antialias the entire scene in a deferred game is with post AA.

This is very much incorrect.

If MSAA is properly implemented in a deferred engine, it is equal to a proper MSAA implementation within a forward renderer.

If a scene isn't fully anti-aliased, then either, the developers are incredibly sloppy and don't really care about accurate MSAA, or it is done to increase performance.

Frostbite 2 uses compute shaders to calculate the lightning on each pixel, if the contrast is low MSAA takes a single sample per-pixel for the shading, if contrast is high, it uses multisampling for that pixel's shading. This get's merged with the fully multisampled geometry in a final pass. This significantly lowers the memory footprint, and thus costs less bandwidth. Aliasing might be theoretically less accurate, compared to 'full scene' AA, but this should not be noticable at all.

Sadly however, there are a ton of examples of awful MSAA implementations, forward engines just as well as deferred engines. Well known forward-rendering engines with awful MSAA implementations are Unreal Engine 3 and the EGO engine.

And if you're not aware, MSAA only affects geometry. Games with tons of sharp specular maps and shader effects (EVE Online comes to mind) will still have a lot of aliasing, even with 8x MSAA. There are techniques to significantly reduce such aliasing without reaching for blurring post-process-AA, but nobody has cared enough to implement those outside of techdemo's.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
This is very much incorrect.

If MSAA is properly implemented in a deferred engine, it is equal to a proper MSAA implementation within a forward renderer.

If a scene isn't fully anti-aliased, then either, the developers are incredibly sloppy and don't really care about accurate MSAA, or it is done to increase performance.

Frostbite 2 uses compute shaders to calculate the lightning on each pixel, if the contrast is low MSAA takes a single sample per-pixel for the shading, if contrast is high, it uses multisampling for that pixel's shading. This get's merged with the fully multisampled geometry in a final pass. This significantly lowers the memory footprint, and thus costs less bandwidth. Aliasing might be theoretically less accurate, compared to 'full scene' AA, but this should not be noticable at all.

Sadly however, there are a ton of examples of awful MSAA implementations, forward engines just as well as deferred engines. Well known forward-rendering engines with awful MSAA implementations are Unreal Engine 3 and the EGO engine.

And if you're not aware, MSAA only affects geometry. Games with tons of sharp specular maps and shader effects (EVE Online comes to mind) will still have a lot of aliasing, even with 8x MSAA. There are techniques to significantly reduce such aliasing without reaching for blurring post-process-AA, but nobody has cared enough to implement those outside of techdemo's.

You do realize that BF3's MSAA doesn't affect the whole scene right?