[guru3d] Total War: Warhammer DX12 benched

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
The fog is missing in FXAA:

That's nothing to do as such with FXAA.

Singularity.jpg


The conclusion from the thread you took the screens from is that FXAA is a tad better than MLAA.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
My issue with MLAA is it blurs the text pretty bad, I remember using it in Dishonored and immediately reverting it back(FXAA was not that great in that game either).
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
MLAA and FXAA both suck pretty bad in most actual games I've played. Blurred, smeared textures. In FO4 it was just awful
 

Hitman928

Diamond Member
Apr 15, 2012
5,182
7,632
136
Both MLAA and FXAA tend to blur. Each does their own things well, it kinda comes down to personal preference and how the developers have them implemented.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
That's nothing to do as such with FXAA.

Then what caused it when that was the only option they changed? Posting another unrelated screenshot isn't proof.

Also where is the proof that it hurts Nvidia performance a lot? I showed that it has almost no perf hit.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
MLAA and FXAA both suck pretty bad in most actual games I've played. Blurred, smeared textures. In FO4 it was just awful

It is the cheap mans AA that's for sure. Running 4xAA instead of MLAA in Warhammer is a big improvement.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
It is the cheap mans AA that's for sure. Running 4xAA instead of MLAA in Warhammer is a big improvement.

Agreed, I usually go either no AA at all and reap higher FPS or I use real MSAA. TXAA I've seen good implementations and bad implementations. Some look pretty good for not too much FPS cost
 

Spjut

Senior member
Apr 9, 2011
928
149
106
Good increases in DX12 but it's bad if the game doesn't support DX12 feature level 11_0/11_1.
The same thing frustrated me back when the first DX11 games hit and many locked DX10 hw to the DX9 path.
 

f2bnp

Member
May 25, 2015
156
93
101
SMAA for the win. I don't understand why developers don't provide it 90% of the time.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
I tend to go SSAA when ever possible. Every other AA just sucks in comparison.
SSAA is super GPU intensive though and it's not really any different from DSR/VSR. If you're already playing at a high resolution, SMAA, FXAA and MLAA can be useful.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
SSAA is super GPU intensive though and it's not really any different from DSR/VSR. If you're already playing at a high resolution, SMAA, FXAA and MLAA can be useful.

Sometimes I need to add some post processing AA to a game running at DSR settings to get rid of all the jaggies.
 

DiogoDX

Senior member
Oct 11, 2012
746
277
136
Didn't like DSR when I used.

Now that I'm in native 4K the best combination for quality with good performance is SMAA (or other post processing) + 2xMSAA. Playing hitman absolution and BF4 MP with custom medium-high settings and looks great.

Since this game will also support MSAA this combination will be possible.
 

faseman

Member
May 8, 2009
48
12
76
Lol my 970 is getting trounced. Might be one of the worst cards I've ever bought for longevity.
 

biostud

Lifer
Feb 27, 2003
18,193
4,674
136
Hopefully the benchmark will be included in the DX12 patch, and also work under DX11 if my 7990 is not supported for DX12.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
The strange thing is, they are getting some wild different results than wccftech...
http://wccftech.com/total-war-warhammer-benchmarks-unveiled-invest-powerful-cpu/

The "High" results are actually not bad for Nvidia, except for the GTX1080, for some reason the "Ultra" setting swifts things way too much in favor of AMD.
Hm? They only tested 1080p with high/ultra and that's CPU bottlenecked for the high end cards even in ultra settings. The only trends I can reliably pull out of that is that the 380 pulls ahead another ~10% on the 960 and the 390 gains ~10% on the 980. Might be MLAA related, could have other reasons, hard to tell.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Hm? They only tested 1080p with high/ultra and that's CPU bottlenecked for the high end cards even in ultra settings. The only trends I can reliably pull out of that is that the 380 pulls ahead another ~10% on the 960 and the 390 gains ~10% on the 980. Might be MLAA related, could have other reasons, hard to tell.

Actually the only thing they are missing is to test 4K/2K at High instead of Ultra.

It looks OK to me, a 390 on DX12 performs on 1080P:
FPS/97 Percentile

76/61 - Ultra
94/73 - High
113/93 - Medium

a 970
66/52 - Ultra
92/75 - High
106/93 - Medium

The 970 is able to keep up with the 390 on DX12 at High and Medium, until something happens at Ultra, and the same thing happens to all nvidia cards except for the 980TI that keeps close to Fury X on every test, maybe its VRAM, or MLAA, no idea.

That the 380 get better perf than a 960 (strange that the 4GB 960 is slower than the 2GB one), and 390 better than 970 on DX12 by a small margin thats all expected, after all this is another DX12 AMD title, and AMD cards are actually good, they just perform horribly under DX11, DX12 balances things out.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
By the product stack positioning only the 980Ti looks fine compared to AMD's counterpart. Its performance is in the same ballpark. The rest of NV's cards are under-performing compared to AMD's competing cards. That's not counting the 1080 because it doesn't have a competitor. As for the 1080's performance relative to maxwell cards - people expected the 1080 to only gain performance relative to older NV's cards. That shouldn't happen because the 1080 is a die shrink not a new architecture. It has the same weaknesses.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
So any more info on when this patch is coming? Should be this week if they want to keep to their schedule.

Then what caused it when that was the only option they changed? Posting another unrelated screenshot isn't proof.

Also where is the proof that it hurts Nvidia performance a lot? I showed that it has almost no perf hit.

C'mon man, why bother actually providing any proof for your claims.
It's way more efficient to just let out a continues stream of unsubstantiated spin and half-truths and simply ignore anyone who calls you out on it.

1080FE doesn't throttle, Polaris is a joke in performance/W, 290X TDP is 250W, DX12 is pointless, etc.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
So any more info on when this patch is coming? Should be this week if they want to keep to their schedule.



C'mon man, why bother actually providing any proof for your claims.
It's way more efficient to just let out a continues stream of unsubstantiated spin and half-truths and simply ignore anyone who calls you out on it.

1080FE doesn't throttle, Polaris is a joke in performance/W, 290X TDP is 250W, DX12 is pointless, etc.
and it is allowed on a serious hardware forum :(