Battlefield 1 Benchmarks (Gamegpu & the rest)

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

kraatus77

Senior member
Aug 26, 2015
266
59
101
Actually it's the truth so far looking at their dx11 vs dx12/vlk results on other games. couple that with async driver which is still coming god knows when.
also remember dx12 for fermi ? it's nowhere to be found. so much for 100% dx12 for all dx11 gpus

but simply saying "fud" solves everything i guess.

anyway, this is a benchmark thread related to bf1, so just discuss this game. for other things please make separate thread we have enough derailing already.

nice to see another optimized game from dice, other devs can learn a thing or two. also if anyone wants to try it, you can play it right now if you have ea access. but for only 10hrs.
 

greatnoob

Senior member
Jan 6, 2014
968
395
136
This is unsubstantiated FUD.

I think at this point it's been proven an exhaustive number of times throughout this entire forum.
1) DX12 driver optimisations are not possible in the same way DX11 ones were because the entire layer is now much thinner -> significantly less control over what Nvidia, AMD and Intel can do to optimise DX12 'drivers'.
2) Maxwell - no async compute after more than a year of promising it'll come soon
3) "This is unsubstantiated FUD." is not a valid argument, why is it 'FUD'?

@swilli89 is spot on when he says "Either their software, hardware, or a combination of both aren't well adapted to future APIs unfortunately for nvidia buyers"
Kepler hasn't aged well at all, and neither will Maxwell (for DX12, 11 seems to be completely fine though). Saying Nvidia cards aren't adapted for future APIs (DX12 and Vulkan) seems like a perfectly tenable thing to say especially when there's data that shows a very obvious performance regression in Vulkan and DX12 games on older Nvidia cards. On the other hand "This is unsubstantiated FUD" seems like a knee jerk defense mechanism to buyers remorse - akin to blocking your ears and covering your eyes because you don't want to hear what you think to be false.. Bring up evidence to counter what has been posted, else don't bother posting at all.
 
Aug 11, 2008
10,451
642
126
That's because there's a performance regression of the 1080 in DX12 so you're GPU bound...
Thats true, but not the point of what I was saying. The poster I was replying to said DX11 was faster only because they were testing with a 5960X. That is not correct. DX 11 is faster with the 6700 as well and equal with the 4770.

Edit: also, I would not consider the 4770 gpu bound in DX11, but it is is DX12.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Edit: also, I would not consider the 4770 gpu bound in DX11, but it is is DX12.

b1_proz_11.png


b1_proz_12.png


The 4770 is basically even in DX11 vs DX12. It is definitely bound in DX11 as the 1080 drops from 121 / 139 -> 109/125

All the other CPUS benefit from DX12 or are basically even.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
OMG my rig is going to wreck this game. This is great new for me. SLI works well and oh, that 2600k! Wow. I would have liked to see a 6 core chip thrown in there, but it won't matter anyway. A 2600k with a moderate OC will be enough to thrash this game, let alone something with more cores. i5s are dying. Sad but true. Was that a SKYLAKE i5 being edged out by an ancient sandy bridge?
Also, an FX 8350 neck and neck with a new Skylake chip under DX12? Good god. Did I mention that i5's are dead?
Also, I bet Zen would be a beast in this game. As good or better than anything else out there I bet. Not many games use cores like this one seems to.
 
Last edited:

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
I would have liked to see a 6 core chip thrown in there, but it won't matter anyway

The 5960x is 8 core and the 6700 performs the same, so looks like it tops out ~4 physical cores / 8 threads.

Also, an FX 8350 neck and neck with a new Skylake chip under DX12? Good god. Did I mention that i5's are dead?

Yeah I feel bad for everyone who cheaped out on pentium / i3s over similar priced 8320s or used i7 2000 / 3000 / 4000 series.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
The 5960x is 8 core and the 6700 performs the same, so looks like it tops out ~4 physical cores / 8 threads.

They perform the same due to GPU bottleneck, as usual. A proper test would be 1080sli or titan X @ 1080p.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
They perform the same due to GPU bottleneck, as usual. A proper test would be 1080sli or titan X @ 1080p.

The 5960x is 8 core and the 6700 performs the same, so looks like it tops out ~4 physical cores / 8 threads.



Yeah I feel bad for everyone who cheaped out on pentium / i3s over similar priced 8320s or used i7 2000 / 3000 / 4000 series.

I feel bad for the people that suggested i3's or even worse, 2c/2t pentiums over any Vishera only to push an agenda. And even worse for the poor lads that paid attention to that backhanded advice.

Ultimately, DX12 is good for everyone. Less money on the CPU --> More budget allocated to the GPU. Who wouldn't want that? We know, the people that say DX12 is the son of the devil just because Paxwell cant do well on this API besides from the UE4 games or blatantly suspicious synthethic benches such as Timespy. Time to save a whole lot of posts for when Volta arrives and DX12 suddenly becomes good again. Yeah, just like when it was just announced and Nvidia was endorsing it with that 5 year development time lie.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
I feel bad for the people that suggested i3's or even worse, 2c/2t pentiums over any Vishera only to push an agenda. And even worse for the poor lads that paid attention to that backhanded advice.
So in one of the few game engines that can take advantage of lots of cores,running with an API that can take advantage of lots of cores the Fx-8350 barely get's 10FPS more than the i3-6100 and that's bad for i3s how exactly?
If anything it's a testament on how ridiculously outdated FX cores are,they should be able to completely destroy a dual core CPU since they have 6 more cores.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Ultimately, DX12 is good for everyone. Less money on the CPU --> More budget allocated to the GPU. Who wouldn't want that? We know, the people that say DX12 is the son of the devil just because Paxwell cant do well on this API besides from the UE4 games or blatantly suspicious synthethic benches such as Timespy. Time to save a whole lot of posts for when Volta arrives and DX12 suddenly becomes good again. Yeah, just like when it was just announced and Nvidia was endorsing it with that 5 year development time lie.

What? DX12 performs phantastic on Maxwell and Pascal in Tomb Raider and Gears of War. Only Gaming Evolved games are broken.

It is funny that Nixxes was able to implement DX12 in their first iteration without any problems in Tomb Raider and yet their DX12 implementation in Deux ExMankind is still broken and totally useless. Maybe there is a correlation between AMD, money and certain publisher and developers...
 

frowertr

Golden Member
Apr 17, 2010
1,372
41
91
I wouldn't hold your breath. Nvidia's drivers team has been about as reliable as your Georgia secondary against Tennessee's hail marys when it comes to putting out DX 12 updates. Either their software, hardware, or a combination of both aren't well adapted to future APIs unfortunately for nvidia buyers.

We are talking about 8 - 10 FPS difference here between the 1060 6GB and the RX 480 at 1440. Hardly the stuff of legends...

I'll take my lower power draw, the card size (I have the Gigabyte Mini), and the quieter card any day of the week with differences small.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
We are talking about 8 - 10 FPS difference here between the 1060 6GB and the RX 480 at 1440. Hardly the stuff of legends...

I'll take my lower power draw, the card size (I have the Gigabyte Mini), and the quieter card any day of the week with differences small.

On top of this, GTX 1060 in DX11 performs exactly like RX 480/390X in DX12, with (much) better minimum FPS.

http://www.sweclockers.com/test/22807-snabbtest-battlefield-1-i-directx-11-och-directx-12

There is room for improvement as well, just like AotS's DX12 implementation on NVIDIA cards (launch vs now).
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Best implemented dx12 game yet, and its a bloodbath o_0

Looking forward to more benchmarks.. This is one of the biggest games in 2016, so this is pretty much a big deal :)
No, it isn't. Its DirectX 11 game, with DX12 back-end renderer.
 

Spjut

Senior member
Apr 9, 2011
932
162
106
On top of this, GTX 1060 in DX11 performs exactly like RX 480/390X in DX12, with (much) better minimum FPS.

http://www.sweclockers.com/test/22807-snabbtest-battlefield-1-i-directx-11-och-directx-12

There is room for improvement as well, just like AotS's DX12 implementation on NVIDIA cards (launch vs now).

For those not translating the text, Sweclockers talks about DX12 having more stutters and irritating "micro stops" for both AMD and Nvidia, but it's worse for AMD than for Nvidia
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
A proper test would be with a full 64 player game going. Thats what sucks up the CPU time.

Most review sites are too chicken to do that. They are so terrified of not being able to maintain perfect repeatability that they totally sacrifice any kind of useful data in these multiplayer games by doing CPU tests in single player. But still, avoiding GPU bottlenecks is critical if you want to see relative CPU performance. Of course, most sites get this part wrong as well.
 
Mar 10, 2006
11,715
2,012
126
Most review sites are too chicken to do that. They are so terrified of not being able to maintain perfect repeatability that they totally sacrifice any kind of useful data in these multiplayer games by doing CPU tests in single player. But still, avoiding GPU bottlenecks is critical if you want to see relative CPU performance. Of course, most sites get this part wrong as well.

If they did, then fans of a particular CPU maker would cry foul if their preferred brand does worse than expected :p



Your purposeful, inflammatory post was unnecessary.


esquared
Anandtech Forum Director
 
Last edited by a moderator:
  • Like
Reactions: CHADBOGA

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
If they did, then fans of a particular CPU maker would cry foul if their preferred brand does worse than expected :p

Maybe so, but I expect that dynamic will shift drastically when Zen comes barreling through our lives like the Wabash Cannonball!
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Testing Battlefield games for CPU in single player is so dumb its basically misleading. Reviewers need to test 64 player MP for CPU or not even bother putting up CPU tests. Nobody cares how great the single player campaign runs when you'll play it for 1/400th of the total time you play BF
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
For those not translating the text, Sweclockers talks about DX12 having more stutters and irritating "micro stops" for both AMD and Nvidia, but it's worse for AMD than for Nvidia

That's terrible. Hopefully official drivers will fix some of these issues.

DX12 is not looking like how I envisioned it...
 
  • Like
Reactions: Arachnotronic

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
For those not translating the text, Sweclockers talks about DX12 having more stutters and irritating "micro stops" for both AMD and Nvidia, but it's worse for AMD than for Nvidia
The culprit is Sweclockers' test bench which has 8GB of RAM. DX12 tends to increase memory usage, and 16GB is recommended so it's not a surprise. Battlefront also had poor frame times if you had 8GB.

16GB seems to be the new 8GB, at least for gaming.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
So in one of the few game engines that can take advantage of lots of cores,running with an API that can take advantage of lots of cores the Fx-8350 barely get's 10FPS more than the i3-6100 and that's bad for i3s how exactly?
If anything it's a testament on how ridiculously outdated FX cores are,they should be able to completely destroy a dual core CPU since they have 6 more cores.

He's not talking about the latest i3, the older ones i guess, the ones many people recommended over FX 8350. It's good that you recognize that the older FX 8350 beats the newest i3 in today's titles. I play everything on my FX machine, where the limiting part is the 7970 i have with it
 

pcslookout

Lifer
Mar 18, 2007
11,959
157
106
The culprit is Sweclockers' test bench which has 8GB of RAM. DX12 tends to increase memory usage, and 16GB is recommended so it's not a surprise. Battlefront also had poor frame times if you had 8GB.

16GB seems to be the new 8GB, at least for gaming.

I agree with you here!

Surprised it took so long for 16 GB to become the new 8 GB!
 
  • Like
Reactions: Bacon1
Mar 10, 2006
11,715
2,012
126
That's terrible. Hopefully official drivers will fix some of these issues.

DX12 is not looking like how I envisioned it...

DX12 sucks right now. It adds unnecessary burden to the developer in search of gains that just aren't worth it, especially given that gaming PCs have much faster CPUs than the consoles.

There is a reason DX12-like APIs didn't really come to PC until AMD started aggressively pushing Mantle.
 
  • Like
Reactions: Sweepr

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
DX12 sucks right now. It adds unnecessary burden to the developer in search of gains that just aren't worth it, especially given that gaming PCs have much faster CPUs than the consoles.

There is a reason DX12-like APIs didn't really come to PC until AMD started aggressively pushing Mantle.

Doom will never stop existing whenever you try bagging on DX12 and low level APIs in general. There's a learning curve involved with the APIs as we've seen, but the gains are very real and worthwhile.
 
  • Like
Reactions: Bacon1