• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Battlefield 1 Benchmarks (Gamegpu & the rest)

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I'm working on an article comparing a number of video cards in BF1 multi-player, but started with CPUs to determine which would be the best platform to test on.

You all might be interested in checking out the CPU results pitting a 6600K@4.4, 6700K@4.4, and 6900K@4.3 against each other in both single-player and multi-player modes.

Nice thanks, can you also do a sample with them not OC'd to see how much OC effects it vs cores?
 
I'm working on an article comparing a number of video cards in BF1 multi-player, but started with CPUs to determine which would be the best platform to test on.

You all might be interested in checking out the CPU results pitting a 6600K@4.4, 6700K@4.4, and 6900K@4.3 against each other in both single-player and multi-player modes.

This was well done! It actually shows what the CPU's can give you in multiplayer, as well as single player. No one has the guts to test multiplayer, but you straight up went there and the information is valuable and interesting to read. Thanks for sharing.
I was thinking of another hex core for my next CPU, but now I'm thinking it might make sense to stretch a little farther and grab an 8 core newt year. People can't make decisions like this if review sites only show bottlenecked scenarios, so thanks for the interesting and useful info.
 
Glad you guys liked this. Yeah, the singleplayer benches will only take you so far. It's taking me a ton of time to get repeatable results in multiplayer, but my GPU benches should be up by tomorrow.

As for running the CPUs at stock, that will have to come later. I'm in the middle of switching through GPUs on the OC'd 6900K right now.
 
Glad you guys liked this. Yeah, the singleplayer benches will only take you so far. It's taking me a ton of time to get repeatable results in multiplayer, but my GPU benches should be up by tomorrow.

As for running the CPUs at stock, that will have to come later. I'm in the middle of switching through GPUs on the OC'd 6900K right now.


Love to see what a 8c/16 Zen would do even though I just built an Intel system and couldn't wait.
 
Glad you guys liked this. Yeah, the singleplayer benches will only take you so far. It's taking me a ton of time to get repeatable results in multiplayer, but my GPU benches should be up by tomorrow.

As for running the CPUs at stock, that will have to come later. I'm in the middle of switching through GPUs on the OC'd 6900K right now.

It looks like I probably would not be able to tell the difference during game play?

And considering the CPU prices, I'd probably want to stick with the 4 core chips.

The 6900K is 3 to 4 times the price.
 
Furthermore, the 6900K is running at a lower overclock, owing to the fact that Broadwell-E does not have as high a limit as Skylake.

Well, the 6900K has the biggest overclock, doesn't it? It's at 1.1ghz over it's base clock, which is far more of an overclock than the 6700K at only 400mhz over it's base clock.
 
Well, the 6900K has the biggest overclock, doesn't it? It's at 1.1ghz over it's base clock, which is far more of an overclock than the 6700K at only 400mhz over it's base clock.

Yes, it does, but just barely, because your calculations are based on published speeds, rather than actual speeds. The 6900K has a 0.8MHz overclock over its 3.5GHz standard operating speed. That's 23%. The 6600K's standard operating speed is 3.6GHz under quad-core load, so it has a 22% overclock at 4.4GHz. Most 6700K samples will not hit that kind of overclock, which would translate to 4.9GHz.

The point, however, is that it's clear that cores, rather than clockspeed, that are making a huge difference here. The 6900K at 4.3GHz is at a 10% disadvantage in terms of per-core output versus a 6700K or 6600K at 4.4GHz, and yet it's way ahead.
 
The one thing that sucks about having a 100hz monitor is that my CPU requirements have decreased. It's hard to be excited about a monster CPU when I can only use 100fps anyway.
 
Thanks for the work Termie. My 5960x and GTX 1080 are reporting for Duty in BF1 MP!

Nice! Share your gamer tag (either here or by PM), and I'll friend you on Origin!

By the way, just about all my results are now up, including 3 AMD cards, 3 Nvidia cards, and VRAM usage. Just a few configurations (980 Ti, 1080, and 1080 SLI) have been locked out by EA's stupid DRM scheme (no more than five configurations allowed in 24 hours). I'm hoping Origin lets me back in tomorrow.
 
It looks like I probably would not be able to tell the difference during game play?

And considering the CPU prices, I'd probably want to stick with the 4 core chips.

The 6900K is 3 to 4 times the price.

Frostbite 3, which underpins BF1, is one of the few game engines that scales really well, not just with more CPU cores but also with more GPUs. There was an article a few days ago on one of the major tech sites(can't remember which) that looked at CPU scaling and basically most games are still by and large capped at 4 core/8 HT CPUs. Going beyond that often does not help in most cases. In some, there was even a regression(like in Witcher 3).

I'd personally wait until Zen is out to see if there is more competition and thus lower prices. Even so, it will still be a few years until an i7 becomes what an i5 is today compared to the 8-10 core CPUs.
 
I'm working on an article comparing a number of video cards in BF1 multi-player, but started with CPUs to determine which would be the best platform to test on.

You all might be interested in checking out the CPU results pitting a 6600K@4.4, 6700K@4.4, and 6900K@4.3 against each other in both single-player and multi-player modes.

Good stuff, thanks for the effort! Confirms what some quad core folks had been reporting and as others have said it's great to see a game engine that scales so well with cores.
 
Nice! Share your gamer tag (either here or by PM), and I'll friend you on Origin!

By the way, just about all my results are now up, including 3 AMD cards, 3 Nvidia cards, and VRAM usage. Just a few configurations (980 Ti, 1080, and 1080 SLI) have been locked out by EA's stupid DRM scheme (no more than five configurations allowed in 24 hours). I'm hoping Origin lets me back in tomorrow.

Nice testing. You might want to point out that the Fury Air has been under $300 often lately for the Nitro OC'd version too. It took me a while to realize that you were using the Air not X, maybe state "Air" or the model on the graph? I didn't realize it until the note where you said it was air cooled. Was wondering why it was behind the 980 non-ti.
 
Frostbite 3, which underpins BF1, is one of the few game engines that scales really well, not just with more CPU cores but also with more GPUs. There was an article a few days ago on one of the major tech sites(can't remember which) that looked at CPU scaling and basically most games are still by and large capped at 4 core/8 HT CPUs. Going beyond that often does not help in most cases. In some, there was even a regression(like in Witcher 3).

It was Tom's: http://www.tomshardware.com/reviews/multi-core-cpu-scaling-directx-11,4768.html

But their methods were flawed, as they turned off hyperthreading, which created non-existent SKUs. It was purely academic, and not helpful to owners of actual Core i7s They should at least have used a Core i5 instead.

My own published article on this topic may be more relevant: http://techbuyersguru.com/intels-core-i5-6600k-vs-i7-6700k-vs-i7-6900k-games
 
So with the latest drivers gtx 1060 seems to be just on par or sometimes a couple frames faster than rx 480 on 1080p correct?
One question: Is there a way to force to use DirectX 11 on windows 10, instead of DirectX12?
 
Awesome article thus far. Are you going to add any OC results for the GPUs?

Well, I chose to run all cards at stock to make it more of an apples-to-apples comparison. Keep in mind that not a single Pascal card I've benched (I've purchased five so far) has achieved more than a 12% overclock. If you see way more than that in published reviews on press sample cards, you can bet they were cherry picked before being sent out for testing.

My Fury can hit 1120 (which ironically works out to 12% as well), so I think the days of sky-high overclocks are in the past. My GTX 980 achieved 20%, and the R9 290 can do that as well with voltage. The 980 Ti was somewhere in between, with one of my samples hitting 1400MHz and the other hitting 1450Mhz. I'd be happy to run a special "F2F Bench" of the 980 Ti at 1400MHz if you'd like. 😉

EDIT: Woo-hoo, EA just released the lock on my BF1 license, after only 20 hours!!! How generous! Benching the 980 Ti shortly, along with 1080 and 1080 SLI!
 
Well, I chose to run all cards at stock to make it more of an apples-to-apples comparison. Keep in mind that not a single Pascal card I've benched (I've purchased five so far) has achieved more than a 12% overclock. If you see way more than that in published reviews on press sample cards, you can bet they were cherry picked before being sent out for testing.

My Fury can hit 1120 (which ironically works out to 12% as well), so I think the days of sky-high overclocks are in the past. My GTX 980 achieved 20%, and the R9 290 can do that as well with voltage. The 980 Ti was somewhere in between, with one of my samples hitting 1400MHz and the other hitting 1450Mhz. I'd be happy to run a special "F2F Bench" of the 980 Ti at 1400MHz if you'd like. 😉

EDIT: Woo-hoo, EA just released the lock on my BF1 license, after only 20 hours!!! How generous! Benching the 980 Ti shortly, along with 1080 and 1080 SLI!


Termi, you're too kind bro 😀
Your 980 TI would need to hit 1533 MHz for it to be a F2F bench, but hey, what can you do 😉
Jokes aside, I wouldn't mind seeing a how a 1430-1450Mhz 980 TI performs in BF1 in comparison to the other cards.
Anyway, thanks for doing this and look forward to reading the finished article.
 
Termi, you're too kind bro 😀
Your 980 TI would need to hit 1533 MHz for it to be a F2F bench, but hey, what can you do 😉
Jokes aside, I wouldn't mind seeing a how a 1430-1450Mhz 980 TI performs in BF1 in comparison to the other cards.
Anyway, thanks for doing this and look forward to reading the finished article.

Just benched the 980 Ti at 1424/7800 at 2560x1440. Hit 96.9 avg/82.5 min, which put it well above a stock 1070 (89.2/78.3). While I don't have time to switch in the 1070 again for OC testing, my previous benches on that card suggest it would end up even with the OC'd 980 Ti. With your 1533MHz overclock, you'd be well above the performance that any 1070 can achieve in BF1. You'd also be above the 96Hz cap on your monitor! 😉
 
Just benched the 980 Ti at 1424/7800 at 2560x1440. Hit 96.9 avg/82.5 min, which put it well above a stock 1070 (89.2/78.3). While I don't have time to switch in the 1070 again for OC testing, my previous benches on that card suggest it would end up even with the OC'd 980 Ti. With your 1533MHz overclock, you'd be well above the performance that any 1070 can achieve in BF1. You'd also be above the 96Hz cap on your monitor! 😉

Awesome. Thanks for testing that, I've been meaning to pick up the game. Great result for an overclocked GTX 980 Ti.
 
Last edited:
Back
Top