Battlefield 1 Beta CPU scaling performance

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
So does bad performing quad, hex and octo cores.

Look again. You can push a 1080p 144hz monitor with FX octo cores or i5 2500k/i72600k. In my book that's a decent performance in 2016 where 1080p is the most popular res by far. Can you guys say the same for dual core cpus???
 

wingman04

Senior member
May 12, 2016
393
12
51
lol. Look at this CPU usage and GPU usage. GPU isn't reaching 99%, it's always in 80% and sometimes at 60% because of the CPU.

It's instable as fu**
Must be a virus on his PC look at this scaling below.
bf1_proz_11.png
 

AnanUser

Junior Member
Apr 3, 2015
14
0
11
Lol yeah, everyone with a i5 have a virus.

Don't go by benchmarks from strange sources. Test it yourself. Check other youtube videos, check the oficial forums, does everyone with i5 have a malware on their PC's?

That graph is a pure lie. I have a 3450 that has basically the same performance has the 2500k in stock clocks and I can't get near that ;)

Another example: 2500k 40 fps in the beggining of the video and he is in a area where there is no players: https://www.youtube.com/watch?v=-RpECi12EgA

100% CPU usage, 50~90% GPU usage.
 

wingman04

Senior member
May 12, 2016
393
12
51
Lol yeah, everyone with a i5 have a virus.

Don't go by benchmarks from strange sources. Test it yourself. Check other youtube videos, check the oficial forums, does everyone with i5 have a malware on their PC's?

That graph is a pure lie. I have a 3450 that has basically the same performance has the 2500k in stock clocks and I can't get near that ;)

Another example: 2500k 40 fps in the beggining of the video and he is in a area where there is no players: https://www.youtube.com/watch?v=-RpECi12EgA

100% CPU usage, 50~90% GPU usage.
LoL it was a virus, eat this BF1 runs sweet with i5 6600k. BF1 - GTX 1070 - i5 6600k - 1080p

 

AnanUser

Junior Member
Apr 3, 2015
14
0
11
https://www.youtube.com/watch?v=d1Ar94-fJKM - i5 6400 + 1060
https://www.youtube.com/watch?v=esi_kem46Gs - another i5 6400 + 1060

Do you want more? I can post here a dozen of videos.... Maybe everyone is infected with a "virus".

So, by your logic, the game is well optimized when a i5 6600k is the minimum to play? Even in that video his CPU is bottlenecking the GPU in some moment and we are talking about a 6600K!!!

Do you want the link to the oficial forums with dozens os pages talking about this?

Dude...

I'm a fanboy of the battlefield series, but I'm not blind. You don't earn nothing defending something that isn't right, or do you work at DICE?
 

Spjut

Senior member
Apr 9, 2011
932
162
106
The i5 in Wingman's video is running at 4.5 ghz

Anyway, people should use the drawgraph command for checking CPU and GPU
 

AnanUser

Junior Member
Apr 3, 2015
14
0
11
And I didn't saw that. The 6600k comes with 3.5Ghz base clock so 4.5Ghz is +1Ghz. Thanks for the observation.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Look again. You can push a 1080p 144hz monitor with FX octo cores or i5 2500k/i72600k. In my book that's a decent performance in 2016 where 1080p is the most popular res by far. Can you guys say the same for dual core cpus???

You can take a look on the chart in the OP and tell me what you think.
 
  • Like
Reactions: zentan

wingman04

Senior member
May 12, 2016
393
12
51
https://www.youtube.com/watch?v=d1Ar94-fJKM - i5 6400 + 1060
https://www.youtube.com/watch?v=esi_kem46Gs - another i5 6400 + 1060

Do you want more? I can post here a dozen of videos.... Maybe everyone is infected with a "virus".

So, by your logic, the game is well optimized when a i5 6600k is the minimum to play? Even in that video his CPU is bottlenecking the GPU in some moment and we are talking about a 6600K!!!

Do you want the link to the oficial forums with dozens os pages talking about this?

Dude...

I'm a fanboy of the battlefield series, but I'm not blind. You don't earn nothing defending something that isn't right, or do you work at DICE?
You post your dozen i5 6600k man . I don't work for DICE however it will work on the XBOX one fine. It is not the best port to PC and has a little coding trouble however it has nothing to do with a 4 core processor. The i5 6600k runs fine with BF1 without Malware or Virus or multitasking, bloatware.

Man look at this video of BF1 Battlefield 1 Open Beta ULTRA | Core i5 6600k 3,5GHz (Stock) @3.9 Intel Turbo Boost | GTX 970 | DDR4 16GB | It's even better with GPU utilization than my last video post.
 
Last edited:

AnanUser

Junior Member
Apr 3, 2015
14
0
11
Ok, it's a waste of time.... Go search for other i5's from 4xxx or 3xxx or even 2xxx series and then take your own conclusions. If the 6600K that is (basically the TOP i5 right now in the market) runs the game many times at 90~100% usage, what happens with other i5's a bit less powerful?

Everyone have the same opinion, in every place (reddit, oficial forums, twitter, etc, etc) so I doubt I'm wrong ;)
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
What do you have for FPS in BF1?

It was averaging around 75-80fps maxed @ 3440x1440 on a single 980ti. Mins around mid 60's and max FPS at a Gsync locked 100fps. Once SLI starts working I'd expect it to stay north of 100 most of the time.
 
Mar 10, 2006
11,715
2,012
126
It was averaging around 75-80fps maxed @ 3440x1440 on a single 980ti. Mins around mid 60's and max FPS at a Gsync locked 100fps. Once SLI starts working I'd expect it to stay north of 100 most of the time.

wth, mGPU doesn't work in this game either? :(
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
wth, mGPU doesn't work in this game either? :(
mGPU will probably only ever work on very compute intense games,like aots,because splitting up compute workloads is at least somewhat easily doable and it doesn't have to be realtime,hence the async part,I very much doubt that it will ever work like sli/crossfire because syncing two unequal cards up will be a total nightmare.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I didn't try DX12 because from what I saw DX12 made the damn game run worse. I didn't even bother with DX12, so who knows about mgpu. SLI will work with it for sure. Its a BF game.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I'm sorry, but based on the BETA performance I am officially declaring these CPU "requirements" to be WRONG. They are wrong guys.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Im using Windows 10

https://www.youtube.com/watch?v=WOI-aeWF0bE (45 fps at the end of the video, when he is inside the city with 100% CPU usage and 70% GPU usage).

https://www.youtube.com/watch?v=E63RBubSJIM (45 to 60 fps always moving, on a 24 player server with a GTX1080!)

The game runs between 85 to 100% on almost every i5. On i7's it runs much better, with 50~70% CPU usage, and 80~90% on less powerful i7's like the 3770k.

Let's wait for the final game and inevitable patches and AMD/NV GPU driver updates. All of these things should improve performance and stability. Having said that, I do have 2 comments:

1) It's strange to complain that an i5 2500K series CPU is starting to severely bottleneck 2016 GPUs in cutting edge 2H 2016 games, especially because by January 2017 this very CPU will 6 years old. It should not be too expensive to sell the i5 2500K + mobo + DDR3 memory and upgrade to even an i5 6400 + Asus Z170-E/A or Asrock Z170 Pro, etc. When considering the total cost of ownership (TCO) over those 6 years, the upgrade will be dirt cheap and an i5 6400 with BLCK overclocking can achieve 4.5-4.6Ghz. There are plenty of examples already where an i5 2500K, even when OCed, is bottlenecking modern GPUs.

2) It's somewhat pointless to discuss stock performance of the 2500K which runs at only 3.3-3.7Ghz but can be overclocked up to 5Ghz, with 4.5Ghz easily achieved on a $20 CM 212 Evo style CPU cooler.

In general, it can even be stated that an i5 6600K @ 4.6Ghz is not that future-proof CPU for next gen modern titles. This idea that i5s are just as good as i7s and that HT is useless for gaming has distorted the reality and people's expectations of i5s, and unfortunately this wrong information continues to be spread on various forums over the last 6 years with no end in sight.

i7 6700 stock vs. i5 6600K @ 4.6Ghz:
https://www.youtube.com/watch?v=f9cVxka2fns

At the 0:50-0:55 second mark, in the Witcher 3, i5 6600K @ 4.6Ghz has 1 core at 92%, 2 cores hitting 95%, 1 core at 100%. In contrast, the i7 6700's cores are generally hovering at the 60-70% range in the same scene.

At the 1:30-1:32 minute mark, in Crysis 3, i5 6600K @ 4.6Ghz has 1 core at 80%, 1 core at 84%, 1 core at 85% and 1 core at 92%. The i7 6700's highest core is loaded to just 54%. At 1:34 mark, the i5's cores jump to 88%, 2 cores hit 91% and the 4th core is at 97%. The i7's core reaches just 66%.

At the 2:36-2:37 minute mark, in GTA V, i5 6600K @ 4.6Ghz's 4 cores at loaded at 94%, 95%, 96% and 100%! In the exact same area, the i7's 2 highest pegged cores/threads are at only 75% and 81%, with the rest hovering in the 30-50% range.

While it is true that overall the i5 6600K @ 4.6Ghz isn't exactly severely bottlenecking modern GPUs just yet, it doesn't have a lot of headroom for next generation titles should they be even more CPU demanding. Sooner or later, we will have a $400 GPU that's up to 2X faster than the GTX1080.

Fact is, the i5, even when overclocked, was never as good of a gaming CPU for the most demanding CPU titles as the i7 was, going all the way back to i5 2500K vs. i7 2600K era. The difference is, there are just more well-threaded AAA titles coming out now that take advantage of i7's HT, and yet the outdated myth that i5 is "just as good or 99% as good for gaming" as the i7 persists even in late 2017. Literally a stock i7 6700 is an overall superior gaming CPU than a max overclocked i5 6600K...which means an i5 2500K on the bring of 2017 is going to show its age.
 
Last edited:
  • Like
Reactions: ozzy702

wingman04

Senior member
May 12, 2016
393
12
51
Let's wait for the final game and inevitable patches and AMD/NV GPU driver updates. All of these things should improve performance and stability. Having said that, I do have 2 comments:

1) It's strange to complain that an i5 2500K series CPU is starting to severely bottleneck 2016 GPUs in cutting edge 2H 2016 games, especially because by January 2017 this very CPU will 6 years old. It should not be too expensive to sell the i5 2500K + mobo + DDR3 memory and upgrade to even an i5 6400 + Asus Z170-E/A or Asrock Z170 Pro, etc. When considering the total cost of ownership (TCO) over those 6 years, the upgrade will be dirt cheap and an i5 6400 with BLCK overclocking can achieve 4.5-4.6Ghz. There are plenty of examples already where an i5 2500K, even when OCed, is bottlenecking modern GPUs.

2) It's somewhat pointless to discuss stock performance of the 2500K which runs at only 3.3-3.7Ghz but can be overclocked up to 5Ghz, with 4.5Ghz easily achieved on a $20 CM 212 Evo style CPU cooler.

In general, it can even be stated that an i5 6600K @ 4.6Ghz is not that future-proof CPU for next gen modern titles. This idea that i5s are just as good as i7s and that HT is useless for gaming has distorted the reality and people's expectations of i5s, and unfortunately this wrong information continues to be spread on various forums over the last 6 years with no end in sight.

i7 6700 stock vs. i5 6600K @ 4.6Ghz:
https://www.youtube.com/watch?v=f9cVxka2fns

At the 0:50-0:55 second mark, in the Witcher 3, i5 6600K @ 4.6Ghz has 1 core at 92%, 2 cores hitting 95%, 1 core at 100%. In contrast, the i7 6700's cores are generally hovering at the 60-70% range in the same scene.

At the 1:30-1:32 minute mark, in Crysis 3, i5 6600K @ 4.6Ghz has 1 core at 80%, 1 core at 84%, 1 core at 85% and 1 core at 92%. The i7 6700's highest core is loaded to just 54%. At 1:34 mark, the i5's cores jump to 88%, 2 cores hit 91% and the 4th core is at 97%. The i7's core reaches just 66%.

At the 2:36-2:37 minute mark, in GTA V, i5 6600K @ 4.6Ghz's 4 cores at loaded at 94%, 95%, 96% and 100%! In the exact same area, the i7's 2 highest pegged cores/threads are at only 75% and 81%, with the rest hovering in the 30-50% range.

While it is true that overall the i5 6600K @ 4.6Ghz isn't exactly severely bottlenecking modern GPUs just yet, it doesn't have a lot of headroom for next generation titles should they be even more CPU demanding. Sooner or later, we will have a $400 GPU that's up to 2X faster than the GTX1080.

Fact is, the i5, even when overclocked, was never as good of a gaming CPU for the most demanding CPU titles as the i7 was, going all the way back to i5 2500K vs. i7 2600K era. The difference is, there are just more well-threaded AAA titles coming out now that take advantage of i7's HT, and yet the outdated myth that i5 is "just as good or 99% as good for gaming" as the i7 persists even in late 2017. Literally a stock i7 6700 is an overall superior gaming CPU than a max overclocked i5 6600K...which means an i5 2500K on the bring of 2017 is going to show its age.
I don't care about CPU utilization that has to do with how efficiently the game engine feeds data to the CPU running always at 100%. What I would like to see is new gaming benchmark links comparing FPS from i5 6600k to i7 6700k.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I don't care about CPU utilization that has to do with how efficiently the game engine feeds data to the CPU running always at 100%. What I would like to see is new gaming benchmark links comparing FPS from i5 6600k to i7 6700k.

You should care. You missed the part that if the game is well threaded where 4 cores are loaded, that does NOT mean we should automatically have 90-100% per core utilization. In a hypothetical 10Ghz i5 6600K, we would not see 96% average CPU usage as happens in the GTA V game testing from the link I provided. Since 6600K only has 4 cores, to ensure no bottleneck exists, these cores would need to be clocked high enough for all 4 of them to handle all of the workload. Obviously i5 6600K @ 4.6Ghz manages to do it in all 3 games I linked but a stock one may not be enough, nevermind a stock 2500K. The entire point was that i5 6600K is already close to its limits in some big titles released in the last 3 years; and that's not even pairing Titan XP / 1080Ti SLI with it which would shift the bottleneck even more to the processor.

I am pretty sure an overclocked 6600K will run the game just fine.
 

guachi

Senior member
Nov 16, 2010
761
415
136
Looks like every AMD chip improves average fps performance with the average of those tested at +4.27%. The performance increase ranges from 3.03% to 6.67%

Intel is all over the place with some actually dropping in dx12 like the 2500k, 4670k 6600, and 6700.

I wonder if this holds true for other implementations of dx12. As someone who actually owns an AMD chip - the 8350 - it'd be nice to know I'd likely get a bigger boost switching to dx12 than other chips.

In dx12 it's 70.2% as fast as the 6700 (vs. 66.8% in dx11) at 34.5% of the price ($100 for my 8350 vs. $290 for a 6700)
 

wingman04

Senior member
May 12, 2016
393
12
51
You should care. You missed the part that if the game is well threaded where 4 cores are loaded, that does NOT mean we should automatically have 90-100% per core utilization. In a hypothetical 10Ghz i5 6600K, we would not see 96% average CPU usage as happens in the GTA V game testing from the link I provided. Since 6600K only has 4 cores, to ensure no bottleneck exists, these cores would need to be clocked high enough for all 4 of them to handle all of the workload. Obviously i5 6600K @ 4.6Ghz manages to do it in all 3 games I linked but a stock one may not be enough, nevermind a stock 2500K. The entire point was that i5 6600K is already close to its limits in some big titles released in the last 3 years; and that's not even pairing Titan XP / 1080Ti SLI with it which would shift the bottleneck even more to the processor.

I am pretty sure an overclocked 6600K will run the game just fine.
A CPU is not like a car engine the CPU runs 100% speed all the time. With good game engine or a app that run well you should see 100% utilization on any CPU no matter how fast it is. When some game engines splits the game into more than 4 threads you see a delay in feeding the CPU with data that causes less than 100% CPU utilization.

Let me put it another way FPS game is not on a clock cycle they will run as fast as the CPU can run. Games, encoding and other apps that are not on a clock will try and run through the CPU at 100% no matter how fast or more cores the CPU has.

So it comes down to the game engine for the difference in utilization with more cores or threads. Just remember all game engine Developers try to use 100% utilization of the CPU, no throttling of the CPU yet like GPU Vsync.
 
  • Like
Reactions: VirtualLarry