• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Battlefield 1 Benchmarks (Gamegpu & the rest)

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
No bottleneck, my cpu will do 4.5 and when overclocked the $150 gtx970 will demolish a $170 rx 470. I can still get over 100$ for my gtx960. So $50 for 47% more performance from a gtx970 is a great value.
Yes, I will grab a nice upgrade when they refresh cards late next year.
Mabe a cheap gtx1070. 🙂 to go with an i7 Kabylake cpu.
he probably means that an i3 will not be enough for BF1 like a dual core would have lower performance In BF4
 
http://www.techspot.com/review/1267-battlefield-1-benchmarks/

I wonder if it is tested in single player or multiplayer ?

Single player:

Testing Methodology
Our benchmark pass lasted 60 seconds, we started at the beginning of the "Through Mud & Blood" story where you take command of a British Mark 5 Tank. This test features plenty of AI-controlled characters and more importantly an easy to follow path that allowed us to repeatedly reproduce the test with a high degree of accuracy.

The most useful part of the test was the CPU:

CPU_FuryX.png


CPU_GTX1080.png


Amazing how poor the minimums are on low core machines in DX12.

Wish we had 99% or frametimes but still good info.
 
Game looks really nice when you look from the distance. Object at close range are not looking amazing when you focus on them. Vegetation is a bit lame.
 
Yet another failed DX12 implementation in the form of a block-buster AAA title with a massive budget. This is despite EA having very deep pockets, and Dice not being amateurs by any stretch of the imagination.

Yet in the other thread we're lead to believe that even Indie developers will get "automatic" DX12 performance advantages by simply using Unity, and the market will be completely taken over by DX12. 🙄
 
Yet another failed DX12 implementation in the form of a block-buster AAA title with a massive budget. This is despite EA having very deep pockets, and Dice not being amateurs by any stretch of the imagination.

Yet in the other thread we're lead to believe that even Indie developers will get "automatic" DX12 performance advantages by simply using Unity, and the market will be completely taken over by DX12. 🙄
Well publishers with deep pockets and excellent coders have incredibly well optimized dx11 paths which are incredibly difficult to be outperformed,yes dices dx12 path should have been much better at day 1, I do agree,especially since it's an console title as well, so low level optimization should already be integrated.

Indies on the other hand have incredibly bad dx11 paths so even automatic dx12 from the game engine will probably provide some improvement.
Just look at dota2,they came out with a vulkan path very fast and it provides a good amount of improvement exactly because their dx11 path was pretty bad.
 
Yet another failed DX12 implementation in the form of a block-buster AAA title with a massive budget. This is despite EA having very deep pockets, and Dice not being amateurs by any stretch of the imagination.

Yet in the other thread we're lead to believe that even Indie developers will get "automatic" DX12 performance advantages by simply using Unity, and the market will be completely taken over by DX12. 🙄

This is why I'm still on Windows 8.1. I don't want to deal with the privacy nightmare of W10 when literally the only reason I'd ever upgrade is for DX12. I'm hoping more devs follow the route of id and go for Vulcan.

It would also make Linux gaming more viable since as Raja of AMD has pointed out, DX12/Vulcan means that the amount of stuff they can do in drivers is more or less minimal. This would solve a big problem with Linux gaming(though not all of course) and make the transition even smoother. Not being tied into the whims of MS would be a golden opportunity.
 
Battlefield 1 is extremely fun, and it runs great on my rig. Between this and the upcoming CoDs, I'm going to have more multiplayer goodness than I can handle.
 
Yet another failed DX12 implementation in the form of a block-buster AAA title with a massive budget. This is despite EA having very deep pockets, and Dice not being amateurs by any stretch of the imagination.

Yet in the other thread we're lead to believe that even Indie developers will get "automatic" DX12 performance advantages by simply using Unity, and the market will be completely taken over by DX12. 🙄

When even the forefathers of DX12 can't get it to perform better than DX11 you're left wondering...

At this point DX12 is AMD's Mantle 2. They need it to stay competitive. NV can just rest on their laurels if DX11 on their side is sufficient enough.

That list that got regurgitated ad nausea isn't looking that good when the games finally hit market. AMD needs to get their devrels in order. DX12 patches AFTER game releases and regressive performance is not helping their cause.
 
Back
Top