• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[GPU.RU]Battlefield 4 Beta Bench

Jaydip

Diamond Member
http://gamegpu.ru/action-/-fps-/-tps/battlefield-4-beta-test-gpu.html

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_1920.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_2560.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_1920_msaa.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_2560_msaa.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_proz.jpg


This game is brutal @ 1440P with 4X MSAA.You will either need mgpu or probably the R9 290X
http://gamegpu.ru/images/remote/htt...-Action-Battlefield_4_Beta-test-bf_4_proz.jpg
 
Last edited:
Thanks, looks like the 7970, or an overclocked 7950, should be good to go at 1080p max settings 🙂
 
I think the most interesting was to see the core load at the end of the article. Seems like BF4 will load up to 8 cores pretty heavily.
 
6 observations:

1) HD7970GE is 2.3x faster than HD6970 at 1080p with MSAA!!! Very nice upgrade for 6950/6970 owners.
2) GTX770 seriously needs a price drop asap.
3) GTX680 has one of the largest leads over GTX580 in a modern game - 47%!

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_1920_msaa.jpg


4) GTX780/Titan run out of power at 1600p. Not sure what the deal is but they are barely faster than 770/7970GE here. Driver issue? If this is the case, I can see R9 290X beating Titan with Mantle API with this level of performance NV flagship has right now.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_2560_msaa.jpg


5) I wonder if 2GB introduces stuttering/hitching in this game that doesn't show up in FPS scores?

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_Beta-test-bf_4_vram.jpg


6) FX8350 @ 4Ghz is keeping up with i7 2600K and is embarrassing the old 1100T. Very nice optimization for Vishera here. If only more games took advantage of 6-8 CPU threads.
 
Last edited:
Good to see this game taking advantage of the 8 cores in the 8350. Even the minimums looks pretty good. Interesting to see the final version benchmarks with and without Mantle. If these graphs are anything to go by and if Mantle gives Tahiti owners decent gains worth talking about, it could make my OC'd 7950's feel like GTX 780's 🙂
 
Seems like it's well multithreaded indeed.
I had liked to see DX11.1 performance though, since Dice said it improved the CPU performance
 
4) GTX780/Titan run out of power at 1600p. Not sure what the deal is but they are barely faster than 770/7970GE here. Driver issue?
It seems to happen in a lot of games. The gap between the 7970GE and 780 closes as the resolution increases. It's too consistent to be a driver issue. I suspect it's an efficiency issue; AMD's rasterizers and ROPs are probably hitting higher efficiency levels at greater resolutions (where triangles cover larger fractions of their rasterization tiles).
 
My 7950 is crying from within its box, and the 4670K is scared that it won't even be bought. Dang.

If Mantle ends up requiring an AMD CPU, I'm gonna wait a bit on buying this game.
 
It seems to happen in a lot of games. The gap between the 7970GE and 780 closes as the resolution increases. It's too consistent to be a driver issue. I suspect it's an efficiency issue; AMD's rasterizers and ROPs are probably hitting higher efficiency levels at greater resolutions (where triangles cover larger fractions of their rasterization tiles).

Even if you look at 780/Titan in isolation to GTX680, I cannot explain why their performance is barely faster given the spec difference. Also, HD7990 which can be bought for $580-600 is only 6 fps slower than GTX780 SLI that costs $1,300. I think NV will need to improve performance in this game. Once AMD introduces Mantle API, even if it brings a 10% increase, then $300 R9 280X will beat $400-450 770 and R9 290 would beat the Titan.
 
Last edited:
This game is brutal @ 1440P with 4X MSAA.You will either need mgpu or probably the R9 290X

Or just find those ~2 settings that are always there in every game that eat tons of FPS for minor fidelity gain and turn them down. People are always gaga about running max settings, but I've never seen a game that didn't look almost as good with like 1-3 settings on medium instead of max, but gets 10-20 FPS better.
 
Or just find those ~2 settings that are always there in every game that eat tons of FPS for minor fidelity gain and turn them down. People are always gaga about running max settings, but I've never seen a game that didn't look almost as good with like 1-3 settings on medium instead of max, but gets 10-20 FPS better.

This, so much this. Maxing games out is pretty dumb these days because not even a Titan can "max out" games at 1080p in some newer titles; basically, use common sense and lower the 2-3 settings that have the largest effect on framerate without an IQ increase. For instance, crysis 3 looks identical on Very high and high quality settings. The thing is, very high runs 20-25 fps slower consistently. In Metro 2033 ADOF causes such a tremendous loss of framerate it's ridiculous, and it doesn't increase visual quality whatsoever. There are many more examples I could mention as well.

Maxing games out became untenable and pretty stupid several years ago. It's just a common sense issue - you don't need 2000$ worth of GPUs to play a game. Just lower 2-3 settings, image quality won't decrease, and generally speaking every AAA game has 2-3 settings that lowers framerate by a ton without a corresponding image quality increase. Lower those settings by 1 notch and you'll be good to go.

Or if you want to blow 2 grand on Titans to "max" games out at 1600p go right ahead, some folks apparently get some sort of self-pride out of being able to proclaim "Hey I spent 2 grand to max games out". Whatever. I think most folks opt for the common sense approach rather than the 2-3 grand approach.
 
Last edited:
Good to see this game taking advantage of the 8 cores in the 8350. Even the minimums looks pretty good. Interesting to see the final version benchmarks with and without Mantle. If these graphs are anything to go by and if Mantle gives Tahiti owners decent gains worth talking about, it could make my OC'd 7950's feel like GTX 780's 🙂

The cpu charts are kind of weird though. The first chart showing raw FPS is at 1680x1050, while the core load charts are at 1080p. Could the first chart be a misprint? Normally they test at 1080.

Anyway, nice result for 8350, but 6 core vishera falls off a lot. Also appears the game uses hyperthreading fairly well.
 
6 observations:
1) HD7970GE is 2.3x faster than HD6970 at 1080p with MSAA!!! Very nice upgrade for 6950/6970 owners.

This has always been the case. In any modern DX11 game with MSAA , HD 7970 ghz is anywhere from 1.8 - 2.2x the speed of HD 6970
2) GTX770 seriously needs a price drop asap.

that will happen when the R9 280 and 290 series launch. especially R9 290

3) GTX680 has one of the largest leads over GTX580 in a modern game - 47%!

shows how well balanced GK104 is :thumbsup:

4) GTX780/Titan run out of power at 1600p. Not sure what the deal is but they are barely faster than 770/7970GE here. Driver issue? If this is the case, I can see R9 290X beating Titan with Mantle API with this level of performance NV flagship has right now.

if this is the Titan performance at 1600p it sucks. 30+% faster at 1080p wrt HD 7970 ghz but just 15% faster at 1600p. R9 290X will beat Titan in DX11.1 while destroying it with Mantle.

6) FX8350 @ 4Ghz is keeping up with i7 2600K and is embarrassing the old 1100T. Very nice optimization for Vishera here. If only more games took advantage of 6-8 CPU threads.

With next gen consoles driven by 8 core Jaguar developers like DICE would have already worked on scaling their game engine to 8 threads. so its easy to see future games do well on AMD FX.
 
These don't mean anything, other than for funsies, since they have to address the problems currently with GPU/CPU not playing nice in beta. You know its messed up when my LOW settings on GTX 580 are the same as Ultra settings. lol

FYI i don't get any hitching, seems to be just certain cards.
 
GPU.ru is the most fishy reviewer site out there. So many benchmarks they have done which have been way off what other reviewers have posted.

I dont understand why people give them credit.
 
Back
Top