CIV: BE benchmarked

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
It's not a joke metric as if you get a certain minimum above what you find is reasonable you will get good performance. But that isn't to say that a lower minimum can't also be very good performance wise if that minimum is only hit very rarely or when it doesn't matter.


Rarely? How about once?

be-25.png


That image explains quite clearly why minimum fps is far less meaningful metric then people are led to believe here.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I suppose the point must be brought up (no I am not trolling here only playing devil's advocate) whether Mantle the API brings benefits above what any hardware specific API would bring. Are the benefits from mantle due to mantle being inherently a better API than DX? Or is it simply that the API is designed specifically for certain GPUs and therefore runs better than an API designed for nonspecific GPUs (not fundamentally better but designed to run on GCN so performance is better on GCN).
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
Rarely? How about once?

be-25.png


That image explains quite clearly why minimum fps is far less meaningful metric then people are led to believe here.

Well depending on the time, and where it is that may or may not be a problem as I just said.

But then again even a single or small amount of low frame time in 10 or 20 minutes can be a big deal. This is a short test so I can't tell you how well it would perform in game. Which is why minimum can help you out as you know it won't be a probelm
 

Spjut

Senior member
Apr 9, 2011
928
149
106
Well, all those who bought and are still on 6xxx series then got shafted. And yes, if you look at many other games (non-mantle) you generally see that nvidia performs with less overhead in DX 11.

Are there any benchmarks done on AMD's older series? The HD 6000 series and lower would presumably need separate optimizations due to the architecture being much different from the GCN GPUs.
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
With FCAT they've got mother of all benchmarking tools, but everyone is using it as they see fit.
There are no common standard on its use.

With Civ BE we are even seeing comeback of "minimums".
Really? Minimum fps in 2014?

With images like this dafuq would I need "minimum" fps that carries no info on minimum longevity.
be-25.png


Where are 1% and 99%. Where is frametime variation?
Instead we are getting "minimums" LMAO

Frametimes perception varies from eye to another eye. Framtime metric is important to detect bad behavior on frame distribution on the screen.

The frametimes metric have its worth, but is not the ultimate or the only metric that matters. Minimums FPS is as important as frametime metrics, since minimums can cause much more perceptible(for most end users) differences in the smoothness of the gameplay.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Frametimes perception varies from eye to another
eye.

There is nothing subjective or perception dependant about frametimes.
Frametimes is just a set of numbers.
Along with input latency they fully describe game experience perf. wise.

The frametimes metric have its worth, but is not the ultimate or the only metric that matters. Minimums FPS is as important as frametime metrics, since minimums can cause much more perceptible(for most end users) differences in the smoothness of the gameplay.

Frametimes incorporate minimums and maximums and whatnot.
Every other perf. metric is derived from frametimes, so yes -

Frametimes is the ultimate and the only metric that matters.

This is not even debatable. The only thing thats debatable is how you are going to present them.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
There is nothing subjective or perception dependant about frametimes.
Frametimes is just a set of numbers.
Along with input latency they fully describe game experience perf. wise.



Frametimes incorporate minimums and maximums and whatnot.
Every other perf. metric is derived from frametimes, so yes -

Frametimes is the ultimate and the only metric that matters.

This is not even debatable. The only thing thats debatable is how you are going to present them.

yeah but frame time graphs are usually high resolution, a standard avg, min, max chart might be simpler to parse.
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
Frametimes is the ultimate and the only metric that matters.

This is not even debatable. The only thing thats debatable is how you are going to present them.

But minimums are more perceptible to the user. Yes frametimes is important too, but is not the only metric. Judging by this way, no one should use multi-gpu configs them...

This is not even debatable.

This is your opinion.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
But minimums are more perceptible to the user. Yes frametimes is important too, but is not the only metric. Judging by this way, no one should use multi-gpu configs them...



This is your opinion.

I think you guys are talking past each other, looking at the frametimes will give you all the other metrics [min max avg]. It is also more accurate and relatively high resolution. How you display this info is debatable though.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
more data, seems Jarred W. fixed an error[omission] in his previous review of civ:be
http://www.anandtech.com/show/8643/...crossfire-with-mantle-sfr-not-actually-broken
CivBE-R9-290X-CrossFire-Mantle-Frame-Rates_575px.png

CivBE-R9-290X-CrossFire-D3D11-Frame-Rates_575px.png

One must be careful in these comparisons. The D3D frametimes are the WORST I have really ever seen for 290X crossfire, much worse than other games such as BF4, TR, etc. It looks like AMD did no driver optimization on this game (which is completely fine as you can use mantle), therefore comparisons between AMD's DX and Mantle frametimes are a little suspect.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
One must be careful in these comparisons. The D3D frametimes are the WORST I have really ever seen for 290X crossfire, much worse than other games such as BF4, TR, etc. It looks like AMD did no driver optimization on this game (which is completely fine as you can use mantle), therefore comparisons between AMD's DX and Mantle frametimes are a little suspect.

fair point but the difference you are seeing is due to different mgpu rendering techniques, sfr vs afr. so direct comparisons dont really work -unless i am missing something?
 
Feb 19, 2009
10,457
10
76
I like this:

"But didn't we have SFR way back in the early days of multiple GPUs? Of course we did! 3dfx initially called their solution SLI – Scan Line Interleave – and had each GPU rendering every other line."

If SFR can get smoother result with higher mininum FPS, I'm happy to sacrifice some average FPS performance, if frame latency is smooth thats just added bonus.
 
Feb 19, 2009
10,457
10
76
But if you really want:
After browsing through several benchmarks it's evident that NV's DX works almost as good as Mantle; ofc not quite there.
It's AMD's DX that is in another low-league all together

To be fair, a R290X that is beating the 980 with Mantle means its a significant gain because normally the R290X is what, 20-25% slower? Ofc, NV's dx11 is much better than AMD in CPU limited situations, that much is clear now.

Edit: Also regarding Min fps, it matters if its repeatable at a certain stage of the game, and in this case, it seems so:

"Basically, Civilization: Beyond Earth hits minimum FPS when you zoom all the way out, particularly when there are a lot of units on the screen, and that's exactly what happens in the benchmark at about the halfway point." - AT's benchmark

One would expect late game, this is impacted the most, especially big maps, lots of factions.
 
Last edited:

Blue_Max

Diamond Member
Jul 7, 2011
4,227
153
106
I know the talk is all about framerate, but what's going to improve the time between turns? Is that 100% CPU?
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
To be fair, a R290X that is beating the 980 with Mantle means its a significant gain because normally the R290X is what, 20-25% slower? Ofc, NV's dx11 is much better than AMD in CPU limited situations, that much is clear now.

Edit: Also regarding Min fps, it matters if its repeatable at a certain stage of the game, and in this case, it seems so:

"Basically, Civilization: Beyond Earth hits minimum FPS when you zoom all the way out, particularly when there are a lot of units on the screen, and that's exactly what happens in the benchmark at about the halfway point." - AT's benchmark

One would expect late game, this is impacted the most, especially big maps, lots of factions.

But looking at those graphs the spikes occurred before and after this zoom out (the plateau/hill in the frametimes graph) on the 980. IMO it looks like the spikes are occurring during the zoom action (when units enter the frame) rather than when completely zoomed out. But I agree with you, if its repeatable then it matters. This is why a long testing period is needed. 1 or 2 spikes every 10 minutes is unnoticeable but 1 or 2 spikes every 30 seconds is quite bothersome.

Its also worth mentioning that using mantle gains GCN about 2-10% in GPU limited senarios (coding the GPU for GCN rather than general hardware) which adds to the 290X's lead over the 980 (this is not related to, or does not appear to be related to the lower overhead).
 
Feb 19, 2009
10,457
10
76
I know the talk is all about framerate, but what's going to improve the time between turns? Is that 100% CPU?

Would be 100% CPU.

I would think Mantle reducing CPU overhead, may then free up the CPU to process turns faster, but its just speculation. Would be nice to see some proper benchmarks in late game with turn times.
 

Blue_Max

Diamond Member
Jul 7, 2011
4,227
153
106
Would be 100% CPU.

I would think Mantle reducing CPU overhead, may then free up the CPU to process turns faster, but its just speculation. Would be nice to see some proper benchmarks in late game with turn times.

Dang... that may trigger the upgrade bug. :twisted:
Thanks
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Would be 100% CPU.

I would think Mantle reducing CPU overhead, may then free up the CPU to process turns faster, but its just speculation. Would be nice to see some proper benchmarks in late game with turn times.


I would 100% love to see IGP acceleration with something like this. I'm surprised mantle doesn't do this (would mesh extremely well with AMD's APUs but AMD likely doesn't have the budget for this) and I hope perhaps DX 12 will support igp physics acceleration. Really looks like a way to extract performance with the CPU and GPU right next to each other (HSA).
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
One must be careful in these comparisons. The D3D frametimes are the WORST I have really ever seen for 290X crossfire, much worse than other games such as BF4, TR, etc. It looks like AMD did no driver optimization on this game (which is completely fine as you can use mantle), therefore comparisons between AMD's DX and Mantle frametimes are a little suspect.

The graphs you quoted are frame rates not frame times.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I would 100% love to see IGP acceleration with something like this. I'm surprised mantle doesn't do this (would mesh extremely well with AMD's APUs but AMD likely doesn't have the budget for this) and I hope perhaps DX 12 will support igp physics acceleration. Really looks like a way to extract performance with the CPU and GPU right next to each other (HSA).

You are mentioning AMD being budget constrained. In order to assume that you would need to know what their budget is and what the costs incurred are. Do you have this info? If you don't then you can't make any meaningful observations and shouldn't just throw it out there like it adds anything at all. It's purely conjecture and smacks of FUD.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Going from 14.9 to 14.9.2 improved dx performance for my 7870 from 28FPS to 36 FPS in the most demanding benchmark scene. Mantle got some too - around 10%. But that confirms two things:
1. AMD works on their dx path.
2. There is a need for a driver update to get anything reasonable from dx.
3. Mantle works better without updated driver, then dx with updates.

Here is my performance table in benchmark (7870@1.2/6.0GHZ + fx6300@4.2GHz):
FHD Ultra avg FPS 14.9 | 14.9.2:
DX: 24 | 33
Mantle: 35 | 40

mantle is 50% faster on old drivers and 20% faster after driver update. Its interesting that in mantle minFPS=avgFPS=maxFPS. In DX it usual ups and downs.

I can't be bothered to test how downclocking CPU affects FPS. Would love to see numbers for high thread count CPU - like FX8350 downclocked to 2GHz
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
Are there any benchmarks done on AMD's older series? The HD 6000 series and lower would presumably need separate optimizations due to the architecture being much different from the GCN GPUs.

checking the gamegpu numbers it looks like this game is better than average for the old VLIW cards compared to Nvidia.

6870 ahead of the 570 and 6950 ahead of the gtx 580 is very unusual for new games.

http://gamegpu.ru/images/remote/htt...ion_Beyond_Earth-test-civilizationbe_1920.jpg
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
Going from 14.9 to 14.9.2 improved dx performance for my 7870 from 28FPS to 36 FPS in the most demanding benchmark scene. Mantle got some too - around 10%. But that confirms two things:
1. AMD works on their dx path.
2. There is a need for a driver update to get anything reasonable from dx.
3. Mantle works better without updated driver, then dx with updates.

Here is my performance table in benchmark (7870@1.2/6.0GHZ + fx6300@4.2GHz):
FHD Ultra avg FPS 14.9 | 14.9.2:
DX: 24 | 33
Mantle: 35 | 40

mantle is 50% faster on old drivers and 20% faster after driver update. Its interesting that in mantle minFPS=avgFPS=maxFPS. In DX it usual ups and downs.

I can't be bothered to test how downclocking CPU affects FPS. Would love to see numbers for high thread count CPU - like FX8350 downclocked to 2GHz


How did you calculate the average?

I am not sure of your methodology.