• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Civilization VI: Performance Analysis (Techpowerup)

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Cause zoomed out you're rendering way more stuff in general(GPU load could stay constant or potentially increase).. and the cpu workload is a lot higher with 10-100x as many things on screen.

I don't think we can make any accurate judgments without more information.

Just look at the actual scene though. Watch the video. The framerate goes nuts and spikes all over and the only animations I could see @ 1080p upscaled to 1440p were the three I marked.

Do you honestly think that the graphics shown should be bringing $600 cards to their knees? I mean 4x / turn based should have the lowest hardware requirements compared to RTS / FPS.
 
Just look at the actual scene though. Watch the video. The framerate goes nuts and spikes all over and the only animations I could see @ 1080p upscaled to 1440p were the three I marked.

Do you honestly think that the graphics shown should be bringing $600 cards to their knees? I mean 4x / turn based should have the lowest hardware requirements compared to RTS / FPS.

I'm not really talking about performance, just about trying to guess how the game works.

It doesn't seem that bad though. How is a 1080 with 70+ fps at 4k(with 8xMSAA too) being brought to it's knees?

edit - Just looked at the setting list for this game. Some of the stuff that gets enabled in this game at VHQ/Ultra is pretty heavy stuff. 8k shadows, 4 pass water reflections(! I want to know what that means exactly...), 8xMSAA obviously, etc..
 
Just look at the actual scene though. Watch the video. The framerate goes nuts and spikes all over and the only animations I could see @ 1080p upscaled to 1440p were the three I marked.

Do you honestly think that the graphics shown should be bringing $600 cards to their knees? I mean 4x / turn based should have the lowest hardware requirements compared to RTS / FPS.

I'm not really talking about performance, just about trying to guess how the game works.

It doesn't seem that bad though. How is a 1080 with 70+ fps at 4k(with 8xMSAA too) being brought to it's knees?

In general with turn based games this is an issue when the CPU is too busy doing the other things (AI primarily) and cannot feed the graphics card. Civ VI doesn't appear to be optimized poorly on the graphics pipeline side, it's poorly threaded/optimized on the CPU side which makes it slow everywhere else.
 
Again, how do we know that?

GPUs at 30% utilization is a good hint.

The charts here are another. I'm playing it at 3000x2000 on an i5-6300U with a Geforce ~940m at an acceptable framerate. It seems to have an acceptable performance floor but the ceiling is incredibly low because of poor optimizations.
 
Last edited:
Yeah, it's CPU bound. That doesn't necessarily mean it's poorly optimized.

Turn the resolution up to 4k and GPU utilization will improve, that doesn't mean the game suddenly optimized itself, it just means there are more factors you aren't considering.

It's CPU bound without using all of the resources available to the CPU. Is that not, by nature, poorly optimized?
 
Just look at the actual scene though. Watch the video. The framerate goes nuts and spikes all over and the only animations I could see @ 1080p upscaled to 1440p were the three I marked.

Do you honestly think that the graphics shown should be bringing $600 cards to their knees? I mean 4x / turn based should have the lowest hardware requirements compared to RTS / FPS.
I have seen 2 videos on this thread. 1 was done with a 980ti, and the FPS were mostly 100+ with dips down to the 80's. That is pretty good if you ask me. The other one, the one you liked, and assume you are harping on, was done with a Celeron with a GTX 650 and that one had poor performance. But consider we are looking at a GTX 650, an entry level GPU from a few generations ago.
 
Plus, it seems to have no problem utilizing 4 cores. The fact that per core load decreases with higher core counts suggests that many tasks are parallelized.
 
Last edited:
No, not necessarily.

That's pretty much a definition of poorly optimized.

If it takes more CPU execution resources to keep the GPU busy, and there are more CPU resources available but are unused, that's poorly optimized. What I personally don't know is how resource utilization is reported in Windows, like if the FPUs are all busy but the INT units are unused, is that 100% utilization?
 
That's pretty much a definition of poorly optimized.

If it takes more CPU execution resources to keep the GPU busy, and there are more CPU resources available but are unused, that's poorly optimized. What I personally don't know is how resource utilization is reported in Windows, like if the FPUs are all busy but the INT units are unused, is that 100% utilization?

I have no idea how it measures that either.

We need more info on those CPU use charts, what was the GPU use, memory and VRAM, etc.
 
The other one, the one you liked, and assume you are harping on, was done with a Celeron with a GTX 650 and that one had poor performance. But consider we are looking at a GTX 650, an entry level GPU from a few generations ago.

What? Why do you think it was on a celeron? I linked to the benchmark from TPU.

It doesn't seem that bad though. How is a 1080 with 70+ fps at 4k(with 8xMSAA too) being brought to it's knees?

1080.png


Lots of cards aren't even hitting 80+ @ 1080p and as we already discussed, its not CPU bound until ~90 fps.
 
What? Why do you think it was on a celeron? I linked to the benchmark from TPU.



1080.png


Lots of cards aren't even hitting 80+ @ 1080p and as we already discussed, its not CPU bound until ~90 fps.

Because I watched the video, and it showed it had a Celeron in it. That is the one with poor performance. The one with 80-100+ FPS had something else, obviously.

This is the one with poor performance. https://www.youtube.com/watch?v=8eQcyfFJDFo
At the 2 second mark, it shows it has a Celeron in it and a GTX 650. The only other video I saw on this thread had awesome performance.
 
Because I watched the video, and it showed it had a Celeron in it. That is the one with poor performance. The one with 80-100+ FPS had something else, obviously.

This is the one with poor performance. https://www.youtube.com/watch?v=8eQcyfFJDFo
At the 2 second mark, it shows it has a Celeron in it and a GTX 650. The only other video I saw on this thread had awesome performance.
No the video with the "good" performance is bad performance as well, it shows heavy framerate spikes at the same points the celeron stalls,stronger CPUs just brute force through it but it is still bad performance for both systems.
 
cTVBtGg.jpg


TIL Frametimes like that and mid 40 fps is awesome performance
If 40 FPS is the lowest it got, that's pretty good performance for a turn based game. That said, I did not see the 47 fps dip, but still. It's a turn based game, with some ridiculous settings that can be turned down. Just because settings exist, doesn't mean they have to be used.
 
If 40 FPS is the lowest it got, that's pretty good performance for a turn based game. That said, I did not see the 47 fps dip, but still. It's a turn based game, with some ridiculous settings that can be turned down. Just because settings exist, doesn't mean they have to be used.

40 FPS is not the lowest it got. The 99th percentile is at 33.9 ms, which is equal to 29.5 FPS.


Early impressions (1,2) would seem to suggest that DX12 performs worse that DX11 (by as much as 20%).
 
Last edited:
Hmm, I can see boost for AMD cards in those reports

Let's wait for some performance reviews

Yeah there seems to be one guy with a 290X who sees better results (about a 10% boost), but there are also a guy with a 280X (17% drop), and another with a 380 (6% drop).

Either way, I wouldn't get my hopes up.

Edit: There's also a guy with a 480 who sees a 4.6% boost.
 
Last edited:
This is sad to read. I hope reviews show something better. Considering the game is heavily CPU bottlenecked, you'd expect DX12 to be a godsend.
 
Back
Top