• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[GameGPU] Just Cause 3

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
And again Gameworks = bad port.


Mad Max (no Gameworks) uses the same engine and have stellar performance and multi gpu support.
 
This benchmark solidifies the view that we don't yet have a decent 4K capable single dGPU today, no matter the spin from either NV or AMD about either the Fury X or the 980 Ti.

I wonder what an OC'd 980 Ti would do here. 15-20% gains are not uncommon with an overclock and that puts minimums above 30fps and averages close to 50 with SMAA enabled. Disabling SMAA (which probably isn't as important at 4K) would should offer another jump in performance without much loss of visual quality.

I also expect some gains to be had with driver and game updates. The sense I get from reading this and the Ars Technica review is that this was rushed and unoptimized.

Mad Max which is supposed to be on the same engine has far greater performance.

98ec7ab4_4K.jpeg
 
Last edited:
A few notes: the 390X beating the Fury (non-X), is quite disconcerning.

Yeah what is that about? The Fury seems like it has lost ground to the 390X since launch.

Makes me wish I got a 290X when I got my 970, looking forward to some time to play this game.
 
And again Gameworks = bad port.


Mad Max (no Gameworks) uses the same engine and have stellar performance and multi gpu support.

check the Digital Foundry analysis on the consoles, it's also not ideal over there (regular drops to mid 20s), and it's not using gameworks on consoles.

and overall it looks good for AMD compared to Nvidia, so I wouldn't blame much on gameworks

this game is just trying to do more I think, with the map design, physics and so on.
 
check the Digital Foundry analysis on the consoles, it's also not ideal over there (regular drops to mid 20s), and it's not using gameworks on consoles.

and overall it looks good for AMD compared to Nvidia, so I wouldn't blame much on gameworks

this game is just trying to do more I think, with the map design, physics and so on.

And the award for "understatement of the century" goes to...

This performance looks good enough for me to jump, but at the same time i see this Intel code selling on ebay for up to almost $40...
 
I don't get the comparison to Mad Max. Two different games. Vegetation is much more demanding than flat desert, along with buildings, water, view distance, etc...
 
NVIDIA on JC3: "The engine is incompatible with multi-GPU solutions."

http://www.geforce.com/whats-new/gu...hics-and-performance-guide#comment-2386875225

What a disappointing answer. My guess is NV is too lazy to make a proper SLI profile. What a way to disappoint your best customers.

If that's the case then screw that game for sure. Also Screw buying two cards next gen because too many games don't even support SLI. I want my SLI to work every time!

-gamer whining complete-
 
SMAA looks excellent in Crysis 3, Evolve, and pretty much every game its been used in, it was superior to FXAA with less blur.

Here, it looks grainy, though apparently that's for all the AA.

Must be the cinematic feels they are chasing for. :/

Ethan Carter Redux has all sorts of AA options including SMAA and FXAA and, surprisingly, I thought TXAA looked best. Not that SMAA looked bad, actually all the AA options in that game looked good.
 
Last edited:
If that's the case then screw that game for sure. Also Screw buying two cards next gen because too many games don't even support SLI. I want my SLI to work every time!

-gamer whining complete-

Isn't it only this game and Arkham Knight so far, plus a few that you needed to wait a few weeks for?
 
Andrew Burnes is the editor in chief of Geforce.com. He isn't an engineer and therefore may have faulty information.

Perhaps the information is faulty and I hope it is, it isn't right to their best customers who buy multiple cards and disappointing to me as someone who just got an NV card.

But he is more qualified than anyone else who has spoken about it SLI so far. He was the author of the optimization guide for JC3, so he had access to the game early and did some type of research into whether SLI can work before doing his testing and publishing the article. Until someone more qualified says something to clear it up, he does represent NV's stance regarding SLI compatibility in this game.
 
Game GPU states that both Mad Max and JC3 use Avalanche Engine 3.0. SLI works in Mad Max so why wouldn't it work in JC3?

Good question, send your email to Andrew Burnes 🙂. I hope he is wrong. I have to think it is technically possible, especially since the engine is shared with an SLI compatible game. But I think this is just NV being lazy about creating an SLI profile.
 
Game GPU states that both Mad Max and JC3 use Avalanche Engine 3.0. SLI works in Mad Max so why wouldn't it work in JC3?

You could say the same about Arkham Knight and the previous Arkham games, though to a lesser degree. Perhaps SLI becomes more difficult to implement as the scale of games increases?
 
yRN6wzr.png
V8l4TVq.png


From: http://www.computerbase.de/2015-12/just-cause-3-grafikkarten-benchmarks/

Note that Sweclockers used old Nvidia drivers. This one is with the game ready drivers. Not like that would affect AMD though.

Since the 390X has higher clocks than a Fury, it has a slight ROP advantage. But honestly given the shader and memory bandwidth deficit it does not make sense for 390X>Fury except in a situation where the 4GB barrier is reached, which is certainly not this game. PCGameshadware has the Fury slightly over the 390X.http://www.pcgameshardware.de/Just-Cause-3-Spiel-9784/Specials/Test-Benchmarks-1179397/

Personally I am running 1440 with FXAA and all maxed out except LOD at High and Bokeh Off. FYI I have the Global AMD Driver set for maximum Tessellation at 8X, as always. Usually mid 50s through mid 70s, most commonly at 58-63 or so. High explosive action brings it into the 40s.

Minor bug? If I alt-tab it caps my framerate at 60fps unless I change the resolution to something else and then back to 1440. Annoying.
 
Looks like really poor Fury X optimization, since the 390X is managing to keep up with the 980 at 1440p, so this game isn't NV biased like other GW titles recently.

This means AMD needs to step up and optimize for Fury owners.
 
Its cpu limited/AMD cpu overhead.Thats why Furyx have almost same performance as 390x in 1080P.
Google translator
The performance shows Just Cause 3 an exciting because unusual picture. In 1920 x 1080 the fastest AMD graphics cards have to deal with the problem, to run into the CPU limit
 
Last edited:
SMAT2X looks very bad to me. It's less noticeable in screenshots but in motion it looks atrocious. Here's some pics I took. For all comparisons Global Illumination and Bokeh (ex1 ex2) are Off with LOD set to High. For performance and preference reasons.

For the first comparison, pay no attention to the tiny bit of a tree you can see peaking out from behind the rock in the top left. It was in between LOD rendering.

FXAA: http://abload.de/img/testfxaa9npuz.png

SMAA: http://abload.de/img/testsmaafgq3x.png

SMAAT2X: http://abload.de/img/testsmaa22poia.png

Here's a comparison just between SMAA2TX and FXAA. This really shows the strange, ugly effect SMAAT2X has, very noticeable in the leaves in the top right.

FXAA: http://abload.de/img/fxaaandgioff2vsssy.png

SMAAT2X: http://abload.de/img/smaa2bdskp.png

Three-way comparison with a town.

FXAA: http://abload.de/img/cityfxaaxqpp5.png

SMAA: http://abload.de/img/citysmaa5eqzx.png

SMAAT2X: http://abload.de/img/citysmaa2i2rlc.png

I originally went FXAA, but I'll try SMAA for a while. FXAA makes the textures awfully soft for me. But SMAA has more shimmering in motion.
 
Last edited:
If that's the case then screw that game for sure. Also Screw buying two cards next gen because too many games don't even support SLI. I want my SLI to work every time!

-gamer whining complete-

That's not whining...that is a legit gripe.
 
I think its just a case of devs not doing it properly, like lack of CF/SLI support for an engine that is proven to support it and scales excellent.
 
And again Gameworks = bad port.


Mad Max (no Gameworks) uses the same engine and have stellar performance and multi gpu support.

But the console version is running like crap and is plagued with problems as well so your bad port theory sucks.
 
Back
Top