Ghost Recon Wildlands Benchmarks (Release!) [PCGH]

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Fury X looks pretty bad , I am not sure if it has to do with the individual game optimization required as mentioned by AMD.The game otherwise also looks to be extremely demanding.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Fury X and 780 with poor showings here. Seems optimized for Maxwell foremost, as they seem to do just a bit better here vs Pascal and GCN then on average. But they don't use reference clocked cards here, so YMMV.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Yeah the engine favors nvidia hardware very much, makes sense with how close they were working together on it. Sad to see how low perf is without all those advanced gamework features as well... with those on perf will tank :(
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Can't really say whether the problem is because the engine favors NVidia hardware excessively. I mean, the game is still a console title first and foremost, which means it should have been optimized for GCN as well. I think the problem more so is that it has very high amounts of detail and lots of tessellation in use. We should all know by now that AMD's DX11 driver isn't as efficient as NVidia's and is more subject to being bogged down due to CPU overhead, and also that AMD's tessellation performance is still two or more generations behind NVidia's.

That said, the game will very likely receive a DX12 patch in the future which should boost performance significantly for AMD. That's just speculation on my part, but it seems Ubisoft is investing heavily in their AnvilNext engine, so I think eventually it will support DX12. Ironically, this is the same engine that they used to create Assassin's Creed Unity, but that game didn't support tessellated environments unlike this game. So I suppose they finally fixed whatever issue was preventing tessellation from being heavily used on surfaces.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
We should all know by now that AMD's DX11 driver isn't as efficient as NVidia's and is more subject to being bogged down due to CPU overhead

Or maybe the game engine just isn't good at sending it data? There are many DX11 games that run just as well or even better on AMD hardware in DX11. Heck you were even saying that Hitman favors AMD even in DX11. There are other recent examples such as RE7, For Honor, or Titanfall 2

Then other games where AMD hardware is heavily throttled like Dishonored 2, this game and others.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I listed two specific possible reasons why AMD hardware isn't performing well in GRW. High amounts of tessellation, and high amounts of draw calls. Both of them are weaknesses that AMD has had for years relative to NVidia. The games you listed don't use copious amounts of tessellation and high amounts of draw calls like GRW, so I don't even know why you brought them up.

I'm not saying after all that AMD has bad DX11 performance. I'm specifically talking about DX11 games with lots of onscreen objects which would exacerbate their CPU overhead issue in DX11.
 
  • Like
Reactions: Sweepr

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Checking out the Guru3d review. Seems about the same, with general AMD under-performance at the popular resolutions.

I hope Vega NCU fixes their CPU driver problem. Look at 1080p, how Nvidia cards with significant TFLOP deficits easily beat AMD counterparts. 980 = Fury X.

Then look at 4K. It's all GPU grunt here. Suddenly 390 = 980 (reference 980 boosts to about the same TFLOPs as reference 390). Fury X ahead (though it will almost never achieve it's TFLOP performance properly). 480 finally beats the 1060 here, although the latter still has better TFLOP efficiency. But all of these cards are unplayable at these settings anyway.

I'm not pointing out anything new here, just ranting over wasted potential. Even games that seem vendor unbiased at FHD or QHD often switch to AMD favoured at 4K. This title clearly favours Nvidia compared to the average game. But AMD not getting 4K GPU tiers at lower resolutions is an issue they own.

Hopefully AMD figure out how to extract this performance at FHD, or at least QHD, with Vega. But we saw AMD cards perform far better at higher resolutions in the VLIW vs Fermi days, so GCN and new drivers have yet to solve anything.
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
Overall performance is lower since Open Beta. Lowering some options doesn't really affect framerate at all like it did before. The texture streaming problem has returned from the Closed Beta (fixed in Open Beta, but Ubisoft is Ubisoft).

There are users reporting 100% Cpu usage on their i5's and i7's. I havent seen overall or single core usage go above 80%, but CPU usage is constantly around 60-70% for all cores. This game needs many fast cores, even at UHD.

Would love to see more CPU related benchmarks.

At UHD with framerate unlocked, I am getting about 10 fps lower than from the Open Beta. 60 fps is still easily achievable, but I can't use the same graphical presets i used in Open Beta. I hope a new driver or update is released to increase performance to previous levels.

The same annoying bugs from both the Betas are still in the game. I have a feeling it will be in a Beta state for another year or more. I would say the top three annoying bugs are:

- Missile Lock audio bug (restarting game is only fix).
- Frozen chat box from entering vehicle (restarting game fixes).
- AI seeing through thick vegetation to headshot you (lowering vegetation helps a little...)

Oh, and by design, certain weapons have incredibly short range. The bullet just vanishes after 120m.

Can't wait to play this game on a 2080 ti @ 4k120 on my 8 core i7 in 3 years.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136

Hilbert used the in game benchmark which isn't really indicative of the actual gameplay I think. PCgameshardware.de is the best review so far, as they use the actual game itself, which kind of explains why the RX480 is beating the Fury X in their review. To elaborate, the RX480's superior geometry processing capabilities allow it to pull ahead of the Fury X at 1080p.
 
Last edited:

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
Was hoping these sorts of benchmarks would come out tomorrow and include the 1080Ti. Is it me, or is 4k gaming at high fps in AAA titles (this is a AAA title right?) still a long way off?
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Was hoping these sorts of benchmarks would come out tomorrow and include the 1080Ti. Is it me, or is 4k gaming at high fps in AAA titles (this is a AAA title right?) still a long way off?

You can game fine @ 4k in most games if you just turn down a few settings. "Maxing" games usually involves halving frame rates for very little IQ gain.

Also the TPU review does state 1080 Ti a few times:

With the Ultra preset chosen, the game demands the best graphics hardware, even to play at a fluid 1080p HD. Only the GTX 1080, Titan X Pascal, and GTX 1080 Ti are able to achieve 60 FPS. When you go to 1440p, the GTX 1080 drops below 50 FPS, leaving only the Titan X Pascal and 1080 Ti with playable framerates.

Sadly they too used the in game benchmark which isn't representative of actual gameplay.
 
  • Like
Reactions: guachi

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
The difference in how Fury (or AMD cards) performs vs GTX 980Ti (NV cards) is in the Gameworks settings used or not in each test.
For example, pcgameshardware.de review had both GodRays and Turf Effects enabled when Gury3D benchmarked with and without those futures (Very High doesnt use Godrays and Turf vs Ultra that used both).
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
The difference in how Fury (or AMD cards) performs vs GTX 980Ti (NV cards) is in the Gameworks settings used or not in each test.
For example, pcgameshardware.de review had both GodRays and Turf Effects enabled when Gury3D benchmarked with and without those futures (Very High doesnt use Godrays and Turf vs Ultra that used both).

PCgameshardware.de tested with Very High settings enabled, which disables ultra textures, enhanced God rays and Turf FX. The game just has a ton of geometry, which is why the RX480 is ahead of the Fury X in the 1080p test.
 
  • Like
Reactions: Sweepr

dark zero

Platinum Member
Jun 2, 2015
2,655
138
106
Fury X looks pretty bad , I am not sure if it has to do with the individual game optimization required as mentioned by AMD.The game otherwise also looks to be extremely demanding.
Only AMD? NVIDIA is suffering here too! And this game seems to be a Obvius Beta. Better to wait 3 weeks in order to see this game propperly patched.
 

Jackie60

Member
Aug 11, 2006
118
46
101
At 4k with Titan X Pascal SLI and 5960x @ 4.6ghz I think the game looks pretty crap. With settings mostly at Ultra and God rays/Turf effects disabled it just looks a
mess and still struggles to maintain 60fps. One thread is heavily loaded and cards going about 80-85% but I realy can't understand why it's so demanding and looks
so mediocre. After the 'amazing' image quality reported I was really excited but it just doesn't look good or real. I getting a refund after 14 minutes gameplay as if two Titans or 1080ti's
can't make this game look good at 4K then I'm not going to continue. Biggest eye candy disappointment in a long time looks like a lazily ported console game with plenty of heavily aliased foliage
and nothing pretty to look at.
 
  • Like
Reactions: Bacon1

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Wow look at the 1060 3GB!

10ynddx.png


Getting beat by the 470 in a Nvidia favored game, and its minimums show a MUCH wider gap than the 6GB 1060 when compared to the average. Hell the 960 has a smaller gap! That is crazy.

Has there ever been a GPU that was dead on arrival quite like the 3GB 1060 is? Ever single person here that called it out for being a poor choice is being vindicated one modern title at a time.
 
  • Like
Reactions: Bacon1 and ZGR

dark zero

Platinum Member
Jun 2, 2015
2,655
138
106
Lol.... the 1060 3GB didn't deserved to be called 1060 after all.

Even the RX 460 has a purpouse and is to be a fanless card.
 
  • Like
Reactions: kawi6rr

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Has there ever been a GPU that was dead on arrival quite like the 3GB 1060 is? Ever single person here that called it out for being a poor choice is being vindicated one modern title at a time.

Ouch, 47% higher minimums and 24% higher averages on the 6GB version.

Nice to see they updated the CPU benchmarks, looks like no gain over 4/8 for Intel, 4/4 has some dips but pretty close overall at 4ghz.

Underclocked however the 10/20 still performs amazing @ 2ghz so it does use threads very well overall.

Gameworks settings hit even the 1080 Ti pretty hard even @ 1080p:

51/56.8 -> 65/73.6 when turned off or 30% perf gained from turning it off.
 

Samwell

Senior member
May 10, 2015
225
47
101
PCgameshardware.de tested with Very High settings enabled, which disables ultra textures, enhanced God rays and Turf FX. The game just has a ton of geometry, which is why the RX480 is ahead of the Fury X in the 1080p test.

PCGH also made a test with Gameworks on and Nvidia cards are loosing more FPS than AMDs, but it's always the evil Gameworks, lol

Just wait, if the new Async optimized Gameworks Libraries perform better on Amd, maybe we will see AMD fanboys insist on benching with Gameworks on:D
 
Last edited: