[gamegpu] Far Cry 4 performance

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
There isnt any ingame benchmark is there? I just took a look and couldnt find any. That might explain why some sites got very different numbers and performance is basicly all over the place.

farcry4-bench1.jpg

thats an irrelevant test as new drivers and patches have changed the situation. hardocp had latest 1.6 patch and Nvidia 347.07 whql and AMD 14.12 omega drivers.

Anyway there are so many FC4 tests which show the Kepler cards losing to GCN cards. R9 290 beats GTX 780 and so does R9 280X against GTX 770. Enhanced godrays does hit older AMD cards more on perf. R9 285 seems to be able to handle it better due to improved tesselation unit in Tonga (GCN 1.2).

http://www.techspot.com/review/917-far-cry-4-benchmarks/
http://www.sweclockers.com/artikel/19647-snabbtest-grafikprestanda-i-far-cry-4
http://www.purepc.pl/karty_graficzn...e_test_kart_graficznych_i_procesorow?page=0,5
http://pclab.pl/art57559-6.html
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
But 285 is just utter terrible compared to the 280X in that benchmark. Whats the explanation?

maybe memory bandwidth is holding back the card according to hardocp. Stock clocks on R9 285 is 918 Mhz while on R9 280X is 1000 Mhz. If you match the core clocks and increase the memory clocks to 6 Ghz on R9 285 that would help reduce the gap. Still the R9 280X will be faster until Enhanced godrays is applied at which time the R9 285 could pull ahead.

http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review/7#.VLJQrskVJ8E

"The AMD Radeon R9 285, being based on newer technology did better than we thought, but ultimately is held back in performance we think due to its memory bandwidth. What is interesting is that we were able to run with the highest Godray setting of Enhanced on the R9 285. We found that selecting between "Volumetric Fog" and "Enhanced" did not change performance much at all. There was only a few FPS difference. It is as if Enhanced Godrays just don't burden this GPU much.

This actually makes a lot of sense. You see, the R9 285 (Tonga) received some tessellation performance improvements in that new iteration of GCN. You can refer back to this table where we outlined all the improvements in each iteration of GCN. You will notice that improved tessellation is one of the upgrades with Tonga. Since Godrays are tessellation based, this newer architecture does very well with it compared to any other AMD GPU. Even R9 290X cannot touch how well R9 285 is doing tessellation now. The R9 285 also uses a new color compression scheme which also improves performance.

All of these advancements propels R9 285 to allowing Enhanced Godrays to be playable at 1080p with that video card. Not bad for a $230 video card. However, due to the memory bandwidth, we did have to set all the in-game settings to "High" quality just like the GTX 770. So while it is buffed in performance in one area, the rest of the specs hold it back in other areas for this game."
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
FC4 alone doesn't paint a picture as being an outlier for Kepler's recent poor form. Just take a look at 970/980 launch reviews and then recent reviews and see where 780Ti/Titan/780/680/770 land. Clearly in modern games they are underperforming. Considering NV has a history of strong driver team/support, are we supposed to believe the 280X can approach Titan/780 performance because AMD found some miracle performance nearly 3 years after Tahiti's launch? I am not buying that.

There have been way too many games now where 280X is on the heels of a 780 and 680/770 are getting a pounding. That tells me FC4 is not an outlier. If anything, 780 and the Titan should have aged better against the 280X/7970Ghz due to superior tessellation performance and a lot of AAA games being GW titles. Instead on TPU, 780 is barely faster than a 7970Ghz in the latest reviews. I remember people and even HardOCP stating that a Titan was nearly as fast as 7970 XF in games. Now that's just wishful thinking as the Titan is only 21% faster than 7970Ghz at 1440p, and shockingly enough the 780 is just 13% faster, with 780Ti just 2.4% faster than a reference throttling 290X!!!

Ok, I realize that 780/780Ti and Titan have massive overclocking headroom. However, their current stock performance is disappointing given their crazy high launch prices:

http://www.techpowerup.com/mobile/reviews/Gigabyte/GeForce_GTX_980_G1_Gaming/27.html

NV's voltage locking for overclocking and VRAM gimped SKUs (570/580/670/680/770) all hint that NV doesn't really want to design cards to last. I think it's fair to say in hindsight that 7970/7970Ghz was a much better product overall than the 680. GK104 won the popularity battle but Tahiti XT won the war. Had I purchase 680 2GB SLI, I would have been forced to upgrade already.

---

As far as 285 goes, while as an overall product, it's a fail imo, as a testbed for color fill-rate, memory bandwidth efficiency, and tessallation performance improvements, it's has addressed 3 major areas of weakness for GCN. Because these 3 aspects are improved, AMD can focus on Shaders/ALUs, TMUs, cache/IPC, HBM and power efficiency. It's a lot easier for the 300 series team to tackle a new GPU since they have to focus on less areas for improvement.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Hell, that is laughable. NV better fix the drivers, this is a GameWorks title after all.

1420574520CH9QmTVFND_6_3.gif


Edit: unless the game at those settings using more than 3GB ram.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
^ When most 2014 games were unoptimized console ports, how else do you entice your userbase to upgrade?

Interesting that the latest FC4 testing shows when NV's GW features are turned on, 980 leads the 290X by 25%. With them off, everything on Ultra, the performance advantage of a 980 drops to just 5%!

Looks like NV's GW kills 2 birds with 1 stone: (1) since NV has access to the game months before launch, it allows them to choose which architecture to optimize for (ie, in this case Maxwell over Kepler), thus forcing earlier upgrades to Maxwell which is one of the main drivers of GWs; and (2) makes NV's cards seem way faster than AMD's by providing GW-specific NV optimized SDK source code directly to the developer. Well played NV.

Also, looks like the Dunia engine stuttering is still present in FC4 after being a nuisance in FC3. Yet another failed Ubisoft game engine. FC4 would have been a much better technical feast had it been designed on UE4 or CryEngine. Too bad Ubisoft only cares about sales, not pushing the boundaries of PC gaming. It's rather shocking how little FC4 improved technically over FC3 in 2 years time.

In 2 years DICE went from BF3 to BF4, and Crytek went from Crysis 2 to Crysis 3 -- both representing a huge leaps in graphics. FC4 otoh is FC3.5, and many found it less fun too.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Personally i dont have a problem with NV optimized games, AMD does the same. What i have a problem with is the GTX 780 performance on a GameWorks title.

I could understand if GTX 970(Maxwell) was that much faster than GTX780(Kepler) when the game is optimized for the Maxwell architecture but loosing that much to AMDs Tahiti R9 290, not even 290X, is laughable.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Personally i dont have a problem with NV optimized games, AMD does the same. What i have a problem with is the GTX 780 performance on a GameWorks title.

I could understand if GTX 970(Maxwell) was that much faster than GTX780(Kepler) when the game is optimized for the Maxwell architecture but loosing that much to AMDs Tahiti R9 290, not even 290X, is laughable.

Remember its a non TI card. 290 and 970 was always faster than GTX780. TI version is 20-30% faster.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
AtenRa, there are only a handful of examples where AMD optimized games are damaging to NV -- Compute-based global illumination model in Dirt Showdown and TressFX in Tomb Raider. OTOH, GW aim is to encourage developers to provide as many GW SDK graphical effects as possible, all code made in-house by NV specifically to favour NV. The more of those features NV can entice the developer to include, the more it cripples last gen NV GPUs and as a side-effect produces a huge performance hit on AMD cards -- all of which encourage new generation NV card sales. How many GE titles do you know where AMD provides SDK source code for HBAO, AA, shadows, lighting model, tessellation, etc?

Why is that in GE titles Maxwell performs well - DAI, Sniper Elite 3, Tomb Raider, CiV:BE, but the minute we talk about GW titles like MGS or AC Unity, the performance on AMD bombs? You think this is some kind of a coincidence?

I mean just think about it -- a tessellation "patch" post game launch for AC Unity and Crysis 2. WTH! How do you just "patch" new major graphical features of this type post launch when the game takes 2 years to design? If the developer wanted tessellation in there to depict the game world, they would have used an advanced game engine / or coded that in from the beginning.

It's pretty obvious based on graphics of FC4, Watch Dogs and Unity that each of those games was just a PS4/XB1 console port with GW thrown in the last 25% stage of game design. It's no wonder Ubisoft has such a hard time optimizing their GW titles to run smooth Cuz they didn't write the majority of the code for those advanced features.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Remember its a non TI card. 290 and 970 was always faster than GTX780. TI version is 20-30% faster.

You are not seeing the big picture.

Far Cry 4 (HardOCP) - 290 is 29% faster than a 780.
DAI (TechSpot) - 290 ties a 780Ti and beats the 780 by 16%

http://www.techspot.com/review/921-dragon-age-inquisition-benchmarks/page4.html

Shadow of Mordor (TPU) - 290 beats 780Ti
and smokes 780 by 30%!
http://www.techpowerup.com/mobile/reviews/MSI/GTX_970_Gaming/20.html

Dead Rising - 280X beats a 780.
http://www.techpowerup.com/mobile/reviews/MSI/GTX_970_Gaming/15.html

There are many more examples. NV has shown in the last 6 months that it doesn't care about Kepler.

780 and 290 were just 1-4% apart at 290's launch:
http://www.techpowerup.com/mobile/reviews/AMD/R9_290/26.html

There is a reason there were so many gamers cross-shopping a 290 and a 780. Clearly now 780 is a total write off compared to an after-market 290, while 670/680/770 2GB are on life support.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
PCSS for example only works on nVidia.

GameWorks-Far-Cry-4-6.jpg


The only way to make it work for AMD is to manually edit the gamerprofile.xml with a large performance penalty.

While the game automaticly enables it for nVidia.
 
Last edited:

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
With all the random games tossed in here in the epic struggle of gpus,like a abortion people want to forget i present the broken Wolfenstein:The New Order....900p and 1080p its pretty much as silly as it can get with the 290 losing to 770 by nearly 10fps in both resolutions.

http://www.techpowerup.com/mobile/reviews/MSI/GTX_970_Gaming/23.html Even a bit more fun just looking at 290/970 cause i just never see this one posted.:awe: It's like forget it!
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Hell, that is laughable. NV better fix the drivers, this is a GameWorks title after all.

1420574520CH9QmTVFND_6_3.gif


Edit: unless the game at those settings using more than 3GB ram.

FarCry 4 at 1440p with those settings is using around 3.5GB on my GTX 970. Does the memory usage start going over the hardware limits on the 2GB/3GB cards and tanking performance for those cards?
 
Last edited:

el etro

Golden Member
Jul 21, 2013
1,584
14
81
With all the random games tossed in here in the epic struggle of gpus,like a abortion people want to forget i present the broken Wolfenstein:The New Order....900p and 1080p its pretty much as silly as it can get with the 290 losing to 770 by nearly 10fps in both resolutions.

http://www.techpowerup.com/mobile/reviews/MSI/GTX_970_Gaming/23.html Even a bit more fun just looking at 290/970 cause i just never see this one posted.:awe: It's like forget it!

OpenGL game, Nvidia drivers crunch in OpenGL.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
OpenGL game, Nvidia drivers crunch in OpenGL.

It's a game on my backlog i want to buy after i upgrade :) Never really had much experience with OpenGL.It's more special and not broken which is good.:biggrin: I hear so little on OpenGL its like unicorns if it lands in newer games?

Pure fps in these games interest me quite much but when it comes to technical details the stuff goes right over my head.This thread needs more charts.:biggrin:

Been using the majority of the games i play as a decision in which gpu to upgrade to,FC4 is one game on my list i want to play and i guess these 1440p numbers can translate to peasant resolutions like 1080p?4k i guess AMD dominates?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
It's a game on my backlog i want to buy after i upgrade :) Never really had much experience with OpenGL.It's more special and not broken which is good.:biggrin: I hear so little on OpenGL its like unicorns if it lands in newer games?

Pure fps in these games interest me quite much but when it comes to technical details the stuff goes right over my head.This thread needs more charts.:biggrin:

Been using the majority of the games i play as a decision in which gpu to upgrade to,FC4 is one game on my list i want to play and i guess these 1440p numbers can translate to peasant resolutions like 1080p?4k i guess AMD dominates?

HardOCP has a good article on FarCry 4. http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review/1#.VLKTHnvkgQM

Each review uses a different scene to benchmark so they aren't comparable. HardOCP lists the highest settings that gave playable frame rates too so you can get an idea of what to expect in actual gameplay situations where you want the game to play smoothly. For game performance articles I find this a superior method.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
With all the random games tossed in here in the epic struggle of gpus,like a abortion people want to forget i present the broken Wolfenstein:The New Order....900p and 1080p its pretty much as silly as it can get with the 290 losing to 770 by nearly 10fps in both resolutions.

http://www.techpowerup.com/mobile/reviews/MSI/GTX_970_Gaming/23.html Even a bit more fun just looking at 290/970 cause i just never see this one posted.:awe: It's like forget it!

This is puzzling...
wolfenstein_2560.png

Even PClab is not confirming those numbers. Do you have any other source that confirms those strange numbers?

Hard to believe r9 290 doesn't run 60 fps average in fullhd. Oh well
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
HardOCP has a good article on FarCry 4. http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review/1#.VLKTHnvkgQM

Each review uses a different scene to benchmark so they aren't comparable. HardOCP lists the highest settings that gave playable frame rates too so you can get an idea of what to expect in actual gameplay situations where you want the game to play smoothly. For game performance articles I find this a superior method.

Yeah from what i can collect between 290/970,you could enable GodRays to Volumetric on the 970 and pull nearly the same avg/min as the 290 without them enabled.The whole kepler debate going about now doesn't matter to me as someone who is upgrading from a 770.:)

There is like 4 settings for GodRays in this game?Holy hell forget high vs ultra in BF4 this is Major League google search worthy for me.:biggrin: Entirely in the dark on the GodRay setting.I believe it is lighting?
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
HardOCP has a good article on FarCry 4. http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review/1#.VLKTHnvkgQM

Each review uses a different scene to benchmark so they aren't comparable. HardOCP lists the highest settings that gave playable frame rates too so you can get an idea of what to expect in actual gameplay situations where you want the game to play smoothly. For game performance articles I find this a superior method.

This is from your linked review:

Unfortunately we cannot test CrossFire because CrossFire doesn't work yet in this game. That's right, 50 days of the game being out, almost 2 months, and no CrossFire support. CrossFire has been disabled on purpose by AMD because it states that Ubisoft needs to make client side updates for CrossFire to work properly. We reached out to AMD to ask why CrossFire was disabled, and the response was as such:

"We have currently disabled CrossFire support for Far Cry 4, as the developer must make additional client-side updates before mGPU will provide a smooth experience."

I'm really disappointed :thumbsdown:
At this point, we don't even know whose fault is this.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
GodRays are stunning,i could see the deal with wanting those on.Some google images really just shine.Never saw something so beautiful.Not sure still on the whole volumetric vs enchanced yet but got the idea down

GodRays must be a tanking setting like MSAA or the like?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Yeah from what i can collect between 290/970,you could enable GodRays to Volumetric on the 970 and pull nearly the same avg/min as the 290 without them enabled.The whole kepler debate going about now doesn't matter to me as someone who is upgrading from a 770.:)



There is like 4 settings for GodRays in this game?Holy hell forget high vs ultra in BF4 this is Major League google search worthy for me.:biggrin: Entirely in the dark on the GodRay setting.I believe it is lighting?


Yeah it has to do with the lighting and glow through foliage from the sun. It is noticeable when I did a back and forth comparison. The performance hit didn't seem that high when I turned them on. Actually, there is no reason in my mind that it's a locked out feature for AMD users. Oh well



This is from your linked review:



I'm really disappointed :thumbsdown:
At this point, we don't even know whose fault is this.


Yeah I saw that and I dunno. I always thought drivers made crossfire and sli work. I have no idea what needs to be done on the developer end, I didn't think that mattered a whole lot.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
With all the random games tossed in here in the epic struggle of gpus,like a abortion people want to forget i present the broken Wolfenstein:The New Order....900p and 1080p its pretty much as silly as it can get with the 290 losing to 770 by nearly 10fps in both resolutions.

http://www.techpowerup.com/mobile/reviews/MSI/GTX_970_Gaming/23.html Even a bit more fun just looking at 290/970 cause i just never see this one posted.:awe: It's like forget it!

Except WNO results at TPU are an outlier from everyone else. That's why it's hard to take TPU's results of WTNO seriously with 770 beating 290 by 10 fps.

1. http://us.hardware.info/reviews/547...sted-on--32-gpus-benchmarks-full-hd-1920x1080
and
http://us.hardware.info/reviews/547...on--32-gpus-benchmarks-ultra-hd--4k-3840x2160

2.
1400620600kFCJkChMRM_4_1.gif


1400620600kFCJkChMRM_4_2.gif

http://www.hardocp.com/article/2014/05/21/wolfenstein_new_order_performance_review/4#.VLKwXHvCcc4

3.
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Wolfenstein_The_New_Order_-test-WolfNewOrder_2560.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Wolfenstein_The_New_Order_-test-WolfNewOrder_3840.jpg

http://gamegpu.ru/action-/-fps-/-tps/wolfenstein-the-new-order-test-gpu.html

4.
Wolfenstein_-_1080p_-_fixed-pcgh.png

http://www.pcgameshardware.de/Wolfe...enstein-The-New-Order-Test-Benchmark-1121737/

The poor performance of Kepler cards in many modern games isn't an outlier as it's common in many reviews across the net.

FarCry 4 at 1440p with those settings is using around 3.5GB on my GTX 970. Does the memory usage start going over the hardware limits on the 2GB/3GB cards and tanking performance for those cards?

Kepler's horrible performance in FC4 is well documented and corroborated by more than 1 reputable site. 680/770/780/Titan all perform poorly for their segment even an 1920x1200. A 270X is right on the heels of a 770.


r1920.png



Yeah it has to do with the lighting and glow through foliage from the sun. It is noticeable when I did a back and forth comparison. The performance hit didn't seem that high when I turned them on. Actually, there is no reason in my mind that it's a locked out feature for AMD users. Oh well

Ya, well that's GW for you.

The Crew - HardOCP review:

"However, HBAO+ will only work on NVIDIA GPUs in this game. AMD GPUs like the AMD Radeon R9 290X an 290 will not be able to use HBAO+, it doesn't even show up in the graphics settings. This is odd because we know HBAO+ itself is vendor agnostic. Far Cry 4 allows you to run it both AMD and NVIDIA GPUs with no issues. Therefore, the developers have artificially locked out HBAO+ to AMD GPU users, which we do not like at all."
http://www.hardocp.com/article/2014/12/15/crew_performance_video_card_review#.VLK04XvCcc4


IMO, I would ban both GE and GW. It's now become a game of who throws more money/programming engineers at developers instead of the developers optimizing the game engine/game for PC gamers. I find both of these programs alienate PC gamers, instead of unifying the gaming experiences, especially so for GWs with locked out features. In the past the game would be made vendor agnostic as to provide a similarly great experience for PC gamers regardless whether or not they owned ATI/NV card. If ATI didn't support SM3.0, well that was a hardware limitation but the ability to run special effects was not imposed by the game developer.

I don't like the idea of 1 firm with more financial resources essentially paying its way to victory. Imagine if Intel started paying developers hundreds of millions of dollars to optimize most games for Intel CPUs and GPUs? How fair would that be? The developer should solely decide how to make a game. If they want to hire talented programmers from NV/AMD/Intel, that's perfectly acceptable too but taking direct code is almost like sponsorship/marketing.
 
Last edited: