[H] Far Cry Primal performance review

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

tential

Diamond Member
May 13, 2008
7,355
642
121
Also, anyone find it HILARIOUS that he comes here to defend Gameworks, and not the quality of his review which is the main issue on topic right now? The quality of the actual analysis of HardOCP reviews sucks. The benches are great, testing methodology is different from other sites and provides useful datapoints, but there is ZERO point in reading the text of the article.

But hey, FarCry Primal looks worse than FarCry 4.... that sounds like a reviewer who knows his stuff when it comes to visual fidelity!
 

BrentJ

Member
Jul 17, 2003
135
6
76
www.hardocp.com
I'm just a gamer, and you know what, this is a very exciting time to be a gamer. With the new GPUs on the horizon, DX12 here now, and games that are coming this year and next, it is a very exciting time for a gamer indeed.

I'd rather have options I can disable if too demanding, than not have them there at all. This gives the game room to grow into future GPUs that I can then enable the features, re-play the game, and have an even better gameplay experience than initially. It gives us a reason to upgrade to new GPUs. I hope game devs take full advantage of DX12 and give us some awesome gameplay.

On my personal list this year is the next Deus Ex game, one of my favorite series, I friggin cannot wait.
 
Last edited:

thesmokingman

Platinum Member
May 6, 2010
2,307
231
106
You're not just a gamer but you sure come off as having bias. Gamers don't want eco systems, we want games we can play without bias without tricks to hack to get decent playable framerates.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
I'm just a gamer, and you know what, this is a very exciting time to be a gamer. With the new GPUs on the horizon, DX12 here now, and games that are coming this year and next, it is a very exciting time for a gamer indeed.

I'd rather have options I can disable if too demanding, than not have them there at all. This gives the game room to grow into future GPUs that I can then enable the features, re-play the game, and have an even better gameplay experience than initially. It gives us a reason to upgrade to new GPUs. I hope game devs take full advantage of DX12 and give us some awesome gameplay.

On my personal list this year is the next Deus Ex game, one of my favorite series, I friggin cannot wait.

What do you think about AMD's open source approach towards gaming vs the nvidia black box approach?
 

tential

Diamond Member
May 13, 2008
7,355
642
121
You're not just a gamer but you sure come off as having bias. Gamers don't want eco systems, we want games we can play without bias without tricks to hack to get decent playable framerates.
Why not? It's not like he said "I'm an objective reviewer".

Seems like a fine Comment to me.

If you want to approach reviews as a for fun gamer be my guest.
 

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
ok, what are we talking about? in this thread.

Complainers are complaining that Brent said he expects more demanding features in games like Far Cry Primal, something GW does. I know, I know, you hate GW and blah blah blah, nobody cares.

Did I miss anything, or are you too dense to understand it yourself?


You need to stop attacking other members.

-Rvenger
 
Last edited by a moderator:

ImpulsE69

Lifer
Jan 8, 2010
14,946
1,077
126
More demanding features? I'd just like decent and somewhat original gameplay. Far Cry is quickly becoming the next CoD.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Only thing H got going for them is the lengthy benchmark runs, everything else is meh.
 

flynnsk

Member
Sep 24, 2005
98
0
0
Brent according to many posters here you guys are in bed with NV so don't waste your time here, even without being a member of [H[ I appreciate the quality reviews you guys are doing, keep it up!

suck-up.jpg
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Also, anyone find it HILARIOUS that he comes here to defend Gameworks, and not the quality of his review which is the main issue on topic right now? The quality of the actual analysis of HardOCP reviews sucks. The benches are great, testing methodology is different from other sites and provides useful datapoints, but there is ZERO point in reading the text of the article.

But hey, FarCry Primal looks worse than FarCry 4.... that sounds like a reviewer who knows his stuff when it comes to visual fidelity!

That is why I reported him for trolling.

Irrelevant to the gameplay experience.

I think it is best to not create any bias toward any vendor, stay neutral, and use what video card is best for which game. In this way, you always have the best experience. It comes down to which video card provides the best gameplay experience.

If I load a game, and Card A plays the game better, at higher settings compared to Card B, then Card A is the best.

If I load a game and Card B plays the game better, at higher settings compared to Card B, then Card B is best.

If you stay neutral, and don't have any vendor bias, then you use what is best at a given time for games, and you just enjoy the game and put all the petty BS behind you.

There's too much crap in the world to worry about the little things, games are entertainment, I play games for entertainment, I use whatever video card gives me that best experience in the game. I evaluate video cards, I let our readers know which one is best for the game, that helps them make buying decisions. It's that simple, it's that unbiased. I am not tied to nor care what vendor that ends up being.

I believe in the KISS principle. "Keep it simple, stupid"

Well, get used to the idea of not having a choice between every GPU for every game then, as the free GPUs are no longer sent to you. Like every other "just a gamer" you will have only one choice: pick which GPU will you buy for your HARD earned money.

If you are 'just a gamer', then stick to it and stop deceiving people on the internet.

Also:
ninja
 
Last edited:

thesmokingman

Platinum Member
May 6, 2010
2,307
231
106
^^Nice catch on the ninja. lol


Why not? It's not like he said "I'm an objective reviewer".

Seems like a fine Comment to me.

If you want to approach reviews as a for fun gamer be my guest.


What? What are you going on about?

His statement that he's just a gamer, like us is disingenuous at best. How many Ubi games have gone to total crap due to GW features? Yet for the first time in a LONGTIME we get a Far Cry w/o any biased obfuscation and instead of applauding the throwback to the days when game developers actually developed their own games, they dismiss it as lacking the very thing that has broken so many games the last few years!
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I evaluate video cards, I let our readers know which one is best for the game, that helps them make buying decisions.

Which is exactly why there is an outcry for reasoning behind not including Farcry primal in your test suite that is used to "evaluate video cards".

Sea bottom and 10ft of mud.
 

Magee_MC

Senior member
Jan 18, 2010
217
13
81

Quote:
Originally Posted by Kenmitch View Post
What do you think about AMD's open source approach towards gaming vs the nvidia black box approach?

Irrelevant to the gameplay experience.

I don't think that it's irrelevant. What we have seen is that whenever GameWorks is inserted into a game it is harder for the developers to optimize the game because they are unable to alter the GW code. This almost invariably leads to worse performance on any given card on any given engine even if the GameWorks features are turned off.

Games using AMD's open source features have better performance on both AMD and NV cards than games using GameWorks where both AMD cards and older NV cards take a performance hit that is not proportional to NV's Maxwell cards.

That is directly affecting the gameplay experience and that is the issue at hand. GameWorks can offer better features for games, but the cost in performance is disproportionately degrading the gameplay experience on anything except NV's Maxwell cards.

So the question is, is a better gameplay experience on Maxwell cards worth worse gameplay experience for the entire PC ecosystem that aren't Maxwell cards. I don't think it is, but you appear to have a different opinion.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
I'd rather have options I can disable if too demanding, than not have them there at all. This gives the game room to grow into future GPUs that I can then enable the features, re-play the game, and have an even better gameplay experience than initially. It gives us a reason to upgrade to new GPUs. I hope game devs take full advantage of DX12 and give us some awesome gameplay.

I get that long-term view. But I think some balance has to be given to immediate playability for the first time you go through. Playability is priority one, games like last year's Batman don't help anyone.

I also think putting switches that are out of reach on $1000 GPUs at 1080p mocks those who invest the most in the hobby. It is basically the Crysis situation. I don't think it was good for gaming that a 2007 game wasn't playable at 60fps at 1080p maxed out on a single card until the OG Titan came out six years later. We don't need more of that, especially when Moore's Law is down for the count. Restraint can be good.
 
Last edited:

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Brent, mind offering why you guys always paint the Fury in a bad light, but don't do the same for the 980 TI?

In Rise of the Tomb Raider:

http://www.hardocp.com/images/articles/1455189919EDyKUcGV8E_7_3_l.gif

Fury (AIR) is 10% slower avg, 10% faster minimum than 980 TI, and yet you said:

However, there is one card that lags behind the rest and that is the AMD Radeon R9 Fury. Even at 1440p it cannot play this game with the highest in-game settings. We had to lower it to the "Very High" preset.

Yet if we look at the Apples to Apples, when it is at those max custom settings like the other cards, we see its only 10% slower avg than the 980 TI, while maintaining 10% higher minimums. And that's a $480 vs $610 card.

The only video card we were not impressed with is the Radeon R9 Fury. For the price, and the fact it is based on the latest GCN technology, it doesn't perform up to our expectations. The Radeon R9 Fury X is better, but both of these video cards are limiting at 4K with their 4GB of VRAM.

Yet here in the Far Cry Primal review, the 980 TI is 8% slower than Fury X and you call that a small difference:

Here we are running the game at the absolute highest settings once again so we can compare performance at "Ultra" settings at 4K. We once again find the AMD Radeon R9 Fury X taking the lead at 4K. The overall average difference is small at 8%, but the Fury X is faster.

Also looking at the "Limited by 4GB of ram" comment from the Rise of the Tomb Raider conclusion, you actually went back and completely disproved that in your IQ testing, so why is that in your conclusion still? You have said that the 4GB of Ram limits the Fury constantly yet you haven't been able to show that happening in any playable settings.


Why are you disappointed in Fury Air for being 10% slower than the 980 TI (and again, it had 10% faster minimums) while its 25% cheaper, but 8% slower 980 TI vs Fury X is "small difference"?
 

flynnsk

Member
Sep 24, 2005
98
0
0

Quote:
Originally Posted by Kenmitch View Post
What do you think about AMD's open source approach towards gaming vs the nvidia black box approach?

Irrelevant to the gameplay experience.

I don't think that it's irrelevant. What we have seen is that whenever GameWorks is inserted into a game it is harder for the developers to optimize the game because they are unable to alter the GW code. This almost invariably leads to worse performance on any given card on any given engine even if the GameWorks features are turned off.

Games using AMD's open source features have better performance on both AMD and NV cards than games using GameWorks where both AMD cards and older NV cards take a performance hit that is not proportional to NV's Maxwell cards.

That is directly affecting the gameplay experience and that is the issue at hand. GameWorks can offer better features for games, but the cost in performance is disproportionately degrading the gameplay experience on anything except NV's Maxwell cards.

So the question is, is a better gameplay experience on Maxwell cards worth worse gameplay experience for the entire PC ecosystem that aren't Maxwell cards. I don't think it is, but you appear to have a different opinion.

sums up perfectly:

XiA7YbL.jpg
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Also looking at the "Limited by 4GB of ram" comment from the Rise of the Tomb Raider conclusion, you actually went back and completely disproved that in your IQ testing, so why is that in your conclusion still? You have said that the 4GB of Ram limits the Fury constantly yet you haven't been able to show that happening in any playable settings.


Why are you disappointed in Fury Air for being 10% slower than the 980 TI (and again, it had 10% faster minimums) while its 25% cheaper, but 8% slower 980 TI vs Fury X is "small difference"?

Meanwhile the 3,5GB 970 was shoved under the rug regardless of issues it caused in at least 2 games (3 if you count modded skyrim).
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
I am for, not against, moving gaming forward by providing better visuals in games, and using graphics features that make games look better...

So you are the Brent of that site?
I commend you for your bravery coming here, but I don't know if you realize that over here you cannot ban anyone who exposes your lack of professionalism, bias and distortion of reality.

Notice the bold part on the quoted text. Look at the picture below, already linked by several members. Can you honestly, with a straight face, come here and argue that such graphics are nothing short of stunning? By standing on such assessment of "meh" you would prove that you are either incapable of an objective evaluation, or biased and unprofessional to intentionally deny it. I don't know what of either situation is worse, the incompetent or the sleazy, but in both cases means that you should NOT be doing "professional" reviews.

New features that can advantage of newer hardware are always welcome, but a feature that cannot provide a tangible benefit in the relationship of visuals/performance is not a good feature. How could you possibly defend a "feature" that gives a 30-40% performance penalty, but the visual benefit is practically impossible to spot? How is that even defensible? For a 40% performance penalty, I would expect an noticeable visual improvement, but when even on high resolution static images we are having a hard time spotting the visual improvement, the 40% performance penalty is not justifiable at all. Furthermore, a "feature set" that is indeed in there to handicap the competition cannot be considered "forward looking" yet you almost worship it.

The rebellious nature of your website was refreshing at some point in the past, but the maverick website became the online tabloid of computer hardware. If I wanted to read gossip I would have visited the register, but now your site is the one that takes the role. I know that a few of us won't give clicks to the new national enquirer, so guys, play it fair and objective if you want to recover some of those clicks... unless, of course, if the gossip brings more clicks, but then, stop calling it "professional reviews" and call it "reality TV" I mean "reality internet"

bD5nZZa.jpg
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
Why spend the time and money to add gameworks when you already have your own implementation that you feel works better?

Kind of stupid to add a setting that would be lower than your max setting and hurts performance more.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Why spend the time and money to add gameworks when you already have your own implementation that you feel works better?

Kind of stupid to add a setting that would be lower than your max setting and hurts performance more.

Yep, which is especially ironic since [H] was enthusiastic that Rise of the Tomb Raider used their "own PureHair" solution over TressFx.. before they realized that PureHair was based off TressFx 3.0...

Rise of the Tomb Raider is based on a new gamine engine built in-house by Crystal Dynamics called the Foundation Engine. Some of the new features include physically based rendering, HDR, adaptive tone-mapping, global illumination, deferred lighting, volumetric lighting and a brand new in-house hair simulation system. This time, instead of going with AMD for TressFX, or NVIDIA with Hair Works, Crystal Dynamics devised its own hair simulation it is called PureHair. Also in use is NVIDIA's HBAO+, and of course tessellation.

PureHair

Let's talk a little bit about PureHair. In the last Tomb Raider game Crystal Dynamics leveraged AMD's TressFX, which did look great. It moved realistically, and made hair finally look "real" in a game. At its onset it did cost a heavy bit of performance, and initially was slower on NVIDIA GPUs. Later, with driver updates, performance evened out with this affect and with newer GPUs it became a non-issue and the performance drain was a lot lower.

In Rise of the Tomb Raider Crystal Dynamics went down its own path this time and created PureHair. PureHair can add 30,000 strands of hair to a character model, in this case Lara. The individual hairs can react to physics in the game and based on the materials they are moving through like air or water. There are three options, you can turn it off, turn it on, or turn it on "very high" quality setting. With the "on" option this is the recommended setting for most players, and those needing more performance. This is a lower quality version. The "very high" option is the one that can in some cases up-close create 30,000 strands of hair and naturally is the most performance demanding.

As others pointed out, "Godrays" and other IQ features are still in the game, just in house and perform well.


But anyway, I don't expect Brent to reply the questions raised in this thread about the Hitman review discrepancies, let alone the ones raised by myself or others over the differences in their reviews when AMD is leading vs when NVidia is.
 
Feb 19, 2009
10,457
10
76
For me personally I am torn. I don't mind basically getting plus versions of games, as 30 fps can be annoying and AA is nice. But I also like hairworks and Tressfx and that is the kind of thing that might push me to upgrade faster to get it (when for say resolution I am less apt to upgrade my display). That is the kind of stuff you show off to buddies who play the same game on a console and they go "wow."

TressFX is in the Xbone Tomb Raider. The reason they could use it, was because it was open source, free to modify and optimize, and so it has a very small ~10% performance hit. On the Xbone, they also ran TressFX via Async Compute and most likely there was no performance hit.

I don't mind what the consoles have, as long as the PC version has no frame rate cap, some good AA options and better textures. These are the foundations of a great looking game throughout the history of 3d gaming. It's not going to change.

Post effects work to make a already great looking game look better. Mostly. Some of them blur the crap out of the game and so it's a downgrade.

If [H] thinks FC4 is better, because it's got GW stuff that brings the 30 fps cinematic experience to top GPUs, then they are anti-PC-gamer. We used to laugh at poorly optimized games.

When a game comes along with great visuals and great performance, it is to be commended.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Complainers are complaining that Brent said he expects more demanding features in games like Far Cry Primal, something GW does. I know, I know, you hate GW and blah blah blah, nobody cares.

Did I miss anything, or are you too dense to understand it yourself?
I would love to know if you actually believe and agree with this :D

that is 2 attacks on me in a row :thumbsdown: want me to respond? :cool:

I'm just a gamer, and you know what, this is a very exciting time to be a gamer. With the new GPUs on the horizon, DX12 here now, and games that are coming this year and next, it is a very exciting time for a gamer indeed.

I'd rather have options I can disable if too demanding, than not have them there at all. This gives the game room to grow into future GPUs that I can then enable the features, re-play the game, and have an even better gameplay experience than initially. It gives us a reason to upgrade to new GPUs. I hope game devs take full advantage of DX12 and give us some awesome gameplay.

On my personal list this year is the next Deus Ex game, one of my favorite series, I friggin cannot wait.
I guess you are a writer for a good reason :) very very nice job not answering in your answer :thumbsup:
 
Last edited:
Status
Not open for further replies.