Resident evil 7 full game benchmarks

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nurturedhate

Golden Member
Aug 27, 2011
1,767
773
136
Yea I bought a Gigabyte gtx1070 Extreme with 8gb of Vram instead of a 4gb Fury X for $590, just like you did. :) Marketing must have us both! hahahaha :)
You talk about us?
At least I didn't buy a $750 gtx780ti like you did. :)

Hey Bacon can I get a like for that?

Yea Resident evil runs great on my system too.
I jumped like 20 times. Good graphics.
I give it a 8 out of 10.
$500 for the 780ti, no where near close to $750. $350 for the 1070. Thanks for playing. Let's agree to end this though, we've dragged the thread far off topic.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
My post of

means = "People are willing to pay 37.5% more to get 10% more performance"
How did that interpreted by you to be "criticizing 290 series"

You are picking the worst example to demonstrate the (well known and obvious) diminishing returns of price-to-performance as raw performance increases. 290X launched as the fastest GPU in the world, $450 and $100 cheaper than the slower Titan and 780, respectively. The 290X had better performance-per-dollar than the closest competitors.

As always, the barely cutdown card had better PP$. This is universal and has been retold countless times throughout GPU history, and you could have used any example, instead of one of the all time performance-per-dollar architectures. You should know this.

Did you even managed to read my last post before you posted this?
And from your post I can conclude that you have english comprehension problems.
...
Wow, just wow. People are talking Apples, you replied oranges
...
And last but not least, I'm talking objectively, you're talking like I'm a so much Nvidia lover, did you checked my previous card was the R9 290 Tri-X?

What? I do have comprehension problems with your posts.

And why did you even bring up the 290X again? Your post:

Apparently gtx1070 which has the speed of titan x ($999) is a worse x70 series since gtx470 even though it is only sold for around 400

People paid for extra $150 just for 10% faster performance from 290 to 290X

Now a high end performance for $400 is a worse purchase

Okay

There goes the objective logic down the drain

I'm piecing it together:

1070 is criticized as one of the worst x70s based on its near 30% gap in this game (remember, topic) to the 1080...
...but that's ok because people paid more money for the 290X over the 290 which launched at the same time, and the 1070 is cheaper than slower, more expensive cards from last gen that launched over a year prior. Therefore it's, actually, one of the best x70s ever.

It is a good card, but not historically in the context of other x70s vs x80s especially with Resident Evil 7 performance vis-à-vis the 1080.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
Did you miss my post? It runs just fine on 4K on my Fury. Provided you disable Shadow Cache.
I didn't say it doesn't run well. It is outperformed by the 390x in those graphs?

Did I miss something?
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
It's recommended that 4GB users disable Shadow Cache. Maybe the Fury X is faster with this disabled?

Did Guru3D do this? If you can make sense of this professional editor paragraph, draw your own conclusion:

"There have been an unfortunate three revisions of the benchmarks with results all over the place. The last batch (revision 3 as you see them) are valid. The Shadow Cache settings disabled offers the best game-play for the majority of cards hence it is the preferred and recommended setting, for 6GB and upwards shadow cache enabled can offer better results, use it at your own peril. In-between revision 1 and 3 of the review a lot has happened, new drivers 5 minutes after I posted the initial article, totally weird anomalies but also an error on my side that boosted perf on the earlier tests. I think this was the cause of my having interlaced modus activated (mistakenly) on some of the cards as I have been goofing around with that on day 1. Totally my bad, but I am not even 100% sure either that was issue as the game remains difficult to measure. We will retest the game in a few weeks when the drivers on both sides have matured and when the game has had a patch or two. The initial results up-to revision 3 have been far from ideal, but the rev3 results do match what you guys should get gameplay / FPS wise closest. I have also retested an rx480 with both the shadow cache on and disabled. The perf differential is not far away from each other really. However on the opposite side, going to a 4GB card with shadow cache enabled brings in issues, stutters and game perf differences as shown in the FCAT results. That said, the results as they stand with rev3 stand and are sound."
From:
http://www.guru3d.com/articles_pages/resident_evil_7_pc_graphics_performance_benchmark_review,9.html

Looks like they only retested the FCAT charts with Shadow Cache off. Hopefully Fury X, and Fury Air, are back ahead with that off.
 
Last edited:

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
  • Like
Reactions: IEC

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,600
6,084
136
Only when using the max settings @ 4k, but those aren't very good settings since you are at mid 30 fps for both.

https://www.overclock3d.net/reviews/software/resident_evil_7_biohazard_pc_performance_review/12

Drop settings a little for almost same IQ and get much higher fps.

From the article:

Resident Evil 7 can be a massive VRAM hog, easily consuming 8GB of VRAM at 4K Very High settings, though it is clear that the game consumes a lot more VRAM than it needs to, presenting no major performance issues when used with 4GB GPUs with High or lower settings and even 2GB cards with a gaming resolution of 1080p at high settings.

Those that are having performance issues with this game should make sure that they update their GPU drivers and avoid Very High graphical settings and HBAO+, which both can be major performance and/or VRAM hogs. Downgrading to high settings will have minimal visual impact in-game while offering much more stable (higher) framerates, which is the most important thing when gaming.

Matches my experience at 4K. Future patches/performance optimization may allow higher settings (or closer to 60 fps average @ 4K High), but in-game I would be hard-pressed to tell a difference at anything greater than High settings anyways.
 
  • Like
Reactions: Bacon1

casiofx

Senior member
Mar 24, 2015
369
36
61
Apparently gtx1070 which has the speed of titan x ($999) is a worse x70 series since gtx470 even though it is only sold for around 400

People paid for extra $150 just for 10% faster performance from 290 to 290X

Now a high end performance for $400 is a worse purchase

Okay

There goes the objective logic down the drain
You are picking the worst example to demonstrate the (well known and obvious) diminishing returns of price-to-performance as raw performance increases. 290X launched as the fastest GPU in the world, $450 and $100 cheaper than the slower Titan and 780, respectively. The 290X had better performance-per-dollar than the closest competitors.

As always, the barely cutdown card had better PP$. This is universal and has been retold countless times throughout GPU history, and you could have used any example, instead of one of the all time performance-per-dollar architectures. You should know this.



What? I do have comprehension problems with your posts.

And why did you even bring up the 290X again? Your post:



I'm piecing it together:

1070 is criticized as one of the worst x70s based on its near 30% gap in this game (remember, topic) to the 1080...
...but that's ok because people paid more money for the 290X over the 290 which launched at the same time, and the 1070 is cheaper than slower, more expensive cards from last gen that launched over a year prior. Therefore it's, actually, one of the best x70s ever.

It is a good card, but not historically in the context of other x70s vs x80s especially with Resident Evil 7 performance vis-à-vis the 1080.
290X is fastest in the world for how long? Like 14 days? 780Ti released after two weeks of the official release date of 290X, and on the early days 780Ti are the fastest GPU at the time.
So do you not understand the fact that people are paying 37.5% more for 10% more performance even though the majority of the half year after its launch it is not the fastest GPU at the time?

Talking about the worse 70 series, GTX970 almost matches the performance of 780Ti when launched, and GTX1070 matches the performance of 980Ti when launched, that is a fail? Cut down or not, it is not how you see how good a GPU is by how many percent it is cut down, but the absolute performance it offered duh!
GTX970 was great because ultimately GTX980 doesn't even offered a big advantage over the GTX970, it's more like GTX980 failed there. What's so bad about GTX1070 and GTX1080 when what you paid is what you get? Can you explained that to me or is it Nvidia owed you something that they must sell you GTX1080 performance with GTX1070's price?

Do you purposely forgotten that GTX1080 is 45-55% more expensive than GTX1070? You're paying half the price of GTX1070 to get the additional 30% performance. Good price if you're in for high end, but people who opt for cheaper GTX1070 are getting a good deal anyways. In fact you forgot again, people buy GPUs based on their price and performance they offer.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
Only when using the max settings @ 4k, but those aren't very good settings since you are at mid 30 fps for both.

https://www.overclock3d.net/reviews/software/resident_evil_7_biohazard_pc_performance_review/12

Drop settings a little for almost same IQ and get much higher fps.

I'm not asking about whether I can get playable settings. It's as if you're deliberately trying to not answer my post while quoting me.
The 390x is outperforming Fury X. Nothing posted while quoting me has contradicted this.

So until there is something that shows otherwise, this is why I dislike Fury X.

Also, if CF is enabled in this game, then it's even worse looking for Fury X. Just makes NO sense to have a flagship GPU that does performance gymnastics like this.

Also, why is this game "Optimized". Anytime people like a vendor, they claim a game is "optimized". By what reasoning are you all deducing these optimizations....
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
I'm not asking about whether I can get playable settings. It's as if you're deliberately trying to not answer my post while quoting me.
The 390x is outperforming Fury X. Nothing posted while quoting me has contradicted this.

Because it's a game with high variance in testing (no benchmark mode, just people running around). So 2 fps difference isn't much at what are basically non-playable settings anyway.

Also, why is this game "Optimized". Anytime people like a vendor, they claim a game is "optimized". By what reasoning are you all deducing these optimizations....

Its optimized because it has good IQ and high FPS. It runs very well on a lot of hardware even at maxed settings, and amazingly well if you reduce shadows just by one notch even at 4k. What do you consider well optimized?
 

MBrown

Diamond Member
Jul 5, 2001
5,726
35
91
I am getting about anywhere between 80 and 120 with my set up, however the game stutters quite a bit. I am not sure what it is. I have the game on an SSD. Anyone else getting stutters? BTW I did not experience these stutters in the demo. I still have the demo installed. I wonder if there may be some complications of having both installed? I have no idea.
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
Game looks okay, nothing special and seems to run decently well for both Nvidia and AMD. Its good thing that PC is getting more love and more and more games are being properly optimized for it.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
980/980ti/1060/1070/1080.Some interesting things:The gap between 1070 and 1080 is bigger than 980 and 1070
1070 vs 1080 30%
980 vs 1070 20-27%
980Ti at 1200Mhz is faster than 1070 at 1873Mhz
 
  • Like
Reactions: Bacon1

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
980/980ti/1060/1070/1080.Some interesting things:The gap between 1070 and 1080 is bigger than 980 and 1070
1070 vs 1080 30%
980 vs 1070 20-27%
980Ti at 1200Mhz is faster than 1070 at 1873Mhz

Maxwell fares well in this game. The GTX 980 is slapping the 1060 around. I would be interested to see how well OC scaling is on each architecture in this game, as both the 980 & 1060 have around 200-300Mhz left in them.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
1060 like every pascal card have 12% avg oc headroom -2050/9000
980 runs at 1250mhz so around 25% oc headroom -1500/8000

GTX980 at 1500/8000 should be at GTX1070 performance in this game.The gap is very small and 25% oc is possible.
GTX1060 oc = around stock GTX980 performance
 
Last edited:
  • Like
Reactions: Face2Face