• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[gamegpu] Evil Within CPU benchmarks - SNB gets hammered

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I disagree there. I run a stock 4770 and its good to see a base.

I never said they should stop testing stock clocks in favor of overclocked for their CPU tests, I want to see both. I agree its great to see a base, but I want to see how the game might continue to scale with clockrate. We can extrapolate if a game seems to consistently climb with clockrate, but that's not always the case.
 
Gimped console game (locked at 30fps) exhibiting weird behavior when unlocked on PC?! NO FLIPPING WAY!!!

Pretty much. I just quit buying them till they are like $5.
Vote with your wallet and all that.


Still not a bad showing for the (upper end) old ass FX imo 😎
 
Digitalfoundry has done some testing on the PS4 and Xbox One versions and the performance apparently isn't that great there either
 
Loving this so far. Refreshing in a sea of tired AAA titles. It also hammers my 4770 hard - 2 cores frequently stay at 3.9GHz, other 2 at 3.7GHz. 780Ti GHz nearly always under 50%. Solid 30FPS (HA!) at 1200p maxed out with SMAA. Its odd playing at 30FPS, sluggish and somehow off with sharp movement, especially being used to 50FPS+. Still nice gameplay!
 
Just, please, don't try to sell 30 FPS as an ideal. If you want you games to look more cinematic, try producing higher-quality pixels instead.
last sentence on pcper

i agree 100 percent

don't piss on me and tell me it's raining.
 
I'm not arguing, but there is no small debate about weather higher than 24fps movies looks "less cinematic" for actual movies. Might be related. Or not.
 
[Sighs]. 30fps, clunky controls, non-rebindable keys, 2.35:1 aspect ratio that's really "letterboxed 16:9" (ie, on a 2.35:1 monitor you'll simply 'enjoy' black bars around all 4 screen edges like an early non-anamorphic DVD)... :thumbsdown:

This is one game that's best left on the shelf. Seriously, the ONLY way they'll stop churning out these intelligence insulting bad ports is to stop buying them!
 
[Sighs]. 30fps, clunky controls, non-rebindable keys, 2.35:1 aspect ratio that's really "letterboxed 16:9" (ie, on a 2.35:1 monitor you'll simply 'enjoy' black bars around all 4 screen edges like an early non-anamorphic DVD)... :thumbsdown:

This is one game that's best left on the shelf. Seriously, the ONLY way they'll stop churning out these intelligence insulting bad ports is to stop buying them!

Well I bought it and played it for a couple of hours last night

30fps - in this game it does not matter , as long as it does not drop below that at any time. It did not drop a single frame during my play session

The graphics are good , dynamic shadows on that engine - finally !

For me though the games biggest problem is the aspect ratio. I can see what they are trying to do , make it like your playing a cut scene and to make the game feel claustrophobic. I don't think it quite works though , you just can't see enough of what is going on. Would like to see them do something to fix this but I don't think they will.

Anyway the game is a bit different to anything else so for that reason alone is worth a purchase
 
PCgameshardware.de has done some GPU and CPU benchmarks now
http://www.pcgameshardware.de/The-E...s/Benchmark-Systemanforderungen-Test-1139281/

They tested an i7 4790k in different modes. The game does perform noticeably better when going from two cores/threads to four, but performs a little worse with eight threads.

4790 doesn't offer true 8 threads so to see if it really performs worse with eight threads they should have tested the game on a true 8 cores CPU like 5960X. What they found out is that the game doesn't like HT.
BTW. I'm curious if CMT on 8-threaded AMD CPUs also induce a performance penalty.
 
Could it be powersaving? I heard haswell plays the turbo/waking up cores game better.

Might be interesting to test with a 2500K with all powersaving turned off.
 
There must be a lot of loops in the code that have been heavily optimized by the many very minor changes that have come since SB:

Ive bridge FP/integer divider throughput is 2x compared to Sandy Bridge

Ive bridge - MOV instructions no longer occupy an execution port, potential for improved ILP when MOVs are present

Ivy Bridge made the decode queue shared, which gives a potential 100% increase in in decode queue size for a single thread, depending on what the other thread is doing.

Branch heavy code can make very good use of the 2nd branch unit on haswell port 6.

And of course the code could be making use of FMA extensively, which reduces those instructions from 8 cycles to 5 cycles, a very easy 40% improvement.

L1-to-L2 bandwidth is doubled on haswell. Looping code sections that are too big to fit into L1 but small enough to fit in L2 are going to see massive performance increases. If I had to guess I would say that there is constant cache thrashing going on with this game engine, because critical sections of code are simply not fitting in L1 cache. L2 cache is fast, but if you are having to constantly hit it, its going to run like an AMD.
 
Last edited:
Maybe it's PCIe bandwidth? Sandy Bridge was PCIe 2.0, Ivy Bridge was PCIe 3.0. If the game is swapping textures all the time (due to being designed for a unified memory console), that might explain the difference.
 
Maybe it's PCIe bandwidth? Sandy Bridge was PCIe 2.0, Ivy Bridge was PCIe 3.0. If the game is swapping textures all the time (due to being designed for a unified memory console), that might explain the difference.

There has never been an instance as far as I know where you are seeing a FPS benefit with PCIe 2.0 vs 3.0 in a single or dual gpu setup that would be equal to what we are seeing here.
 
There has never been an instance as far as I know where you are seeing a FPS benefit with PCIe 2.0 vs 3.0 in a single or dual gpu setup that would be equal to what we are seeing here.

I'm working off the "they ported really stupidly from console to PC" hypothesis. 🙂
 
30fps - in this game it does not matter , as long as it does not drop below that at any time. It did not drop a single frame during my play session
Well it is an indication of a bad port. Letterboxed 16:9 = effective resolutions of 1920x816 or 1280x544. If it's still only 30fps at those, something's very wrong...

Edit : Not intended for anyone here, but what amuses me in general are those funny Hollywood Artist Diva comments of "Oh but you simply MUST play this at 30fps, darling, due to the cinematic effect!". The sad truth is - the "cinematic effect" being "superior" is entirely placebo. 24fps arose in the 1920's as a de facto fixed standard (to standardize audio pitch after the variable fps silent movie era ended) out of the mean average of what cinema's were outfitted to show formerly silent movies at the time (typically 22-26fps), plus measurement convenience (at 24fps the film travels through the projector at a rate of exactly 18.0 inches per second). Likewise 25Hz (PAL) and 30Hz (NTSC) were chosen based on the electrical grid systems (25/30 fps frames interlaced = 50/60 fields per second where early TV's would use the mains AC frequency (240v @ 50Hz / 120v @ 60Hz) as the "timer". None of these rates were selected for any "artistic" effect at all. Nor do any of them "look" any better beyond the placebo of simply being conditioned into thinking a genre "belongs" to a certain frame-rate simply because that's the way it's been historically filmed anyway purely out of coincidental backwards compatibility. There's really zero relevance from film vs rendered on-the-fly modern PC / console fps which are not "filmed" at any static rate at all.

In reality - the so called "glorious" 24p cinematic effect actually looks cr*p whenever motion is involved when you play it back at that rate without using motion blur to hide it (which itself looks highly unrealistic in PC & console games due to the way the human eye doesn't see such blur in real life when viewing a moving object due to... higher "frame rates" of the human eyeball). That's why you have the amusing scenario of games developers pretending to base it on "24-30 cinematics" (as an excuse to avoid admitting performance problems), whilst at the same time, Blu-Ray forums are filled with people who setup their player up to output "proper, native, as it was filmed 24p" (vs deinterlaced 50-60Hz), soon complaining about [judder, judder, judder] every time the camera pans, and solving the problem by... wait for it... turning their TV's frame-rate doubling / tripling / quadrupling interpolation feature on! You just can't make it up...
 
Last edited:
Well it is an indication of a bad port. Letterboxed 16:9 = effective resolutions of 1920x816 or 1280x544. If it's still only 30fps at those, something's very wrong...

Edit : Not intended for anyone here, but what amuses me in general are those funny Hollywood Artist Diva comments of "Oh but you simply MUST play this at 30fps, darling, due to the cinematic effect!". The sad truth is - the "cinematic effect" being "superior" is entirely placebo. 24fps arose in the 1920's as a de facto fixed standard (to standardize audio pitch after the variable fps silent movie era ended) out of the mean average of what cinema's were outfitted to show formerly silent movies at the time (typically 22-26fps), plus measurement convenience (at 24fps the film travels through the projector at a rate of exactly 18.0 inches per second). Likewise 25Hz (PAL) and 30Hz (NTSC) were chosen based on the electrical grid systems (25/30 fps frames interlaced = 50/60 fields per second where early TV's would use the mains AC frequency (240v @ 50Hz / 120v @ 60Hz) as the "timer". None of these rates were selected for any "artistic" effect at all. Nor do any of them "look" any better beyond the placebo of simply being conditioned into thinking a genre "belongs" to a certain frame-rate simply because that's the way it's been historically filmed anyway purely out of coincidental backwards compatibility. There's really zero relevance from film vs rendered on-the-fly modern PC / console fps which are not "filmed" at any static rate at all.

In reality - the so called "glorious" 24p cinematic effect actually looks cr*p whenever motion is involved when you play it back at that rate without using motion blur to hide it (which itself looks highly unrealistic in PC & console games due to the way the human eye doesn't see such blur in real life when viewing a moving object due to... higher "frame rates" of the human eyeball). That's why you have the amusing scenario of games developers pretending to base it on "24-30 cinematics" (as an excuse to avoid admitting performance problems), whilst at the same time, Blu-Ray forums are filled with people who setup their player up to output "proper, native, as it was filmed 24p" (vs deinterlaced 50-60Hz), soon complaining about [judder, judder, judder] every time the camera pans, and solving the problem by... wait for it... turning their TV's frame-rate doubling / tripling / quadrupling interpolation feature on! You just can't make it up...

What the shit? What kind of moron makes games in resolution of 2.35:1 aspect ratio? Sure there must be a mod? Is this also the case for consoles?
 
What the shit? What kind of moron makes games in resolution of 2.35:1 aspect ratio? Sure there must be a mod? Is this also the case for consoles?

Best thing is, if you run it on an actual 2.35:1 monitor, you still get black bars... but now you get them around the side, too 😀

UYeWoGu.jpg


Quality porting job.
 
What the shit? What kind of moron makes games in resolution of 2.35:1 aspect ratio? Sure there must be a mod? Is this also the case for consoles?
Yeah, it's pretty bad. I'm not even sure you can mod it, if it's hard coded? TBH, there used to be a time when I spent hours researching that stuff. Now if a game is that badly broken, I just move on.
 
There must be a lot of loops in the code that have been heavily optimized by the many very minor changes that have come since SB:

Ive bridge FP/integer divider throughput is 2x compared to Sandy Bridge

Ive bridge - MOV instructions no longer occupy an execution port, potential for improved ILP when MOVs are present

Ivy Bridge made the decode queue shared, which gives a potential 100% increase in in decode queue size for a single thread, depending on what the other thread is doing.

It's too bad Gamegpu didn't test Ivy bridge, then we would know for sure..
 
Best thing is, if you run it on an actual 2.35:1 monitor, you still get black bars... but now you get them around the side, too 😀

UYeWoGu.jpg


Quality porting job.
D: Wow... Just.... wow...

At least in ye olde days of DOS they'd wrap the HUD and a bunch of art around it to cover it up.
Ultima-Underworld-The-Stygian-Abyss-3.jpg

Z
 
I suspect the AR is to give it a cinematic feel. Look at the upcoming Interstellar, its being shown in a variety of formats from 70mm to Imax. Not many shoot in cinemascope anymore.
 
Back
Top