[GameGPU] Hatred (UE4 game) - CPU and GPU testing

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
The latest da uses havoks so I guess that may have been a bridge burnt.

Well Dragon Age Inquisition switched from BioWare's in-house engine to DICE's Frostbite engine, and Frostbite doesn't use PhysX, so that was just part of that transition.

Anyways, I flipped through my Steam folder to see what games have PhysX files indicating use of CPU PhysX in addition to the Mass Effect series and Dragon Age Origins and Dragon Age 2. I found:

Alpha Protocol (a UE3 game)
Borderlands (a UE3 game; Borderlands 2 had GPU PhysX, but the first game didn't)
Dungeon Siege 3

And I'd be willing to bet that the Batman Arkham Games use CPU PhysX even when GPU PhysX is turned off.

So yeah, CPU PhysX being used in games is nothing new, and it seems to be a standard part of UE3. It's basically been a competitor with Havok physics for years now. AMD has never had trouble with CPU PhysX before, so I don't see why it would start being a problem now.
 
Last edited:

Snafuh

Member
Mar 16, 2015
115
0
16
Like Red Hawk said, this is probably not an PhysiX issue. PhysiX is basicly the dafault physics engine for 3D games. Even Gaming Evolved titles like Bioshock Infinite use it.
It's used for basic things like collision detection and Newtonian physics. The fancy NVidia only effects in some games are only a small part of the PhysiX library
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
I gave it a look both for fun and to upload something on my channel.

Didn't think to make a proper benchmark though. I dimmed it non benchmark worthy.

Still I recorded a performance oriented video, with my external recorder, so I can give a better picture of how it performs.

Hatred - PC Gameplay on GTX 970 maxed - Digitally recorded - 1080p 60fps

Needing 80% gpu usage, for 60fps, on a G1 GTX 970 seemed a bit too much for this level of graphics.

It's not too bad as a game though. I found the idea of going completely against political correctness, somewhat amusing.

A little mindless shooting to get my mind off Witcher, isn't too bad either.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
So what, we aren't developers and don't know how well it actually runs.
Actually you can. There's a PhysX indicator built-in to NVIDIA's drivers. It will tell you if it's currently using GPU PhysX or not.

"Show PhysX Visual Indicator"

http://www.ocmodshop.com/images/guides/disable_physx_gpu_message/nvidia_control_panel2.png

It's not reliable for determining precisely what's going on in mixed load cases, such as where you have first order CPU physics + second order GPU physics, but it can tell you if everything is CPU based, or if there is GPU processing going on.

So far it doesn't sound like this game uses any GPU physics, but I don't have the game so I reserve the right to be wrong.:p
 

MrTeal

Diamond Member
Dec 7, 2003
3,917
2,703
136
I don't even understand what's going on here. Are those screenshots a tiny crop of some massively larger frame? Is this some kind of homage to Virtua Fighter?
2jbtufq.png

I almost want to play a demo just to see what kind of visuals are causing a 60FPS @ 1080P with some very powerful cards.
 

Shehriazad

Senior member
Nov 3, 2014
555
2
46
Another one of these games that doesn't seem to be using more than 4 cores.


I would really love to see how a 860K @ 4.7 Ghz and an FX4300 @ 5 Ghz do at games like these.

But they never bother to show that...especially considering that those 2 CPUs would BOTH average @ around 53-55 fps guesstimated based on that review.

But that goes to show you that those cheap chips would easily be able to handle those (unnecessarily) demanding games just fine.

But the GPU side is just sad...Im not saying they "prefer" Nvidia...but the majority of optimization definitely seemed to have been Nvidia...which does make sense, seeing how a majority of Desktop dGPU is Nvidia...but still, a bit sad.
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
7,236
17,024
136
I almost want to play a demo just to see what kind of visuals are causing a 60FPS @ 1080P with some very powerful cards.
Here's a gameplay video. Actually I just noticed there was one already in the thread.

It's not too bad as a game though. I found the idea of going completely against political correctness, somewhat amusing.
Depends, I'm curious to see how many will find a suicide simulator just as fun.
 
Last edited:

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
How boring. If you want to make a game to shock and upset make it a first person serial killer game. You go out at night and day in a city to murder and take trophies back to your lair or something. All in glorious DX 11. Chuck in some freeform rape and torture. Like GTA, but with serial killers. Press F to fondle, G to grope, B to reach for blowtorch. Hmmmm, I'd kickstart that.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Not sure why people are blaming Nvidia or Physx.

Its a crappy job by the devs.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Holy moly, Nvidia is doing really well with this game

That's when you get the usual, the games poorly optimized, the game engine sucks, Nvidia is evil, PhysX is killing performance, the game itself is horrible.

Fact is there is no sli/crossfire. A 780ti is slightly faster than the 290x and in some cases that's right where performance should be.

Those benchmarks paint the best picture for AMD , they use a 6 core cpu which should help with the cpu PhysX, but I'm sure the OP knew that.
With a 4 core cpu, my guess is AMD would lose a few more %.

You get what you pay for, you want cheap price /performance you got it.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
That 680 beating a 690 just made me laugh.

I don't think there is an SLI profile. Look at all the other SLI cards, they are performing identical to the single card configuration. Seeing that a 690 is clocked lower than a 680 the numbers actually make sense. Though it's kind of useless to spend all that time testing several SLI configurations when it doesn't work.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
A 780ti is slightly faster than the 290x and in some cases that's right where performance should be.
R9 290 losing to a 680 in a GPU-limited situation is "right where it should be"? The performance difference between a 780 Ti and 290X is at best, around 3% in favor of the Ti (at 1080p), at 4K, the opposite is true. Using one game as proof of NVIDIA's superiority, especially when it's an indie game with questionable visuals over 20 diverse, triple A games in techpowerup's benchmark suite (that show something enormously different) is absolute cherry-picking.
 
Last edited:

desprado

Golden Member
Jul 16, 2013
1,645
0
0
...

Moderator action taken for trolling
-Moderator Subyman
 
Last edited by a moderator: