• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

gamegpuDARK SOULS III Benchmarks

csbin

Senior member
http://gamegpu.com/rpg/р&#108...вые/dark-souls-iii-test-gpu


BaERI.jpg


Y1aeQ.jpg



LiEJR.jpg



izmOW.jpg
 
Slightly surprised ...

Kepler architecture has good performance but I thought GCN would win since console optimizations and all that yet it's DX11 so not a surprise ?
 
Performance is pretty unimpressive considering the visuals, of course the game looks great from an art direction standpoint regardless. Still capped at 60 as well, which just shouldn't be a thing.
 
Pretty meaningless cpu benchmarks, with the FPS apparently capped at 60. And Phynaz, seriously, do you really think 1 FPS more min and capped max at 60 really shows the i3 "beating" 8350? By that logic, the i3 beats the 5960X by even more. Basically though, anything FX6300 or above is essentially all the cpu the game needs (or can utilize) with the framerate caps.
 
Why is there so much hate for 60 fps cap, as long as the min fps is 60 too, the gameplay experience is excellent. This isn't a competitive online FPS, it's a SP action RPG.

Is it a PCMR thing? I've always been a PC gamer and 60 fps is my target and it's good when it happens.
 
Why is there so much hate for 60 fps cap, as long as the min fps is 60 too, the gameplay experience is excellent. This isn't a competitive online FPS, it's a SP action RPG.

Is it a PCMR thing? I've always been a PC gamer and 60 fps is my target and it's good when it happens.

Do you use a high refresh rate monitor?
 
i3 beating 8350 again, and where it counts.

Didn't the FX-6300 embarrass itself vs the i7-5960X though, "where it counts" shameful. I'll be cogitating over these benchmarks for some time. Hilarious.
 
Last edited:
It looks like Crossfire is broken at 4K while SLI works just fine. Need updated driver from AMD/RTG?

EDIT: Actually, could be broken completely. Impossible to tell because at lower resolutions the single FuryX is fps capped so CF could be working or not and have no impact.

EDIT2: It is broken. 7990 gives exactly the same results as 7970 in every case. AMD needs a new driver!

Once you uncap, you never go back.

I've never used a high refresh screen (well, not since the CRT days, LOL), is there really that much difference?
 
Last edited:
I wonder if actual game systems work at 2x speed like in their previously tacked on 60 fps efforts. If your weapons degrade faster because you're running at 60 instead of 30, not good. Maybe that was ok back when Quake 3 came out but in 2016 its embarrassing. Lets hope they figured that out
 
Slightly surprised ...

Kepler architecture has good performance but I thought GCN would win since console optimizations and all that yet it's DX11 so not a surprise ?

290x nipping at 980, Fury matching 980ti. I'd call that a win for GCN, given results like that just didn't happen three months ago.
 
290x nipping at 980, Fury matching 980ti. I'd call that a win for GCN, given results like that just didn't happen three months ago.

Precisely. 290X sold for substantially less, often up to $200-250 less for most of its life when comparing it to the 980. The same can be applied to 390/390X vs. 980, as well as to the after-market 290 vs. 780Ti. For a large chunk of 780Ti's generations, it was possible to own 290 CF, and during 980's generation 290X CF / 295X2 for barely more $ than either of those NV cards. In that context, it's like today having set aside $200-300 towards a next gen card that will be 50-75% faster than a 780Ti/980. That's why I stress price/performance as a key metric for most gamers. It might not be obvious right away but if you buy a $550 card that performs barely better than a $280 card in modern games, it's akin to spending $270 extra for very little benefit other than lower power usage. To me, that makes GCN far superior since it costs way less to own over time, which makes future GPU upgrades cheaper OR you can spend that money on a better monitor, SSD, platform upgrade. It's actually possible to sell old 2500K/2600K + mobo + DDR3 and move to an i7 6700K + Z170 + DDR4 after reinvesting the old parts value + $270 saved on not buying a 980.

It seems like at the end of this generation, a 980 is barely faster on average than a 290X/390X. Ouch. It's odd that many NV users always ignore the huge price disparities like that when comparing performance as if every gamer here already has an i7 6700K or similar.....
 
Precisely. 290X sold for substantially less, often up to $200-250 less for most of its life when comparing it to the 980.

Yeah but the 290X was only $50 cheaper than the 980 at launch of the 980. The price only tanked after the 980 launched.

PS: If developers do focus on DX12 then I do worry that they will focus on GCN 1.1 and nothing else. Driver optimization options are going to be limited, it's all up to the developer.
 
Back
Top