gamegpuDARK SOULS III Benchmarks

csbin

Senior member
Feb 4, 2013
838
351
136
http://gamegpu.com/rpg/р&#108...вые/dark-souls-iii-test-gpu


BaERI.jpg


Y1aeQ.jpg



LiEJR.jpg



izmOW.jpg
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Slightly surprised ...

Kepler architecture has good performance but I thought GCN would win since console optimizations and all that yet it's DX11 so not a surprise ?
 

motsm

Golden Member
Jan 20, 2010
1,822
2
76
Performance is pretty unimpressive considering the visuals, of course the game looks great from an art direction standpoint regardless. Still capped at 60 as well, which just shouldn't be a thing.
 
Aug 11, 2008
10,451
642
126
Pretty meaningless cpu benchmarks, with the FPS apparently capped at 60. And Phynaz, seriously, do you really think 1 FPS more min and capped max at 60 really shows the i3 "beating" 8350? By that logic, the i3 beats the 5960X by even more. Basically though, anything FX6300 or above is essentially all the cpu the game needs (or can utilize) with the framerate caps.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
i3 beating 8350 again, and where it counts.

$159.99 8350 beating a $1,049.99 i7-5960x, what a slaughter!!!

Seriously dude, give it a rest.

Fury X tied with 980 TI @ 4k

Nano making a good showing too
 
Feb 19, 2009
10,457
10
76
Why is there so much hate for 60 fps cap, as long as the min fps is 60 too, the gameplay experience is excellent. This isn't a competitive online FPS, it's a SP action RPG.

Is it a PCMR thing? I've always been a PC gamer and 60 fps is my target and it's good when it happens.
 
Mar 10, 2006
11,715
2,012
126
Why is there so much hate for 60 fps cap, as long as the min fps is 60 too, the gameplay experience is excellent. This isn't a competitive online FPS, it's a SP action RPG.

Is it a PCMR thing? I've always been a PC gamer and 60 fps is my target and it's good when it happens.

Do you use a high refresh rate monitor?
 

Spanners

Senior member
Mar 16, 2014
325
1
0
i3 beating 8350 again, and where it counts.

Didn't the FX-6300 embarrass itself vs the i7-5960X though, "where it counts" shameful. I'll be cogitating over these benchmarks for some time. Hilarious.
 
Last edited:

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
It looks like Crossfire is broken at 4K while SLI works just fine. Need updated driver from AMD/RTG?

EDIT: Actually, could be broken completely. Impossible to tell because at lower resolutions the single FuryX is fps capped so CF could be working or not and have no impact.

EDIT2: It is broken. 7990 gives exactly the same results as 7970 in every case. AMD needs a new driver!

Once you uncap, you never go back.

I've never used a high refresh screen (well, not since the CRT days, LOL), is there really that much difference?
 
Last edited:

tential

Diamond Member
May 13, 2008
7,355
642
121
The FX is beating the 5960X by as much as the i3 is beating the FX. By one FPS. Imperceptible difference.

i3 4330 > Fx 8350 > i75960x

What's wrong with that?
i75960x is clearly the worst processor and the i3-4330 is the best duh.
 

thesmokingman

Platinum Member
May 6, 2010
2,307
231
106
The numbers look somewhat surprising w/o the doom and gloom, but they still looked whacked out.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I wonder if actual game systems work at 2x speed like in their previously tacked on 60 fps efforts. If your weapons degrade faster because you're running at 60 instead of 30, not good. Maybe that was ok back when Quake 3 came out but in 2016 its embarrassing. Lets hope they figured that out
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
Still prefer 60Hz plasma over 144Hz monitor with g-sync for most games.

We already know that OLED has the response time necessary to achieve excellent motion, but they need to get pixel decay times down and brightness up before we can achieve CRT-parity... or exceed it.
 

S.H.O.D.A.N.

Senior member
Mar 22, 2014
205
0
41
Slightly surprised ...

Kepler architecture has good performance but I thought GCN would win since console optimizations and all that yet it's DX11 so not a surprise ?

290x nipping at 980, Fury matching 980ti. I'd call that a win for GCN, given results like that just didn't happen three months ago.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
290x nipping at 980, Fury matching 980ti. I'd call that a win for GCN, given results like that just didn't happen three months ago.

Precisely. 290X sold for substantially less, often up to $200-250 less for most of its life when comparing it to the 980. The same can be applied to 390/390X vs. 980, as well as to the after-market 290 vs. 780Ti. For a large chunk of 780Ti's generations, it was possible to own 290 CF, and during 980's generation 290X CF / 295X2 for barely more $ than either of those NV cards. In that context, it's like today having set aside $200-300 towards a next gen card that will be 50-75% faster than a 780Ti/980. That's why I stress price/performance as a key metric for most gamers. It might not be obvious right away but if you buy a $550 card that performs barely better than a $280 card in modern games, it's akin to spending $270 extra for very little benefit other than lower power usage. To me, that makes GCN far superior since it costs way less to own over time, which makes future GPU upgrades cheaper OR you can spend that money on a better monitor, SSD, platform upgrade. It's actually possible to sell old 2500K/2600K + mobo + DDR3 and move to an i7 6700K + Z170 + DDR4 after reinvesting the old parts value + $270 saved on not buying a 980.

It seems like at the end of this generation, a 980 is barely faster on average than a 290X/390X. Ouch. It's odd that many NV users always ignore the huge price disparities like that when comparing performance as if every gamer here already has an i7 6700K or similar.....
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
Precisely. 290X sold for substantially less, often up to $200-250 less for most of its life when comparing it to the 980.

Yeah but the 290X was only $50 cheaper than the 980 at launch of the 980. The price only tanked after the 980 launched.

PS: If developers do focus on DX12 then I do worry that they will focus on GCN 1.1 and nothing else. Driver optimization options are going to be limited, it's all up to the developer.