[benchmarks] The Division - Steam Release

Feb 19, 2009
10,457
10
76
I'll add more data as other sites cover it. This one is first in on today's Steam release.

http://www.gamestar.de/hardware/praxis/technik-check/3269013/the_division_pc_technik.html

They got NV's latest "Game Ready" driver for it and AMD's latest driver too.

For the benchmarks we have Nvidia's new Division Driver 364.47 used AMD has not yet prepared custom driver, so here comes the Crimson Edition Driver 16.2.1 for use. Our benchmark sequence consists of a short run through the streets of New York, where we play in Full HD and the image level "Ultra".

2ZYbGsQ.jpg



---------------------------------

Kudos to Techspot for testing so many GPUs, a really thorough examination, with older tech to see how they hold up in modern games.

http://www.techspot.com/review/1148-tom-clancys-the-division-benchmarks/

Not listed in the charts, but the cards are not all stock cards. The link to which GPUs are used occur in the first page.

The 980Ti is a reference card, but the 970/980 are both Gigabyte G1 OC models which boost very high (along with high power consumption).

390/X are running at 1050/1070mhz. 380 is 1ghz stock clocks. 960 is a factory OC model.

Ultra_1080p.png


Ultra_1600p.png


High_4K.png


CPU_01.png


---------------------------------

http://www.purepc.pl/karty_graficzn...ys_the_division_pc_bedzie_sie_dzialo?page=0,9

HBAO+ & PCSS OFF 1080p:

7kGJ8qc.jpg


HBAO+ & PCSS OFF 1440p:

2sk8AWz.jpg
 
Last edited:

MBrown

Diamond Member
Jul 5, 2001
5,726
35
91
I am afraid to use 364.47 from the problems I've been reading about.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
FYI Don't update to the Nvidia game ready drivers if you have multiple monitors, they have been crashing / forcing reboot loops for lots of people. Wait for a hotfix.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Here we look at a GTX 980 Ti and a Core i7 4790K at a resolution of 2560x1440 pixels in a fire and achieve relatively low 38 fps.

Move we contrast the fire (and especially from smoke) and looking away in the streets of the city, climb the fps on noticeably more fluid 52 frames per second.

Yikes, 52 fps @ 1440p not in combat, fire brings it down to 32fps?
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
Wait, what...?

R9 380X faster than GTX 970?


Most likely a Compute shader heavy title and built for GCN in consoles. So a GTX 970 has 3.5 Tflops and a 380x has 4 Tflops.

Overclocking the GTX 970 to 1,200MHz should help it achieve parity in that metric. The MSI GTX 970 Gaming 4G is clocked at 1,140MHz. So a 60MHz increase would do it.
 
Last edited:

Udgnim

Diamond Member
Apr 16, 2008
3,680
124
106
what are the clocks on the 980 & 970 they are testing

there's 25% difference in performance between the two

that isn't normal right?
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
what are the clocks on the 980 & 970 they are testing

there's 25% difference in performance between the two

that isn't normal right?
GTX 980 model: 1,203MHz
GTX 970 model: 1,140MHz.

That's around a 28% difference in theoretical compute throughput between the two. When you consider L2 cache sharing between SMMs leads to higher arithmetic latency as you scale SMMs (GTX 970 --> GTX 980 --> GTX 980 Ti --> Titan X) then it makes the compute difference argument plausible.
 
Last edited:
Feb 19, 2009
10,457
10
76
Yeah but as I understand it, under heavy compute shader loads, Maxwell GPUs don't boost.

I wouldn't make that assumption. There needs to be evidence.

Gaming 970 actual in-game boost clocks: 1366MHz

http://hardocp.com/article/2014/09/29/msi_geforce_gtx_970_gaming_4g_video_card_review/8#.Vt4YDfl96Uk

Reference GeForce GTX 970 Clock - Base: 1050MHz / Boost: 1178MHz

We typically see a big performance gain from custom models, and it's due to much higher boost clocks.

The Palit Jetstream 980 model IIRC, boosts to >1.4ghz, similar to the Gigabyte G1s.

It's a shame these review sites don't list the observed clocks in-game. :/
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
The snowdrop engine is just absolutely beautiful, I would hope for the other in-house Ubisoft studios to adopt it and get it ported to DX12 as well ...
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
I wouldn't make that assumption. There needs to be evidence.

Gaming 970 actual in-game boost clocks: 1366MHz

http://hardocp.com/article/2014/09/29/msi_geforce_gtx_970_gaming_4g_video_card_review/8#.Vt4YDfl96Uk

Reference GeForce GTX 970 Clock - Base: 1050MHz / Boost: 1178MHz

We typically see a big performance gain from custom models, and it's due to much higher boost clocks.

The Palit Jetstream 980 model IIRC, boosts to >1.4ghz, similar to the Gigabyte G1s.

It's a shame these review sites don't list the observed clocks in-game. :/
Well I'm not making a definitive assumption. That's why I used terms like "most likely" and "plausible".

I do know that this game was built for the XBox One and PS4 with the PC version being an afterthought.
 
Feb 19, 2009
10,457
10
76
I do know that this game was built for the XBox One and PS4 with the PC version being an afterthought.

It's actually quite impressive.

https://www.youtube.com/watch?v=UKtjDDsO6Y4

One of the best Ubisoft production title, ever. The Beta was smooth and bug-free for me, which is something not associated with Ubi titles even for post-release. ;)

I almost hit the buy on Steam, something unthinkable for a release day Ubi game. lol Normally it's a train wreck, gotta wait a few months for patches to fix a broken game.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
8,739
7,350
136
FYI Don't update to the Nvidia game ready drivers if you have multiple monitors, they have been crashing / forcing reboot loops for lots of people. Wait for a hotfix.

I had to DDU them and I'm running a single monitor.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
I guess Nvidia forgot to add the "sabotage AMD" code into their "blackbox" we call GameWorks?

Or will that accusation only come out if other reviews show the game favoring Nvidia?
 
Last edited:
Feb 19, 2009
10,457
10
76
I guess Nvidia forgot to add the "sabotage AMD" code into their "blackbox" we call GameWorks?

Or will that accusation only come out if other reviews show the game favoring Nvidia?

They probably did forget, or even only adding it for Kepler, cos the beta results for Kepler are lol-worthy bad.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
They probably did forget, or even only adding it for Kepler, cos the beta results for Kepler are lol-worthy bad.

Or they added something that AMD has already worked a fix for? Or the dev wouldn't allow their game to be buggy and broken? Lots of possibilities.
 
Feb 19, 2009
10,457
10
76
Or they added something that AMD has already worked a fix for? Or the dev wouldn't allow their game to be buggy and broken? Lots of possibilities.

I mean Kepler tanked in Rise of the Tomb Raider, AMD performs fine, 390 is 20% faster than 970 in that GameWorks title on release!

Now we're seeing NV partnered titles running heaps better on GCN... it seems to me that AMD has GameWorks figured out for DX11 and the next-gen engines are optimized for consoles GCN. How else to explain the wave of next-gen AAA games showing such a huge lead for GCN?
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
I wouldn't make that assumption. There needs to be evidence.

Gaming 970 actual in-game boost clocks: 1366MHz

http://hardocp.com/article/2014/09/29/msi_geforce_gtx_970_gaming_4g_video_card_review/8#.Vt4YDfl96Uk

Reference GeForce GTX 970 Clock - Base: 1050MHz / Boost: 1178MHz

We typically see a big performance gain from custom models, and it's due to much higher boost clocks.

The Palit Jetstream 980 model IIRC, boosts to >1.4ghz, similar to the Gigabyte G1s.

It's a shame these review sites don't list the observed clocks in-game. :/

Same model do not provide the same boost. You do not know that?
 

maddie

Diamond Member
Jul 18, 2010
5,147
5,523
136
I guess Nvidia forgot to add the "sabotage AMD" code into their "blackbox" we call GameWorks?

Or will that accusation only come out if other reviews show the game favoring Nvidia?
Too hard to keep track of the many releases. Some are slipping through.

A joke. :D