Watch Dogs CPU benchmarks, i7 (apparently) optional

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jj109

Senior member
Dec 17, 2013
391
59
91
Because their benchmarks usually tally up with what most people see with similar hardware on dozens of other games? Far less worse than other sites that posted Beta - even Alpha - builds of BF4 (simply to be "first post!") that ended up wildly different to the finished game...

Please tell me how TechSpot isn't "first posting" by releasing benchmarks on old drivers.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Interesting, from Toms Review the FX8350 is on par with Ivy Core i5 3550 and that by using one of the fastest dGPUs today. Seams WD is using more than 4 threads as the 6-core 3960X pulls away even at 1080p Ultra settings.


http://www.tomshardware.com/reviews/watch-dogs-pc-performance,3833-8.html
CPU-FR.png
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
It is indeed CPU heavy. The non K 4770 + 780 Ti GHz I have still had dips at 1200p to mid 40's from a solid 60 with Vsync everything maxed including that "pc" setting. I even had a BSOD on stock non overclocked (aside from factory OC GPU) well cooled hardware (GPU hit 72 Celsius, CPU less than 50). So its heavy and unoptimized and it doesn't look that pretty. FUN!
 

jj109

Senior member
Dec 17, 2013
391
59
91
What a bizarre selection of CPUs!

Yeah, you would have thought that doing all latest generation or something would be expected for a major site like THG.

I think deferred context rendering is rearing its head in PCGH and THG benches.

LAVkEWp.png
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
even the 3770K is going as low as 22FPS with the Radeon, maybe it's time to change the title

it looks more like i5/i7 at 4.5GHz required ( if you want max details and always over 30 FPS with a Radeon specially).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
even the 3770K is going as low as 22FPS with the Radeon, maybe it's time to change the title

So at Tom's, Radeon R9 290X drops to 22 fps but at HardOCP it beats 780Ti? How can Tom's review even be taken seriously when they used FXAA and Medium textures and still got performance to drop to 22 fps at 1080P?

When reputable sites show 290X performing much closer to 780Ti than Tom's, one has to question their methodology and to be honest theirs leaves a serious question mark for me.

Another question to ask if it this game drops to 22 fps on a 3770K and a high end Radeon at 1080p but doesn't look better than Crysis 3 or Metro LL? What conclusion can be made? No excuses really for that level of optimization or lack thereof. The CPUs inside PS4/XB1 are crap compared to an i7 3770k, even at stock.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
So at Tom's, Radeon R9 290X drops to 22 fps but at HardOCP it beats 780Ti? How can Tom's review even be taken seriously when they used FXAA and Medium textures and still got performance to drop to 22 fps at 1080P?

When reputable sites show 290X performing much closer to 780Ti than Tom's, one has to question their methodology and to be honest theirs leaves a serious question mark for me.

Another question to ask if it this game drops to 22 fps on a 3770K and a high end Radeon at 1080p but doesn't look better than Crysis 3 or Metro LL? What conclusion can be made? No excuses really for that level of optimization or lack thereof. The CPUs inside PS4/XB1 are crap compared to an i7 3770k, even at stock.

I'm talking about the pcgameshardware graphic comparing CPU scaling from nvidia and amd, nothing to do with tomshardware,

the game is running poorly considering how it looks sure, I guess DX12 is really needed, but also AMD software seems to be a lot less efficient, requiring a lot more CPU performance compared to nvidia, as you can see 3.4 vs 4.6GHz with both, also as usual nvidia gains a lot more with more CPU threads.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Why test at 720p? I mean seriously, this type of game will murder most CPUs at that res. Should have 1080/1200p minimum . . . . .
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Why test at 720p? I mean seriously, this type of game will murder most CPUs at that res. Should have 1080/1200p minimum . . . . .
this should really not have to be explained but they test at 1280 to see if there are any real differences. if they tested at 1920 then you become more gpu limited and you dont know what performance you can expect if you turn down some graphics settings or run a faster gpu setup then whats in the review.
 

erunion

Senior member
Jan 20, 2013
765
0
0
I will agree they used a small selection of CPUs but bizarre ? why ??

No mainstream i7, but they included an EE.
The 3 and 4 module chips are vishera, but the 2 module is BD.
A midlevel i5 was used, not the more common 3570.
The IVB parts are over 2 years old.

Clearly they weren't considering what would be most useful to readers considering a new CPU to play Watch Dogs on when they made their selection.
 

Bubbleawsome

Diamond Member
Apr 14, 2013
4,834
1,204
146
I hope for driver optimizations, but it blows my mind that even my brand new i5 4670k couldn't max it at launch. I shouldn't need to rely on overclocking on a brand new enthusiast CPU. Even the extra 0.4Ghz I got on 1 core and the 0.7Ghz I got on 4 cores doesn't look like it would push my 75hz monitor. :|
 

crashtech

Lifer
Jan 4, 2013
10,695
2,293
146
I hope for driver optimizations, but it blows my mind that even my brand new i5 4670k couldn't max it at launch. I shouldn't need to rely on overclocking on a brand new enthusiast CPU. Even the extra 0.4Ghz I got on 1 core and the 0.7Ghz I got on 4 cores doesn't look like it would push my 75hz monitor. :|
That's true for the 4960X as well, and that is a good thing! We want games that push the envelope. We should have games that need the highest-end card or SLI to play at Ultra settings at 60+ fps.
 

Bubbleawsome

Diamond Member
Apr 14, 2013
4,834
1,204
146
That's true for the 4960X as well, and that is a good thing! We want games that push the envelope. We should have games that need the highest-end card or SLI to play at Ultra settings at 60+ fps.
While I agree, that only works if it looks the part. I'm sure I could code a program that uses 15 cores to just display a pixel of a random color. Minecraft pushes my 4670k to its limits when modded, and my GTX 770 well past its with shaders, but if vanilla did that would be broken. WDs isn't pushing limits, it's handicapped.
 

TrantaLocked

Junior Member
Jul 25, 2014
17
0
66
What confuses me is how the 4670K is matching the FX-9590 OC 5GHz in Watch Dogs. That just doesn't seem right, especially when the game was designed around AMD 8-core APUs in the consoles. I wonder if they were biased to cater to Intel-specific instructions for the PC version.
 
Last edited:

davie jambo

Senior member
Feb 13, 2014
380
1
0
What confuses me is how the 4670K is matching the FX-9590 OC 5GHz in Watch Dogs. That just doesn't seem right, especially when the game was designed around AMD 8-core APUs in the consoles. I wonder if they were biased to cater to Intel-specific instructions for the PC version.

does not matter what cpu you use , the game runs like [garbage]

I've tried it fx8350 , i7 3770k and g850 (with 1x7970 , 2x7970 and 1x7850 cards)

does not run well on any of them

No profanity in the tech forums, please
-ViRGE
 
Last edited by a moderator:

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
What confuses me is how the 4670K is matching the FX-9590 OC 5GHz in Watch Dogs. That just doesn't seem right, especially when the game was designed around AMD 8-core APUs in the consoles. I wonder if they were biased to cater to Intel-specific instructions for the PC version.


consoles run custom OS and "API/driver", PCs are running Windows, DX11, Forceware/Catalyst.

the consoles CPU is a totally different architecture compared to the 9590 (it's a lot slower) and as far as I know on the PS4 games only have access to 6 cores (1.6GHz Jaguar)

I think the PC is just paying the price for lack of optimization and Windows/DX11.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Interesting, from Toms Review the FX8350 is on par with Ivy Core i5 3550 and that by using one of the fastest dGPUs today. Seams WD is using more than 4 threads as the 6-core 3960X pulls away even at 1080p Ultra settings.


http://www.tomshardware.com/reviews/watch-dogs-pc-performance,3833-8.html
CPU-FR.png

Hmm.

All three Intel chips have a base speed of 3.3GHz. The top model is 6C/12T, middle is 4/4 and lowest is 2/4, making this pretty close to a comparison of core & thread counts.

Going from 2/4 to 4/4 nets a 54% increase in minimum fps.
Going from 4/4 to 6/12 nets a 38% increase in minimum fps.

So, my take from this is that the game is relatively heavily multi-threaded but even so there is a point of diminishing returns (performance plateaus off). Which also explains how well the AMD chips manage to do (the FX-8350 managing a small win over the i5-3550).
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Long story short, your CPU matters if:

1. You are using SLI or Crossfire
2. You are running 120fps at reduced settings
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
It is indeed CPU heavy. The non K 4770 + 780 Ti GHz I have still had dips at 1200p to mid 40's from a solid 60 with Vsync everything maxed including that "pc" setting. I even had a BSOD on stock non overclocked (aside from factory OC GPU) well cooled hardware (GPU hit 72 Celsius, CPU less than 50). So its heavy and unoptimized and it doesn't look that pretty. FUN!

Watch dogs and Ubisoft are both a joke. That game runs so badly for no apparently reason. But are you sure it's CPU? Most people would say it's some kind of memory handling issue. I did the -disablepagefilecheck thing and while it improved things it still stutters at will.