PCGH: CoD Black Ops III Benchmarks

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Also, it is strange to see a COD that runs so well on AMD GPUs. What happened, it's not NV sponsored anymore?

It doesn't look like it's sponsored by either IHVs so it's likely agnostic game ...

As for why it runs well on AMD graphics that's because of AMD's investment on consoles paying off since developers have to optimize aggressively for VGPR usage for the GCN architecture ...
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I don't understand it though, don't they get that gamers see the failness of it all and wait til its on sale? Instead of a smooth polished launch that attract players to buy it enmass at full price, they push out buggy crap and expect us to beta test the game for them.

I didn't buy BF4 until months later because of that. I'm not touching any AAA titles full price when they are buggy.

Nope. Its all some excel sheet only for the MBAs.
 
Feb 19, 2009
10,457
10
76
Guru3d and GameGPU.Ru has tested the singleplayer levels and AMD runs really awful there, like CoD: AW2.. so whatever optimizations they did, its only for MP.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Black_Ops_III-test-blackops3_1920.jpg


pcgameshardware mentioned they test singleplayer and it loaded their i7 CPU to ~100% with an 980 on all threads.

PhysX multi-threaded in action (AW2) vs single-thread on AMD = gg.

It's almost as if SP and MP are entirely different engines.

This explains it:
http://www.pcgameshardware.de/Call-...el-55478/News/Gameworks-Ankuendigung-1166977/
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The money I had for this game just bought me the SWTOR Amazon starter pack and I have some change to spare.

Speak with your wallets. This game can wait for a Steam Sale.
 

psolord

Golden Member
Sep 16, 2009
1,766
1,128
136
Guru3d and GameGPU.Ru has tested the singleplayer levels and AMD runs really awful there, like CoD: AW2.. so whatever optimizations they did, its only for MP.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Black_Ops_III-test-blackops3_1920.jpg


pcgameshardware mentioned they test singleplayer and it loaded their i7 CPU to ~100% with an 980 on all threads.

PhysX multi-threaded in action (AW2) vs single-thread on AMD = gg.

It's almost as if SP and MP are entirely different engines.

This explains it:
http://www.pcgameshardware.de/Call-...el-55478/News/Gameworks-Ankuendigung-1166977/


That's a lot closer to what I am getting.

970 runs great, 7950 runs like a turd.

Also the high cpu usage is fixed with the released hotfix.

Very disappointed with my 7950's performance. It's less than half that of the 970, which is not the norm. Usually the 970 is 50-60% faster and that's it.

Actually I had to turn shadow maps a notch down, in order for it to be borderline acceptable.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
its really funny isn't it? When nvidia does this crap. Basically if nvidia says they are working closely with a dev, expect a broken game. That is one consistent thing save a few. Seems to be getting worse.
 

happy medium

Lifer
Jun 8, 2003
14,387
479
126
What has happened to the performance of the Kepler based cards too??

780 series is doing fine. the 770,760,680 need more vram it seems.
The 960 2gb seems to do a better job at memory compression.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
just some more benchmarks from Guru3d

index.php

780 series is doing fine. the 770,760,680 need more vram it seems.
The 960 2gb seems to do a better job at memory compression.

No, considering 780Ti was a $699 card and R9 290 was a $399 card, while 780 cost $650 and dropped to $499 when 290 was still $399. In that context, the entire GTX700 series is performing horribly if we consider their pricing for most of the HD7970-R9 290 vs. GTX600-700 generations.

Interesting how you specifically singled out the 1080P graph when on the same page there is a consolidated graph with 1080P, 1440P and 4K. This changes the picture as many of us have 1440P monitors.

Poor Kepler, it's like a coin flip now to guess if Kepler will bomb in a new AAA game or not.
index.php


Guru3d and GameGPU.Ru has tested the singleplayer levels and AMD runs really awful there, like CoD: AW2.. so whatever optimizations they did, its only for MP.

And who pays $60 for COD to play its SP campaign? More alarming is COD BO3 might signal a trend for future 2016-2017 console ports where system RAM and VRAM optimizations will be similar. Looks like 4GB could soon be the bare minimum GPU VRAM and 16GB might start edging towards mainstream system memory for smooth gaming.

In any case, this port is still a giant turd as per Guru3D:

"The problem however remains, the game looks good ... but remains to be just that -- nothing excels in PC graphics quality to a level that amazes. When it works the game does run smooth enough on pretty much any modern age graphics card.

So the results you see today are INDICATIVE and not precise. This game is a mess to measure. We found a sweet spot setting that works pretty good and measure in a level (In Darkness Quarantine Zone, Singapore) that allows fairly consistent frame-rates. However the number you see WILL change even your GTX 980 TI or Fury X at one point will drop to 30 FPS. These means that the results shown today are indicative, not a precise measurement by any standard."


Yet another failed console port.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
479
126
Interesting how you specifically singled out the 1080P graph

HMMMMM , mabe because I play @ 1080p?like most people in this world!
I don't care about other resolutions?
Some people actually look at reviews to see how their rig will play not so they can tell someone how to save a nickle in 40 paragraphs.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
HMMMMM , mabe because I play @ 1080p?like most people in this world!
I don't care about other resolutions?
Some people actually look at reviews to see how their rig will play not so they can tell someone how to save a nickle in 40 paragraphs.

You made a claim about Kepler's performance and how 780 was doing fine. There is plenty of information that proves otherwise. From all angles (price/performance, minimum FPS) Kepler is doing horribly in this game unless you find it "fine" that a $700 780Ti is barely keeping up with a $400 R9 290 at 1080P and gets crushed at 1440P. csbin provided even more supporting data. 280X beating 770 by 53% is crazy bad showing for 2GB cards/Kepler. Good decision by NV to discontinue the 960 2GB. Hopefully AMD follows suit and discontinues the 380 2GB.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
When will the PC master race begin to realize the idiocy of being proud to pay big bucks for faster hardware only to be negated by shitty ports and there is clearly an incentive for this to happen from both the GPU and gaming industry?
 

tential

Diamond Member
May 13, 2008
7,355
642
121
I'm mindboggled at why people AIM for 1080p just because they have a 1080p monitor. I have a 1080p screen. I will NEVER play at 1080p ever again. VSR/DSR makes it easy. The increase in IQ when using VSR was worth it enough for me to get the R9 290 after already having the 7950, although I really wanted Fury and just couldn't justify it for 4K.
AMD REALLY dropped the ball by not getting more GPUs to support 4K VSR. It honestly makes no sense whatsoever for AMD to have a 1440p cap for the 7000 series, and a 1800p cap for the R9 200 series, especially when we saw mods that let you get 4K VSR on the R9 200 series.
Too bad though that we're buying faster and faster GPUs and getting ZERO IQ increase.
 
Feb 19, 2009
10,457
10
76
@tential
You must be joking if you think 4K VSR is relevant for 7950/7970 GPUs. Or even R290/X. We're seeing new titles hammer GPUs on 1080p already, and these weaker class has no grunt for those high-resolution.

If you're such a fan of IQ/VSR, you should realize AMD's VSR does not blur the scene, that already is a huge deal.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
When will the PC master race begin to realize the idiocy of being proud to pay big bucks for faster hardware only to be negated by shitty ports and there is clearly an incentive for this to happen from both the GPU and gaming industry?

I'm already upset. I made a thread here when E3 happened about how now that the new consoles were coming out, we'd get amazing games with Crysis 3 graphics....
I guess I see why people lol'd at me. This is a joke. This is NOT what I expected of these years since the new consoles have released. We've regressed. We have minimal IQ improvements at best, for massive hardware requirement increases.

Maybe 2 years from now DX12 will mature and we'll see some good games with increases in quality.
It seems this whole console cycle, we'll have minimal increases in graphics fidelity. MINIMAL.
 
Feb 19, 2009
10,457
10
76
We have minimal IQ improvements at best, for massive hardware requirement increases.

Maybe 2 years from now DX12 will mature and we'll see some good games with increases in quality.
It seems this whole console cycle, we'll have minimal increases in graphics fidelity. MINIMAL.

Did you see Battlefront's reveal of other maps? The forest looks better than Crysis 3 AND its fluid fast without requiring uber hardware.

There's studios that repeatedly release poorly optimized PC console ports, then there's studios that make polished and optimized PC games first and foremost, while they tone it down for consoles.

Look at Alien Isolation as another good example, moderate GPU setup, 1080p it runs over 100 fps with scenes that look better than this COD.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
@tential
You must be joking if you think 4K VSR is relevant for 7950/7970 GPUs. Or even R290/X. We're seeing new titles hammer GPUs on 1080p already, and these weaker class has no grunt for those high-resolution.

If you're such a fan of IQ/VSR, you should realize AMD's VSR does not blur the scene, that already is a huge deal.

I'd have 2 R9 290s in Crossfire right now to manage 4K VSR.
Instead, 1 R9 290 to do 1800p VSR hardset rule by AMD.
Why does AMD choose to hardset that 1800p VSR when 4K VSR is clearly possible? AMD uses a clear segmentation strategy of:
1440p VSR - 7000 series
1800p VSR - 200 series
4K VSR - Fiji
And that's acceptable? I thought we were sitting here complaining about Kepler tanking in new games so people buy Maxwell. AMD does the same thing.... I refuse to buy the "Overclockers dream" AMD. Well, at least not new anyway.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
I'd have 2 R9 290s in Crossfire right now to manage 4K VSR.
Instead, 1 R9 290 to do 1800p VSR hardset rule by AMD.
Why does AMD choose to hardset that 1800p VSR when 4K VSR is clearly possible? AMD uses a clear segmentation strategy of:
1440p VSR - 7000 series
1800p VSR - 200 series
4K VSR - Fiji
And that's acceptable? I thought we were sitting here complaining about Kepler tanking in new games so people buy Maxwell. AMD does the same thing.... I refuse to buy the "Overclockers dream" AMD. Well, at least not new anyway.

what is the max of the r9-285/380?
 
Feb 19, 2009
10,457
10
76
@tential

VSR was a software feature added, the original Tahiti & Hawaii SKUs did not even support it. When they added it, they informed everyone of the limited resolution support on those SKUs.

Its not like they hid it. You're lumping limitations in an optional feature to actual obsoletion of Kepler? Way of a stretch.

These SKUs AMD sold are old, they could have simply NOT added any VSR support for Tahiti but they instead choose to do so years after those SKUs launched. Just as they continue to improve GCN performance over time. You want to bash them for that? Go ahead. I don't buy that argument for one iota.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
I'd have 2 R9 290s in Crossfire right now to manage 4K VSR.
Instead, 1 R9 290 to do 1800p VSR hardset rule by AMD.
Why does AMD choose to hardset that 1800p VSR when 4K VSR is clearly possible? AMD uses a clear segmentation strategy of:
1440p VSR - 7000 series
1800p VSR - 200 series
4K VSR - Fiji
And that's acceptable? I thought we were sitting here complaining about Kepler tanking in new games so people buy Maxwell. AMD does the same thing.... I refuse to buy the "Overclockers dream" AMD. Well, at least not new anyway.

I don't use super sampling because I really can't tell the difference, but that artificial restriction sucks.