gamegpuCall of Duty Black Ops III Beta Benchmark

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Advanced Warfare is known to cache a ton of vram, but it didn't need it. Frame rate is fine on 4GB cards even at 4K, in SLI/CF.

But it is indeed interesting that this newer one runs so much faster than than AW on the same engine.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This game is eating vram like hell. That seems like on of the reasons for these results. Kepler with 2gb totally tanks.

But 960 2GB is crushing the GTX770 2GB at PCGamesHardware.

960 = 55 fps
770 = 32 fps
http://www.pcgameshardware.de/Call-...cials/BO3-Beta-Benchmarks-Windows-10-1169217/

Also, I think you are on to something about VRAM but it's still not as clear cut how much VRAM is necessary. Once resolution goes up, even mid-range GM204 Maxwell suffer greatly despite more or similar VRAM! Also, if 4GB was a hard limit, 280X would bomb against 4GB cards, but it doesn't.

2560x1440

Case 1:
R9 280X 3GB = 51.5 fps (+30% vs. a 960 4GB)
R9 380 4GB = 45 fps
GTX960 4GB = 39.4 fps

Case 2:
Sapphire Tri-X R9 290 = 62 fps avg / 58 fps min
Asus Strix 980 = 61 fps avg / 51 fps min

390X shows 40% higher minimums vs. the 980 at 1440P.

In any event, I've been warning gamers for the last 9 months that anyone would be making a big mistake recommending/buying a 2GB card to keep for 2-3 years, even at 1080P. And I would say FAR away from any Kepler cards for those in the market of a used GPU and I would never buy any $160-180 GPU with 2GB of VRAM. That's why I've been steering PC gamers to strongly consider spending a bit more for a GTX970 or R9 290, or at the very least consider R9 280X and skip all the 950/960 2GB/285 cards.

I mean 2GB of VRAM becoming a major bottleneck shouldn't be news to people who didn't close a blind eye to the benchmarks that foreshadows what's coming for 2GB cards.

som_1920_1080.gif


deadrising3_1920_1080.gif


This is a good reminder for people who keep insisting on marketing features like HDMI 2.0 and 4K HEVC and ignoring performance-based criteria such as GPU horsepower and VRAM capacity when making GPU recommendations.

According to what I can gather from the pcgamershardware translation, the engine is based off of the same that was used with Advanced Warfare but with the Nvidia effects stripped and replaced with their own as well as a few modern effects added.

AW runs much better though on a variety of cards. Hard to conclude without comparing the graphics though. In the last game 680/770 were performing well relative to the 280X, while 780Ti beat both the 970 and 290X.


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Advanced_Warfare-test-new-cod_2560.jpg


But even in the last AW game, cards like 670/680/770 start incurring big performance hit against HD7970/7970Ghz/R9 280X once you turn on SSAA.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Advanced_Warfare-test-new-cod_1920_ssaa.jpg


It's amazing how at one point certain sites called HD7970 a failure/short-lived. As of now, R9 280X (aka HD7970Ghz) is 68% faster than the GTX580 and has no problem beating a 770 4GB, a card that cost $150 more.
 
Last edited:
Feb 19, 2009
10,457
10
76
It's amazing how at one point certain sites called HD7970 a failure/short-lived. As of now, R9 280X (aka HD7970Ghz) is 68% faster than the GTX580 and has no problem beating a 770 4GB, a card that cost $150 more.

Which is why few take that site seriously. Almost every result has AMD performing lower than other major review sites.

The 7970 and 7950 must be one of the longest lived GPUs ever. They are still producing amazing results today, and the 7950 was an OC beast too, 800mhz -> 1200mhz (50%) OC was common. Strange to see the likes of the rebadged 300 series using Tahiti silicon still kicking much A and taking names.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Which is why few take that site seriously. Almost every result has AMD performing lower than other major review sites.

For sure any site that shows NV winning in every or 99% of benchmarks in every review cannot be taken seriously.

The 7970 and 7950 must be one of the longest lived GPUs ever.

I know. Most gamers fell for cunning marketing tactics/hype despite data going back to November 2012 showing $280-300 HD7950 OC easily trading blows and beating a $450 GTX680.

Strange to see the likes of the rebadged 300 series using Tahiti silicon still kicking much A and taking names.

Interesting how some gamers insisted that a GTX680/770 style GPU would be sufficient to last until the end of he PS4/XB1 generation. It's only 2015, just 2 years into the current gen consoles, and those GPUs are already cracking. Hopefully NV releases drivers to improve their performance.

Interesting seeing R9 390X 8GB and GTX980Ti 6GB pulling away from the field in PCGamesHardware's 1440P benchmarks. Didn't expect a COD game to be eating VRAM like that. Sounds like it might not be a bad idea to skip this entire generation and grab an 8GB HBM2 card.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Which is why few take that site seriously. Almost every result has AMD performing lower than other major review sites.

The 7970 and 7950 must be one of the longest lived GPUs ever. They are still producing amazing results today, and the 7950 was an OC beast too, 800mhz -> 1200mhz (50%) OC was common. Strange to see the likes of the rebadged 300 series using Tahiti silicon still kicking much A and taking names.

Yep. My retired bitminer 7950 is still kicking in games OCed to 1150 mhz and I don't see much reason in upgrading to anything on the market.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Which is why few take that site seriously. Almost every result has AMD performing lower than other major review sites.

The 7970 and 7950 must be one of the longest lived GPUs ever. They are still producing amazing results today, and the 7950 was an OC beast too, 800mhz -> 1200mhz (50%) OC was common. Strange to see the likes of the rebadged 300 series using Tahiti silicon still kicking much A and taking names.

My 7950 gets too hot at 1050 OC. I have some fans to install though for my case to try again =D. And a real CPU cooler now in the DH-15 vs my stock intel cooler. With a serious build, I'm interested in seeing how much further my 7950 can go. Too bad it can only VSR at 1440p, and I have a massive obsession with 4k, because the performance is great. It just goes to show there really hasn't been much of a performance jump as of late lol. We need 2016.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Yet another game where it seems the i7 CPUs outperform their i5 brethren significantly. It feels fine to see my advice to go for i7 has been right.

That's one way to look at it. Another is an i5 2500k is achieving 106 minimum with the 2600k hitting 135 minimum. Step that up to a Haswell i5 4670k and you're at 123 minimum fps so even the folks running 120hz monitors are seeing the full benefit at all times. This is also assuming no overclocking, which is rare for PC gamers with these processors.

You can pat yourself on the back if it makes you feel better, but I don't think i5 owners are going to feel all that under-served by that level of performance. The charts look nice for the i7, but an appreciable difference between them there is not, not in this game anyway.
 

Spjut

Senior member
Apr 9, 2011
933
163
106
That's one way to look at it. Another is an i5 2500k is achieving 106 minimum with the 2600k hitting 135 minimum. Step that up to a Haswell i5 4670k and you're at 123 minimum fps so even the folks running 120hz monitors are seeing the full benefit at all times. This is also assuming no overclocking, which is rare for PC gamers with these processors.

You can pat yourself on the back if it makes you feel better, but I don't think i5 owners are going to feel all that under-served by that level of performance. The charts look nice for the i7, but an appreciable difference between them there is not, not in this game anyway.

Ye I've already pat myself on the back, even had a better night's sleep than normal:rolleyes:

Anyway, this once again goes to show that going for the higher-tier CPU is worth it in the long run. The i7 2600k crowd is getting superior performance not only to the 2500k, but also to the 4670k. At stock speed.

We saw this same ordeal happen back when almost every PC forum proposed the E8400 over the Q6600.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Ye I've already pat myself on the back, even had a better night's sleep than normal:rolleyes:

Anyway, this once again goes to show that going for the higher-tier CPU is worth it in the long run. The i7 2600k crowd is getting superior performance not only to the 2500k, but also to the 4670k. At stock speed.

We saw this same ordeal happen back when almost every PC forum proposed the E8400 over the Q6600.


A better CPU is always going to offer superior performance, I think you completely missed the point. Those additional frames are going to waste. If you didn't have these charts in front of you, you would not be able to tell the difference. In other words, both people who did and did not heed your advise are having an equally good experience.

e8400 vs Q6600 was not even close to being the same scenario. i3 vs i5 is more closely related to that then i5 vs i7.
 
Last edited:

Spjut

Senior member
Apr 9, 2011
933
163
106
A better CPU is always going to offer superior performance, I think you completely missed the point. Those additional frames are going to waste. If you didn't have these charts in front of you, you would not be able to tell the difference. In other words, both people who did and did not heed your advise are having an equally good experience.

e8400 vs Q6600 was not even close to being the same scenario. i3 vs i5 is more closely related to that then i5 vs i7.

Charts such as those clearly show the i7s will have a longer lifespan.

E8400 vs Q6600 was the exactly same scenario. Initially, higher clocked C2Ds performed better than their C2Q brethren, with time the C2Q got a clear advantage, and eventually the C2D became useless for modern games.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
A dual vs quad core isn't the same scenario as a quad vs quad core as much as you'd like to tell yourself it is.

Faster processors will have a longer life span then slower ones captain obvious. That isn't what you said. You made it seem like there's an appreciable difference in this game when there is not. Both are easily breaking 100fps.

If that 30 fps difference was say, 40 vs 70 fps, that's a completely different matter, but you're talking about 106 vs 130 and that's for a 4 generation old comparison, those people already got their money's worth. The newer i5's are getting over 120fps
 
Last edited:

Spjut

Senior member
Apr 9, 2011
933
163
106
A dual vs quad core isn't the same scenario as a quad vs quad core as much as you'd like to tell yourself it is.

Faster processors will have a longer life span then slower ones captain obvious. That isn't what you said. You made it seem like there's an appreciable difference in this game when there is not. Both are easily breaking 100fps.

If that 30 fps difference was say, 40 vs 70 fps, that's a completely different matter, but you're talking about 106 vs 130 and that's for a 4 generation old comparison, those people already got their money's worth. The newer i5's are getting over 120fps

It's not simply quad vs quad, it's quad without HT vs quad with HT. And the scenario is exactly the same since the different CPUs get drastically different performance in benchmarks.

It's understandable the i5 proponents suddenly start getting cold feet when benchmarks as these show up. Meanwhile, the i7 proponents' arguments about i7s aging better due to more games using the extra threads turn out to be spot-on.
And before you go saying "faster processors will have a longer life span then slower ones captain obvious" (I'd actually take you more seriously if you skipped using silly terms like captain obvious btw) , you should know that a big argument the i5 folks have made all those years have been that the i7s aren't faster at all because games did not and would not use the extra threads.

I've said what I've said and you can go on talking about terms like "appreciable difference" if you want to. This is going nowhere.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
And you can go on thinking your advice was somehow useful even though it makes no difference in actual game play.

I'm sure all the 4690 owners out there are thinking "man, I'm only getting over 120fps, I should've listen to Spjut"
 

Spjut

Senior member
Apr 9, 2011
933
163
106
And you can go on thinking your advice was somehow useful even though it makes no difference in actual game play.

I'm sure all the 4690 owners out there are thinking "man, I'm only getting over 120fps, I should've listen to Spjut"

I'm sure there are alot of i5 users that do think "damn, i7s are really starting to pull ahead and will probably last longer" however.


There's no need for you to take this debate personally btw.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
It's not personal, it's mathematical. I doubt anyone getting 120+ fps is going to be kicking themselves for not spending an extra $100. Perhaps you should have saved the gloating for a game where there was a real benefit to the i7, I think you jumped the gun a bit too soon.

Also not sure why you keep saying i7 will last longer. I don't think anyone is under any illusion that a faster processor will last longer. You're trying to make two separate arguments but pretending they are one in the same. I don't know anyone who thinks a slower processor will last longer, nor have I ever heard of anyone making such a claim. Have you?

Will an i7 last longer? Sure. Should i5 owners have purchased an i7 instead because of this game? No

If you don't want to account for terms like "appreciable difference" then your own argument goes against what you're saying and you should be promoting Haswell-E hex cores.
 
Last edited:

Spjut

Senior member
Apr 9, 2011
933
163
106
It's not personal, it's mathematical. I doubt anyone getting 120+ fps is going to be kicking themselves for not spending an extra $100. Perhaps you should have saved the gloating for a game where there was a real benefit to the i7, I think you jumped the gun a bit too soon.

Also not sure why you keep saying i7 will last longer. I don't think anyone is under any illusion that a faster processor will last longer. You're trying to make two separate arguments but pretending they are one in the same. I don't know anyone who thinks a slower processor will last longer, nor have I ever heard of anyone making such a claim. Have you?

Will an i7 last longer? Sure. Should i5 owners have purchased an i7 instead because of this game? No

There's been no jumping the gun a bit too soon. I confess english isn't my first language, but I do think that "Yet another game where it seems the i7 CPUs outperform their i5 brethren significantly" clearly shows I wasn't bragging about this game alone.

It seems like you don't know the history of the i5 vs i7 debate. Then it's understandable that you don't know that the argument against the i7 has been exactly that it's not a faster processor for games. That's what the essence of the debate has focused on and that's what's gradually being proven to be wrong.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I'm well aware of the debate, just as i'm well aware of the Q6600 vs e8400 debate. I was all for the Q6600 back then, it was a clear choice. 2 real core vs 4 real cores and the price was the same. You're trying to compare 4 real cores to 4 real+4 virtual and a $100 difference in cost. It's not the same thing. Neither the performance delta nor the economics are the same.

When people were pondering dual vs quad 8 years ago, that's all they had to consider. The processors cost the same, choosing one over the other made no difference to the rest of the system. Not to mention, a game that actually used 4 cores was EASILY 50-60% faster, the original Black Ops was a very good example between these two processors.

In this case, you have a decision that can easily impact other parts of the system since an i7 will cost an extra $100, and so far what we see in a best case scenario, is a 20% increase in performance and that's with the i5 still providing over 100fps, compared to an e8400 which would be close to unplayable compared to the Q6600. What happens to that 20% that you wouldn't even notice anyway if you need to downgrade your GPU to spend that extra $100 on the CPU?

This is a drastically different set of circumstances compared to 8 years ago.
 

Spjut

Senior member
Apr 9, 2011
933
163
106
If the original Black Ops is CoD: BO, that game was released 2010. That's two years after the Q9550 was released and as such wasn't relevant back when the C2D vs C2Q debate took place.

Back then, the situation regarding dual core vs quad core very similar to today's i5 vs i7. Games only benefited by two cores and most assumed the quads were useless for gaming.
Regarding the E8400 vs Q6600 specifically, Penryn was more efficent, so the E8400 even ended up performing slightly better in games that didn't use more than two cores.

Right now, the only early games I can remember showing better performance on C2Q back in 2008 were Far Cry 2 and Brothers in Arms:Hell's highway
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
If the original Black Ops is CoD: BO, that game was released 2010. That's two years after the Q9550 was released and as such wasn't relevant back when the C2D vs C2Q debate took place.

Back then, the situation regarding dual core vs quad core very similar to today's i5 vs i7. Games only benefited by two cores and most assumed the quads were useless for gaming.
Regarding the E8400 vs Q6600 specifically, Penryn was more efficent, so the E8400 even ended up performing slightly better in games that didn't use more than two cores.

Right now, the only early games I can remember showing better performance on C2Q back in 2008 were Far Cry 2 and Brothers in Arms:Hell's highway

You can substitute any game you want, the end result was a huge performance advantage for the Q over the E when more than 2 cores were used and in those cases were the E was providing a horrible experience at the same price point. Once again, you completely missed the point, you simply helped mine by providing more examples of how the debate back then wasn't the same as the one today since that isn't what we see today between an i5 or i7.

The cases where the E was better, it was by a small margin, and the Q still provided very good gameplay. There was very little reason to go with the e8400 back then even though many did.

Anyway, if you really want to prove your point and pat yourself on the back, start a poll and ask who is upset they didn't get a 4790 because their 4670/4690 is "only" getting 120+ fps. That's really what it boils down to, and when you look at the performance figures i5's are still pulling off, to make a post saying "i told you so" sounds pretty ridiculous.
 
Last edited:

Piroko

Senior member
Jan 10, 2013
905
79
91
I'm sure there are alot of i5 users that do think "damn, i7s are really starting to pull ahead and will probably last longer" however.
As a 4690k owner I would like to chip in with:
So far it isn't aging 110€ worth of better. And if money isn't an issue, then why stop at a 4770k? The 5960x beats it by a mile!
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
The Q6600 vs E8400 was a fun debate back in the day,moving up from a e6750 to a Q6600 even made multi tasking smoother for me.Today in games like BO1 and BO2,they love those 4 cores.

You would think a pos generation 1 Phenom x4 9150e@1.8Ghz would be slower in BO1 and BO2 but its quite a bit quicker.Got another rig with a E5200 and BO2 simply runs like a turd on that despite its 700Mhz and CD2 IPC advantages and even BO1 favors the X4 9150e lol.
 

Spjut

Senior member
Apr 9, 2011
933
163
106
As a 4690k owner I would like to chip in with:
So far it isn't aging 110€ worth of better. And if money isn't an issue, then why stop at a 4770k? The 5960x beats it by a mile!

Yeah, and at the same time, folks are prepared to pay the premium just to get the highest-end GPU even if it's only ten extra FPS. And those without qualms even get them running in Crossfire and Sli
 

Spjut

Senior member
Apr 9, 2011
933
163
106
You can substitute any game you want, the end result was a huge performance advantage for the Q over the E when more than 2 cores were used and in those cases were the E was providing a horrible experience at the same price point. Once again, you completely missed the point, you simply helped mine by providing more examples of how the debate back then wasn't the same as the one today since that isn't what we see today between an i5 or i7.

The cases where the E was better, it was by a small margin, and the Q still provided very good gameplay. There was very little reason to go with the e8400 back then even though many did.

Anyway, if you really want to prove your point and pat yourself on the back, start a poll and ask who is upset they didn't get a 4790 because their 4670/4690 is "only" getting 120+ fps. That's really what it boils down to, and when you look at the performance figures i5's are still pulling off, to make a post saying "i told you so" sounds pretty ridiculous.

I miss the point because you don't have any point. Back then, there were almost no games that benefited from running on a C2Q instead of a C2D, just like it has been for i7 vs i5. Those games came later (quite evident given your example about Black Ops). The few games that did run better are exceptions rather than the norm, just like we saw i7 outperforming i5 in Crysis 3 (2013 game, two years back) and BF4 multiplayer.

And once again, you're grasping arguments out of thin air. I've simply stated i7 is indeed aging better than i5 (clear proof by both Sandy Bridge's and Haswell's results), which the i5 proponents for years have been dead-set on not being possible. And I've even made it extra clear to you I wasn't talking about just this one game.

PC guys also always love to estimate future performance out of today's games. I don't see why you're getting so fed up about me commenting about i7 being a better purchase than i5 in the long run.
Talking about i5 vs i7 performance differences due to hyperthreading is not that different from talking about AMD and Nvidia GPU differences based on their known architectural strengths and weaknesses.

Once again, it seems you're taking this personal. Did you help someone with an i5 build or what?
 
Last edited:

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Yeah, and at the same time, folks are prepared to pay the premium just to get the highest-end GPU even if it's only ten extra FPS. And those without qualms even get them running in Crossfire and Sli

I don't think you can compare i5 vs i7 in a vacuum. You have the take in the entire system and entire system cost into consideration.

For someone running graphics cards in excess of $1000, spending the extra $100 on an i7 makes sense. If the graphics card budget is more in the $300-400 range, saving $100 on the cpu to increase the gpu budget will probably be a better bet.

It's all about balance. You're always going to have a bottleneck somewhere, and the trick is to maximize your budget so one component doesn't significantly hamper another.