computerbasePvZ: Garden Warfare 2 Benchmarks

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Which clock are they using for that 1330mhz number? Base? Boost? Max Boost? Average in game?

Even though it's "clocked" at 1000mhz, a reference 980ti will run furmark at 1189mhz

The evga card they have comes with a base clock of 1102mhz. I can almost guarantee they are not running a +228mhz OC. I don't think I've seen an OC that high and it would absolutely require watercooling to keep from melting a hole in the earth.

It seems more likely they are listing the in game boost that they see from the card, which is in line with what I see on my evga 980ti sc at factory clocks.

If my guess is true, we're looking at a 10% factory OC 980ti vs stock Fury X. I'm sure pcgameshardware.de would be happy to test 10% OC Fury X, if one existed...

Aftermarket air cooled 980ti's can do that OC pretty typically. Also, heat damage eventually gets absorbed by the mantle and hasn't yet penetrated the core, so your exaggerations aren't helpful. "Through the earth" would require about 3x as much heat to make it through the other side. You are WRONG by a factor of THREE, noob.

Here, this kid pushed his air cooled 980ti to 1475...I guarantee that hole doesn't go all the way through.

pBBhHZk.png
 
Last edited:
Feb 19, 2009
10,457
10
76
While it's expected that AMD partnered titles run well on GCN, the odd thing recently are GameWorks titles that run much better on GCN.

It seems the more modern the engine is, the better it performs on GCN.

This even applies to Ubisoft games, which are partnered with GameWorks.

Rainbow Six and recently, The Division. Very rare in the past to see a R290/390 trash the 970 in Ubi GameWorks titles on release, but as they update their engines, moving away from the 360/ps3 era into the xbone/ps4 era, the gains for GCN have been very obvious.

It will be interesting to see how Ubisoft's Far Cry Primal performs, with their updated engine.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Wait wait wait.

When did Nvidia hardware start cooperating with Mantle?

No no, not like that. You are far more likely to burn a hole through the earth with your GPU than Nvidia work with the Mantle API. A cold day in hell it will be...with a big hole going straight through it.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,037
431
126
While it's expected that AMD partnered titles run well on GCN, the odd thing recently are GameWorks titles that run much better on GCN.

It seems the more modern the engine is, the better it performs on GCN.

This even applies to Ubisoft games, which are partnered with GameWorks.

Rainbow Six and recently, The Division. Very rare in the past to see a R290/390 trash the 970 in Ubi GameWorks titles on release, but as they update their engines, moving away from the 360/ps3 era into the xbone/ps4 era, the gains for GCN have been very obvious.

It will be interesting to see how Ubisoft's Far Cry Primal performs, with their updated engine.

This is essentially the reason. The engines themselves are being optimized to better utilize GCN because they are using those engines to make games for the XBone and PS4, which both have GCN installed (and an older version compared to PC's, which means the engine needs to be even that much more optimized to squeeze every last drop of performance out of the hardware, since they still have trouble getting 60fps at 1080p on the consoles).

So, in reality, even though the engines may have once been optimized primarily for Nvidia, if they want the engine to be used at all on modern games, they have had to make serious optimization paths to use GCN so that the developers could release console versions of the game (which very few games are PC only aside from indy titles).

The question anymore isn't does the game engine has optimization for GCN, it is does it have optimizations for Nvidia as all the engines have GCN optimization.
 

pj-

Senior member
May 5, 2015
481
249
116
Aftermarket air cooled 980ti's can do that OC pretty typically. Also, heat damage eventually gets absorbed by the mantle and hasn't yet penetrated the core, so your exaggerations aren't helpful. "Through the earth" would require about 3x as much heat to make it through the other side. You are WRONG by a factor of THREE, noob.

Here, this kid pushed his air cooled 980ti to 1475...I guarantee that hole doesn't go all the way through.

pBBhHZk.png

A 980ti running at 1330mhz core would be boosting into the 1600mhz range. Please show me an air cooled 980ti running stably at 1600mhz.

I have the same card as them and with a modded bios providing max voltage, 121% power target and 100% fan speed, it crashes immediately at 1300mhz core. So you're right it wouldn't melt a hole in the earth because a crashed driver doesn't use much power.

1475mhz is almost exactly what I run at in game and that requires a core clock of around 1230, which is a 23% oc from reference.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
No no, not like that. You are far more likely to burn a hole through the earth with your GPU than Nvidia work with the Mantle API.

Damn. I miss Mantle from when I had a 7970. :(

Not using Directx feels so rebel, like a flashback to the 3DFX days.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
The Division doesn't fit in the same box as much as other Ubi titles because its from a very PC friendly developer they acquired (Massive Studios). They were the guys that made World In Conflict which was really a good game, PC only.

Im willing to consider them on their own terms, but Ubi probably still influenced it. No getting it out of Udontplay in any case...
 
Feb 19, 2009
10,457
10
76
The question anymore isn't does the game engine has optimization for GCN, it is does it have optimizations for Nvidia as all the engines have GCN optimization.

Well put.

Consoles have always been the target for AAA developers, it simply must run with as high fidelity & performance as they can squeeze out of the weak hardware.

When it comes to the PC port, it's already optimize for GCN by default, unless there's something like GameWorks to gimp GCN performance.

Even without AMD "game ready" drivers, as seen in NV GameWorks sponsored Rise of the Tomb Raider, day 1 (not the pre-release alpha build, but the Steam release build) performance:

2560_1440.png



RoTR_2560x1440_OFPS_0.png
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
I don't really see AMD having a huge lead ahead of Nvidia. Game just don't favors any GPU maker.

And it seems that both GPU vendors already optimized their drivers for the game.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
Why are people complaining about a game where you can 80+ FPS 1440p with a 390? That delivers a good single GPU 4k experience? Sounds great to me.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
While it's expected that AMD partnered titles run well on GCN, the odd thing recently are GameWorks titles that run much better on GCN.

It seems the more modern the engine is, the better it performs on GCN.

This even applies to Ubisoft games, which are partnered with GameWorks.

Rainbow Six and recently, The Division. Very rare in the past to see a R290/390 trash the 970 in Ubi GameWorks titles on release, but as they update their engines, moving away from the 360/ps3 era into the xbone/ps4 era, the gains for GCN have been very obvious.

It will be interesting to see how Ubisoft's Far Cry Primal performs, with their updated engine.

Wrong again.

The division Fury X vs GTX 980 Ti both at stock.

https://www.youtube.com/watch?v=MaEoMx0niPs

If it is OC then it can be 25% to 30% better ,however, it is winning at stock.
 
Feb 19, 2009
10,457
10
76
Wrong again.

The division Fury X vs GTX 980 Ti both at stock.

https://www.youtube.com/watch?v=MaEoMx0niPs

If it is OC then it can be 25% to 30% better ,however, it is winning at stock.

You always boil it down to Fury X vs 980Ti, then you OC the 980Ti 25 to 30%.

What about the rest of the cards?

390 kills 970. Max OC 970 is required to reach a stock 390 performance.

https://www.youtube.com/watch?v=fq6GyUzyuJQ

I mean, look at even the 290 REFERENCE (we all know how crap they were, hot & throttling) beating the 970.

http--www.gamegpu.com-images-stories-Test_GPU-MMO-Tom_Clancys_The_Division_Beta_-test-d_1920_u.jpg


http--www.gamegpu.com-images-stories-Test_GPU-MMO-Tom_Clancys_The_Division_Beta_-test-d_2560_u.jpg


Now. Is it a good thing that a slower card that's more expensive and has to OC to match something cheaper at stock?

Before you say GCN can't overclock, you have to realize there's OC without +vcore and there's OC with +vcore.

I've owned GCN GPUs since the first, my 7850 did a 50% OC. My 7950 did a 50% OC. Heck the 7970 I got for mining ran at 1.25ghz!

My R290s did a 947 to 1.2ghz OC (~25%+). My current R290X goes from 1ghz to 1.2ghz with +vcore.

Fiji actually gets to 1.2ghz with +vcore too, per TPU's own data. There's +vcore support for Fiji in afterburner awhile now, so those stock no vcore OC are invalid.

Even [H] (!!) got a decent OC on their 390 with +vcore.. -_-

1450692089gP7yOsAJLZ_3_6.gif


Now, if you want to argue about power usage when OC, sure, GCN is always worse for that metric. But to argue on performance and OC performance, you're barking up the wrong tree or you are not aware of +vcore manual OC due to how easy and noob-proof NV's OC has been.
 
Last edited:
Feb 19, 2009
10,457
10
76
@desprado
Ultimately, what you're saying is, it doesn't matter than GCN is wrecking (faster + cheaper!) Kepler and now Maxwell in modern games and DX12. Because you can just OC to catch up?

Fair enough. You can view it like that.

Imagine this analogy applied to cars.

You buy a more expensive car and have to mod the engine, adding extra turbo chargers and kitted out to get similar performance to a cheaper one as it operates normally.

Sounds ridiculous? Yes.

I also want to address your maths. 30% OC is stretching it for even Maxwell.

Reference 980 already clocks to 1.265Ghz boost. You can see this here: http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/22

If you OC it to 1.5ghz, what % is that? Less than 20%, correct?

Likewise, Titan X and 980Ti actually boosts to 1.2ghz in their reference form. You can also see this here: http://www.anandtech.com/show/9306/the-nvidia-geforce-gtx-980-ti-review/17

Again, if you OC it to 1.5ghz, what % is that? 25%, correct?

Is it true or false that not all games actually scale perfectly with X% overclocks?
 
Last edited:

Mahigan

Senior member
Aug 22, 2015
573
0
0
@desprado
Ultimately, what you're saying is, it doesn't matter than GCN is wrecking (faster + cheaper!) Kepler and now Maxwell in modern games and DX12. Because you can just OC to catch up?

Fair enough. You can view it like that.

Imagine this analogy applied to cars.

You buy a more expensive car and have to mod the engine, adding extra turbo chargers and kitted out to get similar performance to a cheaper one as it operates normally.

Sounds ridiculous? Yes.

I also want to address your maths. 30% OC is stretching it for even Maxwell.

Reference 980 already clocks to 1.265Ghz boost. You can see this here: http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/22

If you OC it to 1.5ghz, what % is that? Less than 20%, correct?

Likewise, Titan X and 980Ti actually boosts to 1.2ghz in their reference form. You can also see this here: http://www.anandtech.com/show/9306/the-nvidia-geforce-gtx-980-ti-review/17

Again, if you OC it to 1.5ghz, what % is that? 25%, correct?

Is it true or false that not all games actually scale perfectly with X% overclocks?

Slow clap
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
@desprado
Ultimately, what you're saying is, it doesn't matter than GCN is wrecking (faster + cheaper!) Kepler and now Maxwell in modern games and DX12. Because you can just OC to catch up?

Fair enough. You can view it like that.

Imagine this analogy applied to cars.

You buy a more expensive car and have to mod the engine, adding extra turbo chargers and kitted out to get similar performance to a cheaper one as it operates normally.

Sounds ridiculous? Yes.

I also want to address your maths. 30% OC is stretching it for even Maxwell.

Reference 980 already clocks to 1.265Ghz boost. You can see this here: http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/22

If you OC it to 1.5ghz, what % is that? Less than 20%, correct?

Likewise, Titan X and 980Ti actually boosts to 1.2ghz in their reference form. You can also see this here: http://www.anandtech.com/show/9306/the-nvidia-geforce-gtx-980-ti-review/17

Again, if you OC it to 1.5ghz, what % is that? 25%, correct?

Is it true or false that not all games actually scale perfectly with X% overclocks?

It gives 5% more boost due to memory OC. That is why i said experience matters.
 

casiofx

Senior member
Mar 24, 2015
369
36
61
You always boil it down to Fury X vs 980Ti, then you OC the 980Ti 25 to 30%.

What about the rest of the cards?

390 kills 970. Max OC 970 is required to reach a stock 390 performance.

https://www.youtube.com/watch?v=fq6GyUzyuJQ

I mean, look at even the 290 REFERENCE (we all know how crap they were, hot & throttling) beating the 970.

http--www.gamegpu.com-images-stories-Test_GPU-MMO-Tom_Clancys_The_Division_Beta_-test-d_1920_u.jpg


http--www.gamegpu.com-images-stories-Test_GPU-MMO-Tom_Clancys_The_Division_Beta_-test-d_2560_u.jpg


Now. Is it a good thing that a slower card that's more expensive and has to OC to match something cheaper at stock?

Before you say GCN can't overclock, you have to realize there's OC without +vcore and there's OC with +vcore.

I've owned GCN GPUs since the first, my 7850 did a 50% OC. My 7950 did a 50% OC. Heck the 7970 I got for mining ran at 1.25ghz!

My R290s did a 947 to 1.2ghz OC (~25%+). My current R290X goes from 1ghz to 1.2ghz with +vcore.

Fiji actually gets to 1.2ghz with +vcore too, per TPU's own data. There's +vcore support for Fiji in afterburner awhile now, so those stock no vcore OC are invalid.

Even [H] (!!) got a decent OC on their 390 with +vcore.. -_-

1450692089gP7yOsAJLZ_3_6.gif


Now, if you want to argue about power usage when OC, sure, GCN is always worse for that metric. But to argue on performance and OC performance, you're barking up the wrong tree or you are not aware of +vcore manual OC due to how easy and noob-proof NV's OC has been.
It has always been like this. 290/X were trashed because all reviews used reference throttled cards and laughed by nv fans. Not to mentioned they talked as if 290/X can't OC.

Now when the reverse happened, it is a big issue.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Damn, I'm looking at those charts above and comparing my old GTX 670's to my 980ti's...holy crap right there. That's about a 3x increase in performance. NICE. Still, 70fps min @ 1440p could be improved upon, so hopefully the overclock will help with that.
 
Aug 11, 2008
10,451
642
126
@desprado
Ultimately, what you're saying is, it doesn't matter than GCN is wrecking (faster + cheaper!) Kepler and now Maxwell in modern games and DX12. Because you can just OC to catch up?

Fair enough. You can view it like that.

Imagine this analogy applied to cars.

You buy a more expensive car and have to mod the engine, adding extra turbo chargers and kitted out to get similar performance to a cheaper one as it operates normally.

Sounds ridiculous? Yes.

I also want to address your maths. 30% OC is stretching it for even Maxwell.

Reference 980 already clocks to 1.265Ghz boost. You can see this here: http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/22

If you OC it to 1.5ghz, what % is that? Less than 20%, correct?

Likewise, Titan X and 980Ti actually boosts to 1.2ghz in their reference form. You can also see this here: http://www.anandtech.com/show/9306/the-nvidia-geforce-gtx-980-ti-review/17

Again, if you OC it to 1.5ghz, what % is that? 25%, correct?

Is it true or false that not all games actually scale perfectly with X% overclocks?

Your analogy falls apart from the beginning. You dont have to buy or install any parts to overclock. And actually, except for a very, very few games, the 980Ti starts out equal or faster and pulls ahead even further when overclocked.
 

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
Yet another thread derailed, looks like Silverforce11 started it when he for some reason brought other games (and GameWorks?) into a thread about PvZ: GW2.

Good job guys, you took the bait.
 
Feb 19, 2009
10,457
10
76
Yet another thread derailed, looks like Silverforce11 started it when he for some reason brought other games (and GameWorks?) into a thread about PvZ: GW2.

Good job guys, you took the bait.

Really?

http://forums.anandtech.com/showpost.php?p=38056352&postcount=7

http://forums.anandtech.com/showpost.php?p=38056493&postcount=14

http://forums.anandtech.com/showpost.php?p=38056538&postcount=17

Good job 96Firebird, false accusations and smear. :/

Is there something false with my post?

While it's expected that AMD partnered titles run well on GCN, the odd thing recently are GameWorks titles that run much better on GCN.

It seems the more modern the engine is, the better it performs on GCN.

This even applies to Ubisoft games, which are partnered with GameWorks.

Rainbow Six and recently, The Division. Very rare in the past to see a R290/390 trash the 970 in Ubi GameWorks titles on release, but as they update their engines, moving away from the 360/ps3 era into the xbone/ps4 era, the gains for GCN have been very obvious.

It will be interesting to see how Ubisoft's Far Cry Primal performs, with their updated engine.

Frostbite games (of which PvZ2 is!) are AMD partnered generally, so they run well on GCN. True.

Surprisingly GCN is running well in non AMD sponsored and even NV sponsored titles lately. True again.

So we delve into the reasons why. Consoles have GCN and AAA studios develop for the consoles first. True again.

I make a prediction, if Far Cry Primal runs very well on GCN, it validates my theory. We'll see.

If you disagree with that, you can post why, preferably with evidence.

If you have a problem and think what I posted is off-topic, what are your thoughts about post #7, random smirk anti-Frostbite/AMD jibe and bringing up Unreal 4, which everyone knows is an NV sponsored game engine as some form of valid comparison. Frostbite is a neutral engine. NV sponsored games on it in the past. Frostbite does not have any propriety tech, like UE4 does, GameWorks PhysX ring a bell?

Oh, the very reason I brought up recent games, is there's a trend of excellent GCN performance in new games. If you didn't notice, i could understand your issues.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126

I'm not sure what you're trying to say here...

First link talks about Frostbite, the engine that powers the game.

The second link goes to another review for the game that this thread is about.

The third link talks about the review from the second link.

Your post, however, talks nothing about the game in the OP, which the thread is about. Instead, you go on about other games, GameWorks, and future games.

This isn't too hard to understand, Silverforce11.