[GameGPU.ru] In nvidia we trust - on the example of War Thunder

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Vote with your wallet. The only way for developers to stop using GimpWorks is if you stop buying AAA games at full prices and purchase them at bargain bin prices. The more of us do it, the more they'll get the message that Jeez, maybe they should create next gen PC games on their own or at least add open-source developer created & optimized graphical effects to PC games?! Welcome to 3-4 decades of how PC gaming used to be ;).

That won't even work. With GMG, Steam, Best Buy, and now it seems even Amazon you can save 20% or more if you buy a game day one. Gone are the days of paying full price for AAA titles.

Hell I got Fallout 4 for $35, that was using Best Buy's gamer club (20% off new titles), I got $10 back in reward points for "pre-ordering" the day before release, and then 10% Cash back on top.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
R290/X was very competitive versus Titan and 780 when it launched, with day 1 drivers.

The fact that they are now competitive versus 970/980 with the same chip is testament to the fact that GCN matures better.

It will keep on maturing better well into the DX12 era, you can be certain of this and quote me on it in a year's time.

None of that helped AMD when the card launched. And going forward, it doesn't seem to help AMD much when by the time DX12 games are out in numbers that card is/should be irrelevant.

AMD has good products, they just aren't as good the day it seems to matter most - when they are released.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
I'm not the only one that noticed the CPUs are different between the presented tests? The ADF doing its usual top notch job.

Change a bunch of variables and they try to compare results to support a conclusion. Scientific Method at work.
 
Last edited:

provost

Member
Aug 7, 2013
51
1
16
All things being equal, I would take an "engineering first " gpu company over a "software first " co. Software can always be optimized (or not), but hardware lasts much longer. As they used to say, software comes and goes, but hardware is forever.....
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I don't think that post contributes to the discussion. There is no AMD technology in Battlefront. The fact that the game performs better on GCN is absolutely not the same as injections of a proprietary black box SDK into a game resulting in demonstratively worse performance on the competition. Even for games that do have Gaming Evolved you have to come to term with the difference between open source and proprietary and that Gameworks is the latter which btw does not apply to Gaming Evolved or even the old TWIMTBP.

The whole Frostbite engine is a "proprietary black box" developed by a company which is in bed with AMD. Because there is no "AMD technology" in it doesnt mean it is not 100% optimized for AMD hardware.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The whole Frostbite engine is a "proprietary black box" developed by a company which is in bed with AMD. Because there is no "AMD technology" in it doesnt mean it is not 100% optimized for AMD hardware.

Can you give evidence that there is any special optimizations for AMD hardware relative to nVidia?
 

DrMrLordX

Lifer
Apr 27, 2000
22,707
12,669
136
I'm not the only one that noticed the CPUs are different between the presented tests? The ADF doing its usual top notch job.

Change a bunch of variables and they try to compare results to support a conclusion. Scientific Method at work.

What you observe actually bolsters the point made by the OP. There is no change in CPU between the 1.37 and 1.39 benchmarks where we see AMD's performance collapse. The 1.53 benchmark has a newer, faster CPU (5960X), and surprise surprise, every card is now performing much faster than before.

It is not to say that the CPU change was solely responsible for the increase in framerates, but it likely played a part. If you reproduced the conditions of the 1.39 test changing only the War Thunder version to 1.53 and (maybe) driver versions to the latest/best-performing drivers available for all involved cards, you might not see such a big difference in performance between 1.39 and 1.53. You might also see performance from the 290x that is lower than it was under 1.37 (again, using older/slower drivers).
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
There is no GameWorks either in 1.39. Unless you try and tell me the developers lied and suddenly disclosed it as included the last month.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Can you give evidence that there is any special optimizations for AMD hardware relative to nVidia?

You mean outside of the whole engine? :\
It is optimized for AMD with the high cache and bandwidth requirements.
A 1350MHz GTX980TI is less than 50% faster than a 1100MHz GTX780TI while having around 70% more usable compute performance.

This game is limited by the bandwidth.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
You mean outside of the whole engine? :\
It is optimized for AMD with the high cache and bandwidth requirements.
A 1350MHz GTX980TI is less than 50% faster than a 1100MHz GTX780TI while having around 70% more usable compute performance.

This game is limited by the bandwidth.

perfrel_3840_2160.png

bf4_3840_2160.png


So, averaged with 15 games compared with BF4 and you say it looks like it favors AMD? The 980 ti is ~10% faster in BF4 but ~5% slower overall. The 780 ti is faster than the 290X in BF4 but slower overall. And this is at 4K where if the game was starving for bandwidth it would be even more apparent. If anything a claim could be made that it favors nVidia since the performance is better in BF4 than it is overall.

Dice does not gimp any cards, new or old from either brand. All games should be as well optimized. We'd have better performance and better looking games.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
More than anything nvidia has "done", it's amd and their own core issues that ruin it for me. Amd needs to get their drivers for new architectures in an optimal state quicker. This way they can compete at launch day instead of needing to "wait bro it'll eventually be better".

Amd needs to figure out a marketing strategy. I have heard the "amd is an engineering company first". That makes zero sense. It's a business it should be run as such. That quote makes me never want to buy another amd product ever again. Make products consumers want and give me what I need. I don't care about performance for other tasks I don't do for a gaming card. There are other cards for those tasks.

Amd needs to figure out a supply issue. Both the fury x and 290x were unavailable at launch. 290x was even worse since price sky rocketed and aib coolers weren't ready so it got horrible reviews on an otherwise ok gpu. That's on amd again. You have a winning chip, and internally while testing it you never used the cooler? Ever? Seriously what does amd do testing wise do they ever test a gpu before sending it out? Or do they just Yolo? It's am unfinished job each time. Just 1 step away from perfection in key parts that turn an otherwise great product into one we can easily tear apart.

Like sure maybe eventually we could recommend fury x against the 980ti in some magical world where the fury x becomes faster in dx 12 games. But who will find those reviews then? They will most likely find one of the reviews where the 980ti wins.

I like what amd wanted to do with fury x, but it was just the wrong time. Imagine if amd had done this with the 290x vs the Titan and 780/ti. Imagine coming out with the fury x style cooler then, and then no reference designs? And we'd be talking about fury x and r9 290 sapphire trix, amd launches the Hawaii gpu series, it's a slaughter house... We're still picking up broken pieces of titans from angry nvidia gpu owners...
And with actual decent launch drivers, none of this "wait eventually amd comes out ahead." dude how long? Are you sure? Like I'm not buying a product based on hoping for future performance. That's ridiculous 290x/290 launch should have been devastating to nvidia, but amd can't get good drivers out Early, can't plan a gpu release, I mean they're simply inept.

Amd only has itself to blame. It's late to every party, when it's early to the party it's hours early. I mean ok, if you show up late lol. But I mean who shows up to a party that starts at 11 p. at 9 am? Amd does.

Freesync has been a number of monitors but no standout amazing monitor to really showcase freesync. Amd should have been working with someone to have a freesync monitor available with a huge range, strobe, etc. Something like a "this is the best it can be". Working with Asus or any other company to have a premium freesync monitor out would be useful, instead amd throws it to the wolves, we're constantly trying to figure out the real refresh rate window. Why not have a database on amds site so we don't have to guess?

I could go on until and on until the food I'm cooking eventually begins to burn and I wouldn't be done getting through the list of issues amd has. Gameworks is annoying, but amd has other issues far more annoying than gameworks. Who forgets to use hdmi 2.0 in a gpu for htpc use? Amd does... But don't worry the adapter is coming soon.

I can't tell which group is more out of touch. The people who defend amd, or the people who work there.

I don't disagree with you.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
You mean outside of the whole engine? :\
It is optimized for AMD with the high cache and bandwidth requirements.
A 1350MHz GTX980TI is less than 50% faster than a 1100MHz GTX780TI while having around 70% more usable compute performance.

This game is limited by the bandwidth.

Which sits in line with full PBR engines needing more bandwith.

So you are implying DICE shouldnt make their engine full PBR and raise the bar in graphics fidelity because bandwith (mostly) starved Maxwell V2 doesnt gain as much as they do on other games with lower overall IQ?

This only proves that Nvidia got greedy with their memory controller design this time around. They could get away with lower width buses with Kepler versus GCN 1.0-1.1, but they went even further with Maxwell V2.

First with the crossbar issues found in 970 and causing it to have 3.5gb fast memory plus 0.5gb dog slow memory, now the whole lineup looking pale (with the 960 being a yellow flag regarding their direct competitors) in full PBR engines like the one in BF:SW. If full PBR is the new black regarding graphics engine trends, I sure see in the horizon a new Kepler situation for Nvidia and Maxwell V2's aging.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
DICE, more specifically Johan Andersson, has been a talking head for AMD for a while now. Anyone who doesn't see that should remove their head from the sand.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
DICE, more specifically Johan Andersson, has been a talking head for AMD for a while now. Anyone who doesn't see that should remove their head from the sand.

That is not evidence that their engine is hurting nVidia. I've already posted graphs that show the opposite.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
DICE, more specifically Johan Andersson, has been a talking head for AMD for a while now. Anyone who doesn't see that should remove their head from the sand.

I would argue that his various statements about AMD have been largely a result of AMDs connection to Mantle (which Johan Andersson is the main author of), and not really anything more sinister than that.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
That is not evidence that their engine is hurting nVidia. I've already posted graphs that show the opposite.

I never said anything about their engine, stop putting words in my mouth! :'(

I would argue that his various statements about AMD have been largely a result of AMDs connection to Mantle (which Johan Andersson is the main author of), and not really anything more sinister than that.

I don't think anything sinister is going on. But to say there is no connection between AMD and DICE is wrong.

Most of this is off-topic anyways, I honestly though I was in the 2x Fiji thread, which is why I mentioned @repi.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The whole Frostbite engine is a "proprietary black box" developed by a company which is in bed with AMD. Because there is no "AMD technology" in it doesnt mean it is not 100% optimized for AMD hardware.

I never said anything about their engine, stop putting words in my mouth! :'(


It is not anti-competitive when a developer doesnt care about other companies. DICE has done the same with Battlefront and nobody is calling the coalition between AMD and DICE/EA "anti-competitive".
Again, show any shred of evidence that Dice doesn't care about "other companies" (maybe you want to clarify that? You do mean nVidia, don't you? Not Ford or GM, or someone else?)

Most of this is off-topic anyways, I honestly though I was in the 2x Fiji thread, which is why I mentioned @repi.

Oh, so all of this was in regards to the Fiji X2? I missed that.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
perfrel_3840_2160.png

bf4_3840_2160.png


So, averaged with 15 games compared with BF4 and you say it looks like it favors AMD? The 980 ti is ~10% faster in BF4 but ~5% slower overall. The 780 ti is faster than the 290X in BF4 but slower overall. And this is at 4K where if the game was starving for bandwidth it would be even more apparent. If anything a claim could be made that it favors nVidia since the performance is better in BF4 than it is overall.

Dice does not gimp any cards, new or old from either brand. All games should be as well optimized. We'd have better performance and better looking games.

Pretty impressive results @ 4K for Fury X. In retrospect I haven't been giving it as much credit as I should have. Is this 4K competitiveness a result of newer drivers? I recall at launch the 980Ti being quicker even at 4K at stock clocks, so I am kind of surprised to see Fury X's relative performance at 4k to be better than the 980 Ti.

These old results show the 980 Ti winning 90%~ of 4k tests. http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/13

Overclocking is the spoiler, but at reference speeds it is an impressive showing by Fury X.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I don't think anything sinister is going on. But to say there is no connection between AMD and DICE is wrong.

Most of this is off-topic anyways, I honestly though I was in the 2x Fiji thread, which is why I mentioned @repi.

Sure there's a connection, but a simple connection is quite a different thing from being a "talking head" as you claimed. Johan Andersson is (was) certainly a "talking head" for Mantle (which is only natural, seeing as he was the main architect behind it), but I don't think you can claim that he was ever a "talking head" for AMD in any capacity beyond Mantle.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Pretty impressive results @ 4K for Fury X. In retrospect I haven't been giving it as much credit as I should have. Is this 4K competitiveness a result of newer drivers? I recall at launch the 980Ti being quicker even at 4K at stock clocks, so I am kind of surprised to see Fury X's relative performance at 4k to be better than the 980 Ti.

These old results show the 980 Ti winning 90%~ of 4k tests. http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/13

Overclocking is the spoiler, but at reference speeds it is an impressive showing by Fury X.

Nah, notice it's overall scores he linked. Techpowerup changed their line up and removed some NV heavy leaning games. But if you look at the games featured in the old line up, you see no change:

August 2015
som_3840_2160.gif

vs
Nov 2015
som_3840_2160.png


52.7 vs 51.6, well within the margin of error.

EDIT: Also, higher resolution tends to benefit Fury cars more than Nvidia. Same game at lower resolutions it's a complete reversal:
som_1920_1080.png


At 1440p where I was gaming, the Fury X fell short of beating out the 980 Ti.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
22,707
12,669
136
Ask the developer. You seem to be looking for an excuse.

An excuse? Neither AMD nor I caused the slowdown.

You can go through every changelog and tell me what you find:
http://warthunder.com/en/game/changelog

Erenhardt already did, and found that the dev was already working with Nvidia before the October 2nd blog entry.

Pretty impressive results @ 4K for Fury X. In retrospect I haven't been giving it as much credit as I should have. Is this 4K competitiveness a result of newer drivers?

Probably so, though Fury X has always done its best @ 4k vs. the 980Ti. It is less competitive at lower resolutions, which is very interesting. With the RAM situation being what it is on Fury X, you'd think it would be the other way around.
 
Last edited: