gamegpuGears of War Ultimate Edition Benchmarks

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
I wonder how Intel lost the suite about paying PC builders to not use AMD CPUs back then and even gimping AMD with their compilers?

It's actually a very similar situation here, paying devs to implement poorly optimized code that runs much worse on competitor hardware.

Do you have definitive/hard proof of said wrong doing however? Any developers coming forward to back up these claims, or bank statements with proof of Nvidia paying off developers? Without said proof, this conspiracy theory is just that, a conspiracy theory.
 
Feb 19, 2009
10,457
10
76
Do you have definitive/hard proof of said wrong doing however? Any developers coming forward to back up these claims, or bank statements with proof of Nvidia paying off developers? Without said proof, this conspiracy theory is just that, a conspiracy theory.

Yeah thats why I said I wonder how Intel lost that suit, it would have been some real damning evidence.
 

Game_dev

Member
Mar 2, 2016
133
0
0
I have always wondered. Is AMD eligible to press charges against NV/Devs after witnessing all these practices?

I mean the game not supporting some very specific features from one IHV to the next is one thing, but making games a mess is totally different, is it not?

If there was any illegal activity a suit would have been filed long ago. Corporations the size of AMD have an army of lawyers.

You can't go to court and say the competition is beating you and that's not fair.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
I wonder how Intel lost the suite about paying PC builders to not use AMD CPUs back then and even gimping AMD with their compilers?

It's actually a very similar situation here, paying devs to implement poorly optimized code that runs much worse on competitor hardware.

Intel did not loose, AMD just used the situation to force intel into allowing them to use global foundries,which was a breach of contract on AMDs side they had already committed, and AMD succeeded in that.

http://www.amd.com/Documents/AMD_Intel_Settlement_Agreement_-_Full.pdf
Under the terms of the Settlement Agreement, AMD agreed to drop all pending litigation against Intel, including the case in the U.S. District
Court in Delaware and the two cases pending in Japan. AMD also agreed to withdraw all of its regulatory complaints against Intel
worldwide. AMD and Intel obtained patent rights from a new 5-year cross license agreement, and AMD and Intel relinquished any claims of
breach from the previous license agreement. Intel also entered into a license agreement with Global Foundries, Inc., a manufacturing entity
formed by AMD and Advanced Technology Investment Company.
The parties agreed that the settlement was intended solely as a compromise
of disputed claims, and was not to be understood as a concession or determination that either party has engaged in any wrongdoing.
 

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
Why do they need to "admit" anything? Games often use game engines licensed from third parties without saying on the box what engine it uses. What's next, should each setting in the game include a particular author credit for that engine feature, along with whatever "biases" that programmer might have?

Get real.

Anyway, interesting to see this huffing and puffing from somebody who strongly implied to to have had a part in NVIDIA losing a bit of market segment share last quarter...
because hiding features that potentially can harm the competitors is the very nature of coprorate to corporate sabotage that falls into the whole industry espionage law of 1996...
https://en.wikipedia.org/wiki/Economic_Espionage_Act_of_1996
(the whole mess with amd vs intel had a foothold on this act back then and that case was far more shady than this one)
 

Hitman928

Diamond Member
Apr 15, 2012
6,642
12,245
136
GF Gifted me this game (seems I didn't rant to her about how broken it was haha), and I'm scratching my head about where the game is broken for Nvidia. Runs pretty smooth on my rig. (Actually, I should test it on the 290X).

However, I spent about 20 minutes in match maker and just gave up.

I read you can get a refund, so that's what I'll be doing.

The matchmaking system is also currently broken from what I've read on their forum.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
It's actually a very similar situation here, paying devs to implement poorly optimized code that runs much worse on competitor hardware.

Why is everyone so sure the Gameworks is what messes up AMD in this game?

I mean, the 280x and 7970 SLAYS. Those cards aren't "gimped" at all, they are out-performing the equivalent Nvidia card (even OG Titan, the old twice as expensive rival). The 7870 is also doing great, but the 7950 does so-so.

If Nvidia was sabotaging AMD cards with Gameworks why does full Tahiti do great and cut Tahiti (and Hawaii and Fiji) do not so great? We have never seen Gameworks pick and chose what AMD cards to boost or not before, it was just all Kepler and all AMD that took a dive. I don't even know if Nvidia knows how to make JUST full Tahiti and full Pitcairn do better but make everything else tank. Suddenly that sounds like the magic JFK bullet to me, I can't believe it.

I think the real answer no one wants to face for some reason but it makes sense for me- the game developers did a terrible job polishing the game. Maybe they just had QA systems with every Maxwell card, but on the AMD side they only had two: ones with a 280x and a 270x. Putting the blame on the developer is the only thing that makes sense to me. Gameworks can't be blamed for this crime.

Hopefully that isn't an indication of the QA testing we get in the Directx 12 era. It would be scary if PC games became like Android games where they only run well on the actual devices the developer bothered to QA with even though technically all devices should work fine.
 

Hitman928

Diamond Member
Apr 15, 2012
6,642
12,245
136
Why is everyone so sure the Gameworks is what messes up AMD in this game?

I mean, the 280x and 7970 SLAYS. Those cards aren't "gimped" at all, they are out-performing the equivalent Nvidia card (even OG Titan, the old twice as expensive rival). The 7870 is also doing great, but the 7950 does so-so.

If Nvidia was sabotaging AMD cards with Gameworks why does full Tahiti do great and cut Tahiti (and Hawaii and Fiji) do not so great? We have never seen Gameworks pick and chose what AMD cards to boost or not before, it was just all Kepler and all AMD that took a dive. I don't even know if Nvidia knows how to make JUST full Tahiti and full Pitcairn do better but make everything else tank. Suddenly that sounds like the magic JFK bullet to me, I can't believe it.

I think the real answer no one wants to face for some reason but it makes sense for me- the game developers did a terrible job polishing the game. Maybe they just had QA systems with every Maxwell card, but on the AMD side they only had two: ones with a 280x and a 270x. Putting the blame on the developer is the only thing that makes sense to me. Gameworks can't be blamed for this crime.

Hopefully that isn't an indication of the QA testing we get in the Directx 12 era. It would be scary if PC games became like Android games where they only run well on the actual devices the developer bothered to QA with even though technically all devices should work fine.

Agree 100%.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The matchmaking system is also currently broken from what I've read on their forum.

Yeah, this game has been a mess. I filed for a refund and it's in progress (probably expect it by Monday to be approved.)

Sadly, the game is now locked on my Live account otherwise I'd bench the 290X.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
If there was any illegal activity a suit would have been filed long ago. Corporations the size of AMD have an army of lawyers.

You can't go to court and say the competition is beating you and that's not fair.

They can't really. Even if they're is compelling evidence they'll be persecuted all over the web as cry babies, poor sports, etc.

The funny thing is it's Nvidia that needs to play dirty to be competitive with the underdog. Getting rewarded with $'s by those who feel beating the underdog with dirty tactics is somehow what's best for all concerned. AMD with nVidia's $'s would probably move the PC master race at a much faster pace in the end.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The funny thing is it's Nvidia that needs to play dirty to be competitive with the underdog. Getting rewarded with $'s by those who feel beating the underdog with dirty tactics is somehow what's best for all concerned. AMD with nVidia's $'s would probably move the PC master race at a much faster pace in the end.

Question, when did ATI GPUs become the underdog? When AMD bought em or after AMD basically cratered them?

I never recall ATI being considered underdogs. Hell, they fought on equal grounds with NV.
 

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
They can't really. Even if they're is compelling evidence they'll be persecuted all over the web as cry babies, poor sports, etc.

The funny thing is it's Nvidia that needs to play dirty to be competitive with the underdog. Getting rewarded with $'s by those who feel beating the underdog with dirty tactics is somehow what's best for all concerned. AMD with nVidia's $'s would probably move the PC master race at a much faster pace in the end.

While this is a bit off-topic, I disagree that nVidia needs to play dirty to be competitive. nVidia has great hardware, and they have great engineering.

The factory OCd 980ti are easily the best cards today, and the 980 was dominating for quite a while before that. AMD really blew it with the 290x release, and it wasn't nVidia's fault.

http://www.anandtech.com/show/7481/the-amd-radeon-r9-290-review/17

The problem is that while the 290 is a fantastic card and a fantastic story on a price/performance basis, in chasing that victory AMD has thrown caution into the wind and thrown out any kind of balance between performance and noise. At 57.2dB the 290 is a loud card. A very loud card. An unreasonably loud card. AMD has quite simply prioritized performance over noise, and these high noise levels are the price of doing so.

To get right to the point then, this is one of a handful of cards we’ve ever had to recommend against. The performance for the price is stunning, but we cannot in good faith recommend a card this loud when any other card is going to be significantly quieter. There comes a point where a video card is simply too loud for what it does, and with the 290 AMD has reached it.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Why is everyone so sure the Gameworks is what messes up AMD in this game?

I mean, the 280x and 7970 SLAYS. Those cards aren't "gimped" at all, they are out-performing the equivalent Nvidia card (even OG Titan, the old twice as expensive rival). The 7870 is also doing great, but the 7950 does so-so.

If Nvidia was sabotaging AMD cards with Gameworks why does full Tahiti do great and cut Tahiti (and Hawaii and Fiji) do not so great? We have never seen Gameworks pick and chose what AMD cards to boost or not before, it was just all Kepler and all AMD that took a dive. I don't even know if Nvidia knows how to make JUST full Tahiti and full Pitcairn do better but make everything else tank. Suddenly that sounds like the magic JFK bullet to me, I can't believe it.

I think the real answer no one wants to face for some reason but it makes sense for me- the game developers did a terrible job polishing the game. Maybe they just had QA systems with every Maxwell card, but on the AMD side they only had two: ones with a 280x and a 270x. Putting the blame on the developer is the only thing that makes sense to me. Gameworks can't be blamed for this crime.

Hopefully that isn't an indication of the QA testing we get in the Directx 12 era. It would be scary if PC games became like Android games where they only run well on the actual devices the developer bothered to QA with even though technically all devices should work fine.

I tend to agree with this guy.:thumbsup:
 

Game_dev

Member
Mar 2, 2016
133
0
0
Question, when did ATI GPUs become the underdog? When AMD bought em or after AMD basically cratered them?<br />
<br />
I never recall ATI being considered underdogs. Hell, they fought on equal grounds with NV.
<br />
Wasn't the combined powers of AMD and ATI supposed to dominate and NVIDIA would be pushed out? <br />
<br />
Also having all the consoles was supposed to do the same thing.

People are starting to sing the same tune about Polaris and Zen.
 

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
<br />
Wasn't the combined powers of AMD and ATI supposed to dominate and NVIDIA would be pushed out? <br />
<br />
Also having all the consoles was supposed to do the same thing.

People are starting to sing the same tune about Polaris and Zen.

Do you have a link to support that? Anyone saying that AMD+ATI would *dominate* nVidia and "push" them out? I'd prefer someone reputable, and not some random fanboy on a random forum.

I'd also love to see a link to someone saying anything similar about Polaris and Zen.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
Why is everyone so sure the Gameworks is what messes up AMD in this game?

I mean, the 280x and 7970 SLAYS. Those cards aren't "gimped" at all, they are out-performing the equivalent Nvidia card (even OG Titan, the old twice as expensive rival). The 7870 is also doing great, but the 7950 does so-so.

If Nvidia was sabotaging AMD cards with Gameworks why does full Tahiti do great and cut Tahiti (and Hawaii and Fiji) do not so great? We have never seen Gameworks pick and chose what AMD cards to boost or not before, it was just all Kepler and all AMD that took a dive. I don't even know if Nvidia knows how to make JUST full Tahiti and full Pitcairn do better but make everything else tank. Suddenly that sounds like the magic JFK bullet to me, I can't believe it.

I think the real answer no one wants to face for some reason but it makes sense for me- the game developers did a terrible job polishing the game. Maybe they just had QA systems with every Maxwell card, but on the AMD side they only had two: ones with a 280x and a 270x. Putting the blame on the developer is the only thing that makes sense to me. Gameworks can't be blamed for this crime.

Hopefully that isn't an indication of the QA testing we get in the Directx 12 era. It would be scary if PC games became like Android games where they only run well on the actual devices the developer bothered to QA with even though technically all devices should work fine.

I don't think people are just saying Gameworks is the problem with this game....
I think people who are saying Gameworks is an issue in this game know full well that there are far more issues here at play. This game is clearly a mess.
 

Mercennarius

Senior member
Oct 28, 2015
466
84
91

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Do you have a link to support that? Anyone saying that AMD+ATI would *dominate* nVidia and "push" them out? I'd prefer someone reputable, and not some random fanboy on a random forum.

I'd also love to see a link to someone saying anything similar about Polaris and Zen.
you are asking him to back up, substantiate his claims? :D that goes against the entire purpose of his account and all his posts thus far.
 

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
I've had both a Sapphire 290X and 390X and neither were loud under load. In fact when gaming I rarely even hear my 390X. Pretty quiet actually.
Sure, now they're quiet. I know, I have an R9 390. From what I gather, at release, AMD's reference coolers were horrible. Am I wrong? The name was ruined, they botched up the release. Things were fixed later, but when people Googled I'm sure that the reference coolers are what they saw.
 
Feb 19, 2009
10,457
10
76
Kepler generation.

~38% marketshare* isn't bad. It was Maxwell 970/980 that made them sink to ~20%, then ~18% and now back up to ~23%.

It's odd to say the R290 and R290X wasn't competitive versus Kepler 780, Titan and 780Ti.

Or to suggest the 7950 and 7970 wasn't competitive versus Kepler 670 and 680.


* Source: https://jonpeddie.com/press-release...ket-down-in-q2-nvidia-holds-market-share-lead

AIB%20PR2.JPG


It went downhill from there post Maxwell's launch in Q3 2014. Prior to that, they were never a major underdog, just 40:60 or even 50:50 going back further. Despite much lower R&D spending.
 
Last edited:

desprado

Golden Member
Jul 16, 2013
1,645
0
0
~38% marketshare* isn't bad. It was Maxwell 970/980 that made them sink to ~20%, then ~18% and now back up to ~23%.

It's odd to say the R290 and R290X wasn't competitive versus Kepler 780, Titan and 780Ti.

Or to suggest the 7950 and 7970 wasn't competitive versus Kepler 670 and 680.


* Source: https://jonpeddie.com/press-release...ket-down-in-q2-nvidia-holds-market-share-lead

AIB%20PR2.JPG


It went downhill from there post Maxwell's launch in Q3 2014. Prior to that, they were never a major underdog, just 40:60 or even 50:50 going back further. Despite much lower R&D spending.
It was maxwell performance per watt and efficiency that cause problem to AMD.

People forget to see that R9 390 use nearly 50% of power to deliver 10% more performance then GTX 970? That is why Maxwell is a goldmine for OC.

"However, our chosen stress test - the street shoot-out when seizing your base of operations - sees a significant performance drop on AMD, where it only narrowly outperforms the GTX 970. It's an interesting example of how benchmarking specific games can see performance differentials between the two vendors change significantly according to context. Regardless, 1080p60 on ultra is off the table for both cards - we got closer by dropping down to high settings with medium shadow resolution. This improved in-game fluidity no end, but it required a meaty +200MHz core overclock along with +400MHz to the memory in order to get the GTX 970 to more consistently hit the target. Generally speaking, we don't recommend overclocking the R9 390 beyond the 1050-1070MHz territory seen in factory-overclocked cards - power consumption and heat generation rise enormously."
https://www.youtube.com/watch?v=Jne8VWuE2a4

http://www.eurogamer.net/articles/digitalfoundry-2016-the-division-pc-tech-analysis

R9 390 is OC edition of R9 290 and it has very OC head is left.

If you OC GTX 970 then you get 20% more performance out of it ,however, red team demand is that benchmark need to be fair so they are fair.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Kepler generation.

Must be something recent. I guess when they ditched the ATI name, the storied history of AMD carried over. Suddenly their GPUs were "underdogs."

Telling you, the AMD name is worthless. Glad they're trying to distance themselves from it.