gamegpuGears of War Ultimate Edition Benchmarks

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tential

Diamond Member
May 13, 2008
7,348
642
121
Having no new product to answer Maxwell for a long time obviously leads to a massive loss in market share.

The simplest answer is most often the most obvious.
Must be something recent. I guess when they ditched the ATI name, the storied history of AMD carried over. Suddenly their GPUs were "underdogs."

Telling you, the AMD name is worthless. Glad they're trying to distance themselves from it.

The genius at AMD who bought ATi, then was like "We have this brand value that's worth something on the balance sheet.... Lets destroy it!!!!!"

Like why? That was actual money.

"AMD Acquires ATi Graphics to continue selling ATi GPUs and improve AMD CPUs."
You keep ATi Brand value and you get to bolster your own brand with a well known and liked brand.

The amount of bad moves AMD can make is mind boggling.
 
Last edited:

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
I think one of the things that hurt AMD badly was the long time they were selling most of the 390 as the 290 because they blackened the reputation of the card with the reference cooler when the aftermarket cards were amazing. They had product that was worth sales, and it wasn't selling. That the 290 line got an immediate boost when they released the 390 shows what they were leaving on the table.

Getting hosed on 20nm and not having the money or the strategy to make something good on 28nm really hurt them badly and I'm hoping it's a one time thing.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
I think one of the things that hurt AMD badly was the long time they were selling most of the 390 as the 290 because they blackened the reputation of the card with the reference cooler when the aftermarket cards were amazing. They had product that was worth sales, and it wasn't selling. That the 290 line got an immediate boost when they released the 390 shows what they were leaving on the table.

Getting hosed on 20nm and not having the money or the strategy to make something good on 28nm really hurt them badly and I'm hoping it's a one time thing.

The main thing hurt AMD is that its own community. They hyped their product to extreme and when it release then it becomes a disappointment.

For example few poster were saying that Fury X will god like, 20% to 40% faster then Titan X etc.

Another example Lisa said that Fury X is an overclockers dream and world fastest card then some same posters in this website hype the card to extreme which leads to fury x as a failure.

Same thing happened with bulldozer where some poster hype that CPU and said it will be 40% faster then sandy bridge and it ended up to 40% slower then sandy bridge.
 
Last edited:
Feb 19, 2009
10,457
10
76
For example few poster like 3D, Sliverfox ect were saying that Fury X will god like, 20% to 40% faster then Titan X etc.

Stop making up fud. I never said such things. You don't understand English and the subtleties of "maybe" or "could be" or "should be" to "it will be" or "it's going to be" or "it definitely is" ...

I said at the time, I was hoping it's ~15% above Titan X, so I can upgrade. It turn out not to be, and so I did not buy it.

You seem to keep spreading these lies, putting words in my mouth. That must be some kind of rule-breaking going on there, right?
 

PhonakV30

Senior member
Oct 26, 2009
987
378
136
The main thing hurt AMD is that its own community. They hyped their product to extreme and when it release then it becomes a disappointment.

For example few poster like 3D, Sliverforce ect were saying that Fury X will god like, 20% to 40% faster then Titan X etc.

Another example Lisa said that Fury X is an overclockers dream and world fastest card then some same posters in this website hype the card to extreme which leads to fury x as a failure.

Same thing happened with bulldozer where some poster hype that CPU and said it will be 40% faster then sandy bridge and it ended up to 40% slower then sandy bridge.

AMD Ceo Lisa Sue never said it.Please find source.It was Joe Macri that said it was an overclockers dream not Lisa Sue

source : pcworld
The Radeon R9 Fury X comes with integrated closed-loop water cooling, much like AMD’s dual-GPU Radeon R9 295x2. “You’ll be able to overclock this thing like no tomorrow,” AMD CTO Joe Macri says. “This is an overclocker’s dream.”


Source : wccftech

Matt Skynner CVP and General Manager of AMD’s GPU and APU products touted the company’s upcoming flagship Fury X as the world’s fastest GPU. As he proclaimed today at the company’s Computex press conference that “HBM enables us to build the fastest GPU in the world” when referring to Fiji, reports Hardwareluxx .

You really try hard.control your emotion before this control your mouth.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
Stop making up fud. I never said such things. You don't understand English and the subtleties of "maybe" or "could be" or "should be" to "it will be" or "it's going to be" or "it definitely is" ...

I said at the time, I was hoping it's ~15% above Titan X, so I can upgrade. It turn out not to be, and so I did not buy it.

You seem to keep spreading these lies, putting words in my mouth. That must be some kind of rule-breaking going on there, right?

It's a free speech forum. You're welcome to claim that Fury X is 20% faster than the 980Ti right now. Since we don't require quotes for these claims, it really doesn't matter.

You can claim the CEO of Nvidia personally will come to your house and break your PC if you use AMD. No one really is stopping you.

It just ruins your credibility.... lol... like that matters here anymore.

To me, the 980Ti is one of the best cards to recommend to a person.
For me personally? I would get Fury X Crossfire if I'm at the high end. A single 980Ti is utterly useless to me. 980Ti SLI has heat concerns. Fury X Crossfire makes far more sense. And I wouldn't game at 1440p, which is where Nvidia shines. I'd game at 4K.

So I could really claim Fury X is the best card out. I wouldn't be wrong. It's just how you feel.

In desperado's case, he considers getting a 980Ti on sale, max ocing it, vs a Fury X, and comparing at 1440p. He's not wrong. It's the resolution/product he'd buy and I wouldn't buy a Fury X to max OC it to play at 1440p either....

Just like he's not wrong that a 970 is better than a 390. Because he isn't considering 1440p. He wants to play at 1080p on such a GPU, where NVidia shines at lower resolutions.
I personally would rather not game than game at 1080p. After VSR/DSR came out. I couldn't handle not playing at 1440p. But for him, he isn't into DSR/VSR. So for him, Nvidia does better where he likes 1080p.

Really you can claim what you want, twist your story to fit it. That's life...

You could say the 980Ti was a failure and that if it had Freesync compatibility, better 4K performance, Async compatibility, and the latest VRAM tech it'd be good. You wouldn't be wrong either....
 
Last edited:

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
It's a free speech forum. You're welcome to claim that Fury X is 20% faster than the 980Ti right now. Since we don't require quotes for these claims, it really doesn't matter.

I'm actually kind of curious because the one thing the rules are a bit touchy about is habitual attacks and you could make an argument that repeated libeling of a person is in fact an attack. I'd kind of expect it to be since it's a loophole you can drive a truck through otherwise.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
It's a free speech forum. You're welcome to claim that Fury X is 20% faster than the 980Ti right now. Since we don't require quotes for these claims, it really doesn't matter.

You can claim the CEO of Nvidia personally will come to your house and break your PC if you use AMD. No one really is stopping you.

It just ruins your credibility.... lol... like that matters here anymore.

To me, the 980Ti is one of the best cards to recommend to a person.
For me personally? I would get Fury X Crossfire if I'm at the high end. A single 980Ti is utterly useless to me. 980Ti SLI has heat concerns. Fury X Crossfire makes far more sense. And I wouldn't game at 1440p, which is where Nvidia shines. I'd game at 4K.

So I could really claim Fury X is the best card out. I wouldn't be wrong. It's just how you feel.

In desperado's case, he considers getting a 980Ti on sale, max ocing it, vs a Fury X, and comparing at 1440p. He's not wrong. It's the resolution/product he'd buy and I wouldn't buy a Fury X to max OC it to play at 1440p either....

Just like he's not wrong that a 970 is better than a 390. Because he isn't considering 1440p. He wants to play at 1080p on such a GPU, where NVidia shines at lower resolutions.
I personally would rather not game than game at 1080p. After VSR/DSR came out. I couldn't handle not playing at 1440p. But for him, he isn't into DSR/VSR. So for him, Nvidia does better where he likes 1080p.

Really you can claim what you want, twist your story to fit it. That's life...

You could say the 980Ti was a failure and that if it had Freesync compatibility, better 4K performance, Async compatibility, and the latest VRAM tech it'd be good. You wouldn't be wrong either....
haha, this is pretty good argument vs both sides. it is just a matter of goal shifting for both sides. the best argument for AMD though is everything you buy have the potential to age really, really well, vs any equivalent nv gpu at the time of purchase :D

that is huge for people who doesn't buy a gpu every year, which is most people.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
Last time I bought AMD, that a-hole had my cat poisoned.

I've bought AMD the last several generations...thank goodness I don't have any pets... :D

Can confirm Nvidia Pet Euthanasia Program (NPEP). I've lost 3 fish and a dog (not mine, family members not off limits by the way) to those bastards. :colbert:

EDIT:

I never fed my fish so I'm now wondering how Nvidia managed to poison them. Must have something to do with Nvidia Water Reconditioning Program (NWRP). Somebody has to get a handle on these guys: first Gameworks, now this?
 
Last edited:

Goatsecks

Senior member
May 7, 2012
210
7
76
Can confirm Nvidia Pet Euthanasia Program (NPEP). I've lost 3 fish and a dog (not mine, family members not off limits by the way) to those bastards. :colbert:

EDIT:

I never fed my fish so I'm now wondering how Nvidia managed to get to them. Must have something to do with Nvidia Water Reconditioning Program (NWRP). Somebody has to get a handle on these guys: first Gameworks, now this?

I've heard that Nvidia GPUs emit a certain raditation that turns people into terrorists.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I'm actually kind of curious because the one thing the rules are a bit touchy about is habitual attacks and you could make an argument that repeated libeling of a person is in fact an attack. I'd kind of expect it to be since it's a loophole you can drive a truck through otherwise.

The most important thing to understand about the rules is that the mods typically only respond to reported posts. There are some posters who report every time someone gets a little snarky with them. When someone says, "I don't know how you get away with posting such FUD, etc...", if you don't report them that's part of the reason. Also, mods are humans and therefore they have biases too. Some are better at realizing it and managing them than others. So, you don't always get even handed justice. It's only a forum though, so nothing to get to upset about. Unless someone makes their living here. ;)
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
haha, this is pretty good argument vs both sides. it is just a matter of goal shifting for both sides. the best argument for AMD though is everything you buy have the potential to age really, really well, vs any equivalent nv gpu at the time of purchase :D

that is huge for people who doesn't buy a gpu every year, which is most people.
Well, as far as GCN goes anyway. As for those who bought into the HD 6000 series for the long term...

For supporting older architectures, AMD and Nvidia both royally suck. It just so happens that AMD stuck to the same/similar architecture for an abnormally long time this time around.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Well, as far as GCN goes anyway. As for those who bought into the HD 6000 series for the long term...

For supporting older architectures, AMD and Nvidia both royally suck. It just so happens that AMD stuck to the same/similar architecture for an abnormally long time this time around.
well, going for a new uarch and a die shrink from 6000 to 7000 series, I do expect a huge performance difference. if every year, we get a new uarch and die shrink, you will never see me complain about lack of support for old, outdated tech. maybe I will go back to yearly upgrades again when that happens.
 

Game_dev

Member
Mar 2, 2016
133
0
0
The main thing hurt AMD is that its own community. They hyped their product to extreme and when it release then it becomes a disappointment.

For example few poster were saying that Fury X will god like, 20% to 40% faster then Titan X etc.

Another example Lisa said that Fury X is an overclockers dream and world fastest card then some same posters in this website hype the card to extreme which leads to fury x as a failure.

Same thing happened with bulldozer where some poster hype that CPU and said it will be 40% faster then sandy bridge and it ended up to 40% slower then sandy bridge.

Starting to hear the same type of "hype" regarding Polaris and Zen.
 

parvadomus

Senior member
Dec 11, 2012
685
14
81

Thats a really interesting driver release. I always thought Fiji throttles a lot, to be power efficient, but at the same time sacrificing performance.
And this drivers corrects GoW with "Core clocks may not maintain sustained clock speeds resulting in choppy performance and or screen corruption (Fixed with new Power Efficiency feature toggled to off)".

And then they add the following to configurable settings: "Power Efficiency Toggle: A new feature introduced in the Radeon™ Settings Gaming tab for select AMD Radeon™ 300 series and AMD Radeon™ Fury X available under "Global Options". This allows the user to disable some power efficiency optimizations."

Maybe fiji users gain some single digit performance (or maybe more?) in all games with all that turned off.
 

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
Wccftech just tested.
http://wccftech.com/amd-radeon-crimson-16-3-drivers/

PHP:
Graphics Card	Fury (16.2.1)	Fury (16.3)	GTX 980
1440P (AVG FPS)	27 FPS       	40 FPS     	60 FPS
4K (AVG FPS)       10 FPS       	15 FPS     	30 FPS

Also some fury nano benches:
http://www.golem.de/news/radeon-sof...unigt-gears-of-war-drastisch-1603-119665.html

01-gears-of-war,-integrated-benchmark-(d3d12,-max-details)-chart.png

1080p basically the same
42 to 53 @1440p.

Obviously they're running something differently than wccf, but it seems that the drivers really do change things significantly.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Gears of War does not use Multi threaded rendering. Look up the CPU charts. It does not scale past 2 cores with ht.

GoW also does not make use of multi-engine support. It's basically a DX9 game running on the DX12 API.

I spoke to Dan Baker about this and he told me that barring any complete re-write of the UE3, the engine itself could not be representative of DX12.

You're clinging to false hope. I'm sorry friend but eventhough these forums allow you to speak freely, no developer and nobody at Beyond3D or any GPU/API guru is going to side with you.

Take care,

Mahigan.

Soso, Dan Baker has insight? Great, maybe he can explain this:
2 threads - 57FPS:
gow_2_57fp2c18.png
gow_2_57fp2c18.png


4 threads - 87 FPS (54% gain)
gow_4c_88fab7d.png


8 threads - 92 FPS (5% gain and less work on each thread)
gow_8t_92fe15e.png


Looks like a perfect multi rendered engine.

/edit: In 1080p the frames jump from 57fps with 2 threads to 106fps with 4 threads. That is an ~86% increase. Hm.
 
Last edited: