[TT]AMD continues to cut spending on R&D, down 40% in the last five years

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
"The core on our sample would only go 60MHz higher"

http://www.techspot.com/review/1024-and-radeon-r9-fury-x/page12.html

5% OC.

Custom 980TI with a 15% OC is not thermally restricted, get your facts straight. It gets beaten soundly out of the box by a reference Fury X. 22% faster frame times is not a small difference.

"A perfect example of this is Gainward's GTX 980 Ti "Golden Sample", which features a 15% factory overclock"

"The R9 Fury X Crossfire cards were on average 22% faster when comparing the 99th percentile data."

They pushed it a further 15% above its factory OC to beat Fury X's 5% OC. -_-
 
Last edited:
Aug 20, 2015
60
38
61
"The core on our sample would only go 60MHz higher"

http://www.techspot.com/review/1024-and-radeon-r9-fury-x/page12.html

5% OC.

Custom 980TI with a 15% OC is not thermally restricted, get your facts straight. It gets beaten soundly out of the box by a reference Fury X. 22% faster frame times is not a small difference.

"A perfect example of this is Gainward's GTX 980 Ti "Golden Sample", which features a 15% factory overclock"

They pushed it a further 15% above its factory OC to beat Fury X's 5% OC. -_-

Then I was wrong to not see that detail, but 5% still leaves it barely with ~10% greater OC headroom while shooting up volts., again only to theoretically match the 980 Ti OC in the comparison while sucking more power. And by thermally restricted, I'm talking about your asinine "but tri/quad GPUs" point.

That 5% on the Fury X is still its limit before requiring more volts and significantly more watts and it still has a deficit of 11% versus the 980 Ti OC to cover in performance. As the reviewer noted, the 980 Tis still sucked down less power at that point. You're missing the point; that 15% factor OC + more is a testament to how much potential GM200 has that Nvidia and even AIB partners didn't tap into. It's not a negative in comparison to the Fury X; it's an accurate measure of its microarchitectural capabilities. Maxwell consistently clocks extremely high, Nvidia didn't clock it out of the box anywhere near as aggressively as the Fury X. What's not to understand, cut-down GM200 still wins that while being more efficient. And that is Fiji's best-scenario.
 
Last edited:
Feb 19, 2009
10,457
10
76
Why do you think custom cooled 980Ti are thermally restricted? PCPER tested on an OPEN bench, best case scenario for open air cards. No throttling. It still lost.

It loses in all out of the box testing. It draws if max OC is done to both, given 5% OC Fury X is so close to max OC 980Ti. A vcore OC Fury X CF will still match max OC 980Ti SLI. That's including HairWorks ON.

Sure, Maxwell is more power efficient when vcore OC are compared, you can have that point. But you cannot have "980Ti SLI is faster at 4K" because that's false.

If you have a problem with cut-down GM200 not winning, then ask NV to release a non-crippled GM200 with a water cooler.

Let's re-examine the numbers again in 2016 (or whenever NV releases custom full GM200). Because I believe Fury X has plenty of room to improve via optimizations, not even including DX12.

Edit: The reason I chimed in, is that I am optimistic about AMD's future directions, it's not something I could say in the past, in fact, I posted many times over recent years, joining in the "doom & gloom" choir. I mean, another "AMD is dead" article or thread, just like "PC Gaming is dead" or "dGPU is dead" etc..

Things are looking up for AMD (after the reference R290/X disaster vs 970/980). Nintendo NX, Zen, HBM2, 14nm ff finally catching up to Intel's node etc. The OP insinuates the lower R&D budget means they aren't competitive, but Fury X is extremely competitive in its intended usage scenario: gaming at 4K.
 
Last edited:

DustinBrowder

Member
Jul 22, 2015
114
1
0
Can people stop using the OC results as some sort of actual real life thingy? Every card I've had and overclocked even minimally, usually up to 50mhz has always broken down within a year.

Cards in the office machines who've never been overclocked, but have basically worked for 24/7 have lasted me 4+ years.

Consistently high overclock does reduce the lifespan and not to mention that the whole small watt gains you otherwise have are completely lost. This just proves people have no clue what they are saying, when they are quoting lower power usage, yet overclock to the max and increase the wattage. Your lower power usage is completely gone.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Stock Fury X CF vs SLI OC 980Ti model.



Notice I did not bring OC into the equation, comparing stock results across the generation. Stock v stock, 5870 was far behind 480, as was 6970, as was R290X vs 980Ti.

Now we're getting Stock Fury X destroying STOCK ref 980Ti and beating OC 980Ti models.

That is an improvement. Give credit where its due.

Against a max OC 980Ti, 15% OC on top of the OC model, putting it at >1.5ghz.



11% faster than gimped (5%) OC Fury X. Get TPU's OC numbers with vcore. >1.2ghz Fury X with vram will equalize that 11% delta.

memory.gif


So no, max OC SLI 980Ti isn't faster than max OC Fury X CF. The techspot article is a gimped OC Fury X, 5%.

Also, reliance on max OC to beat something is NOT a good result. What if you don't have silicon lottery luck? What if your 980Ti can't reach stable daily use >1.5ghz?

Out of the box results, CF Fury X is supreme over even OC custom 980Ti SLI, 22% smoother frame times to boot! That's a major progress from AMD.

Now, wanna compare the ULTRA enthusiast? How about 3-4 GPUs? Fury X destroys Titan X/980Ti.

https://www.youtube.com/watch?v=d8hKhlbrhQ4

https://www.youtube.com/watch?v=G1EoFWrD3lE

When was the last time AMD had an ULTRA setup victory over NV? Never. This gen is the first time they've done it. Next-gen, I fully expect them to continue this progress and take the overall lead, clearly, single GPU and multi-GPU as we enter the DX12 era next year.

Odd that you ended your quote of the TechSpot conclusion where you did. Next few lines are rather telling:

However, of the games we tested, the Fury X combo was faster in only four of them and that includes a 1% advantage in Battlefield 4.

Where the Fury X Crossfire setup won big was in Thief where it was 50% faster and Total War: Attila where it was 36% faster. Removing Thief's result sees the Fury X cards losing to the GTX 980 Tis overall by 1%.

So the 980 Ti combo was faster in 6 out of 10 games, and one particular games changed it from a 1% NV win to a 4% AMD win.

I recall someone saying we should remove outlier results. Haha.
 
Feb 19, 2009
10,457
10
76
If Techspot included Project Cars, the 980Ti will win even more... -_-

I just find it funny people used to say it was unfair when reference v reference comparisons were made and CF Fury X pwned the SLI 980Ti hard.

PCPER also had custom 980Tis and now Techspot, in a large sample of games including some GameWorks title & running with HairWorks, Fury X CF manages to defeat a great custom cooled OC 980Ti SLI out of the box with much smoother frametimes.

Not bad for a company with so little resources and constantly labeled "dead".

Some of the folks here find it so difficult to give AMD any bit of credit. I bet if DX12 boost performance heaps and AMD destroys NV in DX12 games, some of the folks will find negatives and run with that, the mantra will be "so what, DX12 is only a handful of games!" will be a common theme. Nothing good to say at all, even when AMD manages to win, something so abhorrent and believed to be impossibru!
 
Last edited:

Abwx

Lifer
Apr 2, 2011
12,035
5,004
136
You're missing the point; that 15% factor OC + more is a testament to how much potential GM200 has that Nvidia and even AIB partners didn't tap into.


What's not to understand, cut-down GM200 still wins that while being more efficient. And that is Fiji's best-scenario.


This is total non sense, a 15% overclock require 32.5% more power to stay within the same stabilty margin wile perf/Watt is inherently degraded by 25%, physics of the transistors is what it is and it s certainly not Nvidia cards that will negate those proved laws..
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
The problem for AMD is that Fury / Fury X was supposed to be the Cat's Ass.

It just isn't. It's only in the right ballpark.

I think they should have released Nano first as an appetizer.

Nano is probably a more impressive card overall than Fury X.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
The problem for AMD is that Fury / Fury X was supposed to be the Cat's Ass.

It just isn't. It's only in the right ballpark.

I think they should have released Nano first as an appetizer.

Nano is probably a more impressive card overall than Fury X.

Ummm, I think that would be the "Cat's Meow". I don't think being the "ass" of anything is considered a good thing.

:D
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
If Techspot included Project Cars, the 980Ti will win even more... -_-

I just find it funny people used to say it was unfair when reference v reference comparisons were made and CF Fury X pwned the SLI 980Ti hard.

PCPER also had custom 980Tis and now Techspot, in a large sample of games including some GameWorks title & running with HairWorks, Fury X CF manages to defeat a great custom cooled OC 980Ti SLI out of the box with much smoother frametimes.

Not bad for a company with so little resources and constantly labeled "dead".

Some of the folks here find it so difficult to give AMD any bit of credit. I bet if DX12 boost performance heaps and AMD destroys NV in DX12 games, some of the folks will find negatives and run with that, the mantra will be "so what, DX12 is only a handful of games!" will be a common theme. Nothing good to say at all, even when AMD manages to win, something so abhorrent and believed to be impossibru!

In all fairness, you kind of give enough credit to AMD for all of us. You find silver in every cloud, but over time, this can become quite tedious to read. I want to see AMD do better also, but I am also realistic about it. Or try to be.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
These Ti vs X, CF vs SLI, 4K vs 1440p arguements get so circular. I swear, every thread in the graphics section comes back to the same people making the same arguements.

If you want AMD, buy AMD. But don't claim that the Fury X can match a custom 980 Ti in overclocking performance. That is a false claim. Not to mention, 980 Ti's are widely available and the Fury X is not. Plain and simple.
 

Ma_Deuce

Member
Jun 19, 2015
175
0
0
I must be in the wrong place. I thought this thread was about AMD's reductions in R&D spending...

Has AMD also dropped any product lines in the last 5-6 years? Cutting costs on R&D isn't automatically a bad thing. There is always a lot of fat that can be trimmed and it doesn't have to directly effect the quality of the departments that are making some cuts.

I think that AMD may have to get much smaller to survive or be able to profit. Putting all of their focus into a couple things and do them really good before the well runs dry. Then try to build up again from there.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I must be in the wrong place. I thought this thread was about AMD's reductions in R&D spending...

Has AMD also dropped any product lines in the last 5-6 years? Cutting costs on R&D isn't automatically a bad thing. There is always a lot of fat that can be trimmed and it doesn't have to directly effect the quality of the departments that are making some cuts.

I think that AMD may have to get much smaller to survive or be able to profit. Putting all of their focus into a couple things and do them really good before the well runs dry. Then try to build up again from there.
Refreshing only your top products and letting everything else get isn't a great sign of good R&D spending....
 
Mar 10, 2006
11,715
2,012
126
I must be in the wrong place. I thought this thread was about AMD's reductions in R&D spending...

Has AMD also dropped any product lines in the last 5-6 years? Cutting costs on R&D isn't automatically a bad thing. There is always a lot of fat that can be trimmed and it doesn't have to directly effect the quality of the departments that are making some cuts.

I think that AMD may have to get much smaller to survive or be able to profit. Putting all of their focus into a couple things and do them really good before the well runs dry. Then try to build up again from there.

AMD is spending less than NVIDIA in R&D while trying to compete with both NVIDIA in dGPUs and Intel in PC CPUs as well as server chips.

It's a bad thing for AMD, IMO.
 

tg2708

Senior member
May 23, 2013
687
20
81
I must be in the wrong place. I thought this thread was about AMD's reductions in R&D spending...

Has AMD also dropped any product lines in the last 5-6 years? Cutting costs on R&D isn't automatically a bad thing. There is always a lot of fat that can be trimmed and it doesn't have to directly effect the quality of the departments that are making some cuts.

I think that AMD may have to get much smaller to survive or be able to profit. Putting all of their focus into a couple things and do them really good before the well runs dry. Then try to build up again from there.

Well said, had a similar thought earlier today.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
It's better as a company to take large short term losses and invest in yourself then to worry about short term profitability and fail long term. I see amds R&D cuts as a massive red flag.
 
Mar 10, 2006
11,715
2,012
126
It's better as a company to take large short term losses and invest in yourself then to worry about short term profitability and fail long term. I see amds R&D cuts as a massive red flag.

They needed to make these cuts otherwise they would simply go under. AMD isn't cutting R&D to puff up their profits; they've done so in order to keep the proverbial lights on.

AMD seems to be in a "death spiral." Sales tank so the company needs to cut expenses in order to stay afloat. Those cuts allow AMD to live to fight another day, but it wrecks its ability to compete, leading to more revenue declines, budget cuts, etc. until the company can no longer continue.

In the meantime, though, I suspect that AMD's top brass will keep talking up a big game, promising amazing new technology, while they take what they can from the sinking ship.

http://www.benzinga.com/analyst-rat...nalyst-on-amds-executive-bonuses-unbelievable
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
They needed to make these cuts otherwise they would simply go under. AMD isn't cutting R&D to puff up their profits; they've done so in order to keep the proverbial lights on.

AMD seems to be in a "death spiral." Sales tank so the company needs to cut expenses in order to stay afloat. Those cuts allow AMD to live to fight another day, but it wrecks its ability to compete, leading to more revenue declines, budget cuts, etc. until the company can no longer continue.

In the meantime, though, I suspect that AMD's top brass will keep talking up a big game, promising amazing new technology, while they take what they can from the sinking ship.

http://www.benzinga.com/analyst-rat...nalyst-on-amds-executive-bonuses-unbelievable
Id need more time to dive deep into their financials and I'm already doing that for my own company but from what I've seen of amd, it's one of the reasons I don't want to commit to freesync if I can get 60+ fps with an nvidia card next gen vs directly below 60 (good freesync range) with an amd card.

But gcn is a forward looking design I hope it pays off as we continue to move to higher resolutions.
 

Ma_Deuce

Member
Jun 19, 2015
175
0
0
It's better as a company to take large short term losses and invest in yourself then to worry about short term profitability and fail long term. I see amds R&D cuts as a massive red flag.

I would hope that they are not blindly throwing money at probelms. If they cannot get a good return on their investment, then it makes any short term losses pointless. It's not completely about how much $$$ you invest, but more importantly it's about making the money that you do invest pay off.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Custom 980TI with a 15% OC is not thermally restricted, get your facts straight.

The ONLY scenario you've presented where the Fury X takes any crown is when you're pitting it and its aggressively-clocked factory default against thermally-restricted reference GM200 chips.

...

Cherry-picking thermally-crippled and conservatively-clocked results for one product simply doesn't support your assertion that AMD are relatively ahead in uarch development, period.

Emphasis mine. He was clear in that he was referring to the reference cards.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
This is total non sense, a 15% overclock require 32.5% more power to stay within the same stabilty margin wile perf/Watt is inherently degraded by 25%, physics of the transistors is what it is and it s certainly not Nvidia cards that will negate those proved laws..

This is the second time I see this user post this. Is this accurate info?

My own experience:
ajcGp6S.png


~21% gain on Graphics Score
~10% power increase at the PSU.

I'm using a standard Zotac Ref Model. System was stable with no issues and I use same clocks for gaming +300 Core/+300 Mem. Though, I think I can go higher on Mem from what I've read others are getting.

EDIT:
Using the TechpowerUp article:
Average: Metro: Last Light at 1920x1080 because it is representative of a typical gaming power draw. The average of all readings (12 per second) while the benchmark was rendering (no title/loading screen) is used. In order to heat up the card, the benchmark is run once without measuring power consumption.

power_average.gif


~15% power increase for:

metro_lastlight_1920_1080.gif


~12% performance increase.

Woof. I must have gotten a damn good sample then! I guess I can't complain about my dinky ref Zotac :D
 
Last edited:

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
This is the second time I see this user post this. Is this accurate info?

Use GPUz to look at your ASIC quality. It's a general guide to how much voltage is needed to OC. Not all samples are the same, and it of course depends on how high you're going. My MSI 980 is at 1503MHz core and 3800MHz memory (which is a pretty good OC) and I'm running with a MAX voltage of 1.281V, but in reality I don't see it go that high - no need to even cross past 1.3V. (I did up the min-voltage at some clock steps since I was seeing some instability when closing steam, for some reason.)

Also, it's ABWX. He's got an extreme brand preference.

Warning issued for callout.
-- stahlhart
 
Last edited by a moderator:

yacoub

Golden Member
May 24, 2005
1,991
14
81
Since Lisa Su has abandoned the price/performance strategy, I would give yet another advantage to NV since at the same price and similar performance, most consumers will pick NV anyway.



All things being equal, I would choose an AMD product simply because their drivers have been rock solid for me since the HD6000 series and even before that. NVidia's drivers were always more prone to crash during gaming. I haven't had a crash related to GPU drivers in so many years I've forgotten how nice it is to not have that happen. I am very close to going with a 980 Ti this time around but may opt for a 390X, enjoy what I expect to be continued rock-solid stability, and just wait out a couple more years of tech before migrating to something entirely new in whatever the playing field looks like by then.

The problem for AMD is that Fury / Fury X was supposed to be the Cat's Ass.

It just isn't. It's only in the right ballpark.

I think they should have released Nano first as an appetizer.

Nano is probably a more impressive card overall than Fury X.


Agreed, and they could have fixed that by debuting the Fury and Fury X at $50 lower MSRP than they did. That would have made a big wave of price/performance buzz. But since they are so desperate for revenue they couldn't do the one thing that would have made up for Fury and Fury X being pretty much "just okay", which would be an aggressive price.
 
Last edited:

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
They needed to make these cuts otherwise they would simply go under. AMD isn't cutting R&D to puff up their profits; they've done so in order to keep the proverbial lights on.

AMD seems to be in a "death spiral." Sales tank so the company needs to cut expenses in order to stay afloat. Those cuts allow AMD to live to fight another day, but it wrecks its ability to compete, leading to more revenue declines, budget cuts, etc. until the company can no longer continue.

In the meantime, though, I suspect that AMD's top brass will keep talking up a big game, promising amazing new technology, while they take what they can from the sinking ship.

http://www.benzinga.com/analyst-rat...nalyst-on-amds-executive-bonuses-unbelievable

They had milked K8 to death, and now they are milking GCN to death.