AMD and ATI breaking up?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Azix

Golden Member
Apr 18, 2014
1,438
67
91
What "new era in gaming"? You mean when their new flagship is card is finally available that will probably trade blows with the GTX980Ti that is already on the market?

I also dont know what "better architecture" you are talking about unless it is HBM. But that is for a few Halo products, and they limited themselves to 4gb. What they need is a better more efficient architecture through the whole product stack, not just what I will generously call "refreshes" of rebrands, while nVidia has had a whole new, more efficient architecture on the market for over a year. And nVidia will have HBM as well at some point.

dx 12

virtual reality

HBM

maxwell is not more efficient. It simply has less features. I think AMD is in a better position for dx12 since it seems like what they built GCN for from the start. Considering what we are seeing with GCN consoles, the idea of being closer to the hardware seems promising.

https://www.youtube.com/watch?v=a_uuQyUGYH4
 
Last edited:
Aug 11, 2008
10,451
642
126
dx 12

virtual reality

HBM

maxwell is not more efficient. It simply has less features. I think AMD is in a better position for dx12 since it seems like what they built GCN for from the start. Considering what we are seeing with GCN consoles, the idea of being closer to the hardware seems promising.

https://www.youtube.com/watch?v=a_uuQyUGYH4

Point is none of those things are exclusive to AMD. Right now HBM is, but some would argue AMD jumped the gun with HBM 1 because it limits them to 4gb vram on a high end card. In any case nVidia will have it soon.
 

Udgnim

Diamond Member
Apr 16, 2008
3,680
124
106
probably went like this

can we split the company so 1 side retains GPU & CPU IP while the other side retains server IP (Seamicro) and all outstanding AMD debt?

no?

oh well, we'll announce we closed Seamicro down then in Q1 2015 earnings report
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
maxwell is not more efficient. It simply has less features.https://www.youtube.com/watch?v=a_uuQyUGYH4

The problem is that this is not true. Maxwell is actually better than Kepler in compute performance. It's true that they dropped Double Precision support from their top chip, but that was always a niche feature that wasn't fully enabled in consumer products anyway.

Here is Anandtech's review of GTX 980's compute performance. The conclusion: "Overall, while NVIDIA can’t win every compute benchmark here, the fact that they are winning so many and by so much – and otherwise not terribly losing the rest – shows that NVIDIA and GM204 have corrected the earlier compute deficiencies in GK104. As an x04 part GM204 may still be first and foremost consumer graphics, but if it’s faced with a compute workload most of the time it’s going to be able to power on through it just as well as it does with games and other graphical workloads."
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Point is none of those things are exclusive to AMD. Right now HBM is, but some would argue AMD jumped the gun with HBM 1 because it limits them to 4gb vram on a high end card. In any case nVidia will have it soon.

I think AMD is betting that their experimentation with first-generation HBM will enable them to beat Nvidia to market with FinFET+HBM2 GPUs next year. That's really the only way I can see this gamble paying off; Fury isn't going to give enough revenue to justify the R&D costs by itself.

AMD desperately needs the FinFET process so they can finally get the aging rebrands off the market and introduce a new lineup. The best-case scenario for AMD would be beating Nvidia to market with FinFET+HBM2 by six months or so, which would give them a clear lead in both performance and perf/watt. While Nvidia would do a better job of holding their own, AMD could make some major inroads into winning back their lost market share and expanding their presence in the professional world.

On the other hand, there are a lot of things that could potentially go wrong. HBM2 development isn't under AMD's control, and neither is GloFo's work on implementing Samsung's 14nm FinFET fab processes. If HBM2 is delayed until 2H 2016, then it's likely AMD and Nvidia are both going to be waiting on it, and AMD's chances of getting to market first become slim. And if it turns out that TSMC's 16FF+ process is considerably better suited for GPUs than Samsung/GloFo's 14LPP, then AMD could be in big trouble.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
From Ars Technica's "article" on it:



Source:
http://arstechnica.com/business/2015/06/amd-weighing-a-business-break-up-or-spin-off-reuters-says/

Consider this rumor started by panicked short sellers of AMD stock to be officially debunked.

http://www.eweek.com/c/a/Desktops-and-Notebooks/AMD-Denies-Fab-Sell-Off

Sounds familiar?

AMD has to deny it, nomatter if they do it or not.

That's a very different situation, and a very different impetus for the move they made then.

But feel free to compare apples and oranges.

Remember, you have to take these sorts of statements in a very literal manner. What she said was "we have not hired an outside company to...".

That could mean that they haven't had any discussions about it at all. But, that's not what they said, what they said was that they have not hired an outside company.

I knew a VP at Compaq when they were going through the merger with HP...it was brutal. He had to lay off 23,000 people. !!!!!! Can you comprehend what that pressure is like? AMD is worse off today than Compaq was back in the early 2000's... it doesn't take a Computer (or Rocket) Scientist to figure out that they have at least explored the option of a breakup.

More likely, they're hoping that somebody with deep pockets will swoop in and buy them. The only problem is that no potential buyer (even Samsung) is going to want to go head to head with Intel in anything. Nvidia, sure, maybe, but Intel? Why would I pay $2b just to get my (deleted) kicked by the 800 lb gorilla in the tech industry? Realistically, the graphics division is the only part of the company with a future, the cpu side could literally fold up tomorrow and nobody but us old-timers would even miss it.

The real question then becomes, "How much is the graphics division worth"? How much do they need from the cpu side to continue to sell APU's for the consoles? Is it even feasible to extricate the 2 companies anymore? Maybe that's the real issue, they discussed it and decided that you just can't split up the company.

I wouldn't be suprised to see AMD go bankrupt. Another potential option is for a private entity to purchase everything. Think about it...AMD headquartered in UAE would be outside the long reach of the NSA. Ok, maybe not outside of it, but a lot harder to get to anyway. And lots of countries intensely distrust US technology firms in the wake of the Edward Snowden scandal. Perhaps China, or Russia, or the UAE, or especially Iran, would prefer to do business with somebody who at least appears to be less likely to have backdoors for the next stuxnet built into every single chip.

I think AMD is betting that their experimentation with first-generation HBM will enable them to beat Nvidia to market with FinFET+HBM2 GPUs next year. That's really the only way I can see this gamble paying off; Fury isn't going to give enough revenue to justify the R&D costs by itself.

AMD desperately needs the FinFET process so they can finally get the aging rebrands off the market and introduce a new lineup. The best-case scenario for AMD would be beating Nvidia to market with FinFET+HBM2 by six months or so, which would give them a clear lead in both performance and perf/watt. While Nvidia would do a better job of holding their own, AMD could make some major inroads into winning back their lost market share and expanding their presence in the professional world.

On the other hand, there are a lot of things that could potentially go wrong. HBM2 development isn't under AMD's control, and neither is GloFo's work on implementing Samsung's 14nm FinFET fab processes. If HBM2 is delayed until 2H 2016, then it's likely AMD and Nvidia are both going to be waiting on it, and AMD's chances of getting to market first become slim. And if it turns out that TSMC's 16FF+ process is considerably better suited for GPUs than Samsung/GloFo's 14LPP, then AMD could be in big trouble.

You're focusing on the wrong part. AMD graphics is at least competitive these days, and NV is not such a huge behemoth that they can't compete head to head with them in consumer and professional graphics. It's not unrealistic, in fact, to think that AMD will run something like 1/3 of the gpu market indefinitely. Heck, they might even end up with a significant advantage and more market share if GF proves to be more capable than TSMC of delivering the goods.

What IS unrealistic, however, is to expect AMD to go head to head in the CPU space with a company that is nearly 100x as valuable. They've been making some good inroads into other markets, but they need to do more and quickly if they hope to stay alive through what's coming over the next decade.
 
Last edited:

sm625

Diamond Member
May 6, 2011
8,172
137
106
Before drumming up the old villains of "short sellers", you have to keep in mind the possibility that the market might prefer a breakup of some sort. In this bubble market, pretty much any rumor causes a stock to rise. If they can bubble the share price up above $5, then they would get more institutional investor support. It could easily rise to $8, all on a baseless rumor. It happens all the time. Disclaimer: I am long AMD, but would not benefit from a move to $8, unless it happened in the next couple weeks. If it did, I'd be retiring. That would be my first (and last) fifty-bagger.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
The problem is that this is not true. Maxwell is actually better than Kepler in compute performance. It's true that they dropped Double Precision support from their top chip, but that was always a niche feature that wasn't fully enabled in consumer products anyway.

Here is Anandtech's review of GTX 980's compute performance. The conclusion: "Overall, while NVIDIA can’t win every compute benchmark here, the fact that they are winning so many and by so much – and otherwise not terribly losing the rest – shows that NVIDIA and GM204 have corrected the earlier compute deficiencies in GK104. As an x04 part GM204 may still be first and foremost consumer graphics, but if it’s faced with a compute workload most of the time it’s going to be able to power on through it just as well as it does with games and other graphical workloads."
I suggest using lets say GTX970 and R9 290 for Final Cut Pro in Mac Pro 5.1.

GTX is being demolished in terms of compute performance in real world tasks. Simple benchmarks are meaningless. Only one thing is beneficial for compute from Nvidia standing point of view. CUDA. But not in a Mac, anymore.

I know it from my personal experiments. AMD in real world compute tasks is still better, especially on Apple platform, no matter what LuxMark will show.