Amd acquisition rumors not so unfounded...

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

superstition

Platinum Member
Feb 2, 2008
2,219
221
101
I like being able to actually buy AMD processors, so no thanks.
Once upon a time Apple was popular with home buyers (the Apple II line) and was very innovation-minded (Lisa, Macintosh).

I suppose the iPhone, iPod, and iPad can all be called innovative as well, but none of those are on the level of Lisa or even Macintosh. Both products had flaws but they were a major step forward in important ways.

Apple has a ton of cash and could really compete with Intel and Nvidia if it wanted to. The old Apple also had the R&D focus to develop more than just me-too products. But, the current focus on high margins and minimal R&D outlay means it won't happen.
 
Mar 10, 2006
11,715
2,012
126
Once upon a time Apple was popular with home buyers (the Apple II line) and was very innovation-minded (Lisa, Macintosh).

I suppose the iPhone, iPod, and iPad can all be called innovative as well, but none of those are on the level of Lisa or even Macintosh. Both products had flaws but they were a major step forward in important ways.

Apple has a ton of cash and could really compete with Intel and Nvidia if it wanted to. The old Apple also had the R&D focus to develop more than just me-too products. But, the current focus on high margins and minimal R&D outlay means it won't happen.

Are you being serious?

EUNke4y.png


Apple spends nearly four AMDs per year on R&D.
 
Last edited:
Mar 10, 2006
11,715
2,012
126
Yes, but Apple's sales are something like 20 AMDs. So proportionally, superstition's point still stands.

That just means Apple gets a really good return on its R&D budget. In absolute terms $7.1B (and growing) is huge as far as tech R&D budgets go.

Only a handful of tech companies come to mind that can afford $7B+ annual R&D investments.
 

roob

Junior Member
Apr 26, 2013
18
0
0
Are you being serious?

EUNke4y.png


Apple spends nearly four AMDs per year on R&D.

He's right though. Apple spends just 3% of revenue on r&d, compared to 10-15% for microsoft, oracle or google and 20% for intel. he's wrong though about this being a new development, apple used to spend only 2% just a few years back.

edit: google, microsoft and intel all spend more on r&d in absolute numbers, and there's a lot of tech companies at 5-6b. apple's r&d spend is average at best, and dead last when taking revenue into account.
 
Last edited:

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
He's right though. Apple spends just 3% of revenue on r&d, compared to 10-15% for microsoft, oracle or google and 20% for intel. he's wrong though about this being a new development, apple used to spend only 2% just a few years back.

edit: google, microsoft and intel all spend more on r&d in absolute numbers, and there's a lot of tech companies at 5-6b. apple's r&d spend is average at best, and dead last when taking revenue into account.

While I haven't read the entire thread and really unsure on how we got to Apple R&D but Apple would be the first choice in buying AMD as the acquisition costs will be low, R&D is not crazy high and they can use the technology in their huge portfolio. Apple loses very little in this buy.

My guess is that Apple will M&A AMD in the next year or so.
 
Mar 10, 2006
11,715
2,012
126
While I haven't read the entire thread and really unsure on how we got to Apple R&D but Apple would be the first choice in buying AMD as the acquisition costs will be low, R&D is not crazy high and they can use the technology in their huge portfolio. Apple loses very little in this buy.

My guess is that Apple will M&A AMD in the next year or so.

I think Apple has already hired lots of AMD's talent and will have no problems continuing to poach engineers as it desires.
 

DrMrLordX

Lifer
Apr 27, 2000
23,204
13,289
136
If Apple bought AMD, you'd still be able to buy AMD processors. The single-core, cache-free Semprons would be $999, though.;)

And I'll have to wait in line to get one. Sonofa . . .

But hey I'd be able to run OSX woop woop.

Once upon a time Apple was popular with home buyers (the Apple II line) and was very innovation-minded (Lisa, Macintosh).

I suppose the iPhone, iPod, and iPad can all be called innovative as well, but none of those are on the level of Lisa or even Macintosh. Both products had flaws but they were a major step forward in important ways.

Apple has a ton of cash and could really compete with Intel and Nvidia if it wanted to. The old Apple also had the R&D focus to develop more than just me-too products. But, the current focus on high margins and minimal R&D outlay means it won't happen.

My family had an Apple //c growing up. I remember being vaguely jealous of the people that had older //es because you could do more modding to those.

Regardless, Apple stuff has been pretty expensive compared to alternatives since at least the Macintosh days. You still can find Amiga guys bragging about how much cheaper their systems were than Macs. Then they'll go on about how they could use Emplant cards to emulate Macs and do it faster than real ones for less. Oh those spooney Amiga users!

Maybe Apple wouldn't do anything too horribly-evil with AMD, but I suspect that the stream of cheap PC APUs, CPUs, and dGPUs from AMD would dry up and become Apple-exclusive stuff.
 
Last edited:

superstition

Platinum Member
Feb 2, 2008
2,219
221
101
He's right though. Apple spends just 3% of revenue on r&d, compared to 10-15% for microsoft, oracle or google and 20% for intel. he's wrong though about this being a new development, apple used to spend only 2% just a few years back.
I wasn't talking about a few years back. How is the Lisa, for instance, a few years back?
 

scannall

Golden Member
Jan 1, 2012
1,960
1,678
136
How was the Lisa innovative?


I'm guessing you're very young.

Apple Lisa

Saw one at a trade show, shortly after graduating from college. It was like a lightning bolt had struck. It was different in every way from any other computer on the market.

And yes, a lot different and better than a Xerox Star too. Gawd, that was a very rough machine.
 
Mar 10, 2006
11,715
2,012
126
I'm guessing you're very young.

Apple Lisa

Saw one at a trade show, shortly after graduating from college. It was like a lightning bolt had struck. It was different in every way from any other computer on the market.

And yes, a lot different and better than a Xerox Star too. Gawd, that was a very rough machine.

Have you ever used an iPhone?

Seriously, it's pure magic the kind of performance and functionality that an iPhone brings to your pocket.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
AMD has plenty of useful tech that either Samsung or Apple could benefit from.

Samsung would have an amazing x86 and graphics portfolio that coupled with their lithography advances, could directly compete with Intel in just about every market where x86 has a real presence.

AMD could be Apple's ticket to independence from Intel. While they initially wouldn't have lithography advantages of Intel, they could truly customize their x86 chips much like their iPhone and iPad chips to suit their needs. My fear however is that there would be no more selling of box AMD processors and graphics cards unless AMD was left to operate independently.
 

jji7skyline

Member
Mar 2, 2015
194
0
0
tbgforums.com
AMD has plenty of useful tech that either Samsung or Apple could benefit from.

Samsung would have an amazing x86 and graphics portfolio that coupled with their lithography advances, could directly compete with Intel in just about every market where x86 has a real presence.

AMD could be Apple's ticket to independence from Intel. While they initially wouldn't have lithography advantages of Intel, they could truly customize their x86 chips much like their iPhone and iPad chips to suit their needs. My fear however is that there would be no more selling of box AMD processors and graphics cards unless AMD was left to operate independently.

Look at what they did with their Beats acquisition. I doubt that Apple would outright kill AMD if they acquired them. Then again I doubt they'd spend any money reviving the brand if it was already bankrupt.
 

jhu

Lifer
Oct 10, 1999
11,918
9
81
I'm guessing you're very young.

Apple Lisa

Saw one at a trade show, shortly after graduating from college. It was like a lightning bolt had struck. It was different in every way from any other computer on the market.

And yes, a lot different and better than a Xerox Star too. Gawd, that was a very rough machine.

And it was a commercial flop. Innovations don't mean much if no one finds it useful or can't afford it.

Have you ever used an iPhone?

Seriously, it's pure magic the kind of performance and functionality that an iPhone brings to your pocket.

I would agree with this. I would argue that the iPhone was more innovative than the Lisa, and it definitely had more impact.
 

Centauri

Golden Member
Dec 10, 2002
1,631
56
91
And it was a commercial flop. Innovations don't mean much if no one finds it useful or can't afford it.

Innovative products are rarely successful commercially because they're almost always ahead of their intended markets.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
He's right though. Apple spends just 3% of revenue on r&d, compared to 10-15% for microsoft, oracle or google and 20% for intel. he's wrong though about this being a new development, apple used to spend only 2% just a few years back.

Apple just makes a few products. Intel and others pour R&D money in things that are not their core strength. How much money do you think they spent on the Larrabbee project? Itanium? Memory and I/O standards? Intel Play? MeeGo? Cancelled TV project?

Microsoft: Nokia stealth funding(and killing internal projects), User experience projects, Xbox, Music Service.
 

scannall

Golden Member
Jan 1, 2012
1,960
1,678
136
And it was a commercial flop. Innovations don't mean much if no one finds it useful or can't afford it.


Moving the goal posts? It was a commercial flop, because at $10,000 it was too far above the market. But computers in general were very expensive at the time. RAM in 1982 for example was around $500 for 256 KBytes. So getting 2 Megabytes in your Lisa was around $4000. A 20 Megabyte hard drive in 1982 was in the $3000 to $5000 range as well.

But, regardless of the price or the commercial success or failure of the Lisa, it was just stunning to see that at a trade show in the era of teletypes, green screen monitors, punch cards and floppies.

And it did lay the ground work for the Macintosh which came two years later for a whole lot less money, though much less capable.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Apple stalks AMDs engineer Jim Keller to create technology called Macroscalar with Out of Order Execution of data.
Jim Keller goes back to AMD to work on AMD Zen CPUs.
Apple hires people from IBM to work on silicon designs.
Apple talks with IBM on business deal.
Apple talks with IBM to sell factory and technology to Global Foundries.
Apple shuts down every patent dispute with Samsung and make business deals, and tighten their cooperation on technology.
Global Foundries and Samsung synch their silicon production processes.
AMD goes back to GloFo with production of their chips.

Rumors start to go around about Samsung buying AMD.

Reasons? Apple, Jim Keller. Also from technological point of view AMD has wind in its sails. And has financial problems.
 
Last edited:

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Why would Apple need x86? They control the software stack, no reason not to begin a transition to ARM, unless AMD is significantly stronger in performance/watt than Apple's in-house chips. Even buying AMD, it'd probably be preferably for Apple to transition to ARM.
AMD's gpus might be worthwhile though, especially if Apple's in house GPU isn't all that great or can't scale up to PCs. AMD's cost looks almost like a steal just to get a lock in on Radeon graphics and burn the rest of the company.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Apple would need AMD GPUs for their computers and, eventualy, AMD APUs if they perform well enough for Apple.

OpenCL is the most important factor, and if Zen is as good as it is hyped,here and there, then its possible that Apple will ditch Intel from their notebooks.

I don't believe they would want to buy AMD. Samsung is a completely different story. They are closest Apple contractor.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
Apple stalks AMDs engineer Jeff Keller to create technology called Macroscalar with Out of Order Execution of data.
Jeff Keller goes back to AMD to work on AMD Zen CPUs.
Apple hires people from IBM to work on silicon designs.
Apple talks with IBM on business deal.
Apple talks with IBM to sell factory and technology to Global Foundries.
Apple shuts down every patent dispute with Samsung and make business deals, and tighten their cooperation on technology.
Global Foundries and Samsung synch their silicon production processes.
AMD goes back to GloFo with production of their chips.

Rumors start to go around about Samsung buying AMD.

Reasons? Apple, Jeff Keller. Also from technological point of view AMD has wind in its sails. And has financial problems.

He's quite the mover and shaker is Jeff. :awe:
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Jim, thanks for correction ;).

I have no idea, why I remembered him as Jeff...
 

scannall

Golden Member
Jan 1, 2012
1,960
1,678
136
Why would Apple need x86? They control the software stack, no reason not to begin a transition to ARM, unless AMD is significantly stronger in performance/watt than Apple's in-house chips. Even buying AMD, it'd probably be preferably for Apple to transition to ARM.
AMD's gpus might be worthwhile though, especially if Apple's in house GPU isn't all that great or can't scale up to PCs. AMD's cost looks almost like a steal just to get a lock in on Radeon graphics and burn the rest of the company.

While they have done migrations in the past, it was with reluctance and for obvious reasons. Moving from 68k to PowerPC was obvious as Motorola had given up on the architecture. And the PowerPC was a great deal faster. Enough faster that running 68k code in emulation worked very well.

Going from PowerPC to x86 was also obvious. IBM was unwilling to work on performance, and performance per watt for their consumer grade CPU's. You can't put a 140 watt CPU into a laptop. At the time the G5 in particular came out, it was the fastest CPU you could get on the consumer side. But, IBM tossed their roadmap out, and fell behind by a lot. And the G5 had become very power hungry. When you have to have liquid cooling as air just isn't enough, then you have a power consumption problem.

So, going to x86 made sense. Indeed, it was pretty much mandatory unless they wanted to give up the laptop market. Not going to happen, as it's a major profit center.

The Core architecture was fast enough that they could run the PowerPC code in emulation as well. That and fat binaries made the transition fairly painless.

To transition to ARM just doesn't fit what's going on. Their own ARM CPU's would need to have 40 to 50 percent higher IPC than Intel to make emulation palatable. And that ain't gonna happen any time soon.

Further, while Intel has botched a few delivery dates they have worked with Apple, and made them the parts they want. The blinding exception being the SOC's for the iPhone, iPad etc.

There just isn't any compelling reason to transition to ARM, and lots of reasons not to.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,115
136
Going from PowerPC to x86 was also obvious. IBM was unwilling to work on performance, and performance per watt for their consumer grade CPU's. You can't put a 140 watt CPU into a laptop. At the time the G5 in particular came out, it was the fastest CPU you could get on the consumer side. But, IBM tossed their roadmap out, and fell behind by a lot. And the G5 had become very power hungry. When you have to have liquid cooling as air just isn't enough, then you have a power consumption problem.

FWIW... IBM was willing to improve clocks and perf/watt. The problem was that IBM said they needed to ditch Altivec and substitute a less complex SIMD unit to do so. Apple said no to eliminating Altivec and switched to Intel, which didn't have Altivec to begin with. IMHO, Apple was using IBM as leverage to get a good deal from Intel because that's the direction they wanted to go in anyway.