Will Intel will buy NVIDIA?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
3
81
Haha, he's got a point though, at the moment Nvidia isn't looking too good. Or would you insist that Nvidia is better instead?

From a business prespective, there's no question nvidia is better although it's arguable intel is even better even just graphics wise. Benchmarks don't determine how successful a business is.
 

Schmide

Diamond Member
Mar 7, 2002
5,596
733
126
If it did happen it would have to be farther down the road. The hurtles of getting past what could be seen as stifling nVidia's business, i.e. chipset, then buying them. Too much.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
It makes me sad that people need to comment to point out how ridiculous this is.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,674
146
106
www.neftastic.com
Troll much?

Pot, meet kettle.

I don't believe there are any real regulatory hurdles to Intel buying nVidia as the article mentions. I think that if Intel really wanted to buy out nVidia that they can point to AMD/ATI as sufficient competition on the CPU and GPU front.

Tegra IMHO is likely to be a huge platform IMHO. Mobile devices are really the next big thing. Intel has Atom and that is x86 which is a plus for anyone wanting to stay in the Windows OS family but the number of devices based on ARM CPU's and not using Windows is growing. Platforms like the iPhone and even Zune show that it's not necessary to have Windows compatibility in order to succeed. In fact, it is my opinion that mobile devices and platforms like Tegra pose the biggest threat to Intel's dominance.

While this partnership makes a lot of sense on a lot of fronts, I don't think it will happen. Lots of egos involved in both the Intel and nVidia sides. There's also a huge chance nVidia may strike it big on the mobile front.

Cut out the irrelevant cud. As postmortem says:

I doubt intel can't do anything until all antitrust proceedings are done.

Intel will have absolutely no shot at any large scale industry acquisitions until well after all investigations are closed and settled. Just because they settled with AMD doesn't mean the government will turn a blind eye again.

As far as Tegra? Yeah, solid platform, but Nvidia is already going to take it away from Windows to ARM-land. Besides, most people in the mobile world want to do their best to stay away from Windows, and if you take a good look at Windows Mobile, pretty much 0 devices out there run with anything other than ARM anyway, so that point is moot.

Simply put, it makes a lot less sense than you think.
 

akugami

Diamond Member
Feb 14, 2005
6,102
2,477
136
@SunnyD

Just trying to show that nVidia is healthy and that they don't really need Intel at this moment in time. While there are major challenges ahead that are going to potentially hurt nVidia on the GPU front which has been their bread and butter, there are developments in other niches that may help nVidia strike it rich. Even if Intel buys nVidia, it would be years down the line with a downtrodden nVidia that has hit hard times.

I just think that there might be less regulatory hurdles than some might presume. Intel can always point to AMD for CPU or GPU competition. On the mobile front it can point to ARM based CPU's as well as Imagine (even if it holds shares in Imagine).
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Nvidia's market cap is roughly $8B-$9B, typical M&A take a 2x multiple to existing market cap plus or minus some allowances for extenuating circumstances.

Except when a company is losing markets faster than anyone can write an articvle about it and especially not when it's losing ground to the prospective buyer, khm. :awe:

In other words I doubt it would be more than its market cap - OTOH I highly doubt Intel would buy it.

NV is shaking its @ss to attract buyers for long time now - Intel never been interested and the rumored deal with XXXX( not AMD, cannot recall what did I read long time ago :D) fell through on Huang's and his buddies (so-called 'executive managers', mostly leftover people from startup times, stuck in their chairs with little clue about current things, spending times with pet projects) egos or at least the rumor said it... :)
 
Last edited:

sandorski

No Lifer
Oct 10, 1999
70,382
5,942
126
Mark(Quote) my Words: Intel will never make a High End Vid Card on their own. If they want into that Market, Nvidia is their only option to do so.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Mark(Quote) my Words: Intel will never make a High End Vid Card on their own. If they want into that Market, Nvidia is their only option to do so.

What a silly claim - Intel is more than capable to do it if they will commit enough resources (=$$$$ to siphon top engineers with experience then to give them everything they ask for.)
 

sandorski

No Lifer
Oct 10, 1999
70,382
5,942
126
What a silly claim - Intel is more than capable to do it if they will commit enough resources (=$$$$ to siphon top engineers with experience then to give them everything they ask for.)

Not gonna happen. Nvidia/ATI have spent years developing Expertise into the Market. Intel could spend $billions in an attempt to catch up, but by the time they do catch up, Nvidia/ATI will just step ahead of Intel again. At some point it makes no Financial sense and I suspect Intel is pretty close to surpassing that point already, thus the dramatic rethinking on Larrabee.

Buying Nvidia gets them immediately into the game. Product, Engineers, Expertise. Continued R&D just $Wastes. Intel is just not as brilliant as most assume, outside of their core business anyway.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
:D

I would like to have seen Intel competing on the GPU front as well but I guess it may never happen now.

Who knows how much they really even invested in Larrabee. It could have just been one of their hundreds of pet projects that they decided to publicize. Which they did get a ton of publicity out of.

I still say if Intel really was serious about this, they would just cut NVIDIA a check and license what they need. Or even throw them a x86 license in trade. No need to buy the whole company. This is the mistake AMD made and are being punished for. AMD has lost a lot of money on ATI. Not just the initial investment, but ATI has been operating at a net loss since the purchase. Fusion is still no where to be found and AMD still has not even talked about using their own foundries (which they were forced to sell to stay afloat) to make ATI chips.

Here's to hoping AMD recovers, Intel improves their IGP's (at least), Fermi gets launched and everyone has a good Christmas. ():)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Mark(Quote) my Words: Intel will never make a High End Vid Card on their own. If they want into that Market, Nvidia is their only option to do so.

The same sentiment was echoed of Japan's auto and steel industries in the 50's and 60's.

There is a fallacy in implied continuance of market dominance by way of legacy alone.

So long as the barrier to entry is fiscally surmountable (and Intel is in that position) with resources at hand and there is a will to enter that marketspace then it can and will probably happen within a meaningful time period.

The challenge for Intel will continue to be the story of gross margins. Neither Nvidia nor AMD extract >40% GM from the discrete GPU market today even with the substantial performance delivered by their products.

The challenge for Intel's decision makers is to develop a business strategy that they can have confidence in (pie-in-the-sky wishful thinking doesn't cut it) which spells out a technologically feasible approach to hit or best AMD's and NVidia's performance while simultaneously commanding the sort of ASP's and cost structure necessary to sustainably generate 50% gross margins year after year.

They saw a way to do this with their SSD product lineup, it involved/required forming a joint-venture with Micron to lower the cost of flash production combined with leveraging their in-house expertise on microcontroller designs to hit the performance numbers needed to command the ASPs required to generate gross margins.

What path do they see for discrete GPU? They either pursue a strategy of out-racing the competition's performance based on node-cadence (the current CPU strategy, shoot for higher ASP enabling SKUs) or they fall-back to the "recycle fully depreciated fabs and equipment" strategy that lowers cost as a means of getting those gross margins (the current strategy for their IGP).

I'm thinking they are going the "outpace them with our unrivaled node timeline and process tech performance" strategy. Cancel 45nm Larrabee and shoot for releasing 32nm Larabee in a year or two. If that doesn't cut the mustard then fall-back to 22nm Larrabee.

They know the foundries aren't going to be able to keep up with them in terms of node cadence and performance, so as long as they are willing to bide their time eventually wind blowing on their back will put them ahead of the competition. It just might take another 5 yrs and the cost of EUV added to the mix before it happens.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
huh, not sure i follow you. Please elaborate on this. What does nvidia have that can trump an hd5970?

He's talking out of his @ss again - Nvidia isn't first in anything. Fastest single card: ATI, fastest dual card: ATI, biggest market share: Intel.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Not gonna happen. Nvidia/ATI have spent years developing Expertise into the Market.

Ehhh? Just WTF are you talking about, seriously? Especially when Writing like This Randomly?

"Expertise into the Market" - really sounds like PR BS to me, sorry.
FYI Intel is the #1 VGA vendor since ages, what "Expertise into the Market" they need to promote their future products?

You're not making any sense...

Intel could spend $billions in an attempt to catch up, but by the time they do catch up, Nvidia/ATI will just step ahead of Intel again. At some point it makes no Financial sense and I suspect Intel is pretty close to surpassing that point already, thus the dramatic rethinking on Larrabee.

Pure speculation without any meaningful fact or number.
Again: what on Earth are you talking about? Intel is a chip designer and a rather good one, with more resources than ATI+NV together and well ahead of both when it comes to mfring processes, let alone it owns its own fabs.

Is Intel a good GPU designer? Of course not and I did predict this failure for LRB - but this cancellation does not mean anything, Intel is so freakin' rich they could run 2-3 LRB projects in the background.
Yes, they could afford it.

Buying Nvidia gets them immediately into the game. Product, Engineers, Expertise. Continued R&D just $Wastes. Intel is just not as brilliant as most assume, outside of their core business anyway.

What is this single "core business anyway" in your mind?
FYI Intel offers server, desktop and mobile CPUs, they just sold their entire XScale/ARM division a couple of years ago (to Marvell IIRC), wireless chips, chipsets, VGA controllers (yes, they do, surprising, huh?), mobile integrated systems, a shitload of software tools, they own Havok, the best physics-middleware package on Earth (yes, it's far ahead of the free PhysX) etc etc etc.
BTW did you know that Intel used to be the 3rd-4th biggest PC maker, about 15 years ago? They made and shipped them unbranded to MINOs who put their badges on and sold them as theirs.

Anyway, my point is that even though there's some truth to your point you obviously cannot back it with any meaningful argument - here's opne for you: if Huang gives up his control (=his ego) then, and only then can such a merge work.

Heck, I'm defending Intel (never liked them), what a crazy day. :D
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
The same sentiment was echoed of Japan's auto and steel industries in the 50's and 60's.

There is a fallacy in implied continuance of market dominance by way of legacy alone.

So long as the barrier to entry is fiscally surmountable (and Intel is in that position) with resources at hand and there is a will to enter that marketspace then it can and will probably happen within a meaningful time period.

The challenge for Intel will continue to be the story of gross margins. Neither Nvidia nor AMD extract >40% GM from the discrete GPU market today even with the substantial performance delivered by their products.

The challenge for Intel's decision makers is to develop a business strategy that they can have confidence in (pie-in-the-sky wishful thinking doesn't cut it) which spells out a technologically feasible approach to hit or best AMD's and NVidia's performance while simultaneously commanding the sort of ASP's and cost structure necessary to sustainably generate 50% gross margins year after year.

They saw a way to do this with their SSD product lineup, it involved/required forming a joint-venture with Micron to lower the cost of flash production combined with leveraging their in-house expertise on microcontroller designs to hit the performance numbers needed to command the ASPs required to generate gross margins.

What path do they see for discrete GPU? They either pursue a strategy of out-racing the competition's performance based on node-cadence (the current CPU strategy, shoot for higher ASP enabling SKUs) or they fall-back to the "recycle fully depreciated fabs and equipment" strategy that lowers cost as a means of getting those gross margins (the current strategy for their IGP).

I'm thinking they are going the "outpace them with our unrivaled node timeline and process tech performance" strategy. Cancel 45nm Larrabee and shoot for releasing 32nm Larabee in a year or two. If that doesn't cut the mustard then fall-back to 22nm Larrabee.

They know the foundries aren't going to be able to keep up with them in terms of node cadence and performance, so as long as they are willing to bide their time eventually wind blowing on their back will put them ahead of the competition. It just might take another 5 yrs and the cost of EUV added to the mix before it happens.

Exactly my point. It's all about long-term profitability for Intel and I'm quite sure at some point they will re-enter - the question is that it will be discrete desktop graphics or a Fusion-type CPGPU soltuoin?
My money is on the latter...
 

sandorski

No Lifer
Oct 10, 1999
70,382
5,942
126
Ehhh? Just WTF are you talking about, seriously? Especially when Writing like This Randomly?

"Expertise into the Market" - really sounds like PR BS to me, sorry.
FYI Intel is the #1 VGA vendor since ages, what "Expertise into the Market" they need to promote their future products?

You're not making any sense...



Pure speculation without any meaningful fact or number.
Again: what on Earth are you talking about? Intel is a chip designer and a rather good one, with more resources than ATI+NV together and well ahead of both when it comes to mfring processes, let alone it owns its own fabs.

Is Intel a good GPU designer? Of course not and I did predict this failure for LRB - but this cancellation does not mean anything, Intel is so freakin' rich they could run 2-3 LRB projects in the background.
Yes, they could afford it.



What is this single "core business anyway" in your mind?
FYI Intel offers server, desktop and mobile CPUs, they just sold their entire XScale/ARM division a couple of years ago (to Marvell IIRC), wireless chips, chipsets, VGA controllers (yes, they do, surprising, huh?), mobile integrated systems, a shitload of software tools, they own Havok, the best physics-middleware package on Earth (yes, it's far ahead of the free PhysX) etc etc etc.
BTW did you know that Intel used to be the 3rd-4th biggest PC maker, about 15 years ago? They made and shipped them unbranded to MINOs who put their badges on and sold them as theirs.

Anyway, my point is that even though there's some truth to your point you obviously cannot back it with any meaningful argument - here's opne for you: if Huang gives up his control (=his ego) then, and only then can such a merge work.

Heck, I'm defending Intel (never liked them), what a crazy day. :D

You wait. 5 years from now, Intel will not be a Player. Unless they Buy Nvidia.