Xbit: Nvidia needs access to x86 for Long term survival

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
However, NV is primarily an intellectual property/R&D company. They could partner with Via to design a workable x86 CPU. I'm not saying they would, or whether that would make sense, but that might be one way to use Via's x86 license together with NV's R&D budgets.

That said I'm with the camp thinking x86 compatibility is becoming less and less relevant. Microsoft is pushing managed code, cloud computing is all the rage and java is still with us, in the EE as well as ME flavors. CPUs that optimize for running a VM of one sort or another are the future -- cell phones have it right. Support a java (or similar) VM, a standard API and who cares what hardware is behind the scenes (so long as it's fast).

Whether or not they go down the x86 route it's obvious the future isn't as bright for NV as it was even a few years ago. And in the tech market if you don't grow you become irrelevant. NV desperately needs to open up some new markets and fast.

The same goes for AMD though. If a 2012 cell phone with a bluetooth keyboard and (wireless?) HDMI out can replace a typical home PC then only content creators will need a full-function workstation. We're not that far from that vision now -- several smartphones support bluetooth keyboards, a few have HDMI out and at least one runs a desktop OS variant with a real browser. With the pace of mobile device innovation it's only a matter of time until all 3 features are available on a single $400-600 device.
 

Meghan54

Lifer
Oct 18, 2009
11,684
5,225
136
The same goes for AMD though. If a 2012 cell phone with a bluetooth keyboard and (wireless?) HDMI out can replace a typical home PC then only content creators will need a full-function workstation. We're not that far from that vision now -- several smartphones support bluetooth keyboards, a few have HDMI out and at least one runs a desktop OS variant with a real browser. With the pace of mobile device innovation it's only a matter of time until all 3 features are available on a single $400-600 device.


Now for the obligatory.....Can those smartphones play Crysis? ;)
 

Sokar

Banned
Aug 5, 2010
13
0
0
x86 will not make sense for NV no matter what happens. Creating a new product for a market that doesn't exist yet is a million times easier for them. Even if they buy VIA they start with 0 engineers that have experience designing high power x86 chips because Cyrix abandoned that about 10 years ago. Then they need to research for a ton of money to know in what direction they need to go in developing a product that they will launch at least 3-4 years down the line. Then they need to hire the best cpu engineers they can find. Even with tons of money they can probably form a competitive development team but to get the whole thing in place would take 6 months(they can start design) and probably a year or more before they have a real full team in place. That puts them probably 9-12 months behind even if they start today.

Then they need to catch up not only to where AMD and Intel are today but to where they will be in 3-4-5 years when they are ready to launch a product. Then you have to factor in that they have 0 x86 server market share. I read somewhere last week that over 40% of AMDs stock value is from the enterprise server sector and that is not even one of their best markets ( They are doing better in consumer desktops and discrete GPU imo). And you can't expect to break into the enterprise market with your first product either.

Then you have risk analysis. If AMD invest a billion dollars in a new architecture they will do some risk analysis of the project. So say they go worst case scenario we got 1 billion out and 200 million in and 10 billion in in the best case(they know they can always get some revenue even if the design flops) but NVs risk analysis would be more like worst case 0 in best case: no idea at all. Then you combine this with the need to invest 3-4-5 billion instead of 1 it's completely unrealistic. Shareholders would sack the ceo and liquidate the company before letting him take on a huge speculative value project like this.
 

PandaBear

Golden Member
Aug 23, 2000
1,375
1
81
In some way, NV is screwed because even if they can design a CPU that's capable, they won't have the massive resource to manufacture it like AMD and Intel does. Look at VIA, they haven't kept their x86 CPU up to date in years and even if they did, they can't build it in TSMC, UMC, or charter. Now AMD spend a lot of money in FAB and although they are not as efficient and high quality as Intel (in production), they are better than anyone else other than the memory companies (like Samsung, Toshiba, Micron, Hynix, Elpida). Even if Intel wave the licensing and AMD still let NV make chipsets, NV won't have the upper hand when the chipset integrates into the CPU in a few years.

A friend of mine works for NV and he basically agree with what I said, and right now they can only bet all their eggs on the GPGPU and integration with ARM. If the momentum is strong enough for, say, Apple or Android with the need for extensive graphic power, NV might get lucky selling a platform with a 2GHz ARM processor and a powerful GPU.

Why ARM? Why not SPARC or MIPS? Because ARM has huge support in development tools and platform (you are using an ARM processor embedded device every 7 seconds about 10 years ago, probably more now), it is power efficient, small, licensed to every FAB, and most importantly will be around like the x86.

If (and only if) Apple come up with a computer that uses ARM and a GPU, or Android becomes the next windows and support Tegra, or if Microsoft decided to support ARM as the next platform, then NV will survive and face a lot of competition from everyone under the sun. If not, well, they'll gradually reduce in size and goes the way of 3dfx, S3, Matrox, etc.
 
Last edited:

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Maybe long term success and the computer industry don't coexist. Except for microsoft and intel of course, but even those dogs will likely have their day.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Consider that even as long ago as the early 1970's computers weren't "new"...the industry had existed nearly 30yrs already and yet x86 didn't exist at that time, and now consider how many of the big-name titans in 1970 don't even exist today (except perhaps in name only, e.g. Cray).

It doesn't take a lifetime in this industry, nor any other when it comes down to it, for the entire face of the industry to change up.

If history tells us anything its that we are practically guaranteed that Intel and Microsoft will not dominate their respective industries 30 yrs from now. Their fate seems predestined, the reason for their future decline from power is all that remains to be determined.
 

Sokar

Banned
Aug 5, 2010
13
0
0
Consider that even as long ago as the early 1970's computers weren't "new"...the industry had existed nearly 30yrs already and yet x86 didn't exist at that time, and now consider how many of the big-name titans in 1970 don't even exist today (except perhaps in name only, e.g. Cray).

It doesn't take a lifetime in this industry, nor any other when it comes down to it, for the entire face of the industry to change up.

If history tells us anything its that we are practically guaranteed that Intel and Microsoft will not dominate their respective industries 30 yrs from now. Their fate seems predestined, the reason for their future decline from power is all that remains to be determined.

Well the industry changed because of a paradigm shift. Really what happened was the industry shifted from mainframes and mostly corporate clients to more mainstream users using individual machines(and the graphic interface/mouse). Unless a major shift happens and Intel/MS don't see it in time/can't react fast enough its going to be hard to unseat them. In this information age it's going to be harder and harder to sneak in under the radar and take over/change the industry like Apple and MS did in the 80s.

The only thing I think has the potential to do this is quantum computing(could make everything we have obsolete tomorrow morning and effectively reset the playing field).
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
However, NV is primarily an intellectual property/R&D company. They could partner with Via to design a workable x86 CPU. I'm not saying they would, or whether that would make sense, but that might be one way to use Via's x86 license together with NV's R&D budgets.

That said I'm with the camp thinking x86 compatibility is becoming less and less relevant. Microsoft is pushing managed code, cloud computing is all the rage and java is still with us, in the EE as well as ME flavors. CPUs that optimize for running a VM of one sort or another are the future -- cell phones have it right. Support a java (or similar) VM, a standard API and who cares what hardware is behind the scenes (so long as it's fast).

Whether or not they go down the x86 route it's obvious the future isn't as bright for NV as it was even a few years ago. And in the tech market if you don't grow you become irrelevant. NV desperately needs to open up some new markets and fast.

The same goes for AMD though. If a 2012 cell phone with a bluetooth keyboard and (wireless?) HDMI out can replace a typical home PC then only content creators will need a full-function workstation. We're not that far from that vision now -- several smartphones support bluetooth keyboards, a few have HDMI out and at least one runs a desktop OS variant with a real browser. With the pace of mobile device innovation it's only a matter of time until all 3 features are available on a single $400-600 device.

You guys are daydreaming, seriously - the chances of "x86/64 is becoming less and less relevant" is literally zero. If anything it's becoming more and more relevant as Intel is pushing it lower and lower (it's well-known they are aiming smartphones with their next iteration of Atom-derivative aka Moorestown successor).
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
You guys are daydreaming, seriously - the chances of "x86/64 is becoming less and less relevant" is literally zero.

Really?

The research firm Gartner Inc. has said that PC shipments will hit 257 million units this year, which will be a drop of 11.9 percent from 2008 sales figures.

http://www.product-reviews.net/2009/03/03/2009-sharpest-decline-in-world-computer-sales-ever/

IDC said in its report late Thursday that manufacturers shipped 317.5 million phones in the quarter.

http://gadgetophilia.com/global-cel...with-smaller-makers-growing-fastest-idc-says/

If Intel was responsible for 100% of the entire PC market, which we will say they are for the sake of argument, phones are still selling more per quarter then they are in a year by a healthy margin. x86 has already taken a huge hit to its relevance and the pace is accelerating.

it's well-known they are aiming smartphones with their next iteration of Atom-derivative aka Moorestown successor

Intel has to prove that they can be competitive in the ultra portable market which they certainly haven't come close to doing yet. The fact that they gave the competition a several billion unit head start has entriely eliminated x86's biggest advantage, in fact it turned it into a handicap. They have zero mindshare in the ultra mobile space, and even if they did come out with a great part(which hasn't happened yet) the lead time would put them into late 2011/early 2012 before we saw any parts actually ship to consumers(it's how the ultraportable market works). By then, they will have lost another 1.5-2Billion potential customers.

Dell is now pushing into the tablet market using Snapdragon(using former AMD tech alongside ARM) at the low end and Tegra2(nV/ARM) at the high end(Apple is using ARM for their iPads as well). Intel has been MIA in the ultraportable market and it is exploding in popularity without them. x86 has lost a ton of relevance and the pace is accelerating far faster then x86's popularity did.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
The irony of it all is that Intel *had* an ARM department (bought from Digital in a lawsuit). Intel's StrongARM/XScale were quite popular options in the early days of PocketPCs and whatnot.
But Intel sold its ARM division to Marvell about 4 years ago.
http://en.wikipedia.org/wiki/XScale
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
The irony of it all is that Intel *had* an ARM department (bought from Digital in a lawsuit). Intel's StrongARM/XScale were quite popular options in the early days of PocketPCs and whatnot.
But Intel sold its ARM division to Marvell about 4 years ago.
http://en.wikipedia.org/wiki/XScale

That is funny. I was actually just thinking about this a few minutes ago. I remember Intel selling ARM about a decade ago. Both AMD and Intel made horrible business decisions in the last few years regarding their mobile divisions.
 

Sickamore

Senior member
Aug 10, 2010
368
0
0
I don't know much about the chipsets war. From my understanding since nvidia haven't got access to processors it's affecting them from making high performance cards. Correct me if I am wrong if they try will they infringe on patents. Someone explain to me please.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
You guys are daydreaming, seriously - the chances of "x86/64 is becoming less and less relevant" is literally zero. If anything it's becoming more and more relevant as Intel is pushing it lower and lower (it's well-known they are aiming smartphones with their next iteration of Atom-derivative aka Moorestown successor).

Having the X86 code museum as small or as power efficient as a simpler, cleaner, legacy-free architecture is the daydream. To use a poor auto analogy, it takes far more resources to get a car to run an 11 second quarter mile than a motorcycle. Non-x86 (doesn't have to be ARM) CPUs have an inherent competitive advantage in not requiring all those transistors for backwards compatibility with a 70s microcontroller. I wouldn't be too shocked to see a sun sparc derivative in a desirable cell phone, but I'd be stunned to see an x86.

x86 is relevant on the desktop for backwards compatibility with legacy binaries from decades past. This is absolutely not an issue for portable devices. Any vaguely interesting application was not developed in the 80s or 90s, but more likely less than six months ago. And it probably runs under a VM which doesn't care what the underlying CPU is.

So yes, rapidly becoming more and more irrelevant. Only servers and workstations will require an x86 CPU.

Oh, and a cell phone COULD play Crysis through something like OnLive. Which is another reason I think NV is doomed -- phones in the near future will remain too feeble to run console-like games natively, so there's really no need for a powerful GPU on a small % of the devices. Console games will remain the domain of consoles and services like OnLive.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
So yes, rapidly becoming more and more irrelevant. Only servers and workstations will require an x86 CPU.

Only office machines really.
The server/workstation market was/is mostly a *nix market.
As such, there isn't that much legacy x86 code involved. Most code was ported from other architectures to x86 only recently, and could probably easily be ported back to other architectures.
 

shangshang

Senior member
May 17, 2008
830
0
0
Some people seem to believe the x86/Intel entity as an immovable object. IBM was once in such position, they are anymore. Basically, every dominant player has declined. They decline because of paradigm shift, not because of they didn't see the iceberg coming. Did Microsoft see Google coming?? You get they did, but weren't able to do a damn thing about it. MS weren't able to do a thing about it not because MS lack money, but because of the paradigm shift and MS was too large to shift with it. Apple was dominant and then got killed, and it is only recently that Apple has resurge. Who would have thought a bunch of hippies would set a world wide trend for all those Apple iphones and i ipods and ipads? MS saw it coming, but MS discounted them as trendy little things. In all these cases, the dominant players fail not because they didn't see them coming, they fail because of their size.

I think some of you are jumping on the "NV can't survive without x86" bandwagon a bit much. The mobile/handheld market is still growing rapidly and at some point it might consolidate, and I see NV as one of the main player once it has consolidated. Jensun is a good CEO, and he knows where to get talents.
 
Dec 30, 2004
12,553
2
76
Branch prediction costs a lot of $$$ both in terms of research dollars and die area. ARM was traditionally aimed at the embeded market, you usually don't have a lot of logical branches in the apps that are ran there... and you also don't need performance... focus was on real time (real time means guaranteed minimal performance, rather than being fast) rather than unpredictable raw performance (i.e. what if the branch prediction misses?). ARM was traditionally used in just enough performance applications, i.e. just enough performance to make a cell phone call.

Branch prediction also uses up quite a bit of area, since we are mass producing these chips, ever mm^2 counts.



I am not familiar with the ARM process technologies. But I would imagine so, their pipeline is simple enough, should be relatively easy to scale up the frequency as long as you are willing to sacrifice power/area (i.e. bigger transistors to drive more current).



Not sure what you mean-- you can not easily or economically embed DRAM onto CMOS logic.



Sure-- but we also want to look forward to ARM in more than just 3G phones ... i.e. 64 ARM chips on a server, each one serving a different Apache process.

Also, don't we all wish our 3G phones could do more? i.e. our iPhone run game run w/ better gfx, or render a webpage faster?

Even the simplest of branch prediction algorithms that were developed in the early 90's are over 95% accurate in predicting branches in most applications: http://www.utdallas.edu/~sxr049100/...obal and local branch history information.pdf

Depending on the application, the worst these freely-available algorithms perform is on the order of 80% prediction accuracy. Also, a mispredicted branch is no worse than stalling until the real branch result becomes available, with the right register implementation.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
I don't think graphics cards have much of a future in the home market. The trend is making devices more specific for the purpose which is the opposite of a pc which is general purpose. Consoles for video games, ebook readers for reading, set top boxes for video. For things like office applications and other mainstream applications the focus is on cloud computing. MS has put a ton of support for cloud computing into windows 8 development, even planning to launch it with its own online store where you never actually install the software but it runs in the cloud. With things going that way you don't need a fast local video card, the server does.

Onlive is out now and has come a long way from the first demonstrations. I think that as the internet gets faster and they allocate more data centers that it will be the way home gaming is done. Biggest issue is latency and right now 20ms latency gets you 30fps. If the networks improve you should be able to get 10ms latency and 60fps .

ARM is a good way to go for any company. It is extremely modular and adding more cores does not have the same obstacles that you face with x86. It also uses an approach of using hardware dedicated to the task. On x86 if you need to process sound or video most of the time it is done through the cpu in software or at most the gpu does some of it. On ARM it would be done with a dedicated DSP for the task leaving the cpu idle.

MS has windows ce7 for ARM in beta and I have been using it for a few weeks now and I can see where ARM could definitely be a contender.
 
Last edited:

Wreckage

Banned
Jul 1, 2005
5,529
0
0
In some way, NV is screwed because even if they can design a CPU that's capable, they won't have the massive resource to manufacture it like AMD and Intel does.
You mean like Intel does. AMD no longer has its own fabs. NVIDIA could just contract IBM.

As for ARM, Microsoft is already on board.
http://www.pcmag.com/article2/0,2817,2366904,00.asp

A Tegra 3 chip could easily power netbook and tablets plus entry level desktops and laptops. While a Windows 7 port would help (and is certainly possible). Android would work just fine.

Honestly really all NVIDIA needs is Microsoft to provide a Windows port and NVIDIA could make any chip they wanted. Like Cell or PowerPC or whatever.