My understanding of the motivation behind NV's Fermi and future product developments

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Good correction on the Bulldozer. The point is AMD's integrated graphics on Llano is going to be low-end by the time it ships in 2011. This is why AMD has their discrete mobile GPU cards. Not to mention the person buying Llano will have to choose between a superior Intel CPU and a slower AMD cpu? Might as well get Intel + dedicated graphics if you care about graphics. If you don't care about graphics, then you'll likely get the faster Intel CPU as you won't play games anyways. It's likely cheaper for AMD to sell a CPU with integrated APU than to sell their Phenom IIs with crappy chipset/discrete graphics. This is likely the primary reason for fusion - cost savings to AMD.

And Bobcat? That's going to compete with the Atom. Ok so that GPU is going to be immaterial compared to any discrete graphics card. Plus, the market for Bobcat is for users who care about the longest battery life. This means unless bobcat CPU itself is better than the Atom, its graphics performance is not going to matter to consumers. AMD laptops have forever had better integrated graphics than the Core 2 Duo + Intel HD. Yet they have failed in the mobile space because in the mobile space CPU matters more for non-gamers.
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
@ModestGamer

Just one question. What makes you think that 100% of that 85% is going to ditch there existing computer for Fusion and never ever buy another computer from other company other than AMD?
 
Apr 20, 2008
10,067
990
126
The comming APU's from AMD and intel pretty much kills off discrete for them below the middle mid range $100 and under cards. The highest volume sellers. The also loose the integrated graphics chipset sales.

They saw this comming years ago when AMD bought ATI and they knew exactly what it meant then and shifted bussiness strategy.

So it is what it is.

Really? And APU back then?

ATI wasn't even producing a video card with stream processors. Not until a year later after the final sale, nevermind that the deal was made at least a year prior. It was only Pixel Pipelines back then. Not stream processors. They brought up Fusion later that year as an all in one chip, CPU and GPU combination. As in, the GPU only handles video and the CPU only handles processing. Not a combined processor. That thought didn't come up until later.

Stop playing "know it all" and "told ya so" when nobody knew what an APU would be.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Really? And APU back then?

ATI wasn't even producing a video card with stream processors. Not until a year later after the final sale, nevermind that the deal was made at least a year prior. It was only Pixel Pipelines back then. Not stream processors. They brought up Fusion later that year as an all in one chip, CPU and GPU combination. As in, the GPU only handles video and the CPU only handles processing. Not a combined processor. That thought didn't come up until later.

Stop playing "know it all" and "told ya so" when nobody knew what an APU would be.

AMD indeed was producing cards with Stream processors, Xbox 360 anyone? Plus cards takes long to develop, G80 was launched in 2006, but it has been in development for more than 3 years. The HD 2900XT was launched in 2007, and it has been in development way before AMD bought ATi. ATI started developing the RV770 from 2005!!
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
@ModestGamer

Just one question. What makes you think that 100% of that 85% is going to ditch there existing computer for Fusion and never ever buy another computer from other company other than AMD?


It matters not if it is amd or intel. the OEM's control the market, dell,ibm,HP,compaqu etc. ethausits are a minor nuasance in reality.

Its all a matter of what OEM's buy that makes nvidias bussiness plan suceed or fial.

Think about this. Laptop integrated graphics. Why have them now ? Right ?? the intel and AMD APU's are gonna kill that on all but the gamming laptops which make up nearly zero of the whole laptop market. Not only that but AMD and Intel 'more likely amd" will be able to better offer laptop graphics that get a big boost in performance when coupled with their apu design.

Intel might be leading on cpu technology but AMD has intel on gpu tech. Currently.

The real issues becomes. Where will nvidia sell products built on their IP ?

You have to ask that question. The discrete card market is good to nvidia. No doubt about it. I think some here are a bit emotional about product which is silly. Nvidia makes a great product. the issue is will OEM's want to spend nvidia money to ad parts they no longer need ?? OEM PC and Laptop builder want a replacement model only.

To be honest. No they won't if 85% of users never even open the box. they just replace the old box with a new box. Repairs are rare warrantys are high and upgrades on most machines are at best infrequent. Without a OEM source selling position Nvidia is in big trouble.

Product movement is cashflow. OEM's buy more product then gamers etc by leaps and bounds.

Your trying to frame the argument in this way.

Nvidia makes a great product so they will survive.

My point is

Nvidia is loosing a huge amount of cashflow and OEM sales. How will they remain in bussiness with such a model based mostly in the very small gammer community ?



so ask yourself again. Where will nvidia sell product and IP when the OEM's are looking at AMD and Intel to provide cheaper more profitable integrated units CPU/GPU units.

Think microcontroller. Thats where this is headed.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
Very nicely done RussianSensation. All that is pretty much fact.

I mean AMD has its low budget consumer card, so nvidia must compete and have that.

All those MACs and cheap PC's they throw in a 100 to 150 dollar card. That is just how it is now adays.

You want the fast card then pay 400 dollars and get that. Thanks and gb and gl,
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
Really? And APU back then?

ATI wasn't even producing a video card with stream processors. Not until a year later after the final sale, nevermind that the deal was made at least a year prior. It was only Pixel Pipelines back then. Not stream processors. They brought up Fusion later that year as an all in one chip, CPU and GPU combination. As in, the GPU only handles video and the CPU only handles processing. Not a combined processor. That thought didn't come up until later.

Stop playing "know it all" and "told ya so" when nobody knew what an APU would be.


Why else would they buy a GPU technology company ?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why else would they buy a GPU technology company ?

Lots of other reasons:

1. CASH FLOW from current products - ATI would continue to develop and sell standalone PCIe-based graphics cards. You are then paying for the future cash flows of that division. You may pay a multiple of 7-8x annual cash flow, while ATI graphics division will operate for 50+ or more years. (Check)

2. Chipsets - Buying ATI freed AMD from its current reliance on companies such as VIA for chipsets, as the combined could churn out chipsets tailored for AMD systems. (Check)

3. It also gave AMD an alternative to Intel's integrated graphics systems until Fusion was developed. This was necessary if AMD intended to compete at all in the mobile space since 2006. (Check)

4. AMD would make money off the sale of Intel systems with ATI discrete graphics cards. (Check)

5. It gave AMD an entry into the cell phone, console gaming and other digital TV markets such as HDTVs, as ATI makes chips for those products. (Check).

6. Those who know that AMD and ATI merged score their brand preference for AMD 3x times higher than those who are not aware of the merger. In other words, if the consumer knows AMD and ATI are one company, they believe in the product more. On the other hand, by keeping the brands distinct, AMD may have actually been limiting or handicapping their products when sold standalone. (Check).
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
AMD has been talking about integrated graphics chips and cpu's for over a decade.

the explosion of low power devices in the early 2000's set that market on fire.

To adress your points. while those are nice monday quarter backing. The only one that makes sense was the chipset bussiness but they wanted the IP and brand presence.

The whole thing from the start with the ati merger was about integrated gpu/cpu and right after they finalized the deal. the announced fusion products in development.

The motives are glaringly obvious. They are going to pay big dividends in market share to.

AMD isn't the dumb companie everybody thinks they are.




Lots of other reasons:

1. CASH FLOW from current products - ATI would continue to develop and sell standalone PCIe-based graphics cards. You are then paying for the future cash flows of that division. You may pay a multiple of 7-8x annual cash flow, while ATI graphics division will operate for 50+ or more years. (Check)

2. Chipsets - Buying ATI freed AMD from its current reliance on companies such as VIA for chipsets, as the combined could churn out chipsets tailored for AMD systems. (Check)

3. It also gave AMD an alternative to Intel's integrated graphics systems until Fusion was developed. This was necessary if AMD intended to compete at all in the mobile space since 2006. (Check)

4. AMD would make money off the sale of Intel systems with ATI discrete graphics cards. (Check)

5. It gave AMD an entry into the cell phone, console gaming and other digital TV markets such as HDTVs, as ATI makes chips for those products. (Check).

6. Those who know that AMD and ATI merged score their brand preference for AMD 3x times higher than those who are not aware of the merger. In other words, if the consumer knows AMD and ATI are one company, they believe in the product more. On the other hand, by keeping the brands distinct, AMD may have actually been limiting or handicapping their products when sold standalone. (Check).
 
Apr 20, 2008
10,067
990
126
AMD indeed was producing cards with Stream processors, Xbox 360 anyone? Plus cards takes long to develop, G80 was launched in 2006, but it has been in development for more than 3 years. The HD 2900XT was launched in 2007, and it has been in development way before AMD bought ATi. ATI started developing the RV770 from 2005!!

AFAIK, the 360 is based off of the X1800 architecture. Not 2xxx series.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
AFAIK, the 360 is based off of the X1800 architecture. Not 2xxx series.

So what? It is a unified architecture nevertheless, why don't you read the Xbox 360 specifications before you post, the Xbox 360 GPU has 48 unified processors which can do either pixel shaders or vertex shaders, it is based on the X1800 architecture but has some stuff of the R600 architecture, it goes beyond the DX9 spec but behind of the DX10 spec.
 

Petey!

Senior member
May 28, 2010
250
0
0
Pretty easy to see that nvidia is in trouble getting shut out if the integrated market. Seems some people don't realise how massive this is. And a lot of people don't understand business. All the money is in OEMs and getting your product out in mass volumes. That's where all the cashflow comes from, small margins or not. Discrete is nice, and they wallop the workstation side of things, but if that's all thats gonna keep them afloat, they'll be one tiny company compared to what they are now.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
It matters not if it is amd or intel. the OEM's control the market, dell,ibm,HP,compaqu etc. ethausits are a minor nuasance in reality.

Its all a matter of what OEM's buy that makes nvidias bussiness plan suceed or fial.
*Snip*
Okay then, OEM isn't a market for Nvidia and Nvidia should try harder. Yeah, they have tried that several years ago. In fact, 2 years ago ION arrives, where the Nvidia GPU actually offload the Intel CPU by alot. Guess what, Intel used their relations to jack up the OEM parts for Nvidia which increased their BoM cost, and therefore fail to compete.

Guess what, after AMD successfully sued Intel, Nvidia sued Intel for the reason above, and Intel is forced to settle the case but Intel insist to bring it to trial.
Source from money.cnn.com

Unlike what you believe that Nvidia has be playing turtle, they are actually force out of the market by Intel, and average Joes like you never knew it existed. AMD on the otherhand is not affected in the same way as they make their own CPUs and chipsets. With Intel begin forced to settle the case, AMD is in freeway on OEM. Oh wait, Sandy!

Because of Intel, Nvidia is forced to out of that part of the market. Interestingly, it kind of flowed into a bigger market, aka the cell phone market. Smart phone really was not a visible market several years ago, but with the introduction of iphone (not the first, but the most known,) the entire smart phone market arises and "Data plan" is logical for consumers now. Nvidia's Tegra was more or less for OEMs and laptops but can't compete due to Intel dirty tricks, and it will be too late to go back now as Intel and AMD more or less covered the ground. Yet, in cell phones, Nvidia is the strongest in the 3, and phones powered by Tegra 2 is rolling out and Tegra 3 is baking hot.

Did I mentioned that Nvidia is going to go on court with Intel? It is a risky move IMO but there are much more smarter asses in Nvidia than me which have calculated all the risk factors for their decisions, so we will see how they will end up. If Intel loses, then Nvidia can share the OEM market from Intel, weaken its force against AMD. Haven't you hard that your enemy's enemy is your friend? Tell me, why will AMD wants Nvidia die? No matter what the outcome is, AMD will be with the winner side. That is the definition of "Well played."

While these 3 players a busy stepping on each other's feet, Mac shamelessly take over walkman, cd player, mp3, and phone markets. Nvidia has been in good relationship with Mac and almost made it in iphone, too bad the iphone business is too good to be shared. Once again Nvidia is forced to play against the sleeping dragon. Remember how people use to laugh at iMac users? No one laughs anymore as iphone4 virtually sold out everywhere and ipad is still selling like (no words to describe) and the rest of the mobile company has formed an alliance against iphone. Nvidia has placed itself in the warzone but not necessary on either side, but an option to both sides. Where is AMD here?

As to GPU computing, Nvidia believed that dumping money on education will eventually bring them tactical advantages. They are spending University programs, funds like crazy. In few years time these seeds will unleash the power within those CUDA core GPUs in supercomputing because people knows how. That is strategy.

There are stories far more interesting than ones I know. Don't take it from me as my understand is limited. Read around as you will find stuffs that may make you wet your pants, and realize how big the world is.
 

golem

Senior member
Oct 6, 2000
838
3
76
Pretty easy to see that nvidia is in trouble getting shut out if the integrated market. Seems some people don't realise how massive this is. And a lot of people don't understand business. All the money is in OEMs and getting your product out in mass volumes. That's where all the cashflow comes from, small margins or not. Discrete is nice, and they wallop the workstation side of things, but if that's all thats gonna keep them afloat, they'll be one tiny company compared to what they are now.

I agree that getting shut out of the integrated market will hurt Nvidia, it won't kill them but it it does hurt. But not necessarily for the cash flow or profit reasons. It allows them to spread their fixed cost and R&D over a lot more units so the allocated cost for each unit is lower than if they didn't have intergrated units. The profit from this division isn't really that high but it helps to increase the profits from the other units.

What a lot of peope are saying tho. is that this integration of GPU/CPU also doesn't really help AMD much either. It probably won't increase their CPU sales because Intel is doing the same thing. But it will decrease their discrete sales.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81

"Nvidia's achievement for Q2 2010 is quite remarkable since most of the Quadro graphics cards Nvidia shipped were based on previous generation technology, according Alex Herrera, a senior analyst at Jon Peddie Research. In fact, ATI launched its FirePro V8800 and V7800 graphics accelerators based on Cypress chips in early April, hence, they have been shipping since Q1 2010, whereas Nvidia unveiled its Quado 4000 and Quadro 5000 products (powered by GF100 chip) in late July, which means that there were not a lot of such boards shipped in calendar 2010 to OEMs."

Interesting findings....
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Seems some people don't realise how massive this is. And a lot of people don't understand business. All the money is in OEMs and getting your product out in mass volumes. That's where all the cashflow comes from, small margins or not.

No, you clearly don't grasp the way NV operates as a company. 31% of its net income is derived from professional graphics, so to state that all their cashflow comes from OEMs is incorrect. Again sales to OEMs include both integrated and discrete products. So no one is disputing the fact that selling standalone discrete graphics to OEMs is not important. The argument is largely how some believe that integrated chipsets/graphics of NV is VERY important for NV.

Also, Revenue != cash flow. Margins have an immediate effect on cash flows however.

In simplest terms, Revenues * Net Margin = Net Income. (NYU Slide 91: http://pages.stern.nyu.edu/~adamodar/pdfiles/dcfinput.pdf). Since operating margin is the result of Revenue - COGS, if you have small margins, you have small cash flows.

I still think the people arguing that integrated chipset business for NV is a big deal, don't get it - NV does not consider integrated chipsets their important revenue stream (Seero pointed this out for you guys already!!!). Bottom line impact is immaterial today. NV has moved on to better things. Therefore, to state that "that's where all the cashflows come from, small margins or not" is misleading. This statement implies that margins are not related to cashflows because it's implying that irrespective of margins, cashflows are material.

Jen-Hsun Huang - "They (Intel) have disrupted our chipset business. The damage has been done. We've been out of the chipset business for well over a year, so if this got resolved we're not expecting to ramp back up the thousand engineers that we had working on chipsets." http://news.cnet.com/8301-13924_3-20013543-64.html
 
Last edited:

extra

Golden Member
Dec 18, 1999
1,947
7
81
I'm betting AMD is wondering the same thing these days.

AMD buying ATI is the smartest thing they've ever done, other than x86-64. And I hafta laugh at this, especially because the graphics side of AMD has been a lot like Intel the past couple years--an absolute execution machine. Putting their CPU side to shame.

To state the obvious, Intel is also moving in the same direction...sandy bridge is just the start. Do you really think that a company as huge and as resourceful and with as many smart engineers as Intel won't eventually get their GPU (well, let's call it the graphics part of the APU at this point for this discussion) up to a pretty decent point? No, they probably won't catch AMD in integrated graphics anytime soon (other than the temporary lead they may get with SB for a little while). And they won't catch Nvidia's low end parts. That's not the point.

The point is that Intel has, with SB, gotten their integrated graphics performance up to the "good enough" for most people to play games (low settings on some, but PLAYABLE). And do you not think that when Haswell rolls around the gap will close further? I do. Not to mention what AMD's APU will look like at that point, GPU-side wise.

Nvidia WILL get squeezed out of the sub-$100 graphics card market, because that market will likely disappear. Not in 2011 or likely even in 2012 or 2013. But it will shrink and eventually disappear. Discrete will probably exist at least through 2020 in higher priced stuff though (I hope!).

Being cut out of selling mainstream graphics cards and relegated to enthusiast/gamer stuff only is going to hurt. I don't care what you say about revenue percentages, etc. It's going to get really tough for Nvidia in the coming years unless they can find some niches to expand too.

Professional graphics and GPU-compute stuff can carry them, but not forever. There's going to come a point where APU's will start stealing a lot of that market as well. They need to branch out into areas where their disadvantages vs. Intel and AMD are minimized and their strengths are maximized. I think Tegra is a good start. Tegra is, despite what Charlie says, a good chip (play the racing game on zune hd for a nice taste of what it is capable of). Anyway. :)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
All the money is in OEMs and getting your product out in mass volumes.

This, more then anything is why this thread is so amusing. Consumer PCs/laptops now account for less then 20% of all CPU volume sales now, and that rate is in a rapid state of decline. With the exception of workstation/crunch racks which is already nV's cash cow, nVidia is in a very strong position moving forward. Simply take a look at all of the high end phones and tablets coming out in the first six months of 2K11.

nVidia is supposedly doomed because they are being pushed out of the lowest margin chip segment of the rapidly contracting PC segment of consumer electronics, despite currently being the 'holy grail' of the white hot and exploding ultra portable market(Toshiba, HTC, Samsung, Motorola, Dell, MSI, Notion, LG, Asus- are *ALL* using Tegra2 for their flagship ultraportable devices for announced 2K11 products). So nV gets pushed out of the lowest end portion of a ~300 million unit a year market and takes over the high end of a 1.2 billion unit a year market, and some how this spells doom. I don't see the logic in this.

If all you follow is the PC market you are missing the much, much bigger world out there. AMD's dumping their handheld division right before it exploded in popularity is the biggest business mistake they have made in many years, $65 million to Qualcomm which has already made *significantly* more profit then that back on Snapdragon alone(which is an ARM/ATi designed SoC). If they had held on to their handheld division and continued to develop it, they could have a potential market larger then either their CPU or GPU business within the next few years; unfortunately they exited the market right before it moved directly in line with where their designs were perfectly suited.

Current analysts projections on the broader market versus PC shipments have PCs falling to likely well under 10% of the CPU market by 2020. How does this all tie back around to Fermi?

The Fermi design gives them a huge lead in the workstation/crunch rack over Intel or AMD- even Intel's own PR talks about how badly they get beaten(I'd hope Intel fired the idiot who released that one ;) ). On the other end of the computing spectrum, they are pushing platforms that are going to compete with Intel's Atom line in the MID space. Right now, MS has three different OSs running on CPUs made by nVidia- the fastest growing OS by a long shot right now(Android) is also running on nV CPUs. While a lot of people seem to have a laser like focus on how AMD and Intel are going to squeeze nV out of the PC space, they fail to notice that nVidia is right now positioned to do very well in every other computing market while both Intel and AMD are decidedly not(outside of consoles which AMD is still looking strong for outside of the handheld market thanks to their boneheaded sale).

Fermi designs, moving forward, are nV's best bet when they try pushing up from the bottom moving forward. A Netbook with a Tegra4 chip could have more raw compute power then a PentiumD using less then a watt(nV uses the workstation market to figure out the best approach, then scales small units down to put in the Tegra line). That kind of squeeze coming up from the bottom could be a problem for Intel and AMD. General users compute power needs aren't going up anywhere near as fast as they have in years past, and everyone involved in the industry knows it. What do you do to make a desktop PC compelling when a tablet can do everything they use their desktop PC for(many users are *already* getting this with the iPad today- it's not like it's some far off pipe dream)?

All of this assumes that nV can execute. People who follow only the PC market may mistake lead time with failure on the part of the Tegra line, but the design turnaround in the ultraportable market is *FAR* longer then what people are used to on these boards. Tegra2 is going to be a major player in the ultraportable space in the not too distant future- anyone can simply follow Engadget for a couple of weeks and read all the news and see that that is simply a statement of fact.

If they can continue to push their way both up and down in the ultraportable market while continuing their high end dominance with Fermi type parts they will be more then comfortable with their market position. Those two factors are both big *ifs* to be sure, but that they have been ready for this chain of events to happen for years can not be doubted. They have everything in place they possibly can to be ready to exit the PC market entirely within the next decade and potentially be far larger then they are today(not that that I expect that to happen at all, just looking at it from a positioning perspective).
 

extra

Golden Member
Dec 18, 1999
1,947
7
81
If they can continue to push their way both up and down in the ultraportable market while continuing their high end dominance with Fermi type parts they will be more then comfortable with their market position. Those two factors are both big *ifs* to be sure, but that they have been ready for this chain of events to happen for years can not be doubted. They have everything in place they possibly can to be ready to exit the PC market entirely within the next decade and potentially be far larger then they are today(not that that I expect that to happen at all, just looking at it from a positioning perspective).

I don't think they are dominating on the high end (unless you are talking about quadro, then sure, I agree). But I do agree with your post on the importance of Tegra, 100%. If they can execute, it has serious potential. If they can't, I don't think gpu-compute and quadro will save them. Tegra isn't nearly as bad as the FUD Charlie says about it makes people think (not to get into a charlie arguement, I rather enjoy his stuff, just saying)...go check out a zune hd if you need proof--it's great. Go look at the racing game, it's a free download I believe. It's a good example of it's potential. And that's the original Tegra.

However, I don't agree that Intel and AMD won't be able to compete here. They just want to use x86. Can't blame them, either. I think they'll pull it off. And I know what you mean about Tegra being able to have more compute power in a sub-1w part than a semi-modern dual core pentium, but---having that theoretical power there and actually having it usable in the stuff that people are going to be running on a netbook? Two VERY VERY different things.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
don't think they are dominating on the high end (unless you are talking about quadro, then sure, I agree)

Yeah, I was talking about Quadro/Tesla :)

However, I don't agree that Intel and AMD won't be able to compete here. They just want to use x86.

For me this reads like "Company X will able to compete in Formula 1, they just want to use their cement truck platform to do it" :) Hunderds of thousands of transistors needing to be powered in an UP device just to translate x86 calls to uOps is a horrific idea and starts you off with a massive handicap.

And I know what you mean about Tegra being able to have more compute power in a sub-1w part than a semi-modern dual core pentium, but---having that theoretical power there and actually having it usable in the stuff that people are going to be running on a netbook? Two VERY VERY different things.

I honestly see this as being one of the areas Fermi helps them out a lot in. Already Universities are teaching numerous classes on programming on CUDA. Simply looking at the numbers of these classes all those students aren't going to be working for Los Alamos. We are going to see a sharp spike in the amount of developers who already are used to programming for CUDA/GPGPU, and the emerging Ultra Portable market is one that encourages young start ups as they don't need a publisher and have no major existing power base to deal with(relatively speaking the UP software market is still in its' infancy). I'm not saying I think they will get a direct 100% comparison, but I think the gap will be much, much smaller then what we are looking at now.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
your just being ignorant.amd has sub 9 watt integrated apu. pc/laptop might make up 20% of cpu sales if you combine the added size and growth of embedded which is phones where arm owns all.which btw intel and amd intend to shut nvidia out of and they already have better product in the pipeline

a gpu is nowhere near capable of running a machine and amd/intel will be looking to compete there.

nvidia is stuck with discrete solutions. for as long as that lasts.

arm margins are small to they are very volume depedant.

Not to mention intel and AMD both plan to compete here.

AMD pretty will have the consoule market locked up.

Wether you lick it or not client is the cash cow that keeps the industry growing.

Nice try on the spin though.

nvidia "dead man walking"

This, more then anything is why this thread is so amusing. Consumer PCs/laptops now account for less then 20% of all CPU volume sales now, and that rate is in a rapid state of decline. With the exception of workstation/crunch racks which is already nV's cash cow, nVidia is in a very strong position moving forward. Simply take a look at all of the high end phones and tablets coming out in the first six months of 2K11.

nVidia is supposedly doomed because they are being pushed out of the lowest margin chip segment of the rapidly contracting PC segment of consumer electronics, despite currently being the 'holy grail' of the white hot and exploding ultra portable market(Toshiba, HTC, Samsung, Motorola, Dell, MSI, Notion, LG, Asus- are *ALL* using Tegra2 for their flagship ultraportable devices for announced 2K11 products). So nV gets pushed out of the lowest end portion of a ~300 million unit a year market and takes over the high end of a 1.2 billion unit a year market, and some how this spells doom. I don't see the logic in this.

If all you follow is the PC market you are missing the much, much bigger world out there. AMD's dumping their handheld division right before it exploded in popularity is the biggest business mistake they have made in many years, $65 million to Qualcomm which has already made *significantly* more profit then that back on Snapdragon alone(which is an ARM/ATi designed SoC). If they had held on to their handheld division and continued to develop it, they could have a potential market larger then either their CPU or GPU business within the next few years; unfortunately they exited the market right before it moved directly in line with where their designs were perfectly suited.

Current analysts projections on the broader market versus PC shipments have PCs falling to likely well under 10% of the CPU market by 2020. How does this all tie back around to Fermi?

The Fermi design gives them a huge lead in the workstation/crunch rack over Intel or AMD- even Intel's own PR talks about how badly they get beaten(I'd hope Intel fired the idiot who released that one ;) ). On the other end of the computing spectrum, they are pushing platforms that are going to compete with Intel's Atom line in the MID space. Right now, MS has three different OSs running on CPUs made by nVidia- the fastest growing OS by a long shot right now(Android) is also running on nV CPUs. While a lot of people seem to have a laser like focus on how AMD and Intel are going to squeeze nV out of the PC space, they fail to notice that nVidia is right now positioned to do very well in every other computing market while both Intel and AMD are decidedly not(outside of consoles which AMD is still looking strong for outside of the handheld market thanks to their boneheaded sale).

Fermi designs, moving forward, are nV's best bet when they try pushing up from the bottom moving forward. A Netbook with a Tegra4 chip could have more raw compute power then a PentiumD using less then a watt(nV uses the workstation market to figure out the best approach, then scales small units down to put in the Tegra line). That kind of squeeze coming up from the bottom could be a problem for Intel and AMD. General users compute power needs aren't going up anywhere near as fast as they have in years past, and everyone involved in the industry knows it. What do you do to make a desktop PC compelling when a tablet can do everything they use their desktop PC for(many users are *already* getting this with the iPad today- it's not like it's some far off pipe dream)?

All of this assumes that nV can execute. People who follow only the PC market may mistake lead time with failure on the part of the Tegra line, but the design turnaround in the ultraportable market is *FAR* longer then what people are used to on these boards. Tegra2 is going to be a major player in the ultraportable space in the not too distant future- anyone can simply follow Engadget for a couple of weeks and read all the news and see that that is simply a statement of fact.

If they can continue to push their way both up and down in the ultraportable market while continuing their high end dominance with Fermi type parts they will be more then comfortable with their market position. Those two factors are both big *ifs* to be sure, but that they have been ready for this chain of events to happen for years can not be doubted. They have everything in place they possibly can to be ready to exit the PC market entirely within the next decade and potentially be far larger then they are today(not that that I expect that to happen at all, just looking at it from a positioning perspective).
 

extra

Golden Member
Dec 18, 1999
1,947
7
81
We are going to see a sharp spike in the amount of developers who already are used to programming for CUDA/GPGPU, and the emerging Ultra Portable market is one that encourages young start ups as they don't need a publisher and have no major existing power base to deal with(relatively speaking the UP software market is still in its' infancy). I'm not saying I think they will get a direct 100% comparison, but I think the gap will be much, much smaller then what we are looking at now.

You do have a good point here. I think CUDA was/is being damn well executed by NVIDIA and was extremely smart thinking on their part. If their tegra stuff gets cuda enabled (pretty sure tegra 2 isn't cuda enabled, but maybe 3 and 4 will be?)...something I hadn't thought of. :)