News [Reuters] Nvidia eclipses Intel as most valuable U.S. chipmaker

NTMBK

Lifer
Nov 14, 2011
10,207
4,939
136
(Reuters) - Nvidia (NVDA.O) has overtaken Intel (INTC.O) for the first time as the most valuable U.S. chipmaker.

In a semiconductor industry milestone, Nvidia’s shares rose 2.3% in afternoon trading on Wednesday to a record $404, putting the graphic component maker’s market capitalization at $248 billion, just above the $246 billion value of Intel, once the world’s leading chipmaker.


The AI bubble has been pretty amazing for NVidia!
 
  • Like
Reactions: Glo.

mikegg

Golden Member
Jan 30, 2010
1,740
405
136
The AI bubble has been pretty amazing for NVidia!
There is no AI bubble. We're going to depend more and more on AI. It's going to be a colossal industry and right now, Nvidia is ahead.

Nvidia is an innovative company. They're smart about what to focus their efforts on. Only fanboys are mad because Nvidia GPUs are so much better than AMD's that they have to pay a big premium for them. They should be mad at AMD instead.

As a shareholder, I'm quite happy.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,207
4,939
136
There is no AI bubble. We're going to depend more and more on AI. It's going to be a colossal industry and right now, Nvidia is ahead.

Nvidia is an innovative company. They're smart about what to focus their efforts on. Only fanboys are mad because Nvidia GPUs are so much better than AMD's that they have to pay a big premium for them.

As a shareholder, I'm quite happy.

To clarify- I think that CNNs are extremely powerful tech, and will continue to be usefully applied to lots of problems for years to come. I just think that right now we're in the "Peak of Inflated Expectation":

Gartner_Hype_Cycle.svg


People seem to think that we're about to get flipping Skynet, and it just isn't going to happen! CNNs aren't going to deliver general AI. I think there's going to be a period of hype deflation as people come to terms with that.
 

mikegg

Golden Member
Jan 30, 2010
1,740
405
136
Pretty much all businesses can be made more efficient by AI or future AI tech - down to your mom and pop businesses. Companies that can put out the best AI at a certain task can take over an industry overnight. And society-changing AI such as self-driving cars, self-delivering robots is still in the testing phase.

So I do disagree that we're at the "Peak Inflated Expectations" point unless you have solid data and research to back that up.

Personally, I think we're at the infancy stage, much closer to "Technology Trigger" as companies try to figure out the best way to collect data, create ETL, train, etc.

The biggest threat to Nvidia, as a shareholder, is that we're still so early in the game that we don't even know for sure if Nvidia's hardware is the best for AI. For example, could something like Cerebras disrupt Nvidia? How about Google's own internal training chips that they might make available via Google Cloud?

And just for fun, in a survey of data scientists, they predicted that general AI is coming in the 2040s.
 
Last edited:
  • Like
Reactions: moinmoin

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
Is it AI or datacenters demand that are driving Nvidia's growth? Nvidia's profits from gaming are only half of their revenue, they have become tremendously diversified. As a shareholder their valuation is a bit over the top @ today's $420. Mind u I used to say that when I first got in @ $35 and it went to $150 in about a year...Nvidia is just a beast in the stock market. However, lets not forget in 2018 it reached $290 and slid back hard to $130ish...there's no reason to expect it to stay over $400 once the rotation out of tech stocks begins.
 
Last edited:

Gideon

Golden Member
Nov 27, 2007
1,598
3,520
136
So I do disagree that we're at the "Peak Inflated Expectations" point unless you have solid data and research to back that up.

Personally, I think we're at the infancy stage, much closer to "Technology Trigger" as companies try to figure out the best way to collect data, create ETL, train, etc.

That's not what some industry insiders think

Yeah you can easily try discrediting the blogpost as the company went bankrupt as "having a angenda" IMO his reasoning is quite solid and mirrors the thoughts of quite a few developers I've spoken to.

Deep learning is very useful but it is by no means a Panacea and can be quite expensive vs simpler methods.

People actually desiging systems with deep learning are much more realistic about the stuff than hyped up clueless investors. In fact I know of quite a few startups that have added AI related stuff to their project not because it was a good fit (it wasn't really required and produced no better results in that domain) but rather to attract clueless investors. AI and blockchain are just the two new magic words (like the "cloud" of the past) used for that purpose.
 

mikegg

Golden Member
Jan 30, 2010
1,740
405
136
That's not what some industry insiders think

Yeah you can easily try discrediting the blogpost as the company went bankrupt as "having a angenda" IMO his reasoning is quite solid and mirrors the thoughts of quite a few developers I've spoken to.

Deep learning is very useful but it is by no means a Panacea and can be quite expensive vs simpler methods.

People actually desiging systems with deep learning are much more realistic about the stuff than hyped up clueless investors. In fact I know of quite a few startups that have added AI related stuff to their project not because it was a good fit (it wasn't really required and produced no better results in that domain) but rather to attract clueless investors. AI and blockchain are just the two new magic words (like the "cloud" of the past) used for that purpose.
I wouldn't put too much water into that blog post. If anything, it reinforces that we're still in the early phase of AI. We're still years away from breakthrough AI that does more than just facial recognition for social medial apps.

But it doesn't mean it's not coming.

It's like being in the 1980s and saying "computers are overhyped because they're too expensive" or in the 1990s and saying "the internet is overhyped because it's too slow".

For A.I., data collection techniques such as IOT, data storage and pipelining, and finally training and inference (Nvidia's market) need to all come together.

I do agree that blockchain is overhyped.

But please don't compare blockchain and AI. One is developed by a much smaller community and is mostly used for scammers, the other is being developed by nearly every large company with real application.

Comparing blockchain to AI is a false equivalence: https://en.wikipedia.org/wiki/False_equivalence

PS. The medium blog post you linked to is by a guy who actually doesn't have a technical background. Before he started the "AI" company, he worked as a glorified salesperson for a co-working space. While I'm sure he's very very intelligent, I'd believe it more if it's from a technical director working at DeepMind, for example. https://www.linkedin.com/in/stefanseltzaxmacher/
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
People seem to think that we're about to get flipping Skynet

Stock market wise that wouldn't be all that compelling, a monolithic contract, even at say $50 Billion dollars, is nothing compared to the recurring revenue stream offered up by deals like the one with Mercedes Benz(both that the MB deal itself would be that big). Infotainment uses alone has enormous potential revenue implications, smart cruise and auto pilot are potentially bigger still- and those are actual markets that are in varying states of evolution.

Now, justifying the current valuation requires investors to believe that nVidia is a prohibitive favorite to make serious headway I these markets as their valuation balloons.

They’re about to have very tough competition in client graphics, which makes this especially funny.

"Poor Volta" was three years ago and AMD still doesn't have anything that can beat it. AMD themselves took napalm to their credibility in terms of forward looking statements on the GPU sector, they need to actually deliver on their promises- beating nVidia's highest tier part by 5% isn't going to cut it in that regard, they need a nVidia "killer". If you want the benefits of speculative pricing, you under promise and over deliver, that's why nVidia is given such a relative bump.

Also, consumer graphics as a percentage of revenue is dropping quickly, and if you look at gross margins it's already smaller them datacenter. AMD just isn't competitive there at all, they babe an opening with a couple of big wins, but you can look back and see that those big single contacts tend to take years to translate into appreciable market share(just look at nVidia).

None of this is bashing AMD, people on this forum see them as nVidia's largest competitor, but they just aren't to the financial markets(nor are they for most of nVidia's markets at this point).
 

Gideon

Golden Member
Nov 27, 2007
1,598
3,520
136
I wouldn't put too much water into that blog post. If anything, it reinforces that we're still in the early phase of AI. We're still years away from breakthrough AI that does more than just facial recognition for social medial apps.

But it doesn't mean it's not coming.

It's like being in the 1980s and saying "computers are overhyped because they're too expensive" or in the 1990s and saying "the internet is overhyped because it's too slow".

I do agree that blockchain is overhyped.

He didn't say it isn't coming. He said it has hit an S-curve. I suggest you read the entire blog pos including what exactly they were working on, how simple and bulletproof it was compared to unsupervised gen 5. autonomous driving and why and how they still failed. Not that everybody will, but what are the actual hard parts and what is the hype.

I'll quote some relevant parts, but highly suggest to read the entire post:

Back in 2015, everyone thought their kids wouldn’t need to learn how to drive. Supervised machine learning (under the auspices of being “AI”) was advancing so quickly — in just a few years it had gone from mostly recognizing cats to more-or-less driving. It seemed that AI was following a Moore’s Law Curve:

...
Projecting that progress forward, all of humanity would certainly be economically uncompetitive in the near future. We would need basic income to cope, to connect with machines to stand a chance, etc.

Five years later and AV professionals are no longer promising Artificial General Intelligence after the next code commit. Instead, the consensus has become that we’re at least 10 years away from self-driving cars.

It’s widely understood that the hardest part of building AI is how it deals with situations that happen uncommonly, i.e. edge cases. In fact, the better your model, the harder it is to find robust data sets of novel edge cases. Additionally, the better your model, the more accurate the data you need to improve it. Rather than seeing exponential improvements in the quality of AI performance (a la Moore’s Law), we’re instead seeing exponential increases in the cost to improve AI systems — supervised ML seems to follow an S-Curve.

0*yRZb_FYxN3DXayf4


The S-Curve here is why Comma.ai, with 5–15 engineers, sees performance not wholly different than Tesla’s 100+ person autonomy team. Or why at Starsky we were able to become one of three companies to do on-public road unmanned tests (with only 30 engineers).

TL;DR: Supervised learning is becoming exponentially more expensive for the same gains. It's development is slowing down. (Though obviously it's still really useful and has a lot of untouched fields left to improve upon). So yes we are" in the peak of Inflated Expectations" as @NTMBK mentioned. It doesn't mean it wont continue to improve/evolve but the exponential improvements are over and things are slowing down.

But please don't compare blockchain and AI. One is developed by a much smaller community and is mostly used for scammers, the other is being developed by nearly every large company with real application.

Comparing blockchain to AI is a false equivalence: https://en.wikipedia.org/wiki/False_equivalence

Agreed, didn't try to compare them, just mentioned that there are 2 things investors have been hyped about (though blockchain is less popular by now)
 

Gideon

Golden Member
Nov 27, 2007
1,598
3,520
136
BTW I would love to be wrong about gen 5. self driving and Elon Musk seems hell bent to prove it

It's just that the actual sentiment inside the software-development circles I've been to (and they do use ML daily and are highly competitive in their fields) seems quite a bit different and very much aligned with the quoted Starsky blog post (regarding self driving in particular).
 

mikegg

Golden Member
Jan 30, 2010
1,740
405
136
He didn't say it isn't coming. He said it has hit an S-curve. I suggest you read the entire blog pos including what exactly they were working on, how simple and bulletproof it was compared to unsupervised gen 5. autonomous driving and why and how they still failed. Not that everybody will, but what are the actual hard parts and what is the hype.

TL;DR: Supervised learning is becoming exponentially more expensive for the same gains. It's development is slowing down. (Though obviously it's still really useful and has a lot of untouched fields left to improve upon). So yes we are" in the peak of Inflated Expectations" as @NTMBK mentioned. It doesn't mean it wont continue to improve/evolve but the exponential improvements are over and things are slowing down.
I did read the entire blog post.

I also wrote this:

PS. The medium blog post you linked to is by a guy who actually doesn't have a technical background. Before he started the "AI" company, he worked as a glorified salesperson for a co-working space. While I'm sure he's very very intelligent, I'd believe it more if it's from a technical director working at DeepMind, for example. https://www.linkedin.com/in/stefanseltzaxmacher/

Again, I respect the author, but it's from someone who has zero technical background, and ran an "AI" company that wasn't at the top of automation field (Waymo currently is).

In addition, you can find content to support any of your views on the internet. Just google "A.I. is overhyped" and you'll get millions of hits.

So it's best not to put too much stock into one medium article.

Agreed, didn't try to compare them, just mentioned that there are 2 things investors have been hyped about (though blockchain is less popular by now)
By "investors", you mean clueless get-rich-quick schemers, right? Because that was what drove blockchain's hype.

AI is totally different.

Again, I would never compare AI to blockchain.

And Nvidia's valuation is mostly driven by revenue and profit projections - not pure hype.
 

mikegg

Golden Member
Jan 30, 2010
1,740
405
136
I should clarify that I don't think we need to reach a general AI within the next 10 years to fulfill AI's "hype". People who are Nvidia shareholders don't believe that.

In addition, self-driving cars are not general AI.

Waymo already has enough testing data to show that their self-driving cars are actually safer than human drivers. They've had these results for years already. They're waiting for regulation to catch up to allow self-driving cars on the road.

It's not surprising that an AI company founded by a former co-working space salesperson (medium article author) failed. You can only blame investors for that.

I work and live in San Francisco. There are a ton of self-driving cars being tested on the road here.

Lastly, we should not equate the success of AI to self-driving cars as AI has broad application.
 
Last edited:

Gideon

Golden Member
Nov 27, 2007
1,598
3,520
136
And Nvidia's valuation is mostly driven by revenue and profit projections - not pure hype.

I never disagreed with that. My point was directed towards the "Peak of Inflated Expectation" that many still seem to have about AI. E.g. how "Pentium 4 was going to hit 10 GHz" claims ~17 years ago.

Btw the frequency analogy is quite good to portray what I meant. Just because frequency scaling hit a brick wall in 2005 didn't mean that somehow all computing innovation stopped or there was no growth or innovation the computing business. It was just that the "nineties were over" (times of ridiculous improvements every 2 years or so). The same holds to process improvements slowing down (as we are seeing today).

Deep learning has a lot of uses, it's constantly being improved and can be utilized in way more places than it currently has been. There is decades full of untapped work ahead in this regard (even with 0 HW improvements). It's just that in areas where it has been extensively used things are slowing down. The ultra-rapid progress of mid 2010's no more and getting incrementally better results is often requiring exponentially more hardware to be thrown at it.
 
  • Like
Reactions: Pohemi

mikegg

Golden Member
Jan 30, 2010
1,740
405
136
I never disagreed with that. My point was directed towards the "Peak of Inflated Expectation" that many still seem to have about AI. E.g. how "Pentium 4 was going to hit 10 GHz" claims ~17 years ago.

Btw the frequency analogy is quite good to portray what I meant. Just because frequency scaling hit a brick wall in 2005 didn't mean that somehow all computing innovation stopped or there was no growth or innovation the computing business. It was just that the "nineties were over" (times of ridiculous improvements every 2 years or so). The same holds to process improvements slowing down (as we are seeing today).

Deep learning has a lot of uses, it's constantly being improved and can be utilized in way more places than it currently has been. There is decades full of untapped work ahead in this regard (even with 0 HW improvements). It's just that in areas where it has been extensively used things are slowing down. The ultra-rapid progress of mid 2010's no more and getting incrementally better results is often requiring exponentially more hardware to be thrown at it.
I respect your opinion. I just think we're on the opposite end of it.

I believe that AI, as a catalyst for efficiency, is just getting started. The number of companies that should be using some form of AI but isn't due to the lack of talent or cheap tools is astounding.

Put it this way, we're just at the beginning of data collection. We can collect data for just about everything from business events to guest behavior at a restaurant to how many oxygen molecules are in the air. Data collection has to come before training and inference. We can't just say AI is slowing down already when we're not even close to collecting all the data available.

I'm not an expert. I'm only a software engineer who's dabbled into AI a few times just for learning. But I do live in San Francisco, and I'm surrounded by AI companies and talent. The market for AI is huge. Hence, Nvidia has been able to eclipse Intel in valuation and is projected to continue to outgrow Intel.
 
Last edited:

DisEnchantment

Golden Member
Mar 3, 2017
1,587
5,703
136
BTW I would love to be wrong about gen 5. self driving and Elon Musk seems hell bent to prove it

It's just that the actual sentiment inside the software-development circles I've been to (and they do use ML daily and are highly competitive in their fields) seems quite a bit different and very much aligned with the quoted Starsky blog post (regarding self driving in particular).
As someone who work in this industry, and hearing how L5 autonomy is something that can be easily hammered with ML was very exciting... in 2015.
We have been buying a number of startups over the years to find those elusive solutions and "out of the box ideas" as we were told, to give us the "missing pieces of the puzzle" in our domain. Several billions later and several reorganizations later we are a lot wiser.
For me a personal awakening moment was when the new colleagues(no disrespect to them) have no qualms putting commercial non ASIL accelerators in a vehicle coupled to the powertrain network.

Our industry is still struggling to build a safe SW and HW scalable platforms and programming paradigm that hold accelerators and related SW to the same ASIL and ISO 26262 standards as traditional automotive ECU HW/SW.
So yeah, I am with you on this. If you recollect everyone were so adamant about having L5 autonomy in 2021. Right now the industry is struggling with L3 let alone L5. It will happen, just not yet.
 

mikegg

Golden Member
Jan 30, 2010
1,740
405
136
As someone who work in this industry, and hearing how L5 autonomy is something that can be easily hammered with ML was very exciting... in 2015.
We have been buying a number of startups over the years to find those elusive solutions and "out of the box ideas" as we were told, to give us the "missing pieces of the puzzle" in our domain. Several billions later and several reorganizations later we are a lot wiser.
For me a personal awakening moment was when the new colleagues(no disrespect to them) have no qualms putting commercial non ASIL accelerators in a vehicle coupled to the powertrain network.

Our industry is still struggling to build a safe SW and HW scalable platforms and programming paradigm that hold accelerators and related SW to the same ASIL and ISO 26262 standards as traditional automotive ECU HW/SW.
So yeah, I am with you on this. If you recollect everyone were so adamant about having L5 autonomy in 2021. Right now the industry is struggling with L3 let alone L5. It will happen, just not yet.
Do you work for Uber?
 
  • Haha
Reactions: DisEnchantment

mindless1

Diamond Member
Aug 11, 2001
8,021
1,432
126
It's pointless to talk about AI. That has nothing to do with nVidia et al central processing hardware, rather nearly 100% about software and sensors to support it. It'll take much longer to mature than CPU/GPU hardware did.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
BTW I would love to be wrong about gen 5. self driving and Elon Musk seems hell bent to prove it

A little while ago Tesla was having serious problems meeting demand and an article came out and said rather than doing tried-and-true ways of production they wanted to revolutionize the space, and a lot of it involved automating a lot more.

Later Elon Musk came out and said too much automation was the culprit and it hindered their production. He ended up hiring more people.

Since he's among the many that talked about AI Doomsday scenarios you can see why he ran the company that way. Actually he did achieve a lot, and that's due to his attitude that probably nothing is impossible. Reality does come knock at your door and bring you back down to earth every now and then.
 
  • Like
Reactions: Elfear

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
GPU wise nVIDIA is unbeatable but CPU wise...
Nvidia's share price isn't high due to PC sales, the PC market is never going to be a huge growth area, no Nvidia has a silly share price due to speculation that they will do well in other areas. Those other areas often don't use x86, ARM being the most common (because unlike x86 ARM allow you to build whatever specialist chip you like with all the extra modules and functionality you want).

Nvidia has plenty of experience there. Not just in making things with ARM cpu's in, but in building AI infrastructure, interconnect, video encode/decode, spacial 3d processing, server farms, supercomputers. All these key parts of what's gonna matter in in the future. And it's not just the hardware that matters, just as important is Nvidia has the software to go with it. It's the combination of all of that which has given them the silly share price.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Before saying nVidia is unbeatable, we need to see what nVida and AMD release later this year.

If they have a Ryzen style jump in performance in the GPU space they are going to be slaughtered. NVidia having a full node drop, we have a history there, to make their performance claims come true they are going to need *well* over 100% uplift over the 5700xt in performance.

Beating Intel when they are stuck on an antiquated node is, quite frankly, child's play compared to what is expected of their GPU division this time out.

Can they do it? We'll see, but RDNA was shockingly bad on an engineering basis(that is *Very* different than being a bad end product) so RDNA 2 is going to need to be a bigger jump than the 8800GTX or 9700Pro to match the claims.
 

Elfear

Diamond Member
May 30, 2004
7,094
632
126
If they have a Ryzen style jump in performance in the GPU space they are going to be slaughtered. NVidia having a full node drop, we have a history there, to make their performance claims come true they are going to need *well* over 100% uplift over the 5700xt in performance.

Beating Intel when they are stuck on an antiquated node is, quite frankly, child's play compared to what is expected of their GPU division this time out.

Can they do it? We'll see, but RDNA was shockingly bad on an engineering basis(that is *Very* different than being a bad end product) so RDNA 2 is going to need to be a bigger jump than the 8800GTX or 9700Pro to match the claims.

It will help that AMD has some wiggle room in die size. I'm not sure what TSMC's 7nm node is capable of with decent yields but the 5700Xt is only 250mm2. If they can ramp it up to anywhere close to Turing size combined with IPC and clock speed increases, we should see some great competition in the GPU space. At least, that's the hope.