Question An example of how Nvidia is a cancer for graphics technology progress

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

gdansk

Diamond Member
Feb 8, 2011
4,205
7,050
136
10 most valued US companies and their P/E ratio:
Apple: 34.6
Nvidia: 10.3 (had 10:1 stock split this year, that's why so low)
Microsoft: 35.8
Google: 31.4
Amazon: 96.4
Meta: 51.1
Berkshire Hathaway: 11.6
Broadcom: 7.53
Lilly: 170
Tesla: 64.2
Just

Some think a correction will happen, but I think that's pretty optimistic. It could very well end up in something much worse.
While AMD isn't big enough for its collapse to obliterate index funds (like NVDA could) it has a P/E ratio of 200.
Worth noting anyway.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
While AMD isn't big enough for its collapse to obliterate index funds (like NVDA could) it has a P/E ratio of 200.
Worth noting anyway.
It's possible there are companies with even higher P/E, but I am too lazy to check every single one of them, so I limited It to the top 10 companies, which are valued the highest.

edit: AMD had a P/E ratio as high as 1285 a year ago.
New Bitmap image.jpg
 
Last edited:

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136

ARM's is over 300.
It looks like the site I originally took P/E is not very accurate.
That one was showing for Broadcom P/E ratio of 7.53 while your site shows 157.67!

So let's check It once more according to your site:
Apple -> Capitalization: $3.49 trillion ; P/E: 34.63
Nvidia -> Capitalization: $3.31 trillion ; P/E: 63.23
Microsoft -> Capitalization: $3.09 trillion ; P/E: 35.25
Alphabet -> Capitalization: $2.03 trillion ; P/E: 23.60
Amazon -> Capitalization: $1.98 trillion ; P/E: 45.17
Meta -> Capitalization: $1.49 trillion ; P/E: 30.11
Berkshire Hathaway -> Capitalization: $994.45 billion ; P/E: 14.64
Broadcom -> Capitalization: $847.62 billion ; P/E: 157.67
Eli Lilly -> Capitalization: $885.83 billion ; P/E: 114.79
Tesla -> Capitalization: $695.79 billion ; P/E: 61.18

P.S. Even Intel has P/E Ratio of 102.44.
Just Nvidia alone is overvalued by ~$2.2 trillion and is the most likely to have their stock crash after the end of AI bubble.
 
Last edited:

MrTeal

Diamond Member
Dec 7, 2003
3,916
2,700
136
10 most valued US companies and their P/E ratio:
Apple: 34.6
Nvidia: 10.3 (had 10:1 stock split this year, that's why so low)
Microsoft: 35.8
Google: 31.4
Amazon: 96.4
Meta: 51.1
Berkshire Hathaway: 11.6
Broadcom: 7.53
Lilly: 170
Tesla: 64.2
Just

Some think a correction will happen, but I think that's pretty optimistic. It could very well end up in something much worse.
Isn't P/E price per share over earnings per share, and thus it wouldn't matter if the stock split or not?
 

gdansk

Diamond Member
Feb 8, 2011
4,205
7,050
136
Isn't P/E price per share over earnings per share, and thus it wouldn't matter if the stock split or not?
Yes, split doesn't matter. It should be calculated as price per share / earnings per share.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,805
6,413
136
Sure wish I had told my wife eff u when AMD was 1.62 and I could have dropped 5 or 10k into it.

I wish I picked some up even at $5-10. Just use it as an excuse whenever she wants something? "Why cant we travel to here and there and have a great time?". "Because you didn't give me wife dollars (I've heard this term) to buy a stock that skyrocketed!". Of course that will probably end in divorce, so maybe dont do that.
 

DaaQ

Golden Member
Dec 8, 2018
1,903
1,355
136
I wish I picked some up even at $5-10. Just use it as an excuse whenever she wants something? "Why cant we travel to here and there and have a great time?". "Because you didn't give me wife dollars (I've heard this term) to buy a stock that skyrocketed!". Of course that will probably end in divorce, so maybe dont do that.
LOL rinse and repeat.
 
  • Haha
Reactions: igor_kavinski

marees

Golden Member
Apr 28, 2024
1,249
1,803
96
In an Acquired podcast at NVIDIA's GTC 2025 venue, Gelsinger was asked about how NVIDIA's CEO, Jensen Huang, came to realize how big AI actually is, and in response, not only did he say that Jensen got lucky with AI, but also expressed that NVIDIA's current hardware stack is too expensive for AI inferencing workloads.


Intel’s Former CEO Pat Gelsinger Claims NVIDIA’s AI GPUs Are 10,000 Times More Expensive Than What Is Needed For AI Inferencing​


https://wccftech.com/intel-former-c...nsive-than-what-is-needed-for-ai-inferencing/
 

marees

Golden Member
Apr 28, 2024
1,249
1,803
96
not only did he say that Jensen got lucky with AI,
Old headline:

Intel CEO laments Nvidia's 'extraordinarily lucky' AI dominance, claims it cuda-wuda-shuda have been Intel​


Pat Gelsinger: Nvidia Got 'Extraordinarily Lucky' in Dominating the AI Market​

If it hadn't been for that meddling company, Intel would be rich by now!

Things would be completely different if only Intel hadn't cancelled the Larrabee GPU.

https://www.extremetech.com/computi...aordinarily-lucky-in-dominating-the-ai-market

https://www.pcgamer.com/intel-ceo-l...ims-it-coulda-woulda-shoulda-have-been-intel/

During a broad ranging discussion at MIT, Gelsinger was asked what Intel is doing to drive the development of AI.

Gelsinger's comments came in response to a professor who asked what Intel was doing along the lines of AI hardware. This query prompted Gelsinger to recap Intel's ill-fated history with GPUs and "throughput computing" (as opposed to scalar), where he noted that when Intel pushed him out of the company 11 years ago, it also cancelled its discrete GPU project named Larrabee. According to Gelsinger, if the company had stuck with that project, it would be Intel at the apex of the AI industry right now. Instead, Nvidia finds itself at the helm, which Gelsinger says results from Nvidia CEO Jensen Huang getting "extraordinarily lucky."


Gelsinger also claimed things would have been very different had Intel not cancelled the Larrabee project shortly after he left for an 11-year stint outside of Intel before returning as CEO in February 2021.

"When I was pushed out of Intel 13 years ago, they killed the project that would have changed the shape of AI," Gelsinger said of Larrabee.

Larrabee was an Intel GPU project long before its current Arc graphics cards that was intended to go head-on with Nvidia in the gaming and GPGPU markets courtesy of scores of tiny x86 CPU cores. The gaming graphics cards were cancelled in late 2009 and the rest of the Larrabee project withered thereafter.
 

jpiniero

Lifer
Oct 1, 2010
16,492
6,983
136
I don't think it was luck...

What was luck was that AI became the only thing that the entire financial industry cared about.
 
  • Like
Reactions: moinmoin
Jul 27, 2020
26,010
17,948
146
Intel’s Former CEO Pat Gelsinger Claims NVIDIA’s AI GPUs Are 10,000 Times More Expensive Than What Is Needed For AI Inferencing
Yup. Nvidia's price gouging is holding back AI advancements (can't believe I'm agreeing on something with Pat).

Here's what's happening:

The rich control the GPU supply.
Anyone who has a bright idea on how AI could be more beneficial, obviously needs hardware to test their idea on.
Since they can't afford the hardware, they apply for a job at some AI firm aka rich people with access to expensive GPUs.
The brilliant idea ends up making the inventor maybe just enough to retire or create their own startup while the fat cats get even richer and buy up even more GPU hardware.
And the cycle continues.

This is what governments were supposed to prevent from happening by making things fair for everyone.

SURPRISE!

The rich control the governments now too. Congratulations. Capitalism working as intended.

One thing we need to ask ourselves: "Would there be a PC on every desk in modern offices had IBM prevented their x86 PC design from being cloned by other manufacturers?".

What Nvidia is doing is just preventing humanity from solving its problems faster. They need to be taken down by a consortium of AI players with open standards.
 
  • Like
Reactions: marees

GTracing

Senior member
Aug 6, 2021
478
1,112
106
Old headline:

Intel CEO laments Nvidia's 'extraordinarily lucky' AI dominance, claims it cuda-wuda-shuda have been Intel​


Pat Gelsinger: Nvidia Got 'Extraordinarily Lucky' in Dominating the AI Market​

If it hadn't been for that meddling company, Intel would be rich by now!

Things would be completely different if only Intel hadn't cancelled the Larrabee GPU.

https://www.extremetech.com/computi...aordinarily-lucky-in-dominating-the-ai-market

https://www.pcgamer.com/intel-ceo-l...ims-it-coulda-woulda-shoulda-have-been-intel/

During a broad ranging discussion at MIT, Gelsinger was asked what Intel is doing to drive the development of AI.

Gelsinger's comments came in response to a professor who asked what Intel was doing along the lines of AI hardware. This query prompted Gelsinger to recap Intel's ill-fated history with GPUs and "throughput computing" (as opposed to scalar), where he noted that when Intel pushed him out of the company 11 years ago, it also cancelled its discrete GPU project named Larrabee. According to Gelsinger, if the company had stuck with that project, it would be Intel at the apex of the AI industry right now. Instead, Nvidia finds itself at the helm, which Gelsinger says results from Nvidia CEO Jensen Huang getting "extraordinarily lucky."


Gelsinger also claimed things would have been very different had Intel not cancelled the Larrabee project shortly after he left for an 11-year stint outside of Intel before returning as CEO in February 2021.

"When I was pushed out of Intel 13 years ago, they killed the project that would have changed the shape of AI," Gelsinger said of Larrabee.

Larrabee was an Intel GPU project long before its current Arc graphics cards that was intended to go head-on with Nvidia in the gaming and GPGPU markets courtesy of scores of tiny x86 CPU cores. The gaming graphics cards were cancelled in late 2009 and the rest of the Larrabee project withered thereafter.
Direct link to his comments:
It was posted Dec 14, 2023. Must be a slow news cycle.

Question: I think we're going to need increasingly more of that, because AI is here. AI is revolutionizing the marketplace. With ChatGPT, AI has become democratized. It's in everyone's pockets, but in order for AI to do it's magic it needs the platforms; it needs the computers. My question is, what is Intel doing for the development of AI hardware and how do you see that as a competitive advantage?
Pat: Yeah, one of the things. A little bit of my own personal story. I was, I started at Intel so young, I went through puberty there. I grew up at the feet of Grove, Noyce and Moore. 30 years with the company, then I took an 11 year vacation to EMC and VMWare--which by the way, I was outside of Intel almost to the day the same time Steve Jobs was out of Apple. It's like the death of the vision. I wanted to be the CEO of Intel but when I was pushed out of the company 13 years ago, they killed a project that would have changed the shape of AI. I had a project underway--it was called Larrabee at the time--to do high throughput computing in the x86 architecture. They killed it after I left. Nvidia had an unmerited potential in the space of AI hardware, essentially throughput computing versus scalar computing. Nobody put pressure on them. And Jensen, by the way, he's a really good guy. I know him super well. He worked super hard at owning throughput computing primarily for graphics initially and then he got extraordinarily lucky. You know, they didnt' even want to support their first AI project. Right. Big flop machines was needed for these big data sets that they were looking at. Intel did nothing in the space for 15 years and now as I come I have a passion. Okay, we're going to start showing up in that space and our strategy here is, number one, democratize AI because today you see the collision of these high performance computing AI for these 100 billion, soon going to trillion parameter models that we're training on that take 30, 60, 90 days for training. How many of those can you do? It is the ultimate of high performance computing, weather modeling, nuclear simulation, high end training. You can't afford it. So what are we going to do? We want to democratize AI. We want to make it available to every application developer. You can put it under your desk. It's in your laptop. Our next generation client products will have 20 TOPS in a standard mediocre PC. It's 20 TOPs. That was more than was possible on a high-end supercomputer about 15 years ago. So democratize it. Also open up the software stack. For industry standards, get away from the proprietary technologies. We're driving a technology called SYCL, an open standardized parallel C so that we can eliminate proprietary technologies like CUDA. And we begin to offer our high-end computing chips to compete in that high-end training space as well. Build it into our standard Xeon processors to truly make AI available for any and all. Because it's the most important active workload today. And it's evolving like crazy. I helped to create IEEE p854 which was the floating point standard at 64 and 80 bits. Now we're talking about fp8. I mean It's like what! Such trivial low-end data types and are suitable because instead of big vectors let's create little things that are guessing at the right answer and AI is just sort of a probabilistic guess for most things. So it's changing rapidly, sparce matrices, different algorithmic models, breakthroughs happening rapidly. And if we think about AI, essentially you look at something like ChatGPT, incredible technology, but it's on the simplest data set, text. We're going to start getting to some hard datasets going forward. Which are going to be orders of magnitude more challenging than the ones than the ones that we're solving today with AI. So I think we have at least a decade probably two decades of just sheer innovation in front of us and all of it's going to need extroardinary amounts of data, extroardinary amounts of networking, and big honking compute. So we're going to build lot's of fabs so we can build lots of compute.

A few unrelated thoughts:
  • I don't think his comments about Nvidia being "lucky" are out of line. No one, not even Nvidia, knew how big AI was going to be.
  • His comments about democratizing AI seem a bit hypocritical since Intel (and AMD) profit immensely from the the proprietary x86 ISA. I have no doubt that Intel would've been just as greedy if they were the ones who got "lucky".
  • Even if Larrabee had see the light of day, chances are one of it's successors would've been cancelled in the 10 years before AI came along. Though if it were never canceled, Gelsinger would've looked like nostradomas.