Question Intel Corp CEO Pat Gelsinger on AI revolution..

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jul 27, 2020
13,143
7,810
106
iTANIC would have been a far better name.

All the rich people would have jumped on it.

And gone down with it :D
 
Jul 27, 2020
13,143
7,810
106

Lucky guy!
 
Jul 27, 2020
13,143
7,810
106
Natural progression would have been ruiz kicked out by the board, Jensen instated
That would have prevented Xbox and Playstation sharing the same x86 internals.

Main reason AMD won both console makers over was Lisa's "success" with Playstation 3 (she got IBM, Toshiba and Sony to collaborate on the Cell CPU). While the Cell was too ambitious for its time and a pain to program, Lisa likely saw the pros of getting thousands of experienced x86 programmers interested in programming x86 consoles too and made it happen. It sucks that she's been lured by the Dark Side of enterprise profits...
 
  • Like
Reactions: Tlh97

A///

Diamond Member
Feb 24, 2017
4,352
3,151
136
That would have prevented Xbox and Playstation sharing the same x86 internals.

Main reason AMD won both console makers over was Lisa's "success" with Playstation 3 (she got IBM, Toshiba and Sony to collaborate on the Cell CPU). While the Cell was too ambitious for its time and a pain to program, Lisa likely saw the pros of getting thousands of experienced x86 programmers interested in programming x86 consoles too and made it happen. It sucks that she's been lured by the Dark Side of enterprise profits...
No it wouldn't have. Nvidia and microsoft had a bit of a falling out over the original xbox. Microsoft chose then ATI technologies for the 360, the rival of nvidia then. Historically while the two were rivals nvidia had more successes than ati. amd overpayed for ati, the two parties said as much years after the fact. Playstation and Xbox could have gone to ati in an alternate timeline but they would have tried AMD first. Because NVidia would have ceased to exist as it would have been absorbed by AMD. Su may have come back to x86 or she would never have. Take her out of the timeline or put her in, it doesn't matter. You're assuming microsoft or sony wouldn't opt for amd then when their problem along with other companies having a problem was nvidia. In that timeline AMD wouldn't have faltered so much due to overpaying for ATI. Jensen now or then was a much more capable ceo than hector ruiz.

enterprise subsidizes costs for lower tiers. that's how it was at Intel. Sell 10 epycs for the same margins you'd get selling a couple hundred ryzens, for example. not a true example but an example. If microsoft really gave a crap they wouldn't have entered a geforce now partnership with nvidia. you'll know things are really messed up if apple ever comes crawling back to x86 or asks nvidia or amd to help them design a new gpu. Apple's hardware is impressive but their engineers love jumping ship for new challenges but also because apple's pay is not competitive. hardware engineers are underpaid across the board for the amount of work they put in. There's some bs arguments about it but there's no reason a 20 year veteran in hardware engineering should be paid maybe 300K total at some companies while a zit faced mentally challenged moron who's 3 years out of a good college is making that much money starting and will likely triple their yearly totals within a decade. it's all perverse.
 
Jul 27, 2020
13,143
7,810
106
hardware engineers are underpaid across the board for the amount of work they put in. There's some bs arguments about it but there's no reason a 20 year veteran in hardware engineering should be paid maybe 300K total at some companies while a zit faced mentally challenged moron who's 3 years out of a good college is making that much money starting and will likely triple their yearly totals within a decade. it's all perverse.
Intellectuals get used by the morons everywhere. It seems to be the cool thing to do these days. Excuse me while I attend to a moron.
 

A///

Diamond Member
Feb 24, 2017
4,352
3,151
136
Intellectuals get used by the morons everywhere. It seems to be the cool thing to do these days. Excuse me while I attend to a moron.
Calling some or most of my colleagues or myself intellectuals is a stretch. to put it plainly some are butt hurt over the fact that a 30 yo today working for a craphole like meta or idk amazon is clearing a million a year in total compensation. that is unfair because they get to sit around home in their pajama bottoms and play video games in between "work" which likely consists of copy and pasting prefabricated snippets of "code" it would be easier to a be a woman in this field, have a faux affair and get a large 8 figure settlement than work your ass off until your 60s and then retire with a crap pension and shot eyesight and back problems.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
27,396
17,057
146
Getting complaints about this thread going further offtopic with every post. Get the discussion back on track or don't post in the thread.

Moderator DAPUNISHER.
 

cytg111

Lifer
Mar 17, 2008
22,342
12,079
136
I'll admit my mistake now. Your link said it, not you. But you can still take the credit--there is plenty of credit to go around and plenty of learning to do when it comes to AI.

"The Transformer is also able to parallelize its computations, which means that it can process multiple requests at the same time. This allows ChatGPT to respond to user queries very quickly."
Either way transformer or not(if entropy is to be maintained, I assume it is), you're gonna need one set of neurons per layer per customer, the weights are static post training, so it makes sense, you just need that one "neuron" context pr layer executed. We dont know how many layers in GPT? But supposed it would be 6, then you'd be able to process 6 customers in parallel on the same network. But again, as the weights are static and by far and large the most memory intensive in a network, there is not reason why you couldnt spawn a lot of networks with individual neurons but sharing the same weights. That way you'd get parallelism both within the same network and across networks sharing weights.
(maybe this is what a transformer does idk yet).
 

NTMBK

Lifer
Nov 14, 2011
10,191
4,884
136
Oh cool. A bunch of dark silicon accelerators that won't get used, and will serve no purpose except PR releases and propping up Intel's share price (and Gelsinger's compensation) for another year or two.

Meanwhile 90% of user time will be spent in unoptimized Electron apps.
 
  • Like
Reactions: Tlh97 and cytg111

A///

Diamond Member
Feb 24, 2017
4,352
3,151
136
depending on how mtl does and then arrowlake, it'll either make it or break it for pat gunslinger gelsinger. whatever he signed off on won't affect intel until the next guy or maybe gal is pushed as ceo. I think this ai stuff is overblown. llm like chatgpt is interesting, sometimes cool but I don't see it being useful in the long run especially for the havoc it causes when dumb people use it for the wrong dumb reasons. negative press is negative press.
 
Mar 11, 2004
22,802
5,199
146
AI has two elements: training and inference. You are completely ignoring half of the picture.

Training requires massive amounts of power and data. That training will mostly be done in servers on the internet. Imagine things like teaching a computer what an image of a fish looks like--that will be done in the cloud.

But inference, where you use a model that the training created, often will not be done on servers. Things like asking PowerPoint to browse all of your computer photos to make a collage of photos of your father fishing for his memorial service--that is most likely done on individual devices. Focus the laptop camera only on your face (even more specifically, focus on your eyes) during a conference call--done on your laptop not the cloud. Getting photoshop to properly select the subject (all of it) and none of the background (none of it)--that will be done on your computer. Real-time voice translation from any language to any language works best on your device, especially if you are not in internet range. Having Microsoft Word write a cover letter summarizing your work is best done on your work desktop and not have all your sensitive work data sent to the cloud. Having Excel scan all your data to write your quarterly business report is best done in secret to avoid insider trading laws--that is on your computer not in the cloud. Etc.

You're missing the bigger picture which is the inferencing is needed so they can enforce their backdoor attempts at accessing data that people keep encrypted and/or off the cloud. Apple is literally already doing this, and I have a strong hunch they not only weren't the first that others have been doing this under the guise of "training" their models. Gives them plausible deniability ("we didn't have anyone accessing your data, it was all done by AI!").

The ESRB wants people to upload selfies for age verification to be quantified via AI. But not to prevent minors from playing games they aren't supposed to (so what exactly the age verification is for...). And I'm guessing if they haven't already been setup to, they have DRM primed to use AI for who knows what all.
 
Last edited:

cytg111

Lifer
Mar 17, 2008
22,342
12,079
136
100%, been saying that for years, AI is gonna be used to circumvent privacy laws, it's not gonna stop at google++ either, think police and IC.
 

A///

Diamond Member
Feb 24, 2017
4,352
3,151
136
I don't suppose you have some proof of this?
He's one of those yobs who think Apple is selling your data to third parties despite them saying they don't. Apple doesn't, but uses their user data outlined here in the link below for their own metrics but they use those anonymized metrics to sell ad space only on their platform. In 2022 apple's ad revenue jumped from under 2b a year to nearly 20b. this is due to google paying them more to be the default search engine and also because advertisers dropped meta and google to go to apple directly if the were targetting apple consumers. you and I know the typical iphone owner and mbp owner spends more on apps and services than the typical android owner and there's an income disparity too with apple product owners typically bringing in more than android users at the normal consumer level. techies and execs... is different.


Personally I don't care because apple is smart enough to minimise exposure of my habits to marketers who contract them directly versus meta or google. My short experience with android which I've told you about in past was abysmal after the initial ooos and awwws. Google try as they might cannot polish that os into something cohesive and performant. The lack of standardization means most android devices are terrible and not worth the value of the parts they're made of, producing e waste from the get go. Long live Apple.