Intel reveals more details of their Neuromorphic Chip named Loihi

Dayman1225

Senior member
Aug 14, 2017
790
32
96
#1
First lets head over to Wikichip for a die shot, floor plan and some more info

WikiChip said:
Loihi is fabricated on Intel's 14 nm process and has a total of 130,000 artificial neurons and 130 million synapses. In addition to the 128 neuromorphic cores, there are 3 managing Lakemont cores.
The die is a mere 60mm^2 and has 2,070,000,000 transistors
WikiChip said:
The chip itself implements of a fully asynchronous many-core mesh of 128 neuromorphic cores. The network supports a wide variety of artificial neural network such as RNNs, SNN, sparse, hierarchical, and various other topologies where each neuron in the chip is capable of communicating with 1000s of other neurons.

There are 128 neuromorphic cores, each containing a "learning engine" that can be programmed to adopt to the network parameters during operation such as the spike timings and their impact. This makes the chip more flexible as it allows various paradigms such as supervisor/non-supervisor and reinforcing/reconfigurablity without requiring any particular approach. The choice for higher flexibility is intentional in order to defer various architectural decisions that could be detrimental to research.

Now enter Intel's recent announcement here, they have launched a Neuromorphic Research Community to Advance ‘Loihi’ Test Chip.

Intel claims that their version of "Hello World" is scanning and training Loihi to recognize 3D Objects from multiple angles, they say this uses <1% of Loihi and 10s of milliwatts and only takes a few seconds.

They have also sent out development boards to researchers who are working on a variety of applications including sensing, motor control, information processing and more. They also say Software development is a key focus and that they are looking forward to demonstrate much larger scale applications for Loihi


Intel's 'Loihi' Neuromorphic Chip in the Lab - Youtube

In this video they show the Loihi chip learning and training an object(a bobble head) in <5 seconds and it is able to recognize it from any side

If you want more details on the Loihi architecture Intel has published details over at IEEE
 
Last edited:

ClockHound

Senior member
Nov 27, 2007
991
2
91
#2
Probably thousands of man hours of research, millions of dollars and soon I'll be able to catalog my vast bobblehead collection without all the tedious looking at the stupid things. This is the kind of progress that will make Marvin the bobblehead AI personal servant possible. The future just keeps on looking to the future...
 

NostaSeronx

Platinum Member
Sep 18, 2011
2,254
58
106
#3
But can it white room rebuild Crysis (November 13, 2007)? Making a fully playable game from a new engine that emulates CryEngine2? That can be played on Android, Windows, Linux, MacOS, etc?
 
Mar 11, 2004
17,765
289
126
#5
But can it white room rebuild Crysis (November 13, 2007)? Making a fully playable game from a new engine that emulates CryEngine2? That can be played on Android, Windows, Linux, MacOS, etc?
Sure, but it'll also emulate you and your enjoyment of it. "NostaSeronx played the game. He was moderately entertained for a few hours."
 

krumme

Diamond Member
Oct 9, 2009
5,673
56
136
#6
But can it white room rebuild Crysis (November 13, 2007)? Making a fully playable game from a new engine that emulates CryEngine2? That can be played on Android, Windows, Linux, MacOS, etc?
Sure the product is a pipecleaner
 

EXCellR8

Platinum Member
Sep 1, 2010
2,801
52
126
#7
Neuromorphic you say?

Sounds like something Isaac Clarke would cross paths with...
 
Sep 19, 2000
10,190
9
91
#8
Nice.

It will likely be chips like this that power future self driving cars and automated telemarketers.
 

Xpage

Senior member
Jun 22, 2005
457
5
81
www.riseofkingdoms.com
#9
only after it trains itself by smashing your car against the first couple hundred cars it doesn't recognize.

"Why would they put a gas burner on the road? That's a vintage...wait the car doesn't recognize it?....*smash*
 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,292
28
126
#10
only after it trains itself by smashing your car against the first couple hundred cars it doesn't recognize.

"Why would they put a gas burner on the road? That's a vintage...wait the car doesn't recognize it?....*smash*
That's probably the exact opposite situation of reality if one of these were leveraged for such an application. One of the larger factors of humanities progression is our growth in accumulated knowledge, but the relatively static ability for humans to improve their learning ability. Generation after generation resets the clock. Each person is required from birth to re-learn nearly everything of their ancestors and then some more. While we've developed better and better methods for instilling this knowledge into people more and more quickly, we're talking a timetable of decades, and a not insignificant portion of this population won't even make it to the point of acquiring and leveraging new found knowledge.

Back to something like this, training something would take time, but it gets the jump start by immediately absorbing and training from some of our minds. Then it takes off. More units learn more things, and these units feedback to learn from each other. They get better and better at adapting. When their "generation" dies, their descendants immediately take advantage of all previously acquired training at inception. This is why machine learning is such a big deal. Until we can figure out away to directly program knowledge into descendant brains, it's one of our more likely chances of carrying over learned knowledge, experience, and application of learning to levels that simple generational studies of people volunteering to take employment in such a field would take decades and decades to match.
 
Apr 27, 2000
10,843
541
126
#13
Interesting to see Lakemont pop up here.
 
Feb 23, 2017
449
329
96
#14
Judgement Day is upon us, though a few years later than predicted in The Terminator.
 
Oct 14, 2003
5,762
79
126
#16
Judgement Day is upon us, though a few years later than predicted in The Terminator.
Eh.

I think an AI researcher said rather than machines outsmarting us and getting us in trouble, our over-reliance on machines that are stupid will do so instead.
 

ericlp

Diamond Member
Dec 24, 2000
5,984
42
106
#18
damned if you do...damned if you don't. Intel dabbling in AI...and...as usual, late to the party.
 

coercitiv

Diamond Member
Jan 24, 2014
3,025
307
136
#19
I think an AI researcher said rather than machines outsmarting us and getting us in trouble, our over-reliance on machines that are stupid will do so instead.
To paraphrase "somebody" on the Internet: if it's in Python then it's machine learning, if it's in PowerPoint then it's AI.
 

maddie

Platinum Member
Jul 18, 2010
2,330
286
136
#20
That's probably the exact opposite situation of reality if one of these were leveraged for such an application. One of the larger factors of humanities progression is our growth in accumulated knowledge, but the relatively static ability for humans to improve their learning ability. Generation after generation resets the clock. Each person is required from birth to re-learn nearly everything of their ancestors and then some more. While we've developed better and better methods for instilling this knowledge into people more and more quickly, we're talking a timetable of decades, and a not insignificant portion of this population won't even make it to the point of acquiring and leveraging new found knowledge.

Back to something like this, training something would take time, but it gets the jump start by immediately absorbing and training from some of our minds. Then it takes off. More units learn more things, and these units feedback to learn from each other. They get better and better at adapting. When their "generation" dies, their descendants immediately take advantage of all previously acquired training at inception. This is why machine learning is such a big deal. Until we can figure out away to directly program knowledge into descendant brains, it's one of our more likely chances of carrying over learned knowledge, experience, and application of learning to levels that simple generational studies of people volunteering to take employment in such a field would take decades and decades to match.
Quote:
"Each person is required from birth to re-learn nearly everything of their ancestors and then some more."

If only this was true, what a wonderful world it would be.
 
Oct 14, 2003
5,762
79
126
#21
Too many people are allowing too many of sci-fi movies to influence how they view the real world.
 

firewolfsm

Golden Member
Oct 16, 2005
1,826
2
81
#22
About 9,000 of these would approximate the number of the neurons in the human brain, for reference.
 
Oct 14, 2003
5,762
79
126
#23
The average human brain is said to have 100 billion neurons. Just by that number you need 9 million of these chips, not 9000.
 

Shamrock

Golden Member
Oct 11, 1999
1,024
3
91
#24
Will still have an NSA backdoor, and will be vulnerable to Spectre.
 


ASK THE COMMUNITY

TRENDING THREADS