More than a dozen research groups from government, academia and the corporate world have joined Intel to help advance its neuromorphic computing technology.
Kapoho Bay, a USB-based system powered by the Loihi neuromorphic chip. Source: Intel
Intel’s initial foray into this area is based on Loihi, a neuromorphic test processor that was unveiled in September 2017. Fabricated on Intel’s 14nm process technology, the chip is made up of 130,000 neurons and 130 million synapses in digital form. Each neuron on the chip is able to communicate with thousands of others. Underneath the covers, Loihi is implemented as a manycore mesh, each of which houses a learning engine that can support a variety of machine learning models – supervised, unsupervised, reinforcement, etc. Intel claims the chip is up to 1,000 times more energy-efficient than conventional processors when it comes to training neural networks.
The collaboration announced this week will kick off 50 neuromorphic research projects to be funded by Intel. The participating universities for this work include University of Bern; University of California, Berkeley; University of California, San Diego; Cornell University; University of Göttingen; TU Graz; Harvard University; TU Munich; Radboud University; University of Tennessee; and Villanova University.
All of the research will be performed under the umbrella of the Intel Neuromorphic Research Community (INRC), an organization dedicated to building a software/hardware ecosystem around Intel’s neuromorphic computing technology. At this point, community members will use Loihi as the basis of their R&D work, with the eventual goal of commercializing the platform.
One of these projects at the Institute for Theoretical Computer Science, Technische Universität Graz, will attempt to map a new class of spiking neural network (SNN), known as long short-term memory SNN, or LSNN, on to Loihi. LSNNs are said to achieve classification accuracies on par with those of state-of-the-art deep learning models based on long short-term memory (LSTM) networks. LSTMs themselves are used for applications in areas such as speech recognition, natural language processing, robotic control, and human action recognition, among others. Early results from the TU Graz team reflect accuracies within a few percent of the ideal model.
Another partner, Applied Brain Research Inc. (ABR), has provided some initial benchmarking of Loihi based on real-time streaming data inference applications using TensorFlow-trained networks. The results suggest the Intel chip can deliver 2x to 50x better energy efficiency than conventional hardware on these types of applications. The benchmarks were implemented with ABR’s Nengo DL, a neural simulator that has been ported to the Loihi architecture. The software is freely available for research use.
In other work at Cornell University, neuromorphic algorithms run on Loihi for signal restoration and identification have demonstrated state-of-the-art learning and classification performance on chemosensor data sets. “These algorithms were derived from mechanistic studies of the mammalian brain’s olfactory circuits, but I anticipate that in generalized form, they will be applicable to a range of similar computational problems such as air and water quality assessment, cancer screening, and genomic expression profiling,” said Professor Thomas Cleland.
Loihi, in the form of a USB form factor, known as “Kapoho Bay,” is available to INRC members who require direct access to hardware. Besides providing a USB hookup to Loihi, Kapoho Bay provides an interface to a silicon retina camera available from iniVation, a specialist in neuromorphic vision systems. INRC researchers that don’t require a direct hookup to the hardware can get access to it via Intel’s neuromorphic cloud.
On the software side, Intel has released early versions of the Loihi’s software development kit, which can be used by the researchers to develop applications and supporting libraries. The expectation is that all of this will lead to a growing body of software for Loihi, especially foundational algorithms that can be applied to additional research work and eventually real-world applications. Both the software and the research are expected to be made available in the public domain.