MIT researchers claim to have developed analog synapses that are a million times faster than those found in the human brain.
Just as a digital processor needs transistors and our biological brain needs neurons and synapses to form a network of connections between different brain regions, analog needs programmable resistors. According to the press release, these resistors can form networks of analog synapses and neurons when properly configured.
These analog synapses are not only incredibly fast, but also highly efficient. And this is important because as digital neural networks become more powerful and advanced, their energy requirements increase, thus significantly increasing their carbon footprint.
Achieving incredible speed
The researchers achieved nanosecond speeds—faster than biological synthesis in the human brain—by eschewing traditional organic media and choosing a high-tech glass called inorganic phosphosilicate glass (PSG).
“The speed was definitely amazing. Generally, we would not apply such extreme field to equipment, so as not to turn it into ash. But instead, protons shuttled across the device stack at tremendous speed, typically a million times faster than what we had before. And this move doesn’t do any damage, thanks to the small size and low mass of the proton. It’s almost like teleporting,” says lead author and MIT postdoc Murat Onen.
Xu Li, senior author of a paper on the research and a professor of nuclear science, explained:
“Action potentials in biological cells rise and fall on a millisecond timescale because a voltage difference of about 0.1 volts is limited by the stability of water. Here we apply tens of volts to a nanoscale-thick special solid glass film that conducts protons without permanent damage. And the stronger the field, the stronger the ionic devices. faster.
Since PSGs can withstand high voltages without breaking, protons can travel at ridiculous speeds while still being extremely energy-efficient. Moreover, the material is widely available and easy to prepare; So, it is not only the fastest option but also a practical one.
These programmable resistive neural networks greatly speed up training and significantly reduce the cost and energy required to train them. This could accelerate the development of deep learning models, which could be applied to fraud detection, medical image analysis or self-driving cars.
“Once you have an analog processor, you’re no longer training the network that everyone is working on. Instead, you’ll train a network of unprecedented complexity that no one else can afford, and therefore outperform them all. In other words, it’s not a fast car, it’s a spaceship.
MIT scientists hope their discovery will help advance the field of analog deep learning, a growing and innovative area of artificial intelligence.