Artificial Superconducting Synapses Enables More Efficient & Human AI
For those working in the field of advanced artificial intelligence, getting a computer to simulate brain activity is a gargantuan task, but it may be easier to manage if the hardware is designed more like brain hardware to start with.
Artificial intelligence software has increasingly begun to imitate the brain. Algorithms such as Google’s automatic image-classification and language-learning programs use networks of artificial neurons to perform complex tasks. But because conventional computer hardware was not designed to run brain-like algorithms, these machine-learning tasks require orders of magnitude more computing power than the human brain does.
And now researchers at the National Institute of Standards and Technology may have overcome this significant hurdle by designing a chip with artificial synapses.
The researchers have built a superconducting switch that “learns” like a biological system and could
connect processors and store memories in future computers operating like the human brain.In the brain, neurons “talk” to one another by sending electrochemical impulses across tiny gates or switches called synapses. When a synapse receives a strong enough incoming signal from one neuron, it triggers an electrochemical reaction that produces an outgoing spike in a second neuron.
“The NIST synapse has lower energy needs than the human synapse, and we don’t know of any other artificial synapse that uses less energy,” NIST physicist Mike Schneider said in a statement.
Even better than the real thing, the NIST synapse can fire much faster than the human brain—1 billion times per second, compared to a brain cell’s 50 times per second—using just a whiff of energy, about one ten-thousandth as much as a human synapse.
The NIST synapse is a type of Josephson Junction, a sandwich of two superconductors around an insulating layer. What makes this unique is the fact that these synapses special is that the insulating layer is packed with special magnetic clusters that allow the researchers to control how much energy is required to throw the switch, known as the critical current.
As Schnieder noted, these junctions include 20,000 manganese and silicon nanoclusters per square micrometer. They gave the researchers the control they needed.
“These are customized Josephson junctions,” he noted. “We can control the number of nanoclusters pointing in the same direction, which affects the superconducting properties of the junction.”
Ultimately, these synapses could play critical roles in making processing data simultaneously a reality. Neuromorphic computers could be the new wave of reality given the increasing need for faster computing at lower energy costs.
Moreover, the team adds, the synapses can be ‘stacked’ in a three-dimensional arrangement to form a larger system linking devices acting as neurons, which the term says can be made by conventional electronic component construction methodology.
Steven Furber, a computer engineer at University of Manchester, UK, who studies neuromorphic computing, stresses that practical applications are far in the future. “The device technologies are potentially very interesting, but we don’t yet understand enough about the key properties of the [biological] synapse to know how to use them effectively,” he says.
“We’re optimistic that we can start to scale these devices somewhat aggressively,” said Schneider, who puts the figure at between five and 10 years.