From UCLA 29/11/23
FINDINGS
An experimental computing system, modeled after the biological brain, achieved a 93.4% accuracy in identifying handwritten numbers.
This success is attributed to a novel training algorithm providing real-time feedback during the learning process.
This algorithm surpassed the accuracy of traditional machine-learning methods, which train after processing batches of data, achieving only 91.4% accuracy.
The research highlighted that the system’s memory of past inputs, stored within itself, enhanced its learning capabilities.
In contrast, conventional computing approaches rely on separate software or hardware for memory storage, distinct from the processor.
BACKGROUND
For 15 years, researchers at the California NanoSystems Institute at UCLA (CNSI) have been working on a new computational technology.
This brain-inspired system consists of a network of nanoscale wires containing silver, situated on electrodes.
The system operates by receiving and outputting data through electrical pulses.
Unlike conventional computers with fixed atomic structures in their memory and processing modules, this nanowire network physically adapts to stimuli.
The network’s memory is based on its atomic structure and is distributed throughout the system, similar to synaptic connections in a biological brain.
Collaborators at the University of Sydney developed a specialized algorithm to manage input and output, leveraging the system’s dynamic and parallel data processing abilities.
METHOD
The system was built with a material comprising silver and selenium, forming a network of entangled nanowires above 16 electrodes.
Researchers trained and tested the network using images of handwritten numbers from a standard dataset.
The images were communicated to the system pixel-by-pixel via electrical pulses, with varying voltages indicating different pixel shades.
IMPACT
This developing nanowire network is anticipated to be more energy-efficient than current silicon-based AI systems for similar tasks.
It shows potential in processing complex, time-varying data like weather or traffic patterns, which current AI finds challenging without massive data and energy resources.
The study’s co-design approach—simultaneous hardware and software development—suggests nanowire networks could complement traditional electronic devices.
With its brain-like memory and adaptive learning capabilities, the network could excel in “edge computing,” processing complex data locally without relying on distant servers.
Applications could span robotics, autonomous navigation in vehicles and drones, Internet of Things technology, health monitoring, and coordinating multi-location sensor data.