Adam Zewe | MIT News Office
July 28, 2022
As scientists push the boundaries of machine learning, the amount of time, energy, and money required to train increasingly complex neural network models is skyrocketing. A new area of artificial intelligence called analog deep learning promises faster computation with a fraction of the energy usage.
Programmable resistors are the key building blocks in analog deep learning, just like transistors are the core elements for digital processors. By repeating arrays of programmable resistors in complex layers, researchers can create a network of analog artificial “neurons” and “synapses” that execute computations just like a digital neural network. This network can then be trained to achieve complex AI tasks like image recognition and natural language processing.
A multidisciplinary team of MIT researchers set out to push the speed limits of a type of human-made analog synapse that they had previously developed. They utilized a practical inorganic material in the fabrication process that enables their devices to run 1 million times faster than previous versions, which is also about 1 million times faster than the synapses in the human brain.
Complete article from MIT News.
Explore
Discovering the Joy of Future-forward Electrical Engineering
Jane Halpern | Department of Electrical Engineering and Computer Science
One year in, MIT’s hands-on 6-5 (Electrical Engineering With Computing) degree program is already one of the most popular majors among first-year students.
Tomás Palacios appointed Director of ISN
Office of the Vice President for Research
As director, Tomás will lead ISN’s research mission and build communities within MIT and with external partners.
What Makes a Good Proton Conductor?
Zach Winn | MIT News
MIT researchers found a way to predict how efficiently materials can transport protons in clean energy devices and other advanced technologies.




