Fall 2022 | Tuesdays & Thursdays, 3:30 – 5:00pm ET
Speaker: Song Han, MIT
Course will be streamed live on YouTube, with videos available afterwards.
Join this online course taught by MIT’s Song Han as we deep dive into efficient machine learning techniques that enable powerful deep learning applications on resource-constrained devices. Topics cover efficient inference techniques, including model compression, pruning, quantization, neural architecture search, and distillation; and efficient training techniques, including gradient compression and on-device transfer learning; followed by application-specific model optimization techniques for videos, point cloud, and NLP; and efficient quantum machine learning. Students will get hands-on experience implementing deep learning applications on microcontrollers, mobile phones, and quantum machines with an open-ended design project related to mobile AI.
Learn more about TinyML and Efficient Deep Learning.
Explore
New Method Could Increase LLM Training Efficiency
Adam Zewe | MIT News
By leveraging idle computing time, researchers can double the speed of model training while preserving accuracy.
Tomás Palacios appointed Director of ISN
Office of the Vice President for Research
As director, Tomás will lead ISN’s research mission and build communities within MIT and with external partners.
What Makes a Good Proton Conductor?
Zach Winn | MIT News
MIT researchers found a way to predict how efficiently materials can transport protons in clean energy devices and other advanced technologies.




