Shrinking Massive Neural Networks Used to Model Language

Daniel Ackerman | MIT News Office

Researcher Jonathan Frankle and his “lottery ticket hypothesis” posits that, hidden within massive neural networks, leaner subnetworks can complete the same task more efficiently.

Shrinking Deep Learning’s Carbon Footprint

Kim Martineau | MIT Quest for Intelligence

In June 2020, OpenAI unveiled the largest language model in the world, a text-generating tool called GPT-3 that can write creative fiction, translate legalese into plain English, and answer obscure trivia questions.

computer neural network in brain shape credit: ieee spectrum

Using AI to Make Better AI

Mark Anderson | IEEE Spectrum

New approach brings faster, AI-optimized AI within reach for image recognition and other applications

button and neural network, photo credit: Chelsea Turner, MIT

Kicking Neural Network Design Automation into High Gear

Rob Matheson | MIT News Office

MIT researchers have developed an efficient algorithm that could provide a “push-button” solution for automatically designing fast-running neural networks on specific hardware.

Privacy Preference Center