Scientists Can Make Neural Networks 90% Smaller

Artificial Intelligence

Researchers from MIT found a way to create neural networks that are 90% smaller but just as smart.

In a new paper, researchers from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) have shown that neural networks contain subnetworks that are up to one-tenth the size yet capable of being trained to make equally accurate predictions — and sometimes can learn to do so even faster than the originals.

This article stood out to me because if neural networks can be smaller but just as smart, maybe it could encourage companies to keep machine learning locally on a device, like Apple does.

Check It Out: Scientists Can Make Neural Networks 90% Smaller

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.