Member-only story

Last Week in AI

Jesus Rodriguez
3 min readMay 3, 2020

--

Every week, Invector Labs publishes a newsletter that covers the most recent developments in AI research and technology. You can find this week’s issue below. You can sign up for it below. Please do so, our guys worked really hard on this:

From the Editor: Smaller, Faster Machine Learning

The catchy phrase that bigger is better definitely applies when comes to machine learning these days. Bigger bigger models trained on larger datasets have consistently outperformed groups of smaller models specialized on specific tasks. The examples are everywhere, Google’s BERT, OpenAI’s GPT-2, Microsoft’s Turing-NLG operate at scales with computational costs that result prohibited for most organizations. As a result, we are starting to see efforts to create smaller and more efficient machine learning models.

The idea of optimizing the size of a machine learning model without sacrificing its performance is conceptually trivial but really hard to implement in practice. Most large scale machine learning models growth to fairly large sizes during training becoming really hard to understand which sections can be removed without affecting the performance. This week, MIT researchers published a new pruning the size of machine learning models and we are likely to continue seeing more research in this area.

--

--

Jesus Rodriguez
Jesus Rodriguez

Written by Jesus Rodriguez

CEO of IntoTheBlock, President of Faktory, President of NeuralFabric and founder of The Sequence , Lecturer at Columbia University, Wharton, Angel Investor...

No responses yet