Member-only story

The Sequence Scope: Transformers are Getting More Ambitious

Weekly newsletter with over 100,000 subscribers that discusses impactful ML research papers, cool tech releases, the money in AI, and real-life implementations.

Jesus Rodriguez
4 min readAug 8, 2021

📝 Editorial: Transformers are Getting More Ambitious

Recently, we started a new series at The Sequence covering the intricacies of transformer architectures. Considered by many the most important development in the recent years of deep learning, transformers have gone to revolutionize language intelligence. Models such as Google’s BERT or OpenAI’s GPT-3 have lay out the path for new highs in natural language processing(NLP) models. Despite all the success, constraining transformers to NLP scenarios would be a mistake. These days transformers are getting more ambitious and being applied to all sorts of deep learning scenarios.

It turns out that the same attention mechanisms that make transformers so effective for language models can…

--

--

Jesus Rodriguez
Jesus Rodriguez

Written by Jesus Rodriguez

CEO of IntoTheBlock, President of Faktory, President of NeuralFabric and founder of The Sequence , Lecturer at Columbia University, Wharton, Angel Investor...

No responses yet