Member-only story

Transformers for Time Series? Inside Google’s Temporal Fusion Transformers

Jesus Rodriguez
3 min readAug 18, 2022

--

Image Source: https://nypost.com/2022/02/20/could-artificial-intelligence-really-wipe-out-humanity/

I recently started an AI-focused educational newsletter, that already has over 125,000 subscribers. TheSequence is a no-BS (meaning no hype, no news etc) ML-oriented newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below:

Transformer architectures have been revolutionizing different areas of deep learning from natural language processing to computer vision. Almost since the release of the initial attention-based transformer models, there have been attempts to adapt them to the universe of time-series. After all, if transformer architectures result in a breakthrough in the time-series space, it can unleash an innovation race in areas such as quant models in financial markets. Several attempts have been made to adapt transformers to time-series forecasting scenarios with mixed results. Among those, Google Research’s Temporal Fusion Transformers(TFT) stand out as one of the most solid models which have been implemented in several…

--

--

Jesus Rodriguez
Jesus Rodriguez

Written by Jesus Rodriguez

CEO of IntoTheBlock, President of Faktory, President of NeuralFabric and founder of The Sequence , Lecturer at Columbia University, Wharton, Angel Investor...

No responses yet