The Sequence Scope: Don’t Sleep on JAX
Weekly newsletter with over 120,000 subscribers that discusses impactful ML research papers, cool tech releases, the money in AI, and real-life implementations.
--
📝 Editorial: Don’t Sleep on JAX
The ecosystem of deep learning development frameworks has gone from incredibly fragmented to being concentrated around two big names: TensorFlow (Keras included) and PyTorch. A few years ago, a dozen deep learning stacks, such as MxNet, Caffe 2 and Microsoft’s CNTK, showed a similar level of adoption and even comparable with TensorFlow and PyTorch. That picture has changed in the last few years, with the majority of deep learning research and development being concentrated in TensorFlow and PyTorch at levels that it was hard to envision another framework having a real chance to rival those two. Somewhat quietly, a new framework has been boosting its capabilities and adoption within the machine learning community.
JAX was initially released by Google Research in 2018 with the objective of streamlining high-performance numerical computing. The framework enables capabilities such as vectorization, JIT-compilation and gradient-based optimization in a very modular and simple programming model. While it was not intended as a deep learning framework in the first place, JAX has seen relevant adoption within the deep learning community. This has been partly influenced by the adoption of AI powerhouses like Google Research and, very notably, DeepMind, which has been very public about their adoption of JAX. As a result, JAX has quickly increased its tech stack’s depth. Just this week, Google Research open-sourced a new ranking library of ranking algorithms for JAX.
🔎 ML Research
Video-Text Learning
Google Research published a paper detailing a new method for question-answering in video streams →read more on Google Research blog