The Sequence Scope: Using Transformers in Mainstream Deep Learning Applications
Weekly newsletter that discusses impactful ML research papers, cool tech releases, the money in AI, and real-life implementations.
The Sequence Scope is a summary of the most important published research papers, released technology and startup news in the AI ecosystem in the last week. This compendium is part of TheSequence newsletter. Data scientists, scholars, and developers from Microsoft Research, Intel Corporation, Linux Foundation AI, Google, Lockheed Martin, Cardiff University, Mellon College of Science, Warsaw University of Technology, Universitat Politècnica de València and other companies and universities are already subscribed to TheSequence.
TheSequence explains the main machine learning concepts and keeps you up-to-date with the most relevant projects and…
📝 Editorial: Using Transformers in Mainstream Deep Learning Applications
Not a week goes by in which we don’t learn about new, marvelous applications of transformers to deep learning domains. Considered one of the most important breakthroughs in the last few years of the deep learning space, transformers have gone to establish unfathomable milestones in domains such as natural language understanding (NLU) or computer vision. However, the implementation of transformer applications remains a privilege of the big technology firms and AI research labs that have access to vast data and compute resources. Making transformers more accessible to mainstream deep learning applications is one of the most interesting challenges of the next few years of practical deep learning.
The news are not all bad 😉 There are many ongoing efforts to simplify the process of incorporating transformers into mainstream deep learning applications. Just this week, startup Hugging Face raised an impressive $40M financing round to continue their efforts to advance NLU research with a specific focus on transformer architectures. In recent years, Hugging Face has become one of the most popular platforms for using transformer models in NLU programs( we discussed Hugging Face in a previous issue of The Sequence). The new funding should help expand the use of transformers into other domains such as computer vision or time-series analysis. Certainly, we should expect transformers to become one of the most popular architectures in modern, mainstream deep learning applications.
🔎 ML Research
Understanding Generalization in Deep Learning
Google Research published a paper a new framework that uses online optimization techniques to better understand generation in deep neural networks.
Adversarial Environment Generation
Google Research published a paper proposing an algorithm that uses adversarial dynamics between multiple agents to generate robust training environments.
A New Algorithm for Robust Reinforcement Learning Problems
Researchers from the Berkeley AI Research(BAIR) lab published a paper unveiling a new algorithm for robust reinforcement learning problems which are designed to adapt to drastic changes in environments.
🤖 Cool AI Tech Releases
The new version of the PyTorch framework was released this week. The new release includes enhancements in in areas such as distributed training and mobile deep learning as well as a good number of new libraries.
New Alexa Prize Challenge
Amazon announced a new Alexa Prize challenge designed to build chatbots that can operate well in multitask environments.
IBM Research released IBM Molecule Generation Experience (MolGX), a cloud platform that uses machine learning to help with the design of new molecular structures that can help discover new materials.
💸 Money in AI
- Hugging Face closed a $40 million Series B funding round. The startup is an open-source provider of natural language processing (NLP) technologies. Their open-source framework Transformers has been downloaded over a million times, amassed over 42,000 stars and 10,000 forks on GitHub. It’s one of the engines that move the AI industry forward.
- AI-powered HRtech startup retrain.ai raised a $9 million Series A. The company leverages AI and ML to help organizations unlock effective talent intelligence and upskill their employees to stay ahead of the curve.
- CapitalOne Ventures invested $24 million in Securonix, a security startup that reduces noise and prioritizes high-fidelity alerts with behavioral analytics technology that pioneered the UEBA category, it heavily invests in AI and machine learning for greater automation to meet the growing pace of cyberattacks.
- AI-powered bookkeeping startup Zeni raised $13.5 million in a Series A round. Zeni leverages a blend of artificial intelligence and human finance experts to perform daily bookkeeping and manage the different financial needs of a startup.
- Intelligent process automation provider WorkFusion raised $220 million. The company built its proprietary cloud-federated learning technology: AI bots learn in real-time from data and end users with “no-code” simplicity, then further aggregate and share those learnings across the bot ecosystem.
- AI-powered cancer diagnostics startup Ibex Medical Analytics raised $38 million in funding. It creates AI solutions to detect misdiagnosed and mis-graded cancers in digitized slides, guiding pathologists to areas of cancer in support of a prompt review. It also developss AI-markers for prognostic and predictive applications used in cancer management and drug development.
- The cloud data governance and security startup Privacera raised $50 million. The company leverages an open-source AI and ML library for natural language processing to automate the discovery of personally identifiable data to tackle the data privacy and security challenges faced by large enterprises.