Every week, Invector Labs publishes a newsletter that covers the most recent developments in AI research and technology. You can find this week’s issue below. You can sign up for it below. Please do so, our guys worked really hard on this:
From the Editor: The Biggest Roadblock for the Mainstream Adoption of Machine Learning
One word: data. Up to this day, we haven’t seen the complete potential of unsupervised models and supervised methods rule the machine learning space. Supervised training requires high quality labeled datasets and those are incredibly expensive to produce on an ongoing basics. These challenge have prevented even large organizations from adopting machine learning at scale and are a major roadblock for startups entering the space.
The labeled data hurtle has two man solutions: Either we develop methods for producing labeled datasets more effectively or we develop methods that can learn with smaller datasets. In the first solution, areas such as generative models are showing some promise to generate high quality training datasets. In terms of the latter, we have seen incredible advances in research in areas such as semi-supervised models, one-shot or zero-shot learning are improving very rapidly. While the adoption of these techniques remain in a very nascent stage, the problem that they are trying to solve remain the biggest challenge for the adoption of modern machine learning solutions.
Now let’s take a look at the core developments in AI research and technology this week:
Understanding Glass Using Neural Networks
DeepMind published a very intriguing paper proposing a method based on graph neural networks to understand the puzzling phenonemon of glass transition.
Advancing Generative Networks
Microsoft Research published an insightful summary of three different projects in generative models that can facilitate the implementation self-servised learning methods.
Learning Tasks from Single Examples
Researchers from Amazon published a paper proposing a method that improves performcen in meta-learning tasks without increasing the training data requirements.
Cool AI Tech Releases
The TensorFlow team released the Quantization Aware Training (QAT) API which allow the implementation of faster and smaller machine learning models.
Sound Separation Dataset
Google open sourced he Free Universal Sound Separation data set, intended to support the development of AI models that can separate distinct sounds from recording mixes.
AI in the Real World
$30M for AI Research
The US Department of Energy announced $30 million in funding for AI research projects.
AI startup Node raised $6 million to help companies launch AI projects without requiring data science expertise.
AI to Study the Oceans
We know very little about our oceans. AI methods are helping researchers to overcome some of the traditional challenges analyzing the super large datasets related to oceanic data.