Every week, Invector Labs publishes a newsletter that covers the most recent developments in AI research and technology. You can find this week’s issue below. You can sign up for it below. Please do so, our guys worked really hard on this:

From the Editor: The Biggest Roadblock for the Mainstream Adoption of Machine Learning

One word: data. Up to this day, we haven’t seen the complete potential of unsupervised models and supervised methods rule the machine learning space. Supervised training requires high quality labeled datasets and those are incredibly expensive to produce on an ongoing basics. These challenge have prevented even large organizations from adopting machine learning at scale and are a major roadblock for startups entering the space.

The labeled data hurtle has two man solutions: Either we develop methods for producing labeled datasets more effectively or we develop methods that can learn with smaller datasets. In the first solution, areas such as generative models are showing some promise to generate high quality training datasets. In terms of the latter, we have seen incredible advances in research in areas such as semi-supervised models, one-shot or zero-shot learning are improving very rapidly. While the adoption of these techniques remain in a very nascent stage, the problem that they are trying to solve remain the biggest challenge for the adoption of modern machine learning solutions.

Now let’s take a look at the core developments in AI research and technology this week:

AI Research

Understanding Glass Using Neural Networks

DeepMind published a very intriguing paper proposing a method based on graph neural networks to understand the puzzling phenonemon of glass transition.

>Read more in this blog post from the DeepMind team

Advancing Generative Networks

Microsoft Research published an insightful summary of three different projects in generative models that can facilitate the implementation self-servised learning methods.

>Read more in this blog psot from Microsoft Research

Learning Tasks from Single Examples

Researchers from Amazon published a paper proposing a method that improves performcen in meta-learning tasks without increasing the training data requirements.

>Read more in this blog post from Amazon Research

Cool AI Tech Releases

TensorFlow QAT

The TensorFlow team released the Quantization Aware Training (QAT) API which allow the implementation of faster and smaller machine learning models.

>Read more in this blog post from the TensorFlow team

Sound Separation Dataset

Google open sourced he Free Universal Sound Separation data set, intended to support the development of AI models that can separate distinct sounds from recording mixes.

>Read more in this blog post from the Google Open Source team

AI in the Real World

$30M for AI Research

The US Department of Energy announced $30 million in funding for AI research projects.

>Read more in the official press release of the Department of Energy

Node AI

AI startup Node raised $6 million to help companies launch AI projects without requiring data science expertise.

>Read more in this coverage of VentureBeat

AI to Study the Oceans

We know very little about our oceans. AI methods are helping researchers to overcome some of the traditional challenges analyzing the super large datasets related to oceanic data.

>Read more in this coverage of the New York Times

CEO of IntoTheBlock, Chief Scientist at Invector Labs, I write The Sequence Newsletter, Guest lecturer at Columbia University, Angel Investor, Author, Speaker.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store