Image for post
Image for post

Every week, my team at Invector Labs publishes a newsletter that covers the most recent developments in AI research and technology. You can find this week’s issue below. You can sign up for it below. Please do so, our guys worked really hard on this:

Image for post
Image for post

From the Editor: The Great Race for Cloud AI Dominance and its Impact in the AI Startup Ecosystem.

Amazon, Microsoft and Google have embarked in a frantic race to dominate the artificial intelligence(AI) market. To some extent, AI has become one of the few areas of differentiation between the three cloud giants. Every relevant capability of AI applications from experimentation to optimization can be found in the form of cloud services in these platforms. Just this week, Amazon unveiled a new batch of AI services for the AWS platforms during its re:Invent conference.

The massive investments that the cloud giants are making in the AI ecosystem is certainly pushing innovation in the space to a new level but it’s also having tangible side effects in the AI startup ecosystem. In the quest for capturing the best AI talent in the market, Microsoft, Amazon and Google have begun rapidly acquiring AI startups that show any signs of promise. The result is that most AI technology startups haven’t had the opportunity to flourish as independent companies compacting the innovation in the space. This is certainly an atypical trend compared to previous technology markets and one that certain impacts the way investors and customers seem the AI startup ecosystem.

Now let’s take a look at the core developments in AI research and technology this week:

Image for post
Image for post

AI Research

Facebook AI Research(FAIR) published a paper describing an AI model that can achieve super human performance in Hanabi, a card game in which players must work together.

>Read more in this blog post from the FAIR team

The Deep Double Descent Phenomenon

OpenAI published a paper describing a common phenomenon in deep neural networks in which performance first improves, then gets worse, and then improves again with increasing model size, data size, or training time.

>Read more in this blog post from OpenAI

Teaching Neural Networks to Remember

Microsoft Research published a paper proposing a method called Metalearned Neural Memory which uses a neural network itself as a memory mechanism.

>Read more in this blog post from Microsoft Research

Image for post
Image for post

Cool AI Tech Releases

Netflix open sourced Metaflow, a new framework for managing the lifecycle of data science projects.

>Read more in this blog post from the Netflix engineering team

AutoML on AWS

AWS unveiled SageMarket AutoPilot, to bring interpretability to AutoML models.

>Read more in this blog post from the AWS team

A New IDE for Machine Learning

AWS announced SageMarker Studio, a new IDE for machine learning applications.

>Read more in this blog post from AWS

Image for post
Image for post

AI in the Real World

China investments on AI seem a bit overblown based on new numbers published this week.

>Read more in this coverage from MIT Technology Review

AI Assistant Returns to Space

CIMON-2, the second version of the famous floating robot for astrounant assistance, was sent to space as part of the SpaceX ISS resupply mission.

>Read more in this coverage from TechCrunch

The Dilemma Facing Chinese US-Educated AI Engineers

Many Chinese experts have great opportunities in the US or can return home to be part of China’s grand AI strategy.

>Read more in this coverage from the South China Morning Post

Written by

CEO of IntoTheBlock, Chief Scientist at Invector Labs, Guest lecturer at Columbia University, Angel Investor, Author, Speaker.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store