Sitemap

Member-only story

How Amazon Scaled Alexa to 1000 Languages

Self-Supervised pretraining, transfer learning and knowledge distillation were among the techniques used to scale Alexa across many languages.

3 min readSep 19, 2022
Source: https://www.aboutamazon.in/news/devices/customers-in-india-say-i-love-you-to-alexa-19-000-times-a-day

I recently started an AI-focused educational newsletter, that already has over 125,000 subscribers. TheSequence is a no-BS (meaning no hype, no news etc) ML-oriented newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below:

In recent years, we have seen an explosion of multilanguage models across different natural language understanding(NLU) tasks.. Digital assistants have been one of the most fertile environments to test multilanguage models at scale. One of the many challenges that digital assistants have surfaced is the difference between mastering tasks in high resource language like English, French or Spanish and low resource languages that are not spoken by large populations. Building a comprehensive experience across both…

--

--

Jesus Rodriguez
Jesus Rodriguez

Written by Jesus Rodriguez

Co-Founder and CTO of Sentora( fka IntoTheBlock), President of LayerLens, Faktory and NeuralFabric. Founder of The Sequence , Lecturer at Columbia, Wharton

Responses (1)