Member-only story
How Amazon Scaled Alexa to 1000 Languages
Self-Supervised pretraining, transfer learning and knowledge distillation were among the techniques used to scale Alexa across many languages.
I recently started an AI-focused educational newsletter, that already has over 125,000 subscribers. TheSequence is a no-BS (meaning no hype, no news etc) ML-oriented newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below:
In recent years, we have seen an explosion of multilanguage models across different natural language understanding(NLU) tasks.. Digital assistants have been one of the most fertile environments to test multilanguage models at scale. One of the many challenges that digital assistants have surfaced is the difference between mastering tasks in high resource language like English, French or Spanish and low resource languages that are not spoken by large populations. Building a comprehensive experience across both…