Member-only story
The Sequence Scope: The Era of Foundation Models is Here
Weekly newsletter with over 150,000 subscribers that discusses impactful ML research papers, cool tech releases, the money in AI, and real-life implementations.
📝 Editorial: The Era of Foundation Models is Here
The term ‘foundation models’ is becoming one of the hottest buzzwords in the machine learning (ML) lingo. Researchers from Stanford University originally coined the term to describe models that have been trained in large amounts of unlabeled data and can be fine-tuned to specific domains. Think about fine-tuning GPT-like models for domains such as law or science. Foundation models are shifting the ML development paradigm from creating brand-new models to fine-tuning large pretrained models.
The efforts around foundation models are increasing remarkably fast. Stanford University created the Center for Research on Foundation Models (CRFM), a new initiative focused on studying best practices around foundation models. Just this week, Snorkel AI released Data-centric Foundation Model Development, a new series of addition to the Snorkel Flow platform to fine-tune and distill foundation models. Meta AI also unveiled details about MultiRay, their platform for…