Meta AI is Studying the Human Brain to Build Better Language Models

A new cooperation between Meta and neuroscience institutes is contrasting large language models and human brain patterns.

Jesus Rodriguez
3 min readMay 9, 2022

--

Source: https://towardsdatascience.com/neural-networks-is-your-brain-like-a-computer-d76fb65824bf

I recently started an AI-focused educational newsletter, that already has over 125,000 subscribers. TheSequence is a no-BS (meaning no hype, no news etc) ML-oriented newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below:

Language is one of the marvels of human condition and, arguably, the area in which deep learning has made most progress in the last few years. Natural language understanding(NLU) models like OpenAI’s GPT-3 or Google’s Switch Transformer have certainly pushed the boundaries of what we considered achievable in deep learning. However, despite the progress, those models are nowhere near to understand language the way humans do. Recently, Meta AI Research, launched an partnership with several neuroscience institutions to understand how the human brain processes language in order to build better NLU models. The initial results are confirmatory of some of the suspicions NLU leaders have had for a long time.

The core of Meta AI study is based on using functional magnetic resonance imaging (fMRI) to capture snapshots of brain activity as a reaction to language interactions. Meta AI also used magnetoencephalography (MEG), a scanner that takes snapshots of brain activity every millisecond. The combination allowed to determine what neuroactivations take place in the brain as a response to specific language activities.

Some of the results showcased some similarities between the architecture of large language models and the activations in the human brain. For instance, Meta AI discovered that, when reading a word, the brain produces representations that are very similar to CNNs trained for character recognition. Those representations are…

--

--

Jesus Rodriguez

CEO of IntoTheBlock, President of Faktory, I write The Sequence Newsletter, Guest lecturer at Columbia University and Wharton, Angel Investor, Author, Speaker.