Meta building AI that processes speech & text as humans do

New Delhi: Meta (formerly Facebook) has announced a long-term artificial intelligence (AI) research initiative to better understand how the human brain processes speech and text, and build AI systems that learn like people do.

In collaboration with neuroimaging center Neurospin (CEA) and Inria, Meta said it is comparing how AI language models and the brain respond to the same spoken or written sentences.

“We’ll use insights from this work to guide the development of AI that processes speech and text as efficiently as people,” the social network said in a statement.

Over the past two years, Meta has applied deep learning techniques to public neuro-imaging data sets to analyse how the brain processes words and sentences.

Children learn that “orange” can refer to both a fruit and colour from a few examples, but modern AI systems can’t do this as efficiently as people.

Meta research has found that language models that most resemble brain activity are those that best predict the next word from context (like once upon a… time).

“While the brain anticipates words and ideas far ahead in time, most language models are trained to only predict the very next word,” said the company.

Unlocking this long-range forecasting capability could help improve modern AI language models.

Meta recently revealed evidence of long-range predictions in the brain, an ability that still challenges today’s language models.

For the phrase, “Once upon a…” most language models today would typically predict the next word, “time,” but they’re still limited in their ability to anticipate complex ideas, plots and narratives, like people do.

In collaboration with Inria, Meta research team compared a variety of language models with the brain responses of 345 volunteers who listened to complex narratives while being recorded with fMRI.

“Our results showed that specific brain regions are best accounted for by language models enhanced with far-off words in the future,” the team said.

–IANS

Comments are closed.