Google ‘mind-reading’ AI can tell what you’re hearing by monitoring your brain signals

A team of researchers from Google and Osaka University in Japan has successfully developed a groundbreaking AI-powered tool known as Brain2Music.

This tool has the ability to recreate music based on brain scans of people while they are listening to music.

The study, which has been published in the arXiv database (awaiting peer-review), is the first of its kind. Brain2Music works by analyzing brain imaging data collected from individuals while they are enjoying music, the Daily Sun has reported.

By studying the person’s brain activity during this process, the AI generates a new song that aligns with the genre, rhythm, mood, and instrumentation of the music they were listening to.

To feed the AI pipeline, the researchers utilized functional magnetic resonance imaging (fMRI), a sophisticated imaging technique capable of displaying regional and time-varying changes in the brain.

This allowed the AI to identify the activated areas of the brain during the music-listening experience.

Written by staff