Pink Floyd song reconstructed from patients brain waves
Scientists have reconstructed a Pink Floyd classic from the recorded brain waves of patients who were undergoing epilepsy surgery while listening to the song.
Researchers at University of California, Berkeley, in the US, used artificial intelligence (AI) techniques to decode the brain signals, recreating the 1979 hit Another Brick in the Wall, Part 1.
This is the first time scientists have reconstructed a song from the recordings of brain activity, read an article published in the Independent quoting the study authors.
They said the famous phrase "All in all it's just another brick in the wall" is recognisable in the reconstructed song and the rhythms remain intact.
And although the words in the song are muddy, they are decipherable, the scientists said.
According to the team, the findings, reported in the journal PLOS Biology, show that brain signals can be translated to capture the musical elements of speech (prosody) – such as rhythm, stress, accent and intonation – which convey meaning that words alone cannot express.
While technology that can decode words for people who are unable to speak exists, the researchers said the sentences produced have a robotic quality – much like the way the late Stephen Hawking sounded when he used a speech-generating device.
The scientists believe their work could pave the way for new prosthetic devices that can help improve the perception of the rhythm and melody of speech.
For the study, the researchers analysed brain activity recordings of 29 patients who underwent surgery a decade ago.
A total of 2,668 electrodes were used to record all the brain activity and 347 of them were specifically related to the music.
Analysis of song elements revealed a new region in the brain that represents rhythm – which, in this case, was the guitar rhythm.
The scientists also discovered that some portions of the auditory cortex – located just behind and above the ear – respond at the onset of a voice or a synthesiser, while other areas respond to sustained vocals.
The researchers found that language processing "is more left brain, while music is more distributed, with a bias toward right".
Comments