Decoded Brain Waves Reveal Thoughts, Music Playing in Our Heads

  • 19 April 2018 11:02:57 AM
  • By sammire

Tell somebody this premise, and see how quickly they accuse you of science-fiction levels of paranoia or delusion: neuroscientists will soon be able to analyze brain activity in order to read a human’s thoughts.

To be certain, it’s no otherworldly premise. In fact, it’s simply the natural progression of neuroscience, and recent studies involving music have again illustrated just how capable examiners are of peeking into our temporal lobe to find out what thoughts or choruses happen to be running on a mental loop at a given time.

Early findings relating to the decoding of brain waves were revealed publicly in January of 2012, when a study out of the University of California, Berkeley detailed how scientists successfully decoded the information a subject was hearing from an ongoing conversation by examining activity occurring in their temporal lobe, which is the “seat” of the human auditory system. Once the scientists studied how the sounds being heard by the subject correlated to their brain activity, they were then able to translate the brain activity alone into a surprisingly accurate prediction of what the subject had heard. At the time, the researchers only imagined the day when the study would help lead to innovations allowing for the translation of original thought.

This is huge for patients who have damage to their speech mechanisms because of a stroke or Lou Gehrig’s disease and can’t speak, said co-author Robert Knight, a UC Berkeley professor of psychology and neuroscience. If you could eventually reconstruct imagined conversations from brain activity, thousands of people could benefit.

As it turns out, the UC-Berkeley team led by researcher Brian Pasley continued with their mission to decode the inner workings of the brain. In 2014, they proved able to create digitized speech from the brain waves of epileptic subjects who were reading silently. Now they have achieved yet another milestone involving the silent recitation of music within the temporal lobe.

Using brain activity, the team was able to predict what sounds a pianist was thinking of in their head. Pasley explained how they were able to pull off such a feat, which has both great potential application for those who can think but not speak – such as ALS patients – but will also lead many to consider the more concerning implications of such a capability.

During auditory perception, when you listen to sounds such as speech or music, we know that certain parts of the auditory cortex decompose these sounds into acoustic frequencies — for example, low or high tones, Pasley told Digital Trends. We tested if these same brain areas also process imagined sounds in the same way you internally verbalize the sound of your own voice, or imagine the sound of classical music in a silent room. We found that there was large overlap, but also distinct differences in how the brain represents the sound of imagined music. By building a machine learning model of the neural representation of imagined sound, we used the model to guess with reasonable accuracy what sound was imagined at each instant in time.

The experiment was carried out in a similar manner to previous ones, where the pianist’s brain activity was measured as they played the notes out loud, then asked them to imagine the chorus of notes instead of playing them. Through the analysis of this brain activity, the researchers were able to conceive an algorithm that they be applied as a predictive model.

Eventually, the Berkeley team hopes to develop a catch-all predictive algorithm to harness the thoughts of those who cannot speak. While they aren’t there yet, the team has made strides, and the latest predictive algorithm as it pertains to music represents significant progress toward the end goal.

sammire

Comments

JOIN OUR COMMUNITY OF

10,000+ disruptive companies, founders, and executives.



Your email address will NEVER be sharedor sold.

COMMUNITY