logo

Reconstructing Music from Human Auditory Cortex Activity Using Nonlinear Decoding Models

Reconstructing Music from Human Auditory Cortex Activity Using Nonlinear Decoding Models

Music is core to the human experience yet the precise neural dynamics underlying music perception remain unknown. Dr. Gerwin Schalk, Director of the Chen Frontier Lab for Neurotechnology was part of a team of researchers who analyzed a unique intracranial electroencephalography (iEEG) dataset of 29 patients who listened to a Pink Floyd song and applied a stimulus reconstruction approach previously used in the speech domain. Amazingly, they were able to successfully reconstruct a recognizable song from direct neural recordings and quantify the impact of different factors on decoding accuracy.

Read the paper on the journal PLOS BIOLOGY website

Photo credit: pinkfloyd.com/