Predicting Dominant Beat Frequency from Brain Responses while Listening to Music

Pankaj Pandey*, Nashra Ahmad, Krishna Prasad Miyapuram, Derek Lomas

*Corresponding author for this work

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

Abstract

Modern neuroscience has shown that the brain is profoundly rhythmic and that frequencies of neural rhythms are responsive to frequencies of musical rhythms. We collected Electroencephalography (EEG) response on 12 naturalistic music stimuli (songs), from 20 participants. We retrieved the tempo and its sub-harmonics from our stimuli (songs), and further used this information to predict the beats in the brain response using Machine Learning techniques. We observed a hierarchy of beats in each of the songs, with a specific beat frequency to be dominant (i.e. higher in magnitude) than others. This led us to form three groups of songs and their brain responses, with each of the groups indicating the frequency of a beat that dominated in the hierarchy of beat structure of that song. We used small segments of 1, 3 and 5 seconds of brain responses, rather than the entire song duration. We further created two sets for classification of the three groups of brain responses and utilized two spatial filtering techniques: Mean across electrodes (ME) and first principal component (PC1), and a Dense method using data from all electrodes. This was followed by feature extraction using band power. We developed univariate and multivariate models for classification to demonstrate the significance of each frequency band which represent beat frequencies. The dense method outperformed ME and PC1. Features related to eighth note generated maximum discrimination between classes. We also observed a positive correlation between window length and rate of correct prediction. Accuracy from one second to five seconds window improved significantly in both the sets. We achieved maximum accuracy of 70% and 56% accuracies for binary and ternary classification respectively, which is 20% above chance-level accuracy. Random Forest and kNN performed better than SVM. This work contributes to the growing body of knowledge to understand the underlying neural mechanism of rhythm processing in the brain.

Original languageEnglish
Title of host publicationProceedings - 2021 IEEE International Conference on Bioinformatics and Biomedicine, BIBM 2021
EditorsYufei Huang, Lukasz Kurgan, Feng Luo, Xiaohua Tony Hu, Yidong Chen, Edward Dougherty, Andrzej Kloczkowski, Yaohang Li
Place of PublicationPiscataway, NJ, USA
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages3058-3064
Number of pages7
ISBN (Electronic)978-1-665401265
DOIs
Publication statusPublished - 2021
Event2021 IEEE International Conference on Bioinformatics and Biomedicine, BIBM 2021 - Virtual, Online, United States
Duration: 9 Dec 202112 Dec 2021

Publication series

NameProceedings - 2021 IEEE International Conference on Bioinformatics and Biomedicine, BIBM 2021

Conference

Conference2021 IEEE International Conference on Bioinformatics and Biomedicine, BIBM 2021
Country/TerritoryUnited States
CityVirtual, Online
Period9/12/2112/12/21

Keywords

  • EEG
  • Entrainment
  • Machine Learning
  • Rhythm

Fingerprint

Dive into the research topics of 'Predicting Dominant Beat Frequency from Brain Responses while Listening to Music'. Together they form a unique fingerprint.

Cite this