Skip to Content

Music Cognition

warning: Creating default object from empty value in /home/leighsmi/public_html/modules/taxonomy/taxonomy.pages.inc on line 33.
Relating to music perception and cognition

Model Cortical Responses For The Detection Of Perceptual Onsets And Beat Tracking In Singing

Connection Science
Authors: 
Martin Coath, Susan Denham, Leigh M. Smith, Henkjan Honing, Amaury Hazan, Piotr Holonowicz, Hendrik Purwins

Connection Science, 21(2 & 3), 2009 pages 193-205)

Abstract: 

We describe a biophysically motivated model of auditory salience based on a model of cortical responses and present results that show that the derived measure of salience can be used to identify the position of perceptual onsets in a musical stimulus successfully. The salience measure is also shown to be useful to track beats and predict rhythmic structure in the stimulus on the basis of its periodicity patterns. We evaluate the method using a corpus of unaccompanied freely sung stimuli and show that the method performs well, in some cases better than state-of-the-art algorithms. These results deserve attention because they are derived from a general model of auditory processing and not an arbitrary model achieving best performance in onset detection or beat-tracking tasks.

Evaluation of a Multiresolution Model of Musical Rhythm Expectancy on Expressive Performances

RPPW09.jpg
Authors: 
Leigh M. Smith
Abstract: 

A computational multi-resolution model of musical rhythm expectation has been recently proposed based on cumulative evidence of rhythmic time-frequency ridges (Smith & Honing 2008a). This model was shown to demonstrate the emergence of musical meter from a bottom-up data processing model, thus clarifying the role of top-down expectation. Such a multiresolution time-frequency model of rhythm has also been previously demonstrated to track musical rubato well, with both synthesised (Smith & Honing 2008b) and performed audio examples (Coath et. al 2009). The model is evaluated for it's capability to generate accurate expectation from human musical performances. The musical performances consist of 63 monophonic rhythms from MIDI keyboard performances, and 50 audio recordings of popular music. The model generates expectations as forward predictions of times of future notes, a confidence weighting of the expectation, and a precision region. Evaluation consisted of generating successive expectations from an expanding fragment of the rhythm. In the case of the monophonic MIDI rhythms, these expectations were then scored by comparison against the onset times of notes actually then performed. The evaluation is repeated across each rhythm. In the case of the audio recording data, where beat annotations exist, but individual note onsets are not annotated, forward expectation is measured against the beat period. Scores were computed using information retrieval measures of precision, recall and F-score (van Rijsbergen 1979) for each performance. Preliminary results show mean PRF scores of (0.297, 0.370, 0.326) for the MIDI performances, indicating performance well above chance (0.177, 0.219, 0.195), but well below perfection. A model of expectation of musical rhythm has been shown to be computable. This can be used as a measure of rhythmic complexity, by measuring the degree of contradiction to expectation. As such, a rhythmic complexity measure is then applicable in models of rhythmic similarity used in music information retrieval applications.

Rhythmic Similarity Using Metrical Profile Matching

ICMC 2010
Authors: 
Leigh M. Smith
Abstract: 

A method for computing the similarity of metrical rhythmic patterns is described as applied to the audio signal of recorded music. For each rhythm, a combined feature vector of metrical profile and syncopation, separated by spectral subbands, hypermetrical profile, and tempo are compared. The descriptive capability of this feature vector is evaluated by it's use in a machine learning rhythm classification task, identifying ballroom dance styles using a support vector machine algorithm. Results indicate that with the full feature vector a result of 67% is achieved. This improves on previous results using rhythmic patterns alone, but does not exceed the best reported results. By evaluating individual features, measures of metrical, syncopation and hypermetrical profile are found to play a greater role than tempo in aiding discrimination.

Detecting onsets and beat tracking in singing

As part of my work with the EmCAP project, a paper I collaborated on has finally appeared. This reports work that I collaborated with researchers in Plymouth (Martin, Sue) and Barcelona (Hendrik, Amaury, Piotr) as well as Henkjan. We computationally modelled the human perception of onsets of musical notes and tracking of the musical beat. The software determines the beat period from unaccompanied singers that have a variable tempo.

EmCAP in the news

PhysOrg do a good job of explaining the findings of the EmCAP project, of which I was a part of while working at UvA with Henkjan and Olivia, and collaborating with Martin, Sue, Emili, Ricard, Hendrik and everyone else in the EmCAP project. Great work, everyone!

New Borns Sense the Beat

My boss at the Universiteit van Amsterdam, Dr. Henkjan Honing is interviewed by WNYC about a study that was done along with Olivia Ladinig, my co-worker and the group in Budapest on the ability for newborn babies to detect the beat of musical rhythms. Science daily has further details and a link to the paper.

EmCAP showcase now available

The previous project I was working on at the Uni of Amsterdam, EmCAP, has a showcase of the work that each of the academic partners contributed. My computational model of representation of musical rhythm, written in Common Lisp is now available.

Listening Experiments

A project I've been involved with at the Music Cognition Group of the UvA is an online experiment into the ability of listeners to discriminate timing and tempo changes in Classical, Jazz and Rock music.

Rhythmic Contour Map of Ravel's "Bolero"

Rhythmic Contour Map of Ravel's "Bolero"

A "Rhythmic Contour Map" combining magnitude and phase outputs from the continuous wavelet analysis of a portion of the snare drum rhythm of Maurice Ravel's "Bolero".

Syndicate content
Copyright