Gestural extraction from musical audio signals

Bolton, Jered (2004) Gestural extraction from musical audio signals. PhD thesis, University of Glasgow.

Full text available as:
[thumbnail of 2004boltonphd.pdf] PDF
Download (7MB)
Printed Thesis Information: https://eleanor.lib.gla.ac.uk/record=b2257051

Abstract

Conventional exploration of gestures normally associated with musical instruments can be a costly and intrusive process. This thesis presents a novel approach to gestural extraction which overcomes these problems. The motivation behind this research is that the result of gestural input can be heard and therefore extracted from the acoustic signal produced by a musical instrument. Therefore, the guiding principles of this work are taken from the human auditory system.

The concept of temporal grouping, and the fact that any sound which reaches the inner ear is conveyed to the brain, are two features of the auditory system that are mimicked by the presented system. Pertinent definitions are proposed for the sections of the note envelope and musical instrument gestures are classified according to those responsible for excitation or control.

The extraction of gestural information is dependent upon successful identification of note events. A note tracking system is presented which exploits the structure of a note in order to perform preliminary note onset detection. A backtracking function is employed to regress through auditory data, providing a means of assigning individual start points to each note harmonic. The note tracking system also records the end point of each note harmonic. Note information is validated by a bespoke musical comparison system which provides a means of comparing and evaluating different note detection methods.

Information provided by the note tracking system is used to extract gestural information regarding oboe key presses and excitation (articulation) methods of string instruments. System tests show that it is possible to correctly distinguish between bowed and plucked notes with an 89% success rate, using only three discriminators associated with the onset of a note.

In this thesis the foundations of a multifacetted gestural extraction system are presented with useful potential for further development.

Item Type: Thesis (PhD)
Qualification Level: Doctoral
Subjects: M Music and Books on Music > M Music
T Technology > TK Electrical engineering. Electronics Nuclear engineering
Colleges/Schools: College of Science and Engineering > School of Engineering > Electronics and Nanoscale Engineering
Supervisor's Name: Supervisor, not known
Date of Award: 2004
Depositing User: Ms Anikó Szilágyi
Unique ID: glathesis:2004-5922
Copyright: Copyright of this thesis is held by the author.
Date Deposited: 14 Jan 2015 14:33
Last Modified: 14 Jan 2015 14:34
URI: https://theses.gla.ac.uk/id/eprint/5922

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year