Artigo Acesso aberto Revisado por pares

Automatic Extraction of Tempo and Beat From Expressive Performances

2001; Routledge; Volume: 30; Issue: 1 Linguagem: Inglês

10.1076/jnmr.30.1.39.7119

ISSN

1744-5027

Autores

Simon Dixon,

Tópico(s)

Music Technology and Sound Studies

Resumo

Abstract We describe a computer program which is able to estimate the tempo and the times of musical beats in expressively performed music. The input data may be either digital audio or a symbolic representation of music such as MIDI. The data is processed off-line to detect the salient rhythmic events and the timing of these events is analysed to generate hypotheses of the tempo at various metrical levels. Based on these tempo hypotheses, a multiple hypothesis search finds the sequence of beat times which has the best fit to the rhythmic events. We show that estimating the perceptual salience of rhythmic events significantly improves the results. No prior knowledge of the tempo, meter or musical style is assumed; all required information is derived from the data. Results are presented for a range of different musical styles, including classical, jazz, and popular works with a variety of tempi and meters. The system calculates the tempo correctly in most cases, the most common error being a doubling or halving of the tempo. The calculation of beat times is also robust. When errors are made concerning the phase of the beat, the system recovers quickly to resume correct beat tracking, despite the fact that there is no high level musical knowledge encoded in the system.

Referência(s)