MidiBERT-Piano: Large-scale Pre-training for Symbolic Music Classification Tasks

Abstract

This article presents a benchmark study of symbolic piano music classification using the masked language modelling approach of the Bidirectional Encoder Representations from Transformers (BERT). Specifically, we consider two types of MIDI data: MIDI scores, which are musical scores rendered directly into MIDI with no dynamics and precisely aligned with the metrical grids notated by their composers and MIDI performances, which are MIDI encodings of  human performances of musical scoresheets. With five public-domain datasets of single-track piano MIDI files, we pre-train two 12-layer Transformer models using the BERT approach, one for MIDI scores and the other for MIDI performances, and fine-tune them for four downstream classification tasks. These include two note-level classification tasks (melody extraction and velocity prediction) and two sequence-level classification tasks (style classification and emotion classification). Our evaluation shows that the BERT approach leads to higher classification accuracy than recurrent neural network (RNN)-based baselines.

Keywords

Large-scale pre-trained model, Transformer, symbolic-domain music classification, melody recognition, velocity prediction, artist clas- sification, emotion classificati, large-scale pre-trained model, artist classification, emotion classification

How to Cite

Chou, Y., Chen, I., Ching, J., Chang, C. & Yang, Y., (2024) “MidiBERT-Piano: Large-scale Pre-training for Symbolic Music Classification Tasks”, Journal of Creative Music Systems 8(1). doi: https://doi.org/10.5920/jcms.1064

Download

Download PDF

138

Views

36

Downloads

Share

Authors

Yi-Hui Chou (Carnegie Mellon University)
I-Chun Chen (National Tsing Hua University)
Joann Ching (Academia Sinica, Taiwan)
Chin-Jui Chang (Academia Sinica, Taiwan)
Yi-Hsuan Yang (Academia Sinica, Taiwan)

Download

Issue

Dates

Licence

Creative Commons Attribution 4.0

Identifiers

Peer Review

This article has been peer reviewed.

File Checksums (MD5)

  • PDF: 7e5313e5ecf8a1fccd5515856afba09e