MusicNet is a collection of 330 freely-licensed classical music recordings, together with over 1 million annotated labels indicating the precise time of each note in every recording, the instrument that plays each note, and the note's position in the metrical structure of the composition. The labels are acquired from musical scores aligned to recordings by dynamic time warping. The labels are verified by trained musicians; we estimate a labeling error rate of 4%. We offer the MusicNet labels to the machine learning and music communities as a resource for training models and a common benchmark for comparing results.
Source: MusicNet
Variants: MusicNet
This dataset is used in 1 benchmark:
Task | Model | Paper | Date |
---|---|---|---|
Music Transcription | Residual Shuffle-Exchange network | Residual Shuffle-Exchange Networks for Fast … | 2020-04-06 |
Music Transcription | Complex Transformer | Complex Transformer: A Framework for … | 2019-10-22 |
Music Transcription | Concatenated Transformer | Complex Transformer: A Framework for … | 2019-10-22 |
Music Transcription | Deep Complex Network | Deep Complex Networks | 2017-05-27 |
Music Transcription | Deep Real Network | Deep Complex Networks | 2017-05-27 |
Music Transcription | CNN (64 stride) | Learning Features of Music from … | 2016-11-29 |
Recent papers with results on this dataset: