top of page

Purchase Erhu Dataset

Erhu is a classic Chinese two-stringed bowed instrument with a distinct sound palette. Our dataset enables machine learning models to understand the intricacies of erhu music, leading to breakthroughs in generative AI, Music Information Retrieval (MIR), and source separation.

Untitled design (3).png

Dataset Specifications

Total Audio Tracks: Up to 100k Erhu tracks
Type: Instrument (Erhu)
File Format: WAV, FLAC, MP3, CSV, JSON

Dataset includes:

  • Duration

  • Key

  • Tempo

  • BPM Range

  • Mood

  • Energy

  • Description

  • Keywords

  • Chord Progressions

  • Timestamps

  • Time Signature

  • Number of Bars

Purchasing License

Annual License
Contact Us
Perpetual License
Contact Us

Free audio sample available

The erhu, a classic Chinese two-stringed bowed instrument with a millennia-long history, has a distinct sound palette. The erhu, known for its expressive and emotive features, is an important instrument in many musical genres. Our dataset enables machine learning models to understand the intricacies of erhu music, leading to breakthroughs in generative AI, Music Information Retrieval (MIR), and source separation.

Each track in the dataset contains information about chords, instrumentation, key, tempo, and timestamps, promoting a thorough grasp of the erhu's acoustic fingerprint. By training models on this dataset, developers can design AI systems capable of producing culturally rich compositions influenced by the erhu's traditional tones. Furthermore, the dataset simplifies MIR tasks by improving the analysis and classification of erhu-specific audio patterns.

The erhu dataset is an outstanding resource for source separation applications. Detailed metadata allows models to recognize and extract the different sounds of the erhu, helping to develop audio processing technologies. Train your machine learning efforts in the rich history of the Chinese violin, pushing the boundaries of musical invention with the Erhu Dataset.

Untitled design (3).png

Contact Us

Thanks for submitting!

bottom of page