Open Access Article

Title: Digital dance generation and application based on hybrid density network

Authors: Qian Lu

Addresses: Conservatory of Music, Huanggang Normal University, Huanggang 438000, China

Abstract: This article proposes a digital dance generation method based on mixture density network (MDN), aiming to effectively capture and generate complex dance action sequences. Firstly, we analysed the temporal dependencies and diverse features of dance movements, and designed a multimodal temporal generation framework using MDN and long short-term memory (LSTM) networks to capture dynamic correlations and pose changes between dance movements. This framework can generate action sequences that match the music style when inputting music or rhythm information, with high continuity, coordination, and naturalness. This paper assesses the generated dance motions by the model using a publicly available dance dataset, and verified the effectiveness of this method through subjective and objective quantitative indicators. The experimental results show that compared to traditional generative models, the MDN based model has improved the fluency, naturalness, and diversity of generated actions.

Keywords: deep learning; computer music choreography; feature extraction; action filtering.

DOI: 10.1504/IJICT.2025.144054

International Journal of Information and Communication Technology, 2025 Vol.26 No.2, pp.51 - 66

Received: 08 Dec 2024
Accepted: 18 Dec 2024

Published online: 22 Jan 2025 *