Dance-based exergames like Just Dance can be a fun way to boost your fitness and sharpen your mind. However, designing the dance routines requires expertise in modeling and animation. We introduce an augmented reality (AR) personalized dance generation framework that synthesizes dance routines according to specified physical and cognitive intensities. Our system utilizes a curated library of motion-capture dance segments, which are intelligently combined through an optimization process to meet user-defined intensity and cognitive goals. This optimization also ensures smooth transitions between movements for natural dance flow. Users can customize routines by specifying physical constraints or injuries. Implemented in a depth-camera-based exergame that provides real-time performance feedback, our framework was evaluated through experiments and user studies confirming its effectiveness in generating personalized routines with varying levels of physical and cognitive intensity.
The process begins by sampling dance segments from a curated library to form an initial sequence. An iterative optimization procedure then refines this sequence based on a weighted combination of cost terms—including duration, transition smoothness, motion variety, and physical intensity—until convergence criteria are met.
In the Training Phase, users freely explored the system with real-time feedback on movement and heart rate. In the Performance Phase, a virtual guide delivered live choreography while a lower-right preview prompted upcoming moves. This design encouraged both physical engagement and learning.
Experiment results under different training goals: “Light-Easy”, “Light-Hard”, “Intense-Easy”, and “Intense-Hard”. We visualize joint-level intensities by normalizing intensity values across joints and mapping them to a color scale.