The act of embodying or expressing an algorithmic process through movement and choreography, particularly through the physical body in space led to the initial inquiries for Dancing the Algorithm (DaTA). An ongoing process-led investigative project, prototype 1.0 concluded with a rudimentary proof of concept piece – a duet between computational fragments and the individually coded movement DNA, based on dance artist Dapheny Chen captured through motion capture technology.
Fundamentally fueled by the urge to identify, catalogue and archive movement data, these explorations unravel the latent potential of the artistic endeavour to understand living and dancing bodies both past, present and future. DaTA was recently incubated at the National Arts Council Singapore, Esplanade Theatres on the Bay Performing Arts X Tech Lab 2023-2024.
Technical Specifications
Dancing the Algorithm (DaTA) operates as a real-time simulation that merges pose and motion-based machine learning with interactive choreography. Built using PyTorch, Sony’s Mocopi, and Unreal Engine, the system captures and processes motion data to create an immersive duet between the user and Dapheny Chen’s choreographic movement DNA.
Lightweight motion data is captured through Sony’s Mocopi, a portable motion capture system. This data is processed by a custom PyTorch algorithm designed to train a pose estimation model. The model labels and categorizes movements based on Dapheny’s unique choreographic vocabulary, captured during the project's initial motion capture phase.
The simulation runs on Unreal Engine, using the trained algorithm to generate responsive movements in real time. It predicts and mirrors how Dapheny would move based on the user’s live input, creating an interactive performance that blurs the line between physical and digital choreography. This innovative approach transforms movement into a dynamic dialogue, seamlessly blending computational processes with human motion.