04/30/2026 | Press release | Distributed by Public on 04/30/2026 10:15
It's been said that "writing about music is like dancing about architecture." Writing, or talking, about dancing can be similarly futile.
A Cornell doctoral student has helped develop a tool that lets dancers use video and extended reality (XR) headsets to create an immersive environment for analyzing and refining their movements.
In other words, dancers can actually dance about dancing.
"DanXeReflect" transforms two-dimensional video into a 3D virtual studio, where movements appear as interactive avatars. Users can reenact poses to search a catalog of sequences, perform alternative revisions alongside originals, and attach annotations directly to avatars' body parts.
"We found that when dancers are talking to each other, they're kind of demonstrating with their bodies to specify the movement," said Hyunju Kim, a doctoral student in information science and part of the Siegel PiTech Ph.D. Impact Fellowship program at Cornell Tech. "So I wanted to use that idea in an augmented reality and virtual reality interface."
Kim presented "DanXeReflect: Interacting with the Spatio-Temporal Past Movements for Embodied, Reflective Choreographic Collaboration," at the Association for Computing Machinery's Conference on Human Factors in Computing Systems (CHI '26), April 13-17 in Barcelona. The paper earned honorable mention at the conference.
François Guimbretière, professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science, is a co-author. The corresponding author, Bokyung Lee, is an assistant professor of interaction design and human-computer interaction at Yonsei University in Seoul, South Korea.