LuminAI is an interactive art installation in which participants can improvise movement together with an AI dance partner that is projected onto a screen. The virtual agent segments users’ motion into gestures. The agent learns these gestures and then reasons about them using both bottom-up learned knowledge (in the form of unsupervised learning algorithms that cluster similar gestures together) as well as top-down domain knowledge (in the form of encodings of Laban Movement Analysis framework). The agent uses this knowledge to choose a relevant response to display. The LuminAI research project explores research questions related to computational creativity, cognitive science, and dance through this expressive, movement-based interactive experience.

Project Name
Faculty Lead(s)
Jason Freeman, Duri Long
Student Name(s)
Cassandra Naoimi, Sathvika Dannapaneni, Swar Gujrania, Lucas Liu
Main Contact
Brian Magerko
Lab Name
Expressive Machinery Lab (formerly ADAM Lab)
Video Title
Video URL