LuminAI is an interactive art installation in which participants can improvise movement together with an AI dance partner that is projected onto a screen. The virtual agent segments users’ motion into gestures. The agent learns these gestures and then reasons about them using both bottom-up learned knowledge (in the form of unsupervised learning algorithms that cluster similar gestures together) as well as top-down domain knowledge (in the form of encodings of Laban Movement Analysis framework). The agent uses this knowledge to choose a relevant response to display. The LuminAI research project explores research questions related to computational creativity, cognitive science, and dance through this expressive, movement-based interactive experience.
The Expressive Machinery Lab (formerly ADAM Lab) explores the intersection between cognition, creativity, and computation through the study of creative human endeavors and by building digital media artifacts that represent our findings. Applications of our findings range from AI-based digital performance to interactive narrative experiences to educational media design and development.