Project DANCR and DANCR-Tool accomplished
this post is a work in progress!
What you can do with the DANCR-Tool and how it was researched and developed:
The DANCR tool was developed in an intensive transdisciplinary collaboration between dancers, engineers, computer scientists, and theorists. It enables dancers to improvise dances with a humanoid robot and conduct dance research. DANCR differs from conventional AI dance systems in that it does not rely on big data and prefabricated movement databases, nor is DANCR geared towards dance with fixed dance steps, positions, or movements. DANCR is therefore focused on contemporary dance and is primarily geared towards improvisation.
With DANCR, AI-models can now be generated that are trained exclusively with data from one individual dancer in each phase. The DANCR project has thus achieved a radical individualization of AI technology for dancers. The digital system does not work with as much data as possible but with the smallest possible amounts of data. The DANCR tool developed in this way makes it possible to refine one’s own artistic style and offers the opportunity to break out of routines and develop elements of choreography through improvisation. Artificial intelligence is “embodied” in one (or more, physical or virtual) robot(s) and thus offers a spatial-material counterpart – a totally new kind of dance vis-a-vis.
Two different frameworks of algorithms can now be trained to any number of different AI models:
The “improAI” works by means of dynamic degrees of similarity and dissimilarity in relation to the previous movement. The framework offers a complex set of parameters for generating AI models. The models can also be controlled in real time during runtime according to artistic premises. The aim of the extensive AI is to allow dancers radical individuality in development and research.
The “transferAI” works simultaneously with data sets from two dancers. It transforms the qualities of one person’s movement data and transfers them to the movement data of the other person. The aim of the AI is to offer dance researchers the opportunity to penetrate into areas of as yet unknown, but not arbitrary, random movement patterns and to explore these in a targeted manner.
DANCR-Tool properties:
artistic processes enabled:
- a humanoid robot model Pepper is improvising with a dancer
- individual movement generation by individual artists’ improvisation
- produce strictly individual data of artists, no use of big-data nessecarry
- train radically individual AI-models
- embrace and transform the bias
- model training by means of various artistic AI-training parameters
- model training by combination and selection of various data sources.
- control the model execution of improAI realtime behavior by means of the improAI-user-interface.
- hybridize individual movements, by transferring qualities of certain individual movement styles by means of the transferAI to other movements.
technical processes enabled:
- data-recording
- data-selection
- AI-model training by various parameters
- AI-models execution on several virtual and physical robots by controlling various parameters in realtime during performances.
- data user-interface
- improAI user-interface
based on complex systems in an interplay:
- dancing person
- sensor system (4 kinect sensors, 1 wristband, 1 atmospheric, 1 robot states)
- brekl fuses 4 kinect streams to 1. (brekl is the only software bought)
- angle-space-system takes the sensor data stream and fits it in realtime into the motion-space of the robot.
- computer network of the 5 computers executing the DANCR-tool.
- AI’s: both the algorithms of improAI and tranferAI can be trained to produce a nearly limitless number of AI-models
- robot is either physical or virtual or both at the same time. robot is vis-a-vis to dancer, it gets controlled by one of the AI-models and improvises with the dancer.
Credits:
Experts in contemporary dance; two senior (incl. one choreographer), and two junior: anonymous due to data-protection regulations.
Contemporary dance, dance theory, movement analysis: Eva-Maria Kraft.
Philosophy of Technology, art-based research: Christoph Hubatschke.
Project initialization and coordination, Spatial Theory, art-based research: Oliver Schürer.
Human-Robot-Interaction development: Darja Stoeva, Clara Haider, Helena Frijns.
„improAI“ development: Johann Petrak, Brigitte Krenn, OFAI (Austrian Research Institute for Artificial Intelligence).
„transfer-AI“ development: Sridhar Bulusu.
In cooperation with MUK (Music and Arts Private University of the City of Vienna).
Humanoid Robot: Model Pepper.

Funded by
