AI to let robots improvise contemporary dance (2 AIs developed and research&development on DANCR-AI accomplished)

Briefly, the DANCR-AI is …

… a new instrument for 1) dance and improvisation in contemporary dance as well as 2) setup of the scientific experimentation of properties, relations, and embodiments of human-robot-AI sociocultural interplay.

ad 1) The tool is intended to improvise contemporary dance and develop choreographies with artificial intelligence. DANCR-AI enables a new and original way of artistic-performative work with humans, abstractions of humans, and/or humanoid robots (physical and virtual).

ad 2) the system is abstracting the manyfold relations of entities that make up AI-systems. It provides a research environment on entanglements of technologies-humans-techniques researching questions on spatiality and embodiment.

(This post is a work in progress!)

DANCR-Tool and its AI’s, the properties: 

artistic processes built: 

  • a humanoid robot model Pepper is improvising with a dancer
  • individual movement generation by individual artists’ improvisation 
  • produce strictly individual data of artists, no use of big-data necessary
  • train radically individual AI-models
  • embrace and transform the bias
  • model training by means of various artistic AI-training parameters 
  • model training by combination and selection of various data sources. 
  • control the model execution of improAI realtime behavior by means of the improAI-user-interface.
  • hybridize individual movements, by transferring qualities of certain individual movement styles by means of the transferAI to other movements. 

technical processes built:

  • data-recording 
  • data-selection 
  • AI-model training by various parameters
  • AI-models execution on several virtual and physical robots by controlling various parameters in realtime during performances.
  • data user-interface
  • improAI user-interface

complex systems in an interplay: 

  • dancing person
  • sensor system (4 kinect sensors, 1 wristband, 1 atmospheric, 1 robot states)
  • brekl fuses 4 kinect streams to 1. (brekl is the only software bought) 
  • angle-space-system takes the sensor data stream and fits it in realtime into the motion-space of the robot. 
  • computer network of the 5 computers executing the DANCR-tool. 
  • AI’s: both the algorithms of improAI and tranferAI can be trained to produce a nearly limitless number of AI-models 
  • robot is either physical or virtual or both at the same time. robot is vis-a-vis to dancer, it gets controlled by one of the AI-models and improvises with the dancer. 

DANCR-Tool research and development:

The DANCR tool was developed in an intensive transdisciplinary collaboration between dancers, engineers, computer scientists, and theorists. It enables dancers to improvise dances with a humanoid robot and conduct dance research. DANCR differs from conventional AI dance systems in that it does not rely on big data and prefabricated movement databases, nor is DANCR geared towards dance with fixed dance steps, positions, or movements. DANCR is therefore focused on contemporary dance and is primarily geared towards improvisation.

With DANCR, AI-models can now be generated that are trained exclusively with data from one individual dancer in each phase. The DANCR project has thus achieved a radical individualization of AI technology for dancers. The digital system does not work with as much data as possible but with the smallest possible amounts of data. The DANCR tool developed in this way makes it possible to refine one’s own artistic style and offers the opportunity to break out of routines and develop elements of choreography through improvisation. Artificial intelligence is “embodied” in one (or more, physical or virtual) robot(s) and thus offers a spatial-material counterpart – a totally new kind of dance vis-a-vis.

Two different frameworks of algorithms can now be trained to any number of different AI models: 

In DANCR, two critical and artistic AI systems (improAI, transferAI) have been developed and made accessible for arts-based research with the DANCR tool (a variable assemblage of AI systems, sensor system, hybrid space, virtual and physical robots).

The “improAI” is based on a KNN algorithm framework (k-nearest neighbour) and triv decisions on subsequent movements by means of dynamic degrees of similarity and dissimilarity in relation to the previous movements in real time. The algorithm framework makes it possible to generate models with a complex set of artistic parameters using motion data of strictly one individual dancer: no use of big data but instead of the minimum amount of data necessary to make the improAI work for the particular dancer. By that, a potentially infinite number of different AI-models can be trained. Next, each of those generated AI-models can be controlled in real-time according to artistic premises, while following specific dance-research interests or doing a stage performance. The objective of the extensive AI is to provide dancers with the opportunity of radical individuality in the development phase.

The “transferAI” is based on a NST algorithm (neural style transfer) and works with data sets of two dancers at the same time. It transforms the qualities of a person’s movement data and transfers it to the movement data of the other. The objective of this AI is to provide dance researchers with the opportunity to advance to and strategically explore unknown yet not accidentally random movements, within the contemporary dance discourse.

Credits: 

Experts in contemporary dance; two seniors (incl. one choreographer), and two juniors: anonymous due to data-protection regulations. 
Contemporary dance, dance theory, movement analysis: Eva-Maria Kraft.
Philosophy of Technology, art-based research: Christoph Hubatschke.
Project initialization and coordination, Spatial Theory, art-based research: Oliver Schürer.

Human-Robot-Interaction development: Darja Stoeva, Clara Haider, Helena Frijns.
„improAI“ development: Johann Petrak, Brigitte Krenn, OFAI (Austrian Research Institute for Artificial Intelligence).
„transfer-AI“ development: Sridhar Bulusu.

In cooperation with MUK (Music and Arts Private University of the City of Vienna).

Humanoid Robot: Model Pepper.

Funded by