Dance performance generating bodily expression through mathematical / collective intelligence-based methodology. With AI and machine learning offering new insights into the body schema and movement, numeric data and analytical results derived from it inform the choreography. A neighborhood search system matches pose data from stage footage and movies with dancer pose data, for instance, to project imagery of poses closest to theirs into a rectangular on-stage frame. Analysis of lobby-shot day-of-show audience footage and dancers’ motion data is used to project audience members on-screen as dancers themselves, while on-stage implements from microdrones to the frame respond to dancers’ movements based on various rules and algorithms to generate new bodily expressions. Presented since 2018 in cities around the world including Montréal (Canada), San Francisco (U.S.A.), Tokyo and Barcelona (Spain).
© Rhizomatiks / ELEVENPLAY
A new form of expression may have just appeared in the world of dance, with its history of hundreds of years, thanks to new technology involving deep neural networks (DNN), multi-layered systems of artificial neurons. Adding various algorithms has enabled image recognition / generation with an unprecedented degree of precision. This work taps the full potential of DNN to innovate forms of expression beyond human capability. ELEVENPLAY is a female dance unit, yet their stage partners are not dancers in the flesh. They are, rather, “virtual dancers” developed by studying the movements of ordinary people with DNN, and undoubtedly the most advanced such partners existing technology could produce. Where William Forsythe eschewed emotion and Pina Bausch took the opposite tack, this work can be appraised as offering glimpses of a new third category of dance expression distinct from both. (IKEGAMI Takashi)