top of page

Mutual Arising

Included as part of the proceedings at the
2021 NYCEMF 

Screen Shot 2021-08-23 at 11.52.51 AM.png

Mutual Arising uses motion tracking, graphic score notation, and interpretive, reactive improvisation for the purposes of experimenting with remote, interdisciplinary performance practices. It combines stylistic elements found in fixed media, generative and interactive music systems, and gestural augmentation in order to leverage geographic distance as a composition parameter. Instead of using telematics or networked performance, a sequence of recordings took place, where each improvisation was derived not from simultaneous group interplay, but relative to the last performers’ reactions from the instructions provided in the score. Thus, the composition as a whole will not be heard until after all of the parts are played separately. The piano performer (played by Karl Larson) reacts to the source sound (provided by the electronics operator), the drummer reacts to the piano performer’s response to the sound file while simultaneously wearing a motion tracking sensor (Mari Kimura’s MUGIC®) which acts as a control source for the processing of both improvisations. Max/Msp~ was used to build the system for analyzing audio and mapping the gestural data from the sensor to specified processing parameters. The motion sensor is also used as a tool to gesturally trigger sounds on and off (like the spoken word files), as well to expand the timbral possibilities of a real-time drum set performance. This system and the resulting composition, represents a progress towards constructing a flexible software program for drummers that enables them to sonically expand their own unique gestural approach to the instrument.

bottom of page