Improvisations

As part of the performance preparation for Concentric Motion and the Gestural Études, I improvised both to observe and better understand my common gestures during performance and also as a way of exploring and refining Max/MSP patches designed for gesture detection and the control of digital audio effects processing, looping and tempo control.

 


Part of a series of improvisations to study the main types of gestures I use in piano performance. The motion data was captured on the Kinect (left camera view) and web cam (right camera view), and the joint data was also captured through Max/MSP and compared to the video data. I later practiced using the joint information to drive tempo changes in a MIDI composition within Ableton Live (as shown by the shifting red line).

 


An example from a series of improvisations to study the main types of gestures I use in piano performance. The motion data was captured on the Kinect (left camera view) and web cam (right camera view), and the joint data was also captured through Max/MSP and compared to the video data.

 


Performer gestures are controlling effects, a looper in Ableton Live and the tempo of looper playback.

 


Improvised full body motion controls modifies the tempo and applies digital audio effects to a MIDI composition. The visualisation is custom OpenGL software Smokescreen, which represents joint position and amplitude.

 

 


Augmented Piano Experiment demonstrates the relationship between expressive piano performance gestures and effects control in Ableton Live, accompanied by a smoke particle visualisation.