Let’s narrow down the goals for this project:

I want to create an intuitive interface to control – that is perform – pieces which are pre-composed, very much as “tape music” is. The idea is for the performer to make gestures with hands while “practicing” with the piece. In this process system is supposed to “learn” gestures or certain control parameters.

In order to do this I harnessed Chris Kiefer’s Neural Network implementation for SuperCollider. My idea was to generate training data set containing both precomposed control values, as well as input from the skeleton tracking. Then this data set is sent to the neural network to create mappings between input and preexisting data.

At the time of performance system is switching from playing precomposed controllers to mapping of physical gestures.

Details and short documentation of the result coming soon.