This category is devoted to documenting my final project for Mechatronic Art class at DXARTS.

I want to create a hybrid electro-acoustic (-mechanical) instrument, controlled in a non-contact way with hands. I want it to generate sounds with various levels of musical complexity – from single events to large scale gestures, maintaining realtime control over the material.

The concept is heavily influenced by the Isa Harp project, which I was part of briefly.

By creating this instrument I want to achieve following goals:

  • create an integrated device for performace – controller, processing unit and sound generator in one
  • use continuous scale for reading hand position/movement for fine control
  • allow seamless moving from single event control to large gesture shaping
  • combine sounds generated by computer (DA converter -> amplifier -> speaker) with mechanical part of the instrument – eg. solenoid-hit drums and motor-excited strings

There are still many parameters to specify or decide:

  • technology for reading hand position (motion sensors, video tracking, magnetic field…)
  • mechanical instrument part (range and types of robotic “instruments”)
  • types of synthesis used (or to use synthesis at all)
  • musical ideas behind large scale gestures generation
  • usage of transformation on sounds generated in mechanical way