The point of departure for our technology research is the idea of digital technology as embodied thought, and, as such, an integrated part of us. In our choreographic work we use technologies that enable a different experience of movement than would otherwise be possible. At the core of this are MiniBee movement sensors that give performers the possibility to control and influence sound and light in real time. We write the software for this in SuperCollider, an Open Source programming language specifically optimized for algorithmic music and live interaction.
As our work progresses we will post a series of tutorials about the usage of both MiniBees and SuperCollider.
The code from our recent pieces can be found on GitLab. This code is in the public domain, and can be used and modified by anyone.