The movements were first created using physical responses to touch. Secondly we framed this in a pattern of walking, adding a sense of urgency and chasing each others' impulses.
The sound is triggered by MiniBees, and is based on 1) granulated recordings, 2) automatic playback of detuned water bubble recordings, 3) comb-filtered recordings of voices and 4) highpass-filtered kick drums.
Requires MiniBeeUtils and movement sensors. We use MiniBees, but with a bit of rewriting the MiniBeeUtils should work with any sensor data. See the full gitlab repo for more details on implementation.
In this sketch there are several things going on, so you need to set up a few things before running the final ~doubling routine. This is done by making sure the files
numbers.scd, bd.scd, patBubbles.scd and
unisonoBubbles.scd are in the
~sketchdir, which you will need to create under the
~root directory if it doesn't exist.