I’m doing prospective work for a future performance involving a vocalist, a percussionist and realtime computer graphics. The goal is to design an environment which evolves by reacting to sounds on stage. The performance is to be played in schools, but might as well be presented to adults later if they behave. In the meantime I’ll try to share sketches and bits of code along the way, by updating this page.
Maybe the end result will be very different, anyway, here is a first step :
In each video, new cells are created with the vocal sounds, and modified (leg length, rotation) with the percussion sounds. The graphics are inspired by Jared Tarbell’s work, and you can get the Processing sketch here (does not include the audio analysis layer).
Step 2 :
New creatures are generated by random combination of several body and leg shapes, along with one predefined color palette :
You can get the Processing sketch here (does not include the audio analysis layer).