Julien Bloit
music & gesture
Aeolus
Category : projects

At Music Hack Day, I met Warren Stringer and Matt Howell and teamed up with the idea of creating a dynamic graphic environment, reacting to facial expressions, as well as voice or instrument sounds. This is what it looks like at the end of the weekend :

The face and sound tracking happen on the left-side computer. Whenever a blowing sound is emitted, a particle is emitted in the direction pointed by the face. This is picked up by the graphic rendering program running on the right-side, which uses the particle coordinates to drive a fluid graphics engine. The notes informations of the ocarina are picked up and used to modify the colors and affect the dynamics of the fluids.

How it’s done :

  • face-driven particles are a hack of Jason Saragih and Kyle Mcdonald’s Face Tracker project, in the OpenFrameworks environment
  • The sounds events are detected in Max/MSP, sending OSC to the face tracker
  • The fluids are generated with the MSA Fluids library used in OpenFrameworks too
  • The ocarina also sends OSC to the Fluids engine

Update (june 2012) :
I reprogrammed the idea, using a particle system in 3D, mapped to evolve in the same reference as the face tracker. Here’s what it looks like. For now there are only two sound categories that change the color and behavior of the particles. I think there is more potential to this, although the face tracker sometimes lags a little depending on lighting conditions.

Comments Closed

Comments are closed.

Category : projects
hopscotch_01

Citiplay is an interactive, digital version of the iconic street game hopscotch that seeks to foster a sense of urban play. The game functions much like the game “Simon” by asking participants to remember and repeat patterns by stepping on hopscotch tiles that light up in sequence. Aimed at both the passerby walking by on [...]

whaleSkull

This is a collaboration with artist-designer Marguerite Humeau, who is fascinated with scientific experiments and their potential for many narrative opportunities. A presentation of the project from the exhibition in St-Etienne : “Proposal for Resuscitating Prehistoric Creatures” sets up the rebirth of cloned creatures, their wandering and their sound epic. They are seeking to evolve [...]

il_fullxfull.111986728

At Music Hack Day, I met Warren Stringer and Matt Howell and teamed up with the idea of creating a dynamic graphic environment, reacting to facial expressions, as well as voice or instrument sounds. This is what it looks like at the end of the weekend : The face and sound tracking happen on the [...]

Urban Musical Game : playground

The idea for this project was to control music and sounds with a simple school ball. The motion of the ball bouncing or flying around on the playground is captured live and repurposed as giant musical gestures that mashup pre-recorded tracks and sound effects. Various game scenarios were tried : direct sonification of existing games [...]

parraMultifluxTypologie

During my PhD. work at Ircam I have been interested in creating models for machine listening of musical sounds. The specificity of my work was it’s musical unspecificity : I was not focused on building an automatic music-to-score transcription machine, but rather on trying to anticipate how a listening machine could deal with music that [...]