Earlier this year, I started to work on the Sabour project, for which we needed the gestures of the dancer to be interpreted in real time. We used a Kinect to track the body movements, but we needed something more time-reactive and localized to specific body parts. We agreed to test on a wireless IMU.
I designed and built three sensing units, which send their 6 dimensional data ( 3D accelerometer, 3D gyroscope ) every 15ms ( 66,66 Hz ) to a wireless receptor. The data is formatted as OSC packets over UDP, and sent to the computer via ethernet. The advantage of this design is that the data is transmitted very fast, and can travel from the receptor to the computer over an internet cable ( compared to USB solutions, this is very handy for deported situations, where the computer has to be far away from the captured scene ).
In our case, the data packets were simply received in Max/MSP with the [udpreceive] object, feeding a live mapping module between the motion data and the video engine.
Hardware used for the sensing unit :
On the receptor side :
In order to have multiple sensors send their data to one receptor, I used the Xbees in the API mode which sends data frames with a checksum and a sender ID.
Demonstration of the sensors data transmission :
The base box is made from a simple laser cutter design :