Researchers have developed a series of interactive wearable instruments that respond to bodily gestures
Watching accomplished musicians in action, it can often seem that their instruments are natural extensions of their bodies – as if that violin had grown from their collar bone, or the intestinal loops of that tuba had burst out of their ample gut.
A team of researchers at McGill University’s Input Devices and Music Interaction Lab have taken that idea one step further, by developing a series of prosthetic musical instruments that sprout directly from the bodies of performers like futuristic new limbs.
“We think instruments are less about the objects themselves than the bodily gestures they invite,” says PhD researcher Ian Hattwick, who has worked on the project for the last three years with colleague Joseph Malloch. “We’re taking that idea all the way, making instruments that are literally all about the performers’ movement.”
Glowing with a ghostly white light, a translucent spine erupts from the back of one performer, rippling as she writhes across the floor to manipulate an eerie synthesised sound. Another performer wrenches the spine from her back and begins to caress it, as if fondling a pet snake, transforming the otherworldly soundtrack as her fingers dance along the vertebrae.
“We wanted to blur the boundary of when an instrument is an object and when it is part of the body,” says Malloch. “Wearing these objects, the performers have to learn new gestures and modify their own gestures accordingly. If you had an external spine, you would move very differently.”
Using digital manufacturing techniques, including laser cutting and 3D-printing, the team has developed spines, rib cages and visors, each of which respond to movement in a different way. Formed from curved lengths of transparent acrylic, the ribs and visors contain inertial measurement units, and are touch-sensitive along their length, allowing performers to adjust the sound manually as well as through motion. Fitted with wireless data transmitters, the instruments send signals to an open-source peer-to-peer software system that transforms the gestures into sound.
The spines allow natural bending and twisting, and contain a series of three measurement devices at specific points. They are fitted with a combination of accelerometers (those cunning things in your phone that can tell up from down), rate gyroscopes, which sense angular velocity, and magnetometers that pick up on the earth’s magnetic field. By mapping these sensors at points along the spinal column, the designers can build a picture of orientation and shape – which is then used to determine the sound that’s made.
“We wanted the instruments to have the simple economy of a well-designed tool, even if the purpose of that tool might be mysterious,” says Halloch. He explains how the pieces are made without screws or nails, the plastic vertebrae simply held together through friction – again to mimic natural skeletal structures. “We didn’t want them to look like they had been hand-crafted, but like they had been made by some imaginary futuristic machine.”
They may look that way, but with wearable technology becoming increasingly accessible, and with the advent of Google Glass and circuit board tattoos, performative prosthetics may well find an audience beyond the stage.