You are here

Sonalog Gypsy MIDI

Motion Capture MIDI Controller Suit
Published October 2006
By Nick Rothwell

Want to turn movement into sound? Sonalog's 'mocap suit' is probably the world's first wearable MIDI controller.

The Gypsy MIDI Mocap controller is a prime example of a crossover product: something designed for one market which finds a niche in another. The manufacturers of the Gypsy MIDI suit, Animazoo, actually make a variety of motion capture ('mocap') products for the film, media and games industries — and now, through their sister company Sonalog, they are aiming a product at musicians.

Sonalog Gypsy MIDI

It's probably worth taking a brief diversion into the history and principles of motion capture, since it will help to explain the technology. Motion capture (or, as it is sometimes called, performance capture) is the technique of recording the physical movements of an actor or dancer in a way that can be processed and reproduced by a computer; if you like, think of it as sequencing for dance or movement. This data can then be used to 'synthesize' virtual characters' movements for films and computer games.

Motion-capture rigs fall into two main categories. The first is where the actor wears a set of reflective tags and moves inside a large frame rigged with sensors, which track the positions of the tags. This kind of system allows freedom of movement, so long as the actor stays within the frame, but suffers from technical problems like occlusion, where tags are temporarily hidden from the sensors' view.

The second kind of mocap system is one where the actor essentially wears the technology — all the sensing apparatus is fixed to the body, and there is no need for any enclosing apparatus. In this setup, the system measures the angles of the joints of the body, which is enough to track an actor's movement (mostly: actions like walking need special sensors such as inertial gyroscopes). Wearable mocap allows the user to move around in the performance space, but the gear will usually restrict the movement of the body, so it's a trade-off.

Gypsy MIDI

The Gypsy MIDI is a wearable motion-capture suit which transmits MIDI. Strictly speaking, it's only part of a mocap suit, dealing with the torso, shoulders, arms and wrists: there's no attachment for the head, and nothing from the torso down. In fact, Gypsy MIDI appears to be part of the mechanics of a whole-body suit called the Gypsy 5, repurposed for MIDI output. As such, it makes sense to think of it as a rather exotic MIDI controller, and this is the approach we'll take for the rest of the article.

Some care is needed to ensure that the Gypsy MIDI suit's delicate joints don't get damaged.Some care is needed to ensure that the Gypsy MIDI suit's delicate joints don't get damaged.

The raw MIDI data generated by the suit merely encodes the positions of the arm sensors, and there are no configuration options, so the Gypsy MIDI only becomes useful as a MIDI instrument when its data is passed through the bundled calibration and mapping software called Exo. The unit comprises a central harness, which includes the processing unit, and two flexible arm sections. The harness sits on the shoulders and is strapped firmly to the chest by some sizeable strips of Velcro. The suit's arm sections hang from the shoulders of the harness, and are strapped securely to the wearer's arms at elbow and wrist. The arm sections are telescopic, and will scale to fit the wearer.

The processing unit has a power connector and a MIDI Out socket. Unless one forks out for the wireless option, the suit connects to the rest of the world through an ordinary MIDI cable. The unit comes with two small NiCad battery packs (one is a spare) and a charger. Since the suit is cabled anyway, it would be more convenient to run a 12V DC supply alongside the MIDI cable and not have to worry about keeping NiCads charged.

The suit is rather bulky and untidy when it's not being worn, so it comes with what looks like a modified microphone stand, on which it can be hung when not in use. (Although bulky and heavy, the suit is rather delicate — one of the joints on our evaluation unit actually came apart while we were testing it — so it needs to be treated with care.) Finally, in true Ikea style, there is a special Allen key for making adjustments to the harness.


All the control sources on the Gypsy MIDI are rotational, and are positioned at the joints of the arm sections: shoulders, elbows and wrists. Each joint has a pair of sensors (potentiometers) mounted at 90 degrees, to sense two independent directions of movement, so the suit has 12 sensors in total. (Actually, it has 14, because the wrist pads are mounted on pots which allow rotation, but these don't generate data.) The shoulder sensors track the upper arm moving out or forwards, the elbow sensors track elbow bend or upper-arm rotation, and the wrist sensors track wrist bend or lower-arm rotation. Since all control is rotational, the suit isn't capable of generating discrete events such as MIDI notes; this is left to the Exo software.

Resolution And Calibration

When it comes to actual use, all sensing systems have a major issue (or, if you prefer, challenge): they have to be calibrated. The minimum and maximum values which can be transmitted by an individual sensor generally won't correspond to the minimum and maximum values which will be reached in practice, and the latter will vary according to user and circumstance. Each potentiometer in the Gypsy MIDI appears to generate MIDI Controller values from 0 to 127 during the full range of its rotation, but this is much further than it will move when the suit is being worn, so the calibration process maps this limited range to the full MIDI range, before the data is processed further. Calibration is a separate process to the preset programming which turns the data into notes, and the two phases of MIDI manipulation don't interfere with each other.

Before you can use the Gypsy MIDI suit, it needs to be calibrated.Before you can use the Gypsy MIDI suit, it needs to be calibrated.

Calibration aside, there is also an issue of resolution. Since the suit's sensors generate MIDI Controller messages, they are only capable of seven-bit resolution at best, and when the suit is being worn, some sensors will range over only 30 or 40 controller values. For the purposes of note triggering, exact sensor values aren't critical, but a sensor mapped to (say) pitch-bend will often result in noticeable glissando.


The Exo software package serves two purposes: it acts as an instrument to transform the raw controller data from the Gypsy MIDI into notes and chords, and it deals with calibration. Since the Gypsy MIDI hardware generates MIDI, Exo is basically a MIDI transformation system; controller data comes in from the suit's sensors, and Exo plays notes, and transmits other MIDI Controllers, in response.

The Exo software is written in Max/MSP, which, as I'm sure many readers already know, is a graphical software construction kit for building interactive systems for MIDI, media and audio. By and large, Max/MSP applications have a somewhat rough and homegrown appearance, and Exo is no exception — it's not going to win any design awards — but it seems to work well enough. The bundled version of the application was Mac OS X-only, but since Max/MSP runs on both Mac and Windows there's no reason why a Windows version shouldn't be bundled as well, and Gypsy's web site suggests that the suit now ships with both versions. Owners of Intel Macs will either have to wait until the Intel Mac version of Max/MSP ships (a public beta has just been announced as I type), or try running Exo under Rosetta translation, which I am led to believe should work, if a little slowly.

Exo displays a main 'patching' window, which we'll come to in a moment, but before you can use the suit in anger, you need to calibrate all the sensors. This is a pretty simple task, which involves opening the calibration window, priming the calibration process, and then attempting to move all the joints to their maximum extent; the software calibrates to the extremes of movement, and on-screen faders show how far each sensor joint is moving. It is also possible to manually choose the 'sense' (forward or reverse) of each sensor, which determines which extreme of movement will correspond to MIDI value 0, and which to 127.

The supplied Exo software consists of a number of MIDI processing modules, which can trigger notes or controller data.The supplied Exo software consists of a number of MIDI processing modules, which can trigger notes or controller data.The calibration settings can be saved in a named preset, although since the sensor pads won't always be in exactly the same place on the body, it'll almost certainly be necessary to recalibrate each time the suit is used, so it's not clear why being able to save calibrations is terribly useful, except for remembering the forward/reverse settings. It is, however, important to load up the 'default input' preset before starting, since this knows what MIDI channels the suit's sensors transmit on.

It's probably worth mentioning how (or where) presets are stored, both for calibration and for performance setup. If the Exo system was running as a Max/MSP document, each preset would be stored as a text file on disk. (It's a dump file from Max's 'coll' object, in fact.) Since Exo is a stand-alone application under OS X, these files are saved inside the application 'bundle' — in essence, hidden within a secret directory inside the application. It's possible to examine these files and hence rename them, back them up, and so on, by opening the bundle from the Finder; or you can be lazy and just back up the entire application. Since Windows doesn't have application bundles, all the files should be directly visible. (I don't have a Windows version of Exo here, so can't check the details.)

Performance Mapping

The main performance window in Exo is pretty much a blank canvas into which one can install performance (MIDI processing) modules of various kinds. It's not possible to freely mix and match modules: the software is rather strict about how many modules there can be, and where they appear on the screen. It is possible to install up to seven note-trigger modules, nine controller modules, and eight matrix modules. When a module is installed, it appears at a fixed point in the window, and modules of the same type line up in a row — the maximum number of modules allowed seems to have been determined by the default window size.

The note-trigger modules are the most important, and transform the Continuous Controller data from the suit's sensors (after calibration scaling) into note events. A note-trigger module can operate in two monophonic modes, and one polyphonic mode. The monophonic modes are best explained if you think about the playing of notes in the traditional, analogue-synthesizer, CV-and-gate model: one signal selects the pitch, and a second signal gates the note on and off at that pitch. In the Exo software, one of the Gypsy sensors sweeps through a user-defined selection of note pitches, programmed from an on-screen keyboard, while another sensor acts as the note gate according to position. (One monophonic mode is a simpler version of the other, allowing only a single pitch value.) As an illustration, a shoulder sensor could sweep the pitches while a wrist acted as gate, so that you would raise your arm to select a pitch and twist the wrist to trigger. The polyphonic mode allows multiple notes to be latched — which is useful for triggering loops, if your audio software will synchronise them for you — but you'll have to remember to release any notes that you have triggered, otherwise they'll just hang.

The controller modules do as you might expect, turning sensor input into MIDI Controller data (continuous or switch), with range scaling. The matrix modules act as a modulation mechanism: sensors can enable or disable trigger or controller modules, and a matrix module can also load a pre-programmed note scale into a note trigger, opening the door to all sorts of mode and key changes.

In Use

I was curious about the performance potential of the Gypsy suit, so I got in touch with the folks at body>data>space (, a production company involved with artistic integration of the body, technology and the environment, and engaged the services of a couple of professional contemporary dancers to test out the suit under workshop conditions. Putting the suit on involves quite a bit of messing around with brackets, wires and Velcro, followed by a bit more messing around with calibration, but the process does get easier with practice.

The suit in action with an Edirol CG8 video synthesizer.The suit in action with an Edirol CG8 video synthesizer.The dancers' first comments were that the suit does restrict one's movement to quite an extent, especially if you are expecting to dance in it. A learning process is also needed to get a feel for which kinds of body movement produce the greatest range of values in each of the sensors. And, of course, the sensor values are all relative, so it's necessary to think in terms of body posture rather than position in space when programming zones and triggers, and it's always going to be tricky to hit a particular controller value purely by gesture.

When it came to triggering notes, the suit seemed to perform well enough. We didn't notice any serious problems with latency, but sensing systems — especially those with intermediary software — are notorious for response and timing issues, so it would be unwise to expect one's virtual drum kit to respond as quickly as the real thing. Having played around with audio, we then plugged the Gypsy into an Edirol CG8 video synthesizer driving a big-screen projector, which gave us a couple of hours of harmless amusement, and there's no reason why the Gypsy couldn't be used for any other kind of MIDI application.


If we are to view the Gypsy MIDI as some kind of performance system, there are two obvious questions to be answered: does it work, and does it look good? I'm not really qualified to answer the second question: it really comes down to one's on-stage persona, tricks and techniques. I know some artists who could build great live performances around a Gypsy MIDI suit, and others who would merely look like plonkers. As to the first question, here at Cassiel Central we've been through all manner of MIDI controllers and sensing systems, from fader boxes (motorised and not) through accelerometers, ultrasound systems, camera tracking, joysticks, game controllers and Buchla devices, and some common issues emerge. Conventional controllers such as keyboards and mixers have controls which can be physically moved, so that they can be operated by touch and inspected visually. If the controls do not move and offer no visual feedback, then a layer of cognitive support disappears, and one tends to lose accuracy and/or sophistication. If the controls cannot even be seen, then even more accuracy is lost. So, the Gypsy MIDI is always going to be a low-resolution controller compared to traditional devices, but whether this matters depends on what you're trying to do with it. On the other hand, controlling sound (or for that matter, video) by moving one's arms and wrists is really rather fun. In the end, it's up to you to decide whether the performance potential outweighs the drawbacks. 

Published October 2006