After nearly 40 years everybody's favourite digital music protocol is about to get an upgrade...
At the time of writing, we're approaching the 37th anniversary of MIDI, the Musical Instrument Digital Interface. At the time of its launch it was ground-breaking: connect two keyboard instruments together with a simple cable, and the keyboard on one can play the sounds on another! It led the way to rackmount synthesizers and samplers (why pay for a physical keyboard for each device when you can share a single 'master keyboard' amongst a stack of compact sound engines?), and gave birth to computer-based MIDI sequencers for music composition and, in due course, software instruments and effects.
But by modern standards, MIDI seems pretty antiquated: today's gigabit Wi‑Fi has a data rate 30,000 times faster than classic MIDI over 5-pin DIN cables, the sockets for those cables are awkward and bulky, and the MIDI messages themselves are low resolution, with no provision for the kind of detailed gestural expression found in traditional musical instruments — or, indeed, in modern electronic ones.
And yet, MIDI is still alive and well, whilst other contemporary or newer technologies are just museum relics. (SCSI anyone? Or how about ADB, the Apple Desktop Bus?) If you use a modern DAW there's every chance that you've recently been editing note durations or velocity values in a MIDI track in much the same way as a user of the Yamaha QX1 hardware sequencer might have done in 1984. Ironically, MIDI's primitive simplicity might be seen as an asset: the messages are so simple that producers can edit recorded performances easily, note by note.
For almost as long as we've had what is technically known as MIDI 1.0, pundits have been predicting the arrival of MIDI 2.0, or describing attempts to enhance the existing specification to address its shortcomings or add new features. Some attempts were short-lived, such as the MIDI Sample Dump Standard, a way of (very slowly) transferring audio samples over a MIDI cable. Other ideas were rather outlandish: a scheme which I remember being referred to as XMIDI proposed using the unconnected pins in the 5-pin MIDI connector to carry additional data. A completely different XMIDI proposal from IBM in 1992 added new controller messages and, of all things, a BASIC-style for-loop construct for stutter and drum-roll effects. To be fair, many MIDI 1.0 extensions have been more successful: General MIDI for consistent sound sets in consumer devices, and MPE (MIDI Polyphonic Expression) for multitouch gesture in controllers from ROLI, Roger Linn and many others.
Despite these various extensions, enhancements to MIDI 1.0 have been very conservative, as manufacturers and customers alike have resisted attempts to replace it and render existing equipment and software obsolete. If you were to still have a functioning Yamaha DX7 synthesizer from 1983 you'd be able to control it from any modern DAW with the addition of a cheap MIDI interface box. On the other hand, one technological enhancement has changed the MIDI ecosystem: the affordable, ubiquitous USB standard. MIDI over USB is fast, is expandable via standard USB hubs, and is bidirectional, so a DAW can see which devices are attached and get data from them easily. The convenience of MIDI-over-USB, the simplicity of the MIDI protocol itself, and resistance to technological obsolescence has contributed to MIDI 1.0's 37 years of domination. (It is not without challengers, though: OSC, or Open Sound Control, is also widely used between internet-enabled devices.)
A lot of the pieces still have to fall into place before MIDI 2.0 takes off, but after 37 years I can't wait for the upgrade.
When I heard about MIDI 2.0 at the Audio Developer Conference in London last year, I initially assumed it was another speculative proposal with little chance of taking off, until I got a chance to talk to Mike Kent, who is chairman of the MIDI-CI Working Group at the MIDI Manufacturers Association (MMA), having worked at Roland Japan for 20 years, and who has been working on the MIDI 2.0 specification. As I write this, the core specifications of MIDI 2.0 have just been adopted by the MMA. No shipping hardware or software currently supports MIDI 2.0 natively, although various vendors (including Yamaha and Steinberg) have been testing experimental systems with MIDI 2.0 data embedded in MIDI 1.0 SysEx messages for a while. Yamaha, Roland and Korg demoed MIDI 2.0 prototypes exchanging data at NAMM this year, and Roland's new A-88MKII controller keyboard is touted as 'MIDI 2.0-ready'. Native MIDI 2.0 software applications will require operating system support for the new message types and format before we can make effective use of it in the studio.
MIDI 2.0 is an extension of MIDI 1.0 in the sense that two devices communicating using MIDI 1.0 can negotiate a conversion to 2.0 and take advantage of all its benefits. MIDI 2.0 is 'transport agnostic' — there's no specification for the physical connection, although it's most likely to be adopted over USB. MIDI 2.0 is not by any definition faster than 1.0 — the speed depends on the transport — although 2.0 is generally denser, carrying higher-resolution messages. The documentation published by the MIDI Association (the user group administered by the MMA) claims compatibility between 2.0 to 1.0 which preserves interoperability. That's a strong claim and needs some interpretation: in practice a studio or live rig might contain a mix of active 1.0 and 2.0 connections on distinct USB cables, and even do real-time conversion between the two, but a legacy MIDI 1.0 device will not understand MIDI 2.0 messages without conversion and possible loss of resolution. However, if a mix of MIDI 1.0 and 2.0 devices are plugged together, they will communicate: those capable of upgrading their connection to 2.0 will do so, others will stick with 1.0. The MIDI-CI mechanism, described below, is responsible for establishing the most appropriate type of MIDI to use for each device.
A central feature of MIDI 2.0 is MIDI-CI, or MIDI Capability Inquiry. At present it is the only part of the MIDI 2.0 specification formally adopted by the MMA. The first part of MIDI-CI is a process called Protocol Negotiation, where a pair of devices decide whether they are going to use MIDI 2.0 to communicate. This negotiation has to be done using traditional MIDI 1.0 System Exclusive messages — MIDI 2.0 cannot be activated until the devices have both agreed to do so. Obviously a bidirectional connection, such as over USB, is required otherwise the initiating device would not be able to receive a reply.
If the negotiation succeeds and the connection is converted to MIDI 2.0, the devices then communicate using 'Universal MIDI packets', which are packets of data a multiple of 32 bits in size. (Coincidentally, 32-bit packets are already used for MIDI over USB, although in a different format.) If the devices use these MIDI 2.0 packets, they might still decide to work at the level of MIDI 1.0: there's a format for embedding MIDI 1.0 channel messages into MIDI 2.0 packets. This could be useful to encapsulate messages in a MIDI 2.0 data stream that are intended to control a legacy MIDI 1.0 device.
MIDI-CI also incorporates something called Profile Configuration. This allows devices to declare compatibility using standard profiles: a piano profile might specify how exactly a MIDI piano responds to sustain or to differences in note velocity, while an organ profile might detail the arrangement of its drawbars. Hence, a DAW might present a specialised on-screen interface depending on the profile(s) declared by instruments. Profiles could be useful even for non-musical applications, such as MIDI-controlled lighting systems.
Finally, Property Exchange is a general mechanism to exchange parameter information, preset names, device configuration and so on. This is the sort of information that would be contained in System Exclusive messages in MIDI 1.0, although software instruments and effects already make much of this kind of information available to their hosts via the VST standard.
Assuming that two devices (say, a keyboard and a sound generator) are both fully MIDI 2.0 capable and have negotiated to that effect, all communication is via Universal MIDI packets rather than the old 2- and 3-byte messages with status and data bytes. Whilst a 32-bit packet can contain a MIDI 1.0 channel message (note on/off or controller change), high-resolution MIDI 2.0 notes and control changes require 64-bit packets. New System Exclusive messages are delivered in 64-bit packets, whilst large data transfers can make use of 128-bit packets.
A MIDI 2.0 note requires 64 bits rather than the 24 bits of MIDI 1.0 because it carries much more information. For a start, alongside the MIDI channel (1-16) there is a 'group', also ranging from 1 to 16, which can be thought of rather like a cable selection. There are still only 128 possible pitch values, but 65536 possible velocity values. New attribute fields are provided for applications such as note-by-note microtuning (down to 1/512th of a semitone) or articulation. Control change messages now allow for 32768 controllers (registered or assignable, each in 128 banks of 128), with 32-bit resolution. There are also per-note controllers, for polyphonic expression or note-by-note modulation, with extremely fine pitch or value control: essentially a very high-resolution equivalent of MPE.
Program change messages now incorporate an optional bank select operation, doing away with the messy combination of control change and program change needed to perform bank select in MIDI 1.0.
MIDI 2.0's system exclusive message format includes support for 8-bit data, overcoming the inconvenience (for software developers specifically) of having to pack 8-bit data into 7-bit values in MIDI 1.0.
MIDI 2.0 also includes a feature called Jitter Reduction. Part of the MIDI-CI negotiation allows devices to run their own internal timers, and for messages to be time-stamped and transmitted before they are needed, for accurate timekeeping. (The OSC protocol already has a similar feature.) As the title suggests, this is intended to reduce jitter or timing variations rather than compensate for overall latency. Jitter Reduction is optional, and MIDI 2.0 can be used in real time without it.
Until the entire MIDI 2.0 specification is ratified by the MMA, manufacturers start incorporating it into products, and the likes of Apple and Microsoft support it in their operating systems, I can only speculate about its impact in a studio or live rig. (Perhaps I'll revisit this article in a year or two and see how accurate these predictions were.)
In a setup where a computer is USB-connected to a selection of physical controllers and sound sources, nothing much will change in terms of wiring: the computer will still be a USB host and everything else a client, and USB hubs and cables will work unchanged. Adding a MIDI 2.0 device via USB won't affect an existing setup: some connections will negotiate to MIDI 2.0, some won't. MIDI 1.0 devices will continue to work as long as DAWs keep supporting 1.0 — or someone will sell a converter from MIDI 2.0 to 1.0.
High-resolution and polyphonic articulation should give us some exciting new controllers and sound sources — or software — capable of responding to them. MPE has brought us a long way, but it's implemented as a bit of a hack and it lacks resolution. Hopefully, current MPE-capable controllers will be in the frame for firmware upgrades to gain MIDI 2.0 compatibility.
It's slightly less clear what the implication will be for DAWs. These already support editing of automation data via curves and breakpoints, and some allow per-note controller editing. The resolution and richness of MIDI 2.0 performance data will require some clever interface design to facilitate editing, especially if new controllers let us record complex polyphonic performances, and tweaking individual MIDI messages might become a thing of the past. MIDI‑CI will allow DAWs to discover a lot more about external gear than they can at the moment, and might even allow editing panels to be built automatically. It's less clear how MIDI 2.0 will impact software plug-ins, since they already provide rich interfaces and fine control at the DAW level, but per-note articulation is a contender for improvement.
A lot of the pieces still have to fall into place before MIDI 2.0 takes off, but after 37 years I can't wait for the upgrade.