Last month, Paul D. Lehrman described how he became involved in a project to create a performable edition of George Antheil's prophetic, but never performed, Ballet Mécanique, in its original version with 16 synchronised player pianos and a human ensemble. With the process of sequencing the 1240‑measure work complete, he was now faced with the task of preparing the first ever live performance of the piece... This is the last article in a two‑part series.
By April 1999 I had the entire Ballet Mécanique sequenced and stored in my Macintosh, and using the synths and samplers in my home‑studio, as well as a newly installed surround monitoring system, I could revel in an all‑MIDI performance of this piece — a piece which no‑one, including the composer, had ever heard before.
I burned a couple of CDs and sent them off to Bill Holab, at that time the publications director at Schirmer, who had hired me for the project, and Charles Amirkhanian, the composer who had worked with Antheil's music for over a decade. My job for Schirmer was pretty much done. But I wanted to take the project even further.
The University of Massachusetts at Lowell, where I had been part of the faculty since 1988, has a first‑class percussion ensemble, led by one of Boston's top freelance percussionists, Jeff Fischer. The university runs an excellent course in Sound Recording Technology (which I taught), so there is a wealth of technical expertise among the students and faculty, as well as top‑notch recording and production facilities. There is also a 1000‑seat concert hall in the same building as the studios, with all the amenities of a professional theatre, including lighting and sound systems, and tie lines to the recording rooms. Since the hall regularly books professional theatrical performances, there's a full‑time box‑office manager and a publicity office.
Given the unique resources available, it seemed as if this unpretentious state university might be the perfect place to present the world premiere performance of the 75‑year‑old monsterpiece. Michael Bates, director of academic relations at Yamaha, had told Schirmer that his company would provide 16 Disklaviers — Yamaha's line of MIDI‑compatible acoustic player pianos — to the first group who performed the piece. I called Mike and told him of my idea, and his only question was "When do you need them?"
Jeff Fischer, like me, knew about the 1952 version of the Ballet Mécanique — the one without any player pianos — and he had actually played it, in a performance by a Boston new music group called Alea III. I showed him the 1924 score, and although he realised immediately that this was a very different piece from the one he knew, he decided that his ensemble would be able to cut it. We sat down with the box‑office manager, and found a date, November 18th, to put it on the concert schedule.
In the weeks that followed, I recruited the two 'live' pianists the piece requires. One of them was Juanita Tsu, head of the piano department at the university, who was initially a little uneasy about the performance — she is more experienced with classical and romantic pieces which don't call for banging the forearms on the keyboard — but she soon became enthusiastic. The other pianist was John McDonald, a composer and pianist on the faculty of Tufts University (where I am now teaching) who, it turned out, had played in the same Alea III performance of the piece as Jeff.
We decided that Juanita and John could not play conventional grand pianos. Up against the din of 16 player pianos — not to mention the other instruments and noisemakers — their parts would never be heard. We could mic them, but I was already forseeing a technical horror show dealing with all the Disklaviers, and I wanted to simplify things. So they played on 88‑note MIDI keyboards (a Roland and a Kurzweil) through two of the Kurzweil MicroPianos I had used when I was programming the piece. These were fed through speaker wedges on the floor next to them on stage. So it came to pass that the acoustic pianos were being played by the computer, while the human beings played electronic pianos. I think the irony would have delighted George Antheil.
The programme's technical director, Bill Carman, hauled a Macintosh Quadra 650 out of an unused studio, and we loaded it with Opcode's Vision and a Digidesign SampleCell card to handle the propellors, bells, and siren. (Even though I had loaded the samples into a Kurzweil in my home studio, I still had the original samples in Peak, so I reformatted them for SampleCell.) A student, Wayne Cochrane, was called into service as my technical assistant, and a dozen or so other students volunteered for the various crews: stage, lighting, sound, video (we wanted to preserve this!), and also — since the Disklaviers would have to be shoved out of the way several times during the rehearsal process to allow other groups to use the stage — piano movers.
Our only serious problem was finding a sound system, not to mic the player pianos, as they were more than loud enough, but to handle all of the samples. When you counted the electronic pianos, we needed at least seven channels of amplification: one each for the three propellors, two for the siren (it was a stereo sample which would have lost something in mono) which could also handle the bells, and two for the piano modules. The school did not own a sound system that could handle the job and, unfortunately, with public education budgets what they are, they didn't have the funds to rent one (or anything else, for that matter).
Parsons Audio, a local pro audio dealer, put me in touch with Paul Carelli, national tour manager at EAW, the Massachusetts speaker manufacturer, and Paul rounded up seven LA‑series trapezoidal speakers, with appropriate stands and cables. Parsons loaned us a Hafler amp, one of the students donated a Mackie amp from his band's sound system, and the university had an extra Yamaha amp on hand. I contributed a venerable Mackie 1604 mixer from my closet full of antique hardware (which was going to be raided a lot), and a spare Nikko amp, and we were set.
Even though the Ballet Mécanique is played at a constant tempo, considering that there are over 600 time‑signature changes and several long silences, 'who's leading whom?' becomes an important question. Should the player pianos follow the conductor, and speed up or slow down when he or she wants to, or should the computer sequencer be the master, and the conductor follow along? And how do you get a conductor to follow a machine?
There are plenty of ways to make a sequencer follow a live conductor, from sophisticated devices like the 'MIDI baton', developed at the MIT Media Lab, or George Litterst's In Concert accompaniment program (which allows you to teach a sequence how to listen for specific input events) to simply assigning a musician to keep his hand on a slider or wheel that controls the tempo of the sequencer, and his eye on the conductor. This method would be very similar to what was available to Antheil, except that in his day the tempo of a piano‑roll mechanism would be adjusted by moving a lever.
With all of the Ballet Mécanique's metre changes, however, it seemed unlikely that any existing software‑based systems would be able to handle the job: most auto‑following programs are designed to work with relatively simple metres, and in this case, should the conductor make a single mistake, it might throw off the whole system. Having a musician control the tempo in real time is also very difficult when there is a half‑second delay in the response of the player pianos, as there is in the Yamaha Disklaviers. When the results of a tempo change aren't heard instantaneously, the chances of overshooting are very high.
So we decided that the computer would be the master and Jeff, the conductor, would be the slave. I wrote into the sequence a click track, using kick and sidestick sounds, which he could listen to over headphones. Jeff and I worked over the score in minute detail, deciding how bars with odd meters like 5/8 and 11/16 would be subdivided — the kick going on strong beats and the stick on weak beats. We used an old drum machine for the click track, but soon found that the sounds weren't working: even in headphones, with four bass drums and three xylophones (which by now had grown to four, since Jeff had realised that the parts were actually not playable by only three people) flailing away on the stage, the drum machine was getting lost. So we changed the sounds to a cowbell and a loose snare with a long decay. These sounds were so different from everything else that there was no way that Jeff could miss them.
As we got close to the concert we replaced the headphones with a Shure PSM600 in‑ear monitoring system, which worked incredibly well, and provided such a good seal in Jeff's ear that even during the long silences at the end, the audience never heard the click. It occurred to me that someday someone might want to perform the piece with every member of the ensemble wearing a personal monitor, which would allow it to be done without a conductor at all. I wonder if George would have liked that idea...
I also put vocal cues for rehearsal letters into the sequence, using samples of my own voice, so that if the conductor ever got lost, he could find his place again before too much time went by. And I put in some 'countdowns' to certain bars, to prepare Jeff for important transitions that might otherwise be hard to hear. The most important countdown was at the beginning. Because of the Disklaviers' built‑in delay (see last month's instalment), they needed a 'head start', so I inserted an extra three beats at the top of the sequence, during which my voice said "three, two, one", and the first chord of the player piano parts entered half a second before the fourth beat. Knowing the tempo of the piece, we could easily calculate how far in advance, in beats and ticks, the pianos had to enter.
The chance to do a concert with 16 player pianos doesn't come up very often. Trying to make the most of it, I looked around for other composers and pieces that might benefit from this unique opportunity. Richard Grayson, chairman of the music department at Occidental College in Los Angeles, sent me MIDI files for three pieces using various combinations of Disklaviers and MIDI synths. Ingenious, unique, and often very funny, I saw instantly that these pieces would fit our concert perfectly.
Two of them used an intriguing shareware program from Japan called MIDIgraphy which displays notes and other commands on a graphic screen, much like the piano‑roll screen in many sequencers, but with more options and jazzier graphics. One piece, in fact, 'Fantasy on Broadway Boogie‑Woogie' used, as its raw material, a lines‑and‑colours painting by Piet Mondrian. Grayson broke down the artwork into invididual elements, and then created a sequence that reconstructed the elements aurally and visually so, as the piece played, the painting assembled itself on the screen. The college's media centre lent us a Macintosh‑compatible video projector, which we could connect to Richard's Powerbook to display the images to the audience.
When we finally hooked Richard up, on the morning of the concert, we made a disturbing discovery: while MIDIGraphy is ostensibly OMS‑compatible, it can only address one OMS cable at a time. Our multi‑cable MIDI network would have to be completely taken apart to play Richard's pieces. Fortunately, my assistant Wayne realised that if we took a second MIDI interface (in this case an old Passport MIDI Transport), and attached it to Richard's computer, we could simply run a MIDI cable from it to one of the two unused inputs on my Kawai switcher. Instead of re‑cabling everything, all we had to do between pieces was to change the source selectors on the switcher.
From Dr. Jürgen Hocker in Germany (whose two custom MIDI player pianos were used by the Ensemble Moderne earlier in the year for their performance of the piece), I received MIDI files of two studies for two player pianos by the late expatriate composer Conlon Nancarrow. Nancarrow, an American who had fought in the Spanish Civil War, lived much of his life as a political exile in Mexico. His fascinating life and unique approach to music could fill a book (and has). He developed his own musical language using player pianos almost exclusively, but because he didn't have access to MIDI until late in his life, he barely got to hear any of the various pieces that he wrote for multiple player pianos in proper sync. Here was an opportunity for us to give them an audience.
And, for the occasion, I worked out an arrangement of the last movement of Mendelssohn's 'Italian' symphony, a piece I've played many times from various seats in the orchestra, as well as in an uncredited late‑19th century four‑hand piano version. Very fast and contrapuntal, almost Bach‑like, it lends itself to multiple pianos well. My version used eight different parts, each to be played by two Disklaviers.
To fill out the program, Jeff chose two percussion‑only pieces: 'Double Music', a meditative piece from 1941 by John Cage and Lou Harrison, which uses instruments like brake drums, water buffalo bells, sistrums, and a gong which is lowered into a tub of water whilst it is sounding; and 'Ritmicas No. 5 & 6', by Cuban composer Amadeo Roldàn, written in 1930, and probably the first works ever written for percussion ensemble. These are very exciting pieces, using the whole vocabulary of what we now call Latin percussion, integrating African, Caribbean, and jazz rhythms in a way which still sounds fresh.
We certainly weren't going to let this event go by without recording it. On the plane to the AES convention the year before, I had run into Jonathan Wyner, a local classical engineer and owner of M‑Works Mastering. I told him about the project and he said, "I'm there". When I called him a year later for a firm commitment, he was still there.
We decided we would record the concert, but also schedule a second session in the hall, two days later, where we could do a studio‑quality recording, re‑taking any sections that we weren't happy with. Along with technical director Bill Carman, Jonathan and I designed two recording systems, using mics from the school's impressive collection as well as B&Ks and Schoeps from Jonathan's. The setup for the concert was relatively simple, since we couldn't move things around too much between pieces: a cardioid pair arranged in ORTF configuration was on the stage just behind the conductor's podium, and a pair of spaced omnis was placed in the first row of seats. The hall is very live, so additional mics further from the stage were unnecessary.
For the recording sessions we added spot mics on the four groups of Disklaviers, the xylophones, the gong, the bass drums (an M‑S pair), and the speaker array at the back; and also direct feeds from the samplers and synths. To offset the 'boominess' of the hall, I rented 12 dozen blankets from a local moving chain (I had to go to three locations), and our intrepid student crew covered the balcony and the rear of the orchestra seats with them.
Thanks to the generosity of the professional audio manufacturing community, we were able to produce a completely professional recording. Besides the speakers from EAW and the personal monitor from Shure, we borrowed three AD8000 A‑D convertors from Apogee Digital (who also donated DTRS tape), and 16 channels of mic preamps from Millennia Media. Tascam came through with three DA88s, and we got cables from Redco Audio, plus a snake from a colleague at another local college.
Robert Lyons, a producer at WGBH‑FM, the local National Public Radio station, is an old friend, and when I mentioned the project to him over dinner one night, his eyes lit up. One of Bob's current assignments was to find interesting things to put on the WGBH's web site, which streams the station's on‑air signal most of the time. Could wgbh.org webcast the concert live? Somehow, the idea of using end‑of‑the‑20th‑century technology to broadcast the vision of a Futurist composer from the early 20th century seemed most fitting.
By moving various segments of heaven and earth, Bob managed to get an ISDN line from our corner of the campus down to WGBin Boston (about 35 miles), and from there split the audio three ways, to serve three different Web audio platforms: an ISDN signal (using a Telos Zephyr) went to RealNetworks in Seattle, Washington, who encoded and served the RealAudio streams; an audio signal went into a Macintosh, which converted it to QuickTime and sent it over the Internet to an Apple/Akamai distribution centre, which then served the QuickTime streams, which were available both on WGBH's site and Apple's QTV site; and the third audio signal was encoded as Windows Media and sent via the Internet to Public Radio Interactive and Westwind in Denver, who served the Windows streams. A little high (10kHz) and low (80Hz) shelving was applied to the signal before it was distributed, as well as 1:1.8 compression, but otherwise it was sent untouched from the 2‑track output of the school's Soundcraft mixing board.
The final days before the concert, as you might imagine, were insane. We had to deal with the usual last‑minute bugs, failures, and miscommunications that always accompany a big, tech‑heavy show. But as I listened to the final, relatively glitch‑free dress rehearsal, I was overwhelmed by the power of the piece. Although it had been pretty exciting to listen to in my studio, the sheer volume of sound filling the concert hall as the bass drums boomed, the xylophones (being played with super‑hard mallets) cracked, and the pianos crashed and banged, was stupendous. And when the siren sample came sailing out over the players' heads, it was almost too much to bear. Although the ensemble played the piece quite a bit slower than my sequenced version (in some parts of the piece, Antheil called on the players to do things that were simply inhuman at the specified tempo), the live musicians brought drama, shape, and above all, musicality to the piece that I had not been able to hear in my studio. It was quite a moment.
The night of November 18th went far more smoothly than any of us had a right to expect — nothing significant went wrong. We lost a speaker for a moment in one of Richard Grayson's pieces when a crew member tripped over a cable, but that was it. The students and pianists played brilliantly. The concert ran for almost three hours, but none of the 1000 people in the concert hall left, and the ensemble got a standing ovation.
We had convinced The House Ear Institute, the California‑based organisation which has done much to publicise musicians' hearing problems, to send us 1000 pairs of earplugs, and a box of literature about their very valuable work. Many audience members were happy to have them.
The webcast worked perfectly, with about 4000 listeners, all over the world staying with the broadcast for at least 20 minutes. Charles Amirkhanian, the Antheil expert who is currently in Italy on a fellowship, sent me an email at 5am his time, absolutely ecstatic about it. The San Francisco Bay Guardian wrote a glowing review, calling us "one of the four most important artistic events" in the reviewer's life, also based on the webcast. Bob Lyons told me later that 40 percent of the audience was listening on Windows, 35 percent on RealAudio, and 25 percent on QuickTime. And I even got to listen to it: in his office near the control room, Bill Carman had WGBH's site up on two computers, and at one point during the piece, I was able to sneak offstage, run up to the office, and check it out. Very cool.
In January, Jeff Fischer and I sat down in my studio and listened to the rough mixes of the recording session, deciding which takes to keep, and where to make edits. A couple of weeks later, Jonathan Wyner and I went down to Sounds Interesting, a studio owned by musician and producer Erik Lindgren (who was at the concert), in Middleborough, a small town in horse country about an hour south of Boston. In the midst of a light snowstorm, we mixed the recording from the session on Erik's Yamaha O2R and his Stage Accompany monitors. Jonathan and Bill's choices of mics and their placement were spot on: the sense of immediacy, clarity, and air around each instrument is almost palpable. I wasn't happy with the two‑track mix from the concert — the same mix that went out over the Web — because, believe it or not, the Disklaviers weren't loud enough. However, this problem was easy to remedy in the mix, and we were able to get terrific instrumental balances. Jonathan edited and mastered the recordings on his Sonic Solutions system, and we were ready to make a CD. The recording — which includes the Cage/Harrison, two of Grayson's pieces, the Roldàn, and my Mendelssohn arrangement — is now out on the Electronic Music Foundation label, and can be ordered from their web site, https://web.archive.org/web/2014....
And the new/old Ballet Mécanique has taken on a life of its own, being performed by the American Composers' Orchestra at Carnegie Hall in New York, and presented by the San Francisco Symphony, conducted by Michael Tilson Thomas, in an all‑Antheil concert at the orchestra's Davies Hall. I have even been talking with some producers about doing a documentary film about Antheil's lost masterpiece.
What was most remarkable about the Ballet Mécanique project was that it never stopped being fun. The technical challenges, the musical research, the 'aha!' factor when I figured something out, the 'wow!' factor as I heard the piece develop, the meeting of people and drawing them into the project, and seeing them get as enthusiastic as I, have all been a tremendous high. It has been a rare opportunity, something I feel very fortunate to have been given, to bring together just about everything I've ever learned or ever done — in terms of music composition and performance, computer technology, audio electronics, sound design, researching, and teaching — into one giant project. And it's even fun to write about it. I am deeply grateful to all of the people who helped to make it happen. And I think George Antheil would be too.
Because of the density of the MIDI data and the tight rhythms, it is imperative that each of the four player‑piano parts in the Ballet Mécanique be on its own MIDI cable. A Mark Of The Unicorn MIDI Time Piece served this purpose well, each group of pianos having its own output defined in OMS, with a fifth output for the bells. But when I started working with the Disklaviers, I made a discovery. Remember the Disklaviers' built‑in 500mS delay, which I mentioned last month? Not only does it show up at the keyboard, but it also shows up at the piano's MIDI Thru jacks! If we were to daisy‑chain the four Disklaviers in a group, each one would be later than the one before it in the chain, by half a second. This is a violation of the MIDI specification, but I wasn't about to argue the point with Yamaha.
The solution was to use MIDI splitter boxes which, as it turned out, in this era of inexpensive multi‑port MIDI interfaces, are not that easy to come by. We pressed one MOTU MIDI Express into service, and set it to '1x6' mode, which took care of one of the four groups. Another composer whose works we were presenting lent us an old 4‑way splitter, so that was for group number two. I remembered that deep in my by now well‑rummaged antique closet I had a Kawai 4 x 8 manually switched MIDI patchbay, so I set that up in 2x4 configuration to handle the remaining two groups.
As we approached the concert date, it occurred to me that it might be worth investigating using real bells instead of samples. I had rescued an old Radio Shack alarm bell from my antiques closet to make my samples, and I combed the local hardware stores for others, but I only found two that would work. So I went onto the Web, and at the site belonging to Edwards Signals, I found more bells than I could possibly imagine. I chose four that seemed to have similar voltage and current requirements, and ordered them.
I also contacted a company in Vancouver, Canada, called MIDI Solutions, who make a MIDI‑to‑relay convertor: a device that responds to MIDI commands by opening and closing eight different relays. The box worked perfectly, and since a little SysEx doesn't scare me, it was straightforward to program. However, its relay contacts were not rated high enough to handle the substantial currents (and very high transients) that the bells drew, so I recruited Coleman Rogers, an engineer and fellow Lowell faculty member, to help design and build a system whereby the MIDI Solutions box's relays triggered seven other relays, which in turn controlled the bells.
Coleman mounted the bells on a piece of thick plywood, but when we turned them on, the plywood made more noise than the bells did — instead of 'ringing', it 'buzzed', since the plywood was acting as a sounding board for the buzzer mechanisms at the heart of the bells. We tried isolating them with rubber washers and pads, but it made little difference. In a flash of inspiration, I attached each bell to a small piece of heavy butcher‑block wood, and then screwed small hooks into the butcher‑block and into the bottom of the plywood mainframe. I put two large hooks at the top of the plywood, so it could be hung with heavy chains from a lighting pipe on stage and then, using several lengths of light, plastic‑coated chain, dangled the bells from the mainframe. It made all the difference, and sounded great. It looked a little Rube Goldberg‑ish, but in the right lighting it was very 'post‑industrial', especially after one of the students spray‑painted the whole thing black. Again, I think Antheil would have approved.
Paul D. Lehrman is a composer, author, and educator, and recently joined the faculty of Tufts University in Massachusetts. He is a columnist for Mix magazine (US) and has been a contributor to Sound On Sound since 1986. For more information about Paul's project, and some audio tidbits, visit www.antheil.org. You can order a CD of the Ballet Mécanique and other works from www.cdemusic.org/emfmedia.