Technology is a funny thing — it lumbers on for a few years giving you 'more' and 'faster' for less money, but then every once in a while there's a revolutionary step rather than an evolutionary one. Having looked at some of the very clever software that's around at the moment, especially in the fields of modelling and resynthesis, I have a feeling that we could witness a big upheaval in electronic synthesis over the next two or three years. Traditionally, synthesis has been directed mainly towards keyboard players, as electronic keyboards are essentially switches that ideally lend themselves to the unambiguous triggering of envelopes, oscillators and all the other trappings of sound generation. But when you look at the wider picture, the keyboard is just about the worst possible way to control an expressive instrument. Pianos, organs, harpsichords and other keyboard instruments are very limited in their capability for expression — you could even argue that the accordion is the most expressive acoustic keyboard instrument, because of the way the bellows and keys interact to produce the sound.
Of course we've had MIDI guitars, MIDI violins and probably MIDI spoons come to think of it, but none of these are entirely successful, because they try to make the guitar or other instrument behave like a keyboard as far as MIDI is concerned. That means stripping away all the expression that normally goes with the instrument and replacing it with one or more MIDI controllers. Even if you can come to terms with playing one of these MIDI instruments, it soon becomes clear that you've thrown away all the subtle means of expression that attracted you to it in the first place.
What am I leading up to here? Well I think that we could be on the threshold of re-inventing synthesis with the electric guitar as the prime means of control rather than the keyboard. The market is a lot bigger for a start — there must be at least 30 guitar players for every keyboard player (at least judging by the makeup of bands in my area) and the electric guitar is one of the most expressive instruments next to the violin. The new technology wouldn't use pitch-tracking as most of the present guitar synths do, but it would still require a separate pickup for each string. Using resynthesis techniques, the waveform from the guitar string could then be transformed directly into the desired synthesized sound, and where appropriate, playing on different parts of the string, damping the string or playing pinched harmonics would translate directly to the target sound. In other words, as the guitar waveform was modified by different playing techniques, so would the resynthesized output, resulting in a direct means of putting expression into the sound using normal playing techniques rather than wheels and levers.
If this sounds familiar, that's not surprising, as these ideas (split pickup resynthesis of the guitar's output, the ability to add expression directly to the output sound) are basically the same as those underpinning Roland's V-Guitar, originally launched a decade ago. However, the more limited processing available back then meant that the 'resynthesis' amounted to little more than altering some of the harmonics in the guitar's output. Today, much more could be done, although I'll admit that there are problems still to be overcome. Firstly, guitars don't have infinite sustain, so any synthesis system would probably have to incorporate a hold pedal, or perhaps a specially built guitar with infinite sustain, like the very first Roland guitar synth, the GR500.
There's also the business of recording the sound into a sequencer in a way that allows it to be edited. The system wouldn't use MIDI, but it is just possible that the resynthesized sound could be represented using metadata that could be recorded and edited in much the same way as MIDI. If not, then no big deal — most guitar players don't complain that they can't edit their performances on a MIDI grid! Such an instrument wouldn't necessarily be able to provide an accurate emulation of all existing instruments, but then neither can MIDI keyboards. After all, when was the last time you heard a guitar patch on a synth that sounded like a guitar and not like some form of piano? But if an infinite number of sounds was all we needed, the present sound libraries would come close to fulfilling all our requirements. Ultimately, the sounds themselves don't matter nearly as much as the expression the instrument lets you put into them using your normal playing techniques, which is why the electric guitar is capable of putting even the most versatile keyboard to shame.
Paul White Editor In Chief