You are here

Hardware design paradigms: why use them?

Sounding Off By Simon Price
Published May 2004

Hardware design paradigms: why use them?

When studios are increasingly dominated by software, why are we still hanging on to hardware design paradigms?

If you've ever used Apple's OS X operating system, you'll probably have enjoyed the way that the icons in the 'Dock' (the strip of shortcuts at the bottom of the screen) grow and shrink as your mouse rolls over them in a pleasing ripple of smooth expansions and contractions. Just as in real life, as the graphical objects approach, new pieces of information are revealed.

About The AuthorSimon Price is a freelance engineer and Pro Tools editor, and is part of the Flatpack production team. He also writes Pro Tools Notes for SOS.

Whoever came up with the OS X Dock understood that we're generally only interested in a limited amount of information at any one time. And with a mouse or trackpad, we can certainly only alter one thing at a time. The Dock exaggerates the visual version of the 'cocktail party effect', where your hearing can only focus on one conversation at a time in a crowd of people. In fact, the human visual system is extremely centre-weighted. Try looking directly at the full-stop at the end of this sentence while trying to read what's in the next column. Unless you let your eyes flick across, the most you can say is that there is some text there, but that's about it!

On a page, all the information has to be there all of the time, even when it's not being 'used' (read): it's a fixed format. A computer screen is a fluid format: information stored in the computer can be displayed in any way and at any time. I'm stating the obvious, right? Then why does so much of my music software present me with a static screen crammed with tiny dials? I sometimes find using Native Instruments' Reaktor (for example) a physically painful experience. Trying to program a step sequence by making sub-millimetre mouse movements on a row of 16 knobs, each a few pixels across, feels like some kind of bizarre torture instead of a fun creative exercise. The SAS should use it as one of the 'stress positions' they put captured enemies in during interrogation. Why, in any case, does a 'virtual' row of knobs have to look anything like a real row of knobs?

There are, in fact, some good answers to this last question. Knobs and faders are very efficient interfaces because they both control a parameter and display its current value. A row of faders is in fact a decent information design solution, translating smoothly from the physical world to the on-screen GUI world. Furthermore, by mimicking existing physical instruments and equipment, you make the software instantly usable by anyone who already knows how to use the hardware. Besides, sometimes you wouldn't want it any other way: for Arturia's Minimoog V plug-in to not look like a Minimoog would be just plain disappointing. But how far can this go?

The cabling on the back of the virtual rack in Propellerhead Reason makes it easy to figure out how to connect devices together, as well as being an awesome marketing meme. But look what happens when you get loads of cables packed in there: you can't see where anything is going. In the real world, you'd grab hold of the cable and feed it through your hand until you found the end. In Reason, you can grab the end of a jack and the other cables become transparent so you can see what's happening. Here the need for an unreal, non-real-world solution was identified and met.

It's interesting to look at circumstances where the software world surpasses the hardware interface that's being replaced. Two examples are audio editors and samplers. Chopping up and manipulating waveforms is far easier than editing tape, and a computer screen beats an LCD display hands down when creating a complex multi-zone sample patch. What's significant, I'd suggest, is that both these tasks were previously handled by devices with difficult and non-intuitive interfaces. A modular analogue synth, on the other hand, has a great user interface, but when reproduced on-screen, it's no longer so great.

Sooner or later the usefulness of simulating a hardware synth will dwindle, as the majority of users will have started out on software studios. What's needed is a complete re-think of how we interact with these devices. You can start by removing all the unnecessary graphics and making efficient use of the screen. Ditch that realistically-drawn oscillator waveform selector knob that sits next to four LEDs and little squiggly wave pictures, and give me a big honking square button that steps through the options and displays the current shape. Let me just click on something and then take over with the cursor keys — if I have to struggle with fine mouse adjustments, the interface has failed. Show me one thing at a time and make it big! Take a look at Native Instruments' Battery: you click on the drum pad you want to edit and half the screen gets devoted to it. Or, try Pro Tools ' simple editing system with its extreme zoom key, and context-sensitive cursor. Finally, consider Ableton's Live: someone has really thought about that bare-bones pots and single-track focus display. Try it out on someone who's under 10, and they're not worrying about the lack of little drop-shadowed fader caps: they're banging out tunes in next to no time.