What Windows music software works well with multi-touch screens, and what feels a bit out of touch? Find out in our in-depth guide.
Most of us now have an iPad or some sort of tablet or smartphone. What’s more, we’ve probably tried using it to make beats, play synths, do some field recording or control our studio computer with a swipe-swipe of our fingers. We’re completely at home with the multi-touch screen environment. But we’re also perhaps grumbling at the size of the tablet/phone screen, quickly running out of processing power, and troubled by how best to integrate our devices into a larger studio setup. Conversely, on our desktop and laptop studio machines, we have far more power available and access to all the software tools we could wish for, but often find ourselves reduced to controlling them with a mouse.
Multi-touch technology has been very slow to make any headway in the world of desktop computing — and not just in terms of music production software, by any means. Surprisingly, perhaps, given the ubiquity of the iPad, Apple’s OS X doesn’t support multi-touch, but on ‘the other side’ it’s been available in some form or other since Windows 7. And, with high-quality, 24-inch, 10-point touchscreens now available for a very modest outlay (around £300), Windows 8 maturing through version 8.1, and Windows 9 on the horizon, some developers are now making significant progress. So, maybe it’s finally time to figure out how and where multi-touch-capable software could enhance your own recording studio, whether that be a modest home-studio setup, or something on a grander scale.
A number of multi-touch technologies have battled it out for supremacy, and each has its pros and cons:
1. Resistive-touch technology uses pressure on the screen to register each input. Before the iPhone, it was found on most mobile touchscreen devices, such as PDAs and mobile phones, and tended to use a stylus for superior accuracy. This is still in use in the graphics tablet market.
2. Infra-red technology creates an optical grid across the screen, and registers a ‘touch’ when the beams are interrupted. This is particularly suited to larger screens. It’s what the Microsoft PixelSense was based on, and it can be found today in Slate Digital’s Raven MTX. It benefits from great accuracy, not having to use annoyingly reflective glass, and the ability to register touches from any object (not just a finger). However, infra-red screens are vulnerable to accidental ‘touches’ from elbows, clothing, insects and so on.
3. Capacitive technology has risen to the top, primarily through Apple’s use of it. It’s durable, reliable and accurate, with a good resolution (although not as good as a stylus), and the price has come down significantly due to the sheer number of phones and tablets using it. Capacitive screens work by creating a minute electrical field, from which a capacitive object (such as a finger) draws current, creating a voltage drop at that point of the screen. Most phones and tablets now use a variant called Projected Capacitive Touch technology, which essentially doubles up the grid for improved accuracy and tracking, and also supports passive styli and gloved fingers.
Microsoft introduced native multi-touch support with Windows 7, but two things prevented it from really catching on: the expense of the hardware, and the fact that a desktop OS designed primarily for a mouse isn’t particularly suited to being operated with your fingers. In an act of pure genius (or madness, depending on your point of view) Microsoft then designed Windows 8 to offer the user two distinct interfaces: the Windows Start (formerly known as Metro) Modern UI, which is intended for touching, and the standard desktop, for use with a mouse and keyboard. It meant that the OS could be used on multiple devices: tablets, phones, laptops and desktops could all use it, and be touchable and futuristic in every environment.
Unfortunately, for the vast majority of people without a multi-touch interface, it was confusing and a bit frustrating, in that half of the OS couldn’t comfortably be used when all you had was a mouse! Microsoft addressed many of these concerns with Windows 8.1, and further updates have brought the Modern and the Desktop interfaces much closer together. The experience for the mouse and keyboard user is now much more that of an enhanced or augmented desktop, rather than a touch interface that was out of their reach. At the same time, sales of all-in-one computers with multi-touch screens, hybrid laptops and Surface tablets have grown at a healthy rate, and so multi-touch technology is fast becoming ‘normal’ on regular Windows computers.
So what does the Windows App Store offer in terms of music apps? Unfortunately, not a lot! There’s a sprinkling of piano and guitar-strumming apps, and some sample-triggering and remixing ones. There’s a DJ remix app from Magix called Music Maker Jam, which is pretty cool in a DJ-remixing-preset-samples kind of way, and a recording studio from Glouco, which does at least allow you to record via a mic input and sequence the included instruments. However, Windows Store apps have been hampered by two things when it comes to music production. Firstly there’s been no support for MIDI, and secondly, the output latency is only as good as the system’s soundcard using standard Windows drivers.
With Windows 8.1, Microsoft revealed a MIDI API (application programming interface), which for the first time allows a Modern app to use a MIDI input. Image Line took advantage of this in FL Studio Groove, a decent music-making app that lets you sequence, mix and mess about with a range of sample-based instruments and drums — and, of course, you can play the sounds from a MIDI keyboard.
The MIDI API is the first release from a new team of creative people at Microsoft who are working to improve the MIDI and audio aspects of Windows apps for the forthcoming Windows 9. This is unprecedented, and potentially very exciting for music makers. Pete Brown, from Microsoft’s DX Engineering Engagement and Evangelism department, said that they’re doing some serious engineering here, including on audio latency for Universal apps (apps that run across all Windows devices). They’re working with a lot of partners, big names and small, from both hardware and software worlds, to help prioritise, prototype apps, and so on. He tells me they’ve made sure that the approach is aligned with industry needs and requirements, and not just what Microsoft think needs to be done. It’s been the most open development process he’s ever seen, and he says the response from partners so far has been extremely positive.
My view, though, is that the Windows App Store misses many of the key points of running multi-touch technology on a desktop computer. These apps are designed to work across lots of different, low-power devices — which makes them little different from what we have now with Apple’s iOS. Yet my Windows computer has oodles of processing power, great big wads of RAM and vast canyons of storage. I don’t need the Anamoog or Sunriser synth in the Windows App Store (nice as that would be). What I really want is multi-touch control over my studio software — including my DAW, with its arrangement and editing pages and numerous faders on the mixer, and plug-ins such as Native Instruments Komplete, Vienna Strings, Omnisphere, the Arturia V Collection and Waves, all plugged in and working together.
The first thing to realise is that multi-touch is not mouse emulation, as it was on the old single-touch screens you might find in supermarkets or information kiosks. So, although it sometimes appears as though you’re just mousing about with your finger, it doesn’t always work as expected. In an application not designed specifically for touch control (we’ll call these ‘non-touch’), like Cubase, a single finger can access all the menu items, controls and parameters, just as with a mouse. But with some of the plug-ins you’ll find you can’t play the virtual keyboard unless you pull your finger across the notes, giving you a sort of Stylophone effect.
This problem is more obviously demonstrable in Adobe’s Photoshop: you can select all the tools and menu items, but you can’t actually use your finger on an image — nothing happens! Oddly, if you start with your finger off to the side of the page and then drag it onto the image, it then allows you to draw — but only as long as your finger stays in contact. Adobe say that this is because there’s no touch standard, and they already have their own APIs for use with Wacom’s pen and touchscreen products. They’re also waiting for Apple to join in the game, which, I fear, is something we’re going to hear from a lot of developers of software aimed at creative professionals.
Slight oddities aside, most DAW software actually works very well with single touches on a multi-touch screen. In my tests, Cubase, Pro Tools, Reason, Studio One, Reaper and Tracktion all happily let me poke around to my heart’s content. Ableton Live proved good for launching clips, entering notes and moving regions, but I ran into trouble when attempting to move parameters: once I’d grabbed a control with my finger, the knob or slider would zoom to the maximum or minimum value with the slightest finger movement without letting me easily set any value in between. Fortunately, there’s a fix for this. You have to create an ‘options.txt’ file in the Preferences folder, which lurks in the back end of the dusty reaches of your file system, and add the line “-AbsoluteMouseMode” (more precise information can be found at: www.ableton.com/en/articles/optionstxt-file-live). This allows all of the parameters to be moved much more smoothly. Unfortunately, I’m not aware of similar fixes for Bitwig Studio, which has exactly the same problem, or Digital Performer 8 (DP8), where some plug-ins exhibit this behaviour.
In the project/session/arrange window, with the exception of Live, Bitwig and DP8, all DAWs responded to the pinch/zoom two-finger gesture to either expand the track height or extend the timeline. Reaper even managed to do both directions at the same time. So, although these programs are not multi-touch compatible, there’s not that much you can’t do with your fingertip, assuming you can get your fudgy finger on the sometimes tiny knobs. You could load Reason, Live or Pro Tools onto a Windows 8 Pro tablet, such as the Microsoft Surface, and get on with making music without having to add a mouse to the equation.
Similarly, with stand-alone plug-ins and instruments, such as Native Instruments Reaktor or the Arturia Mini V, you’ve got single-finger control over all the parameters — the only problem is the Stylophone effect on virtual keyboards described earlier. Even though, especially on a tablet, you can go without a mouse, there’s no reason why you would want to do so on a desktop machine. The beauty with multi-touch on the desktop is that you can use everything. So, you can use your mouse and keyboard as normal, but perhaps when tweaking plug-ins or working closely in the arrange page you can simply reach out and touch it. Which is fantastic!
Cakewalk’s Sonar X3 deserves a special mention because it’s pretty much the only mainstream DAW (I’ll discuss some less widely used ones later) to implement some degree of multi-touch. In fact, Cakewalk first introduced this a few years ago with Sonar X2a, and the implementation in Sonar X3 is largely the same. Here’s a quick overview of some key features:
1. Console View: all the faders and pan, sends and other controls are all gloriously multi-touchable. You can use all your fingers, all at once, and mix to your heart’s content.
2. Arrange View: the pinch/zoom works in both directions, allowing you to zoom in time or expand track height independently.
3. Matrix View: You can drag/drop samples, launch clips and move things about with as many fingers as you like, making this performance side of Sonar a real joy to use — Ableton and Bitwig really should take note.
Sonar struggles, though, when it comes to consistency. In the Pro Channel expansion to the Console view, the knobs in the EQ, Tube and Compressor respond to a rotary, sort of half-circle finger movement, but the knobs in the rest of the Pro Channel respond with an up and down movement. You can’t control more than one knob at a time, although you can move other controls on the console. Using the visual display on the EQ suffers from the same ‘zooming about’ problem found in Live, though they’ve dealt with this issue in X3 by sliding out a lovely large EQ window, which works perfectly with multi-touch.
Meanwhile, back in the arrange window, nothing is actually touchable! You can’t move any regions, cut them up or edit them in any way. Nor can you add notes to the piano roll or change any automation. All the arrange page things that can be done in the other non-touch-enabled DAWs can’t be done in the touch-enabled Sonar, which is a bit strange. The changes they made to the Pro Channel EQ shows exactly what’s required for touch to work effectively: you need big knobs. The Console view is hampered in places by the size and throw of the faders and some of the small controls. Cakewalk have built multi-touch into their existing GUI, and although it works well in some areas it also demonstrates why this might not be the best way to approach it — although, of course, you still have your mouse and keyboard.
There’s a handful of other music software out there that’s serious about multi-touch, from the simple, via the extraordinary, to the professional. Probably the best-known is OpenLabs’ StageLight, developed in part with Mike Shinoda from Linkin Park and Timbaland. It was released with the intention of being the ‘GarageBand alternative’ for Windows. Version 1 was simple, but it felt like there were some things missing; it just needed a bit more depth. Version 2 was just being released at the time of writing, and it seems they might have got things a bit more right. The look and feel is the same cool, flat, neon Tron style that we find on many touch-style music apps, and although it feels a lot like a Windows Store app ,this lives on the desktop (albeit in full-screen), which gives it some distinct advantages over the likes of FL Studio Groove and Magix Music Maker: full USB MIDI control, access to ASIO drivers, and support for VST Instruments and plug-ins.
StageLight has the standard arrange window, with tracks and a timeline, piano roll and automation, but it also has drum pads, a step sequencer, and a virtual keyboard that you can lock to various preset tunings, making it very easy to play all the right notes. In version 2 they’ve added in some nice-sounding synths and instruments, all with touchy parameters, and they’ve introduced an Ableton Live-style loop arranger, with large clip boxes to poke with your fingers. One very neat feature is that, through its support of VST plug-ins, it includes a multi-touch GUI version of the standard VST parameters window. It’s very simple, with each parameter displaying just a slider and a value, but it hints at what’s possible. Matthew Presley, Product Manager at Open Labs, mentioned how the right-click element of touch — where you hold until a menu appears — is something they’ve found frustrating. In refreshing the interface for version 2, they decided to get around that by creating a ‘Charms’-style toolbar at the side with all the editing tools, including a ‘Duplicate’ button, which takes the finger pain out of copying and pasting. Their core concern was to make it easy, so that people can just get on and make music.
At $10, there’s nothing really to touch StageLight. The pricing model is similar to that used in so many iOS and Android apps, and it’s something we’ll probably see much more of — the standard software is very cheap, or free, and then, through an in-app store, you can purchase additional features as you get more serious. It’s a refreshing change from the ‘Lite’ versions of DAWs we’re so familiar with, where you always wish you could afford the ‘real’ version just to get a little more functionality. StageLight is increasingly being pre-installed on many Dell, Lenovo and Acer Windows 8 tablets, and as kids these days are unlikely ever to possess an actual desktop computer, this might very well be where they start making music.
Moving into the extraordinary, we find the fabulously named Usine Hollyhock from Sensomusic. It’s unique in form and function, very modular, with everything spread out and connected on a ‘workspace’. You can create little racks containing MIDI and audio effects and processes, and save them individually to work with other projects. It’s designed to be used live, to be customisable and as configurable as you want it to be. It’s very beautiful and interesting to play with and the Quick Tour guide really does make you go ‘wow’... although I’m not sure that it left me any the wiser as to what to do next! The multi-touch support is fully integrated into everything about this software, and it really begs you to interact with it. This is, in my view, the sort of approach that multi-touch needs: one not constrained by existing norms of DAW software.
More familiar territory is covered by Image Line’s FL Studio 11. With version 11 they’ve implemented some interesting gesture control for scrolling, zooming and rotating, as well as multi-touch in Performance Mode and for the piano roll. You can enable a multi-touch optimisation, which increases the size of the hot-spots around knobs and buttons, and this is essential given FL Studio’s otherwise small GUIs. The forthcoming version 12 features a new ‘vectorial’ interface, which can scale in size up to 400 percent, making it a far more user-friendly touch experience.
Harrison are in the process of making their Mixbus software suitable for touch interfaces. The virtual console lends itself very well to touch as, being based on an analogue mixer layout, it has a knob-per-function approach, rather than individual plug-in windows for EQ, dynamics and so on. Harrison have been using touchscreen technology in their consoles since the ’90s, so it seems completely natural for them to include it in their software, now that touch technology is more commonly available. They’ve had a lot of experience in getting over some of the design issues associated with touchscreens, too.
Ben Loftis, Product Manager for Mixbus, had this to say: “In a touch interface, you must accommodate calibration errors, parallax, and the splay of your finger. You don’t have any haptic feedback. So, if you want an analogue-console experience on a touchscreen, you will need a touchscreen that is larger than the analogue counterpart. But exactly how much larger depends on the hardware and the user. Currently, Mixbus v2 chooses between three sizes, based on your monitor resolution. But v3 will give the user an infinitely variable-scale slider, so we can accommodate more combinations of screen size and resolution. Also, our plug-ins (like the XT series) are arbitrarily scalable: if you stretch the plug-in window, all the knobs get bigger. We think this will be important for touch users, because many existing plug-ins use tiny buttons.”
The grandaddy of digital audio on the PC, Software Audio Workshop (SAW) Studio added multi-touch control to their Software Audio Console (SAC) live-sound mixer application as long ago as 2010. It was tied into the revolutionary 3M multi-touch screens, as favoured by Perceptive Pixel. Unfortunately, it hasn’t got any further than that, and still only supports multi-touch on these rather expensive screens. The layout of SAC lends itself brilliantly to multi-touch and currently works very well with a single touch — but it would be good to see this opened up to more current and cheaper technology.
The DAW software itself isn’t actually responsible for the ‘touchiness’ of its entire environment — we all rely on the GUIs of the plug-ins they host a great deal too. Strange as it may seem, plug-ins can be fully multi-touch even when hosted in a non-multi-touch DAW. I discovered this when I first bought the LuSH-101 synth from d16 Group a couple of years ago, and was using it in Cubase on my touchscreen. I dropped d16 a line asking them why they decided to implement multi-touch, as few people seemed to have done so at that point — and they replied to say they had no idea that they had!
It turned out that the development libraries they were using (JUCE C++) contained multi-touch features, which were primarily intended for the iPad, and that these had simply translated into the GUI of the VST version. This is true of all their plug-ins. They’ve now released a larger-GUI option for LuSH-101, for people with fat fingers.
I had a similar experience more recently with Arturia’s Spark 2 soft synth. Arturia told me that they hadn’t planned to make it multi-touch, but that they’re very happy that it is. So, the programming languages already exist to allow developers to include multi-touch functionality without specialist add-ons or tools — which means plug-in manufacturers may start to produce their multi-touch GUIs even while the DAW makers drag their feet.
The alternative to direct touch control of the DAW or plug-ins is to use touchscreen technology as a controller. The Jazz Mutant Lemur, the first commercially available multi-touch controller, has now evolved to become an iPad app, and there are now dozens of iPad apps for controlling DAW software via MIDI or OSC — in fact, there are even a few for Android. They’re selling well and there’s obviously a desire and use for it. The lack of haptic feedback (an actual physical knob or fader) doesn’t appear to be a barrier to most users, despite the recent Kickstarter campaign to manufacture knobs that you can stick onto the surface of the iPad (http://sosm.ag/ipad-knob-kickstarter).
James Ivey, Pro Tools Expert hardware editor (www.pro-tools-expert.com), who owns a Slate Pro Raven MTi controller, put to me the case for touchscreen controls over physical faders: “I was using a Euphonix [now Avid] Artist Control and Mix. So I had 12 faders to play with. With the MTi I have unlimited faders — what’s not to like? I really don’t buy into the “Oh, it’s not a real fader or pot” thing. I’m so much faster on the Raven. It’s big, it’s clear and if I don’t like something about the workflow or arrangement I can change it.”
Perhaps more of a barrier, then, is the physical size of the iPad, and the connectivity when away from the cosy security of your home network. With a touchscreen attached directly to your desktop you have none of the connectivity problems, because the screen is right there: attached via HDMI or DVI, it’s part of your system via a virtual MIDI driver. Although Windows tablets may suffer from the same size issues, hybrids, all-in-ones and dumb touchscreens don’t give you a proper console-sized surface to play with either. Probably the most important point is that the controller can ‘be’ anything — knobs, faders, pads, XY controls, you are not stuck to a hardwired configuration.
SmithsonMartin’s Emulator Elite is an awesome crystal-clear, projected capacitive, 10-point touchscreen that folds out into a beautiful sheet of glass. This is then rear-projected upon to create what looks like the ultimate futuristic DJ performance tool. At $15,500, the ‘Elite’ part of its name is apt. However, a rather more reasonable $99 buys you the screen’s core controller software for use with the desktop. CEO Alan Smithson is a DJ and fully admits that 90 percent of their focus is on the DJ market, but the capabilities of Emulator Pro extend far beyond controlling Traktor and offering performance tools.
The beauty of the software is that you’re given a palette of knobs, sliders, buttons and X-Y pads and you can create whatever controller you wish. You can scale the controls as large as your fingers require them to be, you can orient them to any angle, for example to have controls that your fingers fall naturally upon rather than conforming to straight vertical lines, you can change colours, add images, add text — and pretty much do whatever you want. All controls can be assigned a MIDI channel and CC number or range. And once you’ve selected and arranged these controls, you’re off.
Emulator comes with its own internal virtual MIDI driver, which makes it a complete doddle to hook up with Cubase, Pro Tools and other such applications. There are a number of different pages, so you could have mixer control on one and plug-in control on another, or you can add controls to container windows which fold down to a button and reopen when you need them.
Emulator Pro runs only in full-screen mode, but that doesn’t mean it has to obscure the DAW: a feature that’s particularly useful for single-screen setups is the ability to ‘cut holes’ out of its GUI, so that the software running beneath is visible through it. That may be useful to reveal meters, a preview screen, or the arrange page, for instance — the possible configurations are endless.
If Emulator Pro is found lacking anywhere, it’s in the depth of the MIDI side of things. Channel and controller numbers is as far as it goes, so it can’t send SysEx commands or emulate a Mackie HUI, for example. However, Shane Felton (of www.alien-touch.com) has been working on an implementation to get 24 channels of Mackie HUI Control into Pro Tools running on his Apple Mac. The result looks not unlike the Slate Raven MTi, and includes many of the same shortcut buttons and controls. He uses Bome’s MIDI Translator to provide the HUI emulation and three virtual MIDI drivers (one for each group of eight faders) that are setup in Pro Tools. The template files are available to download from his web site, though he stresses that it’s a work in progress and would value contributions.
Back in the world of Usine Hollyhock, we find all sorts of depth, function and parameter possibilities. It has its own ‘add-on’ community, where you can find all sorts of control scripts and patches. Whether it’s SysEx, OSC or even Mackie Control emulation, you’ll find that someone has created a patch for it. Usine Hollyhock has the potential to be an extremely versatile controller. It doesn’t quite have the design simplicity or focus of Emulator (since it’s much more than a controller) but it has far more depth and breadth in terms of control potential.
Another promising project that’s under development is GestureSpark, by Wouter Van Beek. His web site (www.gesturespark.com) suggests that nothing much has happened in the last couple of years, but Wouter assures me that since Windows 8.1 and getting hold of a new touchscreen he is pushing harder than ever to get GestureSpark into a releasable state. It offers a similar layout and design to Hollyhock and Emulator, but Wouter’s particular interest is in getting the inter-application and/or computer communication sorted, and for this, alongside MIDI and OSC, he favours CopperLan.
CopperLan is a networking protocol that connects compatible music software and hardware together. Each device can reveal its parameters by name and be controlled by any other device automatically. It’s a bit like MIDI control, but at a much higher resolution and without all that manual mapping and learning you have to do. There are wrappers for non-compatible plug-ins, but for these you have to manually configure the controls. A CopperLan-compatible touchscreen controller could potentially map itself automatically to whatever CopperLan-compatible plug-in is selected. It can also work internally, without the need for a network, which makes it such an interesting solution for a virtual controller running on the same machine as your DAW.
At the opposite end of the spectrum, an application called Xotopad could not be simpler or easier to use. Created by Hauke Menges of www.feelyoursound.com, it gives you a colourful bunch of drum pads to play, or XY pads to fiddle with. All the pads’ MIDI credentials are editable, and all you need to do is send it to a virtual MIDI driver and you can control whatever software you wish. It’s really simple and easy, but very useful.
I might have saved the best until last, because it looks as though Devil Technologies may have the whole Mackie HUI-control thing sewn up. Their DTouch for Pro Tools offers HUI-based multi-touch mixer integration with Pro Tools, along with a touchy toolbar of shortcuts and macros. It brings a smooth workflow experience that’s spookily similar to the Slate Raven — as you can see from the picture at the top of this article — but on a regular, HD resolution multi-touch monitor.
The DTouch mixer is essentially the fader section of Pro Tools’ mixer window, with cut-outs around the meters so that they shine through. Once the alignment is set up, the design is flawless and you wouldn’t know you were using anything other than the Pro Tools mixer. The toolbar provides all the usual transport controls as well as buttons to activate groups, open selected plug-ins and such like. In the edit window, although no multi-touch controls are overlaid, you get an expanded toolbar full of useful tools and functions. There’s also a load of buttons to which you can assign your own macros. The toolbar allows you to zoom around and perform edits without having to return to the mouse, which is what makes the workflow so effective.
At the time of writing, DTouch was available for Windows 7 only, Pro Tools only, and at a mandatory resolution of 1920 x 1080. However, Devil Technologies were kind enough to let me try their Windows 8.1 beta version, which should be available by the time you read this. One side-effect of the alignment and tight integration is that it’s not very flexible — there’s no ability to edit the controls or create knobs and faders for other things as there is with Emulator and Hollyhock. Instead, its beauty lies in the seamlessness with which it functions alongside the DAW.
I asked the company about the possibility of releasing a generic HUI-based controller, but they tell me they would much prefer to do something that’s designed for the specific DAW — and, encouragingly, they have a Cubase/Nuendo version in the works already. They are also testing out ways to support two screens; currently you have to have everything on the single touchscreen monitor, or the alignments start to shift. Having said that, I tried it over two screens and found that it can work very well, especially on the mix window. But I guess you are back to selecting clips in the edit window with the mouse if you’ve moved it to a non-touch screen.
One unique feature is the ability to incorporate an external, hardware HUI-compatible controller alongside DTouch, to give you the best of all worlds. At 200 Euros it’s more expensive than other, more flexible options I’ve discussed, but it’s a no-nonsense dedicated solution that’s supremely good at what it does.
Let’s get back to the DAW manufacturers. We know the multi-touch support is patchy at the moment, but what of their plans for the future? There appears to be a rough split between software manufacturers for whom multi-touch is already a priority and those who, though interested, are focusing their effort on other priorities. It seems that most of the major firms are the least interested in implementing multi-touch, though. Of course, I should qualify this with the caveat that most major developers don’t like dropping hints about anything that’s unreleased — but Steinberg, for example, went as far as to state that they don’t currently see a market for multi-touch outside the iPad.
The industry’s strong focus on the iPad in recent years is certainly one reason why we see so little development on desktop multi-touch. The other major one is the absence of any lead from Apple into multi-touch support in OS X. It’s not hard to foresee the increasing convergence of iOS and OS X in the medium term, but a fully multi-touch-capable version of OS X still seems a long way off. This, to me, seems a little short-sighted.
James Woodburn, CEO at Tracktion, had this to say: “Eighteen months ago, we were asked regularly when Tracktion would be available for iPad — in fact, we already have a ported version of Tracktion that runs on iOS, but we chose not to release it, as it makes no sense to us to run a full DAW on a limited platform. We would rather design a solution that utilises the iPad’s key benefits and does not expose the limitations The demand, at least from our user base, for iPad support has really dropped off a cliff in the past 12 months — and demand for PC touch is on the rise, albeit quite slowly.”
The key to unlocking multi-touch on the desktop is in the design and implementation of the interface. DAW software has to perform lots of different and precise tasks, some of which lend themselves to touch, but many more of which would be hampered by the fatness and inaccuracy of human fingers. It just makes no sense to build touch into something that would be better accomplished with a mouse. There are also issues with hands and fingers masking the very controls you’re trying to fiddle with.
Bremmers Audio Design have been working with touchscreens for many years, and in MultitrackStudio they’ve developed a neat pop-up window that materialises a couple of inches above what you have your finger on. That means you can see both the control and the value. Even with this function, trying to edit the score window accurately with a fingertip is truly an exercise in futility — but it’s no problem because that’s why we have a mouse. With the iPad it must all be about touch, but with the desktop you can use each and every tool at your disposal, be it single-touch, multi-touch, gestures, stylus, mouse, keyboard, trackpad, Leap Motion, Kinect, hardware controllers, or something else entirely. The desktop should remain an awesomely creative and versatile place.
Spending time over the last few weeks rummaging around in the world of touch-enabled software, I’ve created a bit of a personal wishlist of features. Let’s hope some developers are reading!
In a DAW, I don’t want to be restricted to touch any more than I want to be restricted to a mouse. I want touch controls to become available when I need them — like the way the Pro Channel EQ slides out in Sonar (their mixer needs to do something similar). I want to be able to pinch/zoom into a region and then draw in automation with my finger; but I don’t necessarily want to have to use my fingers to copy and paste, trim audio or move notes.
Scrolling has to be easy, perhaps gesture-based, so that I don’t have to fudge around trying to finger empty space between controls to move the GUI. But I don’t want to have to return to the mouse just to move the screen a bit. Reason has that neat side-panel that shows a zoomed-out version of the rack — which is perfect for finger scrolling. Once you start adding hardware into the equation, a touch-screen makes for a much less jarring experience than putting your hand back on a mouse. In using Arturia’s Spark 2 with the Spark LE controller, it’s so great to be able to simply tap on the screen to change a parameter, preset or sample — it’s a far more fluid experience than moving from a creative hardware place to that mouse zone. It’s also done my RSI no end of good!
In terms of virtual control, which is perhaps more useful, because you could use it with various different bits of music software, a multi-touch screen holds enormous potential. It could be anything, could control anything. Imagine your 24-inch multi-touch screen set before you like a mixer, but placed physically beneath your main (non-touch) screen, and whenever the focus changes to the mixer, or plug-in or instrument, the touchscreen evolves to display the appropriate controls. Something a bit like Novation’s Automap that automatically pulls out the parameters and lays them out in front of you. CopperLan seems to hint at this sort of power, but requires everyone to be compliant for it to reach its potential. That’s the sort of integration we really need.
Hopefully, as more manufacturers realise the advantages of the desktop platform for multi-touch interfaces, they will use it to enhance our music-making environments. With Windows 9 just around the corner, Universal apps and Microsoft’s new-found interest in music production, there’s very little competition and great buckets of processing power and potential for the software developer, which can only mean great things for we touchy-feely users.
DAWs:Cakewalk Sonar X3
Image Line FL Studio
Software Audio Console (SAC)
Plug-ins:d16 Group LuSH-101
Arturia Spark 2
Controllers:Smithson Martin Emulator Pro
Emulator Mackie Control
www.alien-touch.com Usine Hollyhock
Devil Technologies DTouch
The history of touch technology can be traced back to the touch-sensitive capacitance sensors of early synthesizer pioneers such as Hugh LeCaine and Bob Moog — and it’s interesting to note that Apple’s iPad shares the basis of its touch technology with the humble Theremin! The concept of the multi-touch screen was first realised in the 1980s by Bell Labs, but probably made its way into the public consciousness through sci-fi films and series such as Dillinger’s desk in Disney’s Tron (1982) and Star Trek — The Next Generation (1987-94).
In computing terms, multi-touch refers to the ability of a surface to recognise the presence of more than one point of contact. This is distinct from single-touch interfaces, which essentially emulate the mouse input, and moves us through the world of pinch/zoom and gesture control, with which everyone’s familiar, to the possibility of individual touches creating individual actions and responses simultaneously.
The technology that we know today evolved out of a few sources: Fingerworks, a gesture-recognition company who pioneered a number of touchscreen products and were bought by Apple in 2005; Jeff Han’s Perceptive Pixel (bought by Microsoft in 2012) who, back in 2006, were demoing vast multi-touch walls and dazzling us with the concept of pinching photos and swiping maps; and the original Microsoft Surface, now called PixelSense, which started development in 2001 and was an interactive table that combined multi-touch capability with real-world object interaction.
It’s remarkable that in 2005 JazzMutant developed their own multi-touch technology to release the Lemur multi-touch OSC controller commercially. In 2007, with the release of the first iPhone and, a few months later, Microsoft’s Surface (PixelSense) 1.0, we had both ends of the multi-touch spectrum spectacularly catered for. But it would take a few more years for that middle space of tablets, hybrid laptops and multi-touch monitors to really find their technology and pricing sweet spots.
This is a very good question and a hard one to answer! The manufacturers are all over the place, with dual-touch being marketed, confusingly for the end user, as multi-touch, and screens designed for Windows 7 being pushed for Windows 8.
Windows 7, which is still favoured by so many music-makers, supports two-point multi-touch out of the box — so gestures and pinch/zoom all work fine. More points are supported with an additional download, but there was not really any part of the Windows 7 OS that made use of it — so your mileage using multi-touch on that platform will be almost entirely down to the software you’re running.
If designed for Windows 8.x, multi-touch monitors must have at least five simultaneous points. Many all-in-one machines meet only this minimum requirement, whereas hybrids and tablets tend to have 10. Quite honestly, having played with multi-touch for music making for a while now, I’ve rarely found myself using more than two touch-points at once, although sometimes I’ve used up to eight when messing about on a mixer to see what I could do. That said, a 10-point screen is more likely to be a projected-capacitance type, and so of a higher quality than screens offering fewer points.
I chose the Acer T232HL for my own use, because at the time it was the only thing available from the new generation of multi-touch screens. Dell kept promising one, but it kept getting delayed and finally came out about six months after I got the Acer. The Acer remains well regarded, particularly for its ability to lay almost flat, so it was a good choice in that respect.
The biggest gripe I have with multi-touch at the moment is with touch latency — my Acer T232HL adds about 10-15ms (an estimate, having tried playing drum pads, and so on). My understanding is that performance in this regard is rather better on tablets such as the Microsoft Surface, but I’ve not had a chance to test that, so whether, for example, it’s better enough for you to play drums without the latency proving a distraction,z I can’t yet say. Unfortunately, the published specs won’t help: the ‘Response’ time listed in a monitor’s specification usually refers to the change from black to white, and there’s no documentation on the touch response.
We music makers are a greedy bunch. While the rest of the world is moving to ever smaller and more portable devices, we seem to be accruing acres of screen real estate as we hook up multiple monitors to our DAW computers. One question that arose during the course of my research, therefore, was whether you could use two multi-touch screens concurrently with a single computer. This could give you greater access to more parameters spread across a larger desktop or, perhaps more interestingly, allow two people to work on different screens on the same project — for instance, one controlling the mixer, while another controls virtual instruments or effects.
So, is this possible? I posed the question to all the manufacturers I’d been in contact with, and opinions varied, from it being unlikely to work through to suggestions that some kind of driver ‘wrapper’ might be needed to allow the two input devices to work at the same time. But no-one had actually tried it!
Fortunately, just before we were going to press, the Microsoft Surface Pro 3 was released. I was able to get my hands on one and plug in my Acer touchscreen to use alongside the Surface 3’s built-in multi-touch screen. The results were mostly excellent, though a little odd in places. The screens on the Surface and the Acer could always work independently — I could touch whatever was on each screen without problems, and without any additional drivers, setup or configuration. Whether I could use both screens at the same time, though, depended on what it was I was doing. I could run two Windows Store apps and access both at once without too much trouble, but with the Windows 8 desktop, only one window could be in focus, and so only the screen hosting the focused window was operable. When stretching an application like Microsoft Paint across both screens, it was as though one screen grabbed the focus and would paint while the other touched in vain; the other screen would only work when the window was moved a little bit towards that screen, which then grabbed the focus back.
It was a similar story with music applications: when trying to stretch Sonar X3’s mixing console across both screens, for example, the same issue of which screen held the focus arose. The process for grabbing focus is very obvious when you have the screens at different resolutions, because the console jumps in size as you move it between the screens, and that jump is the screen grabbing focus.
However, if I split different tasks out to different screens, the results were more intuitive. Moving Sonar’s mixing console to one screen and its project window to the other allowed me to use them simultaneously with no trouble at all. So, you could have one person mixing levels while the other triggers loops in the Matrix, all in one project on one computer. The bottom line is that multiple multi-touch monitor setups are feasible for music-making, and different applications can be controlled at the same time from some different screens, but I suspect a little trial and error will be required to arrive at the best workflow for each application.