You are here

Creating Custom Touchscreen Interfaces In Logic

Logic Notes & Techniques By Geoff Smith
Published June 2009

You'll be surprised at how cheap and easy it can be to enhance the usability of your Logic setup with a touchscreen.

Creating Custom Touchscreen Interfaces In Logic

It will be little surprise to anyone that, thanks to Apple's iPhone and iPod Touch products, the profile of touchscreen interfaces has risen dramatically. However, what may be surprising to you is just how easy and affordable it is to buy a touchscreen and incorporate it into your music setup. Touchscreens regularly come up for sale second‑hand, as they have been used in shops, medical establishments and other outlets for many years now. Ebay is probably the best place to find one; a quick search at the time of writing revealed that a used 15‑inch touchscreen can be bought for around $150. That size of screen is considerably bigger than the 12‑inch touchscreen on the Jazz Mutant Lemur controller (reviewed in SOS March 2007, /sos/mar07/articles/lemur.htm), which weighs in at around the $2000 mark, although it is worth pointing out that the Lemur's capabilities are a lot more advanced.

How Do Touchscreens Work?

A touchscreen is a display that has the ability to detect the location of a point of contact within the display area. The point of contact can be either from a finger or from some kind of stylus, depending on what type of touch the screen will recognise. The two most common types of touchscreen interfaces are capacitive and resistive, the main difference between the two types being that capacitive touchscreens are usually visually clearer than their resistive counterparts. (The iPhone has a capacitive touchscreen, while Palm's Treo and Motorola's ROKR E6 use resistive screens. One benefit of a resistive screen is that it can be used with a stylus as well as your finger, but on the down side, resistive screens are more easily damaged by sharp objects. Capacitive touchscreens can also be more sensitive than their resistive counterparts, as they do not necessarily need to be pushed; some can detect your finger within 2mm of the screen, and therefore often respond to a lighter touch.

A touchscreen interfaces with your computer via two connections: a VGA or DVI connection for video, and a USB or serial port connection for the touch interface (Mac users need to make sure their screen has USB). It's easiest to think of the touch interface in PC terms as being similar to connecting a second mouse.These are the connectors on my own touchscreen, the catchily named ELO ET1525L-8UWC-1, which I found on eBay for around $150. Left to right: power, stereo audio input for built-in speakers, USB, VGA.These are the connectors on my own touchscreen, the catchily named ELO ET1525L-8UWC-1, which I found on eBay for around $150. Left to right: power, stereo audio input for built-in speakers, USB, VGA. The USB cable carries the X‑Y coordinates of your touch signal to the computer and requires a stand‑alone driver, in the same way that some specialist mice or tablets do, so when buying a touchscreen it's important to ensure that the one you choose has the drivers and software for the operating system you use. This is important because the control interface needs to be set up as either a mirror of your first screen or a separate second screen, and then calibrated to the area you are working in. The calibration software usually takes the form of a simple program that generates a series of crosses on the screen, which you touch. From that, the program calculates the working area, to track your touch accurately. Be aware that, as with any device, there are good and bad examples. Watch out for screens that require firm pressure to register your touch, as these will quickly become fatiguing and make adjusting parameters via sliders hard work.

Because touchscreens are commonly found at tills in shops, and in other environments where a traditional desk is not present, they are available with very flexible mounting arrangements and can be attached to a wide variety of stands. This mounting flexibility is a bonus for musicians: manoeuvre the screen behind your keyboard for patch editing or place next to your computer to help with mixing.

The Touchscreen & Logic

To look at the potential of a touchscreen as a controller with all music applications is beyond the scope of this article, so I'll be explaining how to use one solely with one of the most popular DAWs around: Logic Pro, whose Environment also lends itself very well to creating specialist interfaces.

A touchscreen can work just like a normal monitor and mouse combination, so straight off the bat you can interact with all aspects of Logic's interface. Changing levels and panning in the mix window are particularly easy. Editing plug‑ins is also much more immediate than using a mouse.

One of the things that has always set Logic Pro apart from other sequencers is the depth of what can be accomplished in the Environment window. Because the Environment provides a configurable virtual view of your MIDI studio and mixing objects, you can design your own Environments to suit a touchscreen interface. Some of the things you could build to take advantage of the touchscreen are a specialist touchscreen mixer, multiple X/Y Pads for synth control, and SysEx or continuous controller maps to give your hardware a touchscreen interface.

Basic Mixing Screen

The basic mixing screen offers 32 channels of volume and pan controls, plus solo and mute buttons.The basic mixing screen offers 32 channels of volume and pan controls, plus solo and mute buttons.

When mixing or editing on your main monitor, you can have your touchscreen showing a useful Environment containing just the basics of what you want to adjust. On a 15‑inch touchscreen I've found that 32 channels of volume, pan, mute, and solo controls can be displayed at a size that works well. You could try other variations, such as a screen containing 128 tracks of solo and mute buttons.

To begin building the basic mixer, it's best to create a new template from scratch:

  • Start a new song and add an audio track.
  • Call up the Environment window (Command‑8), and create a monitor object (New > Monitor).In the Environment window, the Audio 1 channel object is cabled to a Monitor object.In the Environment window, the Audio 1 channel object is cabled to a Monitor object.
  • Attach the Audio 1 object to the Monitor object. To accomplish this, go to the small triangle at the top right of the Audio 1 channel object. This arrow represents the output or outlet for the control signals of that channel. Make a connection between audio channel 1 and the monitor object, by click‑holding the triangle at the top of channel 1, moving the mouse to the centre of the monitor object and releasing the mouse button. You should now see the outlet of the Audio 1 object cabled to the inlet of the monitor object. The click‑hold method will be used throughout to connect objects in the Environment.

Now it's time to do a little investigation. Adjust the volume, pan, mute and solo of the track and you'll see that the monitor object displays the messages Logic uses for these functions. It's worth observing that all Logic's mixing functions inside the Environment can be controlled by two message types: Continuous Controller messages for volume and pan; and Fader commands for controlling solo/mute and plug‑in parameters. Continuous Controller messages start with a circular symbol, then have three columns of numbers. An example message you might see in the monitor box for the Audio 1 pan control would be '1 10 40', where 1 denotes channel 1, 10 denotes CC 10 (pan), and 40 is the value of the control signal (in this case, pan position in the range 0‑127).

Fader messages follow a similar format, but start with an 'F'. For example, a mute button could be turned on with the message 'F 1 9 1' . The 'F' means a Fader message; the first number column is the object position, in this case, one (higher numbers denote the insert plug‑in slot number); the second column represents the parameter being controlled, in this case, the mute channel; and the last number is the control signal value, in this case, one, meaning 'on'. Call up a plug‑in on audio channel 1 and move some of the controls; you should see a specific Fader message for each in the monitor object.

Now let's get down to actually creating our own control objects! First, we'll make a Fader object to control channel volume.

  • Create a new Fader object (New > Fader > Vertical 1).
  • Cable the outlet of the Fader object into the inlet of the Audio 1 channel object.The Inspector for the volume control object.The Inspector for the volume control object.
  • Call up the inspector by pressing 'I' on the keypad. Click on the new Fader object to make sure the Inspector displays its parameters, and set them as in the screen just overleaf. You will have recreated the message for audio channel 1 that you saw in the monitor object when you moved the volume slider, earlier.The Inspector for the pan control object.The Inspector for the pan control object.
  • We need an object to control the mixer's Pan pot ('New' menu > Fader > Knob 6). Cable the knob into the volume fader's inlet and enter the values on the right into the Inspector.
  • Test the two objects you have created, by moving them. If everything is connected correctly, you should see audio channel 1's volume slider and pan knob mirroring these movements.
  • Next, we need to create two buttons to control the solo and mute functions. Make one button (New > Fader > Button) and cable it to the inlet of the pan object.
  • Create another Button and cable the outlet of that into the previous button's inlet. You should now have a chain of objects comprising two buttons, a knob, and a fader.The Inspector for the Mute and Solo objects.The Inspector for the Mute and Solo objects.Creating Custom Touchscreen Interfaces In Logic
  • The solo and mute buttons in Logic use Fader control messages, as explained earlier. Set the two new buttons to control solo and mute respectively, and copy the parameters from the Inspector screens below right. Note that the output and input boxes are set to Fader.A complete signal loop is created by cabling the outlet of Audio Channel 1 to the inlet of the Mute button.A complete signal loop is created by cabling the outlet of Audio Channel 1 to the inlet of the Mute button.
  • Delete the monitor object from audio channel 1 and cable the outlet of the channel to the inlet of the mute button, to create a loop. This loop will carry the value from the mixer objects to your new objects, and vice versa (see screen above).

You can now move and resize any of your new control objects, change the type of fader or button, and even modify the colours. Once you have a single‑channel interface you like, you can add more channels by simply copying and pasting the channel and its control objects. Then simply alter the audio object's channel in the Inspector from audio 1 to audio 2 or audio 3 (and so on). Note that you can change the audio channels to audio instrument channels and the control objects will still work.

Want Four Kaoss Pads?

Fancy owning four Kaoss Pads?Fancy owning four Kaoss Pads?

The Kaoss Pad is an effects processor from Korg that has a touchpad interface for altering, in real time, the parameters of whatever effect is selected. For example, if a Delay effect were selected, the X‑axis of the touchpad might control delay time and the Y‑axis the amount of feedback.

The Vector object inside Logic allows you to recreate Korg's touchpad interface on a touchscreen. The advantage of using a touchscreen and Logic is that you can create as many Vector objects as you like, and connect them to any instrument or plug‑in parameters. You could use the X‑axis of a Vector object to change oscillator type in the ES2 synth and the Y‑axis to add distortion and pitch modulation and open the filter, for example. Another Vector object could then control the parameters of any insert effects you put after the ES2.

  • Create a new project and add a single audio instrument track.
  • Go to the Environment page and load an ES2 on audio instrument channel 1 and attach a monitor object to the outlet of that channel.
  • Select the audio instrument 1 channel and press copy (Command‑C). Now you're going to create a new environment layer to put your Vector objects on.
  • Call up the inspector, by pressing 'I'. At the top of the Inspector is a menu that allows you to choose which layer of the Environment you want to view. From that menu, choose Create New Layer. To rename the layer, click on the layer name in the Inspector, type 'Kaoss Pad' and press Enter.
  • Paste audio instrument 1 on to the empty page (Command‑V).
  • Add a Vector object (New > Fader > Vector). Cable its outlet to the inlet of audio instrument channel 1.
  • Resize the Vector object to a quarter of the size of your touchscreen: click on the Vector object to select it, then go to bottom right and click‑drag to resize.
  • Hold down the Alt key and click-drag to create three more Vector objects. Name them Vector 1, Vector 2, and so on.

Now that you have your Vector objects, it's time to do something fun with them.

  • Go back to the Environment 'Mixer' layer that has your monitor object on it.
  • Call up an ES2 synth and select the preset Synth Leads > Big Trance Now.
  • Move the ES2 window so that you can still see your monitor object, and adjust the Cutoff and Resonance controls of Filter 2. You should see the following messages for Filter 2: Cutoff F 2 35 [value] (where value is the setting between 0 and 127 of the Filter 2 Cutoff), and Resonance F 2 36 [value] (again, between 0 and 127).
  • Return to your Kaoss Pad Environment layer and enter the two Filter 2 messages as the control destinations for Vector 1. Click on the Vector 1 object so that it's selected. In the Inspector, you should see that you can set vertical and horizontal messages. Set the vertical message to change Filter 2 Cutoff and the horizontal to change Filter 2 Resonance.

You should now have a fully working Vector controller for Filter 2's Cutoff and Resonance. Play a bass line on your keyboard and move the Vector object around to hear it working. You could then use your second Vector object to control some effects processing, as follows:

  • Go to the 'Mixer' layer of the Environment and call up Logic's PlatinumVerb plug‑in on insert 1 of audio instrument channel 1.
  • Move the plug‑in window out of the way so that you can see the monitor object you created earlier, and adjust the Wet output level control and the Reverb Time control. You should see the Fader commands F 3 19 [value] and F 3 8 [value] in the monitor object.
  • Return to your Kaoss Pad layer and click on the Vector 2 object to select it, then go to the inspector and enter those messages into the horizontal and vertical outputs. You should now have one vector object controlling Filter 2 and a second controlling Reverb.
  • Now set up the other two Vector objects to control other ES2 or plug‑in parameters of your choice.

Things To Make & Do

The touchscreen interface I made for my Akai MPC4000.The touchscreen interface I made for my Akai MPC4000.A set of buttons I made to use as 'presets'for the MPC. They control many different settings at once.A set of buttons I made to use as 'presets'for the MPC. They control many different settings at once.A useful Environment mixing template, offering (top to bottom): 0dB button, phase invert button, gain fader, high‑pass on/off and cutoff frequency controls, solo, mute and pan controls.A useful Environment mixing template, offering (top to bottom): 0dB button, phase invert button, gain fader, high‑pass on/off and cutoff frequency controls, solo, mute and pan controls.

It's worth mentioning that you can use the outputs of the Vector objects in much more complex and flexible ways. For example, using the Transformer object you could enable the X‑axis of a Vector object to control many plug‑in parameters at one time in specific ranges. Vector objects can also be set up to feed the user‑definable controllers in ES2's MIDI section, which can then be used as 'sources' inside the modulation matrix, allowing that source to be a controller for any destination within ES2. Because you can build System Exclusive and controller maps for your hardware synth inside Logic, you could also build your hardware instruments a touchscreen interface. I've created a series of Environments for my Akai MPC4000 that improve the user interface dramatically. The first significant gain is that I can now control any sample's parameters all on one page of my touchscreen, laid out exactly as I want them. The difference this makes is nothing short of amazing. Another interesting benefit is that because Logic allows long strings of SysEx data to be sent using just a button, you can effectively create presets for any groups of SysEx commands you like. It's easy, for example, to set up presets for all the synth and effect parameters within the MPC: one SysEx change could set the selected sample's pitch down a fourth, filter to low‑pass, filter cutoff to 50 and amplifier envelope decay to 34, and add reverb — all at the touch of a button.


There are so many ways to use Logic's Environment with a touchscreen to improve your working methods that I think the case for owning and using one is very persuasive. I wouldn't recommend throwing out your mouse and keyboard yet, but as an addition, a touchscreen makes a lot of sense. Basic tasks such as soloing and muting different tracks while editing and mixing become so much easier. So before you spend a hundred pounds on a controller with a few plastic knobs and sliders, bear in mind that buying a touchscreen gets you a second monitor and a touchscreen interface for your DAW and hardware into the bargain!    

Buy Related Tutorial Videos