Sometimes Studio One expresses itself differently to other DAWs, but don't let that put you off...
Every DAW has its own set of terms for describing entities, features, and processes in the program, and its own graphic representations of those entities. While basic definition of these terms and graphic representations would seem to be so fundamental that it hardly rates discussion in a column like this, the reality is that grasping these definitions sometimes involves mastering subtleties that can become very important in practice. Ideally, once declared and defined, a program's use of terms would be completely consistent and its system of representation obvious, but this is not always the case. Sadly, ambiguity begets trouble.
So, this month, we examine some terms used in Studio One and some of the items you see on screen, and clarify the distinctions between them.
In terms of content, the fundamental units of currency in Studio One are Events and Parts. Generally, when the term 'Events' is used, it refers to Audio Events, which are called things like 'clips', 'regions', or 'soundbites' in some other DAWs.
The term 'Parts' most often refers to Instrument Parts, which are regions filled with performance information that most of us know as MIDI notes. Seems simple enough, eh? Well, it's not, really. The term 'Events' is used more generically in some places in Studio One (and the manual), in a way that encompasses both Audio Events and Instrument Parts. For example, if you want to rename an Instrument Part or an Audio Event, you would right-click on it and choose 'Rename Events' from the contextual drop-down menu that appears. You won't find 'Rename Parts' anywhere in the program; nor will you find 'Mute Parts', only 'Mute Events'. In fact, the Event/Musical Functions submenu in the main menu bar is mostly constituted of commands applying only to Instrument Parts.
So, Audio Events are always Events, but sometimes Parts are Events, too. Think you've got it? Let me add another twist: sometimes Events are Parts as well. One of Studio One's most powerful features is the ability to group a number of Audio Events as slices in a single entity called an Audio Part, like the one shown in the 'Audio Part' picture.
The thread of consistency is that both Instrument Parts and Audio Parts are single items made from multiple items. In both cases, the multiple items can still be edited and addressed individually, or the Part as a whole can be processed or handled. The distinction between Parts and Events becomes clearer when we consider the various ways in which Studio One can adapt the timing of recorded material to the grid.
Studio One has great time-stretching facilities, most of which make use of Audio Bend markers. Identifying and marking transients is the heart of beat detection, and there are at least three different ways in which transient detection can be applied to quantise audio in Studio One.
With the first method, Studio One inserts Audio Bend markers where it determines beats to be falling in an Event. The audio can then be quantised by time-stretching the material between markers to make it fit the grid. For better or worse, there is no explicit term for an Event with Audio Bend markers, but it is, in any event, still a single Audio Event.
However, not all material responds well to being time-stretched. This is especially the case with transient-rich audio such as drums and percussion: although Studio One's time-stretch algorithm tries to separate the transient from the rest of the note and leave it alone, it often gives rise to audible artifacts.
This brings us to Studio One's second method of conforming audio to the grid. Since drum and percussion tracks consist mostly of short sounds with clearly discernible transients, a better way of quantising such instruments is often to cut the audio into individual hits and quantise the start times of those without using time-stretching, just as one quantises the start times of notes in Instrument Parts. This is one of the biggest reasons the slice-based Audio Part was created. Each slice within an Audio Part is an Event, and when the Part is quantised, the individual Events jump to bar and beat lines. In the 'Audio Part' screenshot, you can see the crossfades that make the boundaries between these individual Events. Zooming in closer in the Audio editor makes the slices even more obvious.
This sort of beat-slicing is easy to do: it starts out just as in our first example, but instead of setting the Quantize mode in the Action section of the Audio Bend panel to Quantize, set it to Slice and configure the tick-box options beneath the field to the desired settings. Now, instead of inserting Audio Bend markers at the detected transients, the algorithm will instead separate the audio at transients to create multiple Events. The last step is to select all of those new Events and merge them into an Audio Part. If the Merge option is checked in the box below the Quantize mode field, this happens automatically. If not, select the Events you want to merge and choose Event/Merge Events from the main menu bar, or right-click on any of the events and choose the command from the contextual menu that drops down. This probably is how you will most often make Audio Parts, though any group of Events on a track can be merged into an Audio Part.
The third method I mentioned, incidentally, doesn't affect the audio at all; instead, it conforms the timing of the grid to the audio. In this case we still need to detect transients in the Audio Event, but instead of fitting the transients to the grid with time-stretching, we drag the Event to the Groove Map in the Quantize panel. After transient detection, the Quantize mode for the Arrange view is set to the Groove Template extracted from the audio, and now the grid follows the audio.
Studio One doesn't have Audio tracks and MIDI tracks. In fact, Studio One barely has MIDI anything. Look through the whole program and you can count on one hand the number of times you find the term 'MIDI'. That's because Studio One does not use MIDI in its internal representation of performance information, it uses a much higher-resolution representation.
Thus, instead of 'MIDI tracks' you find 'Instrument tracks' (although a couple of mentions of 'MIDI tracks' managed to slip into the manual). You don't have a 'MIDI editor' for performance data in Studio One, you have a 'Music editor'; and you have 'musical performance data' instead of 'MIDI data'.
The rest of the world, however, still employs copious amounts of MIDI data, which is why you will find mention of MIDI in Studio One, and indeed an entire feature called the MIDI Monitor, which displays MIDI messages entering and leaving the program.
I've laid out a few of the more significant and subtle distinctions in how Studio One talks about and displays data, but there are more examples, such as features with different names in different parts of the program. In the Banks panel of the mixing console, for instance, you can configure whether each channel is shown or hidden, then save that configuration as a channel bank. A channel bank can be reloaded by choosing it from the drop-down menu in the Banks panel, or by right-clicking (or Ctrl-clicking on a Mac) anywhere in the Banks panel or Console and choosing it from the submenu, which is actually named Console Scenes. I didn't find the term 'Console Scenes' anywhere else in the program or manual.
It can be intimidating to see something in a program and not understand what it means. It is easy to just move on and use what you know, because we are generally more interested in accomplishing our tasks than taking the time to puzzle out a feature. But if you can take a few minutes to work something out, you can be rewarded with a new and potentially useful tool!