This set of audio example MP3s illustrates the ideas discussed in the main July 2022 workshop article:
Audio Examples 1 to 6 follow a similar format. In each case, the example is split into two sections. In the first section, you can hear the original audio performance (in all but the last, this is an electric guitar) panned slightly right and a piano playing the block chord changes generated by the Audio To MIDI Chords process without any editing of the Chord Events Cubase has generated.
In the second section, you hear the same performance after the Chord Editor has been used to manually refine the Chord Events.
In this first example, simple triad chords are played by the guitar. The Audio To MIDI Chords analysis does a good job of identifying both the chords and their timing. The small amount of editing required to the Chord Events took less than a minute to complete.
In this second example, simple triad chords are again played to the guitar performance has a more complex strumming pattern. The Audio To MIDI Chords analysis again does a good job of identifying the chords but the timing of some of the chord changes required a little more correction. Even so, the editing required to the Chord Events only took just a fraction longer than in the first example.
In this third example, a heavily over driven guitar was used as the audio source. Despite this, the results were similar to example 1 and required little by way of correction.
In this example, the clean guitar part consisted of open strummed chords containing some drone-style notes. The Audio To MIDI Chords function didn’t make quite such a good job with this type of performance. The timing of some chord changes was incorrect and one chord (a D11 created by a combination of a chord shape and some open strings) was not automatically detected and had to be added afterwards as part of the Chord Event editing process. The editing required took a few minutes, including using the MIDI note method mentioned in the main text to identify the D11 chord.
This example uses a simple ‘jazz-lite’ guitar part containing a number of more complex chords, some played as inversions, and with bass notes played fingerstyle with the thumb. The Audio To MIDI Chords process identified was not so effective with this performance and several of the Chord Events need manual editing to make then follow the performance more accurately. Editing therefore required a little more work but still only took a few minutes.
This audio example used a simple keyboard audio performance as the source for the Audio To MIDI Chord process. The keyboard part consisted of a combination of sustained organ chords and a simple piano arpeggio. The Audio To MIDI Chords process identified the chords correctly and only minimal correction to the exact timing of the changes was needed.
This final example shows how the Audio To MIDI Chords feature and Chord Track might be combined to quickly create a virtual instrument performance from an audio performance. In this case, a clean guitar part was used (section 1 within the example) and this was used to create Chord Events on the Chord Track (heard as block chords played on a piano in section 2). The Chords To MIDI command described in the main text of the workshop was then used to generate a MIDI clip with the chord sequence and this was placed onto a virtual instrument track to trigger a synth sound with an arpeggiator active (section 3 of the audio example).
For context, the final section of this example places this arpeggiated synth sound within a simple arrangement including drums, bass, piano and the original guitar performance.