Got an idea for a plug‑in, but no idea how to code? Read on...
If you make music with computers then you surely use plug‑ins. They’re everywhere: sequencers, samplers, instruments, inserts, effects, utilities such as spectrograms and MIDI filters, and more. But have you ever thought about making your own plug‑in? Perhaps you have an idea for a plug‑in that doesn’t yet exist, but you don’t know how to make it yourself?
Until now, your chances of making your own VST, AU or AAX plug‑in were limited unless you could write code, and had the time to invest in building your idea, or you could afford to pay someone else to do it, or you were famous enough to have companies competing to put your name on their plug‑in. Today, there’s another option: a new approach called vibe coding. Vibe coding is a process where you use natural language to describe what software you want to make, and have a large language model write the code for you. You don’t even need to read, or understand, the code.
Large language models, often abbreviated as LLMs, are so‑called ‘artificial intelligences’ that have been trained on a vast corpus of written material across all fields. This allows them to generate different types of material, from essays and forum posts to computer code. If you’ve already used LLMs such as ChatGPT, you’ll find vibe coding is similar, in that it feels like a conversation with someone, except you’re talking about the software you want to make. All you need is the ability to put your ideas into words, and the diligence to see it through, even when things don’t go to plan. These are skills you already use when writing songs. In this article I’ll outline the process you would need to follow to make your own plug‑ins, and I’ll illustrate my points with an example project.
Starting Points
Before you start messing around with AI code generation, you need to know what you want to build. It all starts with one question: what do you want or need? Perhaps you want to work with MIDI data: filter it, generate melodies or invent outlandish new jazz chords. Or you could work directly with audio, and modulate sample playback so that you can have a snare backbeat that constantly evolves over the course of a song. These are possible, as are many other ideas. My only recommendation is that your first attempt is the simplest thing you can imagine. You want to learn the process before attempting to vibe code a complex plug‑in.
You will need to rein in your expectations, too. Some things you try simply won’t work. You can’t expect to create a competitor to plug‑ins that precisely model orchestras, or allow you to have granular control over every parameter of sample playback. Commercial plug‑ins with those features are complex, and need experts to build them. It’s best to think of vibe coding as a means to build something that’s uniquely yours, that may not be available to buy. Don’t underestimate it, though! I’ve frequently been surprised how far I could push my vibe‑coded efforts. The results usually worked better than I expected.
It’s best to think of vibe coding as a means to build something that’s uniquely yours, that may not be available to buy.
It’s a creative process, so I recommend that you approach it with an open mind. Your plug‑in is going to be created during a conversation with a machine, where you give it your ideas and the machine sends new ideas back at you.
In this example, we’ll use Cockos Reaper to host our generated plug‑in, for two reasons: Reaper’s JSFX plug‑in format has its own compiler built in, and Cockos generously offer a month’s free trial to new users. For the code generation we’ll use Anthropic's Claude, as they also offer a generous free tier and it’s consistently given the best code‑generation results in my trials compared to OpenAI’s ChatGPT and Google Gemini. Claude’s free tier does limit the amount of code you can generate in any six‑hour period, so you may have to wait for it to reset, or consider paying for the licence.
Prompt Action
The vibe‑coding process I’ll outline isn’t very risky. As with all software, bugs and crashing are an inherent risk, and there is the possibility that Reaper will fall over if you give it buggy code, but this has never happened to me. Another concern about LLMs is giving them too much personal data, but you’ve got control. The main thing to remember is to never tell the LLM anything that could identify you. Worth noting is that Anthropic allow you to decide if they can use your conversations to train their models in the Privacy section of Claude’s settings menu.
Let’s walk through the process I follow to create my own plug‑ins. While your goals may be different, the process you follow should be the same, and it always begins with considering what the final result should be:
- 1. Define your desired plug‑in.
- 2. Describe it in natural language.
- 3. Use Claude to create code.
- 4. Test the code in Reaper.
- 5. Repeat steps 3 and 4 to optimise the features and fix bugs, if needed.
- 6. Stop when it works (rather than waiting until it’s...
You are reading one of the locked Subscribers-only articles from our latest 5 issues.
You've read 30% of this article for FREE, so to continue reading...
- ✅ Log in - if you have a Digital Subscription you bought from SoundOnSound.com
- ⬇️ Buy & Download this Single Article in PDF format £0.83 GBP$1.49 USD
For less than the price of a coffee, buy now and immediately download to your computer, tablet or mobile. - ⬇️ ⬇️ ⬇️ Buy & Download the FULL ISSUE PDF
Our 'full SOS magazine' for smartphone/tablet/computer. More info... - 📲 Buy a DIGITAL subscription (or 📖 📲 Print + Digital sub)
Instantly unlock ALL Premium web articles! We often release online-only content.
Visit our ShopStore.

