Embedded MIDI synthesiser on Raspberry Pi

In the brand new issue of Raspberry Pi Official Magazine, out today, we learn that a synthesiser squeezed into an accordion sounds amazing.

Although it’s often considered to be a quaint, old-fashioned instrument in large parts of Europe and North America, the accordion is more popular than you may think. As well as playing a significant role in traditional, folk and ethnic music, it’s used to produce jazz, pop and classical tunes. It’s also of great cultural importance and a great symbol of musical expression.

It still looks like an ordinary chromatic accordion, albeit with new control knobs

Sergey Antonovich is certainly a fan and he loves to play the button accordion. He also spends a good chunk of his free time studying music theory and creating digital music instruments and accessories. To that end, he’s been working on bringing digital synthesiser capabilities to an acoustic accordion while preserving the instrument’s authentic feel and sound.

To achieve this, Sergey has been using the Raspberry Pi Compute Module 3+, which he praises for its balance of performance and thermal efficiency. He’s also produced a version that uses the Raspberry Pi Zero 2 W computer and created it with simplicity in mind so that anyone can try it out and see how it works.

The result is a system that is more flexible and less expensive than off-the-shelf MIDI synthesiser alternatives, that’s still capable of playing beautiful, digital sounds. What’s more, Sergey has fitted all of the electronics inside the accordion rather than have it trailing externally, which has allowed him to create a self-contained musical experience offering the very best sound quality.

Melodic move 

The project has been challenging. “When designing a digital musical instrument, the sound engine – the synthesiser – is not just a component. It is the instrument, in the ears of the audience,” Sergey explains. “No matter how sophisticated the keyboard, pressure sensors, or expressive interface may be, without a reliable and flexible sound engine, the instrument cannot speak.

This is the first headless prototype on Raspberry Pi CM3+. It boots straight into performance mode. With 3ms latency, it is in the same class as hardware synths

“In my work developing digital accordions, this challenge became central,” he adds. “I had already developed a working logic board based on a microcontroller – responsible for scanning buttons, detecting bellows pressure, and generating MIDI messages. But I needed a standalone, embedded MIDI synthesiser: a system that receives MIDI input (via hardware DIN or USB), and outputs audio in real time. Crucially, it had to be small, reliable, low-latency, and flexible enough to support wavetable synthesis with user-replaceable sound banks.”

Although Sergey did consider external MIDI synthesiser modules such as the MIDIPLUS miniEngine Pro, Ketron SD2 or Ketron SD1000, he didn’t want separate cables, power, and clunky mounting. He also thought about using integrated hardware solutions offered by a small number of companies whose integrated circuits provide sample-based wavetable synthesis in compact form. But, again, he saw problems.

“They present several challenges for independent developers,” he says. “There is no public access to development kits or documentation. They use proprietary sound bank formats, editable only with internal or vendor-provided tools. They have locked feature sets, meaning custom user banks are difficult to implement without proprietary tools, licences or vendor support. And cost-effective ICs like the SAM2695 or VS1053b rely on compact, generalised samples that may not fully capture the nuance of acoustic instruments.”

Bellows power

As a result, Sergey began exploring a fully open approach. “I decided to use FluidSynth, a software synthesiser that supports SoundFont 2 wavetable banks that can be tuned for real-time operation on Linux-based single-board computers,” he says.

FluidSynth is a software synthesiser that responds to real-time MIDI input by generating audio and it has a shared library that can be used in other programs. It also has a built-in command-line shell. Running it on Raspberry Pi Compute Module 3+ appeared the most logical solution to Sergey’s problems and it means he could take advantage of the four ARM cores and 1GB of RAM, which is enough to load most SoundFont 2 banks.

All of the components fit inside the right-hand half of the accordion and it doesn’t have the limitations of closed systems or hassle of external gear

“Using Raspberry Pi for audio synthesis offers several key advantages,” he says. It has an open software stack with no vendor lock-in and full control over tuning, latency, and effects. When combined with FluidSynth, it also enables full support for the SoundFont 2 support, allowing users to replace or customise their sound banks, which is a highly requested feature among musicians.

“It also offers a good performance-to-power ratio which is critical for embedded use in sealed housings,” adds Sergey. “And there’s a strong developer ecosystem with up-to-date kernels, documentation, overlays, and toolchains.”

Raspberry Pi Compute Module 3+ can be integrated in a compact board design as well. “For benchtop prototyping, I used the Waveshare Compute Module PoE Board, Waveshare WM8960 Audio HAT, and a simple DIN MIDI input schematic on a 6N139 optocoupler,” Sergey says.

Sound journey

For the instrument to work effectively, the time between pressing a key and the sound playing needed to be short. “The most critical metric for live performance is trigger-to-sound latency and, while hardware synthesisers typically achieve 1.5 to 3 milliseconds (ms), software solutions must carefully balance CPU load and audio buffering to stay within acceptable range – ideally below 10 ms to remain imperceptible to the performer,” Sergey explains.

The system is also designed for safe power removal: thanks to a read-only root file system, the module can be switched off simply by cutting power

Using a minimal Buildroot-based Linux system and carefully tuned audio parameters, he achieved a consistent average latency of 3ms (from receiving a MIDI event to the sound being generated). Including MIDI transmission of around a millisecond, the total response time has come in at 4ms. ‘That’s fully competitive with dedicated hardware synths,” Sergey notes.

The system also handles playing 64 different notes simultaneously, which Sergey says is sufficient for a digital accordion with layered auto-accompaniment. “The audio output is clean, and no artifacts or dropouts occur under load.” 

The result is excellent latency, full musical responsiveness, and great sound quality in an instrument that is fully self-contained and can be played without relying on any external gear – enabling untethered movement on stage when paired with a basic wireless audio transmitter. 

And while the instrument can be played electronically, the acoustic sounds still respond naturally to bellows movement while digital voices can be configured to ignore or follow the same motion. “It confirms the viability of this setup in a real, stage-ready instrument,” he says.

Raspberry Pi Official Magazine #156 out NOW!

You can grab the latest issue right now from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available from our online store, which ships around the world. And you can get a digital version via our app on Android or iOS.

You can also subscribe to the print version of Raspberry Pi Official Magazine. Not only do we deliver worldwide, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico W!

The post Embedded MIDI synthesiser on Raspberry Pi appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/gxFQcOU

Comments

Popular Posts