HN Reader

NewTopBestAskShowJob
Show HN: Nallely a modular reactive Python framework for custom MIDI instruments
score icon2
comment icon0
6 hours agoby drschlange
Hi HN! I'm Vince. I built Nallely, a modular reactive Python framework for creating custom MIDI instruments by patching signal-processing modules together, like a modular synthesizer for controls systems. Nallely focuses on real-time, thread-isolated, reactive behavior, letting you experiment with emergent behaviors.

Demo video: https://www.youtube.com/watch?v=rbMnKAdqAVI building a patch from scratch and hot-debugging a running instance near the end.

Key features:

* Visual patching interface for connecting reactive modules (neurons),

* Extensible via Python API, WebSocket, and/or code generation,

* Integrates any input source (MIDI, webcam, ...) to control synthesizers.

# Yes, but why?

Existing software/libraries that proposes MIDI manipulation are powerful but not friendly to live experimentation. They are low-level, hard to rewire on the fly, and often heavy for embedded or headless setups. I wanted a system that could also evolve dynamically where modules could be patched, hot-swapped, and debugged in real time.

# Architecture

The system is built around a reactive threading model with no shared data: each neuron lives in its own thread and communicates by sending messages through channels. No more CC,... , at the neuron level, everything is a signal (a simple int/float value through time). No global tick, each neuron works on its own time. Each neuron being reactive, they are sleeping the majority of the time. The system takes heavy inspiration from the "Systems as Living Things" philosophy and Smalltalk by treating each thread as a small living entity more than a processing unit. Here is how to code a simple Sample&Hold module:

    class SampleHold(VirtualDevice):
      input_cv = VirtualParameter(name="input", range=(0, 127))
      trigger_cv = VirtualParameter(name="trigger", range=(0, 1), conversion_policy=">0")

      @on(trigger_cv, edge="rising")
      def hold_value(self, value, ctx):
        return self.input
The control layer uses a small WebSocket protocol that the react-based web UI uses to control and introspect sessions. A WebSocket-bus neuron lets external application auto-register to it to send/receive signals: another neuron in the network can serve signals captured from any source. They're useful to distribute computation loads on different machines.

# What have I learned so far

A simple threading model can be powerful in a MIDI/music context:

* you can stop/resume a thread, stopping a part of the processing chain seamlessly;

* overflown neurons can mitigate the pressure without impacting the whole session;

* if a thread crashes, it is paused to give you the ability to debug the instance, and resume it;

* simple websockets have an acceptable throughput.

I was expecting a system entirely based on Python threads to be really ineffective, but it's surprisingly reasonable. Empirically I see ~1-2 % CPU per thread. A 20 threads classical session (~45 patches) uses roughly 21% CPU and 45MB RAM on CPython 3.13 GIL. CPython 3.14 no-GIL shows similar CPU but ~65MB RAM. Feedback loops raise usage (~38 %). Interestingly, on CPython 3.13 the load spreads across multiple cores, I suppose that the threads are sleeping enough to release often the GIL.

# Try it!

You can grab a precompiled PyInstaller built binary in the latest github actions artifacts. Doc is linked in the README, and deep-dive posts are available here: https://dr-schlange.github.io/nallely-midi/posts.

# I would love feedback

* What could be improved to make it easier to get familiar with?

* Are there blind spots or design choices that could be problematic long-term?

* Although it's MIDI-oriented, the system is really signal-agnostic, any idea for non-audio use-case? (e.g. visuals, etc)

No comments