Midi support within (custom) plugins

I’d touched on this on the modwiggler topic, but it got a buried (understandably).

I write a lot of software (both dsp and drivers etc) around expressive controllers, using various protocols including MPE, and recently also within vcv.

Im interested in how I could work in this area by creating plugins for MM.

I wanted to talk a little what support exists in MM, a little focused on the technical side.

my assumption is that all (usb) IO is done via the M4 processor leaving the A7 for DSP.

this would mean that the midi comes in on the M4 and is then shipped over (I had a look at gh, I think there is a packet protocol between chips) to the A7.

k, so lets consider a solid example…

as there is not polyphonic cable support in MM.
what if I wanted to write a polyphonic module that processed a midi stream, so that I can see multi channel (mpe) midi. the module would then be a polyphonic voice, which Id likely ‘mix down’ to (e.g.) a stereo output.

so this raises the question…
can a module get access to the full midi stream?

obviously, this could not be done via a ‘jack input’ since there is no polyphonic cable support.


here’s a little video of what Ive done connecting the Eigenharps to vcv :slight_smile:

not really feasible I suspect on MM, as id guess Id need to write custom firmware for the M4 to support the usb protocol for the eigenharp… the the lack of poly cables in MM would be problematic.
but does show the kind of thing, Im interested in - even if I approach in a different way.

btw: Im also interested in i2c support in this area, but thats a question for another day :wink:

3 Likes

Basically, yes. The A7 and M4 are separate cores on the same silicon chip, but they are independent except for sharing access to the various RAM regions and hardware peripherals.

The M4 handles the USB state machine and acts as a host. It parses the MIDI messages in Controls:

It mostly passes the MIDI messages un-altered, but it also uses the “polyphony number” of a patch to separate note events into separate channels. That’s the poly_chan field of the the MIDI event.

The M4 and A7 share a piece of memory called the ParamBlock. It’s double-buffered, so there are two ParamBlocks: while one core is writing a block, the other is reading the other block.
The ParamBlock has an array of Params elements, one for each frame in the audio block.
On each audio frame, if there’s a MIDI event, it’s stored in current Params, which you can see the kinds of things in Params here:

On the A7 side, the audio callback passes the midi event along to the PatchPlayer here:

The PatchPlayer has parsed the patch and knows what modules are connected to what kind of MIDI events, and routes accordingly.

In theory you could have a ton of input jacks on a module and map all MIDI events to them. But there’s probably a better way…

To get the full MIDI stream, I think the M4 would need some minor changes: to report the channel (right now it ignores it), and skip keeping track of whether a note is pressed or not (right now it won’t report a NoteOff or Aftertouch event if it didn’t first get a NoteOn for that key).
That would get the full MIDI stream to the A7 audio callback, which it could pass on to the PatchPlayer.
A Patch could have a new kind of MIDI jack mapping, call it “Raw” or “All MIDI”, and PatchPlayer could pass on the MIDI event to any module with this type of mapping. It would be akin to a Panel cable, so polyphonic vs. monophonic is not an issue since the PatchPlayer would just stream the data to the module.
However… what we need is an API so the PatchPlayer can do that. Right now inputs, outputs, and params are all just 32-bit float types. Could we encode MIDI into the float? Maybe, but is it a even good idea? Or we could have a new function in the CoreProcessor API: set_MIDI_event() that the PatchPlayer calls on each module with the All MIDI mapping.

2 Likes

first thanks for the really detailed response , really interesting stuff.

oh, and I noticed you have released the firmware on GitHub now, so I can start digging thru it :slight_smile:

oh, thats super interesting … didn’t realise on same silicon!


in terms of midi… yeah for mpe, yeah the main requirement is midi channel.
as long as we then get note on/off, channel at, and cc74 that covers main bases.
(theoretically, other cc’s can be multi channel, but its not common to use)

when considering this space, you might also want to consider poly at as its a bit bit different.

lol, yeah funny enough in my eigenharp vcv module, I do exactly this.
I treat the float as just 32 bits and package whatever data I need into it…even use it as a ‘raw stream’ in some cases.

ofc, its a complete abuse of jacks, but fortunately vcv doesn’t know/care :wink:

but yeah, its probably not a great idea… but for vcv desktop, I had to use what was available - you are able to extend the api :wink:

I made some progress here.

Firmware branch: raw-midi

Screenshots from simulator (just feeding it random notes):

Screenshot 2024-10-03 at 10.51.00 AM
Screenshot 2024-10-03 at 10.51.06 AM

It works as expected, both in simulator and on hardware.

The “MIDI to CV” module is the VCV Rack Core MIDI_CV module. No changes were necessary to the original code from the VCV source to get it running. The changes that I did make were to use the display for a live MIDI monitor (of course, this is just for development, still in the proof-of-concept phase).

The MIDI is transmitted in its raw form from the M4 core to A7.

The way it works is that if a module has a rack::midi::InputQueue object, then constructing that object subscribes the queue with a global singleton Midi Stream. The patch player passes the MIDI stream to this global Midi Stream, which sends it to its subscribers.
So, no special mappings or API changes are needed. At least so far.

The hardware CPU base load (no modules in a patch, but MIDI connected) goes up a few percentage points (7% → 12%). This is because of the extra data needing to be either copied or cleaned/invalidated from the cache, and also more data = more cache misses.

If I disable the legacy method, then the load drops back down to lower than the legacy method alone. This is because there is some amount of processing the A7 audio task has to do for MIDI messages, to figure out which module to route them to, and also because the patch player has some large data structures used to speed up MIDI routing (more data = more cache misses). But as soon as you add a MIDI receiving module, that processing will happen inside that module, so this is not really lower CPU usage.

Next Steps:

  • Need to test other MIDI input modules (non-Rack Core especially). Anyone have recommendations?
  • Add right-click menu options of the MIDI->CV module
  • Polyphony needs to be handled. I’m thinking this:
    • We could have a mono version (same as shown)
    • And, have an octo-phonic version (with 8 x each of Voct, Gate, Vel, Aft, Retr), that automatically sets the poly number based on the highest numbered jack you have patched.
  • Modify the VCV MM Hub code to translate patches with MIDI in Rack to this format (should be easy, mostly removing the old MIDI translation code, though we need to keep the SPLIT module usage intact)
  • Figure out how to handle legacy MIDI patches. We probably could convert them on patch load, and save the converted version when the patch is saved. If having the octophonic monster module visible in the patch view is a problem, there could be an option to hide it(?)
3 Likes

that sounds great…

a thought, would this be a good time to add midi output support?

again, not necessarily adding the modules to use at this stage, but to get that output queue in place within the ‘new’ sdk.

Happy to see more similarity with VCV the software. Should make it easier to navigate and if polyphonic MIDI cable comes through - voila!

1 Like

Yes, I was thinking that too. Just the have the framework there.

1 Like

If MIDI Output is going to be supported (which is just great!), the VCV CV->MIDI modules become viable, which would be a big new capability of the Meta Module … controlling other MIDI gear from sequencers etc running on MM.

Beyond the VCV modules, there are several others that would also add to the MM ecosystem, such as:

fyi - I am developing a Sysex MIDI - based VCVRack module based on MIDI-CAT to control other rack modules with value and display feedback with the Electra One controller. Current demo here: https://youtu.be/rX2rZA0CANs. The beauty of the Rack / MM modular environment is these kind of custom things become possible once modules have direct access to the right APIs.

It would be awesome to be able to port that at some point to MM. Modules with direct access to the rack::midi::InputStream and OutputStream objects is an essential part of that. Plus allow read/write access to the parameters of other modules in the same patch…but that’s a different topic :slight_smile: