Triggering modules via midi only works for 1 channel.
When using multiple midi channels on same patch (for triggering diferent voices) all midi messages get merged and it also messes up modules connections
This also happens after latest update (jack in midi mappings)
yeah thatās really sad. Ignoring the midi channel is very limited especially for large midi setups that rely on midi channels.
Is there a special reason midi channel gets ignored by MM?
Can we really hope that midi channel will be supported in future updates?
I agree here the message not being channel specific means itās difficult when we have other devices that are channel specific receiving midi cc, those are going to conflict with the mappings on meta-module.
I tried to sequence the MM using the Teenage Engineering OP-133 KO2 as a sequencer, via USB MIDI. Iām curious if there are any plans to fix this issue in the future.
We are looking at this now. Currently, MIDI on the MM just merges all channels.
Perhaps I can get some feedback and various ways to handle MIDI Channels:
1) Global MIDI channel setting:
This idea is to add a setting in the System settings page that lets you choose the MIDI channel that MetaModule responds to. The options are 1 - 16, or All (current behavior).
This would be easy to implement and could appear in the next firmware, no problem.
2) Per-patch MIDI channel setting:
When making a patch in VCV Rack, the MM Hub would detect what MIDI Channels you have the MIDI modules set to, and would embed this information in the patch file. Then when you load the patch, the MM would just listen to that channel. Or when creating in MM alone, you could set this in the patch info dialog box. There could still be a global MIDI channel setting (idea #1 above) which it defaults to if a patch doesnāt specify the MIDI channel.
This also would be not too hard to implement but it would require updating the VCV plugin and firmware. There also would be some confusing cases, for example if you had a patch in VCV with the MIDI->CV module set to All Channels and the MIDI->CC module set to Channel 3, then the MM Hub would have to guess at what channels you want to listen to (so either CC messages would come through on all channles, or Note messages on channels other then 3 would be blocked).
3) Per-mapping MIDI Channel settings:
This would add a MIDI Channel to each jack mapping. So when creating a patch with VCV, the MM Hub would look at the MIDI module used to create the mapping and read its channel setting, and then record that information into the MIDI mappings. When making a MIDI map from within MM, you would have the option to select the MIDI Channel for each mapping you create.
This is the most complex option, so we would only want to go for it if itās really what needs to happen.
I think that option 3 (Per-mapping MIDI Channel settings) is the most valuable for MM.
I use the MM in a 4Ms pod (148 ph) for traveling in combination with the OXI coral and OXI one (Sequencer). While traveling I work mostly with midi and 4 patch cables.
And option 2 is the same like 3 but you must select the midi channels in VCV but you canāt change it in the MM?
In both cases you would be able to change it on MM or VCV.
It just depends on if people actually want to have patches like this example: MIDI Channel 7 Note values going to a VCO V/oct input, and at the same time (in the same patch) MIDI Channel 3 Note values are going to some other VCO, and CC 74 on all MIDI Channels go to a VCF module. And at the same time you want to ignore Notes on MIDI Channels 1, 2, 4-6, and 8-16 because you have those routed to some other synth. This kind of complex patch would be the reason for Option 3. With great control comes great complexity/confusionā¦
Or, Option 2 would be suited if you wanted to have a patch respond to all events on a particular channel. Then other synths could respond to other channels. This is a more traditional MIDI setup.
I think if your setup has just one MIDI cable (not sure if yours does), then Option 1 or 2 is probably sufficient unless thereās some reason you want to filter particular MIDI Channels on particular events but not others.
But, maybe thereās an example Iām not thinking about.
is there a option to have different modules on the same patch responding to different channels?
So you could have one voice on channel 1 (pitch, gate and cc control) and another module responding on channel 2.
Thinking of a multivoice setup (like 4 plaitsā¦etc)
I guess this means multichannel per patch.
Thanks!
For me I think the option to use 3 VCOās inside one patch with different midi channels would be the best option. I have 4 sequencers in the OXI one with different midi channels per sequencer.
I think Option 1 is an obvious first step if it can be done quickly, and handles the basic case of treating MM like a single MIDI instrument listening to either a single specific channel or all (Omni).
I canāt see how you could solve the problems you identified with option 2 if the MM can only still process just a single MIDI channel or omni / merge. Which MIDI module in the Rack path would āwinā ? Even if could come up with some logic to decide, how would the user know / predict ? Maybe show the selected channel on the MM VCVrack module somewhere? āWith this MIDI setup, the channel I will select to use is ā¦ xā. Sounds messy and ultimately still restricts to just one MIDI stream into MM.
Option 3 seems to be the āmodularā way. Allow patchers to set up whatever MIDI mapping they want, MM then figures out which MIDI events in which channels to route to which patch cables. There should not be any ambiguity at that point which MIDI event should be routed to which module jack.
Have you thought at all about possible MPE support in the future (I really donāt want to hijack this thread, thats a big topic in itselfā¦)? But which option(s) would support that being added in the future , or which options would make it more difficult to do that in the future ? I use Ahornberg MIDI Poly Expression in VCVRack ā¦ one day MM could parse that module like it does the VCV MIDI modules to route polyphonic MPE data ā¦ imagine!
+1 to option 3 as that would lend the most flex and interesting potential sonic ideas within a patch. but certainly, at least would be happy with option 1 to start offā¦
I also think option 3 is the best because it allows us to respond to various scenarios.
1. Ep133 USB MIDI 1 -> basslime
2. Ep133 USB MIDI 2 -> kick for sidechain
3. Ep133 USB MIDI 3 -> physical modeling voice
Or
Launchcontrol (Novation) preset 1 for controlling 8-step sequencer 1
Launchcontrol preset 2 for controlling 8-step sequencer 2
Or
Ultimately, it would be even better if everything could be connected using a USB hub, allowing both of the above options to work.
(In this case, Ep133ās sequencer would use 1-8, and Launchcontrol would use 8-16)
It would be great if things like VCV Freeās āMIDI to CVā and āMIDI to Triggerā could be ported over just as they areā¦
you could use a combo of all 3ā¦ basically a āfilter heirarchyā
global filter ā patch filter ā jack filter
high level takes precedenceā¦
so if you way to use jack filter you need
all ā all ā jack a -1 , jack b - 2
(though, honestly, I think global is not really needed, I think patch level is fine)
also perhaps jack is a little fine grained, and might be better as āmodule levelā ieā¦ all jacks on that module.
the use-case Im thinking here is oscillator with v/oct and trig (e.g. plaits) , then youāll not want the note on to be separate channels, so kind of makes sense that all midi input to that modules is from one channel.
I guess the counter case, would be something like a drum module where note on trigs are used from different channelsā¦
but of this comes back to be dev question/pointā¦
I think there are some interesting midi modules that could be used to do this mapping and that are present on the MM itself (unlike the vcv mid module which ādisappearā) , these would need access to full midi stream.
but theyād avoid you having to try to cope with all use-cases in the vcv->mm translation.
Iām glad you have brought this up (early and again now), because the more I think about it, this is really a viable option. It would solve all the issues. There are some downsides, but ultimately itās the most flexible solution and familiar to users coming from VCV Rack.
The downsides are that extra modules and cables on the screen makes for more clutter in the patch, and there will be inherently more CPU consumed (since the MIDI module has to be run and its signals passed by the audio engine, vs. the current method is to process MIDI in parallel on the co-processor core). Technically there will be slightly more latency (but at minimum just one audio frame, so nothing anyone would notice).
The upside is that MIDI can be handled in any number of ways depending on what MIDI modules you choose for the patch.
Whether to continue processing MIDI internally in parallel with the raw MIDI stream would be a decision to make.
One hurdle is that without polyphonic cables, the MIDI modules would need Split modules tied to them, so we would create custom MIDI->CV module with 8xCV outs, 8xGate outs, 8xVel outs, etcā¦ It would be huge (40+ jacks), but there could be a mono-phonic version to save space.
Letās call this Option 4: Allow MIDI modules to parse the raw MIDI stream and output CV, gates, or whatever signals they want on regular virtual jacks. E.g. The VCV Free MIDI->CV module would appear as a normal module.
indeed, it does create a small overheadā¦ but I think for the flexibility (and potential creativity in module design) its brings its worth it. also it could run alongside the existing solutions.
polyphonic cables - indeed these help, but I think must be considered a separate development, which would just benefit his scenario if/when implemented.
indeed until then, you need various splitting options ā¦ as you do with other polyphonic modules.
that said, there are some use cases where a module could use multi channel midi directly.
e.g. polyphonic full voice module which supported mpe.
Ive been considering these for a while, since theres many optimisations that can be done when you are processing a full (polyphonic) voice within one module.
I guess, at the end of the dayā¦ the idea is, yes at user levels we need need multi channel midi support. but, separately, allowing modules full access to midi streams brings new possibilities to what can be created within a single module.
Ok, Iāll make a branch and start testing this out. At first, in parallel to the existing MIDI framework.
And at first for proof-of-concept we encode the raw MIDI values in the floats, so as not to create havoc with the API. You mentioned you already do this in the eigenharp VCV? I looked through the Meta Morph project on your github, but didnāt see any MIDI stuff so it must be elsewhere.
no the eigenharp is a proprietary usb protocolā¦ so I encode its data stream as 32bit floats.
the interfacing to the Eigenharp is in my EigenLite sub project, then Metamorph builds on the C++ api I expose there, then encode (some of) those messages over a single jack / float
so the above is where I encode a few (key) streams that I useā¦
there is another one for leds in LightWire.h
the reason I do this, is to avoid having to have 10 jacks for a single connection.
and as the eigenharp is throwing data at 2khz , a serial data stream at sample rate (48khz) is fine
thoughā¦ I do have to be careful on the lightwire since potentially I can be sending led information for over 120 leds ! so I implement these as led region msgs.