I've made some progress with the new MIDI mapping.. A MIDI note can be active on multiple YM2164 voices at the same time (eg when doing 'dual' sound setups).
So what I've done is to make a function is_on_instrument(), where you pass a note and its MIDI channel, and it will tell you whether or not the note should be played on that instrument.
I simply loop through all 8 instruments, and whenever the is_on_instrument() function returns true, I perform the required logic.
I've only done a quick-and-dirty implementation for note_on/off right now, and made a simple default configuration where each instrument is mapped to a unique MIDI channel with full note range and a polyphony of 1.
The actual polyphony code still needs to be reimplemented (I'm moving it into a different place from where it was before), and more annoyingly, the entire codebase was designed to use the MIDI channel as a parameter, and have one YM2151 instance mapped to each MIDI channel.
I need to rewrite all of that to map to the individual voices of a single YM2151 instead.
The annoying part there is that it now makes a difference whether certain processing is 'chip-wide' or 'instrument-wide'.
Hopefully the polyphony code will make things trivial enough that you just get a voice number out of it, and not much else needing to be done on 'instrument-wide' level.
I can just say that I would be very happy once I can start LSL3, and it will initialize all MIDI channels and instrument polyphony correctly, and sound basically the same as it did before. Because that would mean the new MIDI mapping is done, and we 'only' have to solve the voices and any other remaining SysEx stuff.