If you really had to send the data from multiple sources into a single MIDI destination over a single cable, then if a small global delay were acceptable, a smart scheduling algorithm with a 10-20 millisecond jitter buffer would probably take pretty good care of it so that the upstream data wouldn't have to be tweaked.
(Note that if you stand with your guitar 5 meters from your 4x12 stack, you're hearing a 15 ms delay due to the speed of sound.)
Unfortunately, because of the differences in instrument attack, which a MIDI controller would have almost no knowledge of, I think a random jitter would not fix the issue.
An interrupt controller has no knowledge of device semantics; it can just prioritize them based on a simple priority value. The scheduler could do the same thing. It could even be configuration-free by using some convention, like lower instrument numbers have higher priority.
Also, the physical layer of MIDI could simply be extended into higher baud rates while all else stays the same.
I can't remember the last time I used a serial line to an embedded system in the last 15 years that wasn't pegged to 115 kbps. Bit rate is a relatively trivial parameter in serial communication; it doesn't give rise to a full blown different protocol.
115 kbps is almost four times faster than MIDI's 31250. Plain serial communication can go even faster. The current-loop style signaling in MIDI is robust against noise and good for distance. 400 kbps MIDI seems quite realistic.
This would just be used for multiplexed traffic, like sequencer to synth; no need for it from an individual instrument or controller.
(Note that if you stand with your guitar 5 meters from your 4x12 stack, you're hearing a 15 ms delay due to the speed of sound.)