Hacker News new | past | comments | ask | show | jobs | submit login
How will MIDI 2.0 change music? (2020) (qz.com)
179 points by kelnos on Oct 30, 2021 | hide | past | favorite | 161 comments



Now, I agree that MIDI 1.0 was designed for keyboard based synthesizers, and I know that complaints about MIDI latency have been around about as long as MIDI has been. So, yes, it has always been imperfect.

However, the nice thing about MIDI is that it allows me to hook up my 2020/2021 synthesizer (a Behringer 2600) to a 1987 sequencer (a Yamaha QX5), a 2015 synth (a Roland JD-Xi), or pretty much any piece of pro synth gear made since 1983.

Now, with MIDI-over-USB or what not, we lose something which makes MIDI very nice: We can connect a hardware keyboard to a dedicates sound module without having to have a computer involved. While there are USB host synths which one can connect a MIDI-only-over-USB keyboard to directly, they are few and far between (my JD-Xi has USB, as does my 2600 and Arturia Keylab controller, but all three can only be hooked up to a computer with USB)

Until MIDI 2.0 has a way to allow synths to be connected to each other without involving computers, and there is a standard way to do that (I think USB-C would probably be best for this), DIN MIDI 1.0 will be a part of pro and bedroom recording studios.


> without having to have a computer involved

I shudder to think of a future where musical instruments require you to download an app to use them; where not only are your stove and washing machine phoning home, but your piano as well. No thanks.


To be fair, the computer in a USB MIDI setup where it's just connecting a physical controller to a physical synth is basically just forwarding MIDI messages verbatim from one port to another. You can buy MIDI host devices that really don't do anything at all except allow you to connect two USB devices and have MIDI work. (They tend to be expensive, and they're kind of annoying to have to deal with.)

This isn't a proprietary lock-in thing, it's just an unfortunate limitation of USB. Though I suppose a hardware manufacturer could release a product that only works with some special app if they wanted to go that route. (Some devices have settings that can only be accessed via a computer app.)


Maybe. But it worries me whenever anything moves from an "embedded chip" mindset to a "let's add a touchscreen"[0] mindset. Some mid-level exec is going to look at that cheap but beefy ARM SoC they are using, and get ideas. The companies making digital pianos are not immune to the disease that causes online advertising to show up in refrigerator firmware.

[0] https://www.youtube.com/watch?v=G2PMzSo1Bss


You can sign up for the Pro edition piano for $9.99/mo, but you're going to have to call the sales team if you're in a touring band and need the Pro Enterprise edition.


Meanwhile the fly by wire key keys which are prone to wear out and need an expensive service call (not included in price) introduce a random jitter in timing in the standard edition. Attempts to correct this jitter by either performing surgery on your own device or correcting same in software are a violation punishable by jail time specified in DMCA v2.

Attempting to even do the repair yourself will be a violation of the terms and conditions of your subscription and void your ability for you or anyone else to register and ergo use your Piano. If you know or suspect someone of violating these terms please turn them in for a free $100 gift card good for a fine assortment of musical instruments.

If we suspect you have violated these terms you have agreed to give us an opportunity to inspect and if needed disable your equipment. For an additional $200 we will haul off your now useless instrument.


How much extra for the black keys?


Starting from 5 cents per stroke, depending on velocity and pitch. Just make sure you have internet connectivity at all times, otherwise your contract with each of the keys cannot be checked.


Who says it involves "phoning home"? You can use offline apps, and turn off your internet if you want. I happen to like browser based MIDI apps, but then again, I write them, so if it is "phoning home" (which it usually isn't unless I want it to) it is often to localhost. :)

Your comment could be applied to typewriters. Do you prefer the days when, if you wanted to write a book or article or something, you used a typewriter that was completely standalone? I kinda like having my "typewriter-style keyboard" connected to a computer... it opens up a ridiculous number of possibilities that you can't do with a standalone machine without a big display and a fast processor and a big hard drive. And maybe internet.... you can't exactly have Hacker News with a standalone typewriter, but you can with a keyboard that connects to a computer.

Same with a "piano-style keyboard."


"Who says it involves "phoning home"? You can use offline apps, and turn off your internet if you want."

Those are getting fewer and farther between, the new trend is that you have to accept a 40-page license agreement to rent a bike, you get a Fancy pet feeder that doesn't feed your cat when their servers are offline, your Samsung TV adds ads that you cannot turn off, etc.

You own and control nothing despite paying thousands of dollars for it.


I’m sure that there still are some offerings that don’t do that. Vote with your wallet.


I am not aware of any smart TVs you can actually control, and that are not user hostile.

I think "Vote with your wallet" is just an excuse to sit around and do nothing. It did not stop companies spying on you, selling your data, electronics becoming unrepairable, schematics disappearing, etc. Vote with your actual vote, we are meant to be a democracy first, capitalism second.


You can buy a dumb TV instead, or watch something else (Web, DVDs etc.). You can read books, play a musical instrument, or go out. There are lots of forms of entertainment today, many of them don’t track or spy on you.


I'm more referring to the trend in "smart" consumer electronics to violate privacy. For instance: https://news.ycombinator.com/item?id=21657930 https://news.ycombinator.com/item?id=20206536 https://www.nbcnews.com/better/lifestyle/downside-connected-...

And it's not practical for your average consumer (or musician) to try to block it, or even detect it. Also, there are less and less options to buy "dumb" devices and appliances that won't spend their operating lives trying to exfiltrate data about you back to their manufacturers and their corporate partners.


I will shill for the retrokits rk006 all day long (I am not involved and do not benefit from the project).

https://retrokits.com/shop/rk006/

It's a midi usb host and multi-port thru-box that even boosts succulent growth while you make DAW-less jam.

It's particularly good for bridging USB and DIN midi.


This is why complaining about MIDI over USB is nonsense. Anyone with more than 3/4 MIDI 1.0 items has always needed at least a simple hub - and possibly a full patch bay, perhaps with processing of key splits, patch mappings, velocity ranges, and so on - all those things you'd use a computer for today.

The MIDI 1.0 benefit of connecting keyboard-to-keyboard really didn't last long in practice. Of course you can always do it in theory, but most practical applications at all levels need those extra features.


> even boosts succulent growth

Go on...


Agreed- though for MIDI 1.0 Kenton has some helpful boxes that act as a USB MIDI host to DIN MIDI converter.


the converters are great& easy.

this protest that we must conserve an over 30 year old standard& never ever update it is a breed of conservativism that makes me shudder. basic USB is available on a vast vast range of micros, including a huge range of under $1 ones. if you do want to use old gear, great, the are options.


We need more in the way of peer-to-peer and/or switchable USB (though I'm not quite sure how charging/power switching should work.) I remember reading about how to get USB to work peer-to-peer (e.g. like a simple bidirectional serial or ethernet cable) but practically it seems very hard.

Perhaps USB-C can do it as you suggest?


Replying to a past comment (https://news.ycombinator.com/item?id=22188146):

> - Violins come last, and it doesn't really matter because they're lazy and they'll take a few hundred milliseconds to really kick in anyway.

I think the assumption that violins don't need strong staccato/marcato attacks is wrong. I've needed it in the past, and was unable to get it from soundfonts which only have slow-attack samples. This issue is omnipresent in off-the-shelf soundfonts, and I've also sometimes seen poor articulation in music I've heard made using commercial sound libraries (which I've never purchased or pirated so I can't judge myself).

As an example of a piece suffering from slow attacks, https://www.youtube.com/watch?v=ua5VJt6jkaU is a nice remix, but feels sloppy because the flute, strings, and violin's 32nd notes are barely audible (goes from attack to release without reaching full volume) and smeared together.


Yeah, articulation should be standardized in the midi specification. It's always a custom mapping to control it even when proper articulations are available. "Oh cool, custom mapping?!", you might say. Wrong. It sucks, because it tends to make old old scores utilizing articulation unusable because the mapping was stored in some obscure plugin configurations divided amongst many plugins.

It would be great if articulation messages were standardized, and then you could just expect them to be there, and if they are absent, the soundfont can simply fall back to the closest thing available. That kind of helpful logic is not possible with custom mappings.


Has been tried. Check Spitfire's UACC attempt. There are way too many articulations and ways of naming them to get a complete list, and even when there would be such a list, you'll be asking yourself "hmm, this says Long Con Sordino Sul Pont", but this 'soundfont' doesn't have that. What to replace it with?".

Bear in mind that not all sample libraries have articulations. Some of the more modern libraries and plugins include a modelling component, or are pure modelling, which allows you to use controllers to get those articulations in a much more flexible way.


Off-the-shelf .sf2 files containing full-GM instrument banks simply don't have sharp attack samples. I hope they will start having them, but I don't know what it takes. Are single-instruments (or .sfz files) more likely to have them? Adding support in soundfont players for selectable articulation? MIDI 2? "Just buy a commercial plugin?"


SF2 ... forget it. SFZ can do this sort of thing already, as can Kontakt format and also Decent Sampler.

SF2 is dead. It was never capable enough as a format to deal with what really needed to be dealt with. It is a quintessentially dated technology, and unless your requirements for sample libraries are very limited ("make a gong-type sound when I hit middle C"), SF2 is not for you.


To my knowledge there are no GM SFZ instrument packs which contain enough instruments to play any MIDI file out there (https://github.com/sfzinstruments/Discord-SFZ-GM-Bank has like 10 melodic instruments so far). So I have to pick instruments from the ground up when creating a DAW project (and I don't have a good collection yet), rather than using a unified full-GM bank as a starting point. And don't even think about downloading (eg. Doom) MIDI files and playing them in a soundfont of choice.

What's the spiritual successor to General MIDI? Or is the concept no longer viable for (or even wanted by) mainstream electronic, or even semi-orchestral, composers?


I would say that there is no longer a viable GM-like concept.

Take a look at the libraries released by Spitfire Audio. You won't find anything remotely like a GM-style collection.

There's a buzz word of the day - "cinematic" - that has been defining new sample library releases for a while now. They are definitely not focused on "here's a sort of complete set of sounds corresponding to traditional instruments".

People generally want more unique sounds. Maybe a GM bank is a good starting point for people who will work with fairly traditional instrumentation (even they eventually switch away from the GM bank itself). But it's not much use for people wanting more distinctive timbres and textures, and for "cinematic" work (read: games, films, soundscapes), that's normally the goal. It's also where, for now, the money is.


The most "GM-like" experience I've encountered in a modern commercial product is the Kontakt factory library. It has almost complete General MIDI 1 coverage, plus a handful of random additions (a dozen electric pianos, samples recorded from some vintage synths, an SATB choir singing six different vowels, several drum kits...)

The quality is good, but it's not in the same league as specialised libraries. There are usually only two or three velocity layers (for example, the choir can only sing a subdued "aaa" or an intense "AAA"). The orchestral instruments only come with a few articulations, and the guitars and pop brass only provide one articulation each. None of the instruments have sampled legato.

Unfortunately, I think this is probably the quality ceiling for a sample library with good General MIDI coverage. If you were to add more velocity layers, articulations and legato transitions, the number of samples would start to grow exponentially, and the cost of the library would grow to match. I'm not sure there's any way around it.

I went through a phase where I was trying to put together a really comprehensive collection of samples, but in hindsight it was a fool's errand - you'll get diminishing returns fast, unless you're willing to spend huge amounts of money. Accurately emulating a live performer is a nice option to have, but it isn't actually a requirement for making good music. Huge amounts of excellent music has been made using synthesiser presets, or using instrument samples which would be considered "cheap" or "low-quality" nowadays. As long as you're willing to adapt your compositions to the instruments you have available, rather than stubbornly writing string melodies which could only be performed by a real-life violinist, you'll do well.


> As long as you're willing to adapt your compositions to the instruments you have available

I've seen SNES and DS games whose sample sets handle staccato and sharp attacks better than the average soundfont out there. The cover I posted is worse than the original DS game in this aspect. I assume the Kontakt factory library is competent enough to get this right though.

I assume sample packs can't handle wide pitch bends and precise vibrato that well either. Synthesis is probably better, but the pricing I've seen is insane (but probably justified considering the complexity of simulating acoustic instruments).


This is quite a good example of the hard limitations of high-quality sampling. If you composed a violin melody line which absolutely needed a fast attack, Kontakt (a £359 library) would give you:

- A "sustain" articulation with fast attack on the loudest dynamic (which also happens to be molto vibrato), but a slower attack on quieter dynamics

- An "attack" dial which can emulate slower attacks, but not faster ones

- A "staccato" articulation which has an instant attack, but a duration of about 0.5 seconds

- No legato, so even if you get the attack right, fast runs and ornaments are going to sound a little strange

I've encountered similar limitations with dedicated orchestral libraries. If it's not the attack, it's the expression, or the maximum note duration, or the speed of an ornament, or the timbre of the specific instrument they recorded, or something else which always dangles perfection just out of your reach.

I suspect you'd have similar problems with even the most expensive sample libraries; it's one more reason to opt out of that game altogether. If you lower your standards for audio fidelity, your musical freedom increases. When you're composing for the Nintendo DS, nobody's going to notice if you manually tweak the attack of your string sample, or fake an ornament using pitch-bend, or crossfade between two velocity layers.

SWAM is one possible escape route, but you're correct that it costs a fortune. It also has high CPU requirements, so you might need to bounce your SWAM parts to an audio track, and you'd have limited ability to put together seven SWAM violins and call it a "violin section".


There are a lot of claims in this article about MIDI1.0 that are misleading at best, it makes it sound more limited & crude than it is. Still it's indeed very nice to see some improvements here, especially since some were transitioning to proprietary communication protocols over HID.


Some past related threads:

MIDI 2.0 Specifications Available for Download - https://news.ycombinator.com/item?id=22411765 - Feb 2020 (35 comments)

MIDI 2.0, first major overhaul to music interface standard from 1983 - https://news.ycombinator.com/item?id=22180731 - Jan 2020 (105 comments)

Details about MIDI 2.0 - https://news.ycombinator.com/item?id=20007638 - May 2019 (147 comments)

MIDI 2.0 Prototyping announced - https://news.ycombinator.com/item?id=18946222 - Jan 2019 (195 comments)


I just wanted to say, the next and prev links on comments are a brilliant feature. Thanks!


And the "root" link. But since they are plain links they pollute the History stack, which is a bit annoying.


We're thinking about handling them in JS to avoid that. But other users want it the other way. Might have to do a poll.


MIDI 2.0 has USB as a pre-req. I’m not sure how this improves things. One of my chief complaints, and why I moved away from USB to an all-midi setup is how interrupts are handled, making USB sometimes unreliable, especially with large setups like mine. MIDI just works and is pretty easy to implement. I’ve never experienced a real limitation. Resolution has its workarounds but honestly isn’t all that limiting.

What does 2.0 offer me that 1.0 doesn’t? I know it has new features, but I want a good argument for those features actually being useful.


MIDI 2.0 is transport agnostic so it doesn't necessarily require USB, however USB is going to be the best solution for connecting many MIDI devices for awhile.

The big underlying change is that MIDI 2 is duplex, which allows for device discovery, property exchange, and profile configuration. What that buys you is an arbitrary protocol for exchanging information about what things are connected to each other on the same network of devices. It should allow for a lot of cool features, like supplanting Mackie Control and Eucon with an open (or at least trivially open) protocol. One thing I think we'll see with these is more support for microtonal or arbitrary pitch systems that MIDI currently can't realize.

Jitter compensation is built into the protocol too so it should resolve some of the timing issues with USB.

Having read through the spec and written the data structures out for it already I'm not sold that it's overly complex. The additional annoyance is that it mandates JSON parsing but that isn't a tall order these days.


Mackie Control is open already (even if Mackie didn't really intend that to be the case).

Eucon is a perfect example of how to make a control protocol so powerful that nobody can ever really use it, or at least, not more than 5% of its capabilities.

Mackie and Eucon do not provide music performance data, so they would never be involved in pitch control. MTS can already be used to do microtonal stuff, but very few synthesizers are MTS aware, so the fact that you can do it already with MIDI 1.0 doesn't help a whole lot. Maybe 2.0 will encourage (a few?) more synth makers (soft/hard) to support MTS and/or the new stuff, but maybe it won't.


What would you have recommended instead of JSON?


It's weird to put variable-length text-based serialization formats into something that has to perform in realtime no matter what, on potentially limited hardware.


I think JSON makes a lot of sense. It's just orders of magnitude more complex to parse into useful data structures than MIDI1, or any other part of MIDI2.

Since MIDI is already going the opposite direction of say AES67 and reinventing wheels is ok, a custom binary encoding would have made more sense to me. Something that can be encoded in a C89 struct without a ton of trouble or doesn't need much thought put into an allocator on bare metal devices to get the handshake working.

I believe it's not strictly JSON though, I don't have the docs in front of me but iirc strings have a max length. So that's good.


JSON feels like a terrible choice to me, especially on the latency aspect of things.


msgpack comes to mind...


I'm in your camp on this one. The new features are neat, but MIDI 1.0 being stupid-simple is its killer feature to me. Plus everything from the last 30yrs supporting it is a big plus. I love that I can take my 1980s SuperJX and fit it right into my setup with brand new stuff, and a brand new DAW, and everything just works.


I don't know if I would call MIDI 1.0 simple in practice. It has a lot of really confusing idiosyncrasies, it's definitely got that feel of being a simple base that was organically developed and had tons of things piled into it over the years. The packet structure in MIDI 2.0 is a lot more... consistent, I guess is what you could say? I don't think you even need a state machine to implement it beyond the initial handshake.


Aside from increased resolution, one of the biggest improvements is MPE. It allows controlling parameters on a per-note basis. A side effect of this is making microtonal music significantly easier to express, since you no longer need multiple channels to do per-note pitch-bending. MPE support is already probably more robust than MTS (MIDI Tuning Standard,) although I could be wrong on that.

Essentially: for traditional western music, it seems to just offer better flexibility and precision, but it will make a huge difference for microtonal music. Honestly, being able to adjust parameters per-note simply makes sense.

(I am not a musician, just a programmer who has dabbled in a small amount of music software and protocols.)


MPE is not a MIDI 2.0 improvement.

MPE is already implemented with MIDI 1.0.

The only limitation of using channels for MPE is that you can't daisy-chain MPE devices... Which nobody really does anyway (most of my gear doesn't even have a THRU port).


MPE also enables increased expressiveness that new kinds of multi-dimensional controllers such as the ROLI keyboards leverage (new kinds of MPE controllers are shown at https://www.midi.org/midi-articles/midi-polyphonic-expressio...). While MPE is part of MIDI 1.0, many synth manufacturers never bothered to support it since their MIDI implementations were basically considered a "solved" problem by product management. Many of them never even bothered to support polyphonic after-touch except in their high-end keyboards.

Now with MIDI 2.0 support becoming a 'checkbox' item for customers, all the manufacturers will need to revisit their MIDI implementations in upcoming product releases (some already have). I suspect many will use this as an opportunity to increase input dimensions and expressiveness - which is a very good thing for all of us users, even if they do so in ways which may have technically been possible before.


Midi 1.0: 127 steps

Midi 2.0: 4,294,967,296 steps

https://www.midi.org/midi-articles/details-about-midi-2-0-mi...


Number big, protocol better!

Ever heard of NRPN, my friend?

There's a reason why this part of the MIDI spec is rarely used. And the reason is that the musicians don't need it.

If you can hear stepping on your pitch bend or filter, it's not because MIDI, it's because the implementation on your device is stupid.


If you're using only one 7-bit CC (and not an MSB/LSB pair) to control a filter cutoff parameter that can range from 20Hz to 20kHz, it's not possible to dial in the exact frequency you want to match a particular harmonic.


Yup. MIDI 2: solving problems musicians don't have and creating new problems! (Excessive complexity -- MIDI 1 was hard enough to implement. MIDI 2 is telling everyone to jump 10ft high -- good luck with that.)


I agree that many of the things in MIDI 2.0 don't seem directly relevant to the bread-and-butter musicians, composers, mixers, producers and DJs who are the ultimate users. The numeric bumps in resolution, data and packets are all conceptually fine but it's not like they are addressing the standards gaps most holding back users and forward innovation.

Personally, I was disappointed they didn't even make a stab toward addressing higher layers in the stack like standardizing multi-channel mixing control beyond the ancient 'defacto' Mackie HUI format or some attempt at a framework for software plugins running on hosts to display GUI on controllers (even something very open/extensible ala HTML/CSS). That could have unleashed waves of innovation from indie and smaller developers.


Maybe I’m old fashioned but keep html/css the heck of my music instruments please and thank you.


The suggestion was for a GUI running on the host, not on the instrument...


> MIDI 1 was hard enough to implement

MIDI 1 is one of the easiest protocols I've ever interfaced with myself. Getting started on any platform is easy, the API is not that large and it's relatively straightforward. You can on Linux write bytes directly to /dev/midi and get up and running really quickly, and the specification is not that big. See https://ccrma.stanford.edu/~craig/articles/linuxmidi/ for some examples.

What did you find hard in implementing MIDI 1? Would be nice to hear the opinion for something I haven't heard in real life (yet?).


Parsing running status + sysex + system RT messages is non-trivial.

A full state machine for MIDI is much more complex than it tends to initially appear. But yeah, it's not rocket science. It's not even as hard as grappling with std::move()


Maybe the core specification isn't that big a deal if you're just trying to get started writing some note on/off messages, but the full set of specifications with all the controllers and sysex messages and extensions and supplements is pretty long: https://www.midi.org/specifications-old/item/the-midi-1-0-sp...

And you can add in that vendors also have weird proprietary sysex messages a lot of the time. Also I second a sibling comment here, there are a decent number of gotchas with the state machine.


> If you can hear stepping on your pitch bend or filter, it's not because MIDI, it's because the implementation on your device is stupid.

Or because you are stepping across 5 octaves instead of just one.


I have not heard of NRPN until now. Thanks for the pointer. Adding frequency support for any frequency and adding whole-number ratio harmonizing would have been preferable to me personally.


On this matter I must say for playing music live that the 127 steps provides a great deal of resolution. Good enough for me that I have never felt like my EWI's breath sensor was not fine enough or that my piano's touch sensor was not fine enough or that a CC knob didn't give enough resolution when turning slowly. Though if I listen closely to an isolated note I could tell there was a difference in loudness between two neighboring velocity steps, but probably couldn't tell in a live performance setting, and I would challenge listeners to see if they could tell the difference. I would say 127 steps is "good enough" or almost is for practical purposes...But that is just my datapoint and opinion. (For comparison, the most common video format is YUV420, and that averages to 12-bits per pixel, and most screen monitors use 8-bits for each color.) Had MIDI used the top bit to flag an optional extra byte for resolution, then that would have been great.


I trained as a classical pianist before moving to the world of synthesizers ...and rock 'n' roll.

The very few electric pianos I've played that didn't feel painfully limiting in dynamic expression were high-end instruments with a whole lot more than 127 velocity values.

Similarly, I have the obscure but magnificent Nord G2X, which mostly uses MIDI for mapping physical knobs to parameters in patches, but its mod wheels are much higher resolution.

I strongly prefer the mod wheels for params that cover a wide span, like filter cutoff frequency, because I can feel and hear the stair-stepping when using knobs.


It won't. It will make certain small elements of music making a little easier, but there's nothing in Midi 2.0 that will fundamentally change what we can or cannot do in a way that will be noticeable in music in a broader sense. Compared to something like the advent of cheap digital recording, VST plugins, of fast computers, midi 2.0 is a drop in the bucket.

That said, I'm definitely looking forward to having it reduce the number of hacky workarounds we currently deal with to get high resolution data in convenient forms. Honestly to anyone not deep into the weeds on this stuff it will be meaningless.


Per-note pitch bend is a big deal, and it could impact the kinds of music people use MIDI for. Right now, it's pretty awkward to use MIDI with controllers that aren't piano-like or to play music that isn't in twelve-tone equal temperament.

That said, I'm kind of pessimistic about MIDI 2.0's prospects. Maybe it'll be adopted by most of the manufacturers, but it's not looking like anyone is in any great hurry to start making compatible hardware.

There are also some problems that MIDI 2.0 just doesn't address (as far as I know), like the limitation of 128 notes per channel, or the difficulty of expressing a volume swell in a way that a synthesizer will play in a predictable way.


> Per-note pitch bend is a big deal, and it could impact the kinds of music people use MIDI for.

MPE is a thing as well. As the owner of a Linnstrument, I get 10 finger multiple pitch bend over a standard Midi connector to all synths who understand MPE (which are sadly not many, but hey)


You're not wrong, I just think that those who are going to explore this, but who don't already work around the limitations, are a small enough group that it's not going to be any big fundamental change. I'm personally looking forward to it, because I do that kind of thing, and right now have to use hacky solutions like using tons of midi buses and running one note per midi channel to get per note bend.

Don't get me wrong, an improvement on Midi is long (like decades long) overdue, and I'm looking forward to it. It's just not the sea change that the hyperbolic article was claiming, not even close.


I think it's hard to tell what the demand for microtonality would be if it were easy to do rather than hard. Right now if you want to do non-12-EDO music, there are roadblocks everywhere. Instruments are practically non-existent, aside from continuous-pitch instruments like violin or trombone. Standard notation works against you. One would think that electronic music would be the one place where microtonal music would be easy, but no.

I don't really expect mainstream music to suddenly start making extensive use of seven-limit just intonation just because MIDI 2.0 has per-note pitch bend, but for the people who care about it at least we could see a lot more interest in expressive instruments and non-12-EDO musical performance. Even if it's only 5% of musicians who care about any of this stuff, that's still a lot of people.


What would be an application for more than 128 notes per channel?


Scales other than 12-tone equal temperament, mostly.

Awhile ago, I built a keyboard instrument with 156 pressure-sensitive keys. It's 5 octaves, 31 notes per octave, based on a just-intonation tuning system. Basic sound demo here: https://www.youtube.com/watch?v=Ep52Vh6oAOE

The Lumatone has 280 keys. https://www.lumatone.io/

The Tonal Plexus H-Pi is basically a big array of buttons, arranged in 205-tone equal temperament across 6 octaves. That's over 1200 distinct notes. There aren't many of these in existence, but a friend of mine has one. https://hpi.zentral.zone/tonalplexus

For a 12-TET example, one could represent a 6-string guitar with 24 frets as having (24+1) * 6 distinct notes, or 150. Even that is over the limit.

It's possible to work around the the limited number of notes by using the one-note-per-channel trick (formalized as MPE); then you're limited to 16 notes of simultaneous polyphony because that's how many channels MIDI has, but you can pitch-bend to exactly the note you want. But then you're using MIDI in a very different way than it's usually intended, and it's hard to do that without making the user experience worse in various ways.

MIDI 2.0 could have expanded the number of notes so that (with per-note pitch bend), no one would ever have to use the note-per-channel trick again, but that's not the choice they made.


Yeah I mean the feature that matters most is high resolution MIDI CC so we can finally avoid stepping. It’s great, but long overdue and not really a game changer.


Exactly. It means we can stop using so many midi busses to work around the fact that we get one 14 bit per cc per channel right now (pitch bend), that kind of thing.


Exactly. Though with MPE we may see more controllers that support polyphonic expression. Personally I'm happy that ASM is making keyboards with polyphonic aftertouch.


You hit the nail 100% on the head there. I Couldn’t agree more with what you have said hence the comment and not just an upvote.


Of all technologies, I believe MIDI has been the most plug and play experience for me. Plug a midi cable in, boot up a sequencer and you are good to go. Really interesting to see what can be done with MIDI 2.0


I was trying to write a program using midi last year and was surprised by the lack of any libraries to work with it. Then I looked up the spec and was able to get all the functionality I needed with 10 lines of code interacting with /dev/midi. Incredible how powerful such a simple interface is.


Midi can plug-and-play so easily because there is no handshaking. Every midi capable device has already agreed upon the 31,250 serial baudrate and 8-bit data + 1 stopbit format and the commands.

The only occasional problem I'll experience with midi is if I unplug before sending a note-off for a note, then that note might forever be stuck on.


And even that is not really a protocol problem. It's trivial for a device to implement "reset-on-disconnect".


The problem is that there is no such thing as a disconnect signal, so a device can't know a cable has been pulled or something in between has been powered off; if it happens then the device is left hanging with a note on, and one must find a way to send the relevant note off or the all notes off message when communication is restored. Adding such controls (aside the simple reading of the current loop value that doesn't solve all problems) would require one more layer which would translate in more complexity and more latency.

If I had to implement such things on top of an improved protocol, I would likely also ditch the serial link, ignore completely USB, and jump straight to Ethernet, which is damn fast, near realtime, supported pretty much everywhere (Ethernet switches would essentially work as MIDI Thru boxes) and free of royalties. Devices in the same local network wouldn't even need the transport and above layers to work, so latencies would be extremely low, but adding more IP stack layers would allow to remote things easily.


> jump straight to Ethernet, which is damn fast, near realtime

And could optionally use ptp (precision time protocol) Ethernet interface so that the local network clocks are synced to sub-microsecond precision, and so can put timestamps on the midi messages so they can be properly sequenced.

And for increases reliability, incorporate with AVB which uses stream reservation to ensure priority packets can get sent.


Seems you guys should start up :)

Btw I think the hard part is probably not he implementation of Ethernet, but getting hardware vendors on board. Midi 2.0 needed a lot of companies to be on board for the change.

Perhaps if you make adapters and translate devices to help folks transition.


There already are adapters around and software/libraries to build applications to tunnel MIDI over IP, but the hard part IMO is to define a standard so that different products can talk together. Small Ethernet capable microcontrollers are cheap, so once a standard is defined it would be easier to implement it on devices.

https://www.thomann.de/gb/bome_bomebox.htm

https://openmuse.org/transport/mip_oview.html

http://www.linuxsampler.org/ethernetmidi/

https://qmidinet.sourceforge.io/

etc.


>There is no such thing as a disconnect signal

Incorrect :) MIDI OUT provides power. No power = disconnect.

That's not to mention that you don't need a signal for *physical disconnect Your stereo knows when you unplug headphones.


The midi out port provides a small current (a few mA) that is used only to drive the photocoupler at the receiver side (per MIDI specifications, transmitter and receiver have to be electrically insulated). It doesn't and can't supply anything else, although in the past some manufacturers produced little out-of-standard devices that took their supply from the MIDI signal, but they were mostly marketing gimmicks.

Moreover, MIDI is a digital signal which doesn't carry any status about the media it travels through; I mean, 10 seconds of nobody playing would appear identical to 10 seconds after a pulled cable.


That's all good up until you try to synchronize different MIDI devices with their own sequencers. Even "voltage triggering" is more accurate...



The clock is not very accurate and devices have to interpolate it. Which is something they all do slightly differently, especially if the tempo changes - which doesn't happen much in pop, but happens regularly in media composition.

If you're lucky the result will be in-time-ish, but it's never going to be sample accurate.

This actually matters for some applications, including sound design. If you trigger two samples with a variable offset you can get phase cancellation and other easily audible effects.

DAWs are sample accurate now, but 5-pin DIN MIDI 1.0 really isn't.


Never mind what new features will come, I hope it'll always be backwards compatible with MIDI 1.0. MIDI today is so universal and so easy to understand, even on the very low hardware level.


Agreed. I have thousands of dollars of equipment using 5 pin midi connectors that all use current midi.

If new devices come out they're going to need to work with the rest of my setup, or they're going to be a hard sell.


I don't see why someone couldn't make an adapter from 1 -> 2. I think the other way around would be lossy and not desirable.


any adapter may add latency.


I don't know how it would be if they are switching from 8 bit to 32 bit.

I imagine most devices that come with MIDI 2.0 will have an option to run in either 1 or 2.


The physical layer is completely different; Midi 1.0 was a current loop with DIN five-pin connectors; Midi 2.0 uses Ethernet and Wifi.


The physical layer is the easiest thing to get an adapter for. It's the logic and processing that is hard to make cross/back compatible.


>The physical layer is the easiest thing to get an adapter for

Oh really? Riddle me this, then: a MIDI controller has a USB port. A synth has a USB port.

Connect the two without using an external power source for the adapter (Hint: you can't).


You're actually proving my point, rather than disproving it. Thanks, I guess?


What's your point? (And how do you define the physical layer?)

The data transmitted over MIDI and USB MIDI is the same, it's not the problem. It's the same protocol.

The problem is that MIDI OUT on a MIDI controller supplies power, and USB midi out on a device DOES NOT, because it's a slave device.

This a physical layer problem no matter how you slice it.

Now go and read the question I asked again.


You're talking like what you say isn't the same thing I already said. Maybe re-read what I said that you replied to from the beginning.


Yeah I did, and you said:

> Physical layer is the easiest thing to get an adapter for

I'm saying it's not.


And then immediately talked about host slave issues which are not physical but logical and processing. Perhaps you don't understand what physical means?

Truthfully there is nothing to be gained for me continuing this exchange because I in fact have plugged a Keith McMillen k-board into an op-z and a circuit rhythm and noticed it working perfectly, so your truisms are just that you don't know any synths that are battery powered usb hosts, a problem which is entirely unrelated to your inability to understand the difference between physical connectors and the logical models and data processing issues you described as physical in some weird need to repeat what I said while insisting I am wrong.

Have a good one.


I don't think that the new 32 bit packet based protocol will run over the 31kBit/s current loop.


I know that this article is meant for the general masses, but it's rather misinformed in nearly every sentence it utters regarding MIDI 1.0. That's a shame.

I'm pretty bummed out about MIDI 2.0's prospects. The protocol has been very slow to come out, and its design is ugly, overcomplex, lumbering, and kitchen sink. It is nothing like its predecessor. MIDI 1.0 was pushed by a single person with a small coterie of contributors, and has the design elegance and simplicity of such. But MIDI 2.0 has been wasting away for years in a large committee, and reeks of it.

MIDI has only a few, fairly easily dealt with problems:

- It's too slow and has a fixed rate.

- Its time sychronization mechanism is primitive.

- Its data model is too low resolution both in value and parameter space, and workarounds (NRPN etc.) unacceptably trade off speed to fix this. The model also does not define constraints, such as min/max values or categorical labels for values.

- Its parameters are global rather than per-note (resulting in workarounds like MPE).

- Its escape mechanism (Sysex) is too flexible and leaves too many nonstandard design choices to the manufacturer. Because the CC/NRPN data model is bad, manufacturers often have resorted to layering custom and proprietary protocols on top of sysex, many of which are very, very badly designed, and all of which are different from one another. Indeed, some manufacturers have lately treated their corner of MIDI Sysex as a trade secret, using non-documented protocols (ahem Arturia) or profoundly stupid and lazy sysex protocol designs (I'm looking at you, Roli).

This stuff is easy to fix. But 2.0 goes far further, dumping in two-way communication protocol options, device queries, ties to existing, non-peer-to-peer protocols (USB) and lots of data models that do not appear to be separable from the basic stuff. What a hairball. Will anyone want to implement this?


Yes to all of this. MPE is good enough.


I wonder when Firefox will support MIDI 2.0, since they still haven't supported MIDI 1.0. (chromium and other browser engines have had MIDI support since maybe 2014) https://bugzilla.mozilla.org/show_bug.cgi?id=836897

Yes, cool things can be done with MIDI in browser, some that can't easily be done outside of browsers (such as synchronizing and overlaying on top of 3rd party videos, such as from YouTube). Here's some of the stuff I've been working on, mostly targeted at my kid, but adults love it too:

https://www.youtube.com/watch?v=khU5A6Y1dk4

(this one doesn't play on youtube because contentId flags it, but the app itself overlays on 3rd party YouTube videos and it is not a problem) https://www.karmatics.com/video/pianoplydemo.mp4

https://www.youtube.com/watch?v=p1TcyDZiuyI

https://www.youtube.com/watch?v=Gj2rlytbDsY


The article fundamentally speaks to how industry standards led to a proliferation of synthesizers and collaboration within the music industry.

The Metaverse, coincidentally, is at nearly the same crossroads as MIDI in the early 1980's.

Each vendor built their own hardware and defined their own protocols. When Roland open sourced the MPU-401 chipset and gave away the MIDI specs, the industry blossomed.

Anyway, a lot to learn from MIDI... if Meta can similarly align an industry around basic hardware, APIs, and formats.


I've never thought of it this way until I've read your comment, but now I cannot shake it off, because the comparison seems so clear and obvious in hindsight.


The only issue I've noticed, and this might just be an iOS thing is devices would randomly disconnect, I know it's not good for my iPad but one day I found myself incessantly plugging in the adapter over and over again to try to get it to work. It's at the point where I'm tempted to go Bluetooth midi only when using my iPad from music Creation. Which prevents me from using my full size piano, first world problems


As long as it's 100% backwards compatible, including the 5-pin DIN connectors, I'd be on board. No 5-pin, no game.

More on this below.

----------------------

USB midi is one of the worst thing that happened to MIDI. If I have a synth with USB midi and a controller with USB midi, I can't connect them to each other, which is idiotic and frustrating.

The original MIDI spec had its drawbacks, but the biggest have been addressed by MPE.

As for latency, jitter, granularity — as a gigging musician and producer, I've never had any issue with it. It's good enough. And most of the music you hear today is a reflection of that.

What's not good, it's that some manufacturers are ditching the 5-pin MIDI jack for USB or Bluetooth. Thanks, nobody asked. Same for losing the THRU port.

Because what musicians love is setting up their computer on stage while everyone is waiting, right?

Bidirectionality with MIDI can be achived by replacing the dual 5-pin MIDI with the mini-DIN (yes, the same connector the old PS/2 mouse used). Yamaha Tenori-On and and Reface series do that (though again, nobody asked).

And the entire NRPN space of MIDI is unused because it's unstandardized. So simply agreeing on how MIDI 1.0 should be used would address most of the problems that MIDI 2.0 claims to solve.

That's what MPE does, by the way, and that's why it has been silently becoming the de-facto standard.

The examples I see in promotional videos show a computer talking to MIDI 2.0 devices. We've had enough of that. What made MIDI 1.0 work is the simplicity of device-to-device connectivity, regardless of what the devices are (and whether one of them is a PC).

Think of this magic: if I have just ONE type of cable, I can connect ANY device to ANY OTHER device.

That means my Yamaha MX-88 I bought new last year can talk to Korg Poly-800 made before I was born.

I could have bought the cable in the 80s or yesterday, it's the same cable.

What MIDI 1.0 gave people is less thinking about how to connect devices, and more about making music.

USB midi did the same thing for devices connected to a computer, at the expense of being usable in a setup that does not have a computer. That implicit assumption — that everything will involve a computer — was flawed.

WTF, I want to use my ROLI to play my Yamaha — and I can't, because ROLI is too "modern" to talk to any hardware synth made the same year.

I hope that MIDI 2.0 avoids this fuckery, but I can't tell yet. Bidirectionality looks sus.

My litmus test is: if a device doesn't come with a 5-pin MIDI jack while having space for it, it's a gimmick (Teenage Engineering being an exception, because they make up for it with analog control: my pocket operators sync with my Korg Monologue via pulse, and Korg syncs with Yamaha over 5-pin MIDI).

And if Korg Volca could do it, so can pretty much everything else (hint: don't put it on the side, put it on the front panel if you want that slim look).

MIDI 2.0 can be a huge success if you wouldn't be able to tell a MIDI 2.0 device from a MIDI 1.0 at a glance; i.e. same connectors, same interoperability. Then it would be truly an upgrade, a 2.0, and not "thing that works like MIDI in some cases, but have fun buying dongles to connect old equipment to new gear".

MPE works on that model; and connectors aside, any multitimbral device is MPE-compatible.


My hope is that they finish the MIDI 2.0 rtpmidi protocol, and that starts to be truly used in real hardware, not half used as current rtpmidi.

This would allow easy connectivity with very low latency over normal LAN, including switches, power over Ethernet...

Disclaimer: I created a rtpmidi implementation for linux https://github.com/davidmoreno/rtpmidid


> "MPE works on that model; and connectors aside, any multitimbral device is MPE-compatible."

Technically that's not true. MPE adds a feature where you can dedicate a channel as a sort of "multicast" channel where any CC you send there affects all the other channels shared by the same instrument. If the synth doesn't support that it won't work. But it's possible to implement fallbacks that just do the multicast less efficiently from the controller.

I completely agree on USB though. It's awful that I can't just plug a USB controller into a USB synth.

(If there was going to be something to replace the old 5-pin DIN cables my vote would be for CAN-bus. It supports faster speeds, bidirectional communication, and can go pretty long distances.)


I generally agree but wouldn‘t be so dogmatic on the 5-pin.

I know some people consider 3.5mm to be the devil‘s plug, but as a eurorack player I don‘t. I feel like MIDI on 3.5, now that the polarity is standardized, is really helping the good, USB-less MIDI stay relevant.


An omission on my part!

I'm absolutely loving MIDI on 3.5mm, just didn't see enough of it in the wild. Hope it catches on.


This 100x. I am just getting into all of this hardware and it's so frustrating to need a computer in the mix just to get two pieces of otherwise wonderful single-purpose hardware to speak to each other.


Right?

MIDI 2.0 sounds like a solution in search of a problem.

MIDI 1.0 addressed the problem of connecting any two electronic music devices to each other.

MIDI 2.0 seems to un-solve it.


One great thing about musicians is they're not forced into upgrading things as fast as computer software.

So hopefully the lack of sales for MIDI 2.0 only devices will kill the protocol quickly.


MIDI 1.0 solved one problem. 30+ years later, the fact that this problem was solved revealed a new set of problems to be solved.


... which can be solved by standardizing the use of existing MIDI 1.0 spec (see: NRPN, MPE, ...).

The problems that MIDI 2.0 solves seem to be of the kind that musicians don't have, and definitely don't justify switching away from a simple three-wire connector that's the same on both ends (either 5-pin DIN or 3.5mm TRS which is gaining traction these days).


There is a dongle for that! They call it “MIDI host”

Here’s first search result: https://www.midiplus.com.tw/en/product-detail/USB_MIDI_HOST/


Oh fuck that.

I know this full well.

So now, you need an extra device (MIDI host) to connect two devices, plus extra cables, plus power supply, and have fun if you need to set it up on stage in two minutes between the sets.

And good luck if both the controller and sound module are USB slaves.

The device you linked does NOT solve this problem.


And good luck finding something that plugs into your computer’s type C port without another dongle! What a good time for making dongles we’re living in!


You're going to make me cry :(

Seriously, the only dongle I found that actually solves the problem is CME WIDI Master[1]

It's a 5-pin DIN - to - Bluetooth MIDI dongle that runs off MIDI power (i.e., just plug and forget).

Brings you from the 80s straight into 2020s with no noticeable latency, and no wires.

That's how I actually connect my newer controllers (with Bluetooth midi, which, thankfully, does not require the other device to provide power) to old synths.

Can highly recommend, 5/5, etc., but it's frustrating that I'd need to go through Bluetooth and a dongle for devices that could be linked with one short simple wire.

[1]https://www.cme-pro.com/widi-master/


$150 for one pair of ports and I need another power supply. Not exactly as simple as a $5 midi cable.


> As long as it's 100% backwards compatible, including the 5-pin DIN connectors, I'd be on board. No 5-pin, no game.

According to the specs which can be downloaded from midi.org there is no 5 pin DIN socket or 31kbit/s current loop anymore; the new Midi instead uses Ethernet and Wifi to connect "instruments". Apparently the protocol runs on UDP.

> As for latency, jitter, granularity ...

Well, maybe I play too fast or too many notes in a chord, but it happens quite often that I get an arpeggio instead of a chord.


If I have a synth with USB midi and a controller with USB midi, I can't connect them to each other, which is idiotic and frustrating.

Is there a universal MIDI router/multiplexer box with a whole bunch of USB and MIDI ports on it? If not, would there be a market for such a thing?


>Is there a universal MIDI router/multiplexer box with a whole bunch of USB and MIDI ports on it?

Nope. You can do this by plugging a USB hub into a computer/tablet/phone, and running several MIDI interfaces into the hub, so it's not an unsolvable problem for the musician, just a very clunky setup.

Didn't find a single device with multiple USB and MIDI in/out ports after an extensive search. And the devices that I did find cost more than the synths that they would connect.

>If not, would there be a market for such a thing?

There is a market for a compact device like that, especially if it can be battery-powered for live gigs, remembers the settings, and doesn't need a screen/external device to set up the routing.

Even something as simple as "Merge all IN ports, and send to all OUT ports" would be great, and there isn't such a thing.

Seems like most musicians who care about this simply don't buy gear without 5-pin ports to begin with.

But yes, I've been thinking about making this for years now, and if you need a startup/kickstarter idea - go ahead, I'll chip in.


Not exactly what you are looking for, but I created a web interface for ALSA sequencer, which effectively makes any raspberry pi what you are looking for.

For old MIDI DIN you would need a MIDI adapter or use the raspberry pi serial ports and some circuitry. Can be powered by an USB powerbank for example.

https://github.com/davidmoreno/aseqrc

I also added to the mix rtpmidi (https://github.com/davidmoreno/rtpmidid), and USB MIDI gadget (https://github.com/davidmoreno/midiconfigfs) on the PI, so then I can use the pi itself from one or several computers via Ethernet and USB.


That’d be a midi host box. There are a few, tho mostly you run into issues of complexity - they require too much configuration to be useful / fun.


Yeah, and they don't work for the problem I stated:

All the MIDI host boxes I've seen only have one USB port, so can't be used to connect two UDB-midi devices to each other.


Retrokits RK-006 may solve all your problems but its pretty pricy. I assume based on that price that it‘s also not a simple device.


Thank you!

It does solve all the problems I mentioned, and yes, it's not the simplest device (and sadly, I can't say the high cost is unjustified).


> USB midi is one of the worst thing that happened to MIDI. If I have a synth with USB midi and a controller with USB midi, I can't connect them to each other, which is idiotic and frustrating.

Isn't that what USB-C is for?


>Isn't that what USB-C is for?

Nope. USB is master-slave (aka host-device); two slave devices don't talk to each other (nor provide power).

USB-C just shoves this problem under the rug by making the cables look the same on both ends.


What about the venerable USB OTG?


Good luck finding equipment with USB host capability supporting class-compliant MIDI devices.


Like, can I connect two USB-C harddrives together to copy data? Usually (except when mentioned loudly by the manufacturer) USB midi devices are “device” devices, not “host” devices


But but ... you can't do programmed obsolescence with MIDI 1.0 ! That is why MIDI 2.0 + USB will be pushed down everyone's throat.


Duplicate that I shared last night: https://news.ycombinator.com/item?id=29045277


>how will MIDI 2.0 change music

By breaking compatibility. Everything used to just work, now a layer of complication is added to the mix.

This is why we can't have nice things.


MIDI 2.0 is going to have a tough road ahead of it to get any adoption.

MPE solved some expressiveness and didn't break the world...


Yeah, I think that's the big problem with MIDI 2.0: it's complicated, and most manufacturers are probably only going to implement the parts they care about.

MIDI 2.0 has been out for awhile and as far as I know there aren't any mainstream products that use it. (Maybe one or two companies have dipped their toe in the water.) That doesn't bode well for the future of the protocol. Something like the Expressive-e Osmose one would think would be exactly the kind of instrument MIDI-2.0 is for, but nope, it's MPE. If you want an expressive controller, there's the Haken Continuum, the Linnstrument, the Roli Seaboard, the Osmose, etc... and they all work work with MPE.

https://www.expressivee.com/2-osmose

https://expressivee.happyfox.com/kb/article/162-will-osmose-...


What happened to OSC?


OSC hasn't really been designed as a replacement for MIDI, I think. It is really a generic data communication protocol. They only specified the syntax and omitted any semantics. As a consequence, each application has to specify their own OSC messages. And that's what digital mixing consoles, lighting consoles, DAWs like REAPER or computer music programs like SuperCollider do. There really is no common ground between how these applications use OSC.

To make OSC a suitable replacement for MIDI, you would have to come up with a set of OSC messages everyone can agree upon. That's not likely to happen...


It is a shame OSC didn't specify a minimum standard set of message names to use. I'm looking up and found a proposal for a standard namespace [1] though I'm not sure that even caught on.

[1] https://github.com/fabb/SynOSCopy/wiki


MIDI Manufacturers Association should have come up with a standard "Official MIDI™" namespace for naming OSC messages and which works nicely with regular "MIDI" devices. Instead of trying to make their own new spec (that is unlikely to be widely adopted and which is not physically compatible with "MIDI" anyway).


What happened, as in why isn't OSC "MIDI 2.0"?

The short story could be that the members of the MIDI Association chose to pursue their joint specification to be the next version of MIDI.

The actual story will have to be answered by someone who attended all of the various meetings that have occurred over the last three decades about what the MIDI 2.0 spec will be. I was only at a few.


The same thing that will happen to MIDI 2.0, I feel :D


I wouldn’t hold your breath on MIDI 2 guitar controllers being better than MIDI 1. The fundamental problem is pitch resolution and that isn’t really addressed by having a better protocol for representing pitch and intonation.


MIDI 1.0 has 14 bit (per channel) pitch bend, which is probably fine. What MIDI 2.0 adds is per-note pitch bend, so you don't have to dedicate a separate channel for every string, which is awkward and un-idiomatic. (MPE at least standardized how multiple-channel instruments should behave, but that was a relatively recent development.)


MPE uses channel-per-note which is equally unidiomatic.


Oh I agree. MPE is kind of horrifying when you think about how it actually works. But, it gets the job done, and it's a standard. If you want to do electronic music with expressive instruments right now, MPE is what you would use.

I don't think MPE is the end of the road, though. It'll probably be replaced eventually by something that was designed from the start to do the kinds of things that MPE is for. I'm not entirely convinced that MIDI 2.0 is the MPE replacement I'm looking for. Maybe someone will just invent something entirely new.

(I have some thoughts in that direction that I plan on publishing eventually, not so much in the "hey look, here's a new protocol everyone ought to use" sense, but more like "if you were going to design a music protocol to do what MPE does but better, this is an example of what it could look like".)


> The fundamental problem is pitch resolution and that isn’t really addressed by having a better protocol for representing pitch and intonation.

But having 32 instead of 7 bits for pitch will give better pitch resolution, no? Or am I misunderstanding something?


My biggest problem with MIDI is the latency & jitter introduced by USB adapters (many computers these days lack a dedicated sound card or even the space for one). Does the new spec do anything about that?


And to be pedantic, USB is not officially part of the MIDI spec, so that problem introduced by USB adapters isn't really a problem with MIDI per se.

If you really want to interface MIDI with a computer in a fashion consistent with the original MIDI specs, then you can use an actual Serial port at MIDI's 31,250 baud rate, which is possible to do on a Raspberry Pi for instance (or on a computer which you could set your baud rate to that). Then you can get interrupts precisely when the MIDI notes arrive.


If you want MIDI without latency or jitter, you pretty much need an old Atari ST. USB isn't the only source of latency (and I believe MIDI over USB runs in a special mode that minimizes latency).


No, because most MIDI capable devices couldn't be bothered to implement the first version of the spec correctly even today.


Do you have examples?


When XHR requests are blocked, the OP's URL returns a 404. "Interesting" website design.


MIDI 1.0 is designed for musicians. MIDI 2.0 is designed for rights holders.


gonna take way too long before we get webmidi2. meanwhile the usual suspects are screaming hell that no it's too risky to do midi, we must protect the user, God forbid the web get good.


Just use control voltage.


I do kinda get a kick out of connecting my keep (homebrew analog micro modular) to a volca modular slave and midi out to a... you get the picture.


Agreed. Less latency and continuous.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: