Where the MIDI 2.0 Ecosystem Stands Right Now

It’s March 2026. MIDI 2.0 has been in the works for years. Here’s an honest snapshot of where things actually are.

Hardware is making progress. KORG Keystage, Roland A-88MKII, StudioLogic SL series, Rhodes MK8 — the list of MIDI 2.0 keyboards is growing. Still niche, but it’s no longer a single device you can only find in press releases.

OS support finally spans both platforms. Apple has had UMP transport since macOS 11 / iOS 14 back in 2020. Windows MIDI Services went GA in February 2026. For the first time, both major platforms have OS-level MIDI 2.0 infrastructure in place. That’s a real milestone.

DAWs are catching up unevenly. Cubase 14 handles MIDI 2.0 → VST3 conversion. Logic Pro has Property Exchange auto-mapping. Studio One has MIDI-CI Discovery. Ableton and FL Studio haven’t shipped anything yet.

Libraries: as far as I know, MIDI2Kit is the only open-source Swift library with full MIDI-CI and Property Exchange support. That’s not a brag — it’s a gap in the ecosystem that someone needed to fill.

What MIDI2Kit Still Needs

More Test Hardware

Right now, everything is verified against KORG. That’s it. Roland, Yamaha, Native Instruments — I don’t have their hardware, and I genuinely don’t know how their MIDI 2.0 implementations differ.

The library has MockDevice presets for .rolandStyle and .yamahaStyle, but I built those from guesswork and spec reading. They haven’t been validated against real devices. Every manufacturer has quirks; I’ve only found KORG’s. The spec interpretation differences I uncovered while building MIDI2Kit — Mcoded7 encoding, boolean vs. string types, timing constraints — are almost certainly not unique to KORG.

Network MIDI 2.0 in the Real World

macOS 26.4 added Network MIDI 2.0 support. The MIDI2Kit code should handle it — the architecture doesn’t treat network sessions differently — but I haven’t run Property Exchange over an actual network session yet. That test still needs to happen.

Documentation

DocC-based API docs and an interactive tutorial are in progress. This isn’t optional — a library without documentation is effectively inaccessible to anyone who didn’t build it. I want MIDI2Kit to be the thing you reach for when you need MIDI 2.0 in Swift, not the thing you look at once and give up on because nothing is explained.

Sample Apps

Two reference implementations are planned: a MIDI 2.0 Device Explorer for iOS, and a MIDI Monitor for macOS. Beyond just showing the library working, these would be the kind of tools I wish had existed when I started this whole thing.

What MIDI2Kit Is Actually For

CoreMIDI gives you a pipe. You send bytes in, bytes come out.

The layer above that — MIDI-CI Discovery coordination, Property Exchange transaction management, chunk reassembly, timeout handling, manufacturer-specific workarounds — CoreMIDI doesn’t provide any of that. Every app that wants to do anything serious with MIDI 2.0 has to build it themselves.

I know exactly what that costs because I did it in SimpleMidiController: 2,800 lines of code to handle one KORG keyboard. That’s not a reasonable tax for every developer who wants to use MIDI 2.0.

MIDI2Kit pulls all of that into a library. The manufacturer quirks are in one place. The workarounds benefit every app that uses it. Nobody has to rediscover that PE Notify echo-back freezes the KORG LCD.

Honest Thoughts on Solo Development at This Scale

I’ll be direct: building a MIDI 2.0 library as a solo indie developer was kind of unreasonable. One KORG keyboard for testing. Paid spec documents. Apple’s undocumented APIs that you can only probe by writing code and running it on real hardware. BLE MIDI bugs in a closed-source stack. No access to the informal knowledge that MMA member companies share among themselves.

But I also think I couldn’t have built this any other way. If I’d started by trying to implement MIDI-CI from the spec, I’d have gotten something theoretically correct that didn’t work against any real device. The problems that made MIDI2Kit necessary — the timing constraints, the proprietary resources, the silent failures in CoreMIDI — only became visible when I was building something that actually needed to work.

The path was: build a small app, hit walls, understand what the walls actually are, build the library to handle them. That cycle worked. Not efficiently, not elegantly — but it led somewhere real.

Build a small app, hit walls, understand what the walls actually are, then build the library to handle them. It’s not an efficient path. But it leads somewhere real.

Where This Goes

The MIDI 2.0 ecosystem is still early. OS support is finally in place. Hardware is arriving. DAW support is uneven. Cross-vendor interoperability is theoretical.

There’s real work still to do on MIDI2Kit — more device support, validated documentation, sample applications. But the foundation is there: a library that handles real-world MIDI 2.0 on Apple platforms, including all the things that work differently from what the spec implies.

If you’re building something with MIDI 2.0 on iOS or macOS, I’d love to know what you run into. The whole point of putting this work into an open library was so that the next person doesn’t start from scratch.

Start Building with MIDI2Kit

Open source, MIT licensed. Handles the reality of MIDI 2.0 so your app doesn’t have to.

View on GitHub Documentation

Related Articles

← Part 9: Apple vs. KORG All Posts →