The Point Where App Code Becomes a Library

SimpleMidiController — my app that talks to KORG hardware via MIDI-CI — had grown a monster. The MIDI-CI and Property Exchange implementation was 2,800 lines of Swift jammed directly into the app target. It worked, but it was untestable without physical hardware, impossible to evolve without breaking the app, and obviously something that other developers building MIDI apps would need too.

The decision to extract it into MIDI2Kit wasn’t hard. The execution was a three-phase migration:

  1. Phase 1: Add MIDI2Kit as a dependency alongside the existing code. Run both in parallel.
  2. Phase 2: Migrate the app’s UI layer to use MIDI2Kit’s API. Kill off the old code path.
  3. Phase 3: Delete the 2,800 lines. Commit b4ae272.

After Phase 3, import CoreMIDI no longer appeared anywhere in SimpleMidiController. The app had become a pure consumer of the library’s API. That felt like the right outcome.

Four-Layer Architecture

The library is organized into four layers:

MIDI2Kit       — CIManager, PEManager (high-level async API)
MIDI2CI/PE     — Discovery, PE chunking, subscriptions
MIDI2Transport — MIDITransport protocol, CoreMIDI/Mock/Loopback
MIDI2Core      — UMP types, MUID, DeviceIdentity, Mcoded7

Each layer only knows about the layers below it. The app only touches MIDI2Kit. CoreMIDI only appears in MIDI2Transport. This separation matters in practice: I can test the entire MIDI-CI and PE protocol stack without CoreMIDI, without hardware, without any external dependencies.

The Transport Protocol: The Most Important Decision

The single design choice that made everything else possible was making the transport layer a protocol rather than a concrete type:

protocol MIDITransport: Sendable {
    func send(_ data: [UInt8], to destination: MIDIDestination) throws
    func makeReceiveStream() -> AsyncStream<MIDIReceivedData>
}

Three implementations of this protocol cover everything I need:

  • CoreMIDITransport — production. Sends and receives on real MIDI ports.
  • MockMIDITransport — unit tests. Records all sends and receives for assertion.
  • LoopbackTransport — integration tests. Created in pairs: whatever one side sends, the other receives. This is what makes it possible to test Initiator and Responder in the same process.

If I’d started with CoreMIDI baked in, the test story would be much worse. Every test that exercises the protocol stack would need physical hardware. As it stands, the full Discovery → PE Capability Exchange → ResourceList → GET/SET flow runs in unit tests in milliseconds.

Actor-Based Concurrency

MIDI is inherently concurrent. Messages arrive at any time on a background thread. The app reads and updates state from the main thread. Multiple PE requests can be in flight simultaneously. This is exactly the problem Swift actors were designed for.

CIManager and PEManager are both actors. No locks, no mutexes, no manually synchronized queues. The Swift concurrency runtime handles isolation:

actor CIManager {
    private var discoveredDevices: [MUID: DiscoveredDevice] = [:]

    func startDiscovery() async { ... }
    func handleDiscoveryReply(_ reply: DiscoveryReply) { ... }
}

For event delivery, I went with AsyncStream rather than delegates. No protocol to implement, no weak reference management, no threading gotchas. You just iterate:

for await event in client.makeEventStream() {
    switch event {
    case .deviceDiscovered(let device):
        // update UI
    case .deviceLost(let muid):
        // clean up
    case .propertyChanged(let muid, let resource):
        // refresh data
    }
}

This integrates naturally with SwiftUI’s task modifier. The stream runs as long as the view is alive, and cancels automatically when it disappears.

Testing Without Hardware: MockDevice

One of the biggest problems with the original app code was that the only way to test MIDI-CI behavior was to plug in a KORG keyboard. That’s fine when you have the hardware, but it makes CI impossible, it makes regression testing slow, and it means tests can’t run in automated pipelines.

The solution was a MockDevice that simulates a real MIDI-CI device over a LoopbackTransport:

let (initiatorTransport, responderTransport) = LoopbackTransport.createPair()
let mock = MockDevice(transport: responderTransport, preset: .korgModulePro)
let client = MIDI2Client(transport: initiatorTransport)

// This runs against a simulated KORG device, no hardware needed
let info = try await client.getDeviceInfo(from: mock.muid)

Five presets cover the devices I’ve encountered: .korgModulePro, .generic, .rolandStyle, .yamahaStyle, and .minimal. The KORG preset faithfully reproduces all of KORG’s quirks — the Mcoded7 non-usage, the string-typed canGet/canSet, the proprietary X-ProgramEdit resource structure. If my code handles the mock correctly, it handles the real hardware correctly.

705 Tests Across 77 Suites

The library launched with 602 tests on March 8, 2026. As of this writing it’s at 705. The LoopbackTransport makes it practical to test the entire protocol flow end-to-end: Discovery through PE Capability Exchange through ResourceList through GET and SET, all in a unit test that runs in under a second.

The edge cases get their own tests: timeout behavior, NAK responses, chunk loss mid-transfer, malformed JSON in PE responses, Mcoded7 encoding errors, invalid MUID values. Each of these was discovered in production with real hardware. Now they’re in the test suite so they stay fixed.

16 Seconds to 144 Milliseconds

The original SimpleMidiController code took about 16 seconds to complete the initial connection handshake with KORG hardware. That’s the full sequence: BLE warm-up, MIDI-CI Discovery, PE Capability Exchange, ResourceList retrieval, then individual GET requests for each resource. All serial, all with conservative timeouts.

After migrating to MIDI2Kit and adding one optimization — when the device’s manufacturerName identifies it as KORG, skip ResourceList entirely and go directly to X-ParameterList — that number dropped to 144ms.

16.4 seconds → 144ms. 99.1% faster.

The optimization is possible because I know KORG’s resource layout from reverse engineering. The library applies it automatically; any app built on MIDI2Kit gets the benefit without knowing the detail exists. This is the advantage of centralizing device-specific knowledge in a library rather than spreading it across multiple apps.

What “The App No Longer Imports CoreMIDI” Means

The final state of SimpleMidiController has no import CoreMIDI, no import CoreBluetooth for MIDI purposes, and no direct SysEx construction. The app describes what it wants — “get the program list from this device,” “set this parameter” — and MIDI2Kit handles the protocol.

That’s the goal for anyone else building a MIDI app on Apple platforms. You shouldn’t need to know about MUID generation, Mcoded7 encoding, SysEx chunking, BLE warm-up sequences, or Apple’s half-finished MIDI-CI participant. MIDI2Kit knows about those things so you don’t have to.

Build on MIDI2Kit

A clean Swift API for MIDI 2.0, MIDI-CI, and Property Exchange. All the protocol complexity handled, all the device quirks encoded. Open source, MIT licensed.

View on GitHub Documentation

More in This Series

← Part 5: The Bizarre World of BLE MIDI All Posts →