Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Native support for PipeWire or alternatively Jack #17

Open
szszoke opened this issue Jun 17, 2023 · 6 comments
Open

Native support for PipeWire or alternatively Jack #17

szszoke opened this issue Jun 17, 2023 · 6 comments
Labels
enhancement New feature or request

Comments

@szszoke
Copy link

szszoke commented Jun 17, 2023

Hello!

I just discovered your project after screen-sharing a virtual keyboard application during my online music lesson.

I already was already using OBS to bring everything together and I think that MIDI Jar overlay with an OBS Browser Source will be a much more elegant solution going forward.

The application has a built-in MIDI router. This is useful on operating systems like Windows where the MIDI routing capabilities are limited or non-existent by default.

On Linux, with PipeWire or Jack (with some tweaking), this sort of internal routing implemented by applications becomes redundant because there already is a node-based virtual patchbay where MIDI inputs/outputs can be routed to one or many outputs/inputs.

The few graphical applications that exist to manage the virtual page look very similar to the routing page of MIDI Jar.

Each application that is aware of this PipeWire/Jack would register itself as a node, with all the necessary ports.

In the case of MIDI Jar, the application would register itself as a node with three MIDI ports, one for the chord-display in the app, one for the debugger and another for the overlay chord-display.

Routing would be handled outside of the application with whatever tool the user is already using for routing their other MIDI/audio devices/programs between each other.

By disabling the internal router and relying on the audio server MIDI Jar could better integrate with the underlying system.

Is there any interest in this?

@szszoke szszoke added the enhancement New feature or request label Jun 17, 2023
@ArTiSTiX
Copy link
Contributor

Hello @szszoke !

Thanks for your feedback.
I don't know about Pipewire or Jack, i only heard about those solutions.

However, MIDI Jar (as a standalone Electron app, use Node.JS) was mainly developed to fill the lack of Windows solution for routing MIDI (and using MIDI on multiple apps at the same time by also filling the lack from loopMIDI to route messages from hardware devices). This is not an ideal solution (JS is not the best fit for realtime) but that was my quickest solution for providing UI and MIDI for OBS.

I understand that the node approach of routing seems redundant, but, i don't see the point of exposing modules to Pipewire, the current routing solution is a visual alternative to a "Select Device" dropdown that most software have (VSTi/DAW).
What i could dig into is to maybe provide for Linux users some in and outs dedicated to Pipewire,

But, it would not work fo MacOS and Windows.

I have some plans to make MIDI Jar an more extensible solution by having a more advanced routing anyway, that can have multiples modules instead of only one of each (multiple Keyboard Displays, multiple Chord Displays, and a View manager).
So this is going to be more and more redundant. But with a complete rewrite, you would be adding modules by yourself (an Input device, and Output Device, a Chord Display Module, a Keyboard Display Module, and if possible, a Pipewire Input module, or a Pipewire Output)

I would be interested in a standard solution like a cross-platform Pipewire though, but currently my focus is Windows. MacOS and Linux are bonuses.

@ArTiSTiX
Copy link
Contributor

ArTiSTiX commented Jun 17, 2023

Do you know any Pipewire library for NodeJS ?

I see there is @kakxem/node-pipewire but i don't see any example of MIDI, and how to consume and send anything to PW ports from a Node.JS environment.
And i will not create a library for a system i don't use.

@szszoke
Copy link
Author

szszoke commented Jun 17, 2023

I would be interested in a standard solution like a cross-platform Pipewire though, but currently my focus is Windows. MacOS and Linux are bonuses.

In an ideal world something like PipeWire would have first-class support for Windows and MacOS. I don't think that will happen any time soon unfortunately.

The way it works on Linux is that PipeWire is positioned to be the audio/video/MIDI server and no userspace application would talk to hardware directly. Instead they would expose themselves to PipeWire and would have audio/video/MIDI routed to them either automatically (like the audio output of a browser to the main audio output) or manually (MIDI keyboard to a standalone synthesizer app's MIDI input, and to the input of MIDI Jar; the output of the synth app could then be routed to an audio interface for some hardware effect processing and then routed back to a recording application or DAW input)

It has gotten to the point where the system stays out of your way and runs well on "auto pilot" but if you want to take the output of a browser tab where you're playing a YouTube video to the microphone input of another browser tab where you are having an online meeting, it is as simple as dragging a "cable" between two points. (Sort of like the ROUTER page)

What i could dig into is to maybe provide for Linux users some in and outs dedicated to Pipewire.

I think that could be an approach, maybe you could reuse what you learn in your upcoming modular system.

@ArTiSTiX
Copy link
Contributor

Ok thanks for the additional explanation.

Anyway, i currently can't find a JS library to do so, and development of MIDI Jar V2 should start this summer (as soon as i can publish 1.4.0 and fix issues i currently have with ALSA by rewriting my internal MIDI routing/handling).

@szszoke
Copy link
Author

szszoke commented Jun 17, 2023

I will look at the library that you linked. I think that it should be possible to use it to talk to PipeWire. My hunch is that you didn't see anything MIDI related on the project page because the project is just a bridge and then you are supposed to use it according to the PipeWire docs.

Edit: looks like the library is not capable of creating nodes/ports.

I don't know how much sense it makes to bring the whole PipeWire API into JavaScript. The current use-case would be to expose MIDI ports via PipeWire, and when MIDI messages arrive, send the messages to MIDI Jar.

Maybe this could be a whole separate executable that would talk to MIDI Jar via some sort of IPC or sockets?

Edit:
What if MIDI Jar would expose a HTTP/WebSocket connection where MIDI messages could be injected? Then somebody could write a small C program that sits between PipeWire and MIDI Jar, and forwards everything via HTTP/WebSockets?

MIDI Jar would need an option to disable/hide the router and support launching an additional executable.

@szszoke
Copy link
Author

szszoke commented Jun 18, 2023

I explored the path where an external application handles the PipeWire stuff and forwards it to a parent application. I have a working proof of concept.

Here is my PipeWire client
#ifndef PW_MIDI_INPUT_H_
#define PW_MIDI_INPUT_H_

#include <memory>
#include <span>
#include <cstdint>
#include <functional>

#include <signal.h>

#include <spa/control/control.h>
#include <spa/param/audio/format-utils.h>

#include <pipewire/pipewire.h>
#include <pipewire/filter.h>

class pw_midi_input
{
public:
  pw_midi_input() : _main_loop(nullptr, pw_main_loop_destroy),
                    _filter(nullptr, pw_filter_destroy),
                    _piano_roll_port(nullptr),
                    _debugger_port(nullptr),
                    _filter_events{
                        .version = PW_VERSION_FILTER_EVENTS,
                        .process = pw_midi_input::on_process}
  {
    _main_loop.reset(pw_main_loop_new(nullptr));

    pw_loop_add_signal(
        pw_main_loop_get_loop(_main_loop.get()),
        SIGINT,
        pw_midi_input::quit_main_loop,
        this);

    pw_loop_add_signal(
        pw_main_loop_get_loop(_main_loop.get()),
        SIGTERM,
        pw_midi_input::quit_main_loop,
        this);

    _filter.reset(pw_filter_new_simple(
        pw_main_loop_get_loop(_main_loop.get()),
        "MIDI Server",
        pw_properties_new(
            PW_KEY_MEDIA_TYPE, "Midi",
            PW_KEY_MEDIA_CATEGORY, "Capture",
            PW_KEY_MEDIA_ROLE, "DSP",
            nullptr),
        &_filter_events,
        this));
  }

  void createPorts()
  {
    _piano_roll_port = static_cast<port *>(pw_filter_add_port(
        _filter.get(),
        PW_DIRECTION_INPUT,
        PW_FILTER_PORT_FLAG_MAP_BUFFERS,
        sizeof(pw_midi_input),
        pw_properties_new(
            PW_KEY_FORMAT_DSP, "8 bit raw midi",
            PW_KEY_PORT_NAME, "Piano Roll",
            nullptr),
        nullptr,
        0));

    _debugger_port = static_cast<port *>(pw_filter_add_port(
        _filter.get(),
        PW_DIRECTION_INPUT,
        PW_FILTER_PORT_FLAG_MAP_BUFFERS,
        sizeof(pw_midi_input),
        pw_properties_new(
            PW_KEY_FORMAT_DSP, "8 bit raw midi",
            PW_KEY_PORT_NAME, "Debugger",
            nullptr),
        nullptr,
        0));
  }

  int connectAndRun()
  {
    if (pw_filter_connect(_filter.get(), PW_FILTER_FLAG_RT_PROCESS, NULL, 0) < 0)
    {
      return -1;
    }

    pw_main_loop_run(_main_loop.get());

    return 0;
  }

private:
  std::unique_ptr<struct pw_main_loop, void (*)(struct pw_main_loop *)> _main_loop;
  std::unique_ptr<struct pw_filter, void (*)(struct pw_filter *)> _filter;

  struct port *_piano_roll_port;
  struct port *_debugger_port;

  const struct pw_filter_events _filter_events;

  virtual void process_piano_roll_messages(std::span<uint8_t> messages) = 0;

  virtual void process_debugger_messages(std::span<uint8_t> messages) = 0;

  static void quit_main_loop(void *selfPtr, int)
  {
    auto self = static_cast<pw_midi_input *>(selfPtr);

    pw_main_loop_quit(self->_main_loop.get());
  }

  static void on_process(void *selfPtr, struct spa_io_position *position)
  {
    auto self = static_cast<pw_midi_input *>(selfPtr);

    process_for_port(self->_piano_roll_port, [=](std::span<uint8_t> data)
                     { self->process_piano_roll_messages(data); });

    process_for_port(self->_debugger_port, [=](std::span<uint8_t> data)
                     { self->process_debugger_messages(data); });
  }

  using on_process_data_func = void(std::span<std::uint8_t>);

  static void process_for_port(struct port *port, std::function<on_process_data_func> on_process_data)
  {
    auto pw_buffer = pw_filter_dequeue_buffer(port);

    if (pw_buffer != nullptr)
    {
      auto spa_buffer = pw_buffer->buffer;

      auto spa_data = &spa_buffer->datas[0];

      if (spa_data->data != nullptr)
      {
        auto pod = static_cast<spa_pod *>(spa_pod_from_data(
            spa_data->data,
            spa_data->maxsize,
            spa_data->chunk->offset,
            spa_data->chunk->size));

        if (pod != nullptr && spa_pod_is_sequence(pod))
        {
          struct spa_pod_control *pod_control;
          SPA_POD_SEQUENCE_FOREACH((struct spa_pod_sequence *)pod, pod_control)
          {
            if (pod_control->type != SPA_CONTROL_Midi)
            {
              continue;
            }

            auto value = static_cast<std::uint8_t *>(SPA_POD_BODY(&pod_control->value));
            auto size = SPA_POD_BODY_SIZE(&pod_control->value);

            auto data = std::span{value, size};

            on_process_data(data);
          }

          pw_filter_queue_buffer(port, pw_buffer);
        }
      }
    }
  }
};

#endif /* PW_MIDI_INPUT_H_ */

The idea is that you can inherit from pw_midi_input and just implement the two virtual functions (process_piano_roll_messages and process_debugger_messages) which will be called if MIDI messages are sent to the Piano Roll or Debugger ports.

My actual application is doing just that, but it also implements a gRPC client which is used to forward the MIDI messages to a server written in C#.

Application source code
#include <grpcpp/grpcpp.h>

#include "midi_input.grpc.pb.h"
#include "pw_midi_input.h"

class MidiInputClient : public pw_midi_input
{
public:
  MidiInputClient(std::shared_ptr<grpc::Channel> channel) : _stub(midi_input::MidiInput::NewStub(channel)) {}

  void forwardMessage(const midi_input::MidiMessage_Target target, const std::string &value)
  {
    midi_input::MidiMessage request;

    request.set_target(target);
    request.set_value(value);

    grpc::ClientContext context;

    google::protobuf::Empty reply;

    auto status = _stub->ForwardMessage(&context, request, &reply);

    if (!status.ok())
    {
      std::cerr << status.error_code() << ": " << status.error_message() << std::endl;
    }
  }

private:
  std::unique_ptr<midi_input::MidiInput::Stub> _stub;

  void process_piano_roll_messages(std::span<std::uint8_t> messages) override
  {
    forwardMessage(
        midi_input::MidiMessage_Target::MidiMessage_Target_PianoRoll,
        {messages.begin(), messages.end()});
  }

  void process_debugger_messages(std::span<std::uint8_t> messages) override
  {
    forwardMessage(
        midi_input::MidiMessage_Target::MidiMessage_Target_Debugger,
        {messages.begin(), messages.end()});
  }
};

int main(int argc, char *argv[])
{
  pw_init(&argc, &argv);
  using namespace std::chrono_literals;

  MidiInputClient client(
      grpc::CreateChannel("localhost:5160", grpc::InsecureChannelCredentials()));

  client.createPorts();

  return client.connectAndRun();
}
Protobuf file for MIDI Input Service
syntax = "proto3";

import "google/protobuf/empty.proto";

package midi_input;

option csharp_namespace = "MidiServer.Protobuf";

message MidiMessage {
  enum Target {
    PianoRoll = 0;
    Debugger = 1;
  }

  Target target = 1;

  bytes value = 2;
}

service MidiInput {
  rpc ForwardMessage(MidiMessage) returns (google.protobuf.Empty) {}
}

On the bottom left you can see the C# application that implements the server for the MidiInput gRPC service, and on the right is the C++ application that connects to PipeWire and implements the client for the MidiInput gRPC service.

There is also a virtual MIDI keyboard application that outputs via the Midi Through port, which is connected to the Piano Roll port of the MIDI Server node.

The MIDI Server node is in fact the C++ application that is running in the bottom-right terminal.
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants