Skip to content

Add new Mixco based MIDI controller scripts#1016

Merged
daschuer merged 10 commits intomixxxdj:masterfrom
arximboldi:add-mixco-scripts
Sep 22, 2016
Merged

Add new Mixco based MIDI controller scripts#1016
daschuer merged 10 commits intomixxxdj:masterfrom
arximboldi:add-mixco-scripts

Conversation

@arximboldi
Copy link
Copy Markdown
Contributor

Hi!

I have upgraded the controller scripts shipped with Mixco [1] to work properly with 2.0 and the current Mixxx master.

I provide three scripts:

As we discussed on the mailing list a year ago, these scripts are pre-compiled. I did not include the sources, since this brings Node.js. To edit the scripts, one needs the Mixco framework. If Node.js and NPM are installed, it is trivial to install:

    $ npm install -g mixco

[1] https://github.com/arximboldi/mixco

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 20, 2016

Thanks for submitting these. I think it would be good to include the original sources with Mixxx with a link to Mixco's documentation at the top, even though Mixxx's build system won't build them to avoid pulling in Node.js as a dependency.

Please create pages on the Mixxx wiki documenting the functionality of the mappings. See http://mixxx.org/wiki/doku.php/contributing_mappings#documenting_the_mapping for guidelines on documentation.

@sblaisot
Copy link
Copy Markdown
Member

Please add the new files to Windows uninstaller. See wiki for instructions. I can not currently link to it but the are in the "contribution mapping" page.
Also if this PR update existing mapping, please use same filenames.

@arximboldi
Copy link
Copy Markdown
Contributor Author

I am going to proceed with the suggested changes. I have a couple of questions on some of the comments:

  • @Be-ing part of the benefits of the framework is that it can generate documentation from the sources. Is it ok if I update the wiki to just point to the already existing docs? (e.g. [https://sinusoid.es/mixco/script/maudio_xponent.mixco.html](M-Audio Xponent))
  • @sblaisot I will remove the old M-Audio Xponent mapping since this new mapping is better, but I'd like to keep this naming since it makes upgrading from latest Mixco sources easier.
  • I don't know if you noticed, but I am tagging the name of the scripts so they show in the file choose as [mixco] Novation Twitch. The idea is that this way there is a mechanism to potentially disambiguate mappings for the same controller that follow different approaches (e.g. for Korg Nanokontrol 2). It also makes development a bit easier [1]. Are you ok with this?

Thanks!


[1] Is awesome that scripts reload automatically when the JavaScript code changes, but not when the XML changes. This means that when doing changes in the XML only, one has to go to the preferences and re-select the script. It is easier for me thus to easily spot my scripts on the list. However, probably the right change here is to automatically reload the XML source if it changed. This might non-trivial work, because when a mapping is loaded a new one is created where the potential user modifications (via GUI) will be saved. I guess, some heuristic could be applied to keep track of the original XML as long as the user selected mapping is not modified. Or just do the reload in some developer mode... I will look into that later.

It has been replaced by the new Mixco-based script which:

- Does not require the user to hold special keys on initialization for
  the lights to work.

- Uses effects, loop rolls, master sync an few other recent Mixxx
  features.

- Uses less mappings via JavaScript, potentially reducing latency.
  https://sinusoid.es/mixco/script/maudio_xponent.mixco.html
@arximboldi
Copy link
Copy Markdown
Contributor Author

I addressed your comments and updated the wiki-page:
http://www.mixxx.org/wiki/doku.php/hardware_compatibility?&#community_supported_mappings

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 21, 2016

@Be-ing part of the benefits of the framework is that it can generate documentation from the sources. Is it ok if I update the wiki to just point to the already existing docs? (e.g. https://sinusoid.es/mixco/script/maudio_xponent.mixco.html)

That's okay to point the wiki pages to the Mixco website. However, I think it will be confusing and perhaps intimidating for some users to see the code beside the functional documentation. Could you generate pages with just the documentation? If you want to put in the extra work into Mixco to make it output in Docuwiki's format for the Mixxx wiki, that would be nice, but don't worry if you're not up to doing that.

Also, the current Mixco documentation pages have very wide horizontal scrolling on my screen, far past the longest line of code. I have a 1366x768 screen.

@sblaisot I will remove the old M-Audio Xponent mapping since this new mapping is better, but I'd like to keep this naming since it makes upgrading from latest Mixco sources easier.

IMO this is okay. It also makes it obvious which mappings are Mixco generated mappings. If the old mapping is removed, I think the new mapping should use the same title as it appears in the mapping selection drop down menu in the Preferences.

I don't know if you noticed, but I am tagging the name of the scripts so they show in the file choose as [mixco] Novation Twitch. The idea is that this way there is a mechanism to potentially disambiguate mappings for the same controller that follow different approaches (e.g. for Korg Nanokontrol 2). It also makes development a bit easier [1]. Are you ok with this?

To most users, it doesn't matter that the mapping was generated with Mixco, so I'd rather these not be specially marked. Also having a prefix disrupts the alphabetical ordering of the drop down menu.

If you want to put in the work for automatic reloading of XML files, that would be great. It would sure make developing mappings more convenient.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 21, 2016

On the Korg NanoKontrol2 documentation:

  • "prehear volume" and "prehear mix" aren't terms used within Mixxx or the Mixxx documentation, which could be a bit confusing for some users. Could you switch these to "PFL" and/or "headphone"? Also, I think it's more accurate to call it headphone "gain" rather than "volume".
  • "The two furthest channel sections (1st and 8th) control the pitch related stuff and effects of the two decks." I don't see any mapping for effects.
  • Map sync buttons to the new sync_enabled CO to make use of master sync.

@arximboldi
Copy link
Copy Markdown
Contributor Author

Thanks for the quick reply @Be-ing. I will proceed as follows:

  1. Remove the [mixco] from the script names. I agree it is inconsistent and irrelevant for users.
  2. I will fix the Korg Nanokontrol 2 documentation as you suggested.
  3. I will fix the scrolling issues. That appeared recently with an upgrade to pandoc, I might need to tweak the template.

On showing the code in the docs: I have mixed feelings about this. On the one hand I agree it might be intimidating. On the other hand it does not get much in the way (since it is in a sepparate column) and is a way to show users that coding "is not that hard" :-) As an intermediate solution, I will try to hide the code by default and put a button floating on the corner to reveal it. Would that satisfy you?

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 21, 2016

On the Novation Twitch documentation:

  • The link to the programmer guide is broken.
  • "The first two knobs in the Master FX section are mapped to mix and super of the first effect unit." Do you mean the Depth knob and the Mod encoder?
  • typo: "The beats know can be used to change the selected effect."
  • typo: "The scroll encoder scrolles the current view. When pressed it moves faster."
  • Mixxx doesn't use the term "trim", so it would be helpful to clarify that this is the channel gain knob.
  • Is there soft take over on the channel faders? I'd imagine using fader FX would result in unintentional sudden volume changes if not.
  • It seems the cue button LED responds to the cue_default control. Use the cue_indicator control so it matches the skins and respects the cue mode selected by the user in the Preferences.
  • "The buttons 1 to 4 trigger the first four samplers. The sample plays as long as the button is held." I recommend mapping sampler buttons like they are on the Hercules P32 (feel free to use different colors): "Press an unlit pad to load the track selected in the library to that sampler. Pads are blue when the sampler is loaded but not playing and red when playing. Press a blue pad to play the sample from its cue point. Press a red pad to jump back to the sample's cue point. Press a red pad with shift to stop a playing sample. Press a blue pad with shift to eject a sample."
  • "The buttons 7 and 7 perform a stutter effect at different speeds." Do you mean 7 and 8?
  • It would be helpful to specify which loop lengths are in which row of pads for autoloop and loop roll modes.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 21, 2016

As an intermediate solution, I will try to hide the code by default and put a button floating on the corner to reveal it. Would that satisfy you?

Sure, seems like a good idea.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 21, 2016

It seems that none of the scripts use the cue_indicator and play_indicator COs for the respective LEDs. Use these to be consistent with the cue mode the user has chosen in the Preferences.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 21, 2016

Regarding the Xponent mapping:

  • On the Mixxx wiki you put "The controller script has been updated such that it does not need special tricks to get the lighting work". Does the Windows ASIO driver still break the lights with this mapping?
  • "16. When in MIDI mode, the touch pad can be used to control the first two parameters of the first effect." What do you mean "when in MIDI mode"? What other mode is there? Is there anything the user has to do to activate MIDI mode?
  • "17. and 18. When in MIDI mode, the touch pad buttons toggle the first effect unit for the left and right channel respectively." Write "decks 1 & 2" instead of "left and right channel". At first I thought this selectively activated the effect unit on each side of the stereo output (which would probably sound awkward).
  • In the Microphone section, I'm not sure what you mean by "second knob" and "second button". Can you refer to the numeric label on the diagram?
  • Typo: "38. Punch-in/transform. While pressed, lets this track be heard overriding the corssfader."
  • "The little arrow buttons do beatjump — jump forward or back by one beat. When shift is pressed, they select the previous or next item of the browser sidebar." It seems a bit odd to put mappings for the library within the deck controls. On the Electrix Tweaker mapping, I mapped the beatjump buttons to forward/back by 4 beats and forward/back by 1 beat with shift pressed. I find it to work pretty well. What do you think about mapping the buttons like that?
  • Missing word: "The numbers set and trigger a loop of 4, 8, 16 and 32 beats respectively. When shift, they set loops of 1/8, 1/2, 1 or beats 2 long."
  • Typo: "When scrach mode is on, it will stop the song when touched on top and control the track play like a vinyl when moved." Also I think it would be clearer to remove the word "play" after "track".
  • Typo: "Otherwise, it can be used to nudge the playing speed up or down to synch the phase of tracks when the track is playing."
  • Typo: "26. Temporarily nudges the pitch down or up. When shift, they do it in a smaller ammount."

@arximboldi
Copy link
Copy Markdown
Contributor Author

"16. When in MIDI mode, the touch pad can be used to control the first two parameters of the first effect." What do you mean "when in MIDI mode"? What other mode is there? Is there anything the user has to do to activate MIDI mode?

The controller includes a track-pad that can be used as mouse track pad to control the computer, or as midi input. There is a big button that says "MIDI" below the track pad to switch modes. You can see that in the schematics at the top.

For the rest of wording problems and suggestions of alternative bindings: yes to all! Thanks for the very thorough review, I am on the process of fixing it now.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 21, 2016

The controller includes a track-pad that can be used as mouse track pad to control the computer, or as midi input. There is a big button that says "MIDI" below the track pad to switch modes. You can see that in the schematics at the top.

Okay, it would be helpful to explicitly mention this in the documentation.

@arximboldi
Copy link
Copy Markdown
Contributor Author

@Be-ing One thing: you ask about ASIO and the Xponent but I don't have any Windows machine to try it out.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 21, 2016

Based on this forum thread, I presume the ASIO driver would break the lights regardless of the mapping.

@arximboldi
Copy link
Copy Markdown
Contributor Author

Going through your notes...

Is there soft take over on the channel faders? I'd imagine using fader FX would result in unintentional sudden volume changes if not.

I am not using the faders for effects. I find it might be confusing (for reasons like what you suggest) plus I prefer the fader to have a direct mapping (midi signal directly maps to control instead of script) because it has a tiny but perceivable effect on latency, which for faders I consider important.

I recommend mapping sampler buttons like they are on the Hercules P32 (feel free to use different colors): "The buttons 1 to 4 trigger the first four samplers. The sample plays as long as the button is held." I recommend mapping sampler buttons like they are on the Hercules P32 (feel free to use different colors): "Press an unlit pad to load the track selected in the library to that sampler. Pads are blue when the sampler is loaded but not playing and red when playing. Press a blue pad to play the sample from its cue point. Press a red pad to jump back to the sample's cue point. Press a red pad with shift to stop a playing sample. Press a blue pad with shift to eject a sample."

While my brain understands how your suggested behavior is better, I haven't actually used samples enough to actually feel it. Do you mind if we leave those changes to a later PR, once I get to use samples a bit more and get a feel of what's best? This PR is already taking some time...

The rest of the comments I have a addressed in a branch in Mixco, I will push something here in a few minutes...

@arximboldi
Copy link
Copy Markdown
Contributor Author

arximboldi commented Sep 21, 2016

Ok done.

One last remark: I think I prefer leaving the code visible on by default. As I said before, I have mixed feelings. Let's call this an experiment to validate one of the possible two hypotheses:

  1. Users are intimidated and dislike or don't use the documentation.
  2. This approach encourages users to take on MIDI scripting and improve the scripts/documentation.
    I have mixed feelings and am torn between believing either is true, so I'd leave it as is and experiment. We can iterate later on based on feedback. Would that work for you?

Besides, I think I have addressed all your other comments and hopefully this is ready to merge. Once again thanks for the thorough and insightful review.

@sblaisot
Copy link
Copy Markdown
Member

.js source file are a problem because they will be included in the distribution packages (at least on Windows). There is a *.js inclusion. Either move them in a subdir or rename them please, they should not be shipped to final users.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 21, 2016

they should not be shipped to final users.

I'm not so sure about this. I think there's value in shipping the original sources. They won't show up in the mapping selection drop down menu, so only users going into their system mapping folder to modify the mapping will know they are there.

@arximboldi
Copy link
Copy Markdown
Contributor Author

@sblaisot they are already in a subdir called mixco inside the controllers folder. Do you mean something else?

Otherwise, I agree with @Be-ing on the sources being valuable, but then we should also include the .litcoffee and the README.md. Admittedly I don't have a Windows machine and don't care much about it, so I'd be happy to pick the least effort option 😄

@arximboldi
Copy link
Copy Markdown
Contributor Author

Damn! Ok, now I think "pre-hear" is gone 😄

@arximboldi
Copy link
Copy Markdown
Contributor Author

arximboldi commented Sep 22, 2016

@Be-ing thanks for the link to your new framework and the pointers to the PR's that are on the way of improving the engine. I have some comments about your framework, but if you'd like I can wait until the designed is a bit more stable. Just mention me in any PR or whatever whenever you think you would like to get some feedback on it.

In general, I agree that improving the expresivity of the code is possible just with JavaScript. This is why I built Mixco and in general I think that syntactically anything can be achieved from JavaScript.

My biggest problem is that the current system relies too much on JavaScript for handling things. IMHO, handling the MIDI signals of performance-critical buttons is best done in C++, where the signal path can be optimized to reduce latency. This is true for sample triggering, most faders, jog wheels, etc...

My problem is that Mixxx already provides plenty of CO's that can handle these cases, but they can not be set up dynamically. So, if I have a shift modified version of one control, I need to do all the processing of the events in JavaScript. Instead, I'd rather have an API that allows a script do something like this:

function handleShiftPress(channel, control, value, status, group) {
    var shiftPressed = value > 0;   
    if (shiftPressed) {
        engine.mapInputControl(0x42, 0x13, "[SomeGroup]", "some_control");
    } else {
        engine.mapInputControl(0x42, 0x13, "[SomeGroup]", "other_control");
    }
}

A symmetric mapOutputControl could be provided.


Anyways enough on general discussion of the MIDI framework. Is this PR done?

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 22, 2016

Yes, this PR LGTM! 👍

My problem is that Mixxx already provides plenty of CO's that can handle these cases, but they can not be set up dynamically.

I think the solution here is to dynamically register input and output handlers from JavaScript instead of XML like what you've suggested. For output, the beginning of this is already available via an ugly, undocumented interface (see #656). These handlers would be treated the same in C++ as those registered via XML but they would able to be manipulated as JavaScript objects, providing a similar API as the JS library I've started. Getters and setters for different object attributes could be defined in C++ to allow manipulating the JS objects without having to register whole new objects. As a quick mockup:

var someController = {};
someController.init = function () {
    this.leftDeck = new this.Deck(1);
    this.rightDeck = new this.Deck(2);
};

someController.Deck = function (groupNumber) {
    this.group = "[Channel" + groupNumber + "]";
    this.someButton = new Input(0x40 + groupNumber, 0x13, this.group);
    this.shiftButton = new Input(0x40 + groupNumber, 0x14, this.group, this.handleShiftPress);
    this.handleShiftPress(groupNumber, 0, 0, 0, this.group); // initialize unshifted layer
};

Deck.handleShiftPress = function (channel, control, value, status, group) {
    var shiftPressed = value > 0;
    if (shiftPressed) {
        this.someButton.CO = "some_control";
    } else {
        this.someButton.CO = "other_control";
    }
};

If Mixxx provides input and output handler objects on the C++ side, I think there would still be a role for my JS library for combining a pair of input & output handlers into objects representing the complete functionality of a physical control without having to duplicate the MIDI information (the first two bytes of each input and output message). If we're thoughtful, we could design the JS library's API so the implementation could change transparently behind the scenes when the C++ side is ready.

@arximboldi
Copy link
Copy Markdown
Contributor Author

Yes, that is exactly the direction I was looking at. Whether the API is more OO (as you outline) or procedural (as I did) I don't care much as either can be wrapped into the other style from JS.

Another nice thing about such API is that it also removes the need for describing MIDI mappings in the XML file altogether. One can have scripts whose "init" is executed and then this sets up whatever connections it sees fit, either direct or via JS. This would remove most of the conceptual complexity of Mixco, as the generation of the XML part could be deprecated. The user could still customize via the XML, as we could say that XML bindings "shadow" bindings done via JS, this allows for the use-case of a user customizing the mapping via the GUI.

@arximboldi
Copy link
Copy Markdown
Contributor Author

Btw, what is the Mixxx process to getting the PR merged. We need a certain number of LGTM's? Who has merge rights?

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 22, 2016

@daschuer
Copy link
Copy Markdown
Member

Thank you all for the work!

@daschuer daschuer merged commit ae19720 into mixxxdj:master Sep 22, 2016
@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 22, 2016

Another nice thing about such API is that it also removes the need for describing MIDI mappings in the XML file altogether. One can have scripts whose "init" is executed and then this sets up whatever connections it sees fit, either direct or via JS. This would remove most of the conceptual complexity of Mixco, as the generation of the XML part could be deprecated. The user could still customize via the XML, as we could say that XML bindings "shadow" bindings done via JS, this allows for the use-case of a user customizing the mapping via the GUI.

I would also like to move towards an all JS mapping option. IMO this would be a better than working on automatically reloading XML files or adding functionality to the XML format.

The init and shutdown functions of scripts are implemented in a hacky way on the C++ side. I'd like to see the init functions replaced with a constructor of a Controller JS object. JS has automatic garbage collection so it doesn't have any concept of a destructor function, so the shutdown functions would have to stay.

@daschuer
Copy link
Copy Markdown
Member

Here some thoughts:

A solution for the button shifting issue should be both, easy to use and CPU saving.

I have just looked up the CPU eaters in a current button shift solution. Here a random example:

XoneK2.PlayButton = function (midichannel, control, value, status) {
    deck = XoneK2.IndexToDeck(control - 24);
    if (!value) return;

    channel = "[Channel" + deck + "]"

    if (XoneK2.shift_status) {
        engine.setValue(channel, "cue_default", 1.0);
        engine.setValue(channel, "cue_default", 0.0);
    } else {
        engine.setValue(channel, "play", !engine.getValue(channel, "play"));
    }
}

What has the CPU todo:

  • find and evaluate the XML mapping
  • Lookup and call the script function
  • execute a js function with string computing (should be today comparable fast as a c++ equivalent)
  • Call setValue and Lookup the ControlObject

I like the idea for a dynamical mapping because it will be equal fast compared to a direct CO mapping.
It will also work nice, because it happens in the same thread, which uses the lookup table later.

To be honest, I do not fully understand @Be-ing mockup, but it looks like it still involves the same amount of lookups than today. Is that correct?

Any solution that allows to change the existing (or even new) fields we have currently in XML would be nice.

A feature that allows to call engine.setValue() directly on a CO instance would be also helpful to get around some string computing. Something like getControlProxy() for JS.

We may also consider to move the shift feature to the xml mapping itself. This would finally allow to map two controls to a Midi Message dependent of the state of one of a set of Shift-COs. (Mixxx has already a [Controls], touch_shift control that can be used to right click by touch)

@daschuer
Copy link
Copy Markdown
Member

JS has automatic garbage collection so it doesn't have any concept of a destructor function, so the shutdown functions would have to stay.

Not sure if this helps:
https://snippetrepo.com/snippets/c-like-disposable-in-javascript-with-underscore

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 22, 2016

We may also consider to move the shift feature to the xml mapping itself. This would finally allow to map two controls to a Midi Message dependent of the state of one of a set of Shift-COs.

IMO this is the wrong direction. XML is not a scripting language and should not be hacked into one. We already have an issue with this in the skins. In working on PR #940 I ran into this problem. Handling the complex interaction of multiple conditions in XML is very cumbersome, confusing, and results in lots of code duplication that is difficult to maintain.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 22, 2016

To be honest, I do not fully understand @Be-ing mockup, but it looks like it still involves the same amount of lookups than today. Is that correct?

No. There would be neither JavaScript execution nor string manipulation when the button is pressed or released, only when the script initializes or the shift button is pressed.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 22, 2016

The new Input/Output handler objects would only execute JavaScript when their constructor is passed a function object or they are later set to execute a JS function after initialization. If passed a string in the constructor or their CO property is set to a string, Mixxx would manipulate the CO upon MIDI input in C++. This would remove the need for executing JS on each MIDI input received for most cases.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 22, 2016

I have updated the wiki pages:
http://mixxx.org/wiki/doku.php/m-audio_xponent
http://mixxx.org/wiki/doku.php/korg_nanokontrol_2
http://mixxx.org/wiki/doku.php/novation_twitch

and posted on the forum notifying users of the new mappings.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 22, 2016

To be honest, I do not fully understand @Be-ing mockup

I have expanded the mockup a little bit. What do you not understand about it? If you're confused about the language, you may want to read the MDN article about how this works (you can ignore the sections about arrow functions and getters/setters as Qt 4.8's JS engine doesn't support these).

@daschuer
Copy link
Copy Markdown
Member

We may also consider to move the shift feature to the xml mapping itself. This would finally allow to map two controls to a Midi Message dependent of the state of one of a set of Shift-COs.

IMO this is the wrong direction. XML is not a scripting language and should not be hacked into one. We already have an issue with this in the skins. In working on PR #940 I ran into this problem. Handling the complex interaction of multiple conditions in XML is very cumbersome, confusing, and results in lots of code duplication that is difficult to maintain.

Yes sure, I should have written:
" We may also consider introduce a Control object based shift feature. This will move the shift feature into the c++ mapping structures and will finally allow to map two controls to a Midi Message dependent of the state of one of a set of Shift-COs. "

XML is only one interface to manipulate these structures, JS would be an other interface.
This will also allow to "learn" a shifted button. It is quite similar to the build in shift function some controllers have, where a button sends different messages depending on an unmapable shift knob.
We have a similar concept for the left and right click CO in the Skin files.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 22, 2016

Could you provide a JS mockup of how such an interface would work?

@arximboldi
Copy link
Copy Markdown
Contributor Author

Thanks for the merge! It was a thorough and good review, very helpul indeed 😄


On the topic of evolving the controller API: I agree that having shift (or more generally, modes) embedded in the engine could be useful, but I think the advantage is mostly to make this editable via the GUI. From the POV of the JavaScript developer, it is easier to build such an abstraction in this language directly, where it is easier to build composition into the system (e.g. Mixco value expresions, Ableton scripts Layers, Hercules P32 modes and layers, etc.). But these are two opposing views--from the POV of the GUI you want to be as concrete as possible to support good defaults and ergonomic workflows (e.g. only support "shift" buttons) but from the scripting POV you want to be all flexible (but with expressive composable abstractions)...

I digress. My point is I guess that I consider more important adding the possibility to create dynamic mappings from the script. Yet the "shift" in the engine is interesting to explore, but I would do it later, since by adding "special cases" to the engine it might make it more complicated, and make generality harder to add after the fact.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 22, 2016

I guess that I consider more important adding the possibility to create dynamic mappings from the script.

Likewise. I think a well-designed JS API wouldn't be much more complicated than a well designed GUI but would be much more flexible.

@daschuer
Copy link
Copy Markdown
Member

daschuer commented Sep 23, 2016

Here some fragments of my incomplete ideas:

  • Fully expose the MidiInputMapping and MidiOutputMapping to JS
    I think they have to become a QObject and all members have to become a Qt property.
  • Allow to add, remove and edit inputMappings and outpuMappings by JS
  • Extend MidiInputMapping with cashing features for ControlProxies and compiled JS delegates.
  • Extend MidiInputMapping with one or n bool modifiers changable by a mappable ControlProxy.
  • Extend MidiInputMapping with a Mapping option for each modifier state.

During the implementation, we have to be sure, that we provide a smoth transition when a modifier is changed. For sliders, softtakover should work, for buttons, we have to make sure that a release event is passed to the pressed ControlProxy even if the modifier changes during press.

JS fragments:

var inputMapping = engine.getOrCreateInputMapping(0x42, 0x13);
inputMapping.options.softtakover = true;
inputMapping.setModifier("[Controlls]","shift")
inputMapping.mappedControlProxy = engine.getOrCreateControlProxy("[Channel1]","play");
inputMapping.callbackModified = doWhatYouWhant1
inputMapping.callbackUnmodified = doWhatYouWhant2

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 23, 2016

My concern with that approach is how it could handle more than a single simple boolean modifier. What if a pad grid has an autoloop and a manual loop layer, and both have different behaviors when shift is pressed? What if there is also a sample layer for the same pads? Or what if a jog wheel behaves differently when shift is pressed, when a scratch/vinyl mode button is activated, and when both are activated? How could a GUI present such complexities in a manageable way?

I realize now that I see your JS mockup that the approaches are not incompatible. As long as mappings can be modified through JavaScript, script authors can take advantage of that. The GUI could be extended to use a Traktor-like modifier system without interfering with that. I'm not sure if this would really be worth the effort though if it can only handle simple cases. I'd rather not see Mixxx's mapping GUI turn into something like Traktor where users are essentially programming with lots of tedious mouse clicks and little ability to organize a complex system that consists of a huge table of mappings with a lot of redundancy. There comes a point when it's easier to write code than make a complicated GUI.

I agree with @arximboldi that if a modifier system is added for the GUI, it would be better to focus on implementing dynamic mappings in JS first then add the modifier system later.

@arximboldi
Copy link
Copy Markdown
Contributor Author

I'd rather not see Mixxx's mapping GUI turn into something like Traktor where users are essentially programming with lots of tedious mouse clicks and little ability to organize a complex system that consists of a huge table of mappings with a lot of redundancy. There comes a point when it's easier to write code than make a complicated GUI.

Just throwing ideas: Bitwig Studio includes a JavaScript/CoffeeScript editor and console accessible from the preferences UI. This is another path to make casual edits and experimentation easy for users, where code is simpler than widget forms. There could be an Edit code button in the preferences next to the script chooser. This replaces the preferences window with an editor. Below the editor there is a minimalistic JavaScript REPL, so one can inspect and play with the state of the running script for debugging and API exploration purposes.

In the mid/long term, JavaScript literacy is increasing among technology aficionados, specially given the efforts of various institutions towards supplying for the increasing demands of web tech labor. There are many ways in which Mixxx could make it easier to access, understand and modify its scripts--addressing the needs of this growing population.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 23, 2016

Thanks, I was not aware of Bitwig's mapping system. I see plenty of Bitwig mappings online, but I can't find any documentation for the Bitwig mapping API...

@daschuer
Copy link
Copy Markdown
Member

My concern with that approach is how it could handle more than a single simple boolean modifier. ...

Hey, comm on, this is a first draft and can be improved in any direction and as you notice, it is already fully script-able :-D

The idea is that you can map more than one ControlProxy or JS Callback to any message.
Switching between them by one or more modifier CO will allow to switch by any event you like GUI, keyboard and second controller script. You can also write the modifier CO by an additional JS function like you did in your example and change the callbacks on the fly.
On the other hand a simple two feature button like a GUI left/right click button is very easy to edit by a mapping wizard.

How could a GUI present such complexities in a manageable way?

I think we should focus only simple two features buttons by GUI, not more, similar what we can do by Skin. Pick a Modifier Control, Pick Unmodified mapping, Pick modified mapping .. done. This should cover lets say 90 % of all cases.
If we need more, we can provide a library of common script function that can be mapped or better a CO interface implemented in C++ to the complex feature. This would allow to access the feature by GUI as well.

We can also already write JS functions inline in the xml file ...

Currently we have the issue, that some scripts provide features like break effects, that are only accessible from the controller. If we find a way to access them by GUI as well, it would be great.

I agree with @arximboldi that if a modifier system is added for the GUI, it would be better to focus on implementing dynamic mappings in JS first then add the modifier system later.

I am afraid this will be a harder approach. I do not know, How a dynamic typed JS mapping object is processed in the C++ domain. Does Qt allow this? For me it would be much easier to go the other way, define a fixed type mapping object in C++ that is fully expandable by JS callbacks.

@Be-ing
Copy link
Copy Markdown
Contributor

Be-ing commented Sep 23, 2016

I am afraid this will be a harder approach. I do not know, How a dynamic typed JS mapping object is processed in the C++ domain. Does Qt allow this? For me it would be much easier to go the other way, define a fixed type mapping object in C++ that is fully expandable by JS callbacks.

I'm not sure what you mean by "dynamic typed JS mapping object". By "dynamic mapping", I mean that the properties (MIDI signals, group, and ControlObject or JS callback) can be modified by scripts after initialization.

QtScript does do a lot of automatic type conversion between JS and C++ by the way.
http://doc.qt.io/qt-4.8/scripting.html

@daschuer
Copy link
Copy Markdown
Member

In JS, you can define a object by initializing it. For example
this.someButton = new Input(0x40 + groupNumber, 0x13, this.group);
from your example will add a new member "someButton" to the this object.
The object type can change in JS.

You can also say:
var i = 0; // i is now an integer
i = "Hallo"; // i is turned to a string at runtime

This is not (or hardly) possible in C++, where the class has to be fully known at compile time.

If you pass a QObject to the script, it becomes a JS Object, with a reference to the QObject, all Qt Properties can be accessed by JS. You can also add new members from JS, but IMHO they can't be seen from C++.

This means we need to decide about the new mapping c++ class first and cannot start with a pure JS solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants