I wonder if ui and back-end dev could be parallelized in some way. As far as I understand, all our tweaks and amendments to any pedalboard - ultimately get distilled down to a new pedalboard.ttl file where all the fx, signal flows, midi assignments etc are ultimately stored.
Would it be possible alongside the architectural changes to separate ui and business layer, for people to work on some sort of ui app which works simply on the parsing and amending of pedalboard.ttl files. Of course this would mean that though you couldnt audition live changes (i.e change a value and hear the result) - it would allow for the creation of new pedalboards which could then be tweaked in audition.
This could be done by reading the pedalboard.ttl files directly from the device. But the question arises - what would be the characteristics of a new interface? I am thinking that the pedalboard is ultimately a graph of nodes (fx) and edges (cables). What would (for example) a mobile ui for something like that look like? A clickable uml diagram? A zoomable map something like a city with buildings and roads? Or some multiscreen flow diagram where you could swipe between different regions of the pedalboard?
Or would something adapted to ui devices like a playstation controller be more effective? I’e move joystick to the different fxs, select into an fx, and then move the joystick around the available controls (which would not be that much different to the existing mod ui).
I personally would be interested in collaborating on some kind of pedalboard.ttl parser if anyone else is
It could be created a ttl parser into frontend layer. But it will be just used for visual representation.
If you want to make any changes, as changing parameter value or adding a plugin, and you only want to change the .ttl file, it will be necessary to load all entire pedalboard for each micro change, at least using mod-host.
The architecture changes will not change the API behavior, so it’s completely plausible working on parallel.
TTL (Turtle, RDF 1.1 Turtle) is a serialization format for Resource Description Framework (RDF, RDF 1.1 Primer) data, both standardized by the World Wide Web Consortium (W3C).
RDF is also used by the LV2 plugin standard to describe the plugin metadata like name, description, ports, etc.
The content (pedalboard description and configuration) uses a RDF vocabulary which is specific for MOD.
When parsing the TTL file I suggest to reuse existing functionality like RDF.JS (https://rdf.js.org/) or similar RDF libraries which exist for all majpr programming libraries, as there are many subleties in RDF and those libraries handle all they underlying nitty gritty details.
Any representation of the RDF data in a MOD-specific data structure can be based on that low-level reading and parsing using one of those libraries.
I have local test-changes-for-fun adding client-side websocket debug info console logging for the Windows MOD app. It doesn’t look like I can attach the screen capture to a forum post, however.
Use an image-hosting service like imgur.
(maybe not ideal for the indefinite future, but it saves the forum having to allow the upload and saving of a lot of data)
Maybe I am missing something, but is not that websocket communication already visible in chrome F12 debug menu without any additional code changes? At least I see that data flow on my hardware MDX webui:
You’re absolutely right about that. It’s just a first step of getting a bit of middleware into the websocket API; I might want to create some kind of visualization for it, or even just filter out the ping/pong & stats. A tool to help my understand and document the protocol outside of the code itself.
I was talking that with TTL files it would be only used to get the pedalboard state, and not for change the pedalboard stage.
If you access the pedalboard path, you can see some TTL files. For instance, when I created a pedalboard locally, it creates the following files:
The pedalboard data is saved into a TTL file, while snapshots are saved into a .json file.
Related to the json data that you shared, probably the mod team shares the pedalboard as .json because it is the most usual format for sharing over internet.
Well done for finding this! What I couldnt work out tho was where are the cables in the json - I can see it has an autowah, a shiroverb (an array of fx) and has 35 connections - but I can’t work out where it says what connects to what
Here is my first attempt at a sort of mobile UI for the mod - using the data @jetztgradnet found - its a expo.dev “Snack” a prototyping environment for React Native - I think I’ve come to the conclusion that full pedalboard construction (as in connecting pedals to one another) is not really possible on a mobile phone, however, tweaking the settings of the various pedals in in the board should be do-able.
It reads a “pedalboard.json”, which I guess should be received from the dwarf. This json contains an href to a descriptor of each plugin used, which you can use to handle the plugin parameters
Simple but really effective! Hope to see more soon
For me, this small step demonstrates a moment that I think can have a big impact going forward. I think there’s a lot of room to experiment with utility-style applications that can interact with a MOD device or application, and potentially provide solutions for many long-standing user wishes. Examples of things I think could work:
Improve the experience of viewing and editing mapped hardware: hand-assign MIDI values, batch edit assignments
Apply global in / out blocks to pedalboards
Better snapshot editing and management: batch update, list management, exclude plugins or parameters, allow for multiple snapshot “groups” per pedalboard
Better audio file management: assign files to hardware, more use of directories
I played around with your demo for a moment and am happy to test and provide technical direction where I can for these efforts. I have some familiarity with the pedalboard API and file formats and can help brainstorm ideas for implementations if you want to try things out.
I’m really looking forward to getting some personal issues here straightened out and getting back to this. I can see that I was just scratching the surface of the entire effects chain, and this adds another layer of possibilities that is absolutely mind boggling.