Check out the MOD Assistant Beta to find a guitar tone with AI

that is really cool to hear.
Is this bound to this specific project? Meaning you can also pay for the development time for the “supporting” tools with this grant?

1 Like

The extension of tones is in scope of the project @spunktsch, so within certain limitiations yes.

Hope there’s dev money for bassists in there… :grinning:

7 Likes

For bass we will do some experiments @RashDecisionAudio. The general approach should be applicable as well, but hard to say what it takes to get to good results at this point in time honestly.

3 Likes

Absolutely expect more precise results moving forward @LievenDV, each serious usage of the MOD Assistant Beta with feedback is contributing!

What sounds like witchcraft is neural feature detection married with the power of the MOD platform. This allows @itskais to have the AI explore a huge amount of combinations in a highly automated fashion.

While there is still a long way of learning to go, I believe we are on a very interesting and accelerating path.

1 Like

@Luke101 be assured most of our development resources go to short term effective improvements. Actually fixing known timing issues you mentioned is in progress and planned for release with MOD OS 1.14 to support Delay and Live Looping marketing specials we want to do later this year.

Not having projects like the AI Assistant would free up a bit resources now, while making it harder to grow them moving forward. Beside the essential polishing of the user experience, we need fresh innovation to get more funds. The 20k€ is a small but important proof point. Won’t stop there.

5 Likes

Do you plan to also use the guitar input as part ofthe training data in some way?

I might be wrong (i havent yet actually had time to try it out yet). But it looks like right now from what i saw visiting the link that the guitar input is ignored.

Some examples are like how hot is the output of the guitar or the tonal eq and which pickups are selected etc.

It may make sense to have two separate stacks one to try and match guitar to a reference and then the next to turn the matched reference into the desired sound

Different guitars and setups have very different tones and may not use the same pedal stack.

I.e. i would love to be able to have a reference track as well as a sample of me (trying) playing the same thing on my guitar and ask it to try and match as best as possible to make my guitar sound like the reference.

3 Likes

Hi @bcosta. I definitely agree that the guitar (pickup types, pickup position…) is a major factor in the final electric guitar tone you get.
To add to your point, another factor may also be the playing itself: fingerstyle sound is different than alternate picking sound (picks themselves deliver a range of attack tones)… And so, 2 different guitars running through the same pedalboard will produce relatively different tones.

When we were discussing this, different technical challenges have risen in order to account for the guitar’s input: the main issue being the amount of training data we have to generate. However we’re considering ways of overcoming those issues in the future.

For now the idea is that the assistant just listens to the tone you want to sound like, and points you in a close direction. The generated pedalboard is then tweakable in order for the user to finish the final touch and get the exact sound they want.

Thank you for your input!

3 Likes

Thanks for the info. Glad to hear you guys are working on timing and sync. When is this release due for? (I’ve been out for a while)