I've been playing around with a music synthesizer and sequencer that just came out.
I noticed while uploading new presets to the OP-XY that the presets themselves are in a .JSON file format. The fields in the JSON are fairly self-evident, though the devil is in the details and you'd have to consult the user guide to figure out certain enums.
I also spoon fed the official User Guide into Notebook LM and kinda ping ponged between it and Chat GPT O-1 to write a technical spec for the JSON format.
I tested it out by giving it the script of this video https://www.youtube.com/watch?v=eIfssnYO6Hs (essentially a tutorial on a specific synth engine within the device) and asking it to come up with a JSON file that is similar to the one described around the 1-2 min mark of the video. It tried and when I uploaded it to my device and played it, I was astounded to see it mimics very closely the bass in the video. That made my jaw drop.
I tried another prompt where I asked it:
```
please create a pluck patch. It should sound a bit like the plush in Kygo's Firestone. Almost like a steel drum a little bit. It should sound very tropical house. Give it to me as JSON for a preset.
```
It produced a valid patch file. Indeed it did sound very steel drum like and would fit with tropical house. It didn't really understand how to get the envelop of a pluck right, so it may need some fine tuning. And it's not clear how much it really understands.
I burned all my Chat GPT o-1 quota playing with this, but will mess with it more next week. I'm very curious if you could give it enough information so it could actually design a whole ensemble of presets that would play well together.
If you want to play with this yourself, I saved off the entire draft file specification and some initial tips on how to sound design a patch (very crude and needs work). You can copy/paste it into Chat GPT or Claude Anthropic and play with it yourself. https://gist.github.com/kmorrill/898579c18df24ac5094f691cd411d741