
DrumBot: Real-Time ML Drummer - mariuz
https://magenta.tensorflow.org/drumbot
======
AndrewUnmuted
Some of my thoughts on this, as a fellow researcher in the space...

1) What selected the particular drum samples to use, and what methods were
employed?

2) Can we assume that only Western approaches to time and rhythmic syncopation
will be applied by this "real-time ML drummer"?

3) What about gestural performances, and non-standard approaches to timing?
Can this detect and reconcile those kinds of use cases?

4) What does it consider to be "musical" input? Can non-musical input be used?
If so, is the accompaniment what we would expect it to be?

------
atum47
[https://drumbot.glitch.me/](https://drumbot.glitch.me/) seems to be offline.
=(

~~~
Ulagu
I think we keeled it with traffic.

~~~
atum47
or... it's just a glitch!

Sorry, I'll show myself out.

------
riskycodes
This is awesome. Really looking forward to looking over this source code and
finding out about export / MIDI options.

------
n3k5
TL;DR: If drumbot.glitch.me doesn't work for you, try running it locally [2].
It's surprisingly effortless!

First attempt at trying this out: 502 Bad Gateway.

Second attempt: I thought I didn't want to mess with Node, so I downloaded
Magenta Studio [0] instead. When trying to load the Max patch, my potato
promptly lost its will to Live. (Ba-dum tss [1].) Maybe loading a 900MB .amxd
when you only have 1GB of free RAM isn't a great idea? Anyway, I'm not keen on
risking another reboot today, so I'm putting this off for now.

Third attempt: Localhosting DrumBot [2]. Five minutes ago, I didn't have Node
installed (I didn't even recall ever touching Node directly at all), but now
(npm install … added 247 packages from 284 contributors … found 4 moderate
severity vulnerabilities … npm start) the tensors are flowing already! Whoa,
Node is so easy, any idiot can use it — even musicians ;)

I peck in a simple melody:

    
    
      Fly me to the moon
      Let me play Fly me to
    

Oops, this started looping at an unexpected point. I guess I should have stuck
to a 4/4 time signature? DrumBot is equally confused and lethargically bangs
out wonky shit like it's attempting to cure carbon monoxide poisoning with
ketamine. Meet my new band: Tourette's Horse Tranquilizer.

I start over, this time with a proper MIDI keyboard plugged in. Initially this
fails: MIDI Off messages trigger notes in the same way as MIDI On messages.
This is easily fixed by routing the MIDI through a DAW. (Why? No idea.) I play
a standard Boogie riff, except slowed down to 100 BPM (because I can't really
play keys; I just have this controller for inputting notes). DrumBot analyses
this (100% correctly) as a Chicago Blues turnaround and, apparently knowing
that six and two is eight, immediately channels Casey Jones[3].

Dear fellow scholars: what a time to be alive! [4]

Btw. @notwaldorf: I just had a go at routing DrumBot's output back into my DAW
and obtained good results by dragging parts of my custom drum kit onto the
MIDI notes that seemed most relevant. Then I changed the input and now DrumBot
is hitting notes I haven't mapped yet. I guess it's using the standard GM drum
map? Implementing that in its entirety would involve setting up over 40
separate instruments :/ But I'm not supposed to go quite that far, right? Is
there some documentation telling every monkey how to make DrumBot king of the
bongo bong? [5]

[0]
[https://magenta.tensorflow.org/studio/](https://magenta.tensorflow.org/studio/)
[1] [https://www.ableton.com/en/live/max-for-
live/](https://www.ableton.com/en/live/max-for-live/) [2]
[https://github.com/magenta/drumbot](https://github.com/magenta/drumbot) [3]
[https://youtu.be/RSTF6GJ23LY](https://youtu.be/RSTF6GJ23LY) [4]
[https://www.youtube.com/playlist?list=PLujxSBD-
JXgnqDD1n-V30...](https://www.youtube.com/playlist?list=PLujxSBD-
JXgnqDD1n-V30pKtp6Q886x7e) [5] [https://youtu.be/cfLIlP-
GAmg](https://youtu.be/cfLIlP-GAmg)

------
slowrabbit
npm build does work for this on osx and neither does the website... what a
busted app. How did this make it to front of hacker news?

~~~
notwaldorf
Sorry about that; the app is hosted on glitch and they seem to be having some
problems serving apps right now, so it’s not something I can fix :(

~~~
soperj
It's really not your fault. It's really okay if people don't get instant
gratification.

------
edna314
Seems like people are slowly running out of ideas what to do with ML.

~~~
meowface
Are you kidding me? This is one of the things people have been hoping ML could
help with for decades. We're only going to see more music composition and
production-related ML applications in the future.

I think they may soon revolutionize workflows for electronic music producers.
This is an amazing first step in that direction.

~~~
edna314
Maybe I’m a bit oldschool, but until ML becomes self-aware it won’t be able to
replace a drummer, mixer or any other kind of musician. Anything related to ML
and music which I’ve seen so far basically caused the same two reactions:
“WTF? They can do that with ML? This is awesome!” “Ok, to be honest I wouldn’t
listen to this if I didn’t knew it was produced by ML”

~~~
meowface
Yes, there are many use cases here. If we're talking composing something
entirely by itself, start to finish, the technology is nowhere near there (at
least by my subjective music taste standards). We may not need self-awareness
to get there, but we'll probably need something closer to AGI.

But I could think of many ways existing ML could help producers and composers
while they're working. They can present possibilities and alternatives that
the composer can choose from; they can find interesting combinations and
permutations and patterns; they can potentially learn from your workflow and
save you time in the future, etc.

I kind of have some ideas of how to develop applications like these. Maybe I
should pursue them; although I currently work in a totally unrelated field.

