
Interview with the Creator of Max/MSP and Pure Data - stevekrouse
https://futureofcoding.org/episodes/047.html
======
jcelerier
Had the chance to meet Miller as he was part of my phd jury, as most of the
time for persons of this caliber and with such a large impact, he was super
nice and humble.

Regarding

> There’s a problem in Max and in Pure Data, which is that you’re writing
> these things in terms of how they act. You’re making an instrument. But
> someone wants to store a score in the thing, and then do things that refer
> to objects in the score. So far, there’s no good design — or at least I
> haven’t seen a good design — that actually marries these two ideas in a
> single environment that works.

I'm working towards it in [https://ossia.io](https://ossia.io) \- the gist of
the idea is to have a timeline with dataflow elements embedded directly into
it so that parts of the score can connect to other parts which happen earlier
/ after. This allows to easily make instruments that evolve in time... though
most of its users use it to make scores instead of instruments.

(this leads to trying to answer the question "what happens when two connected
nodes of a dataflow graph execute for t=[0; 5seconds] for the source node, and
t=[3; 7seconds] for the sink node").

~~~
rectang
> _the gist of the idea is to have a timeline with dataflow elements embedded
> directly into it so that parts of the score can connect to other parts which
> happen earlier / after._

Why not allow nesting both ways? Why not allow timelines to be embedded within
dataflow graphs, and then triggered by events that occur at arbitrary times?
The outermost container can be a canvas embedding a single timeline if that's
your thing.

The constant-clockspeed model underlying all track-based sequencers and DAWs
is a straitjacket. Arbitrary pauses, jam breaks, and rubato sections make more
sense as time-free elements which exist between discrete timelines rather than
as modifications to timelines.

~~~
jcelerier
> Why not allow nesting both ways? Why not allow timelines to be embedded
> within dataflow graphs, and then triggered by events that occur at arbitrary
> times? The outermost container can be a canvas embedding a single timeline
> if that's your thing.

That is the case in ossia, timelines can embed flow graphs, and flow graphs
can embed timelines (and timelines can embed timelines and flowgraphs can
embded flowgraphs... most things are composable). But, unlike Max / PD, cables
can be connected across hierarchy levels so 99% of the time you don't need to
resort to a specific "subpatch", you just connect things in the timeline
itself.

> and then triggered by events that occur at arbitrary times?

However, the timeline has a built-in mechanism for that - see "open the
floodgates" here :
[https://ossia.github.io/score/first_steps/time_approach/inde...](https://ossia.github.io/score/first_steps/time_approach/index.html)

the doc is pretty outdated but this should give an idea :)

------
Intermernet
This is a beautiful interview. I've been a Max/MSP and Pd user for years and
I've spent many hours swearing at the creator of both. Now that I feel that I
know both the creator, and the mindset behind the programs a little better, I
think I'll spend less time swearing and more time discovering.

As an aside, Max4Live needs some serious love. It needs better interaction
with the Live world than is currently provided by the LOM. Things like "get
the file path of the current Live set" should be native, and there are a bunch
of similar things that should be simple, but are impossible without horrible
hacks. Now that Ableton own Cycling74 I hope this situation improves!

~~~
TheOtherHobbes
I'm a huge fan of Mr Puckette. I don't particularly like PD or Max (or the
limitations of the LOM) either, but I love the fact that he's aware of the
criticisms and even agrees with them, and has spent a lot of time thinking
about how to solve these problems.

Music turns out to be an incredibly hard problem. It's easy to make
superficial but limited tools that do maybe 50% of the job to make boring 50%
music. But the kind of general score management he talks about is still an
unsolved problem - far more challenging in its way than DSP audio synthesis.

~~~
PaulDavisThe1st
There's a question of goals there.

Is the goal to create tools that make not-boring music? Or is the goal to make
tools that enable you (or some other person) make not-boring music? Automating
the creation of music seems to have very little upside. Enabling people to
more fully expore their own creative has a lot, but that leaves people faced
with lots of "incredibly hard" stuff to do by themselves.

------
spiralganglion
Host of said interview here!

I had a blast interviewing Miller, even though I now have nightmarish visions
of imperative constructs rummaging around in my patches, and nasty text fields
sneaking up behind my objects, haha.

For future interviews, I'm eager to find more women, NB, trans, etc. folks to
bring on the show. If you know of anyone working on the frontiers of
programming tools, HCI, PLT, program visualization, etc etc, I would love
recommendations.

Cheers!

~~~
dfxm12
What is NB? Google suggests _nota bene_ or New Brunswick, but neither fit in
this context.

~~~
spiralganglion
Non-binary. Many people also spell it "enby".

[https://en.wikipedia.org/wiki/Non-
binary_gender](https://en.wikipedia.org/wiki/Non-binary_gender)

------
aldanor
Funny to see this on the front page as I've just started implementing a Rust
framework for writing Max/MSP externals :)

~~~
jcpst
I'm very interested in projects that use Rust to implement audio technology.
If there's anything out there yet, let us know!

~~~
aldanor
Who's us? :) as in, HN?

I will sure post something once it works (I have a test extension done
manually in Rust already working, now need to get serious with it, there's all
kinds of thread safety issues etc, challenging but fun)

------
harrylepotter
Max/MSP,PD, and to a lesser-extent Quartz composer were such inspirations for
me in college. About 10 years ago i ended up putting together an app builder
that leveraged similar concepts to PD's 'control' signals to drive UX
interactions...
[https://www.youtube.com/watch?v=hIH5V2BMG_k](https://www.youtube.com/watch?v=hIH5V2BMG_k)

~~~
spiralganglion
You may enjoy: [http://github.com/ivanreese/visual-programming-
codex](http://github.com/ivanreese/visual-programming-codex)

~~~
mycall
I'm surprised [https://processing.org](https://processing.org) is missing.
Awesome list though.

~~~
spiralganglion
As I see it, processing isn't visual programming, it's a text language for
making visuals. Certainly though, it's a superb project.

------
michelb
Max/MSP opened so many cool doors for me back in the day when I was studying
interaction design. The audio design students used it to rig up cool
instruments and audio performances. I connected tons of sensors and controlled
outputs over midi, to hardware or other software. It was so cool to put
everything together in a visual interface.

