
McCarthy's Ambiguous Operator (2005) - kristianp
http://www.randomhacks.net/2005/10/11/amb-operator/
======
ekidd
Author here. It's fun to see this post 14 years later!

If you like this, you might also enjoy:

\- Prolog: This is an entire language built around choice and backtracking.
[https://en.wikipedia.org/wiki/Prolog](https://en.wikipedia.org/wiki/Prolog)

\- _The Reasoned Schemer_ , which builds a Prolog-in-Scheme from first
principles. This book is excellent, or even mind-blowing if you're never seen
this before. [https://mitpress.mit.edu/books/reasoned-
schemer](https://mitpress.mit.edu/books/reasoned-schemer)

\- Oz/Mozart: This language takes the basic idea even further, adding
concurrency and constraint solving. There's an older book about this that's
nice:
[https://www.info.ucl.ac.be/~pvr/book.html](https://www.info.ucl.ac.be/~pvr/book.html)
It feels like Oz was an evolutionary dead end, but an interesting one.

\- Probability monads. You can extend this general idea to include probability
distributions over values. I have an older series of blog posts at
[http://www.randomhacks.net/probability-
monads/](http://www.randomhacks.net/probability-monads/) that explains this
idea piece-by-piece, but there's been a ton of research in the past decade on
probabilistic languages: [http://www.probabilistic-
programming.org/wiki/Home](http://www.probabilistic-programming.org/wiki/Home)

It's definitely a fun topic!

~~~
nn3
There are also icon and txr. I always liked the icon approach where
backtracking is fairly explicit, but still tightly integrated into the
language. Txr is more for text analysis.

------
kristianp
The McCarthy '63 paper referred to seems to be here: [http://www-
formal.stanford.edu/jmc/basis1/basis1.html](http://www-
formal.stanford.edu/jmc/basis1/basis1.html)

~~~
cpach
A brilliant paper. IMHO one of the cornerstones of computer science. I can
warmly recommend it. 'pg has a good explanation of it here:
[http://www.paulgraham.com/rootsoflisp.html?viewfullsite=1](http://www.paulgraham.com/rootsoflisp.html?viewfullsite=1)
(Windows users might want to use an online converter to convert it from
Postscript to PDF.)

------
Patient0
I suppose this is a good time to point out the toy Lisp interpreter I wrote a
while ago:
[http://patient0.github.io/FirstClassLisp/](http://patient0.github.io/FirstClassLisp/)

It supports first class macros, continuations, and has the amb "operator"
(implemented using a macro of course!):
[https://github.com/Patient0/FirstClassLisp/blob/master/Lisp/...](https://github.com/Patient0/FirstClassLisp/blob/master/Lisp/LispEngine/Bootstrap/Library.lisp)

------
rfw
amb without side-effects can also be expressed in terms of the List monad,
e.g.:

    
    
        do x <- [1, 2, 3]
           y <- [4, 5, 6]
           if x * y == 8 then pure (x, y)
                         else []
    

which can be equivalently reworded as the list comprehension:

    
    
        [(x, y) | x <- [1, 2, 3], y <- [4, 5, 6], x * y == 8]
    

i.e., the underlying structure of amb without side-effects can also be
understood as just a search of the input space for values that fulfill some
criteria, rather than as backtracking via continuations.

~~~
millstone
This analogy misses something fundamental. See how the assignments to x and y
are of very different character than the predicate? Notice the return path
(`else []`). `amb` doesn't need this.

`amb` merges the assignments and predicate - this is important because you may
not know your argument count statically, i.e. if your arguments are a list.
But beyond that, `amb` may backtrack an arbitrarily deep call stack, so the
the error return does not need to be expressed explicitly. You may think that
is good or bad but it is not the same as the Haskell solution.

~~~
rfw
We can implement it with a single `amb` function, too (I took some shortcuts
that might have hidden some of the nature of the implementation)!

    
    
        -- Lifts a list into Amb.
        amb :: [a] -> Amb a
    

If we assume Amb is just List, then:

    
    
        amb = id
    

If we write the example in the original article in desugared style, we get:

    
    
        amb [1, 2, 3] >>= \x ->
        amb [4, 5, 6] >>= \y ->
        if x * y /= 8 then amb [] else pure () >>
        pure (x, y)
    

(we are forced to use an awkward condition with `pure ()` on the else branch
when calling `amb` because Haskell requires us to return values on all
branches. We can rewrite it equivalently in terms of `when` to hide that
detail:

    
    
        amb [1, 2, 3] >>= \x ->
        amb [4, 5, 6] >>= \y ->
        when (x * y /= 8) (amb []) >>
        pure (x, y)
    

It now looks more similar to the original example.)

It ends up looking like continuation-passing style: conceptually, if we
encounter `amb []` in the nested function, the nested computation ends and we
start examining the next value in the list values being assigned to `x`: the
implementation given in the original post does end up using callcc, so with
some imagination you might be able to derive some kind of equivalence here :)

~~~
Dylan16807
That's not an amb function. That's a builtin type that's inherently amb-y.

You still get credit for solving the basic problem but let's not be
misleading.

Also don't forget to call head.

~~~
hopler
In Haskell monads, you define a monadic type and then "bind" is the function
that does something specific for that type. "It's a type" because Haskell uses
static types and polymorphism to factor out common logic used by different
functions.

Of course Haskell list is a built-in type, but it's not magical except for the
special bracket syntax. You can make a sugar-free user-defined type

    
    
        data List a = Nil | Cons a (List a)
    

if you want.

~~~
Dylan16807
Yes, I know all of that. I wasn't saying it was magical, I was saying that in
that particular code "amb expressed in terms of the List monad" is an accurate
description, but "amb function" isn't really true. The monad is doing 100% of
the ambiguous operation and the "amb function" does 0%.

As an analogy let's say I define "(" to do write to stdout, and "print" as id.
Even though "print(4)" works as expected, my "print function" is a lie.

Or in C source code where adjacent strings get merged, I could [#define CONCAT
] so that ["ab" CONCAT "cd"] gets preprocessed to ["ab" "cd"] and parsed as
["abcd"]. But I didn't actually write a concatenation operator. The
concatenation happens because of something else, it would happen even if you
didn't use CONCAT, and you can sprinkle CONCAT all over your source code with
no effect.

------
phoe-krk
This actually sounds very Prologesque.

"Here, have some inputs. From those, find me the ones that satisfy this goal."

~~~
User23
AMB is a very handy primitive to implement prolog.

~~~
dragontamer
Hmmm... AMB probably is sufficient for a toy version of Prolog.

I feel like anyone actually implementing prolog will need to fully understand
horn logic. AMB is a useful operator when the specifics of the backtracking
search can be abstracted away.

But an "actual" prolog implementation needs to have a well-defined depth-first
search over the horn-logic clauses. There are a lot of optimizations to make
horn-logic solvers way faster.

------
tempodox
I envy the languages with continuations. That makes the implementation of
backtracking so much easier.

~~~
billsix
Continuations also allowed me to implement python’s yield/send in 16 lines of
Scheme code :-)

[http://billsix.github.io/bug.html#_make_generator](http://billsix.github.io/bug.html#_make_generator)

~~~
hopler
You can also implement scheme's call/cc with python's yield/send, right?

~~~
billsix
Incorrect. In c terms, you can think of callcc as a procedure which memcpy s
the stack onto the heap for later application. And when it is later applied,
the heap allocated stack is memcpy ed back into the stack region of memory,
destroying the current stack.

Python and c#’s yield are nothing like that. It’s just syntactic sugar for
iterators.

~~~
dragonwriter
Python yield actually supports coroutines with bidirectional communication,
not just unidirectional iterators; while not as full featured as callcc, these
cover quite a lot of the use cases.

~~~
billsix
Agreed regarding use cases. My scheme implementation does both yield and send,
cleanly imho, because of call/cc.

------
User23
SICP spends many pages on this.

~~~
cosmez
right, actually one of the topics were i stopped on my first readings.

[https://mitpress.mit.edu/sites/default/files/sicp/full-
text/...](https://mitpress.mit.edu/sites/default/files/sicp/full-
text/book/book-Z-H-28.html)

~~~
User23
Contextualizing this with Hoare's work on pre and post conditions helped me to
understand this chapter. Predication is fundamental to advanced computer
science.

------
_ZeD_
So it's a base operator for prolog unification?

~~~
neel_k
No. From an implementation POV, Prolog adds two pieces to ordinary programming
languages: backtracking plus unification. This post talks about backtracking,
but doesn't touch upon unification at all.

------
kazinator
[https://www.rosettacode.org/wiki/Amb](https://www.rosettacode.org/wiki/Amb)

------
Sharlin
I had been wondering what the `amb` operator in Rx is named after. It takes
any number of reactive streams and evaluates to the one that first emits a
value, discarding the others. I presume the analogy here is that because it
doesn’t block, it figuratively has to ”look into the future” to figure out
which stream to return. But is there a deeper theoretical connection there?

------
DarkWiiPlayer
Regarding the ruby example, thinking about the performance of such a
(non-)feature makes my head spin. Not to mention the possibility of side-
effects wrecking the program state and the complexity of debugging code like
that without making Rubys error handling aware of such a construct.

All in all, this seems like a fun hack, but a catastrophically bad idea in
practice.

~~~
sacado2
This is an NP-complete problem, so performance will be an issue anyway.

------
greencore
For me it looks like this in JavaScript: const [a, b] = permutations(array1,
array2).filter(([a, b]) => a*b === 8).slice(0,2);

------
sacado2
It sounds a lot like constraint programming.

