> Magical computer that guess for you some data of course doesn't exists.
Prolog does sunshine like this, no? You give it rules and then make a statement with a blank in it and it'll fill in the blank given the rules you've described.
Yes, the cool thing here is just to use higher order to provides the user non-determinism seaminglessly within Python while in prolog this is a built in feature.
The idea seems so similar to the classical AMB ambiguous operator introduced by John McCarthy [0]. The implementation seems similar to what I've done a while ago, too, just in Lisp and using Lisp primitives rather than function decorators.
Yep, Screamer is the "enterprise" quality of nondeterministic programming - think efficient and optimized Prolog-esque primitives, but in Lisp.
My implementation of AMB is purposefully small (70 LOC, not counting docstrings and tests and system definition), so that it can fit on a page of paper and be easily understood.
Makes me think about how non-deterministic computing is basically how you view lists as a context (as opposed to a container) when thinking in terms of Monads in Haskell. Application functors basically form a minimal framework for applying functions to values in a non-deterministic context when working with lists.
Making this further relevant is that perhaps the most well known example of monadic programming of lists is: Python's list comprehensions. Which makes a bit surprised to see that not more explored/exploited in this library.
It's roughly the same, no? It emulates a non-deterministic Turing machine and gives the same results, just taking much longer. But runtime isn't everything! I think that if you're programming in a nondeterministic style, you're programming as if you have a NTM, whether or not you're actually running the code on one.
Also, surely NTMs are only physically unrealizable if P!=NP, and so whether or not NTMs could exist is an open question.
Useful to explain non-determinism to students. I saw a similar idea before at https://github.com/aeporreca/nondeterminism which uses fork() to (inefficiently) explore all possible guesses concurrently
It doesn't help that CS students will potentially end up hearing the term "nondeterminism" to mean different things in different contexts. In the context of the linked repo, it's used in the way they'll probably encounter it when learning about Turing machines, but in less formal contexts it also gets used a lot to describe stuff like "Heisenbugs" where running something more than once doesn't necessarily end up in the same result
For me, the idea of nondeterminism was around a computation tree who's branches correspond to possible states of computation.
In the context of a deterministic vs nondeterministic finite automata, a DFA has a linear computation path from the root to an accept/reject state and a NFA branches on epsilon transitions, continuing each potential computation (the "guesses") in so called "parallel universes" until one branch hits an accept state.
This definition follows with Introduction to the Theory of Computation 3rd edition by Sipser (would recommend)
Prolog does sunshine like this, no? You give it rules and then make a statement with a blank in it and it'll fill in the blank given the rules you've described.