
Simple, pure, and total functional language that generalizes Datalog - edwintorok
http://www.rntz.net/datafun/
======
djhaskin987
I totally thought the article was going to be about prolog. Datalog was
formulated as a subset of prolog. Prolog generalizes datalog, and is
functional, etc. So generalizing a language that was the subset of a larger
one in the first place is something I find odd.

~~~
antisemiotic
Prolog isn't a total language, as a total language can't be Turing-complete.

~~~
galaxyLogic
What kind of sound does a Turing machine make?

------
iamwil
Has anyone had experience with Datalog or Datafun? What was the programming
experience like? What did you build with it? Did you find that you were more
productive or less? What sorts of problems was it a good fit for, and what
sorts of things was it a bad fit for?

~~~
YeGoblynQueenne
My research group (I'm a PhD student) is working on algorithms that learn
logic programs from examples and background knowledge (also given as logic
programs). The programs our algorithms learn are first order definite datalog
programs, which means they are sets of first order Horn clauses without
negation as failure and without any functions of arity more than 1 as
arguments of literals or terms.

The reason for the restriction is completeness and efficiency. Definite
datalog programs are guaranteed to terminate (given a finite predicate and
constant signature), so the search for a hypothesis (the learned program)
cannot "go infinite" even when the target theory (what you are trying to
learn) is recursive, even when it has mutually recursive clauses. As a result
our algorithms can learn recursive programs (which, is rather important). The
space of hypotheses doubles in size when negation as failure is allowed, which
improves the efficiency of the search for hypotheses.

The problem of course is that some programs cannot be expressed in definite
datalog than can be expressed in Prolog with negation as failure and arbitrary
functions as arguments. So that's a bit of a limitation. For instance, one has
to jump through hoops to learn programs with "exceptions" (A if B except if
C), say like a program calculating leap years (which have exceptions for years
divisible by 100 and 400) or fizzbuzz.

This is not directly _programming_ with datalog- it's machine learning of
datalog programs from data. But I think it's similar to the experience you're
asking for.

My group's algorithms:

Metagol (a meta-interpretive learner for definite datalog programs):

[https://github.com/metagol/metagol](https://github.com/metagol/metagol)

Louise (a polynomial-time version of Metagol):

[https://github.com/stassa/louise](https://github.com/stassa/louise)

~~~
iamwil
So you're building a program that can learn to write datalog programs. Is it
because datalog programs are tedious enough they should be automateable? Why
this particular research direction?

> The space of hypotheses doubles in size when negation as failure is allowed,
> which improves the efficiency of the search for hypotheses.

That line confused me. Shouldn't a double in the search space decrease the
efficiency of the search for the hypothesis?

~~~
srean
Whole lot easier to understand, compose and guide than DNNs. Thats a reason
enough.

~~~
YeGoblynQueenne
Well, there's that. My intuition is that ILP should appeal to computer
scientists more than statistical machine learning. Unfortunately, it is not
very widely known- we're a small field.

~~~
srean
Lets drink to its health/resurgence.

I am an more on the applied maths camp rather than the CS camp but I feel a
practical and effective merger of logic and probabilistic reasoning is sorely
needed to climb out of the rut we are in.

Neural Turing is all nice and dandy but it seems a very heavy handed way of
expressing/learning dependencies.

------
bcherny
I just listened to Michael’s episode on Future of Coding — highly recommended
for context!

~~~
alexchamberlain
Have you got a link to that? Is it this one
[https://podcasts.apple.com/gb/podcast/future-of-
coding/id126...](https://podcasts.apple.com/gb/podcast/future-of-
coding/id1265527976)?

~~~
bcherny
Yup, that’s the one.

------
carterschonwald
I really enjoyed the original datafun paper. Glad to hear there’s more follow
on work!

