Great timing! I've lately been studying up on Prolog and miniKanren, and increasingly I've been wishing that I had one or the other embedded in my usual programming languages. I believe there is already at least one miniKanren implementation in each of JS and Python, but it's cool to see this pop up.
One thing I've wondered about with such systems as performance: I would be nervous about pathological cases leading to really bad performance or even DOS attacks. My understanding of Prolog in particular is that sometimes in order to get your program to run to completion requires a nontrivial understanding of the backtracking algorithm them so you can judiciously place cuts, etc.
Inspired by Will Byrd's many presentation on quines, this leads me to wonder: what if you had a logic language that does not run at runtime, but that generates code with certain properties according to some set of rules? It's such a thing possible? Are there theoretical results on that subject?
> One thing I've wondered about with such systems as performance: I would be nervous about pathological cases leading to really bad performance or even DOS attacks. My understanding of Prolog in particular is that sometimes I get in your program to run to completion requires a nontrivial understanding of the backtracking algorithm them so you can judiciously place cuts, etc.
That depends on which constructs are you using. It's perfectly fine to write modern Prolog code without any cut, but the sad thing is that those constructs (once/1, dif/2, if_/3, tabling, ...) aren't taught in most books out there (the exception is The Power of Prolog) which still perpetuates these ideas about Prolog.
You would probably need to do some testing before heading to production, but the easiest insurance would probably be to wrap every query in a timeout and just cancel the query after a max timeout. Most net systems work this anyway and it gives you insurance around a slow query saturating your business logic.
You'd be interested in Grammar-Guided Genetic Programming.
It's tangential, but along similar lines. You generate a grammar of what a successful program looks like, and then that grammar can spit out programs of that "shape" ad infinitum.
The main difference is that you don't have the same verifiable guarantees as logic-al solutions because the grammar is generated and evolved over time instead of induced from a ruleset.
One thing I've wondered about with such systems as performance: I would be nervous about pathological cases leading to really bad performance or even DOS attacks. My understanding of Prolog in particular is that sometimes in order to get your program to run to completion requires a nontrivial understanding of the backtracking algorithm them so you can judiciously place cuts, etc.
Inspired by Will Byrd's many presentation on quines, this leads me to wonder: what if you had a logic language that does not run at runtime, but that generates code with certain properties according to some set of rules? It's such a thing possible? Are there theoretical results on that subject?