Hacker News new | past | comments | ask | show | jobs | submit login
Introduction to logic programming with Prolog (matchilling.com)
284 points by matchilling on Dec 5, 2017 | hide | past | favorite | 91 comments

When writing a Prolog program a while ago I stumbled across an old mailing list where someone posted the following joke. Q. "How many Prolog Programmers does it take to change a lightbulb?" A. No.

Can't resist, I will add https://en.wikipedia.org/wiki/Answer_set_programming programming on this.

Q. How many Answerset programmers does it take to change a lightbulb? A. {1}, {1,2}, {1,2,3}, {2,1}, {2,1,3}, {2,3,1}

... {3,1}, {1,3}

LOL thank you

I guess the down voter has not programmed in Prolog before to get the joke.

I agree with what Sylvain Soliman said about this on Reddit: It's a nice tutorial for the 80's parts of the language:


In modern Prolog systems, you would use more declarative features such as constraints to model combinatorial tasks like map coloring.

Make sure to check out modern Prolog features if you are interested in learning the language seriously!

Are there any good materials (articles, books) for this kind of modern Prolog? I’m afraid my only experience with Prolog is those 80s features of Prolog and I’m not sure what you mean by constraints.

Please see my profile page: It contains links that I find relevant for learning modern Prolog features.

Every predicate you impose on the set of solutions can be regarded as a constraint, because it can at most restrict the set of solutions, never increase it.

So, in fact, every Prolog goal you invoke is a constraint. When reasoning over Herbrand terms, the only constraints are equality and disequality of terms, which are implemented by the predicates (=)/2 and dif/2, respectively.

In addition to these basic constraints over terms, modern Prolog systems also provide constraints over more specialized domains, such as constraints on integers, rational numbers and Boolean values. For combinatorial tasks such as map coloring, constraints over integers are especially useful.

For example, here is a constraint on integers, using GNU Prolog:

    | ?- 3 #= 1+Y.
    Y = 2
In this case, the system has correctly deduced that Y can only be 2 subject to the constraint that 3 is equal to the result of the arithmetic integer expression 1+Y, where Y is constrained to integral values.

Constraints implement relations between their arguments and can be used in all directions. This is the reason why Prolog predicates are typically more general than functions in other languages. However, to truly benefit from this, you must consistently use these comparatively new language features instead of lower-level ones. I say comparatively because these language features have been available for several decades by now in professional Prolog systems such as SICStus.

How does using constraints in Prolog relate to using a (finite domain) constraint solver?

The beauty of this is that a constraint solver (over finite domains, Boolean values, sets etc.) blends in completely seamlessly into the way Prolog works.

From the perspective of Prolog and its users, a constraint solver is simply available just like any other predicate! All the internal reasoning a constraint solver performs is completely abstracted away. The only interface is typically a few Prolog predicates that let you access the features of the solver by stating what must hold about the involved variables.

So, to use a constraint solver in Prolog, you simply use the predicates it provides, just as you would use any other Prolog predicate. In the example above, I am using the constraint (#=)/2, which is a predicate that is true iff its arguments evaluate to the same integer.

From the perspective of implementors, Prolog is a great implementation language for constraint solvers due to its built-in search and backtracking mechanisms. It also allows you to use the standardized Prolog syntax that many users are already familiar with, instead of having to devise yet another modeling language on top of your solver.

Thus, I would describe the relation as a natural symbiosis: It is natural to use constraint solvers in Prolog, and natural to implement them in Prolog.

are modern prolog just a layer on top of the old semantics or somehow an incompatible paradigm ?

The OP is likely speaking of "Constraint Logic Programming(CLP)".

The set of CLP languages is generally implemented as a superset or add-on package(s) to a Prolog implementation. For example, SICStus Prolog and SWI-Prolog have CLP modules, e.g.:



and "Constraint Handling Rules(CHR):


To the best of my knowledge the modules are written in Prolog. A look at those links will give you an idea of how CLP is used. There are modules available for CLP(X) where X is one of:

B = boolean,

Z = integers,

Q = rational numbers,

R = real(floating point) numbers,

FD = finite domains (see "CLP(FD) Constraint Logic Programming over Finite Domains" http://www.pathwayslms.com/swipltuts/clpfd/clpfd.html )


One can understand how constraining the domain of interest (reducing the search space) to say, the integers, might make search more efficient.

In a sense, yes, these modules are written in Prolog. However, that is not the full story: CLP requires special interface predicates that the underlying Prolog system must provide. You cannot implement CLP "on top" of just any Prolog system with reasonable performance and correctness.

Devising and implementing such suitable interface predicates is quite hard, and only a very small number of Prolog systems have succeeded with this so far.

For example, the SWI interface for CLP is too limited in practice, and its constraint solvers have elementary mistakes also because of the limitations of the interface predicates it provides.

In contrast, SICStus Prolog provides a much more general interface to attributed variables that supports all widely used constraint solvers (finite domains, Boolean variables, rational numbers etc.) with good performance and no known mistakes.

The tutorial you link to has several rather severe shortcomings, and I cannot recommend it for learning CLP(FD). Please see my profile page if you are interested in Prolog resources.

In fact, the "old semantics" actually included language constructs that are only now becoming available (again) in Prolog systems.

An example of such a construct is dif/2, which is a constraint that was provided in even the very first Prolog system, sometimes called Prolog 0, several decades ago. However, it was not widely available in Prolog systems until much more recently, even though professional Prolog systems such as SICStus have provided it for longer. Also arithmetic constraints were available in early implementations such as Prolog IV. So, in a sense, we are now going "back to the roots" of Prolog by making very old features widely available.

In my personal view, calling modern Prolog an "incompatible paradigm" with the way that Prolog is still being taught is not completely fitting, yet still more right than wrong: The new features are not meant to be used "on top" of old features, but instead! When using modern Prolog features to their fullest extent, you can completely eschew those language constructs that traditionally cause the worst problems for beginners.

Moded arithmetic is a prime example for this, which can be replaced completely by arithmetic constraints in modern Prolog systems. dif/2 is also a good example, which lets you avoid (\+)/1 in many cases by using a more general predicate instead.

Due to their operational semantics that is very hard to explain and understand, rather limited constructs often take significant room in many Prolog courses, and replacing them by more modern alternatives will entail a paradigm shift towards better methods and also make room for even more new material that can be covered instead. This is because on top of being more general, the new language constructs are often also easier to understand for beginners.

> The new features are not meant to be used "on top" of old features, but instead!

That is very powerful indeed. From incompetently playing around with stuff like that, my impression was that "constraints everywhere" is also extremely slow. That's not surprising, there's a lot of machinery working below the surface to deliver all that power. To recover performance whenever the full generality of the paradigm is not needed, I expect we'd need to work hard, say very smart static analysis, or JIT specialization, or...

But certainly my incompetence made me miss a lot. If you know how "constraints everywhere" can be made "fast everywhere", I'd love to hear about it:-)

A modern Prolog like SWI with CLPFD allows you to apply Prolog's semantics to arithmetic expressions in a way that you never were able to in classic Prolog. Instead of using `is/2`, you use `#=` and some other operators, Prolog can find solutions for expressions that would be very difficult to do in either classic Prolog or any other system.

I find the section 5.3 from the paper "Out of the tar pit"[1] describes it very well:

> It is for this reason that Prolog falls short of the ideals of logic programming. Specifically it is necessary to be concerned with the operational interpretation of the program whilst writing the axioms.

I haven't heard of any approach that generalizes the goals of logic programming in a far better way. Is there any? (on specific usecases, having a sort of rule engine for running your "business rules" can indeed be an excellent approach)

[1] https://github.com/papers-we-love/papers-we-love/blob/master...

Ah, that's a problem with many facets.

It is not that we can't come up with a logic language that is more declarative, it is just that telling the program everything about the universe is so damn dull. For example, ordering of your body predicates will often be based on what you think intuitively their sizes of solutions are going to be e.g. `plus(A,B,C), solution(C,X).` You know that plus is injective, so you put it first but from a logical perspective putting solution predicate first is just as sensible. If the language allowed you to express relative sizes of the predicates, then this could have been done automatically, but it would involve coding the order of predicates somewhere else in the program!

You also need to get rid of cuts in Prolog, if you want to be closer to true declarative programming. You might like to then try Datalog (alas, not Turing complete).

Another approach is to separate concerns more clearly. The part of Prolog that encodes and solves a logical problem is often different than the one dealing with IO for example. Ideally, you want to be more declarative in the former domain and more imperative in the latter. This can (and probably have been done) with a monadic approach.

I made an online Datalog IDE with Lua extensions (Turing complete): http://ysangkok.github.io/mitre-datalog.js/wrapper.html

I do not get why you mean by plus being injective.

`plus(1, 2, C). C = 3.`

`plus(0, 3, C). C = 3.`

Isn't that a counterexample, showing how it is not injective?

Yes, I was thinking total with respect to _intuitive_ arguments, but wrote injective. Sorry.

What you're looking for is called answer set programming. It has pretty much replaced logic programming as a research focus, and a number of implementations exist.

The Wikipedia article for ASP is a very nice introduction: https://en.wikipedia.org/wiki/Answer_set_programming

Thank you - I didn't know about it (#^.^#); got some reading-up to do now....

Most programming languages fall short of some ideal. It doesn't mean you're not able to be productive or do interesting things.

For those looking to gentle introduction to logic programming, Tarski's World was a very good book.



Might help, instead of jumping straight into Prolog.

oh you reminded me of a program we used in college named the same. It was about model theory and valid logic clauses (blurry memories)..

Yep, showed up in a First Order Logic course for me. IIRC, it was bundled with a book?

Edit - this one: https://web.stanford.edu/group/cslipublications/cslipublicat...

I try to introduce my boy (12) in programming trough Scratch but still no big success. However, we solved recently two math homework exercises with Prolog (logic) and MetaPost (geometry) which caused some effect, so I'm thinking to deepen in that languages or may be switch to Logo as intermediate lisp like solution.

I know a young child who is going through Scratch, but also no big success, but then the child was introduced to Racket and is now using it as a math companion tool, such as for visualizations and calculations.

The child was encouraged to build general solutions to homework, and suddenly a lot of schoolwork with tedious arithmetic seems boring to the child.

That is the best way to introduce computers to a kid: "let it do the hard work for you".

Logo used to be very popular for kids here in the '80s. I learnt Logo when i was 7, then i "progressed" over the years to Basic, C, Pascal, C++, Java, C#, Python and now, after about 28 years, Lisp. And now I realize Logo is close to Lisp in many ways and a far more powerful language than some of the ones I "progressed" to.

It's always been interesting to me how Logo, a Lisp dialect, was so successful as a language for teaching children to program, but so many people still go around with the idea that Lisp is really hard to wrap your head around.

> People still go around with the idea that Lisp is really hard to wrap your head around.

Usually, the people who genuinely had a "hard to wrap head around experience" weren't actually using Lisp but a different language called Scheme which is hard to do for anything practical, if you stick just to the standard-defined dialect.

Chances are that when they used Scheme, it was a school setting, and they were complete or almost complete programming newbies, and were asked to do everything with recursion, and also given demanding homework, like interpreting Scheme in Scheme and whatnot.

>that Lisp is really hard to wrap your head around.

It deserves a blog post, but I'd say Lisp (at least Lisp in its mainstream Common Lisp incarnation) is easy to learn but extremely hard to master, since it allows many possibilities and paradigms, and because of the power of meta programming. So to master it, you would be familiarized with all the paradigms and you would know how to leverage meta programming, etc. Also, since it allows improving performance by going to low-level details (like observing the assembly output of your function), you would better know a bit of assembly/machine language if you want to be able to improve the performance of Lisp to approach (or equal) C code. So previous experience with low-level programming, at least with C, would be a good prerequisite.

But yes, using it for simple stuff is easy, and in fact, while I consider Python a great "first language to learn", sometimes I feel that Lisp isn't much difficult than Python, for the same tasks. Python is much easier to master, but then it is much more limited.

Yes, my kid was impressed about solving the problem just by describing it.

Exactly in the same way my mind was blown the first time I tried Prolog!

I hated it in university and hate it until this very moment. I get what this language is doing and why it is great for special usecases. But I can absolutly can not read code written in Prolog. Staring at 3 lines of code and not knowing what the hell it does, gives me nightmares till today.

You can write a quicksort in a few lines. You can traverse over a tree very easily. But understanding these few lines takes me more time than writing it in C, Java, PHP etc.. But I guess it is all training...

I've had the opposite experience. In university and later, everytime I play with prolog, I love it. Some things make so much more sense when you write them this way, and it's not just the few very specific cases. When I learned about prolog and had the first couple of classes using it, I was delighted! Finally a language that works in a way my mind likes to think in.

I am not saying that one way or language is better than the other. But I'm saying that, like you are faster to rewrite in c or java, there are people that can rewrite or understand faster the other way. :)

Programming is fun!

I love Prolog, to the point of having taken part in the Portuguese university national logic championship back in the 90's.

On our degree, we had three logic programming classes, starting from logic programming as pure math all the way to deep Prolog exercises.

I didn't know there were Logic championships using Prolog!

P.S: Vocé é Portuguese, lisper e tambén fá de Prolog? You impress me more each day, pjmlp!

Obrigado, but actually it is due to quality of many of the teachers I had the opportunity to interact with during the 90's version of FCT/UNL. :)

I found that these kinds of languages make a LOT more sense after learning some logic [1], specially relational logic, to the extent that it's essentially the same thing.

[1] http://logic.stanford.edu/

> You can write a quicksort in a few lines. You can traverse over a tree very easily. But understanding these few lines takes me more time than writing it in C, Java, PHP etc

I'm still only a beginner at Prolog, but that's what I enjoy about it: it's an entirely different way of tackling problems that really stretches my brain, not just new syntax on the same old paradigm that I already know.

Something tells me that it would probably be easier if this was your first programming language. Actually I expect that some would find this declarative style a lot more intuitive and natural.

But when you've spent many years writing imperative code it feels so alien. It reminds me of the first time I attempted to write Verilog, it's hard to break the mold and change the way you think about the flow of a program and how you construct algorithms. Eventually it clicked for me and Verilog though, so I expect with enough dedication one could do the same with Prolog.

I don't think working with numbers is a strong side of Prolog. Quicksort in Haskell is also 3 lines, but it's totally readable. I can make use of functional programming in my daily life, but logic? Never. But think of the Zebra puzzle. In Prolog the solution is 100% declarative.

BTW, I hate Prolog too.

I agree that is hard to think in prolog, formulating the problem as logic rules is not natural for us after we done imperative programming. I imagine this is true on small problems that are popular in education but in a real world scenario a hard problem(that is suitable for prolog) would be much harder to do it in C or other imperative language, and probably adding new rules would be easier in prolog.

Some problems are harder to express in Prolog (than in conventional imperative logic), but then there are some problems that are far, far easier to solve in Prolog.


You can learn prolog at more than one university in the world.

Sure, but some courses are taught in more or less enjoyable ways.

I've seen a few contalks about logic programming, read a few articles, and aside from the intellectual curiosity in itself I can see that it is very elegant in certain circumstances (although I lack the hands-on experience to feel what those circumstances are).

Since doing a whole program in prolog isn't really practical for me, is there any way to use the logic programming paradigm in mainstream programming languages to deal with specific problems that fit it?

MiniKanren is an embeddable language for logic programming, with implementations in a wide variety of languages: http://minikanren.org/

The point of Prolog as a practical logic programming language really is that it is portable accross Prolog implementations based on ISO Prolog (and maybe prolog-commons API compatibility). If you're using a Prologish tool (a Java framework, say) that is not quite Prolog, then you're loosing the compatibility for very little gain IMHO.

Applications domains I've used Prolog for: discrete planning, game logic, limited/rule-based NLP such as postal address checking, combinatorial test case generation, decision problems in formal language theory used for eg. deciding properties of business processes, other small problems in software engineering such as code generation from domain models with roundtripping, or O/R mapping.

> The point of Prolog as a practical logic programming language really is that it is portable accross Prolog implementations based on ISO Prolog (and maybe prolog-commons API compatibility). If you're using a Prologish tool (a Java framework, say) that is not quite Prolog, then you're loosing the compatibility for very little gain IMHO.

I feel like I might be missing something obvious, but I'm not getting what the "compatibility" you refer to would give me in my specific situations? It looks like you're saying "if you write a Prolog program, it's compatible across Prolog platforms." Well, sure, but how is that fundamentally more portable situation than (for example) "if I have a logic programming framework in ES16, it works in all browsers that run modern JavaScript"?

If I'm specifically trying to do something in the browser (again, for example) and have a problem within that context where logic programming might be useful, having a way to use logic programming in JS makes more sense, doesn't it?

I'm not disagreeing that if you're developing a Web app, embedding logical or constraint programming techniques into a JavaScript API could make sense. OTOH, there's an obvious way for data exchange between JavaScript and Prolog, in that JSON can be be parsed by using Prolog's embedded `op/2` predicate and operator-precedence parser for defining custom DSLs.

What would be the workflow in Prolog if I want to import a set of facts from an external source, like a database/JSON/CSV?

JSON could be parsed as a term by Prolog, with suitable priority/associativity definitions for the comma, colon, and curly brace operators.

For CSV, you could write a DCG grammar for the particular file format at hand.

Integration with a relational DB, itself being based on logic, would be very different. Most Prologs have APIs to map tables to predicates.

If you replace Prolog with SQL - does it make more sense?

Yes, and I missed it because I was thinking in terms of "hmm, I wonder if there are any points in my client-side code-base that could be cleaned up by using logic programming?"

>Since doing a whole program in prolog isn't really practical for me, is there any way to use the logic programming paradigm in mainstream programming languages to deal with specific problems that fit it?

Yes, for example is easy to embed Prolog code in Common Lisp by using a suitable library. For example the LispWorks implementation includes a full Prolog package. Example:


With this, you can use the logic paradigm within the same project, along with the other paradigms supported by Common Lisp: Procedural, functional, meta, imperative, dsl-based and OOP.

Racket should also allow very easy mixing of Prolog source with the rest of the project, since Racket was designed to support different languages at once.

Not 100% sure I answer your question but I have used Prolog mixed with Java a while ago and as a high level layer for problem representation. It's not brilliant but at that time it did help me implement quite complex agent behavior within a multi-agent environment.

Racket has a Datalog too: https://docs.racket-lang.org/datalog/index.html You can mix languages in your Racket Programs as you like, very powerful.

You can use embedded Prolog to implement type inference in your compiler. AFAIK that's how it's done in the Rust compiler.

SWI Prolog has a Java API. You can also use Datalog to get started. It's the non turing-complete version of Prolog. (You can not create facts in rules for example.) The best know Datalog Database is Datomic which you can use in the Java ecosystem but there is also pyDatalog for Python, which is really quite nice.

If clojure is your thing there is core-logic https://github.com/clojure/core.logic

Check out https://github.com/rvirding/erlog with Erlang/Elixir

It seems we have come full circle (the Erlang VM was originally written in Prolog)

I can come up with the following possibilities, although I have no experience with either:

- Using OWL in Java (with OWL API). So you use Java in combination with XML

- Use a Prolog embedding in Haskell.

FYI, there's an updated version of the 99 Prolog problems by its original author here: https://sites.google.com/site/prologsite/prolog-problems

Thanks for the hint, I've updated the URL to point to the new version.

I don't know if this is just coincidence, but this blog post is fantastically similar to another [1] on Prolog, down the the same quotes, and the same example, though different code.

[1] https://bernardopires.com/2013/10/try-logic-programming-a-ge...

Another excellent resource for learning Prolog is the book "The Art of Prolog." One of the finest language texts you will ever read.

I read it recently and I can't recommend it enough. Even if you aren't really interested in Prolog, it is a must read.

Aja Hammerly[0] have a talk[1] at RubyConf this year titled “4 Programming Paradigms in 45 Minutes”, and she used Prolog as her declarative language. She implements a simple cash register in OO (Ruby), Functional (Racket), Declarative (Prolog) and Procedural (Assembly). The Prolog one turned out to be the shortest. It was really interesting!

[0] https://github.com/thagomizer [1] https://confreaks.tv/videos/rubyconf2017-4-programming-parad...

> Prolog has a straightforward syntax. You will pick up the few rules very quickly. Let us transform the Socrates example from earlier into an actual Prolog program and analyse what is happing inside the compiler.

Yeah, and no. This skips over the fact that this will not work in swipl, and you need to write those rules in a separate file, and compile/load them.

This is mentioned way after the entire introduction, after the reader has already given up on trying anything with prolog

Is anyone reading this aware of any work on logic programming in the large as in efforts that are harnessing the fact that answerset programming avoids committed choice and so enables us to write reliable tractable programs, but aims to make that possible in teams or in distributed settings?

I haven't encountered anything across answer set programming in the large as you put it. It is still pretty academic and mainly used for solving puzzle and optimization problems.

And yet declaring all business logic was once the way that all decisions would be encoded!

Looking at this from a more general perspective of the Symbolic AI approach I would say it had its days. Data driven systems are proving to be a lot more powerful and easy to build. I remember I used to spend ages trying to encode my problems in AnswerSet or AgentSpeak and come up with all the rules. It never worked!

Also work that tackles industry things like testing your logic programs?

I know how to code in Prolog. How can I take profit of it, apart from my academic (undergraduate) interests?

Can I build something in particular, can I look for specific jobs?

There are not that many companies using prolog code, but specialized AI companies still do. I've worked as a consultant programmer developing large decision-support systems for engineering projects and the bulk of the domain logic was written in prolog. Most operational stuff was in C++ or Python, still, so even in that project there were as many jobs for Python programmers as there were prolog programmers

You don't truly appreciate the declarative paradigm until you are able to describe (and find the best among) all billions of complex possibilities of configurations of nuts and bolts following all these engineering laws, practices and rules of thumb... in a hundred lines or less

I love Python and I'll introduce myself as a Python programmer, but part of me will always have a deep respect for prolog and how utterly awesome it is in the right context

How does Prolog relate to ML systems? It seems they work in a similar way. Input some learning data and then the system produces output based on these.

Eh, they are quite different under the hood. To put it real simple (and probably quite sloppy): Prolog determines the output by applying logic. ML uses mathematics to analyze the input and "guesses" the output, based on the training set.

As a corollary: Prolog will negate anything that can't be proven true from the rules you give it. A ML algorithm will always give you some answer, though the quality depends on the learning data.

Cool. Thanks! I am watching these trends from the outside and it's hard to keep up if you don't work directly with the tools.

I think he meant ML as the family of meta programming languages?

He did say systems though so you are probably right

Also, the part about "Input some learning data" makes me think to Machine Learning rather than Meta Language.

Btw, Prolog and ML might look like distant cousins: e.g. you use recursion and pattern matching in both. But under the hood there are big differences that arise from Prolog's logic programming paradigm: for instance, you can use append(L1, L2, L3) both to concatenate and split a list, depending on the variables you set:

  append([1,2], [3,4], X)       % ==> X=[1,2,3,4]
  append([1,2], Y, [1,2,3,4])   % ==> Y=[3,4]

Back in 2006, I studied Prolong in my college as a pre-up course for A.I. From my experience then, Prolog was easy for programming small tasks. I ended up building a card game with some intelligence as a final project. Even then, I spent most of the time designing cards to show up in console than time on actual coding for game play/intelligence.

It could be fun to retry Prolog now.

Great tutorial! Made me want to go back to Prolog again. A little sidenote for those who run MacOS with brew - you can install Prolog with "brew install swi-prolog" instead of using the binary version.

prolog interpreter in javascript, no server side code: http://prolog.jldupont.com/

That Uni course is rubbish

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact