Hacker News new | past | comments | ask | show | jobs | submit login
The birth of Prolog (1992) [pdf] (colmerauer.free.fr)
100 points by alokrai on Feb 23, 2021 | hide | past | favorite | 45 comments



If curious, past threads:

Prolog and Logic Programming Historical Sources Archive - https://news.ycombinator.com/item?id=22658770 - March 2020 (33 comments)

The Birth of Prolog (1992) [pdf] - https://news.ycombinator.com/item?id=18178215 - Oct 2018 (39 comments)

A Tribute to Alain Colmerauer (2001) - https://news.ycombinator.com/item?id=15254369 - Sept 2017 (1 comment)

In Memoriam Alain Colmerauer: 1941-2017 - https://news.ycombinator.com/item?id=14399688 - May 2017 (5 comments)

Others? Prolog itself is probably too big a theme to list here, but Prolog origins?

(Edit: thanks to commenter who supplied now-deleted reference.)


The history of Erlang is also intimately related to Prolog, as explained by Joe Armstrong in this article:

https://www.cse.chalmers.se/edu/year/2009/course/TDA381_Conc...

It's quite amazing how one can implement Erlang or basically any other language prototype as a Prolog metainterpreter.


There's also this classic paper that actually contains Prolog code for (reconstructed versions of?) the first metainterpreters that became Erlang: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.34....


That’s amazing, thanks. Somehow I had never stumbled upon this before.


Strict logic programs (including Prolog) do not implement concurrency, which motivated the invention of Actors.

For more information see the following:

"Information Security Requires Strongly-Typed Actors and Theories"

https://papers.ssrn.com/abstract=3418003


When I did my degree, the compiler assignment could be done in any language except for Lisp and Prolog as compiler implementation language.

The rationale? It would be a child's play if we used them.


As Kowalski has acknowledged and reported by others, Prolog was invented as a subset of Planner that only implemented backward chaining.

See the following for further information:

"Middle History of Logic Programming: Resolution, Planner, Prolog and the Japanese Fifth Generation Project" ArXiv 2009.

https://dev.arxiv.org/abs/0904.3036


> As Kowalski has acknowledged and reported by others, Prolog was invented as a subset of Planner that only implemented backward chaining.

From the featured article: "It was at this time that we learned of the existence of Carl Hewitt’s programming language, Planner [20]. The lack of formalization of this language, our ignorance of Lisp and, above all, the fact that we were absolutely devoted to logic meant that this work had little influence on our later research."

Personally I think I'll believe Colmerauer on this one.


Thanks Tom!

Being a linguist of note, Colmerauer evidently didn't know (or didn't

want to admit) that Kowalski had been carefully studying

Planner and Shrdlu (Winograd's Phd thesis based on Planner)

for quite a while in Edinburgh, where he felt quite in the

minority defending pure mathematical logic against the

onslaught of Popler (Pop2 Planner), etc.

    The formalization issue was addressed by the invention of
    Actors for all of digital computation including many
    computations that cannot be performed using pure logic
    programming (including Colmerauer's Prolog).
    Furthermore, many practical applications can be implemented
    using Actors hundreds of times faster than they can be 
    implemented using pure logic programming.
PS. Have you actually read the article

"Middle History of Logic Programming: Resolution, Planner, Prolog and the Japanese Fifth Generation Project"

ArXiv 2009.

https://dev.arxiv.org/abs/0904.3036


Kowalski himself says that he thought that Planner was similar to resolution and the simialrities were not well understood at MIT. The following is my transcription of Kowalski being interviewed for thesearch.space podcast. I surround with square brackets places where I didn't understand or I am unsure of what was said:

At MIT criticism of resolution which led to the development of the Planner language of Carl Hewitt and its use by Terry Winograd to implement a large portion of his PhD thesis which was extremely influential at the time... and this was promoted as an alternative to logic and as a procedural representation of knowledge as opposed to a declarative representation of knowledge. So this [muddied] the waters [so to speak] and it created enormous confusion. On the one hand you have resolution [...] logic and its applications to problem-solving, then you have the Planner language which is characterised as a procedural approach alternative to logic and yet it was clear to me that there where [similar to others] the Planner language had similarities to resolution which which were not understood and it was trying to understand that relationship and that similarity that I would say gave rise to logic programmign as I characterised it so far.

The podcast can be found here (I'm not afiliated with it in any way):

https://thesearch.space/

The passage above starts at around 29:53. My apologies for the poor quality of the transliteration.

>> Planner and Shrdlu (Winograd's Phd thesis based on Planner)

Terry Winograd's SHRDLU remains pretty much unsurpassed as a system even today. Wikipedia quotes an excerpt from Winograd's thesis that lists a dialogue between SHRDLU and its user. There's a particular exchange that blows my mind every single time I read it:

    Person: Does the shortest thing the tallest pyramid's support supports support anything green?
    Computer: YES, THE GREEN PYRAMID.
https://en.wikipedia.org/wiki/SHRDLU

I mean, at that point he's just showing off! I can only imagine him snickering to himself imagining the faces of people reading that.

Still, for me personally resolution has proved to be the right way to go, historically. In particular, I'm prepared to bet that Terry Winograd would have saved himself a lot of pain had he been able to use Prolog's Definite Clause Grammars to write SHRDLU's natural English parser. In Prolog, a grammar, even a natural language grammar, is a theorem and parsing it is a proof, just like for every other program; in every other language a grammar is the input to a program that must be written specially for the task of parsing.


Thanks YeGoblynQueenne!

My criticism at the time was about resolution uniform proof

procedures. I was skeptical that a single procedure could

prove typical complex mathematical theorems. Procedural embedding

of knowledge was invented to overcome the limitations of

resolution uniform proof procedures.

    Unfortunately, resolution requires flattening the structure of
    a mathematical propostion into clausal form thereby
    losing all of its structure. An important innovation of
    Planner was *not to require conversion into clausal form.*
    Instead, each Planner backward-chaining procedure is invoked
    by a noncompound goal.  Furthermore, each Planner forward-chaining
    procedure is invoked by a noncompound assertion. Prolog is a
    subset of Planner in which a Prolog procedure is invoked by a
    single noncompound goal. *Prolog does not have forward chaining.*
BTW, Kowalski and I continue to disagree about fundamental

nature of pure logic programming. My thesis is that a logic

program is characterized by the requirement that each

computational step must be completely justified by a rule of

mathematical logic.


Thank you for your reply, professor. Can I ask, what do you mean by "a rule of mathematical logic"? To my mind, resolution should fit the bill, but obviously you're saying something else, since you're critical of resolution.

About flattening the structure of a mathematical proposition, I think I understand what you mean, but that should not really be a problem. It should be possible to express a mathematical (or anything) concept in different formalisms without losing its meaning. The structure itself is not that important, as long as the meaning remains the same. Horn logic happens to be a formalism that is both maximally expressive and semi-decidable (SLD-resolution is sound and complete for refutation, or with subsumption). Given that it can also be communicated to a computer without any further changes of formalism, it is also very useful.

In any case, thank you again for your contributions to this thread and your replies to my comment. Your perspective of logic programming is invaluable.


You are very welcome!

By "a rule of mathematical" logic, I meant a standard logical

inference rule such as Modus Ponens or Double Negation Elimination.

    No modern proof engine requires flattening a proposition into 
    clauses, which often makes the proposition more difficult to understand and process. 
    Just because the clauses are logically 
    equivalent doesn't that they are always useful.
Thank you very much for your own contribution :-)


Ah, so you do mean inference rules? But, resolution is an inference rule so I'm guessing you mean _classical_ logic rules. The way I was taught it (informally) resolution suffices to replace all classic inference rules -but of course there is the restriction of applying it to clauses, which I understand you're not happy with. OK, I think I get it :)


Good summary :-)

A resolution uniform proof procedure had the problem that once

all the propositions had been flattened in to clauses, the

proof procedure did not have a good idea where to focus its efforts.


The link provided takes me to a "Test server", and an abstract titled "Inconsistency Robustness in Logic Programs". Clicking on the pdf download link returns "Not Found". Searching arxiv for "middle history of logic programming" produces no results.

I would really like to read more about Planner, but I have had no luck searching for this paper.



Thank you very much for helping me find this paper, but I am still somewhat confused.

When I couldn't find the paper in arxiv, it didn't occur me at first to go back to the beginning and use a general search engine. When I did plug the title into DDG it took me to arxiv, which had the expected abstract and a working link to the pdf. But doing another search in arxiv returned "No Results", even though I had put in the exact title from the abstract I was just looking at.

A little more digging uncovered the fact that the paper in question has gone through 41 revisions, and somewhere along the line the title was changed from "Middle History of Logic Programming: Resolution, Planner, Edinburgh LCF, Prolog and the Japanese Fifth Generation Project" to "Inconsistency Robustness in Logic Programs", which is still rattling my brain a little bit.

edit: The working link I found is https://arxiv.org/abs/0904.3036v22


thats interesting, one slightly different link i found was

https://arxiv.org/pdf/0904.3036.pdf

seems like your link is versioned (v22) and the one i found was unversioned... i wonder which one is the latest...?


That's quite odd!


I was wondering if anyone knows of a resource in the same vein as [1] or [2] but to implement a prolog interpreter instead of lisp.

1: https://github.com/kanaka/mal

2: http://www.buildyourownlisp.com/


Traditionally the WAM[1] has been the route, but see Paul Tarau's "A Hitchhiker’s Guide to Reinventing a Prolog Machine"[2]. Check out also Nils M Holm's "PROLOG Control in Six Slides"[3]

[1] https://en.wikipedia.org/wiki/Warren_Abstract_Machine

[2] https://www.cse.unt.edu/~tarau/research/2017/eng.pdf

[3] https://www.t3x.org/bits/prolog6.html https://news.ycombinator.com/item?id=20591771

I don't know anything about it but there's "Prolog.c: a simple Prolog interpreter written in 200 LOC of C++ (2000) (cam.ac.uk)"

https://news.ycombinator.com/item?id=12193694

http://www.cl.cam.ac.uk/~am21/research/funnel/prolog.c


Perhaps not quite what you're looking for, but the free MIT book "Structure and Interpretation of Computer Programs" (https://mitpress.mit.edu/sites/default/files/sicp/full-text/...) has a section on interpreting logic programs (section 4.4) in Scheme. I think it's sufficient to understand the semantics of Prolog, but I'm not sure how much of the rest of the book you'd need to read to understand that chapter (it's an amazing book though!)

edit: to be clear, in the book you interpret a logic programming language with lisp-like syntax


You could study core.logic: https://github.com/clojure/core.logic/tree/master/src/main/c...

I swear I'd bookmarked a resource that was more analogous to #2, but you may want to have a look at The Little Prover: https://mitpress.mit.edu/books/little-prover


Yep! Or go for miniKanren if you want something like core.logic, but outside of clojure.

Will Byrd has a lot of documentation out there. (Scheme focus of course.)

I think the easiest way for a programmer with a vanilla background to “get it” is here:

https://codon.com/hello-declarative-world


There's details on how to implement Prolog (in Common Lisp) in Peter Norvig's Paradigms of Artificial Intelligence: https://github.com/norvig/paip-lisp/blob/master/docs/chapter...

There's also Implementations of Prolog: https://archive.org/details/implementationso0000unse (limited preview)


Implementations of Prolog was published in 1984; its contents predate the Warren Abstract Machine.

Pages 87-92 contain the article "The world's shortest Prolog interpreter?" by M. Nilsson of Uppsala University. The article describes and provides source code for a simple Lisp implementation of Prolog.


It's not exactly what you're asking for, but there is "Racklog: Prolog-Style Logic Programming" https://docs.racket-lang.org/racklog/ which is "an embedding of Prolog-style logic programming in Racket". You could see how they added Prolog style logic functions to the Racket programming language.


I have not looked at your [1] or [2], but I seem to vaguely remember that either a PG book On Lisp or ANSI Common Lisp (wordplay accidental but retained about the "On", heh) or a Peter Norvig book (one of his two main ones) has a chapter or two on implementing Prolog interpreters.

Edit: Oops. After posting my comment, just saw DonaldFisk's comment here: https://news.ycombinator.com/item?id=26244021 So Norvig's PAIP may be the book where I saw it.

Edit 2: Also just saw gglitch's comment:

https://news.ycombinator.com/item?id=26242603

which confirms my first surmise about both the PG and Norvig books containing it.


The closest thing that I know of is "Warren's Abstract Machine: A Tutorial Reconstruction" by Hassan Aït-Kaci. It's out of print but available as a PDF at http://wambook.sourceforge.net



I skimming through it. I wonder if people prefer this or kanren over good old prolog.


I definitely prefer miniKanren, but (with a small sample size) find that non-logic programmers have an easier time reading simple Prolog than equally-simple miniKanren.


I believe several of the canonical Lisp books implement a simple Prolog as an exercise. I’m pretty sure Norvig’s Paradigms of Artifical Intelligence Programming, pg's On Lisp, and Winston & Horn’s Lisp all do.


The relevant section of PAIP: https://github.com/norvig/paip-lisp/blob/master/docs/chapter...

Also worth exploring The Reasoned Schemer which teaches minikanren, which can be (and has been) implemented in many programming languages. It explores much of the same space as Prolog. I wrote most of a minikanren in Common Lisp while working through the book (someone else had already published a version to quicklisp which was better than what I made).


The PAIP implementation seems more practical than the other textbook Prologs I've seen like SICP's, while still quite small and clear.


If you've never gone through PAIP, you're in for a treat. Most of the code is very practically written (I'd wager all, but it's been a while since I read/worked through it). Norvig writes good code, see also his website and many of the Python programs he's published over the years since he switched to it.



I was hoping there would be something like this as well, all of the prolog interpreters out there are either too complex or do not explain how they work. I might have to write something myself as implementing your own (toy) interpreter is a good way to fully understand the concepts behind the language.


You might like to check out Chapter 2 of "The Art of Prolog"; it has enough information to tell you how to make a bare-bones prolog interpreter.


One of the debates I always enjoy when it shows up on HN is the comparison of lightweight serialization formats, especially, e.g., sexprs vs json. I don’t believe I’ve ever seen Prolog terms show up, but they seem apposite.


Basic JSON support using Prolog's built-in operator precedence parsing or pio lib should be as easy as

    :- op(600, xfy, :).
    
    MyJSON =
      { "this": "bla",
        "that": 100,
        "L": [ 1, "two", 3 ] }
provided your Prolog has curly braces (should come with DCGs).


One of those three is not like the others. Prolog is homoiconic. JavaScript programs are not JSON. Parsing Prolog terms is easy. Parsing JSON correctly is very difficult: http://seriot.ch/parsing_json.php


> Parsing Prolog terms is easy. Parsing JSON correctly is very difficult: http://seriot.ch/parsing_json.php

Most of that page is about character escaping as well as Unicode and floating point corner cases. You'll need to handle the same corner cases if you want to write a robust Prolog parser (it's slightly simpler if you cheat by rejecting Unicode).


It seems to be conventional to use the homoiconicity of Tcl to serialize data out in a way intended to be interpreted back in again as code [0], but I don’t think I’ve seen this recommended by Lispers.

[0] - e.g. https://trs.jpl.nasa.gov/bitstream/handle/2014/7660/03-1728....




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: