Aside from being one of my favorite programming books ever, and on par with SICP and PAIP I think, the font they used is really beautiful . The PDF perhaps does not do it justice. On paper it's really crisp and pleasant to read.
I hope MIT Press opens up The Craft of Prolog  too, a fantastic sequel to The Art of Prolog.
I wonder if the MIT Press — or the author — has the source files (presumably LaTeX) somewhere. It would be a shame to have a low-quality scan as the "definitive" archived version.
This book was set in TeX by Sarah Fliegelmann
at the Weizmann Institute of Science
and printed and bound by MIT Press
Look the page in a desktop browser. Directly below the book's cover, there's a “Resources”. To the right of that “Resources”, there are “Overview”, “Author(s)”, “Open Access”; and that “Open Access” is the one you want.
It has been a while since I worked with Prolog professionally, but the memories are fond/good.
Good to see that the PDF is available for download.
I'm OK with ebooks (I actually prefer them), but my wife loves the feel of a real, physical book in her hands. Because of Covid-19 she has been reading more ebooks, but as soon as the library opened back up (with limitations) she went back to requesting as many books as she could.
Let's go with musician.
I have no complaints about the book, but I've been struggling to figure out an easy setup for "Prolog as a web app backend." Typical Prolog implementations are not designed to be used as concurrently-accessed databases. Datalog seems to be a nearby technology that is more pointed towards that use case, but even then, there seem to be scant few implementations of Datalog databases with transaction support, and near zero that are open source.
RDF/OWL databases seem to be another conceptually-near possibility, but they also seem to have few implementations that are prepared for high-scale concurrent access.
Overall, it seems that there is a decent amount of interest in "logic-type" query languages as a curiosity, but little attention to implementations that support use in OLTP business applications, which makes adoption difficult.
Does anybody have any advice?
On one hand, I think the paradigm really simplifies some things, but then on the other hand, everything else (aka routine programming tasks) seems much more convoluted and awkward.
I think PAIP may have been on to something, embedding Prolog in Lisp, but that's less elegant and more difficult in most other languages.
There are a couple of tutorials listed under Tutorials > Web applications on the website. Probably the place to start is:
Tutorial — Creating Web Applications in SWI-Prolog
But in a profession that revolves around getting jira tickets into the Done column while my boss hovers over me with a stopwatch (so to speak), and then cramming leetcode for interviews when it's time to hop from one shitshow for the next ... I've come to the sad conclusion that, in this season of in my life, I just can't justify spending the additional time on such books.
These books might make me "a better programmer" for some amorphous, platonic definition thereof, but in the year 2020 will these actually help me to materially progress career-wise?
Regarding just how it might make one a better programmer (while intentionally not answering anything about how it can "materially progress" one's career), I think that learning Prolog can make one a better problem solver because it forces you to specify the solutions to problems in terms of their truth conditions. This, in turn, makes you more likely to tackle problems by constructing mini proofs. I find this to be valuable no matter the language or language family I happen to be working in and it's a habit I definitely learned from writing Prolog.
So here's my personal experience. I first learned about Prolog in a second
year CS course at uni. I got hooked, much like other folks get hooked on Lisp
and Haskell etc. I stopped saying things like "nobody uses Prolog in the
gaming industry", like my classmates, and started asking questions like "wait,
if facts and rules are predicates, then what are queries? ". So I started
reading every textbook on the language that I could find in the university
library. Most of them were AI textbooks also, from the time when AI was
basically hand-crafted programs written in Lisp and Prolog. From that I got
interested in AI, in general and -after working for a few years in the
industry- I started an MSc course on AI.
My MSc course was almost exclusively about statistical machine learning (I was
told they were planning to rename it to "data science"). I started the course
(part-time, while working) in 2014, right around the time when machine
learning started becoming a hot issue; a rather fortunate state of affairs. I
didn't go back to the industry with my hot new skillz though. Instead I
started a PhD on Inductive Logic Programming, which is basically machine
learning of logic programs, a subject that perfectly fits my background (in
logic programming and machine learning).
And that's where I am now- I'm in my final year and waiting to hear from the
editors of the Machine Learning Journal about my submitted paper that
describes a polynomial time ILP algorithm (exponentially growing hypothesis
spaces are a bitch, but you can avoid them with One Simple Trick). We'll see
how that goes- one does not simply publish to the MLJ. But if I get that
publication it will be a major step forward to getting my thesis accepted.
Er. Once I've actually written it.
Anyway, morale of the story: don't just read a bunch of books. Find a deep
interest in computer science instead. For me, my undergraduate wonderment at
Prolog led me down the path of becoming a computer scientist, growing my
knowledge and honing my skills. There's no way you can go wrong with that.
(I forgot to mention I'm doing my PhD at one of the top rated universities for
engineering research; not bad for someone who started as an unskilled
immigrant working at a warehouse :)
 Also predicates. Mind blown.
Sure, you can learn French and grammar in general without knowing Latin and most do, but it sure is easier if you do.
In Prolog programming (in contrast, perhaps, to life in general) our goal is to fail as quickly as possible.
"Let-it-crash" had to do with the idea that a wobbling process/service is harder to design around than one that's just definitively dead and also that most non-systematic errors in running programs are "Heisenbugs" (the problem is transient) and restarting the process/service will make it go away.
In Prolog the point of "fail-fast" is to determine as quickly as possible if some search branch can't possibly provide a True result, so that you don't spend unbounded time searching for every possible version of false.
Ideas are contagious.
Joe Armstrong and S.R. Virding wrote a case study chapter for the book, "Strand: New Concepts in Parallel Programming," and it has a section explaining the history of Erlang up to 1990.
The first Erlang prototype, written in Prolog, ran into performance problems with backtracking and emulated concurrency, so it was ported to a Parlog system (a parallel Prolog) written by S.R. Virding.
The Parlog implementation was still too slow, so a third implementation was created that compiled Erlang into Strand88, a commercial product from Strand Software Technologies, Inc. which claimed to be the first commercial parallel logic programming language. That implementation was finally fast enough that Erlang could be used outside of the Computer Science Lab at Ericson.
Presumably at some point after that it was re-written at least one more time...
I learned Prolog in High School, it was a challenge at first -- I didn't get it right away. But when I did I was amazed.
Yet I have an ongoing problem with Prolog that I have never fully put my finger on exactly -- it's something about how side-effects intermix with pure logic. I find it makes the flow of activity hard to follow.
Can anyone else relate to this and perhaps explain it better?
It builds off the concepts of prolog in a purely functional strongly typed way. The main compiler, PAKCS, actually targets prolog. Hard to wrap your head around without knowledge of Haskell but with taking a look at.
DEBORDEMENT DE PILE
Now, a lot of people, when they hear the term "constraints", think of arithmetic inequalities. And indeed, there are "Constraint Logic Programming" extensions to Prolog that support such constraints. But that's not what I'm referring to. Even in the absence of such extensions, Prolog is a constraint language, but the constraints are structural rather than numerical.
Let's take a simple example, the Prolog code that defines list appending:
append(, X, X).
append([X|Y], Z, [X|W]) :- append(Y, Z, W).
Prolog allows any of those arguments to be inputs, and any to be outputs. The way this works is that variables in structures function as named holes to be filled in. So while you can use 'append' as a function to append two lists, like this:
? append(, , X).
X = [3, 7]
? append(, , ).
? append(X, , [4, 7]).
X = 4
? append(X, Y, [4, 7]).
X = , Y = [4, 7];
X = , Y = ;
X = [4, 7], Y = 
The catch, as others have mentioned, is that it does that using a rather naive depth-first search. Even though, at the lowest level, it tries each alternative very quickly, there can easily be a combinatorial explosion of alternatives. To use the language effectively requires the programmer to learn strategies to avoid these explosions. So while in small examples the language seems very declarative, and indeed a lot of code can be written without too much attention to its procedural behavior, eventually this breaks down and you have to think about it procedurally as well.
Prolog was intended to be a way to program using logic. It did this in a way by encoding facts in a manner not dissimilar to a relational database; logical statements about the world that resembled functions using Horn clauses (which you can think of as mathematical functions, but applied to logic); and unification, which is what links all this together. Unification is computation via the idea that you have a set of data, a bunch of logical statements, and you want to find everything in the data that satisfies those logical statement.
If you know SQL, Horn clauses should sound a lot like materialised views.
As to why it never really went mainstream, it's a matter of wrong place, wrong time. It didn't help that the language needed a checkpointing mechanism (referred to as 'cuts', and represented by '!') because the unification mechanism could get confused and do much more work than it needed to, so developers needed to put cuts in place to tell the interpreter where it could avoid backtracking to try more solutions within a Horn clause.
If you want to get a taste for it, maybe give this a try: https://github.com/norswap/prolog-dry
Prolog is one of a small number of languages that people should learn even if they never use it in a practical way.
Prolog backtracks when there are more solutions to a goal (think of a goal as a query). Not because "unification gets confused". Unification doesn't have anything to do with backtracking, rather backtracking is a mechanism to represent the natural nondeterminism of logic theories, where multiple variable instantiations ("bindings") might satisfy a formula.
So, when you enter a query at the Prolgo top-level (the console) Prolog will try to find a set of variable bindings that refute your query in the context of your loaded program. If it can, it will report "false", i.e. the query cannot be satisfied in the context of the program. If it can't, it will report "true", i.e. the query can be satisfied in the context of the program and it will also report the variable bindings that make the query true. If there are more than one sets of bindings that make the query true, you can ask for more solutions. At that point Prolog will backtrack and give you more solutions.
However, there is nothing forcing your program to be written so as to produce multiple solutions to a query. It's entirely up to you and how you write your program. You can write a program so that it's deterministic (succeeds or fails exactly once) or not. The cut (!/0) is one way to force a program to become deterministic, but again that's not because anything gets confused. It's because sometimes it's hard to write a program so as for it to be deterministic. And sometimes you just have no idea why your program backtracks and you pepper your code with cuts, hoping it will stop.
Which of course happens less and less as you get to understand how the language works (and why it's a bad idea to put cuts everywhere in the first place).
(It's a bad idea because it means you don't understand how your own program works. Nine times out of ten you don't need to use it when you think you do).
(But I still use the cut all the bloody time because it's convenient).
I know that, and I know the difference between a red and a green cut. However, I was targeting my explanation towards somebody who is completely unfamiliar with the language.
Prolog was (theoretically) a practical declarative programming language. You describe what you want done, and it "figures out" what to do, as opposed to imperative, where you simply tell the computer each step to do.
That's something of an exaggeration in practice, but yet, there was something to it. A typical program was actually doing depth-first search behind the scenes, so "try this, or else try this, or else try this" algorithms were very easy to express in the language.
As an aid to that, it had unification, which you can think of as pattern matching on steroids. You could write a rule like
something([A, [1, X]]) :- yada(A, X)
which would only match if the argument to 'something' was a list the second element of which was another list that started with 1, and so on. It's far more powerful than this, but it's been many years, and I've forgotten most of it.
As for the cons, implementations of the day were relatively inefficient, although Quintus was fairly good.
The killer problem, in my mind, though is that it didn't really deal well with backtracking through huge data structures. If you were trying to write a bit-blitting algorithm on large arrays, for example, and expecting to backtrack a lot, it just plain wasn't going to work well.
Not sure that helps much. I miss Prolog a lot, but it's hard to see a path forward for it. Even somewhat similar languages like Haskell will probably never get any real traction. Or even Ocaml/F#, which are far more practical.
I suppose the closest we have today are really good Makefiles. Done well, this captures just a bit of Prolog, in a very limited domain.
Part of writing efficient Prolog is doing things to avoid such backtracking (by adding cuts or rearranging code).
Bit-blitting is just an example I made up for "large-scale manipulation of array data". There are such cases where backtracking would likely make more sense, though I'm not thinking of an obvious one right now.
The exception would be Turbo Prolog, but it was a rather restrictive subset of the language. Fast as hell, though.
Further more, it is missing many things that you come to expect in a modern programming language - arrays, good numerical performance, etc.
The way it works though allows you to do some things quite beautifully. For example, since every function you right can be run either forwards or backwards (solving for any one unknown amongst the arguments or result), you can often write one single piece of logic for parsing that can just be run in reverse to do serialization (or vice versa).
The main reason, though, that it (or it's progeny like Mercury) is not mainstream is really the same as the reason that Haskell and its derivatives are not. It's just too different from the more widely known algol/simula langages, and the library ecosystem is comparatively poor.
Those depend on the interpreter. For example, Gnu Prolog has full support for arrays. Swi-Prolog has good support for rational numbers. Constraint libraries like library(clpfd) in Swi-Prolog simplify numerical computations. etc.
About numerical support in particular, the problem is that representing arithmetic in first order logic is not straightforward because it doesn't have a concept of a "number"- so what a "number" is, must be defined before arithmetic can be done (and in fact, to some extent, this was the purpose of FOL in the first place). Additionally, Prolog slightly departs from first order logic semantics in that it doesn't have a clearly defined concept of "function" (by contrast in FOL "arguments" of predicates are variables, constants or functions). Instead, Prolog "cheats" and allows predicates to be passed as arguments to predicates. However, for performing arithmetic operations, Prolog defines a predicate is/2 that takes as arguments two "mathematical expressions", where an expression is either a number ...or a function (so a function; because constants are functions with 0 arguments). For example, to add two numbers you would write:
?- A is 1 + 1.
A = 2.
Anyway it's not that there's no good support for numerical computations. It's just ...weird. Even in the context of, you know, Prolog.
def area(base, height):
return (base * height) / 2
triangle_area_sides(Area, Base, Height) :-
Area #= (Base * Height)//2.
- Given a base and a height, what is the missing Area value? (Same as the Python version)
- Given an Area and a Base, what is the missing Height?
- Given an Area and a Height, what is the missing Base length?
- Given an area, a base and a height, is that a valid triangle?
- Given an Area, generate some / many Base and Height combinations which make a triangle of that area.
- Given a Base, generate some / many Area and Height combinations which make a triangle of that base length.
- Given a Height, generate some / many Area and Base combinations which make a triangle of that height.
- Given nothing, generate some valid triangle sizes.
- Given some values but not others, can it be solved at all (true/false)?
Pros: that's much more capable than the Python function of roughly the same code length. Multiple logic statements like this can make a lot happen in a little code.
Cons: it's very different from imperative programming, it makes IO and state interactions a bit more difficult, it's easy to get the Prolog engine stuck generating infinite combinations or searching enormous search spaces and never finishing, debugging what it's actually doing and what is going wrong is a very different skill from imperative languages.
And years of college courses using Prolog as a "force them to deal with recursion", leaving people with the impression that Prolog is incapable of anything good, and what it can do is slow and needlessly difficult.