Hacker News new | past | comments | ask | show | jobs | submit login
Paradigms of Artificial Intelligence Programming (1992) (github.com/norvig)
283 points by RafelMri on Aug 14, 2022 | hide | past | favorite | 82 comments



One of the top 5 programming books.

Old AI is today's bleeding edge computer engineering. There is an enourmous amount of free lunches for computer engineers and software startups in the old school artificial intelligence.

* modern SAT solver performance is impressive. They can solve huge problems.

* Writing a complex systems configurator with Prolog or Datalog can be like magic.

* Expert systems. There has never been so much use for them than today. Whenever you see expensive systems utilizing complex mess of "business logic" and expensive consultants, you should know there is a better way.

(I use SAT-solvers to partially initialize neural network parameters).


SAT for initializing nn parameters? Can you elaborate on this a bit, it seems interesting.


I can only guess:

In certain cases, you know what the neural network should do, for certain inputs, and you have a quite clear idea how each component of the network would solve this, and this should be doable, and with a bit of work you could also construct the parameters by hand, such that it works at least for non-noisy constructed toy input data.

Actually, I think for more complex tasks, having such intuition would anyway be a good idea.

Now, you could use a SAT solver such that it does the work mostly for you. You formulate some constructed inputs/outputs, maybe some other constraints, and let it solve for the parameters. This would be a good parameter starting point for real world data. And if the SAT solver fails to find any solution, maybe your neural network is actually not powerful enough.


Also curious to hear more about this...


Are there any examples for 2 and 3 openly accessible? While I grasped the basic idea I lack the imagination how it would look in a real world system.


What are the other 4 top programming books?


Since I also put PAIP in my top 5 books, here are my 4 other ones:

- The Pragmatic Programmer -

This changed my life when I started programming 20 years ago. Most of the practices are now common, but it was almost radical back then.

- Designing Data Intensive Applications

This is so well written, so elegantly fundamental. It's an absolute pleasure, even if I don't really refer to it in practice.

- Structure and Interpretation of Computer Programs

In many ways, the MIT version of PAIP. Its elegance complements the pragmatism of PAIP well.

- The unicorn project

My number 5 book varies often, because I haven't found many books that reach the writing quality of the other 4. This is a business novel, but it really solidified a lot of concepts for me: lean, queues, theory of constraints, what devops is about, working as a programmer in a business.


DDIA, SICP, and PAIP have been on my reading lists for years. Maybe I’ll finally get round to them after this glowing review :)


* Structure and Interpretation of Computer Programs: https://web.mit.edu/6.001/6.037/sicp.pdf

* Concepts, Techniques and Models of Computer Programming: https://www.info.ucl.ac.be/~pvr/book.html

* The Art of Prolog: https://www.dropbox.com/s/2umr9ouz0jdelio/1407.pdf?dl=1

* Probabilistic Models of Cognition: https://probmods.org

Something that merges logic, types and proofs, perhaps yet to be written. Some good preliminary material:

* Program = Proof: https://www.lix.polytechnique.fr/Labo/Samuel.Mimram/teaching...

* Concrete Semantics: http://concrete-semantics.org

* The Hitchhiker’s Guide to Logical Verification: https://raw.githubusercontent.com/blanchette/logical_verific...

* Programming Language Foundations in Agda: https://plfa.github.io

* Logic and Computation Intertwined: https://cs.uwaterloo.ca/~plragde/flaneries/LACI/

* Software Foundations: https://softwarefoundations.cis.upenn.edu/


I'm going to look at the Art of Prolog book.

Prolog is one of those things that has alluded me all this time. Mostly I don't think I've had an application for it, and, bluntly, at least for me, I need to have a "real" application to solve to best learn something. Seeing the "animal" program repeated over and over and over again was never any help.

In hindsight, maybe it would have been appropriate in an email messaging application I did long ago. It's message routing workflow was not inscrutable, but certainly difficult (and it didn't help that the route could split, sending the message to more than one place with their own workflows -- that was fun).

I've done a bunch with rule engines (and the message routing was done with an ad hoc one), but less so with inferencing.

Maybe this book will give me some insight to explore further. It's always one of those things that sort of nags the back of my brain that I don't quite grok it.


The Art of Prolog and PAIP are full of small Prolog usecases.

The Craft of Prolog is also very much worth looking into.

A related approach is ASP, which combines SAT with logic programming. These are the two canonical books:

* Answer Set Programming: https://www.cs.utexas.edu/users/vl/teaching/378/ASP.pdf

* Answer Set Solving in Practice: https://potassco.org/book


Do you mean "Dyalog" rather than "Dynalog"?


I meant Datalog.


Most likely he means "Datalog". "Dyalog" is an APL.


Does this book teaches how to write SAT-solvers?


I've got an impression that it is a breadth introduction book to various interesting topics in symbolic computation. It doesn't go deep.


It just implements a computer algebra system, a Prolog, a Scheme and a bunch of other stuff. On its way it teaches advanced Lisp programming.


Yes, basic/toy versions.


For a book I find the description of the development process and the level of implementation to be quite good.

A great opportunity for the reader to expand them.

I've seen for example the Scheme implementation from PAIP expanded and integrated into a CMS.


I wrote a small Common Lisp book for Springer Verlag about the same time that Peter wrote this fantastic book and I then met him shortly thereafter at a Lisp Users Vendors conference in San Diego. After 30 years of following his writing, Python notebooks, etc., I think that he just has a higher level view of software and algorithms.

There is some talk on this thread about good old fashioned symbolic AI. I have mixed feelings about this. I am currently working for an all-in Common Lisp + GOFAI company, but most of my best successes in my career involved neural networks, and later deep learning.

I think we need a new hybrid approach but it is above my skill level to know what that would be. I keep hoping to see some new research and new paradigms.


We’re building that paradigm right now, in the math community.

Homotopy type theory tells us that semantic information from a symbolic logic can also be represented by the topology of diagrams. But this is a two way relationship: the topological structure of diagrams also corresponds to some synthetic type theory.

The conjecture is that the topology of, eg, word2vec point clouds will correspond to some synthetic type theory describing the data — and this is bolstered by recent advancements in ML: Facebook translating by aligning embedding geometries, data covering models, etc.

I’m personally working on the problem of translating types into diagrams stored as matrices, in the hope that building one direction will give insights into the other. (Again, because equivalence relationships are symmetric.)


Off topic, but what does diagrams mean in this context?


Diagrams in the category theory sense.

https://en.wikipedia.org/wiki/Diagram_(category_theory)

And for some context, a brief talk by Michael Shulman.

https://www.youtube.com/watch?v=zUPBEQe4Ti8


thank you!


It's weird that it still feels like this is almost something I'm 'supposed' to know and at some point work my way through. (Maybe because it was on the brink of still being relevant when I got interested in programming in middle school.) Kind of a relief to realize that this for sure is no longer something you're expected to know whatsoever.


the engineering concepts elaborated in this book are very much relevant today and you might very well be expected to know them. what this book does (and this is relevant merrit) is that it so effectively drills them into you (provided you follow along of course)


Yes, check out the mostly non-AI, non-Lisp advice in Norvig's very brief summary in What Lessons are in PAIP?, the next to last section in http://norvig.com/Lisp-retro.html


What changed ?


When people talk about AI today they are usually talking about machine learning systems based in neural networks.

The kind of symbolic AI described in this book went through several cycles of hype and disappointment to the point where many think it is obsolete. People often do it connect recent breakthroughs in SAT and SMT solvers with this history and for that matter production rules engines are dramatically better than they were in the 1980s but they’ve never made a breakthrough into general purpose use.


Maybe if you define AI to be “technology that could plausibly lead to AGI”, but everything in this book is still terrifically relevant for many practical “how to get my computer to solve this semi-open ended search problem efficiently”. Which is far from uncommon.


> “technology that could plausibly lead to AGI”

My AI prof joked, I think, that it was "things that don't work yet" - clearly only a humanlike AI could do OCR... until it started working, etc.

But I actually think your hypothetically proposed definition fits the history even better.


Hi @Paul, I’m a newbie in this field. Are you saying that this branch of AI isn’t relevant when compared to machine learning that’s based on neural networks?


Norvig addresses this at the end of f http://norvig.com/Lisp-retro.html . Basically these old AI techniques are now considered regular programming and modern AI is focused on ML. Both are useful but for different tasks.


That is a great retrospective! Still very relevant 20 years after it was updated.


Symbolic AI never managed to scale to significant and general real world problems. Neither did Machine Learning, not until the advent of fast processors and massive datasets for training which broke the scale-barrier.

Rule-based expert systems (as opposed to SAT, etc) based on Symbolic AI also have the issue that for non-trivial problems, coding the rules themselves usually requires programming expertise in addition to the domain knowledge required to capture the business logic. Things have improved a lot since the 80s, but applications remain fairly niche.


I think ideas from symbolic AI are relevant here and there but they certainly aren't fashionable.

Almost any financial institution has a copy of IBM iLOG in there somewhere implementing policy in terms of production rules.

Some of the most interesting systems today combine ideas from machine learning with ideas from AI search. For instance there are many game playing programs like AlphaGo that use

https://en.wikipedia.org/wiki/Monte_Carlo_tree_search

which runs a large number of games to the end rather than searching the next few moves exhaustively. Using a machine learning model to play the game for the playouts but sampling a large number of moves with A.I. search turns out to be a winning strategy.


The most relevant field this would apply to is AI for games.


Mostly not, no.


Hi all! I'm not the author, but I'm part of the effort to turn PAIP from a scanned pdf into a lot of markdown.


this is my favourite book on software engineering (as well as common lisp)


Mine too.


The Prolog programming language implementation is just 139 lines of Lisp code. Never the less, Prolog is a really powerful tool.


It is interesting how almost none of these paradigms/principles are relevant to what we think of as AI today. Lisp, a language specifically formulated for AI applications, is now mostly relevant in the context of programming language theory (and essentially irrelevant to the statistical programming/linear algebra toolkits that underpin modern AI applications). Instead, Fortran and its descendants are actually what power today’s AI programs. Nobody could have foreseen this in the 50s!


PAIP is full of practical programming techniques and advice that are not at all specific to AI or even Lisp. Other commenters here have cited Norvig's retrospective on his book at

http://norvig.com/Lisp-retro.html

The next to last section in this, What Lessons are in PAIP? comprises 52 numbered sentences of advice, cross-referenced to fuller explanations in the book. Many (most?) of them apply to any programming or software engineering, not specifically AI or Lisp.


Lisp was once in the forefront of high level numerical computing systems:

https://en.wikipedia.org/wiki/Macsyma

Similar to the way Python today is a high level language used for numerical computing today. Lisp is certainly more than capable of this role, but the fallout from the AI winter killed Lisp's reputation and popularity, leading to other languages filling those niches.


I'm always surprised that the symbolic math paradigms aren't used more. I suppose most cost functions just aren't continuous?

That said, seems most of what is important in today's code is the ability to drive a gpu. Don't know why that couldn't be done from a lisp.


I'm not sure if anything has or will happen with this, but this seemed like an interesting symbolic extension of neural nets: https://syncedreview.com/2020/11/19/facebook-building-automa...

At least my take on it is: what if you simulate a domain (physics being the most tractable) then can you train using differentials of the model itself and not just differentiating the internal neuron estimators?

Lots of open questions... like what's the driving force that asks for an effect that calls for differentiating the model? We kind of expect the training process to just recreate everything internally.

Now it has me thinking about "coprocessors"... a lot of the domain-specific math is given to the NN as input features, but what if the NN could select inputs to those algorithm? That would give the NN the ability to speculate. Of course you'd need to also differentiate those algorithms.

Now that I'm thinking about it, I guess this is just another activation function, just one that is domain-specific. Are there approaches that use eclectic activation functions? Like in each layer 1-10 is one activation function, 11-20 is another, and then you finish off with tanh or relu or some other genetic activation function.


Neural nets do use extremely long compositions of nonlinear functions whose gradients need to be computed over and over. I guess because chain rule (automatic differentiation) works so well, most of the libraries in use are only “a little” symbolic in that they can automatically figure out first derivatives for symbolic input and output variables.


I thought automatic differentiation is a little different? Will have to look again. I thought the norm for a while was to manually enter the gradient, if possible, or to use a calculated one at each step.


The norm is to use library functions like addition, multiply, power, etc., which have hardcoded derivatives. Then the derivative is computed “symbolically” by composing these base derivatives, which is essentially an exercise of implementing calculator logic. I use scare quotes because there are numerical problems with some functions that have to be hacked around with more manual derivatives, like the cross-entropy/softmax function. In those cases, people suggest to use a different library function special-cased out for the situation, which is not really “symbolic” in the typical mathematical sense.


I'd say this is because the term "AI" has been redefined, not because the same work is being done in a different way


Can anyone recommend a source for setting up an environment in Linux to run the code in this book?

Do I need to learn eMacs to get close to a “modern lisp” programming environment?


Emacs is popular. I like vim, but people are also using VS Code. https://lispcookbook.github.io/cl-cookbook/editor-support.ht... and other pages on the wiki may help you. https://github.com/CodyReichert/awesome-cl#community has a list of community spots if you want to seek out other opinions.

The SBCL implementation is very good, consider getting a binary directly from their site if your distro's version is out of date http://www.sbcl.org/

I disagree with a sibling comment that this book expects you to be comfortable with Lisp; the first chapter is literally an introduction, and the next two chapters cover most of the basics a working programmer should expect to cover quickly with chapter 3 being a handy reference to look back on for a lot of things. If you're new to programming or find the intro too fast, sure, look at other resources, but it's not too bad to just dive in. The main supplement is to figure out, with your editor of choice, how to send blocks of Lisp code to the Lisp prompt so that you can type and edit with an editor and not have to do everything directly on the prompt line.


Download and run Portacle: https://portacle.github.io/

It's a distribution of emacs together with a Lisp compiler, emacs add-ons like SLIME, etc.


Bear in mind that this book explicitly expects readers to already be familiar with lisp, so if you've not got a lisp environment going already and need an introduction, you should really look elsewhere first. Touretzsky's book is a great place to start. https://www.cs.cmu.edu/~dst/LispBook/


How familiar is enough?


I would say Emacs with SBCL is a good start. I haven't started PAIP yet, I'm working through the Touretzky Lisp book, "Common Lisp: A Gentle Introduction to Symbolic Computation" as a warmup before starting PAIP. I've found that Inferior Lisp mode in Emacs is sufficient for working through those examples. It seems like a lot of working programmers are using SLIME with Emacs though.


Emacs would be a fine choice, but is not mandatory, if you can live with less support for parens found in other editors. Emacs traditionally excels at working with lispy languages and REPLs.

I started working through the book, but did not feel like using Common Lisp. Instead I used GNU Guile. I am not very far in the book yet, but so far I was able to translate between Common Lisp and Scheme easily.

So for me the way to set things up was to install GNU Guile. I did that via GNU Guix package manager. However, GNU Guix can also install SBCL, so that you could use Common Lisp as the book does. SBCL is also in many distros' repositories.


Lucky you, one of the chapters implements scheme. Seriously though, use CL, you’re not going to have a good time with scheme with this book. It’s making use of some CL specific behaviours


Thanks for the warning and look-ahead!


this is ridiculous for a first reading. its the same as telling someone to use common lisp for sicp. why? maybe on a second or third reading for fun but to digest this book just use a well supported language like common lisp for which the book was written


I can switch to Common Lisp at any time later in the book, if I feel, that translating to Scheme is too difficult for me. I will need to understand CL anyway, to do the translation.

What I do not like about CL is probably one of the "wars" fought in computer programming: I do not like, that I have to use funcall, instead of simply wrapping with parens. I do not like, that variables and functions are in separate namespaces. However, if going through the book using Scheme gets too complicated, these are not things I cannot get over.

EDIT: I probably also do not like the less emphasis on pure functions in CL. I think while solving some exercises I already did some things, so that I could solve without mutating global state. But it's been a while since I last continued working through the book, so I am not so sure. I get that sometimes a simple side-effect can be practical, but this is my free time occupation and I get joy from doing things in clean ways. Nevertheless, Common Lisp is one of the languages, which I wish I could actually work in (like on the job, not in my free time), when I compare it to things like Python, JavaScript and so on. At least it is in the family of languages I enjoy using.


do you think elisp will work for this book? or would one need to implement unique common lisp features in elisp? or will elisp just be too slow


You should use Common Lisp for this book, unless specifically want a challenge.


most of it should work in elisp, note that elisp and common lisp have different semantics for IF.

you might find some variances later on. also elisp doesn’t have as nice a debugging environment as SLIME so I’d honestly recommend just setting up common lisp


As far as I can tell, any Elisp if-form that is also a valid Common Lisp if-form has the same semantics; but Elisp allows more than one else-form in its syntax, with additional semantics.

It's something like:

  (defmacro elisp-if (expr then &rest elses)
    `(cond (,expr ,then) (t ,@elses)))
Whereas the regular if is like:

  (defmacro cl-if (expr then &optional else)
    `(cond (,expr ,then) (t ,else)))
In a book that uses Common Lisp, you shouldn't see an if-form that isn't Elisp-compatible.


Oh yes you are 100% right, I just had a brain fart. Thanks for correcting


What does “IF” stand for in your comment?

Are you referring to the IF expression or something different?


ignore me, read the comments above


emacs is the easiest but you can try lispworks personal edition or allegro common lisp (they have a web ide IIRC)

there’s portacle which combines emacs with a lisp environment


Many people will fret when mentioning a commercial vendor. but IMHO and YMMV the leaset systems to get going with the least hassle would be either Lispworks or Allegro and both offer trial versions not getting in your way throughout the book (even though I'm Emacs SLIME user for long time). I'm interested in user minimzing unrelated side topics and spiral customization (I love them) and focusing on the book


Most popular Linux distributions include Common lisp packages. For example, "sbcl". if you have Debian/Ubuntu you should be able to install it like this: "sudo apt-get install sbcl"


vim + slimv

check out susam's guide


Hello! Thank you for mentioning my Vim + Slimv guide. I have, in fact, two guides to set up a Common Lisp programming environment from scratch:

For Vim: https://susam.net/blog/lisp-in-vim.html

For Emacs: https://github.com/susam/emacs4cl


Apparently Perlis devoted some time to making epigrams. I like #119 (and have amended it in line with #122 and recent developments in Canada, et al) : Programming is an unnatural act, but so far it is still mostly legal.

https://web.archive.org/web/19990117034445/http://www-pu.inf...

p.s.

Actually applying Perlis's epigrams to epigrams, we inevitably reach the conclusion that epigrams stifle thought yet #125 still holds:

#121 permits meta-epigrams.

proposed: Epigrams optimize expression.

#21: optimization hinders evolution

q.e.d. epigrams stifle evolution of thought.

[Alan Perlis: https://en.wikipedia.org/wiki/Alan_Perlis]


Wrong thread?


This is how PoAIP book [ch. 1] starts:

"You think you know when you learn, are more sure when you can write, even more when you can teach, but certain when you can program. —Alan Perlis "

So I was curious as to who is Alan Perlis. Sorry, assumed others had also accessed the pdf :{}


I guess this is more of a historical curiosity rather than something that is relevant today?


It's actually just a really good book about programming, with both "high-level" but also really practical advice (how to attack performance issues, how to debug). The examples are all really fun to build and extend. In fact I'm going through it again currently (I kickstarted a datalog compiler off of the prolog compiler chapter).


Norvig commented on this:

http://norvig.com/Lisp-retro.html

The last section, "What did PAIP Accomplish?" summarizes his view on PAIPs legacy.


No this is one of the best programming books you can buy. It's just not about ML.


> No this is one of the best programming books you can buy.

It's even no-cost. The book is available for free in digital forms.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: