Very nicely done, thank you for sharing this!A CLP(FD/ℤ) solution such as this one has two parts: First, the relevant constraints are posted. Second, a search tries to find concrete solutions. In general, a search is necessary because the constraints by themselves are not sufficient to deduce the unique solution as that would be computationally prohibitive.You can therefore influence the speed of the logic program in two categorically distinct ways: First, you can post different constraints or try a different formulation altogether. Second, you can try different search heuristics that try to find concrete solutions more efficiently.Regarding constraint propagation, there is a clear trade-off between strength and efficiency of propagation. For instance, the all_distinct/1 constraint that is used in the sample solution propagates very strongly, and is therefore much slower than the weaker all_different/1 constraint, even though they are completely equivalent from a semantic perspective: Both are true if and only if the given integers are pairwise distinct. It depends on the problem at hand whether the stronger propagation pays off, or whether the weaker propagation is in fact sufficient to quickly find solutions.Especially for very simple tasks such as Sudoku, quick and weak propagation may well outperform strong and slow propagation.In this concrete example, I easily achieved a 20-fold speedup by replacing the all_distinct/1 constraint with all_different/1, and using the ff labeling strategy to search for concrete solutions. ff means first-fail, and is often a very good strategy. To get this, I simply textually replaced all occurrences of all_distinct with all_different in the source code, and posted:`````` ?- problem(1, Rows), sudoku(Rows), append(Rows, Vs), labeling([ff], Vs), maplist(portray_clause, Rows). `````` Also, in less than 5 minutes, I just produced all 88 solutions for problem 3 which is shown at the end of the article. This is only due to using less powerful propagation, and a better search heuristic.And this is one of the key attractions of constraint logic programming (CLP): You can relatively easily try different search strategies while keeping the model the same.In Hakan's very nice Picat solution, you see that a specific search heuristic is used, specified as:`````` solve([ffd,down], Vars). `````` In addition, Hakan's program uses all_different/1. For a fair comparison between the solutions, constraints with the same propagation strength, and the same search heuristic must be used in both approaches.

 This comment was a joy to read right after finishing "Out of the Tar Pit" [1]. This bit is one of the critical design principles that article was advocating for:> You can relatively easily try different search strategies while keeping the model the same.
 Thank you for your explanation. However, as someone who has merely dabbled in Prolog (I've written less than 2000 lines of Prolog in total), I still don't understand the distinction between all_different/1 and all_distinct/1.Even the documentation for all_different/1 https://www.swi-prolog.org/pldoc/man?predicate=all_different... asks the user to consider using all_distinct instead. I'm afraid I just don't have a mental model of how CLP(FD) works to understand what is meant to "stronger" or "weaker" propagation.
 Some terminology: a propagator is an implementation of a constraint, and is used to remove values from variable domains, by deducing that they are in no solution. In general, propagation strength is about how much a propagator can deduce, that is, how many values it can remove while still not removing a value that is in some solution to the constraint.As an example, consider the variables`````` x=1, y in {2, 3}, z in {2, 3}, w in {1, 2, 3, 4} `````` with the constraint all_different(x, y, z, w) added (note here that I will use that names for the constraint, and refer to the propagation strength of a propagator).With value-propagation (sometimes called forward-checking), a propagator would deduce the new domains`````` x=1, y in {2, 3}, z in {2, 3}, w in {2, 3, 4} `````` by simply removing assigned values from other variables domains. However, there are stronger propagators for the constraint. In particular, there is well-known and reasonably efficient (O(n^2.5)) propagator that is domain-consistent (also known as GAC or generalized arc consistent). That means that it can always remove _all_ values from variables that are in no solution. The above example with that propagator would deduce`````` x=1, y in {2, 3}, z in {2, 3}, w=4 `````` The reasoning is based on something called Hall-sets. In the above example, y and z form a Hall set of size 2, since they must take the values 2 and 3 in any solution. there are also other propagators with different propagation strengths. For all_different, there are also bounds-consistent propagators, that can do more than value propagation, but only on does advanced reasoning based on the bounds of variables.I think that SWI Prolog uses the all_different name for value propagation, and all_distinct for domain propagation. In the CP system Gecode that I'm most familiar with, we use an argument to the distinct-constraint (same constraint, different name) to indicate the desired level of propagation (https://www.gecode.org/doc-latest/reference/group__TaskModel...).
 Thanks! This comment was super helpful. :) I made a couple updates to the post with these findings.I suspected something like the `ff` labeling approach would improve the solution, but I had no idea how much of a difference `all_different` made vs. `all_distinct`!
 I recently went though a whole bunch of your Power of Prolog videos [0], solving and generating such sudokus was the first thing I thought about when watching the original video of the guy solving the Miracle Sudoku.Thanks for those videos!

Search: