
Esoteric programming paradigms - SlyShy
http://www.ybrikman.com/writing/2014/04/09/six-programming-paradigms-that-will/
======
zmonx
I have two comments about the Prolog code:

First, the article claims: "the sudoku solver above does a brute force
search", but that is specifically _not_ the case. In fact, quite the
_opposite_ holds: The Prolog Sudoku formulation shown in the article uses a
powerful technique known as Constraint Logic Programming over Finite Domains,
which automatically performs _pruning_ before and also _during_ the search for
solutions. This effectively eliminates large portions of the search space in
practice, and degenerates to brute force _only_ in those cases where almost
nothing can be deduced from initial constraints. In the particular case of
Sudoku, the pruning is especially effective and can in many cases _eliminate
the search entirely_ , since the initial constraints (hints) basically
determine the whole solution.

Second, yes, it is easy to write an O(n!) search algorithm in Prolog. However,
it is almost as easy to implement _much more efficient_ search algorithms in
Prolog. For example, here is Quicksort in Prolog:

    
    
        quicksort([])     --> [].
        quicksort([L|Ls]) -->
                { partition(Ls, L, Smalls, Bigs) },
                quicksort(Smalls), [L], quicksort(Bigs).
    

Note how natural the declarative description of "first the smaller elements,
then the pivot element, then the bigger elements" is in Prolog. This only
requires a suitable implementation of partition/4, which is very easy to
implement in at most 7 lines of Prolog code.

~~~
taeric
I think you are just using a different definition of brute force than the post
is using. It doesn't try all permutations of the digits, but it does do a
search over the tree to find the solutions.

Now, to your point, this is really the only way you can solve sudoku. There is
an odd belief that you can make an algorithm that "never guesses." And folks
think that that would be the definition of a non-brute force algorithm. In
reality, you either have a solver that is ready to backtrack, or you have one
that can't solve all puzzles.

~~~
Pamar
I am not sure about your backtracking point. Sudoku can be solved by Integer
Programming (i.e. a special subset of Linear Programming) and this does not
require any kind of backtracking.

~~~
taeric
While I wouldn't be shocked to know that I'm wrong on requiring backtracking,
I am not sure how the integer programming claim refutes it. In particular,
those look to still be "search" solvers and almost certainly have to perform
some "guess" in making the search.

Do you happen to have a recommended link on how this can be accomplished?
First few results in searching just show farming this out to a specialized
function in matlab. :)

~~~
Pamar
It is a bit difficult to explain Linear Programming in a forum post (so I will
have to default to Wikipedia:
[https://en.wikipedia.org/wiki/Linear_programming](https://en.wikipedia.org/wiki/Linear_programming)
or to provide a link to a nice MOOC: [https://www.edx.org/course/optimization-
methods-business-ana...](https://www.edx.org/course/optimization-methods-
business-analytics-mitx-15-053x)).

There is absolutely no "guessing" involved, though: you describe your sudoku
problem as a series of linear equations (which represent the edges of your
search space) and then the algorithm travels from one vertex to the next until
it finds the solution. There is no backtracking at all.

~~~
Pyxl101
While this may be true for linear programming, is it true for integer linear
programming or binary integer programming, which one would presumably use to
model Soduku? The Wikipedia article you posted claims that integer programming
problems are NP-hard.

> If all of the unknown variables are required to be integers, then the
> problem is called an integer programming (IP) or integer linear programming
> (ILP) problem. In contrast to linear programming, which can be solved
> efficiently in the worst case, integer programming problems are in many
> practical situations (those with bounded variables) NP-hard. 0-1 integer
> programming or binary integer programming (BIP) is the special case of
> integer programming where variables are required to be 0 or 1 (rather than
> arbitrary integers). This problem is also classified as NP-hard, and in fact
> the decision version was one of Karp's 21 NP-complete problems.

Here is an article on the topic of "Binary and Mixed Integer Programming"
which explains part of the Balas Additive Algorithm (infeasibility pruning, I
believe) in terms of backtracking:

> At this point, both of the nodes just created are not eligible for further
> expansion, so we back up the tree, looking for a level at which one of the
> nodes is unexplored.

[http://www.sce.carleton.ca/faculty/chinneck/po/Chapter13.pdf](http://www.sce.carleton.ca/faculty/chinneck/po/Chapter13.pdf)

Another paper describes a method by Glover characterized as "A backtracking
procedure for implicit enumeration".

> A particularization of the procedure based on Balas' algorithm. In S2 we
> presented and justified a flexible and economical back-tracking procedure
> for finding an optimal feasible solution of (P) by implicit enumeration.

See Figure 1 on page 182 (or page 6 in the PDF):

[http://www.anderson.ucla.edu/faculty/art.geoffrion/home/docs...](http://www.anderson.ucla.edu/faculty/art.geoffrion/home/docs/e7.pdf)

This branch of mathematics is not my forte however, so if I've misunderstood
then I'd appreciate clarification. It seems like the algorithm is not
backtracking in the sense of generating possible solutions and checking them,
but is backtracking in the sense of fathoming which next cheapest (partial)
solution might be feasible, and abandoning it if proven to be definitely
infeasible, in favor of examining the (then) next cheapest potential solution.

Finally, see the following paper that compares and contrasts Constraint
Programming with Integer Programming, and characterizes both of them as
instances of Tree Search:

[http://preserve.lehigh.edu/cgi/viewcontent.cgi?article=1780&...](http://preserve.lehigh.edu/cgi/viewcontent.cgi?article=1780&context=etd)

~~~
Pamar
_Extremely Coarse Approximation Alert_

Try to imagine this: by manipulating an n-dimensional matrix in a certain way,
you get, at each step, a result which is guaranteed to be part of your
solution space. You also have a way to find out, at each step, if you have
found the optimal solution (in terms of your objective function) or if no
solution exists.

So the process goes like this: Put your constraints in the matrix.

    
    
      Loop -
        Manipulate Matrix
        Is this the optimal Solution?
          Yes - return solution
        Is the solution space empty/concave/unbound?
          Yes - return error
      End.
    

No backtracking in the sense of "try this... hmm... no, try this other".

I have used LP professionally in the past, and recently participated in the
MOOC I linked above (as a refresher) but I might surely be missing something
(the theory I studied at UNI too many years ago - now I just use it as a tool)
or overgeneralizing too much. If anyone can provide corrections these will be
welcomed.

~~~
Pyxl101
Could you clarify which algorithm you're referring to? Or name an example
algorithm that works in the way you describe? What I find in academic
literature seems to characterize Integer Programming as a search problem. See
the following article that compares Constraint Programming with Integer
Programming:

[http://preserve.lehigh.edu/cgi/viewcontent.cgi?article=1780&...](http://preserve.lehigh.edu/cgi/viewcontent.cgi?article=1780&context=etd)

> Since tree search is a basic solution technique employed in both constraint
> and integer programming, we begin with a generic overview of tree search as
> a technique for finding feasible solutions to mathematical models.

> Every tree search algorithm is defined by four elements: a node- processing
> procedure, pruning rules, branching rules, and search strategy. The
> processing step is applied at every node of the search tree beginning with
> the root node, and usually involves some attempt to further reduce the
> feasible set associated with the node by applying logical rules. [...]

> One particularly well-developed solution technique, called branch and bound,
> was introduced by Land and Doig. Branch and bound is frequently used to find
> solutions to integer programming problems; it is a tree search procedure in
> which a bound on the optimal value to each subproblem is obtained by solving
> a relaxation. In this thesis we assume the use of a linear programming
> relaxation in the branch and bound tree.

> The last piece of a branch and bound algorithm is a search strategy
> specifying the order in which to explore nodes of the tree. Depth-first
> search can be used to save memory in a branch and bound tree, since we must
> only remember the difference in solution between two successive nodes of the
> search tree.

Chapter nine of Applied Mathematical Programming from MIT classifies integer
program solvers into three major groups:

[http://web.mit.edu/15.053/www/AMP-
Chapter-09.pdf](http://web.mit.edu/15.053/www/AMP-Chapter-09.pdf)

> Whereas the simplex method is effective for solving linear programs, there
> is no single technique for solving integer programs. Instead, a number of
> procedures have been developed, and the performance of any particular
> technique appears to be highly problem-dependent. Methods to date can be
> classified broadly as following one of three approaches:

> i) enumeration techniques, including the branch-and-bound procedure;

> ii) cutting-plane techniques; and

> iii) group-theoretic techniques. [...]

> Branch-and-bound is essentially a strategy of ‘‘divide and conquer.’’ The
> idea is to partition the feasible region into more manageable subdivisions
> and then, if required, to further partition the subdivisions. In general,
> there are a number of ways to divide the feasible region, and as a
> consequence there are a number of branch-and-bound algorithms.

> An integer linear program is a linear program further constrained by the
> integrality restrictions. Thus, in a maximization problem, the value of the
> objective function, at the linear-program optimum, will always be an upper
> bound on the optimal integer-programming objective. In addition, any integer
> feasible point is always a lower bound on the optimal linear-program
> objective value. The idea of branch-and-bound is to utilize these
> observations to systematically subdivide the linear programming feasible
> region and make assessments of the integer-programming problem based upon
> these subdivisions.

The book describes the "cutting-plane" approach, which does seem to work more
like how you're describing (a series of transformations), but also says:

> In practice, the branch-and-bound procedures almost always outperform the
> cutting-plane algorithm. Nevertheless, the algorithm has been important to
> the evolution of integer programming. Historically, it was the first
> algorithm developed for integer programming that could be proved to converge
> in a finite number of steps. In addition, even though the algorithm
> generally is considered to be very inefficient, it has provided insights
> into integer programming that have led to other, more efficient, algorithms.

~~~
Pamar
As I mentioned elsewhere, it has been ages since I took an actual formal exam
on Linear Programming (and Integer Programming) - since then I always used
dedicated sw to solve LP (there is for example a nice, self-contained LP
solver in _Numerical Recipes in C_ ) and so I always focused more on how to
describe the problems in terms of linear constraints.

"In my mind" it works like this: [https://www.quora.com/What-is-the-
difference-between-integer...](https://www.quora.com/What-is-the-difference-
between-integer-programming-and-linear-programming)

Having said this, while NP is guaranteed to terminate the actual computation
time may take ages. Integer Programming may very well leverage the fact that
the solution is restricted to integer values and apply different, faster
algorithm to exploit this property - including search trees. But in the most
general sense you do not need a search tree or backtracking for Linear
Programming.

------
sixo
'Concurrency-by-default' is similar to a notation I've been using to map out
async service calls. It's just this: lines are terminated with "," or ";". A
comma doesn't block and all comma-separated lines are executed in any order,
while a semicolon blocks. Names are only usable when a semicolon is reached,
and a semicolon unblocks flow when all preceding names are bound. Probably
code is scoped into { } blocks. So a lambda is like "pyth_distance = {x, y;
sqrt(x^2 + y^2)}". A series of async callbacks would be given by an inner
block {x = call1(), y = call2(); pyth_distance(x,y)}, allowing you to do any
manipulations you can do with normal code.

Might try making a toy language out of it eventually.

~~~
aaron-lebo
That's really cool.

Would you build something like this from scratch or base it on another tech
stack like the jvm?

~~~
achamayou
That's exactly what || (concurrent statements) and ; (sequential statements)
do in Esterel.

~~~
aaron-lebo
Thanks, was totally unaware of that. For the lazy and ignorant like me:

[https://en.wikipedia.org/wiki/Esterel](https://en.wikipedia.org/wiki/Esterel)

(if you'd like to throw a link in your comment, would be happy to delete this
one)

~~~
achamayou
No need :) But I'll happily add this one, which is a nice short intro if you
know nothing about it: [http://www.embedded.com/design/prototyping-and-
development/4...](http://www.embedded.com/design/prototyping-and-
development/4024644/An-Introduction-to-Esterel)

I came in contact with the language during my studies (one of my profs worked
on it), found it interesting, but haven't really had a chance to apply it to
anything I've done professionally since. It's a very neat language though, and
if I got to do anything in the embedded field.

I notice that there is now an open source compiler, which is great. The only
implementation I was aware of so far was closed source.

------
tannhaeuser
I don't know if Prolog should be called esoteric. Prolog is an ISO-
standardized language after all, and its syntax has been used for 4+ decades
now in most papers on conceptual database system and query language design
I've read. Which isn't surprising since Prolog syntax, being based on
operator-precedence grammar concepts, is arguably as minimalistic as it gets.
There's definitely also a lot of interest lately in Datalog, not just as a
decidable logic fragment (which has been used for decades as well), but also
as a practical non-SQL database query language and proper Prolog syntax
subset.

~~~
fake-name
Considering they also mention SQL and Forth, I think this could probably
equally be called "things that confuse someone who only knows C".

~~~
andybak
I think both Forth and SQL are outliers in their strangeness. Most people only
have to deal with SQL on a very shallow level and barely regard it as
programming at all.

Forth is known by name and reputation but very few people have ever tried to
write anything in it - let alone delve into it's strange bootstrapped nature.

~~~
DonHopkins
FORTH ?KNOW IF HONK ELSE FORTH LEARN THEN

------
pron
To this I would add synchronous programming[1], which is particularly suited
for interactive or concurrent programs and formal reasoning, and has had
success in industry in safety-critical realtime systems. Examples include
Esterel[2] and SCADE, and outside realtime, Céu[3] and Eve[4] (the latter is
based on Dedalus[5], which combines SP with logic programming).

As someone who loves formal methods and believes most mainstream software
systems today are mostly interactive and/or synchronous, I think this paradigm
has a lot of untapped potential, and I'm glad to see it slowly move out of
safety-critical systems into the mainstream, in languages like Eve.

[1]:
[https://en.wikipedia.org/wiki/Synchronous_programming_langua...](https://en.wikipedia.org/wiki/Synchronous_programming_language)

[2]:
[https://en.wikipedia.org/wiki/Esterel](https://en.wikipedia.org/wiki/Esterel)

[3]: [http://www.ceu-lang.org/](http://www.ceu-lang.org/)

[4]: [http://witheve.com/](http://witheve.com/)

[5]:
[https://www2.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-...](https://www2.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-173.pdf)

~~~
ibdknox
Yeah, there are a lot of interesting benefits to synchronous programming that
haven't been explored in a wider context and we're excited to be able to do
so. Figuring out how to actually implement Eve's semantics has been quite a
quest and unfortunately the implementations of most of those languages don't
really fit us. Fortunately, we've put some really interesting things together
lately that have produced some very surprising performance numbers for us, so
hopefully that's finally resolved and we can move on to how GALS and the like
apply in our world. :)

~~~
pron
GALS is indeed the obvious next step. That's how I'd do it.

------
robmccoll
ANI reminds me of HDLs [1] - I'm assuming that's the inspiration with
terminology like "latch"?

Hardware is also concurrent by default. Coding some hardware logic will also
change the way you approach coding. Anyone who's interested get an FPGA demo
board and write some verilog or VHDL - I highly recommend it.

1\.
[https://en.wikipedia.org/wiki/Hardware_description_language](https://en.wikipedia.org/wiki/Hardware_description_language)

~~~
verall
Yes, I can't really understand why he didn't mention HDLs. I had never seen
ANI before but both the syntax and terminology reminded me of VHDL.

~~~
snerbles
These layers of abstraction seem to lead software bloggers to re-discovering
concepts known by hardware engineers for decades.

------
richard_shelton
Honestly, I don't think that "concatenative" is a good term here. I prefer to
call it a special case of combinator-oriented programming. You may see
examples of this approach in vector languages like APL, FP and in functional
world too (see Henderson's book, SICP and so on). The author of Joy (the
language which spawned the whole "concatenative" activity) clearly was
inspired by Backus's FP. Here we have stack combinators and we can emulate
them even in Python, thanks to closures.

square = code(dup, op("*"))

Add a bit of syntactic sugar and you'll get "concatenative", or, better stack
combinator-oriented language.

As for Prolog, I really appreciate that the author is calling it "declarative"
not a "logic" language. It's more important to learn about backtracking and
unification (powerful variant of pattern matching) than about something like
Horn clauses. For anyone who wants to learn Prolog better (and it's worth it,
Prolog is one of most beautiful PLs around!) I recommend this article:
[http://www.amzi.com/articles/prolog_under_the_hood.htm](http://www.amzi.com/articles/prolog_under_the_hood.htm)

~~~
Others
I disagree with your second point. "Logic programming" is a much more accurate
term for programing with backtracking and unification than "declarative." For
example Haskell is a very declarative language, but it doesn't use
backtracking at all!

~~~
richard_shelton
I prefer not to call it "logic programming" from the point of learning the
language. Too many textbooks start from explaining that "Socrates is a man" or
talk about Robinson's resolution. And, as a result, student doesn't know what
to do with this language.

Declarative is a vague term, but for me it means mostly a way to compactly
describe your problem in domain-specific terms. There are some languages in
which you can produce compact code, like APL or J. But declarative for me
means readable too. And there are cases, when in Prolog one can write a more
compact and readable code than in Haskell. In Prolog you can describe just the
essence of the problem. Not always, of course.

Another interesting thing about Prolog that it's a small language. It means
that it has only few internal parts that make it alive (The complexity of many
Prolog implementations is a result of fighting for perfomance). I really like
small languages (like Oberon or Forth, for example), because it's possible to
learn how they work internally. And the knowledge of this inner working helps
to understand the language better. There is nothing "logical" inside Prolog,
just a few powerful imperative constructs.

The author of "Prolog Under the Hood An Honest Look" says:

"Prolog, billed as "logic programming", is not really. You may be disappointed
if that's what you expected to find. On the other hand, having backtracking,
unification, and recursion inside one computer language leads to something
very powerful and special."

And I, personally, like Prolog terms very much!

------
asavinov
Would like to mention _concept-oriented programming_ [1] which is work in
progress but has highly ambitious goals of changing the way we think of
programming. The initial idea is to make _references_ active and customized
elements of the program which can intercept all accesses to the represented
objects. A new programming construct, called _concept_ (hence the name of the
approach), describes both behavior of references and behavior of objects.
Objects exist in a hierarchy modeled by the concept hierarchy. IS-IN relation
is used instead of the traditional IS-A. Also, concept-oriented programming
distinguishes between incoming and outgoing methods, that is, every method has
two versions: for external access and for internal access. More papers in [2].

[1] [https://arxiv.org/abs/1501.00720](https://arxiv.org/abs/1501.00720) [2]
[https://www.researchgate.net/profile/Alexandr_Savinov](https://www.researchgate.net/profile/Alexandr_Savinov)

Disclaimer: I am the author of concept-oriented programming and data model

~~~
DonbunEf7
How does your research interact with object-capability models of computation,
as in the E programming language?

~~~
asavinov
(I can be wrong but) as far as I understand E is based on objects and messages
as primary mechanisms, that is, we do not know where objects exist, how they
are managed and how messages are transferred. Concept-oriented programming
(COP) is a references-first approach while objects are defined as being
_functions of references_ , that is, object is a derived (secondary) element
of the model and the goal is to be able to have full control over this
mechanism (custom heap, custom garbage collection, custom access control and
security model etc.)

What I see similar in object-capability model and COP is that they both try to
treat _object access_ as a highly important part of a system behavior. In COP,
I used to write that it is more important what happens during access rather
than in the object itself. Hence, being able to make these intermediate
actions integral part of the program (and develop good PL for that purpose) is
very important.

------
anotheryou
Will Eve ever take off? I like the thoughts behind it.

[https://youtu.be/VZQoAKJPbh8](https://youtu.be/VZQoAKJPbh8) very good talk
about the background of eve. When he finally talks about eve you might want to
switch to a more recent talk about it.

~~~
ibdknox
Eve[1] is in the process of becoming a lot more real than it has been up to
this point. We've found a great model and discovered a way to build a high
performance implementation of it, which means much of the foundational
research is finally in place. Over the next couple of weeks, we'll be
revamping our website, docs, etc to help people get started building real
things with it. :)

There's going to be a lot of really exciting stuff coming over the next few
months. We've gathered a set of ideas, evidence, and implementations that have
certainly blown us away - we hope others will find it valuable too.

EDIT: I realized I didn't address your initial question. Fwiw, we just
recently crossed a really big milestone in terms of usage - more than 40,000
people have now played around with Eve on
[http://play.witheve.com](http://play.witheve.com) and we've learned a ton
from that experience. Part of the website revamp will be making that workflow
simpler and nicer. We have a surprisingly high conversion rate (> 30%), so
hopefully we can help smooth out that experience and begin to grow the
community more and more.

[1]: [http://witheve.com](http://witheve.com)

~~~
throwaway7645
I can't load the site on my phone. Mind pointing out what makes Eve so
special? I'm suspicious of us figuring out anything mind boggling new at this
point and assumed Eve was all a gimmick (I hope to be proved wrong :)).
Something like Red makes a lot of sense to me as that ahhh language as it is
tiny with no install, can be used for high/low level coding, has excellent
DSLs such as GUI builders, and can compile to a native binary to name a few
things that kind of shocked me. Eve seems to be more like an online Smalltalk?
It's always fascinating to hear new ideas and I wish the best for your
project!

~~~
cmontella
We've certainly taken a lot of inspiration from Smalltalk, but I think the
semantics we've arrived at make a really nice programming environment, with
some surprising properties you may not think are possible.

Eve has a similar philosophy to Red/Rebol - that programming is needlessly
complex, and by fixing the model we can make the whole ordeal a lot nicer. We
start with a uniform data model - everything in Eve is represented by records
(key value pairs attached to a unique ID). This keeps the language small, both
implementation-wise and in terms of the number of operators you need to learn.

Programs in Eve are made up of small blocks of code that compose
automatically. In each block you query records and then specify transforms on
those records. Blocks are short and declarative, and are reactive to changes
in data so you don't worry about callbacks, caching, routing, etc.

Due to this design, we've reduced the error footprint of the language -- there
are really only 3 kinds of errors you'll ever encounter, and those mostly
relate to data missing or being in the wrong shape that you expect. What's
more, we'll actually be able to catch most errors with the right tooling.
You'll never experience your app crashing or errors related to undefined/nil
values.

We've made sure your program is transparent and inspectable. If you want to
monitor a value in the system, you can just write a query that displays it, as
the program is running. I like to think of this as a "multimeter for code".
You can do this for variables, memory, HTTP requests, the code itself ...
since everything is represented as records, everything is inspectable.

Because at its core Eve is a database, we also store all the information
necessary to track the provenance of values. This is something most languages
can't do, because the information just isn't there. So for instance if a value
is wrong, you can find out exactly how it was generated.

There's a lot more work to do, but we have big plans going forward. We plan to
make the runtime distributed and topology agnostic, so you can potentially
write a program that scales to millions of users without having to worry about
provisioning servers or worrying about how your code scales.

We're also planning on multiple interfaces to the runtime. Right now we have a
text syntax, but there's no reason we couldn't plug in a UI specific editor
that integrates with text code. We've explored something like this already.

Anyway, those are future directions, but what we have right now is a set of
semantics that allow for the expression of programs without a lot of the
boilerplate and hassle you have to go through in traditional languages, and
which provide some unique properties you won't find in any other language (at
least none that I've used).

~~~
lioeters
Thank you for these intriguing thoughts behind the development of Eve. This
project was for me the most valuable find in this whole discussion. There are
a number of fundamental design decisions in Eve that opened my eyes to a fresh
rethinking of the underlying assumptions in existing languages.

The code examples demonstrate surprising simplicity in achieving features that
would be complicated to implement in other languages. I'm convinced that Eve
will influence how programmers think (at least it did for me) and promote
development of languages/frameworks/libraries that adopt some of these ideas.
Great work, will be following with interest.

------
nialv7
Array oriented programming languages are also pretty interesting:

[https://en.wikipedia.org/wiki/APL_(programming_language)](https://en.wikipedia.org/wiki/APL_\(programming_language\))

[https://en.wikipedia.org/wiki/J_(programming_language)](https://en.wikipedia.org/wiki/J_\(programming_language\))

~~~
nickpsecurity
Especially in something like kdb.

~~~
throwaway7645
I hear this a lot, but am confused in that I can't find hardly any examples
anywhere. It'd be nice if kx systems put some videos of people querying data
and building charts...etc.

~~~
nickpsecurity
[http://tech.marksblogg.com/billion-nyc-taxi-
kdb.html](http://tech.marksblogg.com/billion-nyc-taxi-kdb.html)

~~~
throwaway7645
A nice demo, but still doesn't show examples of a customer going through an
example dataset and building dashboards. The kx site is good at talking about
all their customers without giving a whole lot of real info. I wish they had a
64-bit hobbyist license.

~~~
nickpsecurity
I don't know about 64-bit but people in prior HN discussions said you can
download a free one off their web site. You should be able to extrapolate some
things from that.

EDIT: [https://kx.com/download/](https://kx.com/download/)

~~~
throwaway7645
True, and thank you. I think it'd be nice if they had a video of a user on a
sample system running queries and building displays. Yea I can play with the
demo, but for that price I'd expect some promotional materials to be made.

------
dkersten
A few random comments:

Forth is a great concatenative language, since its the pioneer in that area (I
think), but Factor is definitely worth mentioning too as a "modern" take on
the paradigm. It essentially tries to be a concatenative Lisp.

ANI was dead even in 2014 when this article was written (which the author
acknowledges: _" the language seems dead, but the concepts are pretty
interesting"_). It has some really interesting ideas, but since it never got
implemented, I'm not sure how much use there is in discussing it here amongst
_real_ languages. It would be useful as a discussion for possible future
languages for sure, but its currently still just a concept, so I'm not sure
what _practical_ thing you can learn from it right now.

------
mcguire
For some extra dependently typed fun, check out ATS and Dafny.

ATS is aimed at system programming, and if you think Idris has a steep
learning curve, you'll need belaying for ATS. And, the language is horrible.
But it's really mind expanding to see it in action.

Dafny is a really basic language with support for Hoare/Dijkstra verification.
It's completely unlike the type system model.

------
oolongtea
ANI/Plaid reminded me of the LabVIEW visual dataflow language, which is quite
widely used in the branch of physics I used to work in, for data acquisition
and instrument control. While I've often longed for a text-based alternative
that plays better with modern version control and my favorite text editors, I
have to say that having everything laid out for you in a visual way does make
it easier to reason about the execution flow. That is, after a few years of
working with it --- initially, this paradigm shift was rather a painful
stretch of the nerves.

If every language has its own specific dark patterns and bottlenecks,
LabVIEW's is definitely the "brightly-colored spaghetti" structural breakdown
of an advanced beginner's code :-)

Incidentally, why do we, as programmers, tend to focus on a language's
bottlenecks so much, in such an emotionally charged way? Any psychology-of-
programming people out here? You might consider LabVIEW an excellent case
study in getting on people's nerves...

------
_pmf_
I'd recommend to everybody (but especially those involved in either embedded
systems or a lot of concurrent state) to try out HSM / state chart programming
(note: this has basically nothing to do with "flat" FSMs). It's as close to a
silver bullet as you'll ever get for these kinds of systems.

Stateflow or QP/QM, all other systems suck.

~~~
petra
Is it useful in writing drivers ? or mostly for application code ?

And what are the drawbacks? Why aren't everybody using it?

~~~
_pmf_
It's used heavily in application code for automotive ECUs. It has code
generators that generate output with a really, really, small footprint and
zero runtime overhead; due to optimization at very high levels, this is on the
level of really good LTO optimization.

Drawbacks: it's very un-agile; you really have to think the system through
completely. (The magic being that if you do this, it is very likely correct by
design.) It's not really feasible to specify a part of the system now and
leave other parts open for later refinement. The other drawback being that no
good non-commercial options exist.

~~~
petra
>> It's not really feasible to specify a part of the system now and leave
other parts open for later refinement.

Is there any work or ideas on how to solve that issue?

And so it's also hard to add features later, in next versions ?

~~~
nickpsecurity
Stuff like these. It overlaps with model-driven development where you work at
a higher level in constrained way to knock out many issues. Then, it generates
safe code from that which you also check with tests or other tools.

[https://en.wikipedia.org/wiki/Stateflow](https://en.wikipedia.org/wiki/Stateflow)

[https://en.wikipedia.org/wiki/Simulink](https://en.wikipedia.org/wiki/Simulink)

Recent example from high-assurance security:

[https://www.umsec.umn.edu/sites/www.umsec.umn.edu/files/hard...](https://www.umsec.umn.edu/sites/www.umsec.umn.edu/files/hardin-
icfem09-proof.pdf)

------
lioeters
Thought-provoking. The examples are all programming languages, but the
paradigms themselves can apply on a smaller scale, i.e., for application
features, as design patterns or inspiration.

The section on "symbolic programming" has me pondering still about potential
implications. It makes me imagine something like a "visual" WYSIWYG editor,
but a "conceptual" editor.. Looking forward to digging deeper via provided
references.

------
ramchip
It's strange to see ANI mentioned as if it were a real, working language.
AFAIK it was never able to compile even the samples:
[https://code.google.com/archive/p/anic/issues/7](https://code.google.com/archive/p/anic/issues/7)

------
dkarapetyan
Wait. What exactly is esoteric about these paradigms? All the books I've read
simply call them regular paradigms.

~~~
throwaway7645
You're correct that I'd not consider these esoteric, just not procedural or
OO. Esoteric would be something like Brainf __*, the music programming
language, or that art language (forget the name). Unusual but still commercial
and useful paradigms are logic, array, functional, and concatenative to name a
few.

~~~
jfoutz
Piet. The Mondrian language I'm guessing

~~~
throwaway7645
Yes that was it.

------
minxomat
Curious. ANI seems to me like an abstract form of graph-parallel programming,
where the language itself is the scheduler.

There are some production-ready schedulers for GPP, like Intel's TBB[1] (C++),
but learning to be effective with this requires a major shift in thinking
about code - essentially thinking in graphs.

[1] - [https://www.threadingbuildingblocks.org/tutorial-intel-
tbb-f...](https://www.threadingbuildingblocks.org/tutorial-intel-tbb-flow-
graph)

~~~
edparcell
My team has been working on a Python library called Loman that represents
computations as graphs. We've open-sourced it [1][2]. One of our aims is to
make it as natural as possible to use graph-based programming, and within an
already-familiar programming language. Be interested to know what you think.

[1]
[https://github.com/janusassetallocation/loman](https://github.com/janusassetallocation/loman)
[2]
[http://loman.readthedocs.io/en/latest/user/intro.html](http://loman.readthedocs.io/en/latest/user/intro.html)

~~~
minxomat
Can you demo your library with a more complex example, e.g. the Dining
Philosophers Problem. Here[1] is the solution using TBB, and here[2] a more
recent version - using a multioutput function node to optimize the flow.

[1] - [https://software.intel.com/en-us/blogs/2011/01/10/using-
the-...](https://software.intel.com/en-us/blogs/2011/01/10/using-the-intel-
threading-building-blocks-graph-community-preview-feature-an-implementation-
of-dining-philosophers)

[2] - [https://software.intel.com/en-us/blogs/2011/09/13/using-
inte...](https://software.intel.com/en-us/blogs/2011/09/13/using-intel-
tbb-40-features-to-simplify-dining-philosophers)

~~~
edparcell
Thanks for the links. I took a look, and I think that the intention is quite
different between the libraries. Our library would not directly apply to the
Dining Philosophers Problem. Both libraries use graphs to represent
dependencies between tasks, but they do so for different reasons, and to cover
different uses. The Intel library does it with the intention of scheduling a
given workload. Our library uses a directed acyclic graph to track state as
either the data or function for given nodes of the graph are exogenously
updated, either interactively during research, or from new incoming data in a
real-time system. We cover where we think our library is useful in more depth
in the introduction section of our documentation[1].

[1]
[http://loman.readthedocs.io/en/latest/user/intro.html](http://loman.readthedocs.io/en/latest/user/intro.html)

------
protomyth
Agent oriented-programming
[http://robotics.stanford.edu/~shoham/www%20papers/AgentOrien...](http://robotics.stanford.edu/~shoham/www%20papers/AgentOrientedProgrammingAIJ.pdf)
[PDF]

~~~
romaniv
Alan Kay talking about agent-oriented computing in 1990:

[https://youtu.be/275FQ9koAw8?t=7647](https://youtu.be/275FQ9koAw8?t=7647)

Very interesting view, especially considering how long ago that was and how
relevant they still are.

~~~
vram22
Some time after Java first came out, there was a product called Voyager from a
Java products company called Objectspace [1]. Voyager was a product for
creating and using mobile agents. I had downloaded the trial and tried it out
a bit. It was quite cool. IIRC, Graham Glass, who was involved in ObjectSpace,
was also the creator of Electric XML, an XML library, and was later CTO of
WebObjects. Recently he had/has been working on EDU 2.0 (EDU20.org), an
e-learning product company.

[1] They were also the creators of JGL, the Java Generics Library, which was
like a Java version of the C++ STL, and done before Java got generics
natively.

------
combatentropy

      > Dependent types
      > 
      > Example languages: Idris, Agda, Coq
      > 
      > You’re probably used to type systems in languages like C and Java,
      > where the compiler can check that a variable is an integer, list, or string.
      > But what if your compiler could check that a variable is “a positive integer”,
      > “a list of length 2”, or “a string that is a palindrome”?
    

This is what I love about SQL. You can even define your own types, like
"email", at least in PostgreSQL:

    
    
      create table contacts (
        name text not null,
        age int check (age >= 0),
        email email
      );
    

It already has a few of these special types built in, like for IP and MAC
addresses
([https://www.postgresql.org/docs/current/static/datatype.html](https://www.postgresql.org/docs/current/static/datatype.html)).

~~~
kornish
Out of interest, how is common ORM support for custom Postgres types?

~~~
siddboots
GeoAlchemy for PostGIS/SQLAlchemy is a good example of explicit support. But
typically, you can always "deal" with custom types in your ORM, even if that
means falling back to binary data.

------
afdsadf
Shame not to see factor in the concatenative list, it addresses some of the
pain points there with locals

~~~
protomyth
Postscript would also have been a fine addition and has its own methods of
dealing with locals.
[https://www.adobe.com/products/postscript/pdfs/PLRM.pdf](https://www.adobe.com/products/postscript/pdfs/PLRM.pdf)
[PDF]

~~~
DonHopkins
What a sad world we live in when Adobe publishes a paper about PostScript in
PDF (which is just PostScript without the programming language).

To help offset the irony and celebrate Turing completeness, here is a
PostScript interpreter implemented in PostScript, and an explanation of why.

[http://www.donhopkins.com/home/archive/cyber/litecyber/ps.ps...](http://www.donhopkins.com/home/archive/cyber/litecyber/ps.ps.txt)

[http://www.donhopkins.com/home/archive/cyber/litecyber/ps.ps...](http://www.donhopkins.com/home/archive/cyber/litecyber/ps.ps.reasons)

[http://www.donhopkins.com/drupal/node/97](http://www.donhopkins.com/drupal/node/97)

------
toolslive
I have also seen _persistent by default_ where every variable is by default
store in a database and automatically initialized when you come back to that
piece of code. Useful for web development. (Sorry, forgot the name of the
programming language)

~~~
lioeters
I'd agree that "persistent by default" is an unusual and interesting approach,
that would be suitable for the list of paradigms that may "change the way you
code".

It reminds me of a statement I read about managing application state, to treat
the state (in this case a Redux store) as an "in-memory database". Add a layer
to load/persist automatically - via LocalStorage, WebSocket, etc. - and it
would be persistent by default. I suppose you wouldn't want _everything_
persistent though, just a relevant slice of state.

Here's an article about "persistent languages", which includes discussion on
related features.
[http://wiki.c2.com/?PersistentLanguage](http://wiki.c2.com/?PersistentLanguage)

------
leovonl
This classification of "paradigms" is a bit off.

First, declarative programming is a generic name which includes a broad range
of paradigms - from functional to logic programming. Logic programming is
something that deserves a special mention and discussion, because there are a
number of interesting and unique concepts that deserve a more in-depth
explanation.

Second, "dependent types" is better understood as a feature of a language (or
better yet, of a type system) than a paradigm by itself.

Some of the other "paradigms" also seem more like characteristics of
languages, and not really something that structures the way solutions are
expressed/understood.

~~~
mmalone
This. There are three (and arguably only three) common programming paradigms:
imperative programming, functional programming, and logic programming with
lineages back to Turing machines, lambda calculus, and formal logic / proof
search, respectively. Languages can be broken down along other dimensions, but
usually the term "paradigm" is reserved for the sort of irreducible foundation
of a language, and these seem to be the three useful ones.

Edit: to put a slightly finer point on it, this irreducible foundation mostly
has to do with how the language "computes." Imperative languages compute via
statements that modify program state. Functional languages compute via proof
reduction to normal form. Logic languages compute via proof search.

~~~
adwn
There's no reason why you couldn't integrate functional programming and
imperative programming into a single language – for example, an imperative
layer (for I/O) around a functional, lambda-calculus based core (for
computations). Now what is the _" irreducible foundation of [this] language"_?

------
devrandomguy
There should be a category for analogy based languages. Rail or the Billiard
Ball Machine, which looks like Feynman diagrams on a pool table, are examples
of this.

[https://esolangs.org/wiki/Rail](https://esolangs.org/wiki/Rail)
[https://esolangs.org/wiki/Billiard_ball_machine](https://esolangs.org/wiki/Billiard_ball_machine)

------
dragonwriter
I'm kind of uncomfortable calling "declarative" a paradigm; it's a broad (and
not binary) feature. Prolog and SQL are, respectively, the leading examplars
of the logic and relational paradigms. They are both fairly declarative (but
then, in the age of optimizing compilers, even C is somewhat declarative: your
code constrains the result but it doesn't dictate the how as much as it seems
to.)

------
rmidthun
I would add pictorial programming languages, such as ProGraph if you want a
really unusual way to program. Object oriented, but inherently dataflow. Loops
were very weird though.

[https://en.wikipedia.org/wiki/Prograph](https://en.wikipedia.org/wiki/Prograph)

------
tjalfi
ParaSail[0] is another implicitly parallel language. It has similar goals to
Rust and Spark Ada.

[0] [https://forge.open-do.org/plugins/moinmoin/parasail/](https://forge.open-
do.org/plugins/moinmoin/parasail/)

------
a_c
On concatenative languages, I would like to add Piet[1] as a contender. Plus
Piet program could look like 8-bit art (to me).

[1]
[http://www.dangermouse.net/esoteric/piet.html](http://www.dangermouse.net/esoteric/piet.html)

~~~
a_c
I once read about Pieter Hintjens' model oriented programming [1] which the
idea I have yet to understand. Hope someone can provide more insight on this
subject. HN discussion [2]

[1] [https://github.com/imatix/gsl](https://github.com/imatix/gsl) [2]
[https://news.ycombinator.com/item?id=11558007](https://news.ycombinator.com/item?id=11558007)

------
westurner
Re: "Dependent Types"

In Python, PyContracts supports runtime type-checking and value
constraints/assertions (as @contract decorators, annotations, and docstrings).

[https://andreacensi.github.io/contracts/](https://andreacensi.github.io/contracts/)

Unfortunately, there's yet no unifying syntax between PyContracts and the
newer python type annotations which MyPy checks at compile-type.

[https://github.com/python/typeshed](https://github.com/python/typeshed)

What does it mean for types to be "a first class member of" a programming
language?

------
Blackthorn
Concurrency by default feels a bit like the underlying processor of the
machine, what with superscalar architectures and all.

~~~
snerbles
Hardware Description Languages tend to be concurrent by default, with specific
syntax for sequential logic.

A parallel pair of AND gates is physically concurrent, let alone anything with
a clock.

------
garyclarke27
Great thought provoking article. Good explanation of Dependent Types, Title is
misleading though, should be called - Particular or Specfic Types. Postgres
has the same capability - Domains - always surprised that other db's such as
sql server have never adopted this useful feature.

------
oddmunds
There seems to be and old discussion over here
[https://news.ycombinator.com/item?id=7565153](https://news.ycombinator.com/item?id=7565153)

------
Chris2048
Aurora looks interesting, but it seems to be .Net/windows only?

~~~
ibdknox
That must be a different Aurora. The one mentioned here was never really
released, only demo'd at Strange Loop. We subsequently went on to turn that
into Eve[1].

[1]: [http://witheve.com](http://witheve.com)

~~~
rmbeard
Thanks for confirming that. This looks very interesting. How active is
development of the mathematical side of things? Also have graphics
capabilities been developed yet or are they far away? The HN article indicated
Aurora now Eve would have these capabilities. I'm interested because of the
similarities I see between Eve and other literate programming approaches such
as the Jupyter project, Mathematica, Sage notebooks, R motebooks. Eve looks
like it will soon be joining that group.

~~~
cmontella
Recent development has been focused on shoring up the semantics of the
language, making the runtime more performant, and packaging our syntax and
editor to make it more usable for a wider audience (i.e. people outside our
office). Efforts relating to a UI that is approachable from a non-programmer
perspective have been waylaid until we produce something that is usable and
comfortable for at least the programmer crowd.

------
DonHopkins
I recently submitted a link about "Robust First Computing". It didn't get any
response, but I'll repeat the link and description here, since it's certainly
esoteric, but has some extremely important properties.

Robust-First Computing: Distributed City Generation
[https://www.youtube.com/watch?v=XkSXERxucPc](https://www.youtube.com/watch?v=XkSXERxucPc)

A Movable Feast Machine [1] is a "Robust First" asynchronous distributed fault
tolerant cellular-automata-like computer architecture.

The video "Distributed City Generation" [2] demonstrates how you can program a
set of Movable Feast Machine rules that build a self-healing city that fills
all available space with urban sprawl, and even repairs itself after
disasters!

The paper "Local Routing in a new Indefinitely Scalable Architecture" [2] by
Trent Small explains how those rules work, how the city streets adaptively
learn how to route the cars to nearby buildings they desire to find, and
illustrated the advantages of "Robust First" computing:

Abstract: Local routing is a problem which most of us face on a daily basis as
we move around the cities we live in. This study proposes several routing
methods based on road signs in a procedurally generated city which does not
assume knowledge of global city structure and shows its overall efficiency in
a variety of dense city environments. We show that techniques such as
Intersection-Canalization allow for this method to be feasible for routing
information arbitrarily on an architecture with limited resources.

This talk "Robust-first computing: Demon Horde Sort" [4] by Dave Ackley
describes an inherently robust sorting machine, like "sorting gas",
implemented with the open source Movable Feast Machine simulator, available on
github [5].

A Movable Feast Machine is similar in many ways to traditional cellular
automata, except for a few important differences that are necessary for
infinitely scalable, robust first computing.

First, the rules are applied to cells in random order, instead of all at once
sequentially (which requires double buffering). Many rule application events
may execute in parallel, as long as their "light cones" or cells visible to
the executing rules do not overlap.

Second, the "light cone" of a rules, aka the "neighborhood" in cellular
automata terms, is larger than typical cellular automata, so the rule can see
other cells several steps away.

Third, the rules have write access to all of the cells in the light cone, not
just the one in the center like cellular automata rules. So they can swap
cells around to enable mobile machines, which is quite difficult in cellular
automata rules like John von Neumann's classic 29 state CA. [6] [7]

Forth, diffusion is built in. A rule may move the particle to another empty
cell, or swap it with another particle in a different cell. And most rules
automatically move the particle into a randomly chosen adjacent cell, by
default. So the particles behave like gas moving with brownian motion, unless
biased by "smart" rules like Maxwell's Demon, like the "sorting gas" described
in the Demon Hoard Sort video.

In this video "Robust-first computing: Announcing ULAM at ECAL 2015" [8],
David Ackley explains why "Robust First" computing and computing architectures
like Movable Feast Machines are so incredibly important for scaling up
incredibly parallel hardware.

I think this is incredibly important stuff in the long term, because we've hit
the wall with determinism, and the demos are so mind blowing and visually
breathtaking, that I want to try programming some of my own Movable Feast
Machine systems!

[1] [http://movablefeastmachine.org/](http://movablefeastmachine.org/)

[2]
[https://www.youtube.com/watch?v=XkSXERxucPc](https://www.youtube.com/watch?v=XkSXERxucPc)

[3]
[http://www.cs.unm.edu/~ackley/papers/paper_tsmall1_11_24.pdf](http://www.cs.unm.edu/~ackley/papers/paper_tsmall1_11_24.pdf)

[4]
[https://www.youtube.com/watch?v=helScS3coAE](https://www.youtube.com/watch?v=helScS3coAE)

[5] [https://github.com/DaveAckley/MFM](https://github.com/DaveAckley/MFM)

[6]
[https://en.wikipedia.org/wiki/Von_Neumann_cellular_automaton](https://en.wikipedia.org/wiki/Von_Neumann_cellular_automaton)

[7]
[https://en.wikipedia.org/wiki/Von_Neumann_universal_construc...](https://en.wikipedia.org/wiki/Von_Neumann_universal_constructor)

[8]
[https://www.youtube.com/watch?v=aR7o8GPgSLk](https://www.youtube.com/watch?v=aR7o8GPgSLk)

------
bryanrasmussen
Isn't Lua supposed to be an example of a concatenative language?

~~~
beagle3
Not at all.

The definition I like to use of concatenative language is:

If "X Y" is a legal program, then "X" is a list of tokens and a legal program
and "Y" is a list of tokens and a legal program, and semantically, executing
"X Y" is equivalent to executing "X" and then executing "Y".

practical implementations often deviate a little from this ideal.

~~~
bryanrasmussen
Not sure where I had it from, probably some other language and I got it mixed
up with Lua in my head.

------
pitaj
I believe HN title convention is to remove the number of list items ex. this
title should be just "Programming paradigms that will change how you think
about coding".

~~~
sctb
You're right, and “... that will change how you think about coding” is
clickbait. Luckily the article contains an adjective for that, which we've
used in the title. Thanks!

~~~
dasmoth
I realise moderation is tough an never pleases everyone, but have to say I'm
disappointed here: in some circles, "esoteric" carries some quite strong
negative connotations, which I think are unwarranted here.

Some of these languages are definitely suitable for "serious" usage. And I'm
not sure I'd count SQL (or Prolog, for that matter) as esoteric at all!

~~~
sctb
That's a good point. We'll happily update the title again if someone can
suggest a better one still using the author's language.

~~~
fenomas
The suggestion from 'pitaj' that started this comment thread was perfect.

