Hacker News new | past | comments | ask | show | jobs | submit login
Strand Programming Language (call-with-current-continuation.org)
117 points by QuinnWilton 8 months ago | hide | past | favorite | 28 comments

Their implementation approach is interesting: they have their own custom Forth implementation, with a kernel written in assembly language (ARM, aarch64, x86_64 and ppc64le). Then Strand is written in a mixture of Forth and itself.

I think this is very indicative of Winkelmann's philosophies about software in general. On his site he writes a little bit about why he thinks Forth is so important, and I think it makes for a good explanation of why he probably implemented Strand like this:


A sample from their user manual[0]:

    % print even numbers between 1 and 10

    main :- count(1).

    count(N) :- N =< 10 | even(N), N2 is N + 1, count(N2).

    even(N) :- M is N \\ 2, even(M, N).

    even(0, N) :- writeln(N).
    even(_, _) :- otherwise | true.
[0] http://www.call-with-current-continuation.org/strand/MANUAL

What isn't clear from this snippet, that I think is incredibly cool, is that all of the expressions within a function (here, called processes), actually run in parallel.

In this snippet:

   count(N) :- N =< 10 | even(N), N2 is N + 1, count(N2).
count(N) is defined defined as a process that takes an argument named N, and if N is less than or equal to 10, the process changes state to a random process on the right side, and is also forked to become the other two processes.

Where it gets weird, is that the resulting three processes have data dependencies between them, so if the third process were executed first, count(N2), the dependency on N2 wouldn't be satisfied, and so that process would suspend, and a new process be chosen for execution.

It's easy to look at this line of code and think that the three expressions will execute in the order they're written, but any interleaving of them is possible, with that sort of non-determinism being built into the language by design.

I'm currently reading the Strand book, and it more or less describes the language as being Prolog, but without unification and backtracking: instead treating dataflow as being the basis for the computational model.

It would be interesting if current computers could take advantage of all expressions running in parallel; back last century that was exposing more parallelism (and incurring more coordination costs) than a better, coarser-grained, approach.

I guess that this being a declarative language, it is the task of the compiler to determine the cutoff where you just reorder tasks or you spawn a thread, right?

This is effectively what happens when HDLs are compiled into an ASIC or FPGA.

Can you share your copy of the Strand book? A cursory Google search makes it seem somewhat elusive...

Unfortunately not: it's a physical copy and was a very generous gift from a friend yesterday, who knew I had been trying to track down a used copy for over a year.

That is a good friend, you should keep them

does the book explain why it doesn't use unification?

Not that I've seen yet, but the manual does include this passage:

> Strand provides pattern matching but no general unification, instead logic variables must be explicitly assigned, which simplifies the semantics considerably.

It can be useful to tackle unification in two steps, even in Prolog. By pattern-matching first (a la functional programming) we quickly get either a failure or a set of distinct variables with constraints, which the second 'proper unification' step can try to solve.

For example, unifying `plus(1, 2)` with `plus(X, X)` can start by pattern-matching, which turns the latter into `plus(X, Y)` with the constraint X = Y (to avoid variables occuring more than once). The match will succeed, producing the constraints X = 1 and Y = 2, along with the X = Y constraint from before. The second step will try to solve these constraints, and fail since they're inconsistent.

I've not done much logic programming, but I've encountered this two-step approach as a way to implement co-inductive logic programming, which lets us handle infinite datastructures like streams.

Looks very similar to erlang, for sure the devil is in details but it’s quite easy to read when applying erlang line of thought to it.

Joe Armstrong and Robert Virding actually experimented with compiling Erlang to Strand. I'm not familiar with all of the details, but I believe they saw a factor of six speedup as compared to the Prolog implementation [0], but deemed the project a failure because of the complexity involved in restricting Strand's parallelism and failure to meet their target of a 70x speedup [1].

I'm actually sharing this in the first place because I managed to acquire a copy of "Strand: New Concepts in Parallel Programming" [2] yesterday, and it includes a case study about the Erlang -> Strand compiler, so I've been having fun trying to piece together the lineage.

[0] https://erlang.org/download/armstrong_thesis_2003.pdf

[1] http://erlang.org/pipermail/erlang-questions/2007-September/...

[2] https://www.amazon.com/Strand-New-Concepts-Parallel-Programm...

Erlang was originally built from Prolog, I believe.

Yup! The original Prolog implementation is described in "Use of Prolog for developing a new programming language" [0], and took place in 1986. After performance eventually became an issue, they cross-compiled Erlang to a few other concurrent languages (Strand included), before opting to directly implement Erlang in C.

[0] http://citeseerx.ist.psu.edu/viewdoc/summary?doi=

From the same author, and in the same family of languages, is FLENG: http://www.call-with-current-continuation.org/fleng/fleng.ht...

It lacks Strand's distributed features but includes prolog's unification.

I wasn't involved, but I remember considerable interest in Strand from parallel programming people in the lab about the time it appeared. I don't think that lasted long, but unfortunately I don't remember why it failed (if I knew then).

SISAL was probably a better bet as a dataflow-ish language, but I don't remember anyone else looking at it, despite Manchester being "just" down the road.

No idea if related in any meaningful way, but https://github.com/rust-lang/chalk/tree/master/chalk-engine/... (a "PROLOG-like logic solver" for the Rust trait system) also makes use of "strands"

Oh thank you for sharing! I've never come across this, but there's some great references in the README I'll have to go through.

A good introduction to concurrent logic languages:

Parlog86 and the dining logicians G. A. Ringwood Communications of the ACM January 1988 https://doi.org/10.1145/35043.35044


I find this fascinating, if a bit incomprehensible, but what might an intended use case be?

A nice property of logic programming is that its semantics don't depend on "time": it just turns data into data (unlike "imperative" programming, like C, Java, etc. which rely on state changing over time). This is why logic programming is sometimes called "declarative".

Whilst concurrency and parallelism are a bit of a nightmare in imperative languages (e.g. multithreading), they turn out to be trivial in such "declarative" languages; in fact the difficulty is usually in reducing parallelism, to keep down the context-switching overheads.

Traditional logic programming languages like Prolog work in a single-threaded depth-first manner, i.e. to execute something like `main` we look at its definition, which might have multiple "clauses" depending on the input data (a bit like doing a `switch` or `cond` to figure out what to return); Prolog will see if the first condition applies, and if so it will start calculating the corresponding definition, which will probably involve some mixture of calls to other functions; it will execute those functions in a similar way. However, it will also "remember where it got up to" during the execution of 'main'; if the definition fails (or if we've asked for more than one result!), it will "back track" and try the next clause. This sort of like having a try/catch in each switch clause, and moving on to the next clause if there's an exception.

Strand seems similar, but rather than trying one clause at a time, depth-first; it instead runs all of the clauses concurrently. This doesn't need to "remember" anything or back-track; if a particular clause fails then its thread dies, leaving the others (if any) to keep going.

It seems like a remarkably simple way to achieve massive parallelism, especially for those already familiar with logic programming.

From the User's Manual:


    (Chinese Proverb)

It is a "mojibake" of 三個和尚沒水喝[1].

[1] https://en.wikipedia.org/wiki/Three_Monks

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact