
Ask HN: Which people and groups are researching new approaches to programming? - maryrosecook
Which people and groups are researching new approaches to programming?<p>I&#x27;m interested in groups who are thinking about the next thirty years, rather than the next five years.<p>Some projects I find interesting:<p>* Subtext by Jonathan Edwards: https:&#x2F;&#x2F;vimeo.com&#x2F;106073134<p>* Apparatus by Toby Schachman: http:&#x2F;&#x2F;aprt.us<p>* Bret Victor&#x27;s explorable explanation and introspectable programming demos: http:&#x2F;&#x2F;worrydream.com<p>* Eve by Kodowa: https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=VZQoAKJPbh8<p>* Scratch by the MIT Media Lab: https:&#x2F;&#x2F;scratch.mit.edu<p>* Mathmatica by Wolfram Research: https:&#x2F;&#x2F;www.wolfram.com&#x2F;mathematica<p>Why am I interested? I&#x27;m working on Code Lauren, a game programming environment for beginners: http:&#x2F;&#x2F;codelauren.com I want to learn as much as possible about work done by others in the same area.
======
GregBuchholz
I think you'd probably be interested by the Viewpoint Research (and Alan Kay):

[http://vpri.org/index.html](http://vpri.org/index.html)

...and STEPS:

[http://www.vpri.org/pdf/tr2011004_steps11.pdf](http://www.vpri.org/pdf/tr2011004_steps11.pdf)

"We set a limit of 20,000 lines of code to express all of the “runnable
meaning” of personal computing (“from the end‑user down to the metal”) where
“runnable meaning” means that the system will run with just this code (but
could have added optimizations to make it run faster). One measure will be
what did get accomplished by the end of the project with the 20,000 lines
budget. Another measure will be typical lines of code ratios compared to
existing systems. We aim for large factors of 100, 1000, and more . How
understandable is it? Are the designs and their code clear as well as small?
Can the system be used as a live example of how to do this art? Is it clear
enough to evoke other, better, approaches?"

~~~
lebek
Alan Kay is also running the Communications Design Group:
[https://github.com/cdglabs](https://github.com/cdglabs)

~~~
msutherl
Just to clarify, Alan is only 'running CDG' insofar as he is supporting and
representing it as a sister lab to VPRI. The various research groups there are
completely autonomous and as far as I can tell not publicly identified.

~~~
leoc
A lot of the news coverage is claiming that he had a role in recruiting
specific people to CDG, and even names some of them (eg.
[http://www.bloomberg.com/news/articles/2015-01-29/sap-
looks-...](http://www.bloomberg.com/news/articles/2015-01-29/sap-looks-to-
xerox-for-r-d-inspiration-builds-idea-lab) ), which would be more than just
lending his support even if he's not really running the place. I certainly
can't confirm if that's accurate though.

------
rwosync
Like most things, the kernel of tomorrow's ideas is already here. On the scale
of the next five years, these ideas will give rise to what the future of
programming will look like:

* Refinement types

Liquid Haskell: [https://ucsd-progsys.github.io/liquidhaskell-
tutorial/02-log...](https://ucsd-progsys.github.io/liquidhaskell-
tutorial/02-logic.html#/semantics)

* SMT Solver Language Integration

Cryptol:
[https://github.com/GaloisInc/cryptol](https://github.com/GaloisInc/cryptol)

* Session Types

Scribble: [http://www.scribble.org/](http://www.scribble.org/)

* Dependent Types

Agda:
[https://en.wikipedia.org/wiki/Agda_(programming_language)](https://en.wikipedia.org/wiki/Agda_\(programming_language\))

Idris: [http://www.idris-lang.org/](http://www.idris-lang.org/)

* Effect typing

Koka: [https://research.microsoft.com/en-
us/um/people/daan/madoko/d...](https://research.microsoft.com/en-
us/um/people/daan/madoko/doc/koka-effects-2014.html)

* Formal verification

Coq:
[https://www.cis.upenn.edu/~bcpierce/sf/current/index.html](https://www.cis.upenn.edu/~bcpierce/sf/current/index.html)

TLA+: [http://research.microsoft.com/en-
us/um/people/lamport/tla/tl...](http://research.microsoft.com/en-
us/um/people/lamport/tla/tla.html)

This is the general trend, generally making more composable abstractions and
smarter compilers and languages that can reason about more of our programs for
us.

------
dentrado
Some more projects:

* Unsion by Paul Chiusano: [http://unisonweb.org/2015-05-07/about.html](http://unisonweb.org/2015-05-07/about.html)

* APX by Sean McDirmid: [https://www.youtube.com/watch?v=YLrdhFEAiqo](https://www.youtube.com/watch?v=YLrdhFEAiqo)

* Awelon by David Barbour: [https://github.com/dmbarbour/awelon/blob/master/AwelonProjec...](https://github.com/dmbarbour/awelon/blob/master/AwelonProject.md)

And perhaps more in the "next five years" category:

* Om Next by David Nolen: [https://www.youtube.com/watch?v=MDZpSIngwm4](https://www.youtube.com/watch?v=MDZpSIngwm4)

* Elm by Evan Czaplicki: [http://elm-lang.org/](http://elm-lang.org/)

------
DonaldFisk
Me.

I'm working on Full Metal Jacket, a strongly-typed, visual, pure dataflow
language
([http://web.onetel.com/~hibou/fmj/FMJ.html](http://web.onetel.com/~hibou/fmj/FMJ.html))
with its own IDE.

Things have advanced a fair bit since I wrote those pages, and published the
recent paper, so I'll add to the tutorials very soon, and announce this in
Hacker News. Type definition, macros, and a few other things have been added
to the language.

.303 shared-source release approaches, but I don't do deadlines.

No battle plan survives contact with the enemy, but I have some ideas for
future directions, including adding dependent types, running on a multi-core
machine with speculative execution, and automatic programming (i.e. user
supplies just inputs and outputs). Very long-term ideas involve a developing a
variant of the language which enables programs to run backwards, to enable
execution on a gated quantum computer.

~~~
kecks
Wow, that's oddly similar to an idea I've been playing with.

Who do you see as the core users of these tools?

~~~
mej10
A lot of people creating multimedia works use visual programming languages
that have dataflow models.

Some that are fairly mature projects:

vvvv - [https://vvvv.org/](https://vvvv.org/)

Puredata -
[https://cycling74.com/products/max/](https://cycling74.com/products/max/)

Max MSP - [https://puredata.info/](https://puredata.info/)

~~~
qzxvwt
Am I wrong to think about the "graphical" aspect of visual programming
languages as another layer of abstraction? If this is the case, doesn't that
take some power away from these users, in the sense that their capabilities
are limited to the abstractions/interface given to them by the language's
developers?

Not exactly trying to question whether it's a good/bad thing (as I'm aware
that for many creators this might be exactly what they need), just trying to
understand.

~~~
mej10
I have done some programming in these and they are definitely more suited to
the applications that their creators had in mind.

I wouldn't want to do general purpose programming in them -- at least as they
exist currently.

~~~
DonaldFisk
Full Metal Jacket is general-purpose. Doing general-purpose programming in it
does, however, has required me to think very differently. At present, it's
still easier for me to program in Lisp, but that's mostly down to being a very
experienced Lisp programmer and a rookie FMJ programmer. Initial difficulties
are inevitable when you learn a new programming language different from the
languages you already know. I had similar issues with Prolog but mastered
that.

Different general-purpose languages are better adapted to different
programming tasks, however, and Full Metal Jacket may in time find its niche.
I wouldn't write a neural network in Prolog, or an expert system in C, for
example.

------
rntz
Sean McDirmid's work on Glitch is an interesting (and distinctly contra- the
current "FP all the things!" zeitgeist) approach to live programming:
[http://research.microsoft.com/en-
us/people/smcdirm/](http://research.microsoft.com/en-us/people/smcdirm/)

Conal Elliott's work on Tangible FP was an interesting attempt to unify
functional and "visual" programming that has been mostly abandoned:
[http://conal.net/papers/Eros/](http://conal.net/papers/Eros/) Hopefully some
of its ideas may yet survive in other projects.

The Berkeley Orders of Magnitude project is somewhere at the intersection of
database and PL research, aimed at handling orders of magnitude more data with
orders of magnitude less code:
[http://boom.cs.berkeley.edu/](http://boom.cs.berkeley.edu/) The Dedalus
language in particular is interesting, as it integrates distributed- and
logic-programming:
[http://www.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-17...](http://www.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-173.html)

Joe Armstrong's thoughts on a module- or namespace-less programming
environment are interesting: [http://erlang.org/pipermail/erlang-
questions/2011-May/058768...](http://erlang.org/pipermail/erlang-
questions/2011-May/058768.html)

I've been meaning to write a blog post about the convergence of various ideas
of what the future of programming might look like for a while now, so I have a
bunch of notes on this topic. The OP & other folks have already mentioned most
of the other projects in my notes - in particular Unison, Subtext, Eve, & Bret
Victor's work.

My current line of work is on tackling a tiny little corner of what I see as
the future's problems - trying to find a better way to combine
database/relational programming and functional programming. My work is here
(but the docs are almost entirely in type-theory-jargon at the moment, sorry!
feel free to shoot me an email if you have questions):
[https://github.com/rntz/datafun](https://github.com/rntz/datafun)

~~~
gnocchi
Thanks for Datafun where can I start to learn more about the syntax you are
using on the README.md?

~~~
rntz
Hm, which bits of syntax are you confused by in particular?

At the beginning, when defining what types, expressions, contexts, etc. looks
like. I use a bunch of BNF
([https://en.wikipedia.org/wiki/Backus%E2%80%93Naur_Form](https://en.wikipedia.org/wiki/Backus%E2%80%93Naur_Form)).
I'm afraid I don't actually know a good introduction to BNF.

The clearest account I know of of the most critical part of my notation,
namely _inference rules_ (those things with the big horizontal bars), is in
Frank Pfenning's notes for his course on Constructive Logic:
[http://www.cs.cmu.edu/~fp/courses/15317-f09/lectures/02-natd...](http://www.cs.cmu.edu/~fp/courses/15317-f09/lectures/02-natded.pdf)
(the full course of notes is at
[http://www.cs.cmu.edu/~fp/courses/15317-f09/schedule.html](http://www.cs.cmu.edu/~fp/courses/15317-f09/schedule.html),
but the one I've linked is most relevant)

A lot of the rest of my notation (types like A × B, A → B, expressions like
λx.e, the judgment Δ;Γ ⊢ e : A meaning "e has type A in context Δ;Γ", and the
inference rules themselves) is borrowed from fairly standard type-theory-of-
functional-languages stuff. Standard books to read here are Types and
Programming Languages by Benjamin Pierce
([https://www.cis.upenn.edu/~bcpierce/tapl/);](https://www.cis.upenn.edu/~bcpierce/tapl/\);)
or Practical Foundations for Programming Languages by Bob Harper
([https://www.cs.cmu.edu/~rwh/plbook/book.pdf](https://www.cs.cmu.edu/~rwh/plbook/book.pdf)).
Those are pretty heavy books that cover a lot of ground, including stuff
that's mostly irrelevant to my work. If you're interested, though, they're a
great place to start learning about formal PL theory!

Was there anything else in particular you wanted to know about?

~~~
gnocchi
Thank you, this is perfect to get started, I've been programming for a long
time and I couldn't extract enough meaning.

I've heard about the BNF notation with the recent news about Peter Naur death
and before but never took to time to get my head around.

I think your description of Datafun and the references you gave me are good
and enough to make me learn something new and to understand your project.

------
tolmasky
We're working on [http://tonicdev.com](http://tonicdev.com) . You can read a
bit about it here [http://blog.tonicdev.com/2015/09/10/time-traveling-in-
node.j...](http://blog.tonicdev.com/2015/09/10/time-traveling-in-node.js-
notebooks.html) or just try it yourself.

Some guiding principles:

1\. So much of what hinders programmers is that the friction to using existing
solutions is so high that they choose to redo the work themselves. In Tonic
we've made every package for javascript immediately available (over 200,000),
so that you can focus on putting existing pieces together. We keep adding them
as they come in immediately, as well as making notebooks accessible to each
other. Its like you have access to the global library.

2\. There shouldn't be any snippet of code online that isn't runnable. Every
example you read on the internet should be immediately runnable (regardless of
whether it require binary packages or what have you). The difference in
understanding when you can tweak an example and see what happens vs just
cursory reading over it and thinking you get it is huge. Tonic embeds allows
this: [https://tonicdev.com/docs/embed](https://tonicdev.com/docs/embed)

~~~
mej10
How are you handling sandboxing and other security/abuse related things?

------
ping010pong
[http://urbit.org/docs/theory/whitepaper](http://urbit.org/docs/theory/whitepaper)

~~~
chc4
Seconding Urbit since I'm a giant fanboy. It's not a new programming languages
as much as new computer, internet, everything. And it's a lot of fun to play
with - Hoon isn't nearly as difficult as it looks :)

As insane as a lot of it seems, most of it are just logical conclusions from a
few base ideas: what would computers look like if you owned your own data, if
it was all built on top of a purely functional VM backed by bignum binary
trees, if you could replace all the mistakes made over the years (nonversioned
file systems, untyped shells, manually serializing data structures)

------
_mhr_
My primary concern is with personal knowledge bases [1]. This intersects with
new approaches to programming, because programs and algorithms are an aspect
of and a way of interacting with knowledge.

It's mostly conceptual right now, but my idea is to represent knowledge as a
hypergraph with spatial and temporal dimensions on both edges and vertices.
This, I hope, could represent every kind of knowledge I can imagine. The
hypergraph would function as a filesystem and database, and you could
query/program the system. It would be Emacs for all media, and not just text.

I want to augment the inherent power of the hypergraph with a mashup of the
OpenEndedGroup's Field, Xerox PARC's Smalltalk, Doug Engelbart's NLS,
Symbolics' Open Genera, org-mode, Vim, the Wolfram Language, Bell Labs' Plan
9, and Ted Nelson's ZigZag and Xanadu projects.

If anyone finds this interesting and wants to chat about this stuff, please
email me at the email address in my profile.

[1]
[https://en.wikipedia.org/wiki/Personal_knowledge_base](https://en.wikipedia.org/wiki/Personal_knowledge_base)

------
mamcx
I'm working in a relational language like [http://www.try-
alf.org/blog/2013-10-21-relations-as-first-cl...](http://www.try-
alf.org/blog/2013-10-21-relations-as-first-class-citizen).

However a noob in building a language ;)

I have learn a lot of stuff. For example, columnar-stores provide some nice
oportunities to manage data in-memory + compress values in columns.

Sorting, joining & selecting on multiple-arrays that are a necessity for OLAP
queries translate well to the need of normal programming.

SQL is a overcomplication and a bad "programming" language. Unfortunately, the
only practical way to interface with most databases.

If my plan work, this could be good to build a relational store, that with let
your to say: The name is a string, and I need to search on it. A normal engine
will need to store it again in a index. I will say instead, the name column is
the index and the values are stored once.

Or something like that.

BTW: A relation have some simmetry with inmutable functions, but I still don't
know how exploit this fact

~~~
mej10
Also check out datalog.

Datomic is a database that uses it as its query language -- it is really nice
to work with.

~~~
mamcx
Datalog not "click" to me. Is weird to understand and another uncommon
programming model, and think that the ideao of build a relational language
exhaust the limit of paradigm switch for my potential users. However I wonder
if can be used as a internal engine and will provide efficient/easier backend
for all the relational stuff?

~~~
OJFord
It's almost 'free' if you know even just basic Prolog.

I learnt it before Prolog, and personally really like it, much more than I do
SQL. It's a slightly different mindset - "what is a result" rather than "how
do I go about finding a result" \- but one that I found more intuitive in an
Intro to DB course.

------
solomatov
JetBrains MPS:
[https://www.jetbrains.com/mps/](https://www.jetbrains.com/mps/)

------
PeCaN
I'm building a language for learning to program on smartphones. It's a stack-
based language designed for interactive editing (since typing code on a
smartphone is no fun).

It's in the very early stages at the moment.

[https://github.com/alpha123/vio](https://github.com/alpha123/vio)

------
jamii
Microsoft have done some great and totally ignored work with embedding visual
languages into game engines for kids - [http://research.microsoft.com/en-
us/projects/kodu/](http://research.microsoft.com/en-us/projects/kodu/)
[http://welcome.projectspark.com/](http://welcome.projectspark.com/)

Wrt to Code Lauren you may also be interested in
[http://www.cs.cmu.edu/~NatProg/index.html](http://www.cs.cmu.edu/~NatProg/index.html)
\- a HCI approach to tooling (the WhyLine is my favourite).

Dog hasn't released much info yet but it's an interesting concept -
[http://dl.acm.org/citation.cfm?id=2502026](http://dl.acm.org/citation.cfm?id=2502026)
[https://www.media.mit.edu/research/groups/social-
computing](https://www.media.mit.edu/research/groups/social-computing)

Program synthesis is really interesting area of research eg
[http://research.microsoft.com/en-
us/um/people/sumitg/](http://research.microsoft.com/en-us/um/people/sumitg/)

VPRI has already been mentioned but I'd like to highlight their work on
Programming as Planning -
[http://www.vpri.org/pdf/m2009001_prog_as.pdf](http://www.vpri.org/pdf/m2009001_prog_as.pdf)

From the last FPW, there were a couple of projects that really stood out for
me:
[http://probcomp.csail.mit.edu/bayesdb/](http://probcomp.csail.mit.edu/bayesdb/)
[http://2015.splashcon.org/event/fpw2015-an-end-user-
programm...](http://2015.splashcon.org/event/fpw2015-an-end-user-programming-
environment-that-s-cell-based-copy-paste-friendly-with-a-flat-and-forward-
execution-model)

Also not really a new language, but Sam Aarons work on Sonic Pi is
pedagogically interesting - [http://sonic-pi.net/](http://sonic-pi.net/)

------
zyxzevn
Unseen

A Functional and Logical Dataflow language.

[http://www.reddit.com/r/unseen_programming](http://www.reddit.com/r/unseen_programming)

Status: under development.

Functions are components, which can be used recursively. Arrows are logical
relations between these functions. Inspired by Scala and VHDL.

The logic and flow deal with the control and time aspect.

Testing and commenting is integrated in the graphical system as different
layers. All graphical structures can be converted to (reasonably) simple text,
resembling Scala.

------
pklausler
Looking at hardware trends on servers, in the next ten years I expect more
pure functional programming on the CPU (implicitly concurrent, write-once data
structures) and more data-parallel array operations being offloaded to GPUs.
The CPU language of 2025 is probably something very much like Haskell with a
cleaned-up base library, but I'm not sure what the GPU component will look
like to the programmer.

~~~
chc4
It probably looks like APL. Data-oriented programming, with the only type
being vectors/streams of data. Ideal parallelization to however many million
CUDA cores just by distributing the workset into partitions.

Really, its how GLSL and other shader languages work already, just hidden
behind C-like syntax and lies.

------
westoncb
For a meditation on which aspects of our text-based programming tools derive
from merits of the medium versus historical accident, see:
[http://westoncb.blogspot.com/2015/06/why-programming-
languag...](http://westoncb.blogspot.com/2015/06/why-programming-languages-
dont-have.html) —there's also a second part (linked to within) that describes
an alternate, general purpose architecture for programming tools that lets us
stay powerfully text-centric while moving away from operating on character
sequences under the hood.

I wrote this non-traditional program editor:
[https://youtu.be/tztmgCcZaM4?t=1m32s](https://youtu.be/tztmgCcZaM4?t=1m32s)

And a new kind of visual debugger:
[https://www.youtube.com/watch?v=NvfMthDInwE](https://www.youtube.com/watch?v=NvfMthDInwE)

------
eecks
> * Eve by Kodowa:
> [https://www.youtube.com/watch?v=VZQoAKJPbh8](https://www.youtube.com/watch?v=VZQoAKJPbh8)

This guy seems kinda young. Does anyone think that the future of programming
can come from people without extensive programming experience of current
programming?

------
sarmad
I am. I am working on a new programming language designed for the next
generations. The language re-imagines the role of the compiler from being a
monolithic static blackbox that converts source to executable, into being an
open dynamic system the manages the language syntax and compilation process,
but leaves the final syntax and actual compilation to hot pluggable libraries.
Why? To make the language dynamically upgradable, and on a per-project basis.
This is the only way to make a language future-proof. More details:
[http://alusus.net](http://alusus.net)
[http://alusus.net/overview](http://alusus.net/overview) P.S. the project is
still at a very early stage, but a proof of concept is there.

~~~
teraflop
I read the links; your language looks interesting, but could you expand a bit
on what makes it a _new_ approach? User-programmable syntax is an old idea;
for instance, it's been a core feature of Scheme for 30+ years.

~~~
sarmad
My language is not only about user defined syntax; it's also about user
defined code generation. For example, you can define your own syntax that
compiles directly into a DB stored procedure or into GPU shader, or into
whatever technology that might be invented in the future, like for example
quantum computing. In fact, it's less about user defined syntax than it is
about user defined code generation as I am trying to limit new syntax into
certain patterns rather than leave it in the wild.

~~~
gnaritas
That's what macros are in Lisp and Scheme.

~~~
sarmad
Does scheme give you access to the compiler's internal data structures? Can
you for example write an extension that scans through the compiled code
looking for, say, for loops and replacing them with something else?

~~~
gnaritas
Macro expansion happens just before compilation; you have the chance to alter
the source code as it's being read by the reader just before compilation.

But I see what you're getting at; you mean global modifications to the
compiler to interpret the whole language differently. A macro allows you to
make syntax, but it doesn't change the meaning of other syntax elsewhere in
the program; only allows the programmer to add syntax to the language.

------
zitterbewegung
Everyone doing probabilistic programming. Can't really give a summary

[https://en.wikipedia.org/wiki/Probabilistic_programming_lang...](https://en.wikipedia.org/wiki/Probabilistic_programming_language)

Formal verification using ACL2 or Coq or other tools also.

~~~
jakeogh
[https://news.ycombinator.com/item?id=9368443](https://news.ycombinator.com/item?id=9368443)

------
marknadal
I think "visual" programming is the incorrect approach, something I would like
to see more of is "tactile-spatial" programming. Anybody have an example of
these? Most work I've seen is visual/flowchart which is not optimal for touch
devices or large projects.

~~~
hendekagon
This is what I'd like to see too. Let's get away from tiny motions and sitting
in a chair and make tools more like traditional wood/metalworking tools for
programming: benches with robotic appendages that we can manipulate ASTs with.
Let's use our kinesthetic and touch senses more.

------
xjay
Like many people, I think about this now and then, but I haven't done any work
in that area. Ultimately, I picture a live environment, but not like Blueprint
in UE4.

I expect we'll end up with a predominantly interactive approach to
programming, most likely visual drag-and-drop style programming, in a live
environment that knows common patterns, data structures, algorithms; giving
real-time advice, showing native code output as you go, etc. Basically, you're
molding the system while monitoring it under various contexts, and you're
programming against data models.

Got a new Arduino board? Just drag and drop the data sheet model into your
environment. It contains memory address information, read/write contexts,
access protocols, contexts, etc for every component, and how they're
connected. Now you design the rest of the logic.

"A programming language is a user interface as much as it is any other thing.
Why have multiple ones? They are all Turing equivalent."—Alan C. Kay (see the
talk for the context of the quote; starts at around 23:25)

"Rethinking CS Education | Alan Kay, CrossRoads 2015."
[https://www.youtube.com/watch?v=N9c7_8Gp7gI](https://www.youtube.com/watch?v=N9c7_8Gp7gI)

"Most computer people went into computing because they were disturbed by other
human beings."—Alan C. Kay

------
chenglou
Lamdu: [http://www.lamdu.org](http://www.lamdu.org)

------
nicklovescode
I'm working on a new approach to programming that is in the direction of Bret
Victor's Inventing on Prínciple (focusing on environment, all open source)

Because we're prelaunch(and working hard on getting it to all of you!) I can't
talk about it much yet, but we're funded and looking to work with great
people.

Email me at cammarata.nick@gmail.com if you're interested in this space, would
love to hear ideas and talk more.

------
david927
Kayia was presented at the Future of Programming at Strange Loop a little over
a year ago.

[http://kayia.org](http://kayia.org)

------
drostie
Viewpoints Research Institute, see e.g.
[http://vpri.org/html/writings.php](http://vpri.org/html/writings.php) which
has papers like "Checks and Balances - Constraint Solving without Surprises in
Object-Constraint Programming Languages" and Alessandro Warth's "Experimenting
With Programming Languages" (which led to OMeta/JS, which is I think on
GitHub), as well as a ton of Alan Kay talks on fundamental new computing
technologies
[http://vpri.org/html/words_links/links_ifnct.htm](http://vpri.org/html/words_links/links_ifnct.htm)
.

The way that Datomic uses Datalog is really interesting from a perspective of
"new approach to programming" (databases).

Erik Demaine's course on advanced data structures gives some interesting ideas
for time-travel-based games:
[https://courses.csail.mit.edu/6.851/spring14/](https://courses.csail.mit.edu/6.851/spring14/)
. His work also has application to other fields like creating an efficient in-
app version control system
[http://www.cs.utexas.edu/~ecprice/papers/confluent_swat.pdf](http://www.cs.utexas.edu/~ecprice/papers/confluent_swat.pdf)
.

Lots of cool stuff on HaskellWiki; for example
[https://wiki.haskell.org/Functional_Reactive_Programming](https://wiki.haskell.org/Functional_Reactive_Programming)
.

If you really want to jump into the deep end there's a whole blog called
Lambda: the Ultimate about new approaches to programming: [http://lambda-the-
ultimate.org/](http://lambda-the-ultimate.org/)

------
lmeyerov
Program synthesis: see work by Ras Bodik's team at Berkeley (now UW) and
descendants in Armando's MIT team and Sumit Gulwani's MSR team.

As a concrete examples making industry waves, see Excel's new Flashfill and
Trifacta ETL wrangling product.

Underneath, these use search (ML, SMT, ...) to allow non-tradiotional & sloppy
coding: fill in the blanks, program by demonstration, etc.

~~~
whaaswijk
For more on program synthesis and verification see also the Leon system by
Viktor Kuncak et al.[1]

[1] [http://lara.epfl.ch/w/](http://lara.epfl.ch/w/)

------
cycomachead
From Scratch you should checkout Snap! and snap.berkeley.edu

John Maloney (co-inventor of Scratch), Jens Moenig (who was on the Scratch
Team and develops Snap!), along with Yoshiki Oshima (who may also have been on
the Scratch team), are deveolping a new langauge "GP" (for General Purpose)
which like "professional Scratch".

Here's a video of it:
[https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&c...](https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwjvvvvGwsbKAhUG8mMKHfmXDvQQtwIIHTAA&url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DCnvoz_5_YiI&usg=AFQjCNEkAdQeWLqxHbHB8a0IyNSaJKXVVg&sig2=1n0QOMDvJcvdFA_W202QVA)

There's also lots of extenstions to Scratch and Snap! particularly around CS
education, and I'd be happy to discuss those!

------
petra
STEPS by alan kay(building a full OS including applications in 20K LOC:
[http://blog.regehr.org/archives/663](http://blog.regehr.org/archives/663)

Spiral [http://www.spiral.net/index.html](http://www.spiral.net/index.html)
and spiralgen - going straight from math to code optimized for multiple
platforms: [http://www.spiralgen.com/](http://www.spiralgen.com/)

Automatically solving bugs using genetic algorithms :
[http://dijkstra.cs.virginia.edu/genprog/](http://dijkstra.cs.virginia.edu/genprog/)

Automatic bug repair(2015): [http://news.mit.edu/2015/automatic-code-bug-
repair-0629](http://news.mit.edu/2015/automatic-code-bug-repair-0629)

------
rl3
[https://treeline.io/](https://treeline.io/)

YC company run by Mike McNeil. Visual backend programming. Built off Node,
Sails and Machine Spec. The latter is particularly interesting:

[http://node-machine.org/](http://node-machine.org/)

~~~
mikermcneil
@rl3 Thanks for the mention :)

@maryrosecook I'm glad this post showed up in our Slack feed- really cool
topic! As I'm sure you've noticed from the volume, variety, and impressive
comments in this thread, improving approachability/usability for software
development is an exciting (albeit broad) effort, and there is a lot of
amazing work underway. In the area of declarative programming specifically,
there's a lot still to do, but as a community we're getting really close. That
said, there is far too much work to do here for any one team to do it by
themselves.

To be successful, or at least to expedite the process of democratizing
software development, I believe we need to establish a common set of semantics
for describing _what software is_. And the first step towards that is
deconstructing functions-- or perhaps more accurately: "subroutines".

The rest of the Sails.js core team and I have been working on this problem for
quite some time now, primarily as a reaction to having to build plugin system
after plugin system in Sails as the framework's user base grew, and getting
frustrated with every plugin system being a little bit different. After we had
our first working prototype of the tech that would become Treeline, we
extrapolated and open-sourced the machine specification and related utilities
as a way of exposing building blocks for other devs to reuse for their own
projects.

To add a little background, machines are stateless building blocks; nothing
fancy. They're more or less equivalent to asynchronous functions in
JavaScript, just highly specified. In particular, machines encapsulate
definitions of their inputs, their exits, and whether they are "cacheable",
just "idempotent", or neither (and therefore have side effects). Machines are
oftentimes implemented as JavaScript functions-- although part of the
project's philosophy is that what's important is the _usage_, not the
implementation.

From the perspective of a traditional "intro to programming", an example of a
machine is `moveTurtleDown()`:
[https://gist.github.com/mikermcneil/864b930a67b0d00d3d8b](https://gist.github.com/mikermcneil/864b930a67b0d00d3d8b)

As a more practical example, here's the documentation for a handful of
machines related to string manipulation: [http://node-machine.org/machinepack-
strings](http://node-machine.org/machinepack-strings) (To try any of them out,
just copy and paste the generated sample code into e.g. a Tonic sandbox:
[https://tonicdev.com/npm/machinepack-
strings](https://tonicdev.com/npm/machinepack-strings))

Each of a machine's expected inputs may be required or optional (with a
potential default value) and declare a type schema. Each of its exits may
optionally declare a return value, and if so declare a type schema for it. In
addition, machines have strict conventions for meta properties (e.g. the
machine `description`, which summarizes the machine's behavior when run, is
~80 characters or less of English, written in the imperative mood and in
sentence case with ending punctuation)

This can be used to generate documentation, code for interactive form
elements, or in the case of the default machine runner, runtime type
validations.

But most important of all, once a piece of code is written to the machine spec
and open-sourced, it can be used by any tool or platform which consumes that
spec (think USB).

As an example, in Treeline we use machines as draggable/droppable components
that you can compose into what we call a "circuit"; which can _itself_ be used
as the implementation of another "composite" machine with its own inputs and
exits (vs. an "analog" machine; i.e. a machine implemented with JavaScript
code). We are planning to publish the circuit spec and related tooling for
composite machines later this year. In the mean time, the analog machine spec
is stable and ready to use. Around 500 MIT-licensed machines are available
today on NPM, with generated documentation hosted on the Node Machine
registry.

Hopefully that was a helpful first look- if I can help explain further or if
you'd like to discuss the project roadmap, just hit me up on Twitter. It's
really awesome to see so much new interest and progress in this tech, and
particularly in the context of education. Excited to see where you go with
CodeLauren!

~~~
jononor
Just checked out the node-machines.org and theJsDot presentaion there. Many
parts of it has paralells to things I'm playing around with[1]. What are the
reasons for not going with an established concepts for the machines? Like
contracts for specifying pre-conditions (input requirements) and post-
conditions? Or Promises for error vs success scenarios?

1\.
[https://github.com/jonnor/projects/blob/master/introspectabl...](https://github.com/jonnor/projects/blob/master/introspectable-
computing/README.md#areas-of-research), maybe especially Agree

------
nickpsecurity
I'm doing several myself at an abstract level given lack of time or resources.
I feed what I learn in them to pro's for potentially seeing it built. Here's a
few:

1\. Modula-2-style systems language with macros, better typing, and typed
assembler to replace C with minimal hit on performance or ease of use.

2\. Relatively-simple, imperative language plus translators to equivalent
functionality in mainstream languages to leverage their static analysis or
testing tools if possible.

3\. Unified architecture to natively represent and integrate both imperative
(C or Modula style) and functional (Ocaml or Haskell style) languages.
Extensions for safety/security at hardware level for both.

4\. Hardware/software development system using Scheme or abstract, state
machines that can turn a high-level description into software, hardware, or
integrated combo of both.

5\. Defining TCB's for type systems, OS architectures, etc that minimize what
must be trusted (mathematically verified) to ensure invariants apply while
maximizing expressiveness. crash-safe.org and CertiKOS are more concrete,
useful, and professional versions of what I'm getting at with 5.

6\. Building on 5, a series of consistent ways of expressing one program that
capture sequential, concurrent, failure, integration, and covert channel
analysis in a clean way for formal analysis. As in, an all-in-one high
assurance toolkit that automates the common stuff a la static analysis or
reusable, composable proofs. And makes hard stuff easier for pro's.

7\. Occasional thoughts on automatic programming via heuristics, patterns,
rules, human-guided transforms, and so on. Got me started in advanced SW and
my mind occasionally goes back to possibilities for achieving it.

8\. A way to do any of that with the rapid iteration, safety, performance,
live updating, and debugging properties of Genera LISP machine. I mean, we
_still_ don't have all that in mainstream tooling? ;)

------
magicmouse
I am working on a new computer language called Beads, which is designed to
replace the current development stack for both personal computers, mobile
devices, and web. You can find out more about this project at e-dejong.com

The focus of my tool is creating graphical interactive software: iphone and
Android apps, desktop apps, and things that run in the browser. The notation
is compact, readable, and straightforward. It has many innovative aspects: new
data structures, physical units (something FORTRAN just got after 30 years in
the in-basket), and deductive reasoning which dramatically reduces the length
of programs. It is not a design by graphics system, but a language. It isn't
that abstract, and is far more straightforward than Haskell or PROLOG. It is
not a LISP derivative.

------
philip142au
There are many different projects, I feel however that Idris and Scala are the
places where innovation should be done. Although there are ideas in Subtext,
etc, they should be implemented as libraries in Idris or Scala to gain the
maximum usage from developers.

------
reptation
Jordan Pollack's group is interesting (genetic algorithms, etc.):
[http://www.cs.brandeis.edu/~pollack/](http://www.cs.brandeis.edu/~pollack/)
Web page is hilariously archaic as bonus

------
lebek
The Augmented Programming group:
[https://groups.google.com/forum/#!forum/augmented-
programmin...](https://groups.google.com/forum/#!forum/augmented-programming)

------
protomyth
I still think there is something to agent oriented programming. Yoav Shoham
[https://en.wikipedia.org/wiki/Yoav_Shoham](https://en.wikipedia.org/wiki/Yoav_Shoham)
[http://www.infor.uva.es/~cllamas/MAS/AOP-
Shoham.pdf](http://www.infor.uva.es/~cllamas/MAS/AOP-Shoham.pdf)
[http://robotics.stanford.edu/~shoham/](http://robotics.stanford.edu/~shoham/)

~~~
nickpsecurity
I used to do that back when I was investigating and toying with AI. Had a
whole book dedicated to all the ways one could apply it. One use case was to
send an agent over our slow, expensive connections to where the data was to do
work for a price and bring just the results back. Since then, our connections
and machines have gotten _fast_. Yet, HN posts show the concept lives on in
cloud services doing datamining and stuff for a new reason: pulling a lot of
data _out_ of the cloud costs a fortune vs pulling just the results of on-
premises analysis.

Agent-oriented programming lives on today in a new form. Just dawned on me as
I saw your post. :)

~~~
protomyth
Weirdly, I'm not as interested in the AI aspect or the mobile code aspect[1].
I'm much more interested in a organization and modularization of system sense.
I think it has an obvious code parallelism and a communication aspect for
business users.

1) well, in the traditional Telescript sense. I am a bit interested for load
balancing and redundancy.

~~~
nickpsecurity
That morphed into application and OS containers. Basically. They're way better
than anything agent-oriented programming had back in the day. Also more
versatile. That's why you don't hear about them for that much anymore except
fringe academia.

Might have been different if Cyc or OpenMind had achieved anything. They could
be the reference point for autonomous, mobile agents using knowledge-based
programming. Best just to create more high-level languages, good libraries,
and ways to package them up. It's not just good enough: it's more predictable
and reliable than agent or expert systems even on their intended use-cases.
Funny how that worked out, eh?

~~~
protomyth
> That morphed into application and OS containers. Basically. They're way
> better than anything agent-oriented programming had back in the day. Also
> more versatile. That's why you don't hear about them for that much anymore
> except fringe academia.

The applications and stuff that goes into containers has to be written in some
language, and I think an agent oriented language might be more suitable for
large programs than what we currently have. I really don't think they are
better, just more in line with what we have now.

As I said, I'm not really interested in the expert system aspect (AI). I'm
looking at this as a language issue to build big systems. I think containers
are not a language answer but a coping mechanism for current languages and
practices. I've been doing research on a different path. I hope others are
looking beyond what we have now and what paths history didn't take because C
and UNIX won.

~~~
nickpsecurity
Most of what it takes to do that were just some interpreters and function
calls. Originally established in distributed computing like Amoeba, MPP
systems, and Obliq in agent-oriented systems. Might help if you think on it
looking at distributed, agent, and reactive languages to see what _language_
aspects you think would make it easier. Then you'll be able to convey it
better.

To me, it's just a VM (or source), ability to capture state, and one or more
function calls. Any language could do this.

Obliq just in case you didn't know about it:

[https://en.wikipedia.org/wiki/Obliq](https://en.wikipedia.org/wiki/Obliq)

------
thangalin
Relational XPath Map (rxm) provides a syntactically terse domain-specific
language (DSL) to query relational databases and generate structured
documents.

[https://bitbucket.org/djarvis/rxm/](https://bitbucket.org/djarvis/rxm/)

[https://bitbucket.org/djarvis/rxm/wiki/Discussion](https://bitbucket.org/djarvis/rxm/wiki/Discussion)

Currently generates SQL code from a DSL, which can be run against a database
to produce an XML document.

------
magicmouse
I really love Jonathan Edward's chart where he shows the big stack of
technologies one has to master to build apps today. The goal is to replace
that entire stack with one language. That would be the 10:1 improvement that
would really make a difference. The burning question therefore, is what can
replace that entire stack? Clearly you have to offer a way of storing and
structuring data that improves upon the table. otherwise you are back in
relational database hell.

------
d08ble
I also interested in new ways for minimal coding & debugging.

I'm working on Animation CPU platform, ACPUL declarative algorithmic language
for same purposes.

[https://www.youtube.com/watch?v=ubNfWarTawI](https://www.youtube.com/watch?v=ubNfWarTawI)

[http://acpul.org/](http://acpul.org/)

Also LiveComment information tool:

[http://acpul.org/livecomment](http://acpul.org/livecomment)

------
noblethrasher
Alan Kay’s Viewpoints Research Institute: [http://vpri.org/](http://vpri.org/)
(with which Bret Victor is associated).

------
amirmc
Probabilistic programming might interest you.

e.g. [http://arxiv.org/abs/1507.00996](http://arxiv.org/abs/1507.00996)

------
crb002
I've been toying around with the idea of SPL, Species Programming Language.
See Brent Yorgey's thesis. Like APL, functions would be able to match where
they bind within a data structure. Instead of just multidimensional arrays it
would handle all combinatorial species; think lists, trees, graphs, cycles,
...

Still haven't found a syntax I like yet. Yorgey and Classen both have nice
Haskell libraries as a springboard.

------
jonathanedwards
Wow, these comments mentions lots of work I wasn't aware of! We've been
building a community of people working in this area: the Future Programming
Workshop. [http://www.future-programming.org/](http://www.future-
programming.org/) We will all do better if we get together to exchange ideas
and criticism. Suggestions for improving FPW are welcome.

~~~
dukoid
Why not

\- Have a summary what future-programming.org is about on the homepage?

\- Help to complete this table:
[https://docs.google.com/spreadsheets/d/1JJ14iwi-
UU4Zw2dc9XHY...](https://docs.google.com/spreadsheets/d/1JJ14iwi-
UU4Zw2dc9XHYwYZChOW6DcoVUAJS7ptKDjs/edit?usp=sharing) (Use the "comment only"
dropdown button to request edit access)

\- Create a Wiki organizing the information from this thread?

------
petke
Personally I think in the next 30 years we will be programming with thesame
languages popular today. They will evolve to handle multicore better though.
With tasks, async, couruoutines, actors, etc. Before too long we will have a
mix of thousands of cores. Gpu, cpu, remote cores. We will have to figure out
how to spread our programs across all of them. Well all be doing
supercomputing.

~~~
DonaldFisk
Before looking ahead, it's well-worth looking back to see how far we've come.
Things have changed a lot over the last 30 years, and I'll be very
disappointed if they don't change much over the next 30. In 1986, C++, Java,
C#, Perl, Python, and JavaScript didn't exist. The web was a small project
unheard of outside of CERN. Much of computing was done in COBOL on mainframes.
Garbage collected languages weren't used much. Things haven't progressed
uniformly, and in some areas have actually regressed (I miss Lisp Machines),
but mainstream computing is well ahead of where it was then.

I also don't think you can simply graft supercomputing onto mainstream (i.e.
Algol-descended) languages and expect to keep all your cores equally busy. You
might get away with some SIMD parallelism, but MIMD parallelism, and quantum
computing, require completely different approaches, and completely different
programming mindsets.

------
Permit
I'm working on a live programming plugin for Visual Studio that supports C#:
[http://comealive.io](http://comealive.io)

Previously we'd developed a code exploration tool:
[http://codeconnect.io](http://codeconnect.io)

------
phodo
It might be worth checking out ethereum.org and the work they are doing there
on their blockchain, as well as serpent / solidity programming languages.
Also, check out ipfs. The distributed computation and storage model has broad
applicability, and a good lens from which to view the world.

------
auxi
[http://www.clafer.org/](http://www.clafer.org/) Lightweight modeling

[http://categoricaldata.net/fql.html](http://categoricaldata.net/fql.html) FQL
Functorial Query Languages

------
jng
Intentional Programming
([https://en.wikipedia.org/wiki/Intentional_programming](https://en.wikipedia.org/wiki/Intentional_programming)),
started by Charles Simony

------
ISNIT
Chris Granger seems to be doing some awesome work with 'Eve':
[https://www.youtube.com/watch?v=5V1ynVyud4M](https://www.youtube.com/watch?v=5V1ynVyud4M)

------
beefman
I spent a day looking at these back in August. Summary is here:

[https://gist.github.com/clumma/58253f3692e4c1c28087](https://gist.github.com/clumma/58253f3692e4c1c28087)

------
jakeogh
Sylph is interesting:
[https://news.ycombinator.com/item?id=9126772](https://news.ycombinator.com/item?id=9126772)

------
surfmike
CDG (Communications Design Group) at SAP is where Bret Victor and a lot of
other people work on research. I think a big focus is changing how we program
in the future.

------
jonbaer
Social Machines by Mark Stahl ...
[https://github.com/socialmachines](https://github.com/socialmachines)

------
dukoid
I am working on FlowGrid (Visual dataflow on Android):
[http://flowgrid.org](http://flowgrid.org)

------
erikj
Wolfram Mathematica is a pretty old project, did it change much over the last
five years?

~~~
exDM69
The Wolfram language has been promoted a lot recently. They're moving from a
computer algebra system to a more general purpose programming environment for
scientific applications. There's neat new things like easier GUI features.

------
mikekchar
This may be a strange way of looking at it, but let's backtrack 30 years
before and see what made the biggest differences. As I've been in the industry
for about thirty years, my impression is that nothing much has changed. That
seems strange, but it's still the same job that I started with.

For me it is interesting that in my career, code base sizes grew to gigantic
proportions -- there are many applications that are 10s of millions of lines
of code. In the middle of my career I worked on a few. Interestingly, I'm
doing web development now and a lot of my colleagues think that 5000 lines is
unbearably big. I think the take-away here is that we have gotten slightly
better at abstracting things out and using development libraries (and dare I
say, frameworks).

OO was just becoming a big thing at the start of my career. Everybody did it
incredibly badly. Then Kent Beck and Ward Cunningham came along and told
people how to do it not-so-badly. I think the biggest thing that I saw in this
time frame was the breaking of the myths of OO being about code re-use, and
the movement away from huge brittle design structures. Good OO developers
moved back to the really basic ideals of dealing with coupling and cohesion in
design. We even started to have a language to be able to discuss design
intelligently. Of course, quite a huge number of people were oblivious to
this, but it always struck me how amazing it was that Beck and Cunningham were
really 15 years ahead of most of the rest of us.

Lately, functional programming is coming into vogue. For the second time in my
career I was surprised. People in the know are talking about idempotence, and
immutable structures. This was the stuff that the crazy people were talking
about in the 80's -- stuff that was "too slow", and "too memory intensive" to
take seriously. But now it's pretty obvious this is the way to go.

I think the other big thing that blew me away in the last 30 years was
testing. Probably some people will remember pre-conditions, post-conditions,
and class invariants. This was unfortunately forgotten by most, but the most
astonishing thing was unit testing. Especially the practice of Test Driven
Development that not only allowed you to document your code with executable
examples, but also forced you to decouple your objects/modules by the very
behaviour that creates the documentation. Very few people do this well (just
like most of the other things I've mentioned), but it is completely game
changing.

As for the future, what is coming up? I suggest you look at what has gone
before you for hints to that. In the last 30 years, apart from TDD (which came
completely out of the blue as far as I could tell), the major advancements
came from stuff we already knew about. It was the stuff that the "crazy"
people advocated in the 70's and 80's, but that seemed impractical. If I were
to guess, I suspect that we will see further progress on decoupling in design.
Immutable data structures will not just be common, they will be how ever
professional designs code. As performance moves from single processing to
multi-processing, this will be important. Look at innovative ways of
processing like flow based processing and other crazy ideas from bygone years.

My last piece of advice: Don't look for large revolutionary changes. I think
those are unlikely. The programmer of 30 years from now will probably still be
doing the same job as I am today. The changes will be much more qualitative.
Also, expect that the vast majority of programmers will be as oblivious to the
advancements as most programmers are today.

------
Lofkin
Thoughts on Julia?

------
DrNuke
IMHO machine learning will automate coding to such an extent we all but a few
will be unemployed.

------
iheartmemcache
This is a really broad question, about on par with asking "which fashion
houses are putting out daring material and what will Dior be making that's
popular 30 years from now". Software is just like any other cargo-cult
industry where trends rise and fall almost like clockwork. From Rails, to
Angular, to React.

RE: People/Groups who are researching 'new approaches to programming' \- you
have the typical universities putting out papers. Conferences like POPL and
ICFP tend to be where most of the major academic work gets put out. From
within the industry, commercial entities aren't really doing much, bar
Microsoft Research, Cambridge (UK, not MA). They're really pushing the
envelope with regards to strict PL research. www.rise4fun.com to see the
dozens of projects they're putting out. Oracle, too, is surprisingly doing
some interesting work.

30 years is a hard guess, but 5 years you'll certainly see: 1) a lot more
emphasis on concurrency, at 14nm we're rapidly approaching the physical
limitation of transistor density (which is why you're seeing 18 core Xeons).
Sharing memory is notoriously hard, so the move towards immutability (whether
it's pass by ref const in 'traditional' languages like C++ or more esoteric
languages like Haskell, that's the direction it's going in, whether by using
the actor model, STM, etc) 2) Especially with Intel's Knight's Landing. RDMA
has been around for ages, but bringing it to Xeon means the common-man doesn't
have to pay Infiniband prices for HBAs. RAM has been getting cheaper but
imagine being able to just deck out a 42u filled to the brim with ~18 2u's of
half a TB of DDR4 RDIMM a piece that your main application can access. 3)
Disks, which used to be a huge thrashing bottleneck (who here is old enough to
remember thrashing on GCC 2.95 when trying to compile E17?), are now posting
amazing numbers, even for SSDs.

Effectively every level of computing that used to be a barrier (minus SRAM and
the CPU caches which seem to have been stuck at the same spot for a while in
terms of capacity) has, or will within 6 months be consumer accessible. I
couldn't guess what's going to happen in 5 years. I can't even guess what's
going to happen in 5 months and I've been at this nearly 20 years.

~~~
bjornsing
> From Rails, to Angular, to React.

That's where you lost me. ;)

(It's sort of like a physicist claiming force, mass and momentum to be the
hottest new research areas for the coming 30 years...)

~~~
sitkack
> force, mass and momentum

sounds like you think pretty highly of Rails, Angular and React.

------
joshkpeterson
Ramsey Nasser is developing a language written entirely in arabic.
[http://nas.sr/%D9%82%D9%84%D8%A8/](http://nas.sr/%D9%82%D9%84%D8%A8/)

------
sklogic
My research is on hierarchies of composable domain-specific languages (see
github, account 'combinatorylogic').

~~~
jononor
If you want people to go there, provide a link
[https://github.com/combinatorylogic?tab=repositories](https://github.com/combinatorylogic?tab=repositories)
#internetageattentionspan

~~~
sklogic
I do not want to spam with links to my projects (and to get banned for this),
but sometimes it is relevant to mention them, as in this thread.

