
Why not program right? - Jtsummers
https://bertrandmeyer.com/2018/05/24/not-program-right/
======
tchaffee
I used to love these kinds of articles three decades ago. Then you get a
programming job with budgets and deadlines and even stupid decisions based on
politics and you hate all that at first until through experience you realize
that even poorly engineered cars can get product from point a to point b and
do so all over the world. Free markets eventually only have time for perfect
solutions. And a perfect solution according to markets is the solution that
does the job for the lowest cost over time. For a website that will support a
two week marketing campaign, you don't need anything talked about in this
article. In fact the only responsible decision is to ignore this type of
approach and just build the damn thing with someone that has a track record.
And then throw it out and move on. Based on empirical evidence, there isn't
really a problem at all with the way people program. Markets have already
mostly figured out the rare cases when such robustness is really needed. And
it's rare. The only "right" way to program is to take as many stakeholder
requirements into consideration as possible. And those requirements are rarely
around program correctness. So this article is good (although I think you're
really looking at functional programming by this time in history) but first
make sure a top priority is program correctness before getting into the mode
suggested by the article. One final stakeholder requirement that's always a
priority: you have to be able to find qualified developers, and what
developers learn is based on popularity and fashion. It's a real world
constraint even if it's distasteful to the idealist and well intentioned types
who write these articles.

~~~
jacquesm
And then, five years after you've left the company and some system inevitably
collapses with nobody having a clue as to what went wrong you'd finally
realize the wisdom of all that.

But it's no longer your problem.

So, please don't take it personal, but 'the well intentioned types who write
these articles' tend to be the people that then get called in to clean up the
mess.

And then - belatedly - the job gets done properly to keep the company in
business, assuming there is still time enough to do so.

Just this week I had a nice inside look at the kind of mess gets left behind
when the original duct-tape-and-spit guy leaves the company and lets his
former co-workers clean up their mess. It isn't pretty, and a chapter 11 isn't
an unlikely scenario, so forgive me if I take a harsher than usual look at the
attitude that causes this sort of thing.

Note that the 'free market' doesn't have a horizon much longer than the next
quarterly shareholder report, and that your typical software product lives a
multiple of that interval. So software made with short term goals in mind will
create long term headaches.

Your two week marketing campaign gets a pass. But your decade long backend
project does not, nor does your real time medical device controller, ECU,
database system or operating system.

~~~
tchaffee
> And then, five years after you've left the company and some system
> inevitably collapses with nobody having a clue as to what went wrong you'd
> finally realize the wisdom of all that. But it's no longer your problem.

If that were a problem in reality, the markets would be punishing companies
where that happens. It's not a real problem that that happens. Management
pretends to be upset, but in reality it's not a huge deal. Entropy is normal
in apps and everything else. To continue the analogy from my original comment,
do companies really go into crisis mode when one of the many cars in their
fleet inevitably "collapses"? No. They build or buy a new one and life goes
on.

> nor does your real time medical device controller, ECU, database system or
> operating system.

Yep. Those are the rare cases I talked about. It's a tiny fraction of total
programmers building databases and stuff like that.

~~~
jacquesm
> If that were a problem in reality, the markets would be punishing companies
> where that happens.

Oh they do, I can show you plenty of examples. But it is never the problem of
the people that created the issue in the first place.

Think of these things as time-bombs of technical debt. They'll blow up sooner
or later, usually later, and that makes it that much harder to deal with the
fall-out.

Also: for all the lessons about economy made here: I would happily argue that
doing things right is actually cheaper in the long run and _possibly_ also
cheaper in the short term, by applying the techniques described in proper
measure you can save yourself a ton of headache.

But of course that would first require a basic understanding of what the
article is trying to put across, which if your time horizon is short and your
deadlines are looming likely isn't going to be on your agenda.

> It's a tiny fraction of total programmers building databases and stuff like
> that.

Software you build tends to live longer than you think and tends to be
incorporated into places that you can not foresee when you make it.

The 'tiny fraction of total programmers building databases' should include the
_huge_ fraction of programmers building embedded systems, APIs, operating
systems, libraries and so on. All of those will have life-spans in the decades
if they're done halfway right.

~~~
tchaffee
You seem to have a poor understanding of both entropy and markets. Even the
perfectly built program will soon become useless. Car companies are quite
profitable building cars that "collapse" far before the actual potential which
might be a car that lasts 50 years. But no longer can pass emissions tests...
Zuck is about to surpass Buffet as the third richest person in the world. On
an app built with PHP! I don't think much more needs to be said to support my
original point.

~~~
catamorphismic
> You seem to have a poor understanding of both entropy and markets.

And you're a bit assuming and rude. Your argument also isn't as bulletproof
you want to make it sound. What is the argument here, anyway? There's no need
to improve technique for an average programmer because an outlier system
(Facebook) is written in a language commonly associated with poor programming
practices, with some handwaving about markets and entropy sprinkled on top?

~~~
tchaffee
Sorry if I came off that way. I was in a rush on the way to an event and I
thought I was just being honest about the weakness in his argument.

What's the argument here? That stakeholders have requirements that don't have
to do with robustness like budget and deadlines and that your software has a
shelf life and sometimes it's ok if it eventually breaks, just like cars and
even the laptop I'm typing this on will. Is that an unreasonable perspective?

And Facebook is an outlier? Really? Even when we add Wordpress, Wikpedia,
Flickr, MailChimp and a long list of the most successful websites in the world
to that list?

~~~
pdimitar
> _And Facebook is an outlier? Really? Even when we add Wordpress, Wikpedia,
> Flickr, MailChimp and a long list of the most successful websites in the
> world to that list?_

Yes, FB is an outlier -- one out of million companies. Only 5-10 companies out
of those millions made this current model work. So their existence and
"success" proves absolutely nothing.

You have a strange understanding of the word "successful".

Facebook is certainly not "successful" _because_ it neglects good tech. If
anything, they rewrote PHP itself so as not to have to rewrite their customer-
facing software. How is that for your "tech excellence is not important"
argument? They rewrote the damned runtime and even added a compiler.

So please define what "successful" means to you. "A lot of people using FB" is
a temporary metric, even if it lasts for decades. It's not sustainable per se.
It relies on hype and network effect. These fade away.

@jacquesm's points are better argued than yours. Throwing words like "free
market" and "entropy" does not immediately prove a point.

I will give you the historical fact that there are many throwaway projects but
he's also right that the fallout from the tech debt they incurred is almost
never faced by the original author. Throw in the mix the fact that many
businessmen are oblivious on what do the techies do in their work hours
exactly and one can be easily misled that technology perfection is not
important. Seems that you did.

Final point: I am not arguing for 100% technical excellence. That would be
foolish. We would still be refining HTTP and the WWW in general even today and
internet at large would not exist. But the bean counters have been allowed to
negotiate down tech efforts to the bare minimum for far too long, and it shows
everywhere you look.

(My local favourite restaurant waiters' smartphone-like devices for accepting
and writing orders are faulty to this day because some idiot bought a cheap
consumer-grade router AND made the software non-fault-tolerant, being an
everyday example.)

~~~
tchaffee
> Only 5-10 companies out of those millions made this current model work

Stats? Evidence? I mean hundreds of of thousands of companies use PHP and
other forms of less than perfect tech.

Websites all over the world seems to get the job done even when JavaScript
with all its warts is used. I like JS for the record, but it does have warts.

> even if it lasts for decades.

You're saying the same thing I said. That stuff breaks. That companies come
and go in and out of fashion. I also think it's interesting that you're
calling FB an example of tech excellence but saying it's going to fade away.
Choose one?

> How is that for your "tech excellence is not important" argument?

I never made any such argument. Not even close. I only said quality is not the
only requirement and might sometimes not be a requirement at all.

Most of the code I write is high quality. I put a lot of effort into code
reviews too. I mentor more junior devs around quality. My original post is
actually much more nuanced than you are claiming.

> Final point: I am not arguing for 100% technical excellence. That would be
> foolish. We would still be refining HTTP and the WWW in general even today
> and internet at large would not exist.

Exactly. That's in the spirit of my original post. Maybe re-read it to see
that we mostly agree instead of making my position into something it really
isn't?

~~~
pdimitar
> _Stats? Evidence? I mean hundreds of of thousands of companies use PHP and
> other forms of less than perfect tech._

Oh, I meant companies at the scale of Facebook. There aren't too many of them,
would you not agree?

> _I also think it 's interesting that you're calling FB an example of tech
> excellence but saying it's going to fade away. Choose one?_

FB does a lot of open-source projects. Their devs are excellent. That doesn't
mean that their main value proposition is not comprised of code of the kind
you speak about. No need to choose one, both can coexist in such a huge
company like FB.

> _I never made any such argument. Not even close. I only said quality is not
> the only requirement and might sometimes not be a requirement at all._

Well alright then. I am not here to pick a fight, you should be aware that you
came off a bit more extremist to me and a bunch of others than you claim. But
these things happen, I can't claim your intent because of a few comments,
that's true.

Me and several others' point is that quality plays a part bigger than what you
seem to claim. I also knew many devs that decided they won't ask for
permission to take the [slightly / much] longer road and this decision paid
off many times over in the following months and years.

Sometimes businessmen simply must not be listened to. I can ship it next week
alright. But I can skip a few vital details, namely that I did not take into
consideration some stupid micromanagement attempts to teach me how to do my
job ("nobody cares about this arcane thing you call 'error logger' or 'fault
tolerance', just get on with it already!"). Such toxic work places should be
left to rot, that is a separate problem however.

------
nlawalker
I find that a lot of folks only think of programming in terms of variables and
executable instructions that modify them. The notion of an invariant is not
part of their toolbox, and the exercise of modeling a domain is not something
they really engage in: if the program doesn't do what they want it to do,
either the instructions are wrong, or there need to be more of them to handle
the cases that aren't currently handled.

~~~
vbtemp
I think this is correct.

It's understandable when it's a physicist or electrical engineer trying to
code something up to get something working. What really upsets me is when it's
professional software developers who never evolve out of that mindset.

I feel like at the end of the day the trick is to think algebraically -- Your
data types and structures are some domain and you define operations over the
elements of that domain in such a way that certain properties hold.

~~~
candiodari
The big advantage of CS is the accessibility of the domain. You cannot take
that away and hope nothing else changes. That, of course, means any idiot or 8
year old can just pick it up and run with it. They may become less of an idiot
after a while, but this may not be desirable or expedient, and it's great that
that's not much of a problem.

I mean, this is a bit like saying, why doesn't every physicist do everything
by just coming up with a random differential equation for their problem then
use fixed points to deduce the long-form non-recursive version and isolate out
the wanted variables into long form ? It's, after all, usually by far the
simpler process, especially since everyone can come up with a few
differentials for any situation. Coming up with the correct long form
directly, however, is absurdly hard. So if you simply learn to work with
differentials, that's the way to approach essentially any problem. With a tiny
caveat ... "learn to work with them" is 6 months of study and intensive
practice, and that's assuming you already know a lot of math that isn't
exactly high school level either, including a significant list of "tricks"
that you just need to know by hard. But what you can do with it is amazing.

But the level of knowledge and understanding required is just too high for
anything resembling general application.

~~~
haliax
This sounds cool, can you point me to a good primer on the technique?

~~~
candiodari
[https://www.youtube.com/watch?v=-_POEWfygmU](https://www.youtube.com/watch?v=-_POEWfygmU)

Don't look at other videos until you've internalized the first sentence. Think
long and hard about what that sentence means : differential equations allow
you to find any function that you can make enough "what happens when it moves"
observations about. Enough usually means one.

For instance you can find Newton's equations from the statement that "falling
things keep going linearly faster" (because they're the simplest function that
satisfies that differential equation).

On the more complex side, Google's pagerank is also the solution to a
differential equation. Very technically it sort-of kind-of qualifies as a
first-order one, just not in the real number space.

There's a separate branch of "differential equations" (let's call it "the
physics branch") that studies how to work it with discrete time intervals
rather than continuous ones, which is also interesting and useful.

------
Animats
I wish we had tools for proving object invariants widely available. But such
tools tend to come from people pushing their latest type theory or functional
programming or something. They're too complex and abstract for most
programmers. I tried to fix this once, but it was too soon in the early
1980s.[1] (That was before objects. We would have had object invariants if
objects had existed back then. We had function invariants and module
invariants.)

This stuff is useful is when you have lots of modifiable data structures which
need to be consistent. Window managers. Operating systems. Game engines.
Database internals. Most of the things you need to prove are trivial. But you
need tools to check that invariant A is maintained by code far distant from
where invariant A is defined. Maintenance programmers will miss that.

[1]
[http://www.animats.com/papers/verifier/verifiermanual.pdf](http://www.animats.com/papers/verifier/verifiermanual.pdf)

~~~
gowld
Is there a viable solution that isn't type theory or functional programming or
something? If it were feasible to do so, wouldn't we see progress in
"everyday" common language?

~~~
Animats
There was for Modula III. DEC SRL in Palo Alto did a nice verification system
for Modula III. But Modula III went down with Digital Equipment Corporation.

C/C++ has too much undefined behavior. Ada died off. The scripting languages
don't need it as much. Rust had potential for proof work, but went off in a
different direction. There are modern proof systems, but they're rarely
integrated with the programming language.

~~~
nickpsecurity
"Ada died off. "

On the verification side, it's never had more users that I'm aware of. SPARK
and Frama-C are very active compared to almost non-existent use of formal
methods in industry decades ago. Rust could similarly have a subset integrated
with Why3 platform to make the formal methods easier to use. Further, I've
seen extraction done from Coq to C and Rust. There's also one person modeling
C in WhyML to write the algorithms in the latter but extract to former. Or
something like that. Could be done for Rust, too.

------
csours
[https://web.archive.org/web/20180705200232/https://bertrandm...](https://web.archive.org/web/20180705200232/https://bertrandmeyer.com/2018/05/24/not-
program-right/)

~~~
csours
I guess my default answer is: because I have to solve some business problem to
get paid, and I never have enough information or time to do it right - or
phrased differently: If I took enough time to do it right, someone else would
have already done it wrong and moved on.

Is there a "better" answer than worse is better?

~~~
pmoriarty
Worse is better doesn't sound so bad for just a single (small) project.

Where it starts to go badly off the rails is when there's a company culture of
worse is better and one quickfix, bandaid solution is piled on top of another.

They then wind up with one or more Leaning Towers of Pisa made up of bandaids,
hacks, and quick fixes, and everyone running around like headless chickens
putting out fires and trying to bandaid all the failures as the towers are
constantly in the process of collapsing.

This leads to an ever widening spiral of hacks upon hacks upon hacks as
company culture, lack of manpower, and pennypinching never gives them the
luxry of doing it right, and cutting the gordian knot by scrapping everything
and doing it right from the start becomes ever more impractical.

~~~
gowld
The whole web is "worse is better". You can say "worse is worse", but that
implies you think the web is worse than something that.. what, would have
sprung into existence if the current web was suppressed?

~~~
icebraining
Project Xanadu?

------
PaulAJ
I was an Eiffel fanboy in the early 90s, when the only alternatives were
Smalltalk and C++. It was obvious to me that garbage collection was necessary
for modular systems, and that Eiffel assertions were the sweet spot in formal
specification: lightweight enough to be usable in the real world, but formal
enough to let you state and demonstrate useful things about your code.

However I was still always bothered by mutation. Everything was fine as long
as you didn't change any data structures, but as soon as you added mutation to
the mix everything fell apart. You could state an invariant, but if you shared
a reference to an internal structure then you had to trust everything else in
the system not to change anything.

Then I discovered functional programming in the shape of Haskell. Its still
not perfect, but it's fundamentally better than OO.

~~~
webreac
I have also learned eiffel and read Meyer's book. The terrible standard
library with the deep inheritence trees was my main complaint about this
language. All his critics about C++ were very insightful. At that time almost
nobody was speaking about immutability (GRAAL was called a language without
variable). Java was almost revolutionary with its immutable strings.

------
AnimalMuppet
> The only way to achieve demonstrable correctness is to rely on mathematical
> proofs performed mechanically.

This is guaranteed to fail. What are you going to prove by your mathematical
proof performed mechanically? That the program performs correctly? How do you
define "correctly"? At bottom, it is defined by an informal specification in
peoples' heads. You cannot mathematically prove correspondence to that, even
in principle.

At best, you can prove correspondence to the most-high-level _formal_
specification. But how do you prove that _that_ specification is "what the
program should do"?

The next problem with this approach is that it has costs. Having to write a
mathematically rigorous formal specification for all the behaviors of a
program takes non-trivial time and effort. As others have said, that effort
could have been put into other things, like more features. Would we rather
have more perfect software, or more feature-rich software? Above a certain
level of quality, we'd rather have more features. (Yes, software often drops
below that level of quality...)

~~~
tathougies
You can prove performance properties, memory properties, and formal
properties, or any combination of the above.

The question of 'what the program should do' is not a mathematical one; it's a
philosophical one in the most general sense, and most likely a business one.

------
RcouF1uZ4gsC
Contracts were just recently voted into C++20
([https://herbsutter.com/2018/07/02/trip-report-summer-iso-
c-s...](https://herbsutter.com/2018/07/02/trip-report-summer-iso-c-standards-
meeting-rapperswil/))

With the addition of contracts, C++ will natively support this style of
programming without ASSERT macros.

------
mpweiher
> ...and examine the evidence.

And the evidence is that people write working, even reliable software without
Eiffel.

~~~
jacquesm
Well, yes. But they do that by replicating a lot of functionality that might
as well be pushed down a level. Because _every_ program with output that
people or other systems rely on needs that level of reliability and yet only
very few provide it to a degree that the company behind it would accept
liability if it doesn't.

Usually 'working' and 'reliable' get redefined to 'working with what we've
tested it with' and 'reliable insofar as our statistics indicate'. Without
knowing for sure that you've really covered all your edge cases you're a typo
away from some disaster. Fortunately most software isn't that important. But
for software that is that important these strategies, even if imposed from the
outside rather than embedded in the language will pay off.

~~~
mpweiher
Oh, I am not saying there are no problems. And I don't deny a certain
emotional appeal to having safety features provided by the language.

However, great (quality) is delivered with those kinds of features and
without, and crap software is delivered with those kinds of features and
without. And more importantly, I have seen little to no _evidence_ that having
those sorts of features actually substantially changes the statistical
distribution of crap/quality software, no matter what we feel should be the
case.

People can use these safety features or not, and they can use them well or
not. Just like they can use non-linguistic safety mechanism, such as really
good test-suites...or not.

Elsewhere, he writes:

> This is where I stop understanding how the rest of the world can work at
> all. And so you probably need to upgrade your understanding.

If the world doesn't conform to your understanding of it, the thing that's
lacking is almost certainly your understanding of the world. Because it does
work.

~~~
jacquesm
> And more importantly, I have seen little to no evidence that having those
> sorts of features actually substantially changes the statistical
> distribution of crap/quality software, no matter what we feel should be the
> case.

I have. Our company has done a fairly large number of studies on the internals
of companies producing software and the better companies are at the tech the
better they do in the long run.

Note that there is such a thing as 'good enough', and once that bar is cleared
I'm fine with cutting a corner here or there to meet a deadline. But I'm _not_
fine with categorically ignoring quality and security in favor of short term
wins.

~~~
mpweiher
Hmm...not sure we are disagreeing.

You:

"the better companies are at the tech..."

"categorically ignoring quality and security"

That is about _doing_ something about quality and security, and I agree
wholeheartedly.

Me:

"having those sorts of features [in the language]"

That's whether having some specific language features is determinant for
_doing_ things. I don't think it is.

------
ben509
> This is where I stop understanding how the rest of the world can work at
> all. Without some rigorous tools I just do not see how one can get such
> things right. Well, sure, spend weeks of trying out test cases, printing out
> the structures, manually check everything (in the testing world this is
> known as writing lots of “oracles”), try at great pains to find out the
> reason for wrong results, guess what program change will fix the problem,
> and start again. Stop when things look OK. When, as Tony Hoare once wrote,
> there are no obvious errors left.

Most businesses put their data in a DBMS, so you have integrity constraints
"to ensure database consistency with the business rules or, in other words,
faithful representation of the conceptual model of reality."[4] The relational
model is both complete and a lot easier to understand than using the predicate
calculus directly.

The other shortcoming of any language's invariants is that it lives in your
source repo. As coders, we often forget that your production data represents
contracts with customers, and you have to take account of the real data when
migrating your business rules. That's why the M is in DBMS.

More directly answering the question, most languages can employ property tests
[1][2][3] that, very closely to how Eiffel does it, allow you to specify the
precise invariant and validate it under randomly generated tests.

But, honestly, a great deal of code, even math heavy code, doesn't have nice
general invariants. Real functions dealing with business logic and bugs are
piecewise and messy.

And then that only covers the handful of functionality that is even remotely
expressible in mathematical form. You still have the user interface to deal
with, dependency management, your entire deployment story, and all the problem
extant between chair and keyboard.

Technology can't substitute for the grunt work of testing, testing, actually
using your system and talking to your customers.

[1]:
[https://hackage.haskell.org/package/QuickCheck](https://hackage.haskell.org/package/QuickCheck)
[2]:
[https://hypothesis.readthedocs.io/en/latest/](https://hypothesis.readthedocs.io/en/latest/)
[3]: [https://github.com/HypothesisWorks/hypothesis-
java](https://github.com/HypothesisWorks/hypothesis-java) [4]:
[http://www.dbdebunk.com/2017/06/what-meaning-means-
business-...](http://www.dbdebunk.com/2017/06/what-meaning-means-business-
rules_11.html)

~~~
macintux
> But, honestly, a great deal of code, even math heavy code, doesn't have nice
> general invariants. Real functions dealing with business logic and bugs are
> piecewise and messy.

At the edges, the invariants are going to be messy, but it would seem like
this gives you ample excuse to define functions with strong, narrow invariants
as the core of your application, with input transformation functions at the
edges.

~~~
jacquesm
And, very importantly: input validation at the edges. Way too many systems
assume that the data they are fed is consistent.

------
InclinedPlane
Why not? Because provably correct software is not necessarily correct
software. One of the biggest classes of defects in programs is missing details
in specifications, and provable correctness against the wrong specification
doesn't help with that problem at all.

Provable correctness is one technique in software development, it's not the
only technique nor even the most important technique nor even a _required_
technique.

~~~
mpweiher
Yep.

Provable correctness can be useful iff you have a formal specification, and
iff that formal specification is more likely to be a correct reflection of the
requirements than the code + some tests.

While there are times when that's the case, those are rare.

------
icebraining
I find contracts interesting, but it seems to me that they're only useful for
transformations that are easier to check than to implement, no?

------
syastrov
I have heard of property based testing (e.g. QuickCheck). Is the main
innovation of Eiffel that the author is referring to that these invariants are
written alongside the actual code AND are configurably checked at runtime?
Because ordinarily, you might sprinkle assertions throughout your code, but
these are checked only at the point you add them, rather than between
statements as I assume Eiffel does. In that case, I see some benefits in the
form of documenting logical assumptions which are actually validation when
code is run (also during manual testing).

I could imagine this being implemented in other languages by hooking into
statement execution much like a profiler or debugger and executing the checks.
Does anyone know of such tools for other languages, e.g Python or JS?

Does anyone have real-world experience with this development technique on an
actual project in production?

~~~
PaulAJ
Eiffel dates back to the 1990s. Java was going to have Eiffel-style
invariants, but they got chopped out because Bill Joy had a deadline to meet.

------
carapace
"The future is here, it's just not evenly distributed yet." ~W. Gibson

We are like lumberjacks who so love our axes that we refuse to even consider
chainsaws. Many of us don't even know they exist, or if we do, think of them
as esoteric and nearly magical. And then there's the fellow who says, "I'm too
busy chopping down trees to learn use a chainsaw. I've got to get these trees
chopped today!" (So learn to use a chainsaw on the weekends maybe? They really
aren't that hard.)

~~~
pdimitar
I agree with you, however the chainsaws in your example are definitely hard as
hell... consensus algorithms, net splits -- and you already have much more a
single programmer could handle (most of us anyway).

------
mhd
One thing I'd really like to see is a debate between Meyer and his predecessor
Niklaus Wirth. The latter was definitely willing to sacrifice some correctness
if that means making the system easier to implement and learn. Just compare
Oberon and Eiffel, or, heck, "Algorithms and Data Structures" and "Object-
Oriented Software Construction".

~~~
i_don_t_know
I'm not sure Wirth has ever sacrificed correctness. He has been willing to
sacrifice performance though, that is, if a simpler solution was fast enough
he went with that (e.g. linked list instead of hash table for symbol lookup in
a compiler). He's also not been afraid to question established wisdom and
practice if it lead to simpler implementations (e.g. the file API in Oberon).

I suspect Wirth would not object to codifying pre/post-conditions and
invariants per se. But I think he would object to using them as a band-aid
around complicated implementations. It's hard to convince yourself that your
implementation is correct when the pre/post-conditions and invariants are
themselves complicated and hard to understand.

------
tzs
OT: what is the definition of a "root" of a multigraph? In a regular graph,
the most common definition I've seen is that a node is a root iff you can
reach every other node starting from it. In the graph in the article, that is
true both of nodes 1 and 2, but he says 1 is the only root.

------
subjectsigma
Yes, I get he thinks his language is the best, and maybe it really is.
Constraint-based programming sounds cool. Then again, so does logic-based
programming, but I'd wager most people have never written a single line of
Prolog, and for good reason. It's hard and slow to write, and to run for that
matter.

Maybe his language would be more popular if he wasn't such a snob about it.

------
Cobord
Looks like algebraic query language AQL. Anyone familiar with both this and
AQL who can give the comparison?

------
rthille
If I had to develop software like that, I'd quit and dig ditches for a living.

