
Composition and Diamonds - setra
https://www.arrdem.com/2017/02/02/composition/
======
chriswarbo
Whilst streams, filter and cons/car/cdr are good examples, their context of
functional programming might be seen as already rather detached from the
realities of physical machines; however, another good example is Unix
programs. The abstractions offered by programs like grep, sed, find, cat, cp,
etc. are all quite small, and whilst bug-fixes, optimisations and even new
features may still be added to the implementations, they've been essentially
"finished" for decades.

The point is that these applications were not created to solve some real-world
problem. Instead, it is assumed that real-world problems will be solved by
scripts (or one-liners in a terminal), that users will be writing their own
scripts, that each problem will require a new script and such scripts will
churn as the requirements/practicalities change. Hence the job of Unix tools
is to make scripts simple and easy to write. By decoupling the tools from any
particular problem, they can provide small, reliable abstractions; if
requirements change such that an abstraction doesn't quite fit, the solution
is to _change the script to use something else_ , rather than changing the
contract of the tools.

This works well when the tools are small, and the user is assumed to have some
competence or willingness to learn in order to solve their problems. It
doesn't work so well when the tools are large (e.g. a modern Web browser) and
the users are assumed to only be capable of e.g. clicking a "Go" button.

~~~
dualogy
> It doesn't work so well when the tools are large (e.g. a modern Web browser)
> and the users are assumed to only be capable of e.g. clicking a "Go" button.

Well then you just end up with middlemen "scripters", for better of worse,
hence.. JavaScript.

------
jpt4
A trivial diamond of computation would be a custom purpose machine, e.g. [0].

Inter-operation between decoupled "small abstractions" and their composed
descendants requires either a pre-existing mutual interchange form, or a set
of wrappers that provide the same. This is equivalent to simply building on
top of that common normal form, which exists, at varying levels of specificity
and accessibility, whether one explicitly considers it or not [1]. This,
combined with the ultimate CNF still being natural language, because
computational semantics are still ultimately evaluated in the context of the
human brain, argues for intentionally clarifying and improving that least
common computational denominator, in the same way that if one entirely lapses
in their recollection of higher synthetic geometry, "begin by assuming a
compass and straightedge" is at the least never incorrect.

[0]
[https://news.ycombinator.com/item?id=13669033](https://news.ycombinator.com/item?id=13669033)

[1] E.g. the cons example in the article is a small abstraction that composes
well, at the abstract level, with the cond abstraction for destructuring
quoted symbol lists when writing interpreters, but as noted by the author
leaks too much of the underlying machine primitives, and thus is an impedance
mismatch when seeking performance optimizations, as in a compiler.

