
Parallels Between Math and Software Engineering - grfl
https://beta.oreilly.com/ideas/striking-parallels-between-mathematics-and-software-engineering
======
rathereasy
At the end of the article, the author mentions how we could possibly find
other designs of mathematics. Well, some people already have!

Some mathematicians did not like the law of excluded middle, which states that
for any proposition A, either A is true or A is false. So they invented
intuitionistic logic, which is normal logic without the excluded middle, and
started rewriting mathematical proofs in this new system. Turns out there's a
lot of stuff you can prove in intuitionistic logic.

Some mathematicians did not like the axiom of choice. One of the consequences
of this axiom is that every subset of the real numbers has a least element
according to some ordering. Think about it, what is the least element of {1/n
: n >= 1} ? Who knows! So what did they do? Some people found it so weird they
either replaced it with a weaker axiom or a contradictory one.

There's even syntax arguments in mathematics! What's the derivative of a
function f? is it f'(x) or df/dx ? Is multiplication represented by a dot (.)
or a cross (x) or by a juxtaposition of expressions?

Sometimes we use big existing proofs in the middle of a proof to save time.
And sometimes we use the big proof to prove something far simpler than the big
proof. This creates a big dependency and some people dislike hate these
dependencies because the reader of the new proof will have trouble
understanding the proof completely. It's like dropping in some magic in the
middle of the proof and saying: "if you want to understand this proof
completely, go read this other 50 page article" Sound familiar? Some
mathematicians hate this so much they insist on proving things from the ground
up whenever possible so that the proof is as comprehensible as possible. This
is the mathematical equivalent of dependency management.

~~~
zeroonetwothree
If you just reverse the typical ordering, then the least element of {1/n : n
>= 1} is 1, so this doesn't seem all that strange to me.

~~~
fmap
The point is that there is one ordering in which every subset of real numbers
has a minimum.

For the rational numbers, this is not difficult, since we can enumerate the
rational numbers. As the ordering we could then pick "a <= b" if the index of
"a" in the enumeration is less than the index of "b". The least element of any
set of rational numbers is then the rational number with the least index in
that set.

However, the real numbers are uncountable. Tricks like this cannot work.

\----

The real culprit is the axiom of excluded middle, which confounds the notions
of existence and embodiment. Under classical logic it may be proven that P =
NP, even though no algorithm actually exists. This already happened for
certain graph problems... (=> Graph minor theorem, moving from minor-closed
graph property to finite set of excluded minors is essentially non-
constructive.)

Without excluded middle, the axiom of choice is exactly as trivial as it
sounds: Given a non-empty set, give me an element in that set. In order to
show that a set is non-empty you need to give enough information to construct
an element in that set. So the proof contains the information to validate the
"axiom".

With excluded middle, the proof that a set is non-empty doesn't contain enough
information to pick an element from that set. So adding both excluded middle
and choice takes you into an axiomatic system whose connection with the real
world is rather tenuous.

\----

The article is spot on: Not only is mathematics a designed system, but the
analogy between programming languages and logic is perfect in view of the
Curry-Howard correspondence. Elevating Zermelo-Fraenkel set theory is like
insisting that your software project has to be built using Turing machines.
After all, everything else is just encoding. There is a huge design space in
logics as well as in programming languages. For instance, you can do physics
in the setting of synthetic differential geometry and work without
preconditions and use theorems which are literally impossible in ZFC.

A soundness proof for a new logic is, by the way, the logical equivalent of
building a compiler. :)

~~~
DominikPeters
Careful, the axiom of choice doesn't state that non-empty sets contain an
element (this is a triviality). Rather (an equivalent of) it states that the
Cartesian product of non-empty sets is non-empty. If we take a product of
_finitely_ many such sets, then we can prove it, but we need the axiom of
choice for the infinite case.

The reason is intuitively clear: in order to prove that the infinite product
is non-empty, we need to produce an infinitely long vector. Well, since the
individual sets are non-empty, we can just for each set choose an element and
put it in the vector. But this takes infinitely many steps! and that's not
allowed in mathematics. The axiom of choice is a way of collapsing infinitely
many steps into 1 invocation of the axiom (so 1 step, so finitely many steps).
Not too much to do with excluded middle.

~~~
fmap
There are many classically equivalent formulations of the axiom of choice, the
one I gave is just the one I like best. :)

But you are right, the devil is in the details, so let me spell it out
precisely:

The axiom states that for every set X, there is a function

    
    
      \epsilon : P(X)/{\empty} -> X
    

If you want to restrict yourself to first-order logic, then you have to encode
the function \epsilon (e.g. as a functional relation, where relations are in
turn encoded as sets of ordered pairs, which are in turn encoded with
Kuratowsky's construction). Set theorists like first-order logic and this
encoding overhead is the reason why it's not the standard definition, even
though it's arguably what the axiom is trying to express(1,2). Apart from that
the statements are (classically) equivalent:

To go from the higher-order version to the cartesian product, note that the
cartesian product of an I-indexed family of sets is a set of functions f from
I to the union of all X_i's, such that f(i) \in X_i for all i. Let (X_i)_{i\in
I} be an I-indexed family of sets, all of which are non-empty, then (i ->
\epsilon(X_i)) is in the cartesian product of the X_i's, where we use \epsilon
for the union of all X_i's. The reason we couldn't build that element without
AC is precisely that we have no way of selecting exacly one element from each
X_i, even though we know that they are non-empty.

In the other direction we can use Zermelo's well-ordering theorem to construct
a well-order on X. We define \epsilon to pick out the minimum with respect to
this order.

(1) This really is what the axiom of choice was meant to express. If you can
read German, you can refer to Zermelo's own work:

    
    
      https://eudml.org/doc/158167
    

Under point 2 in this paper, Zermelo assumes that there is such a choice
function on a given set and from this he derives the well-ordering of that
set.

(2) Another example of the drawbacks of working with a first-order
axiomatization is the axiom of regularity. The idea we want to express is that
there are no infinitely descending element chains ("every set is well-
founded"), but expressing this in first-order logic leads to a convoluted
statement which obscures the simple idea... and is also not a correct encoding
of this idea in intuitionistic logic!

~~~
mm_aa
Is there any effect on our thinking that comes from HoTT? I heard this is
exactly the kind of problems it tries to solve.

------
ivan_ah
Very nicely put.

I'm a big fan of linear algebra because it's the best example of why learning
math is useful. Sure knowing about equation and calculus come in handy, but
linear algebra is pure modelling superpowers and a much more valuable tool
overall.

Related: An awesome LA introductory lecture by Prof. Strang:
[http://ocw.mit.edu/courses/mathematics/18-06-linear-
algebra-...](http://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-
spring-2010/video-lectures/lecture-1-the-geometry-of-linear-equations/)

Related 2: A short tutorial on LA that I wrote:
[http://minireference.com/static/tutorials/linear_algebra_in_...](http://minireference.com/static/tutorials/linear_algebra_in_4_pages.pdf)

~~~
andrepd
I don't think you can say of any topic in mathematics that it's the _best_
example of why learning math is useful. What about calculus? It's an immensely
powerful tool that, by harnessing the power of the infinite and the
infinitesimal, unlocks a massive body of practical applications in nearly
every field of quantitative knowledge. And what about discrete math? Without
it there would be no such thing as a computer. Differential equations, _the_
tool for modeling dynamical systems?

Mathematics is a vast topic that every quantitative discipline must
necessarily draw from. Each field of study will benefit more from certain
topics of it.

~~~
ivan_ah
> What about calculus [...and...] differential equations ?

I agree that calculus and diff. equations techniques are important, but if you
look closely, you'll see they are only "necessary" for engineers and
physicists, and other "hard" sciences, and not so essential for non-science
studies.

Imagine "pitching" the benefits of learning university-level math to a class
of Arts students. Can you honestly say their lives will be improved if they
were to learn about a bunch of techniques of integration?

On the other hand, I wouldn't hesitate to recommend learning LA to them.
Thinking about operators, vector spaces, dimensions, etc. is applicable much
more widely. (I speak from experience here, because I've been "finishing up"
the applications chapter in my LA book and I keep finding new applications to
cover: balancing chemical equations, electric currents and other network
flows, nutrition, least squares approximations, linear programming,
cryptography, network coding, error correcting codes, Fourier analysis, prob.
theory, and quantum mechanics. The LA party just don't stop ;)

~~~
mooreds
I remember in my college linear algebra class (I think it was) that my prof
made the joke that his wife was the projection of the vector representing all
the attributes of his ideal woman into the space that represented the pool of
all women who would date him.

I thought it was hilarious. Almost made me want to be a math major (I was just
a physics major). Which makes me think of:
[https://www.xkcd.com/435/](https://www.xkcd.com/435/)

~~~
paulmd
I can't decide whether to go for the "null vector" joke or the "yo mamma's
magnitude is so great"...

------
rivalis
Mathematical truths and objects are real things with existence independent of
our minds that we "discover," not just designed things. The author seems to
believe that the language used to describe mathematics (which is indeed a
designed thing, just like software) is the only thing "there." She is probably
a formalist.

I think it is important to remember this, because mathematics, like a
computer, "fights back." You cannot simply dream up whatever structure you
want and have it mean what you want and behave how you want. See Godel's
incompleteness theorems. No matter what you are doing, your mathematical
constructs (including your implicit Turing Machines in your computer programs)
must obey certain underlying constraints that are completely mind-independent.
These constraints are what mathematicians study, albeit through a glass,
darkly.

Regardless of ontological issues with the post, I like that it emphasizes the
designed nature of our mathematical tools. The space of possible tools is so
large that there is near-limitless room for human creativity and design in
mathematical research. It is a shame that most mathematics classes don't
really get that across.

edit: fixed misgendering, sorry, that was sexist.

~~~
hackinthebochs
>Mathematical truths and objects are real things with existence independent of
our minds that we "discover," not just designed things.

While its almost certainly true that the content of mathematics is mind
independent, it is far from obvious that these objects are "real things".The
real meat of the issue is how exactly the mind-independence is cashed out.
Different ideas paint a vastly different picture of mathematics and even the
universe. For example platonism vs. nominalism. Lets not be so quick to put
forward as an obvious truth the critical issue in question.

~~~
eli_gottlieb
Can't a mathematical theory compress, generalize, and map out many relevant
empirical facts very well without _needing_ ontological commitments to the
generalizations themselves?

The real numbers seem to be a perfect example: if you work in physics at
scales where quantization doesn't noticeably apply, the only way to calculate
correct predictions is really to use real numbers and continuous (mostly
Euclidean) spaces. But that doesn't mean physical objects are ontological
shadows of our mathematical abstractions, as Plato's Allegory of the Cave
portrayed it. Quite the reverse: when you get down to a sufficiently small,
fundamental level, objects, space, and time _stop_ being continuous and
correct experimental predictions only come from using _discrete_ formalisms.

You can then proceed to ask, which one is _Platonically_ real, the continuous
mathematical spaces or the discrete physical ones? But I think the answer
there might be, "Who says anything is _Platonically_ real? The map is not the
territory, so shut up and calculate."

~~~
hackinthebochs
>Can't a mathematical theory compress, generalize, and map out many relevant
empirical facts very well without needing ontological commitments to the
generalizations themselves?

Maybe, but its not obvious. The fact that the same generalizations are
multiply realizable in different processes/structures certainly says something
interesting. The consequences of this multiple realizability hasn't been fully
investigated.

Your response seems to be arguing against my post which was mainly about mind-
independence, by arguing against platonism. I don't see that mind-independence
necessarily implies platonism. In fact, I find all forms of platonism
extremely distasteful.

>The map is not the territory, so shut up and calculate.

Right, but this in fact goes to the heart of the question of the philosophy of
mathematics. When someone says that mathematical objects are mind-independent,
they are _not_ talking about the notation itself (the map), but rather the
content of the notations (the territory), i.e. the structure revealed through
the notation. It should be pretty obvious that there are many interesting
questions about the mind-independent structure of the territory. "Shut up and
calculate" isn't an answer to this question, but rather the attitude that the
answer simply doesn't matter. For many fields the answer doesn't matter, but
the question is worth asking nonetheless.

------
mshron
Well said. Advanced math is mostly about working with properties higher up the
chain of abstraction, and then seeing what happens when you bring the insights
learned up there back down to more concrete examples.

From an OO point of view, the real numbers inherit almost every useful trait:
they're a field, they have a topology, they have a measure. Studying the
parent classes, so to speak, gives you abstract algebra, topology, and
analysis, respectively.

Once you get the basics of each, you can study how they interact. Then, once
that stuff is clear, they can be recombined in beautiful ways to give you new
objects to study.

~~~
kejaed
"Well said. Advanced math is mostly about working with properties higher up
the chain of abstraction, and then seeing what happens when you bring the
insights learned up there back down to more concrete examples."

I came here to say the same thing. I went through a lesser known engineering
discipline, "Mathematics and Enigneering" [0] and found that the type of
thinking one learns doing pure math proofs has served me well in my eventual
career in aerospace systems engineering. I find that the thought process in
considering a proof as a high level whole/black box or being able to drill
down to the finest detail while still keeping the big picture in mind has
translated quite well to my day to day traversal up and down the abstraction
ladder at work.

[0]
[http://www.mast.queensu.ca/meng/undergrad/info.php](http://www.mast.queensu.ca/meng/undergrad/info.php)

------
shockzzz
I have, on multiple occasions, looked at math equations in CS paper and been
like, "WTF?"

But when I look at the implementation in code it's so obvious what's going on.

~~~
thachmai
Can you give us some examples?

I can't fathom how someone who cannot understand the math formula can
understand the code.

~~~
shockzzz
Context: I never did too well in statistics, and in general I was always
pretty bad at understanding simple Math notation like sigma notation, multi-
variate calc, or linear algebra.

Here's the Hokusai paper:
[http://www.auai.org/uai2012/papers/231.pdf](http://www.auai.org/uai2012/papers/231.pdf)

And here's an implementation:
[https://github.com/dgryski/hokusai](https://github.com/dgryski/hokusai)

For me to understand the paper, I had to go through the code to piece out the
Math jargon, even though the Math behind it is really quite simple. Another
way of saying this is, "I'm dumb," sure. Not gonna argue with anyone there.

Though I still think it's curious why it's easier for me to understand a
concept through code than it is through Math. It's sort of reminiscent of when
you first learned Algebra, your teacher would tell you to replace the variable
with an easy number and work through it to understand the mechanics of
equations.

I think code is an example of that - literally working through the mechanics
of algorithms to understand what they actually do and how they work.

~~~
eli_gottlieb
I think it's probably just that you have far more relevant experience with
reading code. Math is its own language, and math departments only really start
teaching how to read and write it in Real Analysis classes. As my math
professor stepfather said his advisor once put it, "Everything before that is
just to keep the children from running in the halls."

~~~
paulkon
I've always wondered why Real Analysis isn't introduced earlier in math
education. It might make it easier to grasp concepts introduced in upper-level
math.

~~~
eggy
That's why so many people point to Spivak's Calculus book, since it has been
hinted it should be called an analysis text. Strang's Calculus, as do others,
start with real-world examples. Personally, I had to do the Strang route, and
then come back to Spivak, since I had to admit to myself, I don't have that
ability to think 100% abstractly; I need motivation from the real world.

------
ccvannorman
What a great article. Paraphrased "Math is a designed thing, for humans and by
humans, not an absolute truth." Also this post is the BEST introduction to
linear algebra that I have seen.

~~~
musername
to assume that maths is itself the absolute truth ,which suppesedly actual
mathematics are supposed to lead to, is an apt syllogism. It's saying, any
kind of math that doesn't reveal an absolut truth isn't real maths.

------
ky3
You can play creatively in a particular nexus of math and software engineering
called Djinn [0], the Haskell program that writes your Haskell programs for
you.

1\. An ancestor of Djinn is automated theorem proving. Why can't machines
prove math theorems for us? This quest goes back to the dawn of computing
science.

2\. A more recent development is the Curry-Howard Correspondence. Programming
in a (typed) FP language is like playing tetris. Solving symbolic logic
problems [1] is also like playing tetris. Djinn exposes the connection in a
REPL you can play with. And see how the computer plays tetris for you!

3\. Don't want to install Djinn? No problem, just hop over to the Haskell IRC
[2]. Lambdabot has a working Djinn plugin.

[0]
[https://hackage.haskell.org/package/djinn](https://hackage.haskell.org/package/djinn)

[1]
[https://www.coursera.org/course/intrologic](https://www.coursera.org/course/intrologic)

[2]
[https://wiki.haskell.org/IRC_channel](https://wiki.haskell.org/IRC_channel)

------
matheweis
I've had a similar thought as the author, and often wondered - could we
develop alternative systems for intermediate-to-advanced mathematical concepts
that would make it easier to parse?

~~~
Yomammas_Lemma
Category theory?

~~~
charlieflowers
It has been called many things ("general abstract nonsense", etc.), but I
don't know if it has ever been called "an easier way to explain things to non-
mathematicians." :)

~~~
eli_gottlieb
Nobody thinks category theory is easy for nonmathematicians, but the point of
it is that it provides a common language for talking about common behaviors
across many different, _seemingly_ disparate, fields of mathematics, _for
mathematicians_.

~~~
charlieflowers
Fair enough.

------
abc_lisper
Very good article. Studying computer science as my sole field, I am starting
to realize how much I have missed out on getting an alternative take on
things.

~~~
wmichelin
I think the same could be said for almost any profession or field of study.

------
euske
The other day I realized that a man-made law is also a bit like mathematics or
computer software. It is carefully designed and constructed. Ideally, it is
intended to work like a machine with as little room for human discretion as
possible. And just like mathematics, adding an another "axiom" to the law has
far, far-reaching consequences.

~~~
johncolanduoni
> And just like mathematics, adding an another "axiom" to the law has far,
> far-reaching consequences.

Actually mathematicians virtually never do this. Almost all of mathematics
(from arithmetic to calculus to category theory) operates in the confines of
ZFC[1] and adds no further axioms. All of these fields may add _definitions_ ,
but these are just shorthand; the are conservative and have no actual
consequences. It is more like fixing Newton's laws and then experimenting with
all the machines you can build with them.

[1]:
[https://en.wikipedia.org/wiki/Zermelo%E2%80%93Fraenkel_set_t...](https://en.wikipedia.org/wiki/Zermelo%E2%80%93Fraenkel_set_theory)

~~~
eli_gottlieb
>Actually mathematicians virtually never do this. Almost all of mathematics
(from arithmetic to calculus to category theory) operates in the confines of
ZFC[1] and adds no further axioms.

Except for all the non-ZFC foundations, like type theory or category-theoretic
foundations.

~~~
cwzwarich
Relatively speaking, there are very few mathematicians using those systems to
discover new mathematics as opposed to studying them for their foundational
interest.

------
a3_nm
It is also interesting that there are many parallels between software
engineering and the design of mathematical proofs (or theoretical CS proofs,
which I am more familiar with).

In theoretical CS, people talk of catching and fixing "bugs" in proofs,
namely, mistakes that make the proof fail but can hopefully be fixed while
sticking to essentially the same idea.

One can "refactor" proofs, in superficial ways (e.g., renaming of concepts),
but in deeper ways also, e.g., extract part of a proof to make it an
independent lemma that you can reuse (or "invoke") from other parts of the
proof. One often tries to "decouple" large proofs into independent parts with
clearly defined "interfaces", that the reader can understand separately from
each other, though this usually implies a tradeoff (a more tightly integrated
proof requires more mental space but is usually shorter overall).

One can think of the statement of sub-results (lemmas) as providing an
"interface" to invoke them elsewhere, which you try to "decouple" from the
actual "implementation", namely, the way the lemmas are really proven. It
takes practice to find the right way to abstract away the essence of a result
to state it correctly, without burdening it with implementation details, but
without forgetting an important aspect of the result that will be necessary
later. As in software engineering, once a result is proven, you stop burdening
your mind with the implementation and mostly think about the statement (i.e.,
what the result is supposed to be doing) when using it.

In software engineering, one must decide which part of the code is
"responsible" for checking certain properties on the objects, and that code
may "assume" some preconditions on its inputs and must "guarantee" some
preconditions on its outputs. In the same way, in proofs, one often wonders
where certain conditions should be verified. Should they be part of the
definition of the object? Does this lemma enforce more conditions on the
object than what is guaranteed by its statement?

The parallel is not perfect. In software engineering, you can rely on the
computer to check that your code is correct, and to execute it. In mathematics
you rely on other humans to do this and check that they are convinced by your
proofs. This means you can get away with appeals to human intuition which are
not fully formal, but on the other hand there is no safety net when you make
an error in your reasoning, no reality check that you can invoke to avoid
exploring erroneous consequences. Also, this does not apply to all types of
proofs; but it applies especially well to proofs that describe a construction,
i.e., a way to "build" a certain abstract object, often to justify that an
object with a certain desirable set of properties exists.

~~~
thinkpad20
If you are writing programs in Coq or another dependently typed programming
language, these parallels between math and programming are not just
incidental; they are one and the same. Theorems are stated as functions, whose
type signatures reflect the theorem being proven. Even the gap that you
mention at the end of your post is bridged, since the type checker will verify
that your proof is correct (modulo the trust in the metatheory of the language
itself, and its implementation). Things like refactoring, choosing more or
less specific statements of theorems (e.g. more or less abstraction in a
library), interfaces, etc, all become exactly analogous to the same kinds of
problems faced in software engineering. It's really quite remarkable.

(Disclaimer: I'm hardly an expert in these languages; I just dabble.)

~~~
asgard1024
You are, of course, correct. It's strange no one brought up
[https://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspon...](https://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspondence)
yet. In a very real sense, programmers are creating mathematical proofs!

------
rdlecler1
Understanding programming languages definitely helped my understanding of
Math. Smarter people than myself can do this the other way around, but I
always needed to understand the why before it could start to stick. I didn't
really understand programming languages until I could dig into the source code
and the standard libraries to see how and why everything was done. The problem
with modern math teaching is that is starts with fully baked axioms and it
doesn't walk you through the process of discovery before it was all cleaned up
into a neat terse explanation. One exception is a great book from the 40s
called a Mathameticians Delight. It was recommended to my by my Yale professor
and I highly recommend it.

------
currentoor
Interesting article. I always thought math felt like programming but in a
language far higher level than any of the available programming languages. So
like programming but with a lot less friction when going from thought to
symbols.

For example, creating new domain specific control flows with Lisp macros
versus defining a Dirac delta function using limits and integrals. In
programming it's easy for bugs to seep in because there are more little/subtle
details and leaky abstractions. But math on the other hand feels much more
abstract and clean.

Perhaps this is just because dumb silicon boxes interpret our code and humans
interpret our math which gives us a much more sophisticated base language to
work with.

------
agumonkey
Not trying to evangelize but FP was an great hint for that. Seeing 'tangible'
(that I can create, see, step through) incarnations of groups, monoids,
transitive relations etc gave an operational grounding to abstract algebra.
Something needed for some of us before see the abstraction behind the
notation, and understanding it.

~~~
vezzy-fnord
The perception of FP as being somehow more "math-y" is nothing but a bias.
There is no intrinsic magical property of "mathiness", it's just that the
operational semantics of functional languages are much more well-defined.
Logic programming is itself firmly based in axiomatic semantics and has a
similar _a priori_ system of reasoning to it, though distinct from FP.
Imperative languages can be modeled well on Hoare logic, too, and Dijkstra did
plenty of research on it, known as predicate transformer semantics. It is
comparatively understudied, though.

~~~
alexlarsson
Also, many purely mathematical models of computing like turing machines are
definitely not functional.

~~~
Yomammas_Lemma
Well, nobody is proposing that Turing Machines be the foundation of
mathematics, whereas the sort of type theory used in FP is being taken pretty
seriously as a foundation.

------
vayarajesh
Any more recommendation for books on Math? I am a web developer and I wanted
to learn Math from the basics (I only did Math till my 10th Grade)

I have a keen interest in neural networks and it requires good foundation of
Math.

------
vezzy-fnord
[http://c2.com/cgi/wiki?DisciplineEnvy](http://c2.com/cgi/wiki?DisciplineEnvy)

------
Dramatize
I never got into programming growing up because I thought it involved complex
math. I wish I knew it's mostly logic rather than algebra.

------
louithethrid
Is there legacy code in mathematics ?

------
throwaway593492
Created a throwaway just to post this.

I've never taken a college-level linear algebra course, and I've found
KhanAcademy's linear algebra course to be a good gentle introduction. I just
skip the parts I already know.

------
linky123
We only hire people with math degrees for our company (parallel supercomputing
field). CS people generally don't have the required skills.

------
brandonium21
Yay

