
Why there is no Hitchhiker’s Guide to Mathematics for Programmers - dominotw
http://jeremykun.com/2013/02/08/why-there-is-no-hitchhikers-guide-to-mathematics-for-programmers/
======
asolove
While there is no "royal road" to mathematics for programmers who don't care
about proofs, there is a programmer's road to proofs for those interested in
Math.

In fact, the book is even called "The Haskell Road to Logic, Maths and
Programming" [0]. It covers mathematical notation, proof construction, and
lots of interesting portions of discrete math that should be of interest to
programmers. And large portions of the results are demonstrated or used in
interesting Haskell programs.

[0] [http://www.amazon.com/Haskell-Programming-Second-Edition-
Com...](http://www.amazon.com/Haskell-Programming-Second-Edition-
Computing/dp/0954300696/ref=cm_cr_pr_product_top)

~~~
dysoco
That book is on my to-read list, yet I don't know what to think of a book that
starts like this:
[http://i.imgur.com/rSc0fpw.png](http://i.imgur.com/rSc0fpw.png)

~~~
ef4
The validity of that statement depends on your historical perspective.

To the modern eye, we see radically different type systems and conclude they
are as different as night and day.

But there was a time when the world was divided into people who thought
anything higher-level than C or FORTRAN was stupid and impractical, and those
who aspired to build more abstract languages. LISP was one of the rallying
cries of the latter group. Haskell clearly falls into that group too.

~~~
dysoco
Hm, that makes some sense, the book might be old, that's why.

~~~
andrewflnr
It can't be that old, since according to Wikipedia Haskell itself is only
about 23 years old.
[http://en.wikipedia.org/wiki/Haskell_(programming_language)](http://en.wikipedia.org/wiki/Haskell_\(programming_language\))

------
darkxanthos
I like how the author mentions having an executable mathematical language
might be helpful. Math is a non-standard non-machine-executable programming
language which means it's inherently buggy. On top of that, now I need to
interpret the "code" while I'm learning and it can be pretty dense.

Much like in programming, I'm sure expert mathematicians may not see why it's
so painful, but wait until a more robust language/system is developed for the
hobbyist. It will be extremely disruptive.

~~~
TelmoMenezes
> Math is a non-standard non-machine-executable programming language which
> means it's inherently buggy.

I don't understand what you mean by non-standard. I can't think of any
language more standard than mathematics.

Mathematical theorems are formally proven. That is a higher form of knowledge
than empirical validation (running programs against tests). Another case of
empirical validation is scientific experimentation. It validates theories by
increasing our assessment of their plausibility, but never up to 100%.
Mathematical proofs are the only completely objective form of knowledge.
Automatic theorem provers do exist and are employed by mathematicians, but
they are inherently limited, as proven by Gödel's undecidability theorems.

~~~
yummyfajitas
_Mathematical theorems are formally proven._

It's formal compared to colloquial english, but completely vague compared to
computer programs.

There are automatic theorem provers, but it isn't Godel's undecidability
theorem that limits them. It's the difficulty of formally writing down our
proofs in machine readable format that limits them for practical purposes.

~~~
karamazov
Mathematical proofs are completely precise, that is to say, not vague in the
least. They are, however, in an extremely high-level language; it's left to
the reader to expand the notation enough to convince themselves of the
validity.

(And, of course, there can be bugs - that is, mistakes - but there is no
ambiguity.)

~~~
cwp
So the underlying ideas are precise, but they're expressed in imprecise
language? That's true of all communication.

When I ask my wife for "that thing by the door," I know exactly what I mean,
but she has a to do a lot of work to decode it.

------
ivan_ah
> To learn mathematics from scratch. The working programmer simply doesn’t
> have time for that.

I disagree. Most of high school math and calculus can be learned fairly
quickly (like weeks). This is what I have been trying to do with my book "No
bullshit guide to math and physics." It has been very successful with coders:
[http://minireference.com/](http://minireference.com/)

Also on the topic math \cap code, here is an excellent talk by Gerald Sussman:
[http://www.infoq.com/presentations/Expression-of-
Ideas](http://www.infoq.com/presentations/Expression-of-Ideas)

~~~
j2kun
The problem is that high school math and basic calculus is not much
mathematics at all, and it still has the problems of foreign notation and
ambiguity.

~~~
ivan_ah
HS math and calc a is _some_ math, definitely not all, but a good chunk of
what I would call "applied math"\---the math that links directly to the real-
world, meaning the concepts can be understood intuitively, e.g., the
connections to physics.

The trick is to covertly throw in some formal proofs to prepare the reader for
the more abstract stuff. Going directly into proofs might be too much of a
jump for some people, though I agree that, ultimately, this is what
mathematics is about.

As for the foreign notation, I wrote an appendix which explains how to read
things:
[http://mcgillweb.alwaysdata.net/notation_appendix.pdf](http://mcgillweb.alwaysdata.net/notation_appendix.pdf)

~~~
wolfgke
> HS math and calc a is some math, definitely not all, but a good chunk of
> what I would call "applied math"\---the math that links directly to the
> real-world, meaning the concepts can be understood intuitively, e.g., the
> connections to physics.

IMHO there's little mathematics that can't be applied to the real world rather
directly. I even heard mathematicians say that the distinction between pure
and applied mathematics is mostly for historical reasons. Especially if you
consider connections to physics as intuitive understandability - for example
in string theory or quantum field theory you'll find lots of highly
complicated mathematics.

~~~
xyzzyz
_I even heard mathematicians say that the distinction between pure and applied
mathematics is mostly for historical reasons._

It must have been applied mathematician, for I cannot even imagine pure
mathematician uttering anything like this.

I mean, seriously, I know quite a lot of modern pure mathematics, and for most
of it I cannot picture even far-fetched connection with anything existing in
real world, not to mention an actual application in solving some problem that
wasn't created only for this application.

Most of the time, for almost every field of mathematics, the way it works is
that for an enormous amount of knowledge, and enormous amount of research
happening and results and papers being published, only very, very small amount
will actually get applied any time soon (soon as in next 200 years). For some
fields, like calculus, or probability, or partial differential equations,
amount of applicable stuff is larger (mostly it's just old, one or two
centuries old stuff anyway), and for other fields, like say homological
algebra, or algebraic topology, or descriptive set theory, the applicable
stuff is almost nonexistent - I'll buy a beer to anyone who'll point me to an
application of descriptive set theory to any real life problem.

Almost all of the mathematics existing is purely abstract, and not applicable
to real world, and I find it hard to even imagine anyone trying to argue
otherwise. One can argue that what now is considered abstract and not
applicable can become very useful in real life problems in future, and indeed,
it happens quite often, but I think it will not be much, since we're applying
to real life problems only a small fraction of 200 years old mathematics today
anyway.

~~~
j2kun
I don't think it's fair to say some idea is not applicable because you don't
know of an application. Number theory was like that before the advent of
computers, and I'm a firm believer that there are thoughts that cannot be
thunk until the right framework comes along to allow it.

That being said, a lot of these pure subjects do have applications. For
example: algebraic geometry (the crown jewel of pure mathematics, my
colleagues would have be believe) has applications to tons of industrial
problems in the form of solving systems of polynomial equations (see homotopy
continuation). Algebraic geometry has also been applied to robot motion
planning, etc.

Descriptive set theory is applied in functional analysis and in ergodic
theory, which in turn is applied to statistical physics. Not to mention that
descriptive set theory is the pure-logic equivalent of computational
complexity theory, and that there is potential to connect the two fields and
resolve some big open problems (though it's doubtful that P vs NP will be
resolved this way).

And almost all of modern physics is based on more or less modern mathematics:
tensor analysis and other flavors of linear algebra, lie theory, etc.
Algebraic topology is starting to find some traction in the subfield of
persistent homology, which aims to study high-dimensional data sets in the
context of homological algebra. I even gave a talk earlier this year on the
concrete attempts people have made to apply persistent homology to real-world
problems [1]. It's still an extremely young field, but shows some promise.

I'll give you that mathematics is extremely abstract, because I believe it.
But to say that it's not applicable and being applied wherever possible is a
bit naive. And to say that it's only 200 year old mathematics is to ignore the
most applicable fields which did not exist even a hundred years ago:
combinatorial optimization, mathematical computer science, and modern
statistics and probability theory.

[1] [http://jeremykun.com/2013/04/27/persistent-homology-talk-
at-...](http://jeremykun.com/2013/04/27/persistent-homology-talk-at-uic-
slides/)

~~~
xyzzyz
I must have not communicated what I meant clearly, because you missed my
point.

Only a very, very small fraction of results in number theory are applicable.
Only a very, very small fraction of stuff done in algebraic topology is
applicable. While descriptive set theory is indeed sometimes applied in
functional analysis, the intersection between these two is not a significant
fraction of each one of them, and by the time you apply (a very small fraction
of) functional analysis to statistical physics, you're already too far from
descriptive set theory to even see it on the horizon.

I'm not saying that none of the mathematics is applicable, because this is
obviously not the case. What I'm saying is that the stuff that gets applied to
real life problems is surprisingly small, even more so when you're not a
mathematician.

When I first started to learn mathematics, I was completely overwhelmed, when
I found out just how much knowledge is out there in this field of human
activity. The planes of mathematics are so vast, there's almost nothing else
in sight when you stand atop of the Mount Bourbaki. I realized that even if I
get a PhD in pure mathematics, I will still only be able to learn less 1% of
mathematics ever created in my whole lifetime. That's why when I hear people
saying that all math can be applied, I think that they must have not realized
just how much of the stuff is in there.

I used to specialize in algebraic topology, and when I first learned about
persistent homology, I encountered a home page of the professor at some US
faculty, who specialized in applying algebraic topology in real life, and the
first reaction of me and my classmates whom I have shown his website was not
appreciation of his results. We were totally amazed that this stuff can be
applied to anything _at all_. Yeah, some of these applications were to the
problems that are usually solved in a better way, some problems were very
contrived and seemed to actually be tailored so that one can apply algebraic
topology to them, but still these were very fine and interesting results, and
we were very surprised by them.

The field of algebraic geometry actually makes an interesting example. Indeed,
it is considered by many one of the most, if not the most abstract field of
the mathematics. Initially, at the beginning of the previous century, people
were mostly concerned with studying the sets of solutions of systems of
polynomial equations. David Hilbert with his landmark results being his basis
theorem and Nullstellensatz laid fundamentals to this field, and because of an
essential assumption in the Nullstellensatz theorem that creates a bridge
between algebra and geometry, for many decades most of the results concerned
only polynomials and sets of solutions in algebraically closed fields, because
for anything else, the apparatus was just lacking. Algebraic geometry didn't
have a reputation of an extremely abstract field, especially since sets of
solutions to systems of polynomial equations are quite natural objects, and
it's easy enough to imagine their occurrence in real life problems.

In 1950s and 1960s, though, the field was completely and utterly
revolutionized by Alexander Grothendieck and his school, and that's when the
field gained its fame of being very esoteric. Grothendieck's methods and
approach allowed algebraic geometers to tackle vastly bigger range of
problems, and ultimately to efface the distinction between algebra and
geometry, at the price of making things much more abstract and distanced from
more concrete considerations. That's when algebraic geometry expanded its
reach, to encompass a large amount of research in abstract algebra and number
theory.

In the meantime, another interesting thing happened in the field: the advent
of computational techniques. Things like Groebner bases really pushed things
forward and made a lot of theoretical stuff actually doable in practice. This
is mainly what made things you mention in your post possible: while many
industry problems could be formulated in geometric terms earlier, only rise of
computational methods actually allowed us to solve them.

The point here is this: algebraic geometry field consists of two parts, the
older and more concrete, which can and is (relatively) frequently applied, and
the newer, but more abstract, of which applied is very little -- with notable
exceptions though being for instance finite elliptic curves with well known
application to cryptography, or Calabi-Yau manifolds which are intimately
connected to the string theory. Nevertheless the applied stuff constitutes
only a small part of the field, the rest is just pure and abstract
mathematics, without any applications whatsoever, though it's still worth
noting that algebraic geometry is still relatively very good in the amount of
applicable stuff, and fields like algebraic topology, not to even mention
descriptive set theory, fare much, much worse with regard to it. For me, a bit
naive is to think that big part of mathematics will be applied, when there's
so much of it.

~~~
j2kun
I'm aware of the categorical revolution, since I'm young enough to have been
(mathematically) raised on that perspective. Some of the more abstract
algebraic geometry is actually coming back to computational applications. See,
for example:
[http://www.researchgate.net/publication/226664601_From_Oil_F...](http://www.researchgate.net/publication/226664601_From_Oil_Fields_to_Hilbert_Schemes)

To say that something is not applicable is hard to argue. And besides giving
examples of when it is applied (and you claiming it's still an unimaginably
small fraction of mathematics) all I can say is that the understanding of some
object can provide insights and applications in unexpected ways. Dynamical
systems inspire computer graphics, Mobius bands inspire carburetor belt
design, category theory inspires Haskell... I just don't think it's fair to
ask for the immediate applications of any given theorem because the ultimate
application is understanding what's going on.

But thanks for the great discussion! :)

------
slurry
When I picture a Hitchhiker's Guide to Mathematics, I don't picture a book
that actually teaches you mathematics. I picture a book will, on demand, give
you just enough information to get by in a particular area of mathematics
without blowing off a leg or something. You know, like the Hitchhiker's Guide
to the Galaxy didn't actually make you an expert on a planet, just gave you
some tips for having fun and just barely surviving should you find yourself
stuck there.

So yeah, a big book of formulas, algorithms and mathematical structures with
example applications complete with code snippets and exhaustive indexing. You
wouldn't learn anything worthwhile (not Real Math and not really even applied
mathematics) but it might get you out of a jam now and then.

It would take heroic effort and probably sell in the hundreds at best, it
would be an immense challenge to keep the examples general enough while still
being useful, but there's no _a priori_ reason why it couldn't be done.

~~~
lutusp
This is a terrific idea -- a math book that avoids the pitfalls of most math
books aimed at nonmathematicians.

Such a book could cover many important mathematical ideas without necessarily
lapsing into equations and overly technical explanations. For example, it
should be possible to describe how compound interest works without falling
into an obscure technical explanation, and understanding compound interest is
very important in modern life.

Another example might explain why the stopping distance of s car is
proportional to the square of the speed -- this is not well-known, and it's
important for drivers to know, young ones especially.

Yet another example would explain why each member of the running sum of odd
numbers is a perfect square. Expressed in words, it's not obvious that it's
true or why it's true, but a picture conveys the reason immediately and
intuitively:
[http://arachnoid.com/example/index.html#Math_Example](http://arachnoid.com/example/index.html#Math_Example)

Just a few examples. I'm sure one could fill such a book with useful examples
that would convey useful information, and make math sound like fun, or both,
without being preachy or too technical.

~~~
wolfgke
> For example, it should be possible to describe how compound interest works
> without falling into an obscure technical explanation

I personally find these obscure technical (and highly abstract) explanations
often far more easy to understand. I often found "easy" explanations highly
illogical - not before I got the rather abstract explanations I found these
explanations acceptable (and even this was not always the case - almost always
the explanation for this phenomenon was that the definitions given in
foundation courses could be abstracted a lot).

How can this be explained? The reason is simple: in highly abstract
definitions anything that is not necessary is omitted - so there is _less_ to
think about. Additionally in this kind of definitions there is a lot more
"internal logic". What does this mean? This is a little bit difficult to
explain for non-mathematicians, but you can be sure that anything in the
definition has a deep meaning. If this meaning seems strange to you, you can
be sure that what remains to be understood often carries a deep meaning. On
the other hand: when using "simple" definitions, you always have to worry
whether, if something sounds strange, it is because you haven't understood it
or if the "simple" explanation was simply bad.

Disclaimer: I'm a mathematician (as may be imagined). But I'm a computer
scientist, too. :-)

~~~
lutusp
> I personally find these obscure technical (and highly abstract) explanations
> often far more easy to understand.

I do, too, but I also know that nontechnical, nonmathematical people are
turned off by a quick immersion in mathematical reasoning. I have a theory
(not just mine by any means) that if the beauty of mathematics could be
presented before the required discipline and attention to detail, we might
lose fewer possible future mathematicians. As things stand, the public level
of innumeracy is depressing.

> On the other hand: when using "simple" definitions, you always have to worry
> whether, if something sounds strange, it is because you haven't understood
> it or if the "simple" explanation was simply bad.

Yes, very true, one must be very careful to get it right while making it
simple. I personally think a persuasive layman's explanation of something
mathematical can go wrong in so many ways, and the more persuasive, the more
room for error. Consider all the crazy "explanations" of quantum theory out
there -- the more popular ones have no connection to reality.

------
polarix
The syntax blocker is the biggest one for me. Are there books full of classic
proofs and explanations of their syntax and shorthand? I'd be very interested.

~~~
tel
Yes!

Grab a copy of "Proofs from the Book". It's tremendous.

~~~
dagw
Proofs from the Book is awesome. Another book in a similar vein, but requiring
less of a mathematical background and targeted more at the curious layman is
"Journey through Genius". Journey through Genius also spends more time giving
general background about the players involved and telling the story leading up
to the proof.

------
beloch
When you tackle a new problem the hardest part is often just figuring out what
you _don 't_ know, but need in order to solve the problem. You might find a
paper or book that solves a very similar problem (if you're lucky) but find
you just can't understand what you're reading. This is probably because the
author assumed his audience would know things you've never even been exposed
to. It would be rather hard to write much of anything if authors didn't do
this! However, it makes life rather difficult for "foreigners" to the
discipline who don't have a good idea of the discipline's city layout and what
neighborhoods they need to hang out in to find help.

The ideal solution is to get a local guide to help you, but a map of knowledge
covering as much of the city as possible would be almost as helpful if it were
any good. Unfortunately, mapping cities of knowledge is harder than mapping
real cities!

It would be utterly fantastic if math (or physics, economics, etc.) books and
publications came with associated meta-data that would tell you what
dependencies are associated with what you're reading. Ideally, it should be
possible to trace the map from string theory right back to counting. This map
would be your guide to all the rabbit holes you dive into! It wouldn't perform
magic and explain quantum physics to you in a paragraph, but it would give you
an idea of how much you don't know and where you need to start.

------
kephra
I have to disagree to the basic assumption of the article.

Its often not necessary for a coder to understand why math works, but only how
to use their results. e.g.:

I've coded fft in 5 different languages, and used it uncountable times. Still
it took ages to understand why dft works, and I did not yet took the step to
understand how dft leads to fft.

I know dozens of Second Life scripters (including me) who use quarternions
regular, still nobody could tell me, why they work so good for 3d rotations.

I've coded my first Markov, 35 years ago, at the age of 12, and Markov was
never mentioned in school at all.

I'm using dozens of machine learning methods on daily base, coded a few my
self, but I do not care to know why they work, but only what their strong
points and limitations are.

Its not necessary to know how to design and build a car, if you just need to
drive from A to B. Its not necessary to study bio-chemistry, to be a good
cook.

~~~
j2kun
The article does not argue that anyone needs to do mathematics. It just
attempts to explain why mathematics tends to be hard for otherwise smart and
motivated programmers.

That being said, fft is a divide and conquer approach to multiplying the dft
matrix by a vector, and it achieves nlogn time by taking advantage of the
special structure of the dft matrix.

------
dysoco
So could anyone suggest a nice book for a High School student (I know basic
algebra, but no Calculus so far) to learn math? The school system is not very
good here, so I have to do some of the learning by myself.

I have looked at Concrete Mathematics by Donald Knuth, would that be
complicated to understand?

~~~
ivan_ah
Check out my _No bullshit guide to math and physics_ :
[http://minireference.com/](http://minireference.com/) [$33], it covers all of
high school math, mechanics, derivatives, and integrals.

An excellent free alternative is _Calculus Made Easy_ by Silvanus P. Thompson,
which is very good and also funny
[http://www.gutenberg.org/ebooks/33283](http://www.gutenberg.org/ebooks/33283)

~~~
dysoco
That book looks amazing, I'd love to learn some Physics too, will see if I can
get it.

Read the first pages of Calculus Made Easy, looked nice so far.

------
cliveowen
If it weren't for mathematics I would've graduated this year, instead I'm
still stuck with Algebra and Calculus and what not. I don't know about you,
but every time a read a theorem is like getting kicked in the nuts and then
punched in the stomach. The fact is, our brains are wired to excel at a
handful of things and if math it's not one of them you're out of luck. You
can't teach a person to paint, or to write, because you can't teach them
talent, the same is true with math. You just have to plough through and hope
to get better with time.

~~~
j2kun
It certainly took you many years to learn how to read and write, yet here you
are.

It sounds like you have a grudge against basic mathematics because you think
isn't important, you're afraid of it, and it affected your life in an adverse
way. Unfortunately that has little to do with a human's ability to learn.

The truth is that talent means nothing in comparison with practice.

~~~
oblique63
> _It certainly took you many years to learn how to read and write, yet here
> you are._

Exactly. Most people don't realize how unnatural and difficult written
language actually is, yet here we are. There's a great book on the subject
called 'Proust and the Squid'[1]; it looks into the history, development and
neuroscience of reading and it's quite eye-opening. There's really no reason
to believe why math/music/programming/etc should be any different.

[1] [http://www.amazon.com/Proust-Squid-Story-Science-
Reading/dp/...](http://www.amazon.com/Proust-Squid-Story-Science-
Reading/dp/0060933844/)

~~~
cliveowen
If reading and writing were once unnatural and difficult skills but now are
easily picked is just because evolutions has wired our brains to learn it
faster, maybe future generations will struggle less with math for the same
reason.

~~~
j2kun
I don't think you realize how much time you spent learning to read and write,
considering that every waking moment you're surrounded by things with text on
them.

Also, you have a pretty skewed idea of how evolution works. How long do you
think it took for evolution to "wire our brains" to learn to write quickly?

~~~
cliveowen
The fact that we're continually surrounded by written text only helps explain
why we haven't that much difficulty learning how to read it.

As long as the time it took for evolution to rewire our brains, I'd say more
than a thousand of years, it's not like writing was invented yesterday.

~~~
j2kun
What proportion of the world was literate a thousand years ago?

------
contingencies
This seems like a resonable place to mention that years ago a guy called Tom
Henderson in Portland put up a kickstarter project for a _Punk Mathematics_
book, which I funded, but he has taken $30,000, completely disrespected his
funders and failed to produce anything at all. If you meet the guy, do
(verbally) thump him for me.
[http://www.kickstarter.com/projects/1541803748/punk-
mathemat...](http://www.kickstarter.com/projects/1541803748/punk-mathematics)

------
mtdewcmu
> My programs absolutely reeked of programming no-nos. Hundred-line functions
> and even thousand-line classes, magic numbers, unreachable blocks of code,
> ridiculous code comments, a complete disregard for sensible object
> orientation, negligence of nearly all logic, and type-coercion that would
> make your skin crawl.

It probably would have been easier to start on a language that's procedural
and maybe dynamically-typed. There are fewer traps to fall into.

------
shire
Great read, Khan academy does solve a lot of the headaches that come with
Math, thank you Salman Khan! if it wasn't for him I wouldn't have gotten a 3.8
in Calculus.

for those who don't know math or want to learn this will make your life
easier, [https://www.khanacademy.org/](https://www.khanacademy.org/)

~~~
dominotw
I am not a big fan of 'black screen with voice over' format. There are much
much better alternatives for calculus these days, for example calc1 course on
coursera is excellent.

~~~
shire
On the contrary I prefer his method of teaching more than anything else, it
can be quite distracting when there is a face or hand movements on the screen.

------
skylan_q
Excellent read. Thanks for this.

------
Dewie
> It is an interesting pedagogical question in my mind whether there is a way
> to introduce proofs and the language of mature mathematics in a way that
> stays within a stone’s throw of computer programs. It seems like a
> worthwhile effort, but I can’t think of anyone who has sought to replace a
> classical mathematics education entirely with one based on computation.

...Curry–Howard correspondence?

~~~
j2kun
Is quite a large stone's throw away from both computer science and standard
mathematical proofs.

------
rfnslyr
_I was hit hard in the face by a segmentation fault. It took hundreds of test
cases and more than twenty hours of confusion before I found the error: I was
passing a reference when I should have been passing a pointer._

That code wouldn't compile, if a pointer is expected you can't just pass a
reference, hello static typing. Furthermore, you can't explicitly "pass a
reference" to begin with, as in there is no syntax in the caller to signify
"this shall be passed as a reference", that would be something that is in the
signature of the CALLEE. So his "error" literally makes no sense as it cannot
happen. If what he means is that he wrote the callee wrong, that doesn't make
any sense either, because the whole point is that you can't reseat (make to
point to something else) a reference. So you would know if you needed to use a
pointer instead the minute you tried to reseat a ref type.

Also, anyone who takes 24 hours to debug a segfault needs to pick another
profession and _quickly_.

~~~
humbledrone
If you have never taken 24 hours to debug a segfault, you haven't done enough
C/C++ programming.

E.g. a rare race condition causes some memory to be freed while there's still
a live pointer to it, but THAT doesn't cause a segfault immediately because
the memory is immediately reused for some other data structure (in a different
thread), so when the dangling pointer is accessed it points to allocated
memory that's being used for some other purpose. Some field in the data
structure the dangling pointer references is mutated, and then finally in
another thread that same memory is accessed via a different pointer (the one
that was recently allocated) and part of it is treated as a pointer, and
dereferencing THAT causes a segfault.

(Oh and by the way, in real life things can get much more complicated than
this before the program blows up with a segfault. In fact, you're lucky if
there's a segfault, instead of finding out months down the road that your data
has been subtly wrong the whole time.)

Ultimately the solution to this is some iterative combination of swapping in a
guarded malloc, then trying things in Valgrind, then trying various Valgrind
options, then setting some hardware watchpoints in gdb, then finding that the
race condition disappears in the debugger, then adding log statements all over
the place, then statically analyzing the code for hours while pulling your
hair out, then finally by accident noticing that padding the first data
structure with an extra field makes the problem go away, then setting the
right hardware watchpoint that actually helps, etc, etc, then piecing together
a real story for what's happening, and ultimately making a 2-line fix.

If you think you're somehow "above" this kind of horrible drudgery, and that
your immense programming skill will let you avoid it forever, then you are
just not very experienced (or you're not working on hard problems).

~~~
rfnslyr
These things are true. Things do quickly get complicated when you're trying to
replicate race conditions. What you described, though, is not trying to "fix a
segfault" it's trying to "fix a race condition" where the fault is just a
symptom. In the example in your OP there was none of that. Even assuming what
he meant was that he accidentally mutated a pointer he passed by reference
instead of value which caused problems after he returned and things fell out
of scope or were freed, you could debug that in 5 minutes just by setting a
watch breakpoint on the pointer to see where it was last changed before the
errant access. An hour, tops, if that didn't occur to you to begin with.

Tracking down multi-threaded memory corruption is tough, but those bugs are
going to be rare. Threading is a tough problem to begin with, but it's
certainly not a fault of C/C++. And whatever this guy was describing should
not have taken 'hundreds of test cases and over 24 hours'.

~~~
humbledrone
Why do you say "should not have taken?" Given the fact that he was a student
at the time, learning C++, I think it's perfectly understandable that he ran
into a "bang your head against the wall" type problem. Maybe it was his first
real segfault debugging session. I doubt you solved your first real segfault
in 5 minutes.

Sure, I'd expect someone with a couple years' experience to be able to debug a
simple segfault relatively quickly. But the only reason they'd be able to
debug it quickly is because they've _painstakingly done it before_, and can
draw from that experience.

