Sure, mathematical thinking can be useful, but it's only one type of logical thinking among many types which can be applied to programming.
I've been programming so much for so long now that before I even start writing code my mind launches into an esoteric process of reasoning that I'm not confident would be considered "thinking in math" since I'm not formally skilled in mathematics. It's all just flashes of algorithms, data structures, potential modifications, moving pieces, how they all affect each other and what happens to the entire entangled web when you alter something. Fortunately, my colleagues are often pleased and sometimes even impressed with my code, and yet I'm not so sure I would consider my process "thinking in math."
So, this isn't necessarily a direct refutation to the article. In fact, maybe what I'm talking about is the same thing as what this article is talking about. But, anyway, my point is that I feel that there's more ways to think about problems and solutions than pushing the agenda of applying formal mathematics.
As an aside, I noticed this part of the article:
"Notice that steps 1 and 2 are the ones that take most of our time, ability, and effort. At the same time, these steps don’t lend themselves to programming languages. That doesn’t stop programmers from attempting to solve them in their editor"
Is this really a common thing? How can you try to implement something without first having had thought of the solution?
If the only aspect of mathematics that you bring into programming is logical deduction by mechanical rules, then I doubt it will help, except for rare cases where you prove or disprove the correctness of code. If, on the other hand, you bring over the aesthetic concern, the drive to make painfully difficult ideas more beautiful (ergonomic) for human brains, then it will help you make your code simpler, clearer, and easier for others to work with.
It's common, and as you can imagine, it doesn't lead to good outcomes. When people start by coding first, it's so much work they tend to stop at their first solution, no matter how ugly it is. When people start by solving the abstract problem first (at a whiteboard, say) they look at their first solution and think, "I bet I can make this simpler so it's easier to code." The difficulty of coding motivates a bad solution if you start with code and a good solution if you write the code last.
A lot of people with particular interest in one area -- say, mathematics -- don't realize that much of what is important is much more generally applicable.
It's not that these things are distinctly important for math. It's that they are important for thinking.
For example, in law or philosophy, repeating the same argument multiple times, adapted for different circumstances, can give it weight. In math and programming, the weight of repetition is dead weight that people strive to eliminate. In law and philosophy, arguments are built out of words and shared assumptions that change over time; in math, new definitions can be added, and terms can be confusingly overloaded, but old definitions remain accessible in a way that old cultural assumptions are not accessible to someone writing a legal argument.
In physics, the real world is a given, and we approximate it as best we can. In math and software, reality is chosen from the systems we are able to construct. Think of all the things in our society that would be different if they were not constrained by our ability to construct software. Traffic, for one — there would be no human drivers and almost zero traffic deaths.
Where programming differs from math is that math is limited only by human constraints. Running programs on real hardware imposes additional constraints that interact with the human ones.
There’s kind of two ideas going on here (in this thread in general), I think.
One seems to be of a mindset I’d describe as thinking in math means glomming onto knowing linear algebra.
The other seems to be thinking in interconnections, minimalist definitions, and those abstract concepts that exist in math (and all kinds of things) for connecting discrete ideas into composite ideas.
One thing that bugs me is code with overly specific semantics, where it reads like that’s the only problem the code could solve.
When if it’s broken into concepts and abstraction in the PLANNING stage the code ends up being less verbose and descriptive of the human problem and more useful for a variety of problems.
So instead of code to balance a checkbook, I’d write code to add/subtract numbers and input numbers from my checking account.
I see a whole lot of code with too much specific semantic meaning. And it ends in practice that we think code in one system is highly specific to that system and minimizes effort to reuse.
At least that’s been my experience at work. Ymmv
I agree with your gist, there are lots of things where studying that thing is virtuous beyond its direct application. But also, I’d contend that thought is the subject of mathematics and not just a virtuous side-effect.
And as programmers we work with mathematical objects called state spaces, that have vastly more than 100 dimensions.
That said one can easily be a competent programmer without having much formal mathematical knowledge much like one can easily be a competent ball player without knowing the differential calculus. However, just as modern ball players improve their games with computer aided mathematical analysis of their swings and so on, a programmer can improve the quality of his output by mathematical analysis, in particular via the use of the predicate calculus and its, in my opinion, most useful application of loop analysis.
How did he solve it? Using probability theory and sets.
It's not just games, cryptography, finance, signal processing, compression, optimization, and AI that require mathematics, tons of programming does most people just don't realize it and brute force their way to a solution.
Lot's of real world problem can be solved with
algebra, calculus, Boolean algebra, linear algebra, geometry, sets, graph theory, combinatorics, probability and stats. What typically happens is most programmers are giving a problem, and what do they do? They start thinking in code. How did we solve problems before computer?
Apply that kind of thinking, then solve the problem with mathematics. Your code will often be much smaller and dense. Sure, dealing with output and input doesn't require you to write mathematical code, but the core of your problem can often be solved with some mathematics.
Unfortunately, it's incredibly common.
The result is always almost a mess. Functions that are never called, parameters that are never used, as they discovered their mistake as they were coding but then never went and cleaned up the stuff they don't use anymore. Broken logic, poor performance. Functions with a mess of loops and if statements, nested like 10 indents deep.
You can tell by looking at code if they were making it up as they were going versus implementing a solution they had thought through before starting coding. It's painfully obvious.
When you try to solve your problem by coding, I think you are forced to take a myopic view of only subsets of your solutions and it's near impossible to step back at this point and come up with a nicer, more abstract and probably more concise solution. The solution comes out spikey.
Of course when doing it like this you write a lot of code which later is unused or bad. But I think that will always happen and it's just a matter of having the discipline to continuously clean up after yourself.
Nobody criticizes the sculptor for the clay that ends up on the floor, and clay is heavy. We carve away bits, they have no mass and don’t need to be swept up, all we have to do is cut them away, revealing the final program.
Programming may not be (all) math, but it's not art, either.
Maybe the kind of common 8-5 office programming around buisness logic is not, but to design any bigger project is definitely art.
How do you make a statue of an elephant?
If you don't have a reasonably detailed idea of what you want and how to achieve it, you are unlikely to get it.
That is also a useless analogy. Do bridge builders get to test and re-test their bridges in the real, non-simulated world? Can they instantly make a copy of their bridge with a few critical differences and see how the two behave? Can they re-build their bridge in minutes?
Metaphors aside, I think history is ample evidence that "coding your way around a problem" rather than conceptualizing a solution first is a perfectly valid way to approach professional programming. It's not the only way, and it has drawbacks which others have pointed out here. So does the conceptualize-first approach: you might solve the wrong problem, make something inelastic in the face of changing requirements, or fall into the psychological trap of being attached to your mental model even when it turns out that you really didn't think of everything and have to make changes on the fly.
I'm really tired of people being dogmatic about either approach ("move fast and break things/pivot; anyone else isn't really interested in getting stuff done!", "you're just a messy code monkey unless you can hold the solution in your head before you start!"). It's almost always veiled arrogance rather than honest improvement-seeking, in my experience.
> I'm really tired of people being dogmatic about either approach
Exactly - and the implication that I am being dogmatic is a straw man. I am simply opposed to arguments that depend on poor analogies.
Furthermore, all of the bad things that you say can happen if you try to think ahead are as least as likely to happen if you don't, and especially if you have gone in the wrong direction for some time (I know the latter is a manifestation of the sunk-cost fallacy, but it happens a lot on real projects.)
Oh wait that is actually how architects work. In fact at my work we have multiple CAD designers(not architects though) and it's not uncommon for them to completely throw away a design and start over. I think code should be mostly the same.
Of course, but the Apollo 11 lunar lander was created without the aid of ubiquitous desktop computers. I imagine the SpaceX guidance/control software was written in a way that less resembles bridge-building/Apollo 11 lunar landers and more like the organic processes we see elsewhere in the software industry.
If Neo were to build a bridge in the Matrix, chances are his processes would bear little resemblance to those of the Army Corp of Engineers.
For the guidance/control systems, I bet you're wrong.
Software is a design practice/process. Not a building process. Any analogy should be to the design phase of other engineering disciplines.
The CAD designers absolutely test if things work. Why do you think almost every engineering bureau has 3D printers.
Sure, but it is not the only one. You are allowed to think at other levels, and it can be quite useful, especially on larger systems.
The problem of this approach is that it does not scale to large systems. If you don't spend much time on thinking in the abstract about how it will work and what might go wrong, then, by the time you have written enough code to find that out, you may have gone a long way down the wrong path, and not all architectural-level mistakes and oversights can be patched over.
No-one does this perfectly -- even people using formal methods will overlook things -- but, on a big project, if you don't put much effort into thinking ahead about how it should work, and try to identify the problems before you have coded them, you are likely to end up where, in fact, many projects do find themselves: with something that is nominally close to completion but very far from working. Those that are not canceled end up looking like legacy code even when brand new.
Big projects should be cut into smaller pieces where each piece can be relatively easily rewritten.
To come up with the right smaller pieces, you have to think about how they will work together to achieve the big picture. That means interfaces and their contracts, and if you get them wrong, you end up with pieces that don't fit together, and do not, collectively, get the job done.
Big problems cannot be effectively solved in a bottom-up manner, and perhaps the most pervasive fallacy in software development today is the notion that the principle of modularity means you only have to think about code in small pieces.
What do you think other engineering principles do? They create a proof of concept. Verify it works and then create the real thing. That is why "real" engineering companies have hundreds of tools to test stuff.
I really don't understand why people want software to be different. You write some shitty throwaway web app then sure go ahead and don't prototype anything just hire a "software architect" that designs something and use that.
But do you want something that actually works then that is completely useless. Prototype, verify, start over if necessary. That is the way to write quality software.
That's beside the point. The point is that coding is not the only way to verification, especially at the architectural level.
> I really don't understand why people want software to be different.
It seems to be you who wants to be different. Making prototypes is expensive and time-consuming, so engineers try to look ahead to anticipate problems. Prototyping in software is cheaper, but not so cheap (especially at the architectural level) that thinking ahead isn't beneficial.
If it's the former then this is part of building it. An implementation without proper testing is incomplete. If it's the later I actually agree. Only the most sensitive of applications require that level of sophistication though.
The prototype is generally a mess, but I throw that out anyway.
Code, after all, is cheap (and often totally worthless). More developers should adopt this view. I’ve seen engineers more times than I would care to admit get attached to some piece of code, as if it was some piece of themselves. Code is more akin to dogshit than the limb of a dog.
In my experience, the problem levels go differently than one could naively expect. Data structures, abstractions, module interfaces - all problems dealing directly with code - are best solved first on a whiteboard, where evaluating and iterating through them is cheap and effective. User interfaces, user experience, usefulness of a part of a program - things dealing with business and user needs - are best solved through prototypes, because you can't reasonably think through them on paper, you have to have a working thing to play with.
That's what doing math is like too - just substitute axioms, mathematical objects (whether numbers, sets, rings, or whatever is under discussion), potential lemmas and approaches, what bag of mathematical tools (theorems) you can use, and how much closer to a solution when you shift terms in your formulae around.
Then you write it all down (if you haven't already), simplify it, and clean it up before showing it to others, just like you would code.
Also, you can map programs to proofs and vice versa: https://en.wikipedia.org/wiki/Curry–Howard_correspondence
All code boils down to operations that can be described mathematically. Software is applied mathematics (with a sprinkle of art, perhaps). I think the reason why some people feel that programming is not closely related to mathematics, is that programmers are thinking and working on top of so many layers of abstraction, it's almost like working with the "stuff of the mind" itself, with models, processes, flows, transformations, events, composing behaviors.
That said, I relate to what the grandparent commenter is saying. Software allows me to think with visible, malleable and "living" mathematics while building up a system, to ask questions and have a dialogue with it.
>> there's more ways to think about problems and solutions than..applying formal mathematics
I agree with this. Often a "looser" approach is needed to explore a problem space, and formal mathematics may not be the best medium for creative problem-solving. On the other hand, the qualities that are valued in software - types, functional programming, test-driven development, etc. - are all about proofs. Not necessarily mathematically rigorous, but the closer you get, the more reliable the logic.
Programming's friendlier to algorithmic thinking (versus equation/identity and proof). The former's really easy for me, and while on paper (aptitude test scores) one might think the latter would be too, it's very, very not. I've only relatively late in life realized I need to reframe any non-trivial math I encounter in terms of algorithms to have any hope of understanding it. It's probably why I bounce off—understand well enough, just strongly dislike—programming languages that try to make code more look more like a math paper (more focus on equality/identity and proof-like structures).
And yeah algorithms are math, but lots of math's not really algorithms and when someone writes "think in math" that mostly means "think in proofs" to me. If they mean "think in algorithms" then that's close enough to programming—as I see it—already that it's a pretty fine distinction.
Whereas actually ”mathematical thinking”, like coming up with a proof, is an incredibly intuition-guided process, a parallel heuristic search in the solution space, a fundamentally creative endeavour. And as your intuition comes up with promising paths through the search space, you write them down, formalize them, probably discover some corner cases you have to handle, and either continue down that path or realize that it is a dead end and you have to backtrack.
At least to me, this process is incredibly similar to programming effort. You come up with subsolutions, formalize them, fix issues revealed by the formalization, carry on with the next subsolution or realize that approach can’t work after all, and come up with something else.
There appears to be two distinct kinds of programmers that are about equally effective: ones that think through the problem first and then write down the solution on the one hand, and ones that start with something close and then iteratively refine it into the desired result on the other hand.
When you’re doing things like writing documentation, this is important to remember as the two kinds of programmer will approach the documentation differently — important information needs to be put where both approaches will find it: http://sigdoc.acm.org/wp-content/uploads/2019/01/CDQ18002_Me...
They group these styles as: opportunistic versus systematic approach to programming. Paraphrasing below..
Opportunistic programmers develop solutions in an exploratory fashion, work in a more intuitive manner and seem to deliberately risk errors. They often try solutions without double-checking in the documentation whether the solutions were correct. They work in a highly task-driven manner; often do not take time to get a general overview of the API before starting; they start with example code from the documentation which they then modify and extend.
Systematic developers write code defensively and try to get a deeper understanding of a technology before using it. These developers took time to explore the API and to prepare the development environment before starting. Interestingly, they seemed to use a similar process to solve each task. Before starting a task, they would form hypotheses about the possible approach and (if necessary) clarify terms they did not fully understand.
Perhaps there is little correlation between those who excel at coding at a young age and those who go on to be good programmers when they get older. I just find it interesting that at this young age I see a correlation between coding skills and language skills more than math (really just arithmetic) skills.
Another observation was that we did the Hour of Code activity in December last year with Year 2 to Year 6 students (equivalent to Grade 1 to Grade 5 in the US). And in each group there was one or two student who really stood out. And every one of them was a girl. Small sample size of only about 100 students so maybe I shouldn't be wondering what is going on here.
As the other comment above mentioned, I think this has to do with education of the teachers. Very few teachers know what math is either.
High-level math values logical and linguistic skills.
This is often a hard stopping point for many students who were good at high school computation like calculus.
This is the standard thinking of someone who's not deep into math but deep into programming.
The two are deeply interrelated and in actuality are one in the same. Knowing math provides deeper understanding of programming. If you want to get better at programming in general, learning every new frameworks or specific technologies is not the path to getting better. Learning math is the path.
I cannot show you the path for you to understand it, you'll have to walk it yourself to know.
Suffice to say that there is an area of math that improves programming in a way you can understand. Type checking. Type checking proves that your program is type correct, it comes from math. You know it, and probably use it all the time.
To extend this, there's this concept of dependent types which also come from math. Dependent types can prove your entire program correct.
That's right with math you can write a single proof which is equivalent to billions of unit tests that touch the entire domain of test cases, to prove your program 100% correct. It's a powerful feature that comes from math. It's in the upper echelons of programming theory / mathematical theory and thus not trivial to learn. If you're interested you can check out the languages: Coq, agda or idris.
Could you elaborate on this? Mathematics is abstract/meta enough that I would consider any type of logical thinking as part of math.
> It's all just flashes of algorithms, data structures, potential modifications, moving pieces, how they all affect each other and what happens to the entire entangled web when you alter something.
That for example sounds very much like "think in math" to me.
I may be wrong but I believe the Curry-Howard correspondence disproves your claim. One can translate between the two and find that they are equivalent.
The key to solving hard problems is being able to think concretely in abstractions. The best language we have for abstraction is pure mathematics.
One of the bad patterns in the code was very complex nested boolean logic in places. Often with the same condition in several branches.
So I started using K-maps to untangle these. A few of them were much easier to read, but some of them... some of them it was unclear that all the cases were addressed. So I started putting big block comments above those, but we all know what happens to block comments over time.
Much later, big conditionals like that I would just move to a separate function, and then split em up to look like normal imperative code, instead of like math.
The first rule of teamwork is stop trying to be so goddamned clever all the time. It's like being a ball hog in basketball, football, soccer. Use that big brain to be wise instead. Find ways to make the code say what it means and mean what it says. Watch for human errors and think up ways to avoid them.
Math has very, very little to do with any of that. Psychology is probably a better place to spend your time.
I think I'd call this "thinking in programming", and it seems like a great way to do it.
> Is this really a common thing? How can you try to implement something without first having had thought of the solution?
A distressingly large amount of work I've done has not been greenfield development but things that might be called "maintenance" or "integration". You're not trying to draw a picture on a blank sheet of paper - you've been handed an almost-completely-assembled jigsaw, the photo on the box, and limitless box of random pieces. Your job is then to work out which of the already-assembled pieces is wrong and which of the spare pieces can be used to fill the hole.
In this context, disposable programs are very useful for finding information about what's going on, sketching possible solutions, and finding out which plausible ideas won't work for reasons outside your control.
(e.g. this week I wrote a disposable program to use libusb to extract HID descriptors; this duplicated a library we already had but didn't trust, and enabled me to pass a problem over to the team programming the other end of the USB link.)
Some of us actually think by programming. In that sense, a REPL or notebook is probably a better medium, but the thinking is going on concurrently with prototyping.
It isn’t so much like “we are solving the problem at the same time we are writing the code for the solution” but more like “we are writing (disposable) code to help us solve the problem.”
With respect, that tells us much more about you than about math or programming.
No Haskell expert, or formal methods expert, or complexity theory expert, would ever make a statement like that.
You may be right that math is quite a distance from day to day development, of course. (I don't think I'm being pedantic here, but perhaps.)
> it's only one type of logical thinking among many types which can be applied to programming.
What do you have in mind? Design patterns and software development practices, or something else?
I think if you regard logic (in philosophy) and maths (as a huge broad field) and computing (specifically a sub-field in maths to some people) its pretty clear that logic and computing have a huge relationship.
I can think of lots of other subfields in maths, which have huge inter-relationships. Applied maths, whats that got to do with probability? Well.. it turns out that modelling complex systems uses Monte-Carlo methods .. (a fictional example, I suspect, I know the manhattan project people dreamed MC up but its modern applicability is unknown to me)
You don't think maths informs programming, or its over-stated? I guess thats true, in as much as poetry doesn't inform legal writing. But, I observe that people who do enough poetry or writing to understand the difference between a simile and a metaphor and an allegory, are really on-point communicators, and the law needs that concision and precision.
I think people with good groundings in maths (and logic) make awesome programmers but its not strictly neccessary to be a mathematician to know how to "speak" in a programming language. What pitfalls you avoid from your knowledge, I cannot say. But I do know that huge pitfalls lie in naieve programming: large loops iterating over un-initialized data structures, not understanding the if-then-else logic or side effects of expressions, tail recursion..
I think computing is a sub-field in maths. How much it matters depends on how much your code matters.
Completely agree with this. I did a Maths and Philosophy degree, and I reckon the Philosophy was more useful to my career in programming than the Maths was. Although this probably depends on what kind of programming you do.
My (heavily uninformed) guess would be the ever questioning if our assumptions are actually true or not.
I found it to be not the case. Upon reading the first chapters I started wondering how could this be useful for coding. So I jumped to one of the last chapters where they show you practical applications. Upon reading those I thought: "I can do all this in code just fine without using linear algebra".
I never touched that book again.
About two years ago or so I started to make little games for the pico-8 fantasy console. There's some math involved there but almost always you don't use the math formulas as you would in a text book, for example, for something simple like drawing a straight line or a circle, finding paths, collisions... there are very specific algorithms for that, they don't look anything like a math formula, even if they are derived from those.
Just my point of view.
I'm not sure what you mean by that. I'd describe making a 3D game (engine) with rendering, collision detection, etc as probably one of the math-heaviest areas of programming outside of scientific computing or algorithm R&D.
This is exactly what programming is.
1. Are you aware that the complexity analysis isn't about being precise but about being able to predict the time for any given input given some sample? Since from my own experience, it's more of an analytical part and it's about calculations of worst case scenarios/computability of the process overall. Still, it has everything to do about actually predicting the exact values, with the grain of salt that the relativity of the method is.
2. Are you actually aware that the math isn't about being "precise" in the sense of numbers but about relationships between abstract entities? Ever heard something about category theory or pretty much anything related to the abstract algebra?
3. Is there anything else than math that helps abstraction in your opinion? For what I know, even mediocre understanding of abstract algebra helps a lot. Please note that this question is totally non-ironic, I'd really want to know.
I suspect one of the reasons is that to a casual observer, there is no difference between someone who is thinking deeply about something, and someone who is just daydreaming. They both aren't interacting with the computer and may have their eyes closed. On the other hand, "coding" by constantly banging at the keyboard and mousing around looks productive.
I am someone who thinks deeply first, and have been told off about it because they thought I was sleeping or otherwise not working.
Well, any fool can write a loop. But to do the same thing in constant time instead one might need to use some math.