I write software that runs on various jet engines. There is a lot of software that goes onto a modern commercial engine, but the general theme of the software I focus on is the modeling of the underlying engine dynamics using sensor data. It's very math- and physics-based. A thorough knowledge of linear algebra, signal processing, regressions, and clustering/neural network algorithms is essential, among other things.
It's not the kind of environment where typical startup aphorisms apply ("fail fast, fail often", "don't be afraid to pivot", etc). So every month or so when another "programmers don't need to know math" article comes out, usually written by a web programmer, I have an impulse to represent the other side of the divide, but I usually find so many misconceptions and poor assumptions in the original article that I conclude it's too much work. So I am glad that the author has done the job for me here.
What I have observed, again anecdotally, is that web programmers are generally more vocal about programming than systems programmers and scientific programmers. It makes sense in the historical context of the internet, but I don't think that this bias is properly accounted for when people on Twitter/HN/etc discuss "programming" and what it requires.
I don't think it should be controversial that programming is founded upon the study formal languages, mathematics in particular. So even if you don't need math to do your programming work on a day to day basis, it's because a lot of very smart people have solved some very difficult math and language problems over the decades so that you have the luxury of ignoring the mathematics your code relies on. This is all ok. But the implicit hostility towards mathematics that a lot of these articles demonstrate really makes me concerned about the influence it will have on the next generation of programmers. Moore's Law has allowed a certain level of indifference to mathematics in the past few decades since you could always throw a newer processor and more memory at a problem, rather than solving it via a better algorithm. But that situation won't last forever.
Personally I have the impression that many web programmers, especially those who has never worked with anything other than the web, are ignorant of the existence of other branches of programming.
In many aspects programming for the web is simpler. For example the quality requirements for a microwave oven firmware are much higher. You can't easily deploy a hotfix to production after shipping 100k units. I have a feeling that most of complexity in web programming is incidental. It's a result of how poorly the web is designed as a platform. Of course it's very successful commercially but from a technical standpoint it isn't good.
john_b, could you give an advice how to get into aerospace industry? It always seemed very challenging and I'd love to have a chance to work there.
I've made numerous rebuttals (many right here on HN) to claims that software development is not engineering, citing aerospace examples to the contrary. Often the actual task of programming is as simple (or even simpler) than web programming, but there is a wealth of process that surrounds it, to, as you suggested, make sure that what goes out the door is as correct as possible the first time.
Having also done some web and mobile development on the side, I see software development across a spectrum: ranging from throwaway hacks to cheesy mobile entertainment software to quality mobile entertainment software ... etc ... to embedded medical software and flight control software and such.
While all of these activities are software development, it would be erroneous to classify them all as the same thing. Best practices in one region of the spectrum may well be ludicrous if tried in another region.
How to get into aerospace? Apply for a job? Anyone with a computer science degree or equivalent experience [though the degree may be "required", depending on the company] would qualify for an entry-level position in most aerospace fields that I am familiar with. Another approach, if you're more web savvy, might be to get a job in an aerospace IT department doing internal web development and such, and then transfer out into some other project later.
> I write software that runs on various jet engines. There is a lot of software that goes onto a modern commercial engine, but the general theme of the software I focus on is the modeling of the underlying engine dynamics using sensor data. It's very math- and physics-based. A thorough knowledge of linear algebra, signal processing, regressions, and clustering/neural network algorithms is essential, among other things.
> I've made numerous rebuttals (many right here on HN) to claims that software development is not engineering, citing aerospace examples to the contrary.
I don't get this pattern of argument. If the question is "is programming math" or "is programming engineering," it is not an answer to point out that some programming tasks require engineering or math skills. Doesn't this simply establish that certain problem domains require those skills while others don't? The same pattern of argument could be used, I would think, to "demonstrate" that writing is math or engineering: after all, if you're writing a textbook on engineering you sure need to know something math and engineering. But surely nobody would accept that writing, therefore, IS either of those things. Or pick another social science: knowledge of math sure does help if you want to perform a quantitative sociological study, but I don't think that this makes sociology math.
One might make a somewhat stronger claim about programming: math and engineering skills will make anyone a better programmer. I think this is probably true, to some extent, and this is what is at issue in the articles' debate over the usefulness of mathematical measures of complexity. But this also does not establish that programming "is" either a branch of math or engineering.
Then again, I really have no idea what turns on any of these distinctions or, therefore, why we would be debating them for any reason beyond philosophical curiosity. Isn't it enough to say "no, programming isn't math [or engineering], but like many other disciplines, programming bears some conceptual similarities to math, and a grasp of mathematics [or engineering] will make you a better programmer"?
Well, somewhat notoriously, there has been a rolling debate for better than a century now over whether "higher" sciences simply are complex subdiciplines of "lower" sciences. E.g., are all biologists ultimately just physicists? How about psychologists? Economists? The proper meaning of "are" itself in that sentence also, rightly, is the subject of significant dispute. But the analogy isn't the best since I think that philosophers enjoy that question a lot more than actual scientists.
A more interesting example may be law and economics. There has been a concerted movement by certain economists/legal theorists to show either that law is simply applied (or misapplied) economics, or that it should be. (Here's a taste: http://www.law.berkeley.edu/faculty/cooterr/PDFpapers/stratc...) Again, though, the analogy is imperfect. I don't think that many would claim that law "is" economics, but rather that legal rules are or should be explicable in strictly economic terms. Though maybe this would satisfy some definitions of "is."
Pharmacutical research and clinical trials is another field in which software development is more like proper engineering. I worked in a niche that was not very math heavy, but intensely process heavy (out of necessity). First you design a system that can never fail. Then you assume it will fail and design another system to make up for it. Repeat over and over until you reach the desired safety.
I think the dividing line between whether an industry practices programming or engineering is the question "Is it possible people might die as a direct consequence system failure?" Eventually, someone will die and folks will search for answers. When the investigators/plaintiffs come around, you must be able to produce documentation explaining why you thought your system was robust enough to rely on for a safety critical task.
Edit: Perhaps a good rule of thumb to distinguish between programming and engineering is this: are you productively working full-time and producing more than 20-50 lines of high-level language code per day? If yes, you are programming. If no, you are engineering.
I think the dividing line between whether an industry practices programming or engineering is the question "Is it possible people might die as a direct consequence system failure?"
Commercial aviation products are developed according to the requirements of DO-178B (or increasingly, DO-178C) which defines several levels of criticality at which a software component could exist. The most severe indicates that software failure could result in loss of life. The least severe indicates that a software failure wouldn't interrupt anything notable (often used for in-flight entertainment, for example). As you go up the chain, the amount and stringency of process increases.
> Edit: Perhaps a good rule of thumb to distinguish between programming and engineering is this: are you productively working full-time and producing more than 20-50 lines of high-level language code per day? If yes, you are programming. If no, you are engineering.
I like this. As a web developer I am often expected to write closer to an order of magnitude more lines per day than that. But maybe our high level languages aren't high level enough and I'm mostly writing bloat.
I can't believe that in 2014, web development is still getting this mindless, indiscriminate bashing. The kind of web development you might have seen, or done, may have been simpler, but please don't generalize it to everything. Every field has its low ends and its high ends.
I'm generally speaking a web developer, I work in bioinformatics and I can assure you that what I do is not "simpler" than most other programming, please get over yourself.
Your microwave example actually undermines the point you're trying to make. If microwaves are indeed so far more technically advanced, why then is it so hard to upgrade them? I know it's a stupid question, but so is the point that brought it up.
I've spent 15 years having to justify the worth and quality of my work to non-web developers, a lot of whom I wouldn't even trust with the most basic programming tasks. That included some of my teachers at University (!), who not only undermined me all the way through my dissertation, but actually proceeded to steal my work and use it as a teaching base for years after. This was after they admitted that they could not, in fact, fully understand my work, mostly because they disregarded it for the longest time.
I'm pretty damn proud of being a web developer, and even though I do or have done other kinds of programming, I'll stick to web development, if only to prove to people like you that we, too, are programmers.
> To me it sounds like you have a bioinfo library that you've integrated with a website.
??? This is the hostility the above poster commented about; why did you assume that they were using unsafe / uninventive practices? From what? How did you read their development of a complex piece of software as "you made glue code", without external information, or, perhaps, internal bias?
> If your bioinfo code is actually mixed in with your web code then this is exactly what real programmers talk about when we trash web programmers.
Why would you frame a hypothetical situation, unrelated to their comment, as evidence of your beliefs? How does this make sense?
It's pretty clear people use "web programmer" as a synonym for "poor programmer" or "lesser programmer".
Oh, you work on hard problems and complex software? I wouldn't call that web programming!
I promise there are enough hard problems and complexities in the "web" for any programmer in existence to choke on. HTTP/CSS is just a small part of the "web"; it doesn't speak to the complexities of the system behind the curtain.
Well, he has a point. From your description, it seems tenuous to classify what you do as 'web programming'. So why don't you describe how your bioinfo work relates to the web part, because all I can think of is that you have software for which you simply have a web frontend.
From his current job's description (Clinical Genomics Database Web Developer) [1,2]:
"The Clinical Genomics Database/Web Developer performs a variety of duties in the development of interactive websites, databases and interactive tools for the interpretation of genetic assays performed in a clinical setting. The Clinical Genomics Database/Web Developer also participates in the development of interactive tools for interpretation of genetic and genomic assays in support of developing high-throughput, clinically accredited pipelines to aid in identifying known variants in hereditary cancers, and in assisting in research towards the development of personalized treatments for other cancers.
Duties/Accountablilties:
1. Development of interactive websites to allow for sample submission, clinical report retrieval, and analysis interpretation.
2. Development of databases for tracking of patient samples, and preparation of reports for Clinical staff and physicians.
3. Development of interactive tools to allow for interpretation of results from genetic and genomic assays.
4. Quality-assurance testing for websites and databases.
5. Development and assessment of bioinformatic tools for detection of genetic variants implicated in oncogenesis."
Quite intense. Far from a mere "presentation layer", it seems in this case the work involves developing actual clinical research tools --which just happen to be web-based.
> Personally I have the impression that many web programmers, especially those who has never worked with anything other than the web, are ignorant of the existence of other branches of programming.
I see this again and again in the articles that show up here. Occasionally, mobile apps are acknowledged (although they usually mentioned when the author is trying to make a point about trivial social media apps).
> I have a feeling that most of complexity in web programming is incidental. It's a result of how poorly the web is designed as a platform. Of course it's very successful commercially but from a technical standpoint it isn't good.
It seems that most of the technical challenges that people run into in this field can be divided into two categories:
* Needing to accomplish something difficult by developing, implementing, or otherwise employing novel techniques, algorithms, data structures, automation, etc.
* Needing to hack around poorly designed or obsolete software, such as CSS hacks to get older versions of IE to render a page element correctly.
First, a disclaimer: the aviation industry is very large and surprisingly diverse. Different practices exist in different parts of it (engines, airframers, rotar winged aircraft, etc), so it's hard to give general advice. Everything below is based on my personal experience with only the engine subset of the industry.
At the moment it seems like a pretty good time to get into the industry. Hiring is very cyclical, but the companies I follow are generally hiring now. If you know somebody at the company, or even a contractor with the company, that would help of course. With the larger companies, contractors often do a lot of the actual technical work while the primary company manages various projects and contractors, with some of the more ambitious research-y projects conducted in-house (often by M.S. or PhD holders).
Aerospace companies did not begin as software companies, so even though many of their competitive advantages today depend on the quality of their software, a lot of people at these companies--especially older ones who pursued a management path--will not be able to assess your specific technical capabilities (nor will they try). If you have any code on Github, you can mention it as a plus, but some of the people you talk to won't know what a Github is. Some people will understand how desperately their company needs competent software developers, while others will be more interested in how you solve problems in a general sense, communicate & interact with groups, and function in an environment with complex processes and high standards.
You will generally be expected to have a B.S. degree in some form of engineering, though physics and mathematics majors are also hired in smaller numbers, depending on the company. A graduate degree doesn't hurt either. At my company and the companies I work with, there is currently a bias towards more traditional engineering degrees: mechanical, electrical, obviously aerospace; but CS majors are hired as well. The latter are needed most urgently in my opinion, as the former group often ends up writing software that doesn't really require their skills or background, but which could be better written by someone with a CS background who had a strong interest in the application area.
You make a good point, and it worries me somewhat. It manifests itself not just in discussion about math, but also in discussions about fast runtimes, discussions about algorithms, discussions about data structures. There is a lot of vocal self reinforcement of the ideas that speed matters little, algorithms matters little, data structures matter little.
The situation reminds me of an aphorism. I am sure there would equivalents of this story in other cultures. The story behind the aphorism goes like this:
A worldly wise frog visits his friend, a frog in a well. The frog of the well wants to know how big the world is, and proceeds to ask questions by jumping over increasing fractions of the diameter of the well, and asking "is it this big ?". About the point where the frog jumps from the sides to the center of the well, the frog proclaims now if you say the world is bigger than this you are just bullshitting me.
I see this a lot in CRUD programmers, just because they havent found a way to exploit some algorithms they think these are universally useless. Not only that, influenced by these oft repeated lines, they do not even try to find an use that would make their code more efficient. Some are more radical and propose that these topics should not be taught in CS courses. Redis is proof that CRUD can benefit from all these. Similarly, I have also seen many compute oriented programmers think CRUD is trivial. In which case I would just slide the keyboard towards them and ask them to write something that scales, is performant in terms of speed and latency, is robust and frugal with hardware resources.
> web programmers are generally more vocal about programming than systems programmers and scientific programmers
I'm not sure you would find systems programmers doing much more math than web programmers. Sure, they'll focus on optimization and a few algorithms, but much of their job involves plumbing bits from point A to point B.
There are programming jobs that involve math, but programming itself is not math. It is more like plumbing and architecture. You are right in that much of the math is already done for you, in well encapsulated components that you might want to know about but often can forget. But saying "programming is math" is like saying "biology is physics"; it might be true at some level, but it is not often very useful.
> "But saying "programming is math" is like saying "biology is physics"; it might be true at some level, but it is not often very useful."
I agree and actually thought about making this analogy. I think some people like to debate these things for philosophical reasons, and this kind of hierarchy of fields is of interest to such people. Others seem to approach it from a more pragmatic and anecdotal perspective and often take issue with the classification of their field as, e.g. "math-based" or "language-based", as though it reflects on their talents (required or actual) as an individual.
The reaction of the latter group reminds me of pg's essay:
As programmers, we use lots of techniques and where many hats. We sometimes do engineering, design, math, sometimes we are detectives (especially when debugging or coming up with requirements), sometimes we are writers and communicators.
That is why I react to "programming is math" so negatively; it belittles the other tasks that I have to do much more often than math.
The issue to me is more that our education system is so shitty these authors don't even know what math is or that they're doing it every work day. We combine abstract symbols and attempt to logically reason about how that will affect what our programs do. That's math. It's not separable from programming. The only other option is these authors are proud of the fact they engage in some weird cargo cult attempt at programming.
That is a definition of 'math' that is so broad as to be utterly useless. Other things that are 'math' by this standard include baking cookies from a recipe and writing poetry.
I'm unsure of why so many people have such an urgent need for programming to be math, but generations of people have written software, some of it quite substantial, who have no mathematical education at all. Redefining what math is to get these works under the canopy reeks of No True Scottsman and is pointless.
There's plenty of math in CSCI, and I will state unequivocally that any particular writer of software would only be improved by understanding more math than she currently does. In fact, I believe that about anyone, anywhere, in any field. But it doesn't mean that you're doing math while writing your Django app.
The fact that you do it by writing software is incidental. Obviously mathematical skill is necessary to write software that models and controls complex physical/mechanical systems. No one is saying it isn't. That's a strawman.
Many problem domains, however, are not mathematical in nature. I don't need to know anything about calculus, have accurate hand-computation skills, remember probability formulas, etc. to write a database application, for example.
But then obviously I would need these things to write and validate a financial modeling/risk assessment system.
I (and many others) would argue that when math becomes relevant to programming, it is because math is relevant to the problem domain in which you are working. Programming, by itself, has little to do with math (or, more precisely, skill in computation-based math classes and exams is not a proxy for programming skill.)
There is actually more to it. There are aspects of programming that are no different from procedures in mathematics even if the application area of the code is not 'mathematical'. As has been said in the comments [0], the process of debugging code is really not that different from solving/proving those Euclidean geometry exercises/riders we had in school. Now, the person debugging the code may not be explicitly conscious of this, but if the person is any good, he/she is using the same procedure: assume axioms; make hypothesis; prove or disprove using observations, deduction/induction; repeat till you have reached your goal.
>he/she is using the same procedure: assume axioms; make hypothesis; prove or disprove using observations, deduction/induction; repeat till you have reached your goal
Absolutely! This problem-solving process is very similar to what we were asked to do in physics and chemistry on a nightly/weekly basis. I've found chemistry problem sets to be very similar to programming.
However, what you describe is not math, not as it is taught anywhere between first grade and Calc III. Being "good at math" is exactly equal to being good at manipulating symbols (first numbers, then variables) according to a memorized set of rules and procedures. The higher you go, the weirder the rules get. Symbol manipulation and the memorization of probability and geometry formulas are also the entirety of the "skills" represented by standardized tests of mathematics.
Programming is about coming up with the rules and procedures by which the computer should manipulate symbols. It's really completely different.
Excellent mathematical skills would make me a good compiler. Not a writer of compilers, but a compiler. Of course, that might be useful if I wanted to work on one (I'd be able to check its work by hand), but in practice, I'm that sort of work is done billions of times faster and with much greater accuracy by computers than humans anyway.
> not as it is taught anywhere between first grade and Calc III
This is the problem when most programmers talking about math. Everyone seems to think that math after Calculus is just more abstract versions of calculus, with more complicated symbols and more complicated rules for manipulating those symbols.
I once talked to an engineering student who, after founding out I'm a mathematician, boasted to me that he took both differential equations classes, so he was going to tell me some cool things about math. I was sitting with my friend who is getting her PhD in dynamical systems; the irony was rich and totally lost on him.
Anyone who says "being good at math is exactly equal to being good at manipulating symbols" is completely ignorant. I suggest you update your beliefs about mathematics, because right now you're the butt of the irony.
If, by arguing that math skills are important to programming, you mean "skills in post-Calculus math" than say so. If you say "math," people will continue to believe that you mean math in the way that they experienced it, i.e. math ending at or shortly after calculus.
Lots of mathematicians freely admit that they're terrible at computation. Which is precisely my point - the two represent different skills. Being good at symbol manipulation (i.e. "math" as represented by your high school transcript, SAT, ACT, and college transcript if not a math major) is not a determinant of your aptitude as a programmer.
Is it useful to study math beyond calculus? Of course! I know (abstractly) that things like the typed lambda calculus and set theory exist and are the foundation for much of what we do. However, it is ignorant to talk about these things like they're related to your aptitude for pre-Analysis math.
But what it isn't is rational to to use performance in lower math classes to screen (computer science undergrad applicants|interns|programmers), or to encourage people to choose or not choose programming as a career path based on their performance in mathematics classes.
Your comment made it very unclear whether you believed what you were saying about mathematics or not. Rereading, it still seems like you think conjecture and proof is more like physics than math. And you claim that math is irrelevant to compiler design. It's not "abstractly" relevant, it's directly relevant. Parsing is formal language recognition. Register allocation is literally graph coloring. These things are not third-generation applications of mathematics. You seem to think that mathematics makes you good at computing, when in truth most mathematicians are bad at computing things by hand.
The root misunderstanding might be that most people wouldn't know math if it hit them in the head, but changing my definition of math to fit their misconceptions is certainly not going to fix the problem. I would even prefer it if people thought, "math? yeah I have no idea what goes on in that subject" to what you describe. Because distinguishing between "math" and "post-Calculus math" (the latter of which is almost all of math) won't help anyone.
You are basically claiming that everything is math, which is true, but useless. Programming is more like chemistry or biology in the purity access...not "like them" but at a similar level.
Writing parsers, which I do a lot, requires very little parsing theory...ya, they make for good academic papers, but nothing beats good ole simple recursive descent in flexibility for error handling. And for register allocation, you might use - gasp - an algorithm, but that doesn't dominate your work - writing a compiler involves some math (among other tasks), but is not mathematical activity.
Mathematicians can be bad or good at programming, just like musicians can be...there is no strong correlation to their aptitude based on their previous training.
> The root misunderstanding might be that most people wouldn't know math if it hit them in the head, but changing my definition of math to fit their misconceptions is certainly not going to fix the problem.
Math can be defined so broadly as to basically be a useless word. Is sociology math? Is chemistry math? Is physics math? Is engineering math? Is implementing an algorithm math? Meh, if so, then whatever, we haven't made any progress.
> Writing parsers, which I do a lot, requires very little parsing theory...
I am currently writing an Earley parser. When I'm done, you will indeed need little math to use it. However, I had to grasp several non-trivial mathematical insights to write this damn tool (most notably graph search). And I'm not even done: currently, my parser only handle grammars like this:
A -> B C D
A -> B 'x' E
B ->
etc.
I want to handle the full Backus Naur Form, however, so I will need to compile BNF grammars down to a set of production rules. This will involve things very close to lambda lifting, which is rooted in lambda calculus.
Math is the main activity in writing this parser. The code is merely a formalization of such math.
I believe the result will be worthwhile: the error handling should be just as good as manual recursive descent parsing (RDP). Parsing code using my framework should be about 10 times as short as RDP. It will not be limited to LL grammars, unlike RDP. And you will still need very little math to write a parser —just like RDP. But that's because I will have abstracted it under the rug.
---
I don't know what kind of compilers you write, but I'm surprised to hear you say math is not the main activity. We're talking about semantic-preserving transformations here, how could it not be math?
Or, you already know all the math, and applying it doesn't feel like "math" any more.
I do Bret Victor style interactive programming environments. I've developed a set of tricks over the years and they are all quite simple. It really is programming in the classic sense and not so much Greek symbols on the whiteboard.
also, what I do is not very well understood, so there is no good theory for it yet and a lot of open questions to investigate. So its more experiment and measure vs. find a proof that will tell you for sure the right thing to do.
> I've developed a set of tricks over the years and they are all quite simple.
Actually, so is Earley parsing. The more I study this little piece of CS, the more I see how simple this really is. This is why it feels so much like math to me: hard at the beginning, then something "clicks" and everything becomes simpler.
Your "set of tricks" are probably similar. Knowing nothing about them, I'd bet their simplicity is rooted in some deep, abstract, yet simple math, just waiting to be formalized:
> what I do is not very well understood, so there is no good theory for it yet and a lot of open questions to investigate.
And how do you plan to further your understanding, or finding good theories? It can't be just psychology and cognitive science. I'm sure there will be some math involved, including proofs.
My set of tricks is more like: trace dependencies for work as it is done, put work on dirty list when dependency changes, redo work, + some tricks to optimize previous steps. It is really hard to interpret that as math, especially if the word "math" is to remain meaningful and useful. It is all math at some level, but so is everything.
Thank you. This is what I was trying to say, but better articulated.
I would say that math done by mathematicians would be more indicative of skill, but most people who are/are trying to be programmers haven't really tried that, so it's impossible to assess them based on it.
If you still write parsers as recursive descent parser by hand, then this is because NOT ENOUGH MATH IS APPLIED IN PRACTICE (by programmers who think they don't need it).
No, its because incremental performance and error recovery are more important than raw performance and expressiveness. If you think Dr. Odersky doesn't know enough math...not to mention most production compilers out there written by the best professionals in the field. Reality is a harsh mistress.
My parsers, by the way, do things only your parsers could dream of:
Yep. Well, see my managed time paper; basically we use no fancy algorithmic tricks and it all works out fine. There are 2 ways to do incremental parsing: the hard way based on some fancy delta algorithm, and an easy way based on simple memoization and conservative recomputation.
Academics especially often over think these problems when the simple solution often works fine, and performance wise you'd have to mess up pretty bad before parsing becomes a noticeable bottleneck.
> I would even prefer it if people thought, "math? yeah I have no idea what goes on in that subject" to what you describe. Because distinguishing between "math" and "post-Calculus math" (the latter of which is almost all of math) won't help anyone.
Why not?
Most educated Americans did "math" for a minimum of 13 years of their lives. It would be immensely useful if people who control gates like employment and admission understood that the math relevant to computer science and they math they did/can measure about candidates are different things.
The fact that most mathematicians are bad at computing things by hand is the key takeaway. Hand computation skill doesn't mean you should be a mathematician or programmer. Lack of hand computation skill doesn't mean you shouldn't.
> However, what you describe is not math, [not as it is taught anywhere between first grade and Calc III]
I think this is where we disagree then. That is pretty much what Euclidean geometry is, and I would indeed file that in the 'math' cabinet. Euclidean geometry has very few axioms and all of the theorems are just consequences of those axioms and choosing and applying truth preserving operations on them (in other words, reasoning, or as you called it 'problem solving').
Say you are tasked with deciding whether the angular bisectors of all triangles all meet at a point inside the triangle (In programing you may be asked to find out whether this piece of code will ever crash). You seek out theorems that you think would be useful and then try to prove or disprove them. It might turn out that the theorem wasnt useful after all, then you try to prove another theorem that you now think will be more useful. In debugging that is pretty much exactly what you do, same with ensuring properties of code: the code will not crash, the pointer will not be null etc.
I was not schooled in the US but I would guess it is not that different here.
> Programming is about coming up with the rules and procedures by which the computer should manipulate symbols.
How do you think mathematicians come up with a set of axioms and how to operate on them ? Think about how mathematicians came to use imaginary numbers, it is quintessentially the same process, i.e. "coming up with the rules and procedures ...[to]... manipulate symbols". Is it really as fundamentally different as you make it out to be. I am hesitant to say that programming and math are one and the same, but would claim that their methods are the same, that processes in programming, no matter what application you are programming, is indeed at least a subset of the fundamental processes of mathematics.
Furthermore what you are talking about is one aspect of programming, the synthesis part of it. The other is debugging or the deductive aspect of it. It is no less of a part of programming, and again the methods are indeed the same. Whether you call them theorems or not, whether you write them with symbols on paper or not, when you are debugging you are indeed manipulating symbolic objects, and proving or disproving theorems based on rules, exactly like in math, example: "if my assumption about initial conditions and the function foo() is correct then 'a' ought to be 42. If it is not, either my reasoning is wrong or the function implementation or the initial condition is wrong. Ok so it was not 42..." and you keep going like this manipulating your conjectures and observations using rules of mathematics, more precisely logic with '&', '|', 'for all', 'there exists'.
Consider writing tests and choosing what tests to write, its the same process. Consider drawing conclusion form the tests, consider using the type-system to encode properties you desire in the code, again its fundamentally the same deal. One may be aware of it, one may be doing it implicitly without being aware of it, but regardless, its still the same process.
I would argue the match with math is better than the match with sciences like Chemistry or Physics, because there the rules have been set by nature. In programming you choose the rules and try to do something interesting with those, much like in mathematics. For actual computers you do have to co-opt nature into executing those rules.
=============
EDIT: @superuser2 replying here as I dislike heavily nested threads with cramped widths.
> I have never been asked to do anything this in a math class.
I upvoted your comment and now I understand more of where you are coming from, and it seems that there is a difference in the way math is taught in schools where you are from [I am guessing US]. We would typically do this stuff in grade 7 and it is taught in school, so anyone with a school education would be aware of this (quality of instruction varies of course, in fact varies wildly).
> Geometry is an exception, as you state. However, geometry is one year of many, and was extremely easy for me. I'm talking about Algebra, Algebra II/Trig, and Calculus.
I think this sheds more light. For me at least the most difficult homeworks and tests were in geometry, also the most gratifying. Most of my schoolmates would agree. I think we had geometry for 3 years (cannot remember) very lightweight compared to Russian schools. I would however encourage you to think about solving simultaneous linear equations, it is again deduction at work, the only difference is that its scope is so narrow and we know the procedure so well that we can do it by rote if we want to. We also had coordinate geometry, which was about proving the same things but with algebra, but we had this much later in grade 11.
BTW post high-school 'analysis' is different though, it is not devoid of such logical reasoning but its focus is different.
..and thanks to you I now know a little bit more about the methods of Chemistry.
> and proving or disproving theorems based on rules, exactly like in math, example: "if my assumption about initial conditions and the function implementation is correct then 'a' ought to be 42. If it is not, either my reasoning is false or the function implementation or the initial condition is wrong. Ok so it was not 42..."
Chemistry is like this. "Cadmium would be consistent with the valence electrons required for the bond to work. If the mystery element is Cadmium, then the mass would be xxx, but it's yyy, so we can rule that out..." Of course I've forgotten the specifics, but the method is just like it is in programming.
I have never been asked to do anything like this in a math class. Equations are sometimes called theorems, but that's the extent to with this sort of analytical thinking ever factored into a math class up through Calculus. (There was a little of this in geometry, and a really fun two weeks in propositional logic, but geometry was still mostly formula-driven - numbers of vertices and such.) I gather that's the sort of thing Analysis is about, but I'm not talking about math for math majors, I'm talking about math as most people experience it.
You memorize some formulas. You're given some formulas. You recognize the information you're given and apply the procedure you were taught to convert it to the information you're asked for. That's it. (Geometry is an exception, as you state. However, geometry is one year of many, and was extremely easy for me. I'm talking about Algebra, Algebra II/Trig, and Calculus.)
Memorizing and drilling steps is not how you become good at debugging - thinking is. Holding the program in your head, tracing it, asking yourself how you would design it and then figuring out where the problems in the logic are. This is nothing close to recognizing the situation as belonging to a particular category and then blindly cranking the computation for that category of exercise to reach the answer.
> I have never been asked to do anything this in a math class.
This is a tragic comment. It is also true. American children waste their time and their brains in American math classes for about 8 years each. They learn to multiply and maybe solve a quadratic equation. "Equations are sometimes called theorems," and that's it. And it's not getting better; as a culture, we don't understand or respect math so we don't support it.
Modern mathematicians don't memorize or drill steps. We don't just find a bigger number each day by adding one. We hold our problems, trace them, ask ourselves how the proof should be designed and figure out where the problems in the logic are. If you've got 2n points in a line, how many ways can you draw a curve from one point to another such that you match them all up and none of the curve intersect? What about if they're in a circle -- is the answer different? How do you characterize a straight line if you're in a non-Euclidean surface? What's the right definition of straight -- should straight mean "direction of derivative is in direction of line" or "shortest distance" or what? And if we're talking about points on a grid, how should distance be measured anyway? Do we have the right definitions to solve our problem? Distance from streetcorner to streetcorner by cab in Manhattan is different than distance as the crow flies is different than "distance" through time and space; how are the concepts consistent, then?
Formulas are just shorthand for relationships. We don't teach students the relationships or the thinking in the US. That's why my Romanian advisor would ask our graduate algebra class, "You should have learned this in high school, no? No... Hmph." American school math is like prison, a holding cell until release that merely increases the likelihood you'll do poorly in the future.
In general I agree. But I can't help but feel that while math is essential to your job, it might not be essential to programming in general. It's similar to if someone said that having medical knowledge is essential for programming because he was working in medical imaging software.
So I agree with your conclusion (math==good) but not necessarily with your evidence.
I didn't (and wouldn't) claim that programming "is" math in any literal sense. The foundations of software engineering are heavily derived from math, but a degree in software engineering isn't necessary for all forms of programming.
I only claimed that programmers rely on the math skills of other programmers, even if they are not aware of it. Somebody wrote the compression, search, and encryption tools that even ordinary computer users use every day. Game developers often rely on 3rd party game engines to do the heavy vector mathematics that they'd rather not reinvent.
So some programming is heavily based on math, and it is often exposed only as a library or service of some sort. As long as it sits quietly in the background doing its tasks, it won't be noticed much. Declaring that programmers don't need to know anything about math is, to me, equivalent to saying that abstractions can be counted on not to leak.
I don't think this is limited to programming. I've noticed that most engineers forget their college math and engineering theory within a few years of graduating. Many are adamant that math isn't necessary for an engineering career, and many have heard this message from older engineers.
Maybe there's a similar divide, where some engineering activities use more math & theory than others. In moments of cynicism or frustration, I label the two groups "engineers" and "designers," and complain to colleagues that a remarkable amount of design work is done by trial and error.
Is it the right way or the wrong way? I don't know. I'm biased by my preference and enjoyment of math & theory, and I volunteer for tasks that use those skills.
I think we should just do away with the term 'programmer'. It doesn't really mean much anymore. A 'programmer' could be anything from someone deep into the algorithmic side of thing to someone who spends most of their time writing event hooks in a UI. I don't mean to imply any level of 'who's better' in that statement, just that they are very different concerns, often with little in common other than 'writing code'. Making any generalization of programmers is bound to be wrong.
A "programmer" is already somewhat the same as a "builder" is. A programmer programs, just like a builder builds.
A programmer can have a lot of different hats. A hacker (white or black), computer scientist, developer, etc, all do different jobs. You wouldn't put a hacker into a job of medical or financial responsibility, where a minor bug can be fatal or extremely expensive.
A builder can have loads of different hats as well. A carpenter doesn't do the same thing as an iron worker, a plumber or an electrician.
The carpenter probably could do a lot of the same things, with about the same error frequency as a hacker in a financial job would have. That error frequency would also decline as he got better at wiring like an electrician.
That's my view of what the "programmer" word means.
I'm curious your experience with TDD and other hip programming approaches? Because I've been long wondering if "web programmers are generally more vocal" has a lot to do with their popularity.
(Disclaimer: I love automated tests, particularly integration tests, and use them wherever it seems practical. But I cannot imagine developing the software I develop with a "unit tests first" strategy...)
My experience, while nowhere close to being representative of the aviation industry at large, has been that large industrial companies are somewhat schizophrenic in this regard.
I often find myself appreciating, for example, the care that somebody put into their automated build tools so that my unexpected corner case (which I originally expected to break things) actually worked the first time without a hiccup. Yet later in the same day I will be cursing the awful, terribly broken two decade old version control system that the same company is paying enormous sums of money to license.
TDD is one of the approaches that I advocate for often, though not the extreme versions that some people advocate for. Some people get it, while others think it's more efficient to wait until the last few days of a waterfall(ish) process before running comprehensive tests (because "testing costs money"). The panic that ensues when they discover a major problem days before a big deadline is never very pretty.
I'd kill for a TDD framework in the tools I use. While you label many things as "hip programming" my industry is several years behind the fast paced methods of more popular platforms like the web.
For the record, TDD was taught in my university education course as a pretty core testing methodology. The reasoning behind it is sound as it really helps you program to contract and forces the programmer/engineer/manager/whatever to make critical thinking decisions ahead of time. Working "On the fly" isn't always practical.
Hard to get over just how behind automation is. The big scada platform we use is 'disruptive' (ie cheaper and better than the entrenched players) yet they think we should be happy that the brand new version finally supports half-assed, internal version control.
I worked at a company with both a big web application and a bunch of hardware and firmware development. We showed the hardware and firmware developers our integration tests using cucumber, and after about a year they were test driving big swaths of their process using cucumber. (And they were better at it than we were.) My point being: with some work it can be possible to integrate the "hip" tools into your process, and sometimes it even makes sense and works ok!
All of this discussion, quite possibly because so much of the initial conversation happened on twitter, embeds a lot of assumptions into the words used that seems to result in a lot of people talking past each other.
There's "programming" that's a single person essentially performing research-level mathematics; there's "programming" that's a thousand people writing to a spec without considering anything about the structure or complexity of their code. Programming is obviously not just math, nor is it no math.
If you become defensive when someone says "programming isn't math" or when someone asserts that it is, you're failing to think critically about the both words they're using and about the assumptions you're making.
Disclaimer: I have a degree in mathematics, and I'm a programmer. I can't think of the last time I used anything from my college years directly in programming itself, but I attribute a lot about the way I code and think about problems in general to having a mathematical background.
Scott Hanselman's answer is pretty good, and based on that you could say that a Coder usually doesn't know math (and in the current job market they don't need to know math to do well).
The discrete version of fast Fourier transform essentially does a matrix vector multiply faster than usual. If you are not familiar with matrix vector multiply, then think lots of multiplications and additions. Now, what fast Fourier transform does is take advantage of the fact xy + xz = x(y+z) that we know form mid school algebra. Note the right hand side has fewer operations to perform. The other thing that FFT uses is symmetry. It recognizes that two things that it needs to compute are really the same in value, so it computes it only once.
You can understand FFT without much deep math, it is just clever use of these two properties. Study it, you will really enjoy that cleverness. It is one of those algorithms that still makes me break out into a grin.
That said FFT can get more mathematically involved when you want to compute them on vectors whose lengths are not powers of two. Then it gets into primality, Chinese remainder theorem etc. However, for basic understanding and coding it you need none of these.
EDIT:
I would say dont get lost in the details, those butterfly diagrams etc. Just focus on what is the key idea that it is exploiting. There are essentially just two ideas: ditributivity: xy + xz = x(y+z) and symmetry/associativity.
May be a good idea is to sit down and derive a different way to group the multiply and adds, using those properties, to make it go fast. Dont worry if that does not beat the best algorithms out there. You will just get a better feel for the underlying mechanism.
I did implement those, but the deeper understanding eludes me. Honestly I never gave them much though, when it comes to choose whether to take a week and learn it properly I always rather code or do something tangible. Except for linear algebra, which is something I might use in the future.
> because I don't have the background, and I wish I could. Differential equations and fft come to mind.
You should just resolve learn these things and work through a section a day.
I have a math degree but never learned these things because my department was more pure than engineering oriented (an ode course wasn't required) and wanted to learn both for my day job. Turns out they are both fairly simple, and there are tons of free books. Esp differential equations have this mysticism to them before you start, and it turns out they're actually very straight-forward. Almost depressingly so, until you get pretty deep into the analysis behind them.
So... here's my encouragement to you -- just do it!
> You’re right, programming isn’t math. But when someone says this, chances are it’s a programmer misunderstanding mathematics
Or a mathematician misunderstanding what's generally known as "programming" in the real world. The vast majority of programmers write webpages in something like PHP or Rails. 10 years ago it was Excel macros and VB. These are often customer interaction tasks where programming is incidental, but the workers doing these jobs quickly called themselves Software Developers or Javascript Engineers.
> Sure, you don’t need to know how to build a car to drive it
Just as car building/repair and car driving have evolved into 2 separate tasks, so have math-based programming and non-math programming. The separation occurred because there were many people who wanted to program things but didn't want to learn math. The non-math programmers vastly outnumber the math-based ones.
>The non-math programmers vastly outnumber the math-based ones.
At some level I'd love to have a different word for what I do [1] that distinguishes it from web or other kinds of programming. When I talk to someone who has only encountered web developers and I say I'm a programmer, and they say "Yeah so is my uncle! He does HTML!", it feels like they really, really don't understand. Like I've said "I design Tesla electric cars" and they've responded "My uncle assists mechanics in a shop that repairs VWs!" [2]
But at some level it's just my ego talking, so it might be best to just take the ego hit as a lesson in humility. ;)
[1] I do native C/C++ code on multiple platforms including optimization for real-time performance, as well as high speed server optimization, system security analysis, and cross-platform video game development. Not math-heavy, precisely, but still far more skilled than basic web development. I also create the occasional high-performance web page, so I can compare directly.
[2] With no aspersions on assistant mechanics, but it's a VERY different job.
I got a programming job where I rely heavily on my mechanical engineering background. For much of the day, I am programming in Python and C doing user interface work, web programming, and optimizing numerical algorithms. However, I also get to study and use solid mechanics, the finite element method, fluid mechanics, etc.
In the engineering world, a lot of entry level jobs require just a pocket calculator and a book full of standards and codes. I would hate to have wasted all that time in school learning advanced math not to be able to use it! There are a lot of ways to apply math in your programming simply through how math makes you think. Studying math revitalizes the programming part of my brain, and allows me to attack problems from new angles.
This (highly opinionated) quote [1] from Zed Shaw inspired me to pursue a career that leverages applied math and physics, as well as programming:
> Programming as a profession is only moderately interesting. It can be a good job, but you could make about the same money and be happier running a fast food joint. You're much better off using code as your secret weapon in another profession.
> People who can code in the world of technology companies are a dime a dozen and get no respect. People who can code in biology, medicine, government, sociology, physics, history, and mathematics are respected and can do amazing things to advance those disciplines.
Im also mechanical background.the reason i involve programing because wanted to build something like ( statical process control(spc))..now building own accounting instead.math is a must in accounting system like rule 78,calculate irr ,calculate depreciation..
For me, debugging or perhaps more appropriately anti-bugging a program is so much like proving a theorem that the difference is one of terminology and notation rather than essence.
"One big reason that mathematics is much more like language than programming, is that doing mathematics involves resolving ambiguities. In programming you have a compiler/interpreter that just dictates how an ambiguity resolves. But in mathematics, as in real language, you have to resolve them yourself based on context."
Along this line, I recently noticed how similar legal contracts to programming. One partner contract I had to read over looked very much like a huge if/elseif statement, and the definitions section looked a lot like setting up variables.
After reading this article it seems the similarities between legaleze and math proofs are pretty strong as well. I wonder if anyone has ever tried to process legal documents or laws with mathematical formulas to try finding loopholes.
Loopholes is when you find something unforeseen by the law. I vaguely remember the opposite problem - having different laws stating different things about the same situation.
This might be a problem by laws. As for math, there is inconsistency robustness topic of research.
This whole debate is a mess of definitions and taxonomy. If you define math as algebra/geometry/calculus/etc., obviously programming isn't all that tightly associated with math. But programming is mathematical logic, which is a subset of math, so programming is math... right? Well, except logic itself is a subset of philosophy. So programming is philosophy now as much as it is math? And what about this argument that programming is language, it's got some good points, how does that fit into all this?
Incidentally, some visible members of the "programming isn't math" camp are discouraging others from emphasizing the connection between math and programming so as not to intimidate young women away from taking an interest in programming. Folks, this position only validates the stereotype that women are not good at or interested in math.
I'd feel much more comfortable with the idea that programming is math if my anecdotal experience of programs written by mathematicians didn't show that they have absolutely no clue how to write a good program - I'm talking about aspects of design like modularity, cohesion, separation of concerns, testability, sensible use of abstraction, etc. There's clearly something more to programming than math or there wouldn't be such a high correlation between good mathematicians being such garbage programmers. I suspect the whole debate is a sign that mathematicians simply dismiss these other aspects as unimportant and only count the mathematical parts and then come to the "surprise" conclusion that "wow, programming is all math!".
Here is an interview with John Carmack where he shares his opinion on the importance of math in game engine programming which is one of the math heavier areas of programming. It's the first question asked so you don't need to skip ahead.
Interesting! He says he got by mostly with high school math. That analytical solutions are used more in physics simulations. But that in general if you time slice enough, simple approaches work well enough.
When I was 10, I had the opposite reaction. I started writing a game, but having a busy loop just constantly figuring out inputs and updating state seemed so ugly, I gave up, figuring I obviously lacked some knowledge. After all, I couldn't write a space invaders game without "brute forcing" it. Later when I found out that's actually how pros do it, I was really disappointed.
GW-BASIC in 89 or so. I don't think you could sleep. And I didn't have much of a concept of frame either. I just thought frames were arbitrary decisions in time to look at the exact state of the world and show it. Like, that you had a bunch of equations that told you where objects would be at any given time, and when they'd do something interesting (like collide). So naturally my little loop felt stupid and I figured once I went to high school or university they'd tell me what I was missing (hah!).
Had I known that everyone else was just making up approximations, and that a lot of things don't even have analytical solutions would have been nice to know.
Those days your CPU would be doing mostly 1 thing, like run your game, so the concept of sleep wasn't needed and actually, usually you didn't have time to sleep even if you could, you would need every cycle to do calculations. To make life a bit easier and by using C/asm, you could offload something like game music by setting an interrupt vector so it would keep playing nicely without you doing all kinds of timing madness. I was using a lot more math back then than now; there are libraries for everything and they hide the math for you. You could find physic approximations (very rough! it only had to look plausible, not realistic) in books and BBSs BUT you could never just use them as you didn't have enough cycles for that, you needed to rewrite them to fit your purpose otherwise your software would be unusably slow.
I agree, programming is not mathematics, however, the author seems to have mixed up the act of calculation and mathematics.
Bering a good human calculator is unlikely to result in stellar programs. Having a good grasp of how to arrange and communicate your ideas, as is required in mathematics, will help your programming hugely.
Yes, there's a large gap between what is acceptable in mathematics and what's acceptable in programming, and the types of problems you're likely to encounter, but fundamentally, you'll be communicating ideas to other people and to your compiler.
Source: Computer Science and Mathematics degree, now working as a researcher.
Programming is math, in that (at least at some level), a program is formally equivalent to a math problem.
Programming is not math, in that most of the time, when I'm programming, I'm not thinking about it at all like a math problem, even if I do some formal reasoning about some aspects of the program. (Maybe here's where it shows that I'm not a Haskell programmer.)
So: Yes? No? Depends on what level you're looking at it on.
This becomes one of those discussions that's really pointless because it turns on the definitions of the terms involved, and just winds up wasting peoples' time.
>>> I'm not thinking about it at all like a math problem
The thing is, not even a mathematicien thinks always about math that way. In mathematics, one needd to distinguish between a phase of exploration, where you work along a fuzzy, intuitive process, and a phase of justification, where one try to prove one's conjectures. The second part is math-as-a-formal-system, the cliché of mathematics. So it's not a problem of definition, it's a matter of accepting mathematics as a complex process, practiced in various ways and different for each mathematician.
Fuck no, programming in not math. I learned that the hard way, when I tried to take college level calculus as a refresher, after years of neglecting my basic algebra, and assumed that my interim programming experience would be sufficient to tide me over.
The notation is similar, but mathematics is much more rigid in terms of correct results, and is sprinkled with a completely different body of vocabulary, history, facts and trivia, which may selectively overlap and intersect with computer science, but often times not with perfect analogies or exact synonyms.
Not to mention that programming is often accomplished using products distributed by large for-profit companies that have been tailored to maximize convenience for the end-user, which in many cases might be a paying customer, or a large, vocal client community that is deeply intolerant of defects and anti-patterns. Math may be assisted by tools, but the tools are often times fraught with limitations that demand careful and sometimes creative application to a given problem.
> Fuck no, programming in not math. I learned that the hard way, when I tried to take college level calculus as a refresher, after years of neglecting my basic algebra, and assumed that my interim programming experience would be sufficient to tide me over.
Calculus is a discipline that many think is not very applicable to the art and practice of programming itself.
"Programming is math" does not mean to imply "programming is any and all kinds of math".
So, I definitely felt as if I were familiarized with some of the very basic, generalized concepts I would need to pass the mid-terms and final exams, but when push came to shove, the only real parallel that was a direct one-to-one translation was the piecewise-defined function. The rest of the class was still quite the steep climb.
"Programming is one of the most difficult branches of applied mathematics; the poorer mathematicians had better remain pure mathematicians." --Edsger W. Dijkstra
Old arguments are new again because twenty something kids think that they know everything and that the wisdom of programmers from decades past is old, so it doesn't count in today's world of Hadoop cluster Node.js AWS instance startups. This is as true of meta-theorizing about programming as it is about hammering out the details of various software implementations.
> You see, Mei’s fundamental misconception is that the kind of applications that we haven’t yet automated and modularized constitutes what programming is all about.
Please forgive me if I'm making the most heretical question possible, but once we reach the level of software sophistication the author hints at ("how to automate the translation of drawn sketches to good HTML and CSS") wouldn't software also be able to create and analyze mathematical theory ("conjecture, proof, and building theories")?
I think, that despite this be a good rebutal, it show exactly why the first article. When try to justify why programing is math then it focus in the kind of task that benefit a lot for it, then dismiss "the kind of applications that we haven’t yet automate".
In real life, even in math-heavy task, the dominate factor is doing that kind of stuff.
The problem that people that say "please, programming is not math!" is get rid of the elitistic mindset where if not good at math, then worse at programming, and dismiss the importance of other kind of knwoledge (art, writing, other domains, etc). Math is a part, but is not the whole. This is the message.
The only way out of this semantic quagmire is to realize that programming is math incarnate. Mathematics in the broadest sense is nothing but logic applied to axioms. You can come up with arbitrary axioms and apply logic to them and that's math! Sure the connotation of math are specific concepts that usually have some real world application, but that's not the only kind of math. Lots of times someone came up with a crazy math idea long before they realized the real world application (eg. complex numbers).
Until computers came along, math was entirely a theoretical pursuit. The universe may be described by math, but it's impossible to know whether it is actually defined by math or if we just fit the math to the universe. But programming is different, because for the first time we can take a mathematical idea from inside our heads and express it in a physical computer which executes it according to the logic we've defined.
When you look at it this way you realize that it's exceedingly silly to try to separate programming that requires math knowledge from programming that doesn't. I mean no, programming usually does not require the infinitesimal slice of mathematics known as 9th grade algebra, but it does require logical thinking which is the basis of all mathematics.
It also includes far more writing and communication than it does mathematics, and yet we don't tend to talk about it as a part of Liberal Arts (or specifically Philosophy) much at all.
I feel like the biases come from the origin of programming. But to suggest that it is somehow exclusively mathematics because of its origin is puzzling. It is quite clearly mostly words, many of which are arranged into sentences based upon a strict grammar.
It's also odd in that the reason the original article was written was to suggest that it wasn't an insurmountable field, and that many more people could probably learn it if they were not intimated. It's odd that the response seems to be "FEEL INTIMIDATED MORTAL!". I feel like we're seeing gatekeepers feeling threatened rather than a particularly logical argument.
> Mathematics in the broadest sense is nothing but logic applied to axioms.
mmmm. Lots of people would disagree with this :-)
> Until computers came along, math was entirely a theoretical pursuit
Not at all; mathematics always had applications. Not all mathematics, but it's certainly not that case that math was "all theory" and without serious application before computing.
> because for the first time we can take a mathematical idea from inside our heads and express it in a physical computer which executes it according to the logic we've defined.
We've had calculation assistants in various forms for centuries. Also, ballistics and construction based on mathematics are more-or-less "taking a mathematical idea and expressing it in the physical world".
Well, "computer" originally referred to a person performing the task of computation. In that sense, "Until computers came along, math was entirely a theoretical pursuit" may be accurate - it is hard to make math practical without computing anything. It's certainly not the case that it was purely theoretical until we had mechanical or electronic computers.
I disagree. The term "computer" was typically reserved for people who exclusively did often pre-defined computations, and didn't contribute much other than the computation to the mathematics at hand. That is, someone else had come up with the novel method of computation, and someone else had formulated the problem. The human computer was applying the computation to problem that someone else handed them.
There was plenty of mathematics before mathematics got to the point where we needed dedicated humans who spend their whole day computing things in a prescribed manner.
But regardless, for some reason I think the original post was referring to electrical/mechanical computers :)
I think it's the same question as whether someone cleaning their own toilet is "a cleaner" for the duration, on which I could go either way.
"There was plenty of mathematics before mathematics got to the point where we needed dedicated humans who spend their whole day computing things in a prescribed manner."
More to the point, there was plenty of useful application of mathematics before then. Which I certainly agree with. My point was that most (and possibly all?) early application of mathematics required computation.
My comment, though, was mostly agreeing with you - just picking apart a technicality to get at some tangential interesting questions.
So, I tried to think of counter-examples to "most (and possibly all?) early application of mathematics required computation" just for the sake of discussion.
I think I have only one, which is the establishment of axioms, both philosophically (as a method) and specifically (e.g. in Elements).
A revisionist history might say that choosing axioms doesn't require any computation, just a keen sense of style and close observation of the world.
But actually, I'm sure that the choice of axioms was a long and drawn-out process informed mostly by computation and checks that the computed values/proven theorems matched with physical intuition. After all, that's kind-of how it's done today, even by people who have lots of experience with formal systems.
Now I really want to read pre-Euclidean mathematical philosophy to see if I'm correct :-)
Yeah, I thought it was interesting space for speculation. 'S why I tried nodding toward it. Don't really know enough of the history to get terribly concrete, but that roughly corresponds to my understanding. Both the actual history and what might be theoretically possible are fascinating.
Everyone will have a different take on this based on the kind of software they write. The bottom line is that math is pervasive at all levels of software engineering but it impacts developers in different ways.
If you write video games or any other graphic related application, or if you work on jet engines or modelization or industrial engineering, you are obviously immersed in math on a daily basis. If you write servlets on a Java backend, you are probably not required to be as much versed in math, physics or chemistry.
Still, no matter how close to mathematics the software you write runs on, the bottom line is that you are using a programming language to do so, and all languages are rooted to various extents into math. Some are very closely connected, and sometimes based on, specific mathematical fields (e.g. Haskell and Category theory) while others are more loosely based on such principles.
I find that just like you don't need to know how an engine works in order to drive a car, you don't need to know a lot of math to be a decent developer, but it certainly doesn't hurt to read up on some of the theoretical foundations that underlie all of computer science.
The popular understanding of "math" very much is "performing the correct sequence of calculations/manipulations to solve a given problem". This is because US math is taught this way, from addition of fractions to symbolic differentiation of functions in high school.
Most people never grok that math is really about building abstractions of various levels of sophistication and saying (proving) useful/interesting things about those abstractions.
Programming fits more with the latter notion. Even web programming: we have abstractions like user interface, color theme, database table, scripts, etc. The web programmer has to construct these abstractions and then do useful things with them.
I would agree that programming and math are cognitively very similar, and both very similar to using language. However, I also understand why programmers may like programming yet "hate math". It's because their understanding of math is derived from US standard math education, which is essentially about the memorization of symbolic procedures, with very little constructive abstraction happening.
Disclaimer: I prefer to eschew labels and categorizations as they lead to heated & mostly unproductive conversations. These conversations often serve to close minds by creating stereotypes and ossifying one's often incomplete perspective on the subject matter.
That being said, this article's main examples are programming applied to domains that rely on mathematics. Domains that don't rely on mathematics tend to have less mathematics.
Programming is (mostly) about creating automatable systems or automating processes. Depending on the process, mathematics may or may not be involved. It's easy to imagine a programmers going their entire career without encountering mathematics. I personally like to automate mathematics, however it's not "necessary" to be a programmer. Mathematics will make you more "well rounded" just like a liberal arts education would.
Learning math is not just learning the knowledge of math. It's the learning of the abilities to reason, to think, to analyze, to simplify, to deduct, and to solve. Math knowledge helps programming but the math abilities are what propel you to the next level.
Mathematics is the study of quantity and structure. Reasoning about how to mutate some state, defined as a set of values, according to some domain-specific logic, constrained by the execution environment and invariants, is most assuredly mathematics.
Indeed. Whether the likes of Sarah Mei agree or not, the Curry-Howard correspondence holds regardless: producing a computer program that meets some specification is theorem proving at a fundamental level...
Software engineering requires mathematics. Software development does not. Engineers and developers both bang out code. But the similarities end there.
A developer uses a library to make a product. An engineer uses mathematical knowledge, skill, and reasoning to find an efficient representation that elegantly solves a problem. A software engineer extends this idea: using (mostly) discrete mathematics, type & category theory, and computer science at his or her disposal.
I have respect for developers -- they are an important part of the entire software ecosystem. I feel it necessary, however, to distinguish those who use mathematics to program and those who do not.
> Software engineering requires mathematics. Software development does not. Engineers and developers both bang out code. But the similarities end there.
I agree. For instance, in Haskell you cannot just sit down and hack something together like in Python or PHP. You need to think about the whole structure of the program. That's software engineering.
Interestingly (for me) I realized that it is possible to write software in Haskell in a mathematical way which I previously considered to be possible only in an imperative way. It required a shift of my thinking about software, and I am still learning that.
Software engineering is intriguing because it raises software quality to a new level.
Actually, in Haskell, you can write some python code to "program to think" first, and then when you know EXACTLY what you are going to do, you can write elegant Haskell code with its "think to program" requirement.
Software engineering is a title no more concrete than software developer. In either case, it's useless to talk in platitudes.
Mathematical background helps in programming. It is not absolutely necessary but personally I do not know any Programmer who was good at Mathematics and failed really badly in programming.
As a web developer with degrees in both math and programming, I sometimes find direct applications of math, such as backing off from updating something on the page exponentially when the user goes AFK, or more commonly, when working with visualizations (see d3.js). For the most part, my peers without much of a math background could solve the same problems in other ways. For more graphics-oriented stuff it gets a little harder but often a library plus a tutorial blog post gets the job done.
It's also lacking an agreed upon definition of programming. It's even lacking an agreed upon definition if "is". (No, I'm not trying to go Bill Clinton on you. In this discussion, what do you mean by "is"? Do you mean "identical to, equals"? Or do you mean "part of, a subset of"? If you mean identical, do you mean identical in all respects, or just in some respects? If so, in which ones?)
i want to say that math just is logic (which, blown up into set theory, is the major way we understand language outside of full on anthropology and incidental history/statistics)... but my early allegiance to frege would be showing. alas, we are doomed to never find the logical foundation of mathematics--godel hits us on one side, wittgenstein on the other! at least frege was a better writer than either of those two.
anyway, i love the author's point (2). i would never argue that we don't need to bother wondering about the import of death or how we feel about things, but (2) not being widely known comes largely from A) people not studying philosophy, since B) they don't realize that philosophy isn't (always) about finding (capital M, almost-supernatural) Meaning, but rather is just being rigorous in your thought and language (as well as curious). math and programming aren't magic; they admit of rigorous thought just like anything else. you needn't be a savant, and you needn't go and vilify any particular normal human capacity or object of investigation to be allowed to not be a savant. sadly, i write like wittgenstein, not frege.
hope this rambling response struck you as topical enough.
"Large amounts of effort are spent on tedious tasks in industry for no reason other than that we haven’t figured out how to automate them yet. And novel automations of tedious tasks involve interesting mathematics by rule, not exception."
Not closely related to the article's subject, but nevertheless a good point, it's one of the reasons I consider academia to be a good place to be. At least for those at the 'top' of the pyramid...
From my personal experience they are related, but I can tell for sure they are very different.
U think this confusion exist because many famous computer sciences studied math or were also mathematicians.
From my personal experience, I passes all my CS courses in college without much effort but I struggled with calculus, statistics and algebra classes. So I think even the mindset you need to understand them are very different.
No you don't need to be a mathematician to learn the basic syntax, but you could be a mathematician and still learn the basic syntax. You could also be a mathematician and use programs to build wonderful and new exciting things.
Programming has no limitations other than our imaginations.
I don't get why we humans have to be so definitive before things stop scaring us.
Before there were Computer Science university schools, programming was taught at Math, Physics and Engineering schools (like the mentioned FORTRAN), it wasn't taught (still isn't) in language schools. There's some relationship between math and programming, and in any case math can be seen as a language as well.
The truth lies in the happy median: math is absolutely necessary for some programming applications, and helpful in the others - being able to distill a problem into easy-to-reason-about abstractions is helpful in any domain of programming, but a certain level of abstract thinking can be attained without formal mathematics.
I'm okay with the 4 down votes in 4 hours -- just hoping someone could clarify why. My comment asked:
"Does this have anything to do with the female vs male perspective on programming? Maybe programming can be actually approached with different mindsets -- and it's not a case of either/or...."
I didn't down vote you but my initial reaction to your comment was can we please have a conversation related to development without dragging sex into it? I would guess that many had the same reaction.
Discrimination is an issue in Tech, its not the only issue or an issue which relates to every other issues it is an issue, singular.
That is the problem: programming should be math.
If we view programming as metaphysical, we can never hope to build good software. That will change in the future.
I don't believe that 50 years from now we will be still writing code to fly a plane or run a web app, and questioning ourselves: will it run? will it crash? Does the code does what it is supposed to do? Do these two functions do the same thing?
Certainly as time goes on, programming will be ever more a well defined formal system, so much that big parts of development will just be auto-coded(by machine).
That is why functional languages are so good. They are very close to mathematics, and allow us to build formal systems on top of them, that remove any ambiguity that might exist.
And this is(or should be) one of the reasons why functional programming is so 'hip' now: it allows us to reason about program design using the Algebra of programming.
Of course right now we are stuck with the methods we have to build software. And that means you can have a programming job without knowing much about math, or even using it day to day.
But that will change in the future.
(My humble thoughts on it, based on the views I got from a university class called 'program calculation')
> That is why functional languages are so good. They are very close to mathematics
Wouldn't it be better to say that functional languages are close to our commonly used mathematical languages? The abstract underpinnings of mathematics are not expressible naturally. We've had to invent languages to be able to communicate the ideas.
And that is pretty much what this debate boils down to: Are the languages that allows you to express abstract ideas (programming languages in this case) math or language?
Mathematics is a specific application of logic. Programming in general is closer to logic than math. But there is certainly a large set of programming that is mathematics-dependent (see above threads of people doing engineering programming).
Let me copy part of my comment to another submission:
Different people view programming very differently. For example for me programming is mainly a way to understand things. I program things I want to learn about. My understanding of programming helps me understand computers and computing. Then I have a friend who couldn't care less about how exactly things work under the hood, or in theory. All he cares about is making a product, and programming is his tool.
I think this is one of the main distinctions. Some are motivated by curiosity and love of tinkering with the machine and see it all as a playfield with toys, while some others are motivated by the opportunities to build things and make money out of it, seeing it all as a tool and material. I am not saying this is the dichotomy or that one even exists, but it illustrates that "programmer" doesn't really say all that much. We're a broad and varying bunch!
You might not want to self-identify too much with those who "couldn't care less about how exactly things work under the hood". When you've spend the best part of a day carefully explaining how NOT (a OR b) is the same as (NOT a) AND (NOT b) to the "programmer" at the next desk who got his job because someone else of the same gender and ethnicity filled in for him at the aptitude test, you'll quickly curse how HR people turn a blind eye to closely checking such stuff so they can meet their hiring quotas.
It's not even inconceivable that someone like this has conned their way into running open source software projects, perhaps even project managing a "Top 50" programming language!
It's not the kind of environment where typical startup aphorisms apply ("fail fast, fail often", "don't be afraid to pivot", etc). So every month or so when another "programmers don't need to know math" article comes out, usually written by a web programmer, I have an impulse to represent the other side of the divide, but I usually find so many misconceptions and poor assumptions in the original article that I conclude it's too much work. So I am glad that the author has done the job for me here.
What I have observed, again anecdotally, is that web programmers are generally more vocal about programming than systems programmers and scientific programmers. It makes sense in the historical context of the internet, but I don't think that this bias is properly accounted for when people on Twitter/HN/etc discuss "programming" and what it requires.
I don't think it should be controversial that programming is founded upon the study formal languages, mathematics in particular. So even if you don't need math to do your programming work on a day to day basis, it's because a lot of very smart people have solved some very difficult math and language problems over the decades so that you have the luxury of ignoring the mathematics your code relies on. This is all ok. But the implicit hostility towards mathematics that a lot of these articles demonstrate really makes me concerned about the influence it will have on the next generation of programmers. Moore's Law has allowed a certain level of indifference to mathematics in the past few decades since you could always throw a newer processor and more memory at a problem, rather than solving it via a better algorithm. But that situation won't last forever.