If you are not familiar with the author, Alexander Stepanov is the guy who basically figured out generic programming, was instrumental in the design of C++ templates and the C++ STL. C++ gets a lot of flak but templates are very powerful (if you disregard the complexity). I think that they are actually one of the main reasons why C++ is still relevant. Also I'm starting to think that generic programming might actually be the most powerful paradigm out there (this is just a hunch). This book doesn't take the middle road, only the low level (C++) and extreme high level (abstract algebra) and totally cuts out the middle part (aka boiler plate).
Funnily enough, this C++-like language actually translates very nicely to the modern C++ successors like Swift and Rust (or it seems, I'm in the progress of exploring this).
Has anyone here tried to explore the contents of this book in either Swift or Rust?
But remember that this is not an easy book, I've met very smart people who told me they read only a part of this and are still wrapping their heads around that.
What is the origin of STL? Has STL been conceived to be what it is now, that is "the" C++ Standard Library, or does it come from some other project? Could you tell us a history of STL?
In 1976, still back in the USSR, I got a very serious case of food poisoning from eating raw fish. While in the hospital, in the state of delirium, I suddenly realized that the ability to add numbers in parallel depends on the fact that addition is associative. (So, putting it simply, STL is the result of a bacterial infection.) In other words, I realized that a parallel reduction algorithm is associated with a semigroup structure type.
The full interview can be found at:
[Edit] Interested in Rust for the same reason, but I'm not attracted to Swift. It looks so much like Kotlin, I am not sure I understand the fuss over it other than it gives Apple devs a way out of ObjC.
Anyway, I am going full on Android now, not in the hopes of app money, but the sheer numbers of devices out there, Google's backing, and it is more of a hacker's platform than iOS. I will most likely try Kotlin again, and this will prepare me for any possible switch to Swift.
(0) The interface of an abstract data type (set, heap, etc.) is the signature of a variety in the universal algebra sense. Implementations are concrete algebras of this variety.
(1) Homomorphisms of varieties can be used to bootstrap abstract data types from simpler ones in a generic way. Chapter 10 of Okasaki's book “Purely Functional Data Structures” is entirely dedicated to this technique, although he doesn't make the connection to universal algebra.
But are also some differences:
(2) I'm using a functional programming language (Standard ML), rather than an imperative one (C++).
(3) I emphasize persistent data structures over ephemeral ones. I don't reject destructive updates that are compatible with persistence (e.g. laziness).
(4) I use algebraic data types to statically rule out unreachable code paths. Inexhaustive pattern matching is considered a bug. Raising an exception saying “this path was supposed to be unreachable” is also considered a bug. This would be at best very difficult to enforce in C++.
I know Alexander Stepanov personally, and Paul McJones through him, and they are both interested in seeing the ideas in the book implemented in other languages. Rust is a particularly compelling candidate, because it allows you to specify type constraints on generic functions through traits. This is similar in some ways to "concepts" which Alex has been hoping to see implemented in C++ for a very long time but which keeps getting kicked down the road.
Also, a while back, I reinvented InputIterators, OutputIterators and `std::copy` in a more type-safe way, using algebraic data types: https://github.com/eduardoleon/shit . (Sorry for the repo's name!) Although I only provide Standard ML and Scala implementations, this is perfectly doable in Rust as well. (In fact, my code is linearly typed, not just affinely typed.) The only requirements for a port to another language are parametric polymorphism and algebraic data types.
Linear algebra is useful for computer graphics, but also for general "system thinking" concepts like inputs spaces, output spaces, transformations, and properties of transformations.
Basic differential calculus, meeeh, but multivariable calculus—specifically optimization—is really important in many programming contexts (e.g. machine learning).
Of course, the most important and most basic of all is the notion of a function f(x), its definition, inputs, outputs, properties, etc.
It serves to review basic high school math, calculus, and even a bit of linear algebra. SymPy is very powerful stuff...
I work primarily on audio software. Linear algebra is most important, followed closely by signal processing. Those are sufficient to write good software. Great software requires statistics, calculus (mostly for optimization but also modeling), and discrete math (again mostly for optimization). Other specialties probably require more discrete math and less linear algebra and signal processing.
1:When I try to learn a new language I try to work through some of the problems on project Euler. I'm always impressed by some people's one line math solutions to something that took me 50 lines of code.
2: I once try to implement a timetable scheduling Web app (matching available teachers to courses) I could see that a solid understanding of set theory would have allowed me to - a: reason about the problem domain more easily and b: find the best way to set up my SQL queries.
I recently realized that math is not necessary for programming... if you want to be stuck doing boring stuff like webdev or CRUD apps.
Also, there are many kind of maths, the Europeans even get this right over the Americans by making it plural. Math is necessary is like saying things are necessary...which thing? It really depends on what you are doing, and experience with a known field of math might not be useful for some problems (beyond the abstract reasoning you need to write code at all).
linear algebra | graphics, scientific computation, mostly graphics.
discrete probability | a whole shitload of perf solutions, understanding risk while planning
number theory, particularly factorization | modular arithmetic for crypto, a lot of very clever hacks for compressed representation of state spaces
predicate logic, basic set theory | I cannot count the number of times someone I work with has expressed something that required one elegant logical operation in a horribly convoluted way.
ammortized analysis | just learn it
statistics | operational reasoning using performance metrics, how to test/alarm rationally, how to reason about your customers
calculus | marginal returns (important when managing a team and optimizing where to spend resources). also if you ever end up doing any kind of convex optimization, which you might, maybe. (I do, but I don't think that's super normal outside datascience).
TL;DR: there's a reason CSCI curricula look like they do.
Would be great if you can write/blog about it or just provide some examples here...
There are some things you don't want to learn, and there is also an oppurtunity cost to learning/practicing A instead of B.
Good instincts require a breadth of knowledge, moreso than a depth in any particular domain. Precise logical reasoning requires practice and patience in any of a very wide range of fields, not exclusive to mathematics. In both cases, I think developing a careful knowledge of history or neuroscience or any rigorous intellectual discipline is effective practice, although I think a foundation in logic is irreplaceable.
Being an excellent programmer in some specific domains requires deep domain specific knowledge in applied math, which does have a steep opportunity cost. Absolutely. But I did not believe that was the question at hand.
From my experience:
Everything you publish in science/academia will be read by a wide, potentially hostile audience. It is very easy to be misinterpreted, and misinterpretation wastes a lot of time and effort.
Stating what you mean in clear, precise terms is far more work than writing things that make you sound 'smart'. Good writers make their work seem terribly obvious, but it comes at the labor of many, many drafts. Most of the hardest work in editing is helping an author subtract and simplify to get right to the point. The labor of being precise and explicit exposes errors that the illusion of understanding tends to conceal.
Young/inexperienced writers often have a gigantic blind spot when it comes to their own writing, and correcting that is a very painful process.
Also, sometimes the original author was right, and the only precise way to express something was really ugly and horrific and your attempts to make it simple did violence to a lot of careful thought.
However the truth in many situations is that if person A spends an hour constructing a beautifully worded five-liner, and if person B spends the same time to produce several rambling pages, then person B's argument may well win out. Readers who have spent ten times as long wading through B's contribution are quite likely to have forgotten entirely about A's.
Definitely a lot of discrete math (e.g. combinatorics), some calculus, number theory (esp. for cryptography), coding theory (e.g. for error correction), information theory, computational complexity theory. The border between theoretical computer science and math in general is very blurry, pretty much anything that is CS pretty quickly touches on more "traditional" mathematics.
A lot depends on your area of programming, things like machine learning will definitely involve more linear algebra, calculus, statistics. If you're programming a 3d engine then again algebra, geometry, etc. Things like finite state machines have a mathematical underpinning.
EDIT: From what I've seen at a university level CS program, Calculus, Linear Algebra, Discrete Math, Group Theory, Logic and Statistics were (IIRC) the core required math courses. Doesn't mean everything you learn there is applicable to any kind of program you might write but at least that's what someone thought was important...
If you want just the maths and at a slightly slower pace, go with Concrete Mathematics. If that's a struggle, I found the first part of Discrete Mathematics and its applications by Kenneth Rosen was enough for me to make progress in Concrete Mathematics.
I also found learning a bit of linear algebra and calculus was a mind changer. I expect more fruits by my continuing of those studies.
Disclaimer: I know nothing about math but have been researching it recently as I want to go to university next year to study CS so I have about 8 months to 'learn' math (45 year old street programmer here :).
I would appreciate any feedback on if this post has any merit, and will be following this thread as it's relevant to my interests :)
When you say you know nothing about math, does that mean you have no Calculus? That was where I was starting. To prepare for Calculus, get solid in algebra and trig. Then do single-variable Calculus (Calc 1 & 2). My recommendation would be to do linear algebra before multivariate calculus. You can do Calc 2 and Linear Algebra at the same time.
If you have no Physics experience you might also consider studying it. Physics helps you learn by applying math concepts. Also, I think it is very good for teaching you to problem solve as an engineer.
Axler - Precalculus 2nd version
He works through every odd exercise solution, in full. Also gives you a good intro to sets and series, summation notation, binomial theorem, all this will come up later in discrete math. He assumes the reader knows nothing about Trig as well.
Gilbert Strang's videos "Highlights of Calculus"
https://ocw.mit.edu/resources/res-18-005-highlights-of-calcu... just to get an overview of what calculus really is since I assume your university will throw you into an applied single variable class first year. MIT Open Courseware has their calculus (18.01/18.02) course lectures up if you want to watch them too to get an idea of what you'll run into.
"Elements of Mathematics: From Euclid to Gödel" gives quite a good explanation of Mathematics as a whole, like why we learn elementary math the way we do and how it applies to more advanced concepts, explains Rings/Fields and has a really good introduction to Logic. I wish this book existed 5 years ago when I started http://press.princeton.edu/titles/10697.html
"An Introduction to Mathematical Reasoning" by Eccles is short and excellent. You can also download the lecture notes here from CMU's course on proofs http://math.cmu.edu/~svasey/concepts-summer-1-2014/ and try some of the homework if you want but you probably won't see any of this until second year judging by most university calendars and recommended program course list I've seen.
CLRS https://en.wikipedia.org/wiki/Introduction_to_Algorithms which has a great first chapter "Foundations" that compliments Knuth's 'Mathematical Preliminaries' chapter in TAOCP vol 1. Knuth's book you can really drill yourself in these concepts though with the many, many exercises writing proofs. The skills I learned doing these exercises paid in full years later when I started more advanced math and even at work.
But I have no others to go on to compare.
Just look at the curricula for various undergrad math programs at some number of decent universities. They have a set of required courses and the course descriptions usually list the prerequisites, and sometimes this is even accompanied with a diagram of the DAG as a flowchart.
For any given course, there's usually a canonical text or set of roughly equivalent in quality texts.
You can do the same thing for basically any area of study that can be found as an undergrad major at most universities.
Basic logic is also important - understanding symbolic logic, and methods of proof (direct inference, proof by contradiction, induction, etc.) also prove important.
I have had to use basic trigonometry in the past as well, and linear algebra (for CSS rotations/translations driven by JS).
Algorithmic number theory, closely related to algebraic number theory, drives cryptography.
Harmonic analysis drives audio.
Algebraic geometry powers facial recognition software.
The list goes far deeper, but I have been away from the math world for 6 years & have forgotten much.
And on the more esoteric side of things, in Quantum Computing every logic gate is actually a Matrix Multiplication
It's cool to learn recognize how simpler math concepts apply to computing. Ie, associativity allows for map/reduce
Math isn't majorly needed and you can learn what you need on the fly. For instance, if you want to get into game development you need to understand some fairly complex math concepts but nothing is stopping you from attacking the problem and either a) working around it with your understanding of the computing devices you are using or b) by learning what you need on the fly.
If you want to practice in a field that operates on computers, don't learn another field. That's like saying "what kind of philosophy most correctly correlates to learning about horticulture?" Just start to learn horticulture and as you need philosophy it will be tied in.
I really, really hope that whatever you are doing does not involve cryptography.
Crypto is no more CS then it is EE. You can implement cryptographic elements in hardware but you don't see the math people setting up shop in the EE labs at a college so I don't know why you see them setting up shop in computer labs.
Honestly I'm not sure if I'm be able to handle a modern graphics programming jobs unless I studied and practiced constantly for at least six months straight. And I've worked on a bunch of 2D games before (and one 3D game).
You said "Have you tried interviewing at game development companies"
These are two separate things. It's like saying "Most people can learn how to fill out their tax forms on the fly. You don't need anything special you can just read up what to do when it comes time to." and then you coming back and asking "have you ever applied for a job at H&R Block? You need to be well versed in the tax system to be a full time tax filer"
This is not at all what I was referring to.
The question at hand was "Which areas of math are practical to programming/algorithms and why?" Would you say that game development is the representation of the entire programming field? No it's a small portion of our profession. If you want to do that professionally round the clock then you need to prepare for what you will be doing in that field but that doesn't mean that if you want to get into programming or algorithmic thinking that you should learn game development. That's just one small element.
> Honestly I'm not sure if I'm be able to handle a modern graphics programming jobs unless I studied and practiced constantly for at least six months straight
What you have to ask yourself is what is it that I'm practicing for this job
You didn't learn the maths first and said "you know game development looks good let me go there" and sit down and start to learn how to program and think algorithmicly. You sat down and said "I need to learn everything that directly applies to this field" which in most cases (for rendering) is vectors, frustum view angles, Eular and maybe Polar coordinates, an understanding of how to use a quaternion, what a matrix is. The remaining, and much more important part in this equation, is understanding how to interface with OGL/DX/Vulkan. This is what a graphics person does best, not math.
It boils down to what our field is: understanding, in great depth, the systems of computation and how to interface with them in a structured, maintainable, terse, and fast manor. Math is a cart and programming pulls it to the goal post.
Set theory is, probably, the most fundamental. Everything could be defined as a set or as a function.
Lambda calculus, obviously.
Some combinators. Basics of linear algebra.
No category theory and other bullshit is needed. Sets will do.
It is actually very important skill to avoid wasting time in disconnected from reality obscure academic bullshit, be it philosophy, physics or math. Do not follow other people's hallucinations. Have your own.)
As a rule of thumb - you need just enough math to understand The Wizards Lectures and SICP.
Again, the substitution model, sets (for notion of types and basic collecttions) and high order functions (the "domain and range" mantra) is enough.
For algorithms the notion of being bound by some function and orders of growth.
While they're both useful, I think category theory encompasses a bit more than set theory and it's only a little more abstract. I also like the more functional approach that a lot of category theory requires you to take; if you're used to imperative programming it's a pretty different and useful way to think about things and it's great for adding to your critical thinking/general problem solving skills.
Knowing where to stop in a heuristic-guided search is the most difficult part. In my opinion one should stop after realizing that the subject becomes too abstract and too academic and cease to be a tool of clarification.
Let's say that it has something to do with the pragmatism of Ayn Rand, Quality of Robert Pirsig, and to the general scientific method of removing bullshit, dogmas and nonsense in order to let the truth standing.
Now, the fact that it is so abstract does mean that your average programmer won't use it on a day to day basis. I'm certainly not going to argue that it is the most important math to understand. However, we are finding that there is uses for it. Not just in a "oh look, I can describe my code using abstract algebra, nifty" sort of way, but in a "knowing this was essential to finding the correct solution" sort of way.
For instance, the folks working on .NET's LINQ knew they were implementing monadic comprehensions; they just had the good taste to not use the word "monad".
Furthermore, it seems that having an understanding of abstract algebra is essential to making a distributed real-time analytics engine
Monads make no sense in a non-lazy language.
Yep, monads :).
> Actually it is a canonical example of abstraction for the sake of having an abstraction, which only increases confusion.
I intentionally brought them up with a concrete, real-world example where they were useful.
> Monads make no sense in a non-lazy language.
A more accurate statement might be "monads make no sense in a non-lazy evaluation context". There are plenty of cases when working in an otherwise strict language you will compose a series of computations over a lazy data-source. For example: you want to process a potentially large series of rows coming from a database without blowing through RAM.
You can solve this problem on an ad-hoc basis, but as far as I understand it, the folks working on LINQ wanted to develop a declarative, readable, sql-like syntax for creating/composing such computations in a way that would work generically across data-sources. The question is, what is the pattern of operations and constraints that is common to all of the disparate data-sources that can be used in this way? The monadic functions + monadic laws is a precise (if arcane) description of those operations and constraints.
Using monads is not the only way to structure computation in the context of lazy evaluation, but it is one that is relatively well understood and worked well with the goal of being "sql-like" because SQL queries can be modeled as monadic computations.
Your value judgments in this case actually seem dogmatic and nonsensical... which makes sense if it's derived from Ayn Rand.
Ayn Rand was a student of the classic Greek philosophy, nothing wrong with her.
However, her status as an intellectual heir of Aristotle is highly debatable. She's as much a disciple of Nietzsche as anyone.
Meanwhile, metaphysics is a subject that stems directly from Greek philosophy: specifically the subjects addressed in Aristotle's work Metaphysics (literally "the book that came after Physics"). Spinoza's theory of monism is itself a metaphysical theory in this sense.
Perhaps by metaphysics, you mean to certain metaphysical theories, such as the Idealism of the 18th century?
edit: missing punctuation
> Metaphysics is a branch of philosophy concerned with explaining the fundamental nature of being and the world that encompasses it. Metaphysics attempts to answer two basic questions in the broadest possible terms:
> Ultimately, what is there? What is it like?
Does category theory have anything to do with that? No.
> Does category theory have anything to do with that? No.
I could argue that category theory is claiming to answer those questions for mathematics. (So is the axiomatization based on set theory.)
This. So much this. I've lived in a few hacker houses and worked at hacker spaces in the last year. The most impressive programmers I met had picked up just one set of skills (mostly web dev or mobile dev) and while being good at just one thing is less glamorous, these people were actually finishing projects. It was really incredible to watch them create and transform a project into real life in a very short amount of time. The most frustrating people I met were ones trying to learn everything--Machine Learning, IoT, VR/AR, mobile, web full stack--and getting nothing done on a daily basis.
The math you need to learn is completely dependent on what programming you want to do. What will be most impressive is your ability to focus on one or two areas and not get distracted into trying to learn all of it, and not feel envy towards those who are experts in domains that you don't know.
Timeless classic. And nostalgic diving back into 80s. And MIT Scheme on a monochrome monitor with a clicky keyboard.
There are lots of practical uses. For example, the billion dollar mistake  is painfully obvious when viewed through the lens of type theory.
Software such as Coq can be used to write verified (provably correct) software and can be used by mathematicians for proving theorems.
 - https://en.wikipedia.org/wiki/Formal_language
It is a huge help to be able to think in terms of recursive algorithms in many applications, and proofs by induction both give you the tools for thinking this way and the framework for demonstrating that recursive algorithms are correct.
Understanding (that there is) a relationship between computability theory and just basic thought is one of the (if not the) most important takeaway from my (disastrous) college career.
Benson Mates  is the canonical (succinct) textbook.
The basics: + - x /
Generally speaking, any vector or array can be thought of as a matrix (not that it is important).
Some problems and domains involved many operations on [multi dimensional] matrices. Where some math knowledge is advised.
Then there is "graph" both theory and practice. It's usually classified as math but has little to do with mathematics. I'd rather consider that as a unique separate domain with many practical and theoretical applications in other fields.
Forget about "crypto". Nothing of that is needed for programming. You'll use some algorithms but you'll never have to write or invent them.
In a theoretical sense. "Functional [programming]" is a branch of mathematics.
Overall. Math is seriously overrated and has nothing to do with programming.
If you're into programming. Maths are simply not required to be a programmer. Don't worry about that.
There are some domains (maps=graph, 2D/3D=geometry, variousstuff=matrices) in where programming regularly require some maths. The maths part is always a very specific subset that can usually be learnt alone.
If you're into abstract mathematics. There are some theoretical stuff that can be worked on which are very far from practical programming.
Here are some concrete examples from my career.
* Combinatorics and probability theory - Entropy analysis of authentication schemes, algorithms for enumeration and analysis of related combinations and permutations, etc.
* Calculus and probability theory - Implementation of bloom filter for space-efficient indexing, tests and analysis of false-positive rates, etc.
* Statistics - Adaptive authentication schemes, performance measurements and predictions, network event correlation, etc.
* Discrete mathematics and algorithms - Tree traversals, graph search algorithms, etc. were useful in a variety of situations, e.g. data retrieval in tree-based distributed databases.
* Asymptotic analysis, Big O notation, etc. - Data deduplication in distributed databases.
* Modular arithmetic - I didn't really implement any cryptography algorithms; I only used them. But the understanding of modular arithmetic was essential to understanding some of the limitations of cryptographic algorithms based on modular arithmetic, e.g. why the message size cannot exceed the size of the modulus, as well as for software engineering tasks like decoding certificates found in network traffic.
* Formal language theory, DFAs, etc. - Parser generators for network event parsing, SQL engine development and related query optimizations, etc.
So these are 7 examples from 7 years. There may be more examples but I cannot remember them right now.
I have felt from my personal experience that although most of the work does not involve mathematics, once in a while an interesting problem comes up that can be solved efficiently with the knowledge of mathematics. It is not easy to predict when such a problem would come and it is not always easy to recognize whether the problem at hand can be solved or analyzed efficiently by known mathematical techniques. So it is good to have someone in the team or be the person who has a good breadth of knowledge in mathematics, so that when such problems do come up, they are addressed efficiently. Fortunately for me, I love mathematics and there were people around me who also loved mathematics, so we got a lot of interesting work done.
Graph Theory is important for understanding data structures, I would also stress that the math equivalence of idempotency and isomorphism. We use these concepts a LOT in our work - but admittedly, we're a database ( http://gun.js.org/ ), which is a very different line of work than building consumer apps.
Other people already mentioned Big-O notation. Combinatorics, etc.
Algorithms that are giving not exact, but pretty good results on giant data sets.
From what I know of topology, you use logic and algebra as tools to learn about it, just as you use logic and algebra in computing. But you probably wouldn't use topology to say anything about computing.
Number theory is relevant to crypto, but I would be hard pressed to say it is practical for programmers in general. You could probably have a pretty long and successful career in programming without knowing basic things like what a prime factorization is.
Calculus is pretty useful in computer graphics and vision, if you want to understand what is going on there are integrals left and right. Audio processing and finance too.
But that's a different thing than saying that a particular subfield of math is "practical to programmers".
I would say that programmers need the foundations of logic, algebra, and probability, and a bit of calculus (less than I was taught). And THEN they can learn topology, number theory, non-Euclidean geometry or advanced calculus if they need to, for their specific domain. You never know what you're going to end up doing 10 years down the road.
And of course that seems to be what happens in a good undergrad CS education, so apparently people better than us have thought about it :) Indeed college is supposed to give you the ability to learn how to learn, when new things come up, as they always do in programming. Quantum computing is basically a ton of linear algebra as far as I can tell.
We employ some very talented people working on these sorts of problems for rail: http://biarrirail.com
That said, nobody in a STEM field should go without knowing at least univariate calculus. Without calculus, you can have only a poor intuition for situation involving rates of change, or little deltas being applied in one place resulting in other little deltas elsewhere. It's also necessary for stats, because you're dealing with oh, integrals such as the area under sections of a probability density function.
Part of computer science is numerical analysis, too. Once upon a time, numerical analysis constituted the bulk of "computer science".
CS undergrads arguably need some exposure to numerical analysis, and in such a course, the knowledge from other math courses provides support.