I really appreciated the voice and tone. I am very surprised to see people who did not appreciate it and took the time to write how much they didn't like it. (Which is their right).
I am writing this only to let the original author know that there is an audience for this kind of "breezy" explanation.
This course started with Zeller's congruence as an example of an algorithm and then moved onto some graph theory and then onto Dijkstra's algorithm. I never got to see what the rest of the course contained, but it really looked like a good course. Sadly I was not able to persuade the person who was in charge to let me continue to take the course.
All the people on the course absolutely hated it from the start; thought I was mad to want to do it; would go on to fail it and as a result this was the last year the sixth form would run it.
 Sixth form is the 16-18 optional education in the UK.
In England and Wales, maybe in NI but definitely not here in Scotland, which has a completely different system.
It's only sort-of optional now - since 2015 you're required to remain in education or training until 18, although that includes vocational courses and apprenticeships.
Yes, I understand some people have the opposite view, I'm just trying to day that there are surely a lot of people too that find this kind of style extremely distracting and frustrating.
In other words, my favourite textbooks of all time are Landau and Lifschitz :^)
I get that I'm not the audience, but still this conversation style irks me. It just seems like noise.
edit: To be constructive instead of just complaining. I'd like articles this long to have a Table of Contents. I can't skim this article because there are too many pictures. So I have no way to summarize the content and find something useful.
On the positive side: relevant illustrations. I also liked that the code implementation is one that use explicit edges. Too often in books, it is either the adjacency matrix or the linked list ones. The former is inconvenient when you need to add a node, and the latter doesn't let you put weight on edge. So these "academic" implementations are not well suited for groking concept by playing with code.
I work a lot with graphs and there is a big value in having a general enough graph implementation. Last week I implemented a trie class with it, and this week a transducer. Both in two hours, debugging included. Other things such as serialization came from free from the library. Of course, that not optimal data structure, but it is fine for my use cases.
In your post, you don't use the fact that trees are graph to avoid duplicate code explanations, instead you explain it first for a tree, then details the graph one. It should be the reverse: how to use a graph implementation to create a tree. And then only if this implementation does not fit the memory or performance requirements of the app, rewrite a more specialized data structure.
I recommend reading blog posts by authors you like who have a conversational style with an editors eye. How often do sentences not have any relevant information? How often do they actually try for a joke? It's often less than you might think. The style often comes more from tone and word choice. I think a great model is Julia Evans: https://jvns.ca/
This seem to be a great summary of the Pyramid Principle  -- I hear the book is a bit of a slog after the first 3 chapters.
One tip I would suggest if you plan to buy the book, get the older version from the 90s at a lower price. The newest version is quite pricey.
Some of us even question the value of a CS degree.
I take this blogpost as example how I do not want to write.
It is too long and too much information in one piece, Euler to BST with AirBnb to Twitter, and Facebook. With stop on linked lists, arrays and trade-offs between them. Don't even get me started on Sponge Bob. All this targeted at junior developers?
I advise author to read: "Nobody Wants to Read Your Sh*t: Why That Is And What You Can Do About It"
Most of us simply skip over the stuff that doesn't interest us. That is clearly the default behavior. Therefore it is interesting why, on certain articles, people do feel the need to comment on the style.
Personally, I've been lucky, because the most vocal people have been the people who like my writing. But not always. I wrote a book about startups, and some people loved that it had a lot of humor, whereas others really hated it. At least so far, the one's who hate it write to me privately, whereas the ones who love it have written the reviews on Amazon:
I'm lucky, but I'm also curious why this is. As a purely sociological question, why is that sometimes people feel the need to be very vocal with their disapproval, whereas most of the time they keep such disapproval private?
The Nature of Computation - Moore, Mertens
What I got (graduate coursework):
Graph Theory - Reinhard Diestel
A more encyclopedic treatment (most big results, no proofs):
Handbook of Graph Theory - Gross, Yellen
What do you mean by 'Euler graphs'? As in, the problems described by him such as the bridge problem?
I would call anything with vertices and edges an 'Euler graph' as opposed to a 'Cartesian graph' (a chart).
He specifically mentions in the article that he found both terms and used the one used in the literature he refers to.
> I used “Euler path” instead of “Eulerean path” just to be consistent with the referenced book  definition
Should be 'Eulerian'. It's not horribly confusing, just a little vague.
Everyone uses both kinds of reasoning but I know I have deeper understanding of the areas that I studied bottom-up. Even if it might have been a bit more difficult and time consuming, because I didn't get to the applicable stuff as fast.
Most people learn best from examples, which makes intuitive sense. Purely theoretical learning does not come easy to most people - that’s why maths is so scary to so many.
Given that the learning style myth has been mostly discredited, I also wonder about preferences such as night owl vs morning bird - it seems to me that people learn best in the morning, _unless_ they have poor sleep hygiene. Which would again make totally intuitive sense.
This just sounds like a morning person trying to imply that night owls are unhealthy.
I consistently go to bed around 2 and wake up at ~10. I have better 'sleep hygiene' than many people I know who are 'morning people' because they force themselves to strict wake-up times.
Even with a good amount of sleep at consistent times each night, I am still more intellectually productive in the evening than in the first couple of hours after waking up.
I wondered if anyone would interpret it like that, but I decided you would give me more credit, given I was speaking from a professional POV.
Before I did the teaching courses, I would have described myself as a night owl. I did then (and still do now to a lesser extent) have poor sleep hygiene. However, since I became more wary of how illogical the dichotomy is, I have found I'm just as productive in the morning provided I've slept well.
In the morning, your brain is fresh from sleep and flushed of toxins. Your energy levels are higher and you've been hit with blue light indicating it's the start of the day. It makes sense that you would learn better in such a state.
What's the argument for night owls being better learners at night?
I would say that the argument is that many people report this to be the case and there are some studies pointing to the existence of such an effect (sometimes called chronotype). Even in the absence of rigorous studies, a reasonable default view is to say that such an effect may or may not exist and we just don't know for sure if it does.
I would argue that the burden of proof is on the person making the claim that this effect doesn't exist, as that's a stronger statement about the world. It also seems to contradict a lot of anecdotes without proper evidence that casts doubt on their accuracy; the simplified model that you mentioned, about sleep flushing toxins from your body may not capture all the relevant aspects here (for example genetic components to concentration ability).
Wikipedia references a few studies, but I'm not an expert and can't tell how conclusive or rigorous they are: https://en.wikipedia.org/wiki/Chronotype
That your 'brain toxins' and first blue light are irrelevant to many people's abilities to learn. Also, being a night owl doesn't mean you aren't getting blue light while you study.
I've been thinking about whether it has to do with the daylight/sunlight and my inability to "settle down" and concentrate when it's light outside. This "condition" is causing me to stay up late and lose hours of sleep per night. I've been thinking about going to see a psychologist or therapist to see wtf I can do to rectify the situation.
Now I'm also curious about the differences in free-form learning (which I guess I'm talking about) vs the more traditional classroom learning. Do you have any insight on this?
I have a feeling you might enjoy "The One World Schoolhouse" by Salman Khan (as in Khan Academy).
(There are also comments by those who disagree.)
I looked at a couple of instances to see how one of my favourite authors, Knuth, explains things. If you look at how he introduces BDDs (Binary Decision Diagrams), it is with an example first: http://www.cs.utsa.edu/~wagner/knuth/fasc1b.pdf#page=8
When he introduces an algorithm for generating all permutations (http://www.cs.utsa.edu/~wagner/knuth/fasc2b.pdf#page=5), he does not work through the entire algorithm on an example case. However, when he mentions “all permutations … in lexicographic order”, he immediately gives an example of what that means.
(In general Knuth is a big fan of the principle of saying everything twice, once informally and once formally; his literate programming paradigm is also an extension of this idea, where first you explain something informally and then write down the code which is supposed to be a precise version of it.)
Everyone learns better starting from examples. Humans are inductive learners. Makes sense, given our situation.
However, from our starting point some people meta-learn abstract reasoning i.e. mathematics, which doesn't come naturally. Maths is hard because it isn't the way we usually think.
Often those people claim to find purely theoretical approaches better for learning, although I'm personally sceptical from attending a decade of seminars and classes at many institutions; people presenting a purely theoretical introduction to a topic are immediately hit with requests for examples. And usually criticised for not starting with examples. But I do not usually attend lectures in maths departments (rather than related subjects, e.g. CS).
I suspect it's really just a numbers game: Present the information enough different ways and you're more likely to use one which happens to click for a particular student temporarily receptive to a particular mental model.
If so, that means diversity of approaches is good even with no personalization.
Inductive: A,B,C, ... may imply Q (a prior probability distribution)
Abductive: Best possible from Q1,Q2,... given a subset of A,B,C,...
For your question, I think those terms may not be so well-defined:
Deductive can be bottom up (if you'd like) if you call the premises the bottom. But Inductive can too (if you'd like) if you use A,B,C,... as your data points from which you're basing Q on.
Problem solvers get the job done
You’re a cryptographer and don’t use any game theory? Most cryptographic protocols are, or can be, phrased through game semantics. Game theory is very useful, what are you skeptical about?
The game theory I studied in economics (albeit in my undergrad) I'm very skeptical of, though. It was about exact payoffs and rational agents and loads of assumptions that really don't work in real-life. I'd spend time literally analysing optimal strategies in board games. It all seemed kind of pointless. People always said you can apply game theory to political science and wars, but it all seemed like common sense and logic, rather than the specific stuff I was actually studying. I remember solving integral equations to calculate Nash equilibrium bidding strategies in third price bid auctions with perfect information and a uniform distribution of value to buyers. The solutions had a lot of maths, but had no bearing on reality whatsoever. What I studied was never really used to make any actual predictions in the real world. It seemed almost as if they were trying to make this subject mathematical to give it some legitimacy. A lot of it is common sense. One could argue that studying it helps you think logically I guess, but so do lots of things. Ariel Rubinstein is a famous economist and actually one of the founders of game theory and gave an excellent critique of the whole space. He is a lot more eloquent than I. You should listen to him and let me know what you think.
So game theory as a framework seems would be totally fine if we could only figure out a more psycologically plausible goal function, wouldn't you agree?
PS : Or maybe we could use a blockchain to be even more modern?
Because of this, algorithms on trees are often very different from general graph algorithms in practice. You could use the same tools, but it would be unnecessarily slow and/or complex.
I’ll give another shot to this article later but, I’m more lost than before because I managed to have a grasp of graph theory without knowing a lot about binary tree (I’am more a data analyst than a developer).
PS : Thx vardanator
In Europe most universities that teach computer science have at least one course fully dedicated to graph theory, algorithms for traversal, shortest path finding, cycle detection and so on. I didn't even know it was possible to work professionally as a software developer and not already know all of this.
I do, and I was forced to dedicate a course to it, yes
There's plenty of existing resources designed to learn graph theory. It's not good enough for a new one to simply be correct. If someone doesn't think their presentation of the content is going to be better, why bother? I'd sooner share my discrete math textbook with someone before sending them this link.
Can all the authors out there just stop trying to dumb things down for a general audience and just start at the intermediate level? Just skip all the drawn out cliched intros and get on with it. Don't try to cutesify it, it just comes out condescending. This is also why I can't bear to read documentation from Google.
If you're a liberal-arts kind of person and or have scars from prior experiences with mathematics I recommend, https://www.amazon.com/Introduction-Graph-Theory-Dover-Mathe...
Further reading: https://terrytao.wordpress.com/career-advice/theres-more-to-...
It's a great way just to learn without losing face if you need to. The alternative sentiment is "I'm lost, please help" which may not be accurate.
There does need to be some way to express, "I am competent enough to understand this, I do not understand this, so something is going wrong and it isnt my stupidity".
Like how is it that so many people can effortlessly memorize the names of hundreds of actors and recognize them on sight, and why do they bother?
I've been humbled many times thinking I know something when in fact I really had a shallow understanding. Here are a few examples:
* Stable  in place sort. Merge sort requires an extra O(n) space, Quicksort isn't stable. The solution is either Grailsort  or Wikisort , both of which I still don't understand to this day.
* Gaussian elimination. Turns out Gaussian elimination might blow up exponentially (afaik, still an open problem whether it actually does) in intermediate bit complexity and other methods have to be employed to guarantee a polynomial bit length. 
* Fast Fourier Transform. How many bits do you need? When does using 64-bit floating point fail and why? The bit complexity is polynomial and understood but it turns out the most straight forward methods to figure this out uses modular arithmetic to get a handle on the complexity. Also, what's minimum known bit complexity? I'm not sure I know what's bleeding edge for this. 
* Dynamic programming. Only recently has it been shown that, unless P=NP, O(n^2) is basically optimal.  Also, what if you're comparing similar strings? What are the best algorithms in terms of edit distance and other efficiencies. Ukkonnen  Hirschberg  and checkpointing , but boy did it take me a while to get a good description of all of them, with checkpointing still unclear to me.
I've heard, and agree with, that 95% of programming doesn't require any deep CS knowledge. The flip side of that is 5% of the time you will and for those 1/20 times you encounter a problem that requires theory, you're dead in the water unless you know how to identify it, how to solve it or where to look for solutions to it.
These are just a few. The more I learn the more I realize how little I actually know.
 Chee K. Yap, "Fundamental Problems in Algorithmic Algebra", Chapter 10, Linear Systems, 10.3 Matrix Inversion
 Chee K. Yap, "Fundamental Problems in Algorithmic Algebra", Chapter 1, Arithmetic, 1.4 Modular Fast Fourier Transform
This is about a dynamic programming approach to the edit distance problem. Not dynamic programming itself, which is a vague term (I think it's basically "use a bottom-up approach and avoid doing the same work twice")
For example, implementing VEB trees or even (countably distinct length) dictionary-based tries can speed up execution by orders of logn/n
That's all it took for the theory to fall into place.
> note that we are still dealing with pseudocode
He's got a funny idea about pseudocode!
This is very timely. I've been writing an app to draw polyhedra and it's slowly been dawning on me that there's a lot more graph theory involved than there is 3D geometry or vector maths.