Hacker News new | comments | show | ask | jobs | submit login
I don’t understand Graph Theory (medium.com)
566 points by signa11 8 months ago | hide | past | web | favorite | 138 comments



I have a graduate degree in computer science and know some graph theory, but I really liked this. I thought it was interesting, reminded me of a lot of theory, and taught me some more.

I really appreciated the voice and tone. I am very surprised to see people who did not appreciate it and took the time to write how much they didn't like it. (Which is their right).

I am writing this only to let the original author know that there is an audience for this kind of "breezy" explanation.


Thank you very much! I don't really get upset for critics, it helps a lot to produce the next good work (noting previous complains). However, hearing some really positive feedback is always motivating to spend more time on knowledge sharing. Thanks again!


Teaching is part skill and part craft and like so many other creative areas there are plenty of different styles that can appeal to different personalities and preferences. It's a constant tension and there will always be room for imagination, creativity and even art in teaching. Prime example - http://poignant.guide/


I'm likely the developer that's in your target audience: a self-taught Junior in a feature factory with no mentorship and 'agile' cycles that have you always on the next thing after the other. These types of articles are super helpful in beginning to shore up the foundations. Thanks for the article.


Maybe I'm alone, but I never felt the graph theory pedagogy I experienced was very confusing or boring (I did think this about plenty of CS topics!). There's lots of motivating examples to pull in, and better/clearer diagrams than this in may places. This just seems like a not very lucid description of theory that is very formalized and that many lectures and textbooks deal with.


When I was 16 I changed sixth form[0] following a house move and I was accidentally put onto a maths course that was basically algorithms without computers. I think the name was something like Decision and Discrete.

This course started with Zeller's congruence as an example of an algorithm and then moved onto some graph theory and then onto Dijkstra's algorithm. I never got to see what the rest of the course contained, but it really looked like a good course. Sadly I was not able to persuade the person who was in charge to let me continue to take the course.

All the people on the course absolutely hated it from the start; thought I was mad to want to do it; would go on to fail it and as a result this was the last year the sixth form would run it.

[0] Sixth form is the 16-18 optional education in the UK.


"education in the UK"

In England and Wales, maybe in NI but definitely not here in Scotland, which has a completely different system.


>Sixth form is the 16-18 optional education in the UK.

It's only sort-of optional now - since 2015 you're required to remain in education or training until 18, although that includes vocational courses and apprenticeships.


You're talking about the D1 module, which was considered the easiest maths module at my school, because it doesn't really require much understanding, you just have to learn the material.


D-maths modules in the UK have changed a lot over the years, seems like they started out pretty legit and got diluted over time. Depends when they took it.


I took it 6 years ago. When were they introduced?


At least the early 90s, possibly earlier.


I hated that module. I found it boring because it was just memorising algorithms and applying them. The key aspect of studying algorithms, complexity and analysis of the algorithm itself, was missing.


besides data structures, graph theory and its related curricula were my favorite and overall most practical areas of the CS education I received


Nothing covered in this article is stuff I'd consider very complicated, but I took a class on graph theory and the kinds of algorithms and concepts covered in the second half of that course were much more complex, though the fact that I hadn't taken a modern algebra class probably hurt me with some of those things, not to mention the class was also about the relation to combinatorics which presents much more interesting, though more complicated, problems.


I'm just gonna go ahead and say: this is the kind of style that makes me go "yawn, this is boring". Endlessly meandering and circling the subject, stopping every sentence to make an irrelevant digression or an attempt at humour (breaking my concentration in the process), stretching on and on... Yeah, give me a concise text, that goes straight to the subject, and that doesn't attempt to take a conversationalist tone with me as if we were having a beer together, give me that any day.

Yes, I understand some people have the opposite view, I'm just trying to day that there are surely a lot of people too that find this kind of style extremely distracting and frustrating.

In other words, my favourite textbooks of all time are Landau and Lifschitz :^)


I looked at the scroll thumb after the first two paragraphs when I saw a long list of disclaimers and was disappointed that I hadn't read anything useful yet. I closed the tab.

I get that I'm not the audience, but still this conversation style irks me. It just seems like noise.

edit: To be constructive instead of just complaining. I'd like articles this long to have a Table of Contents. I can't skim this article because there are too many pictures. So I have no way to summarize the content and find something useful.


That's a good point, gonna do it! Edit: done, looks nice :)


Although I didn't read it before, it does look good with the Table of Contents. You can follow the story or get to the point. Thanks!


Looked at the scroll after this comment, Nope..


Interesting notes, thanks. I was struggling with such books that compress a complex idea in a couple of sentences, so I'm always trying to explain as detailed as possible. Though understanding that there is a broad audience who likes the opposite (succinct explanations), I wrote disclaimers to secure me just in case. So, read the disclaimers one more time, that's my only excuse for such long and boring discussions :)


I share some of the criticisms already stated in other comments. The biggest offender here in my opinion is that it jumps to too many subjects. Trees as graphs, yes, but they are a whole subject of their own. This post should broken down in at least 4 or 5 posts. Moreover, its mixes vastly different concerns: byte level optimization is an interesting topic too, but it has nothing to do with graph per se.

On the positive side: relevant illustrations. I also liked that the code implementation is one that use explicit edges. Too often in books, it is either the adjacency matrix or the linked list ones. The former is inconvenient when you need to add a node, and the latter doesn't let you put weight on edge. So these "academic" implementations are not well suited for groking concept by playing with code.

I work a lot with graphs and there is a big value in having a general enough graph implementation. Last week I implemented a trie class with it, and this week a transducer. Both in two hours, debugging included. Other things such as serialization came from free from the library. Of course, that not optimal data structure, but it is fine for my use cases.

In your post, you don't use the fact that trees are graph to avoid duplicate code explanations, instead you explain it first for a tree, then details the graph one. It should be the reverse: how to use a graph implementation to create a tree. And then only if this implementation does not fit the memory or performance requirements of the app, rewrite a more specialized data structure.


Thanks!


Detailed explanations are good. As is a conversational style. But consider that all jokes and asides that are meant to be conversational can get in the way of a beginner understanding your point, because they are beside the point.

I recommend reading blog posts by authors you like who have a conversational style with an editors eye. How often do sentences not have any relevant information? How often do they actually try for a joke? It's often less than you might think. The style often comes more from tone and word choice. I think a great model is Julia Evans: https://jvns.ca/


Thank you!


Yes, I have talked to people who much prefer a lighter, lengthier style, so I guess this is a matter of taste. If I may offer a suggestion though, the pictures are very distracting and often don't add to the discussion. Consider either limiting their use, or making them smaller and on the sides of the page.


Sure, will try for the next one. Thanks.


Have you heard of the Pyramid Principle book? I am working through the book now. I find the method very helpful in structuring how you present ideas in your writing.


Thanks for the heads-up on the Pyramid Principle. I've been studying storytelling and the outlined principles jive pretty closely with what I've picked up.

This seem to be a great summary of the Pyramid Principle [0] -- I hear the book is a bit of a slog after the first 3 chapters.

[0] https://medium.com/lessons-from-mckinsey/the-pyramid-princip...


If you consider the book applies the principle itself, it is well worth the read.

One tip I would suggest if you plan to buy the book, get the older version from the 90s at a lower price. The newest version is quite pricey.


It seems likely this is not the text for you -- especially if you already know graph theory. But I'm not sure why it's a topic for HN, given it's like a mandatory CS undergrad class.


Not everyone here has a CS degree.

Some of us even question the value of a CS degree.


Well, one of the benefits is you get exposed to topics like the above in a structured manner =)


I just love L&L :-)


and how many volumes of LL have you actually read?


Mechanics, Quantum Mechanics, and Fluid Dynamics. None completely cover to cover, I admit.


if you have not read Mechanics from cover to cover and solved all the exercices, you have not lived


Curiously, if you have read Mechanics from cover to cover and solved all the exercices, you also have not lived.


Those disclaimers look like comments on bad code. Dear author you do not have obligation to make such clarifications.

I take this blogpost as example how I do not want to write. It is too long and too much information in one piece, Euler to BST with AirBnb to Twitter, and Facebook. With stop on linked lists, arrays and trade-offs between them. Don't even get me started on Sponge Bob. All this targeted at junior developers?

I advise author to read: "Nobody Wants to Read Your Sh*t: Why That Is And What You Can Do About It"


And yet, it is at the top (#2 right now) of HN.


Because the HN voting system is a clear and democratic way of expressing value


:D noted, thanks. I just hate those part 1, part 2, part N articles, so tried to put all in one. However, done just 40% of the initial plan, so some new looong article will be published sometime in the future.


I find it strange that the folks on Hacker News sometimes have such strong reactions to the tone of an article, or sometimes just a paragraph in the article. I’ve gotten that too, when articles of mine become popular on HN. I personally liked this article about graph theory, the conversational tone.


I didn't understand why at first until I started reading dozens of books and papers suggested here that have precision in their writing, now I too can't stand an overly written article or book. The Art of Computer Programming vol 1 contains an excellent introduction to graph theory where Knuth converts a program flow chart into a graph to discover free subtrees.The dry humor there doesn't try to be too obvious, and every sentence is carefully written for precision.


I get that any kind of writing can provoke strong reactions, as some people have strong preferences. And they have a right to those preferences. But I hope we can agree that being vocal about one's preferences is a different kind of activity? Everyday on Hacker News there are dozens of articles that don't interest me, but can you imagine what people would think about me if, in every one of those articles, I posted a note saying that I didn't like the article?

Most of us simply skip over the stuff that doesn't interest us. That is clearly the default behavior. Therefore it is interesting why, on certain articles, people do feel the need to comment on the style.

Personally, I've been lucky, because the most vocal people have been the people who like my writing. But not always. I wrote a book about startups, and some people loved that it had a lot of humor, whereas others really hated it. At least so far, the one's who hate it write to me privately, whereas the ones who love it have written the reviews on Amazon:

https://www.amazon.com/Destroy-Tech-Startup-Easy-Steps/dp/09...

I'm lucky, but I'm also curious why this is. As a purely sociological question, why is that sometimes people feel the need to be very vocal with their disapproval, whereas most of the time they keep such disapproval private?


For people who don’t understand graphs, Euler graphs seem like such a weird place to start. There are much more relevant problems (social networks, followers and travel distances) with approachable graphs (undirected, directed, and weighted).


The place I wish I had started at:

The Nature of Computation - Moore, Mertens

What I got (graduate coursework):

Graph Theory - Reinhard Diestel

A more encyclopedic treatment (most big results, no proofs):

Handbook of Graph Theory - Gross, Yellen


The Diestel book is great if you already know Graph Theory. The proofs are exquisitely concise, but it's hard to read because you have to think really hard about the definitions and why they're giving in precisely that way (hint: to make the proofs more concise!).


The Nature of Computation is really good.


edit: Never mind - I hadn't read far enough in. He means 'Eulerian graph', I think. As in a graph with an Eularian trail. You know it might be nice for people to get the terminology right if you are writing about this stuff...

What do you mean by 'Euler graphs'? As in, the problems described by him such as the bridge problem?

I would call anything with vertices and edges an 'Euler graph' as opposed to a 'Cartesian graph' (a chart).


> You know it might be nice for people to get the terminology right if you are writing about this stuff...

He specifically mentions in the article that he found both terms and used the one used in the literature he refers to.


Fair point. A minor detail is that he mistypes the standard term :

> I used “Euler path” instead of “Eulerean path” just to be consistent with the referenced book [1] definition

Should be 'Eulerian'. It's not horribly confusing, just a little vague.


Fixed, thanks!


Some people prefer learning from examples (inductive) while other people prefer learning the theory bottom-up (deductive). This article may be for the first kind of people.

Everyone uses both kinds of reasoning but I know I have deeper understanding of the areas that I studied bottom-up. Even if it might have been a bit more difficult and time consuming, because I didn't get to the applicable stuff as fast.


Why not use mixed approach? I find it works best for me. First, learn by examples to get a broader picture, then learn theory bottom-up to get into details. It's much easier to learn complex theories when you know where it leads and what purpose it serves than building mental model blindly.


IMO, deductive learning is limiting ( we are limited by the span of our examples), but it has great recall value. Inductive learning is harder to commit to long-term memory; particularly, when learning something which you find scant use for in your day-to-day life. Again, I can see this is not general, and could vary considerably across individuals.


I’ve done a few degree-level teaching courses, and what I gathered from them is that the notion of individual learning styles is a bit of a myth.

Most people learn best from examples, which makes intuitive sense. Purely theoretical learning does not come easy to most people - that’s why maths is so scary to so many.

Given that the learning style myth has been mostly discredited, I also wonder about preferences such as night owl vs morning bird - it seems to me that people learn best in the morning, _unless_ they have poor sleep hygiene. Which would again make totally intuitive sense.


>they have poor sleep hygiene

This just sounds like a morning person trying to imply that night owls are unhealthy.

I consistently go to bed around 2 and wake up at ~10. I have better 'sleep hygiene' than many people I know who are 'morning people' because they force themselves to strict wake-up times.

Even with a good amount of sleep at consistent times each night, I am still more intellectually productive in the evening than in the first couple of hours after waking up.


> This just sounds like a morning person trying to imply that night owls are unhealthy.

I wondered if anyone would interpret it like that, but I decided you would give me more credit, given I was speaking from a professional POV.

Before I did the teaching courses, I would have described myself as a night owl. I did then (and still do now to a lesser extent) have poor sleep hygiene. However, since I became more wary of how illogical the dichotomy is, I have found I'm just as productive in the morning provided I've slept well.

In the morning, your brain is fresh from sleep and flushed of toxins. Your energy levels are higher and you've been hit with blue light indicating it's the start of the day. It makes sense that you would learn better in such a state.

What's the argument for night owls being better learners at night?


> What's the argument for night owls being better learners at night?

I would say that the argument is that many people report this to be the case and there are some studies pointing to the existence of such an effect (sometimes called chronotype). Even in the absence of rigorous studies, a reasonable default view is to say that such an effect may or may not exist and we just don't know for sure if it does.

I would argue that the burden of proof is on the person making the claim that this effect doesn't exist, as that's a stronger statement about the world. It also seems to contradict a lot of anecdotes without proper evidence that casts doubt on their accuracy; the simplified model that you mentioned, about sleep flushing toxins from your body may not capture all the relevant aspects here (for example genetic components to concentration ability).

Wikipedia references a few studies, but I'm not an expert and can't tell how conclusive or rigorous they are: https://en.wikipedia.org/wiki/Chronotype


I'm not asking for evidence, just a reasonable argument as to why people might learn better at night. Because the obvious thinking on that is, until very recently, night time meant most humans simply went to bed.


I would say that variability in ability to concentrate might be reasonable to assume as a starting point, given how much people vary in other features as well, even if there are similar connections to how humans evolved. For example, food preferences are also in part a result of evolutionary processes, but vary a lot between people.


>What's the argument for night owls being better learners at night?

That your 'brain toxins' and first blue light are irrelevant to many people's abilities to learn. Also, being a night owl doesn't mean you aren't getting blue light while you study.


I'm struggling with some issues right now where I'm finding that I'm much more productive in the late afternoon/evening than I am during the daylight hours. I'm a telecommuting software engineer, and it's somewhat important that I be available during core business hours (9AM-4PM) yet I find that I just am unable to focus and get into my work during the day only to leave much of my work left to be done during the evening. My work is very challenging and interesting, so i do not believe it's a boredom issue.

I've been thinking about whether it has to do with the daylight/sunlight and my inability to "settle down" and concentrate when it's light outside. This "condition" is causing me to stay up late and lose hours of sleep per night. I've been thinking about going to see a psychologist or therapist to see wtf I can do to rectify the situation.


Is it possible you're just being (or expecting to be) interrupted during those hours? That's a classic reason for delaying challenging work until people have gone home / signed off.


I'm curious about your last statement. I guess I'd be considered a night owl by most people, and certainly have poor sleep hygiene in the eyes of early birds, but I find myself most motivated to do really "intellectual" work about 2-3 hours after dinner. For the record, when I say "intellectual" work I don't specifically mean academic, but more broadly creative. Do you specifically mean something like learning in the classroom setting?

Now I'm also curious about the differences in free-form learning (which I guess I'm talking about) vs the more traditional classroom learning. Do you have any insight on this?


I'm referring to the ability to learn and retain material, doesn't have to be in a classroom. So I don't think that whether you're reading, being lectured, etc. would impact whether the time of day affects you. As for being creative, I imagine the same arguments for mornings being more productive would apply (better physiological state).

I have a feeling you might enjoy "The One World Schoolhouse" by Salman Khan (as in Khan Academy).


So would you say people who do master complex mathematical concepts too learn from examples, rather than a deductive approach of understanding pure theory?


Here's Timothy Gowers, a Fields Medalist, on the principle of giving examples first:

https://gowers.wordpress.com/2007/10/19/my-favourite-pedagog...

https://gowers.wordpress.com/2007/10/24/examples-first-ii/

(There are also comments by those who disagree.)

I looked at a couple of instances to see how one of my favourite authors, Knuth, explains things. If you look at how he introduces BDDs (Binary Decision Diagrams), it is with an example first: http://www.cs.utsa.edu/~wagner/knuth/fasc1b.pdf#page=8

When he introduces an algorithm for generating all permutations (http://www.cs.utsa.edu/~wagner/knuth/fasc2b.pdf#page=5), he does not work through the entire algorithm on an example case. However, when he mentions “all permutations … in lexicographic order”, he immediately gives an example of what that means.

(In general Knuth is a big fan of the principle of saying everything twice, once informally and once formally; his literate programming paradigm is also an extension of this idea, where first you explain something informally and then write down the code which is supposed to be a precise version of it.)


My answer would be "yes, but perhaps eventually they've bootstrapped themselves where they can escape examples."

Everyone learns better starting from examples. Humans are inductive learners. Makes sense, given our situation.

However, from our starting point some people meta-learn abstract reasoning i.e. mathematics, which doesn't come naturally. Maths is hard because it isn't the way we usually think.

Often those people claim to find purely theoretical approaches better for learning, although I'm personally sceptical from attending a decade of seminars and classes at many institutions; people presenting a purely theoretical introduction to a topic are immediately hit with requests for examples. And usually criticised for not starting with examples. But I do not usually attend lectures in maths departments (rather than related subjects, e.g. CS).


I remain sceptical of your scepticism since I'm one of those people that gets fidgety when someone wants to explain something to me using examples. It's not that I don't appreciate examples too, but starting with a high-level, abstract explanation clicks much more easily, especially for CS/math topics.


> the notion of individual learning styles is a bit of a myth.

I suspect it's really just a numbers game: Present the information enough different ways and you're more likely to use one which happens to click for a particular student temporarily receptive to a particular mental model.

If so, that means diversity of approaches is good even with no personalization.


Isn't deductive top down and inductive bottom up?


Deductive: A,B,C, ... strictly implies Q (for all subsets of premises A,B,C,...)

Inductive: A,B,C, ... may imply Q (a prior probability distribution)

Abductive: Best possible from Q1,Q2,... given a subset of A,B,C,...

For your question, I think those terms may not be so well-defined: Deductive can be bottom up (if you'd like) if you call the premises the bottom. But Inductive can too (if you'd like) if you use A,B,C,... as your data points from which you're basing Q on.


Then there is reductive, conductive... This could be the makings of an kxcd


and finally seductive


Alt-text: A warning that the last one can lead to reproductive.


And speaking in a higher uctive


people can "prefer" what they want.

Problem solvers get the job done


Who's the problem solver: the teacher or the student? And what job? Learning? When is that 'done'?


So, if you do have data in an arbitrary graph structure and want to do a search like the Airbnb example - essentially, sort nodes by maximal total flow from a set of starting search nodes - clearly Hopcroft-Karp isn’t appropriate, but is there an algorithm that is good for this use case? The author punts to building indices, but is there a way to express that from a graph theory perspective?


My MSc dissertation topic was on graph theory. I studied a lot of crazy stuff: things like random trees, hypergraphs and partitions of graphs into other graphs. I remember once this internet company came to campus and was trying to encourage mathematicians to join them. This lady said: "One person has a PhD in graph theory but originally thought she wouldn't be needed here. Yet the internet is a graph!" I was thinking, yeah, sure, a lot of graph data structures and algorithms are pretty useful, but it was hard for me to see how some the niche topics I was studying would really, actually, be useful. Luckily mathematics does tend to to surprise us like that. (Saying that, I am now a cryptographer and don't use much graph theory, or any of the economics and game theory I did in my undergradate :-) Game theory in particular, I am highly skeptical of, but that's another story...)


> (Saying that, I am now a cryptographer and don't use much graph theory, or any of the economics and game theory I did in my undergradate :-) Game theory in particular, I am highly skeptical of, but that's another story...)

You’re a cryptographer and don’t use any game theory? Most cryptographic protocols are, or can be, phrased through game semantics. Game theory is very useful, what are you skeptical about?


It is true that you can analyse many cryptographic schemes by modelling them as games. In fact, my PhD was largely focused on this exact topic :-) Our discussion really depends on what we define as “game theory” and “graph theory” I suppose.

The game theory I studied in economics (albeit in my undergrad) I'm very skeptical of, though. It was about exact payoffs and rational agents and loads of assumptions that really don't work in real-life. I'd spend time literally analysing optimal strategies in board games. It all seemed kind of pointless. People always said you can apply game theory to political science and wars, but it all seemed like common sense and logic, rather than the specific stuff I was actually studying. I remember solving integral equations to calculate Nash equilibrium bidding strategies in third price bid auctions with perfect information and a uniform distribution of value to buyers. The solutions had a lot of maths, but had no bearing on reality whatsoever. What I studied was never really used to make any actual predictions in the real world. It seemed almost as if they were trying to make this subject mathematical to give it some legitimacy. A lot of it is common sense. One could argue that studying it helps you think logically I guess, but so do lots of things. Ariel Rubinstein is a famous economist and actually one of the founders of game theory and gave an excellent critique of the whole space. He is a lot more eloquent than I. You should listen to him and let me know what you think.

https://rwer.wordpress.com/2012/11/05/game-theory-a-critique...


Isn't the (only) problem with game theory that the actor optimization function is inappropriate? Rationality as an assumption is the problem. But everything else, all the derived higher order structure above that seems, in my eyes, to be pretty useful. There are other contexts, like evolutionary game theory, where the optimization function is just darwinian fitness. And in that situation do most professionals seem to agree that the model is perfectly useful.

So game theory as a framework seems would be totally fine if we could only figure out a more psycologically plausible goal function, wouldn't you agree?


Rubinstein makes the point (which I wholeheartedly agree with) that game theory is a lot like fables. They might help you interpret a certain situation in a certain way, but they have no value in actually predicting anything. I'm not sure how it will ever get there. If someone comes up with a goal function and actually makes predictions which work in the real world then yes, I would of course change my mind.


I used to be just as sceptical as you. (And I still am about traditional game theory.) But then I stumbled into evolutionary game theory, which seems much more robust. Have you read anything about it? What's your take? Apparently ESS has been used to model cancer very fruitfully.


Interesting, I’ll read more. Thanks for the perspective.


It seems like you'd have great background to work in the cryptocurrency space.


Sad I'm downvoted, but maybe folks thought I was being flip. I'm not a crypto utopian. I just think theoretical CS, cryptography, game theory, economics -- those are the building blocks.


I just lost track at the airbnb example. What is the inherent advantage of using a graph in the context of computing scores over 4million possibilities ? Should’t a Modern sql database be enough?

PS : Or maybe we could use a blockchain to be even more modern?


The point of example is to show that graphs (for Airbnb case - trees) are a good fit. Modern sql database should be enough, but just because it has strong indexing, which is mostly implemented using B-trees, which are again graphs. So Airbnb example takes the reader to balanced binary trees (though it seems long reading).


I thought it was odd to treat trees as graphs. They are, but they're different enough that you don't use the same tools on them.


Hmm? Trees are just acyclic and connected graphs in graph theory. Why wouldn't you use the same tools that apply to graphs in general?


Acyclicity usually makes it possible to come up with a specialized algorithm that works better on trees than an algorithm that needs to handle all possible graphs. For example, many NP-complete problems become fixed-parameter tractable if you fix the tree-width of the graph (essentially, how many nodes you have to bag together until the graph looks like a tree).

Because of this, algorithms on trees are often very different from general graph algorithms in practice. You could use the same tools, but it would be unnecessarily slow and/or complex.


Hold on I get It’s because actually that example was all about binary tree (which are often used under the hood by database engine for index scanning).

I’ll give another shot to this article later but, I’m more lost than before because I managed to have a grasp of graph theory without knowing a lot about binary tree (I’am more a data analyst than a developer).

PS : Thx vardanator


Oh, thank you too, it's really great to hear a feedback.


Or a hadoop cluster. Oh I'm showing my age!


The best way to understand graph theory is in a continuos setting. When you approximate a linear pde with finite differences, you can write the discrete set of equations as a linear system with sparse matrices. Sparse matrices are just graphs. And the natural matrices associated to this graph (adjacency matrix, incidence matrix, transposed incidence matrix, etc) correspond to the natural linear operators on your space (laplacian, gradient, divergence, etc.).


If you already understand those other things I find it surprising that you wouldn't already understand graph theory.


I don't know if this is common in USA to finish higher education in computer science and not learn about graphs at some point? Even the basics?

In Europe most universities that teach computer science have at least one course fully dedicated to graph theory, algorithms for traversal, shortest path finding, cycle detection and so on. I didn't even know it was possible to work professionally as a software developer and not already know all of this.


Another thing I noticed is the number of articles claiming that "college is overrated" or that "CS is not useful for the real world" and in general downplaying the importance of a formal education, and then... "I don't understand Graph Theory". Yes, not saying the same people claim all these things, but surely there's some connection between deriding a formal CS education and not understanding a prime example of a CS topic like Graph Theory?


It's common everywhere software is made that the people who make it don't necessarily have a higher ed degree in computer science.

I do, and I was forced to dedicate a course to it, yes


There is some serious amounts of cynicism and negativity in these comments. The article is an interesting introduction. The style is a little all over the place, but hardly worth closing it over. No one seems to be criticizing the correctness of the content, probably the most important part of a technical article to discuss.


>No one seems to be criticizing the correctness of the content, probably the most important part of a technical article to discuss.

shug

There's plenty of existing resources designed to learn graph theory. It's not good enough for a new one to simply be correct. If someone doesn't think their presentation of the content is going to be better, why bother? I'd sooner share my discrete math textbook with someone before sending them this link.


If someone wants to present a fresh take on a subject like this, why drag out the same examples that are always used? My eyes glazed over as I observed the bridges of Konigsberg trotted out yet again. Sprinkling curlicues on a tired overused cliche like this will never freshen it up.

Can all the authors out there just stop trying to dumb things down for a general audience and just start at the intermediate level? Just skip all the drawn out cliched intros and get on with it. Don't try to cutesify it, it just comes out condescending. This is also why I can't bear to read documentation from Google.


Thank you for sharing this. I'm doing a master's degree in computing and I think graphs are the most powerful modeling tool in the whole of computer science.


Graph theory is amazing. One of my favorite subjects.

If you're a liberal-arts kind of person and or have scars from prior experiences with mathematics I recommend, https://www.amazon.com/Introduction-Graph-Theory-Dover-Mathe...


With the top few posts being somewhat negative, I was expecting a bad article. It is longer than I'd care for on a quick read, but it looks solid and I would consider a hardcopy. Especially to share with my kids to help them learn. The added "color" really helps, I think.


I must recommend also Sarada Herke series of videos on graph theory. they are excellent.

https://www.youtube.com/playlist?list=PLGxuz-nmYlQOiIOriTXME...


Here's a pro-tip: "I don't understand XYZ" this is the humble-brag of academia. And the more basic XYZ is, the more points you win. Try the following: "I don't understand 1+1=2." Sit back and enjoy your collaborators as they stare in wonderment at your sage lack of understanding.


I don't think good mathematicians confuse rigorous understanding with intuitive understanding as you suggest; rather, they are complements of each other

Further reading: https://terrytao.wordpress.com/career-advice/theres-more-to-...


Saying that you intuitively do not understand XYZ often requires intellectual honesty as well as understanding it rigorously, even more so in academia. So saying that you intuitively do not understand XYZ can be used as a signal.


That's a great, yet easily forgotten distinction


This reminds me of a dinner party game described in a David Lodge book where highly competitive literature professors could only score points by naming books that they haven't read, but which everyone else has.


I have noticed that more often than not, when I say "I don't understand XYZ" I actually mean "Are you really, really sure that you understand XYZ, because what you say makes no sense whatsoever"


Yes, it's a challenge to the person you're speaking to.

It's a great way just to learn without losing face if you need to. The alternative sentiment is "I'm lost, please help" which may not be accurate.

There does need to be some way to express, "I am competent enough to understand this, I do not understand this, so something is going wrong and it isnt my stupidity".


One hazard I am still learning to adapt to after many years is saying this to US based colleagues. They tend to interpret it literally. I have to switch grammars in that context and speak much more directly.


It took a bit of work to understand what was going on here though: "From this proposition it will follow, when arithmetical addition has been defined, that 1 + 1 = 2." https://en.wikipedia.org/wiki/Principia_Mathematica#/media/F...


It's not just academic topics. It also works if you express mystification of some aspect of popular culture that any "normal" person understands. Even if you are genuinely mystified by it.

Like how is it that so many people can effortlessly memorize the names of hundreds of actors and recognize them on sight, and why do they bother?


Can something be effortless and a bother at the same time?


spent a minute trying to figure out if this is a meta humble-brag


Whatever do you mean, Socrates? Surely all men know the meaning of XYZ?


While I agree with your sentiment, the more I learn the more I realize there are some profound subtleties that are glossed over by people teaching them and people claiming they understand it. I think it's a sign of maturity to realize that even seemingly simple and well studied problems aren't very well understood by most people.

I've been humbled many times thinking I know something when in fact I really had a shallow understanding. Here are a few examples:

* Stable [1] in place sort. Merge sort requires an extra O(n) space, Quicksort isn't stable. The solution is either Grailsort [2] or Wikisort [2], both of which I still don't understand to this day.

* Gaussian elimination. Turns out Gaussian elimination might blow up exponentially (afaik, still an open problem whether it actually does) in intermediate bit complexity and other methods have to be employed to guarantee a polynomial bit length. [4]

* Fast Fourier Transform. How many bits do you need? When does using 64-bit floating point fail and why? The bit complexity is polynomial and understood but it turns out the most straight forward methods to figure this out uses modular arithmetic to get a handle on the complexity. Also, what's minimum known bit complexity? I'm not sure I know what's bleeding edge for this. [5]

* Dynamic programming. Only recently has it been shown that, unless P=NP, O(n^2) is basically optimal. [6] Also, what if you're comparing similar strings? What are the best algorithms in terms of edit distance and other efficiencies. Ukkonnen [7] Hirschberg [8] and checkpointing [9], but boy did it take me a while to get a good description of all of them, with checkpointing still unclear to me.

I've heard, and agree with, that 95% of programming doesn't require any deep CS knowledge. The flip side of that is 5% of the time you will and for those 1/20 times you encounter a problem that requires theory, you're dead in the water unless you know how to identify it, how to solve it or where to look for solutions to it.

These are just a few. The more I learn the more I realize how little I actually know.

[1] https://en.wikipedia.org/wiki/Sorting_algorithm#Stability

[2] https://github.com/Mrrl/GrailSort

[3] https://github.com/BonzaiThePenguin/WikiSort

[4] Chee K. Yap, "Fundamental Problems in Algorithmic Algebra", Chapter 10, Linear Systems, 10.3 Matrix Inversion

[5] Chee K. Yap, "Fundamental Problems in Algorithmic Algebra", Chapter 1, Arithmetic, 1.4 Modular Fast Fourier Transform

[6] https://rjlipton.wordpress.com/2015/06/01/puzzling-evidence/

[7] https://www.cs.helsinki.fi/u/ukkonen/InfCont85.PDF

[8] https://en.wikipedia.org/wiki/Hirschberg%27s_algorithm

[9] https://github.com/drpowell/sequence-alignment-checkpointing


> Dynamic programming. Only recently has it been shown that, unless P=NP, O(n^2) is basically optimal. [6]

This is about a dynamic programming approach to the edit distance problem. Not dynamic programming itself, which is a vague term (I think it's basically "use a bottom-up approach and avoid doing the same work twice")


Yep. It's a complete oversimpliification to consider all DP problems at O(n^2). Anybody familiar with modern algorithms will tell you that certain DP problems can even be O(nloglogn)!

https://en.wikipedia.org/wiki/Van_Emde_Boas_tree

For example, implementing VEB trees or even (countably distinct length) dictionary-based tries can speed up execution by orders of logn/n


For programmers "understanding" graph theory begins with being able to implement a traversal and knowing how depth-first is different from breadth-first, what is pre-order, in-order or post-order, what role the stack and the queue play in it, and how it can be done recursively or using a loop.


For the longest time I couldn't understand electrical theory, until a youtuber explained it very simply... It won't move unless it can see a path to ground, except you make it do all the work while it's attempting to ground.

That's all it took for the theory to fall into place.


Small quibble:

> note that we are still dealing with pseudocode

He's got a funny idea about pseudocode!

This is very timely. I've been writing an app to draw polyhedra and it's slowly been dawning on me that there's a lot more graph theory involved than there is 3D geometry or vector maths.


If anyone has trouble with graph theory or other basics of discrete mathematics, then the introductory works of Matoušek and Nešetřil will most likely help them very much.


I really think what this article needs is an editor. From a quick glance I can see quite a bit of info but the way it is presented irks me.


Don’t worry, it’s just a theory...


Was expecting this to be pitch for neo4j.


Understood but forgot. Several times.


Thanks @signa11 for sharing this.


How long did this take to write? It is rather reminiscent of War and Peace, in that it's undoubtedly an impressive work, but there is no way i am going to read it.


That's a shame. Really good long form writing is often well worth your time. Although this particular piece had some grammar issues that reduced it's quality but the content was still enjoyable.


Nice work. I stopped reading about 2 volumes in, though.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: