Hacker News new | past | comments | ask | show | jobs | submit login
Applied Category Theory (mit.edu)
508 points by lelf 36 days ago | hide | past | web | favorite | 106 comments



If anyone is looking to get into Category Theory as a programmer, I would recommend giving Bartosz Milewski's Category Theory for Programmers a shot[0]. Quite good, and you can read it as a blog post on that site or get the ebook/physical book to leaf through.

For a more theoretical/less applied approach that's still relevant to computer science/programming, check out Benjamin Pierce's Basic Category Theory for Computer Scientists[1]. It's a very good introduction to the parts of Category Theory that are useful for [theoretical] CS, and delves into semantics of programming languages, types, and concurrency a fair amount.

[0] https://bartoszmilewski.com/2014/10/28/category-theory-for-p...

[1] https://mitpress.mit.edu/books/basic-category-theory-compute...


For a first introduction to category theory, from the masters thereof, I would highly recommend the book by Lawvere and Shanual "Conceptual Mathematics: A First Introduction to Categories". It almost doesn't look like a mathematics book, but believe me, these guys know what they are talking about. They distill some very high-brow ideas down to something us mere mortals can grok.


I watched a little less than the first half of Bartosz video lectures and I still don't see what the fuss is about from the perspective of an engineer that is looking to get better at engineering.

I can't find something there that would help me implement better systems. What I do see is an attempt to shoehorn a mathematical model onto certain programming paradigms in order to be able to say that these paradigms are instances of the model. In short, a lot of talk about abstractions (or "abstract nonsense") but very little of engineering significance. Please let me know if I'm missing something by listing concrete examples of how category theory can help me with engineering problems.

Disclaimer: I'm not a mathematician.


I think that the most plausible way that Category Theory is going to improve your life as an engineer is that it will sneak into the way your tools are designed.

You don't personally need to know about the Yoneda lemma to benefit from a better designed programming language whose type system was informed by algebras of types, allowing it to leverage exhaustive pattern matching.

With that said, learning about techniques used in functional programming that have their origin in category theory (maps, folds, algebraic data types, monads, monoids, etc) can allow you to think about how to solve problems (even in non-functional languages) in ways that can be more "natural" to the problem, in that the abstractions used resemble the problem description more closely. That ability to decompose difficult problems in new ways can definitely make you a better engineer.


I am a mathematician and generally bearish on category theory in programming. The best argument for it I've seen so far is that it can inform the design of your system to give it desirable properties, like how the DVCS Pijul [1] bases its data structures on patches and its merge operation on categorical pushouts [2] -- also explained at length in a series of blog posts [3]. I found the whole idea and reasoning behing the merge system very convincing, and am not sure it could have been arrived at or explained as clearly without category theory guiding the way.

[1] https://pijul.org/ [2] https://arxiv.org/abs/1311.3903 [3] https://jneem.github.io/merging/


Doesn't this post suggest you are "bullish" rather than "bearish?" Or am I misreading it?


Maybe I miswrote it. I’ve been very unimpressed with all other crowbarrings of category theory into programming than this one. Other attempts feel like language for its own sake, while Pijul really seems like a Grothendieck-style compass bearing use of category theory. The point of using it wasn’t just language, but it really informed the design of the system.


I'd say that you can safely ignore CT until you find the need or desire to learn more. The field is still very young and it probably shines most within pure mathematics, for now at least.

I think of CT as the study of "composability" in its purest form. Mathematicians are finding that this language is really good at translating domain-specific knowledge between disparate fields, not unlike the babelfish from the Hitchhiker's Guide.

FWIW, some researchers are apparently finding interesting applications in the development of biological models. John Baez has talked about applications in this direction via control theory [0].

[0]:https://classes.golem.ph.utexas.edu/category/2015/04/categor...


Let me expand a bit on the origin of category theory in mathematics.

Mathematics is most commonly described in the language of set theory, where objects are constructed from basic building blocks to create bigger blocks. This is much in the same way as algorithms are described in something concrete, say a programming language.

At some point, certain high-level constructs started to appear, where the same high-level behaviour was exhibited in otherwise completely disconnected fields. Explaining what is going on then started to become messy, for example take following quote from Hatcher's Algebraic Topology concerning covering spaces: "This is often called the Galois correspondence because of its surprising similarity to another basic correspondencein the purely algebraic subject of Galois theory". The problem is that in order to understand this, the reader has to understand two fields which are not at all connected and furthermore, the relationship looks nothing alike in the language of set theory.

Category theory was invented as a formal language which could describe such high-level behaviour. In this language this language we would could now say that there is an "contravariant equivalence of categories", which helps if you've seen it before and doesn't help at all if you haven't.

So category theory is massively useful in understanding the similarities between fields, and can be used to transfer understanding from one field to another. It it also useful for understanding more abstract fields where the intuition must be gained by analogy with something more concrete. It doesn't do much in itself, and I don't think it makes much sense to study the mathematical aspects of category theory without any good examples to apply it to.

What does that have to do with engineering? I'm not expert here, and haven't yet read the linked book, but one could hope that certain similarities could be described in terms of this new language. Then concepts which now only live on as the expert experience of working practitioners, might be written down and hence taught directly rather than indirectly. One could hope that certain problems have common solutions which can be re-used across fields.


Category theory studies the composition of (things that look like) functions. So each thing has a domain, and a codomain, and if we compose A --f--> B with B --g--> C then the intermediate B goes away. This doesn't sit nicely with how parallel composition works, e.g. If I talk to Google, that doesn't prevent you from talking to Google.

That's the essence of why category theory works nicely with pure functional programming, but beyond that simply fails to be useful. In particular, there has been virtually no progress in useful categorical models of parallel computation.



I don't see much / any category theory in the two papers linked. So I'm not sure what to say.


Catagory theory is a fantastic tool for understanding programming language design. As a user, you don't need to understand catagory theory, but you benifit when the language designers do.

As a software engineer, category theory is sometimes useful in API design as well; although more often than not, your API is constrained by the existing ecosystem too much for category theory to give you much benefit (and, frankly, simple enough that category theory is overkill).


A very abstract idea of its relevance, insofar as I understand it; engineering generally boils down to putting together available things to create some whole, and category theory gives a vocabulary for describing how things (considered very abstractly) can be put together.

So for instance, one of the big applications to CS is in type systems, which describe what kinds of expressions/values can be put together, and in what ways, to create valid programs.


Most engineering is based on mathematics because proofs are a powerful support for any idea you plan to implement. It's one thing to throw something together and have it work, and a totally different thing to prove it is optimal in some way.


That's seldom how ideas from category theory are used in practice. Eg most Haskell programs aren't proven correct.

(But if you are into that kind of proving, than there are a languages a bit like Haskell, like Agda for example, that revel in those proofs.)


The course is based on this free book http://math.mit.edu/~dspivak/teaching/sp18/7Sketches.pdf


I am currently reading it, it is very nicely written and accessible. Each chapter starts off gently so you can skip to the next whenever you feel lost, only to come back later.

So far: introduction/preorders, monoidal preorders/wiring diagrams, categories/application to database schemas.

For those reading on a ebook, the book sources are available on its arxiv page https://arxiv.org/abs/1803.05316 so you can build a custom pdf for your device.


Note.

A mathematician's idea of "applied" and a normal person's idea of the same bear little resemblance.


It may be professional deformation speaking, but they seem to be using "applied category theory" in a coherent and natural sense: they mean the theory of monoidal categories, (a) viewed as a generalization of linear algebra, and (b) omitting logic and programming language semantics as applications.

Given the centrality of linear algebra to applied mathematics (even in places where you wouldn't first expect, like the theory of regular expressions and finite automata), this seems like a perfectly sensible coinage.

(Even though I must admit I get my back up at the omission of categorical proof theory, it's perfectly natural as a sales pitch -- type theorists don't need to be convinced of the utility of category theory!)


> Given the centrality of linear algebra to applied mathematics (even in places where you wouldn't first expect, like the theory of regular expressions and finite automata)

Where can I read more about this?


Stephen Dolan's pearl "Fun with semirings" might be relevant:

http://stedolan.net/research/semirings.pdf

But I would also like to know what other resources Neel has to recommend.


If you like Matlab more than Haskell, try https://www.goodreads.com/book/show/11768822-graph-algorithm...


"Applied Category Theory" almost seems a bit tongue-in-cheek to me, given Category Theory is designed to distill the concepts found in a bunch of different "applied" fields into one, theoretical, framework. (That was my impression from the intro lecture, at least)


They should call the class “Concrete Nonsense” (since category theory itself is sometimes described as “abstract nonsense”).


Which phrase, I note with profound glee, has its own Wikipedia page for precisely this usage:

https://en.wikipedia.org/wiki/Abstract_nonsense


Maybe “Brutalist Nonsense”


Generally, I tend to care more about math if it relates to my specific interests. I'm in ML and robotics so I've found it useful to understand algebra, topology, real analysis.

I've tried to get into Category theory a few times but couldn't keep my motivation high enough once things got too technical. Anyone know how category theory could be useful for someone in ML or robotics?


Recently, I've been into learning about CRDTs a class of data structures which can be edited independently and merged without conflicts. They can be used to build experiences similar to Google Docs (although Gdocs uses a different approach called OT).

I'm interested in learning Category/Set Theory to better understand the underlying math of how they work.

I have very little experience with ML/Robotics, but CRDTs might be useful for distributed learning, or coordination between independent robots.


I've been taking a more domain-theoretic approach to understanding CRDTs(focusing more on the semantics than the syntax) but it appears that domain theory does have some connections to category theory:

https://ncatlab.org/nlab/show/domain+theory


It's funny you mention them, I've been thinking a lot about CRDTs for online multiplayer games where you have a bunch of concurrent players changing a game state.


What class was this if I may ask?


When I said "class", I really just meant "type".

CRDTs are relatively new so there may not be many classes yet that teach them. If you want to learn about them, I recommend you first watch this video: https://www.youtube.com/watch?v=OOlnp2bZVRs&list=WL&index=56... and then read this paper: https://pages.lip6.fr/Marc.Shapiro/papers/RR-7687.pdf . The video explains a lot of the math in a very easy to digest way and will make the paper much easier to understand.


I’m sorry, I misread. I was aware of CRDTs and I thought that they were taught in some class.


My understanding is that category theory can be considered as abstract function theory. If we can prove that a function (e.g. Haskell compiler) is equivalent to automatic differentiation with some transformation, according to category theory, we can write Haskell code to train neural network, instead of writing extra graph building code in, say, Tensorflow. This paper basically shows this. http://conal.net/papers/essence-of-ad/essence-of-ad-icfp.pdf.


You may be interested in this paper too https://arxiv.org/abs/1711.10455

I think given a language with AD you already don't need to have a separate language for graph building and my understanding is that's one of the goals for the Julia team and I've seen other nice looking packages for AD in other languages

* https://fluxml.ai/2019/02/07/what-is-differentiable-programm...

* http://diffsharp.github.io/DiffSharp/examples-inversekinemat...

I'm also very excited about being able to design neural nets using ideas from functional programming. I love the Keras API but I don't think it's going to be the obvious API a couple of years down the line.


And for a completely different application of differentiation to programming, see http://strictlypositive.org/diff.pdf


The structure of various ML primitives can be considered and under CT. Doubly so since in many cases you can in fact "go backwards" through a ML structure in a way normal functions can't, meaning that categorical duals might have more useful interpretations in your world.

So if anything, it might be MORE applicable for composing ML networks.


Yes. And other bits commonly associated with functional programming should also work together really well with ML. After all, most ML models are explicitly thought about as a composition of layers (ie functions).

But so far, the best ML libraries can be found in impure / imperative languages like Python.


Is there any paper or book you like that covers this topic in more depth?


Just look for introductions to duality in math and category theory. It's not the sort of subject that'll be broken out individually, but rather an assumed consequence.


I'm currently working on developing a double category of hybrid systems with applications to parallel composition and model reduction in robotics. We should have a math paper up on the arxiv within a month, and a robotics paper up shortly thereafter.


Looking forward to it! And if there's a preprint you're comfortable sharing please let me know!


Check out UMAP [1]. It's an alternative to t-SNE for dimension reduction that's grounded in topological category theory.

[1] https://arxiv.org/abs/1802.03426


Have you seen Robert Ghrist's book, "Elementary applied topology" ? There's plenty of category theory in there, even though it is mostly implicit. (And applications to ML and robotics.)

https://www.math.upenn.edu/~ghrist/notes.html


Yes! I skimmed it and really enjoyed it although I mostly remember seeing algebraic topology and not so much category theory, will give it a closer look.


David Spivak has a company, http://conexus.ai, that is applying category theory to data migration as described in 7 sketches. It will soon be expanding and soliciting proposals to fund applied category theory start-ups in many domains, as well as providing some example proposal ideas. Watch http://ventures.conexus.ai over the coming weeks for details.


What's category good for? It's a serious question so please don't say "monads", I understand monads just fine without knowing category theory. I'm down to put in the work if the reward is there.


First the joke answer. In mathematics, once you understand the proof of something, we say that it is trivial. Category theory is about making things trivially trivial.

Now the serious answer. Category theory is about abstracting away the details of mathematical domains so that fundamentally parallel proofs and constructions can be seen to be essentially the same. It also allows you to create constructions that allow problems in one field to be transformed into problems in an apparently unrelated one where you might find them easier to prove.

The result is that in a new field, you recognize the category and suddenly have a whole bunch of results, and some important constructions to look at. Such as the tensor product.

And in existing fields you can do things like use homotopy theory to turn topology problems into algebra problems, where they are often easier.


Category theory is about making trivial things trivially trivial.


Imagine if you took a mathematical construct and replaced all of the familiar terminology with gensyms. For example:

"A florb is brazzle of carnatious snoggles. A florb is itself a snoggle and so can be a fnizzle of another florb."

This is actually the definition of what is normally called (in English) a SET, under the following mapping of vocabulary:

florb = set

brazzle = collection

carnatious = distinct

snoggle = object

fnizzle = member

Category theory is about extracting the fundamental structure of mathematical definitions so that these kinds of equivalences can be studied with mathematical rigor.


I agree with others that this comment has increased my understanding greatly. Perhaps since this is a linguistic example, it resonates. Thank you for replying.

To add a bit of info, there is a system named Attempto Controlled English which uses a subset of Natural Language to express axioms. It is available here https://github.com/chrisdone/ace and here https://github.com/Attempto/APE

How would one associate patterns of axioms with appropriate relations in Category Theory ?


Not a clue. I don't actually know much at all about category theory, just enough to offer up that explanation of what it's about. And I had never heard of Attempto before you mentioned it.


Awesome, thanks for this.


Glad it was helpful.


I'm kind of surprised no one has said this, and I'm kind of a novice, so I'll probably get it wrong.

To me, the interesting thing about category theory is that it draws out the significance of relationships in a system. We often spend a lot of time thinking about the things, themselves, in a system.

- What are their properties?

- What do they mean?

- What is their identity?

In category theory, almost all the significance of individual things is erased. Instead, you often group things up into sets and other structures, and then call the whole set an object. Then you think about the existence of morphisms from one kind of object to another. A morphism is kind of like a function, in the sense that something goes in, and something else comes out. Except that we're not concerned about the properties of what's going in and coming out, so the function body doesn't really matter. A morphism is something more general than any specific function. It's more like a function declaration--specifying that something does exist without saying anything concrete about what it does.

So, you're operating on a level of abstraction that removes basically all unnecessary details about things and relationships, other than existence and some basic rules for how those relationships can be composed to form new ones. As it turns out, there's a whole lot you can say and prove about categories once you know a bit about their structure.

In my opinion, the real power is that any given system can often be "projected" into categories in a multitude of different ways, and depending on the structure of those categories, you can say apply proofs from category theory to say interesting things about that system.

Going further beyond that, category theory concerns itself with how categories relate to each other, by considering categories themselves as objects and relationships between them as morphisms.

Why does this matter to programmers?

Well, we often want to design systems that are predictable, no matter what the actual identity of the data that traverses them. This is abstraction, and we do it on many levels, all the time. We also want to chain together our abstractions. This combination of abstraction and composition is something category thoery is well suited to analyze.

I'm probably off here in some important ways, but this is my layperson's takeaway from spending some time trying to understand what all the fuss was about.


To really drive the point home of the level of abstraction I like to think of a category of domino tiles.

It kind of makes a few things apparent.

Tiles would be the morphisms and objects would be the number spots on the tile ends.

So the first insight: morphisms is the actaul object of study. The “objects” of a category is just a consequence of how the morphisms may, or may not, compose.

My first hurdle when learning things was being distracted by trying to understand what objects where, some times they would be types, sometimes sets, and it was never really clear what their cardinality would be. As if the theory hid important details that you had to keeps track of implicitly. But it doesn’t, objects really are inconsequential, they are just there to be a place for morphisms to meet, nothing more. Just like the number of dots of the domino tiles. We could have used color instead, and the tiles would compose in exactly the same way.

Second intuition, there’s no computation or anything else going on. Just as you say about the functions. The only thing of concern are the composability of those things called morphisms, be they functions or domino tiles.

In the lecture series linked a few places in this thread, Bartosz makes a fantastic point of this when deriving some properties of surjective and bijective functions using only composability.


Tai-Danae Bradley has a very nice write-up called "What is applied category theory?" -- https://arxiv.org/abs/1809.05923

Among the more readable category theory resources I'm currently reading.


Well, monads in programming are modeled on category-theoretic monads, so you do know a bit.

Monads, functors, arrows, etc are just patterns that show up a lot in functional programming that also happen to have analogues in category theory. This isn't too surprising, because category theory basically started from mathematicians trying to formalize common patterns in disparate areas of math. Categories pop up all over the place in math, and math of course is the main abstraction that we formally model just about anything in reality.

Given how ubiquitous they are, I think it's worth it to have a working, informal knowledge of category theory. It doesn't take serious time or study to do that.


I think the idea is that it abstracts away a bunch of the area-specific stuff from other fields, distilling it all down to generalized reusable concepts. For instance, I watched the first lecture and saw immediate connections to the compatibility matrices used in programming language analysis and type theory. I didn't "learn" anything new from the lecture (it was all introductory), but I'm sure that if I keep up with the future lectures I'll see more that I can actually apply to my work.


Example: the first question on the first problem set [1] shows how the idea of the Least Common Multiple and the Greatest Common Divisor are analogous to ⊤ and ⊥ in PLT, or "any" and "never" in TypeScript. (Specifically, the Join and Meet operations)

[1]: https://ocw.mit.edu/courses/mathematics/18-s097-applied-cate...


LCM and GCD are meet and join (union supertype and intersection subtype in PLT), not top and bottom. Top/any would correspond to the product of all numbers (union of all factors), and bottom/never would correspond to 1 (intersection of all factors).


I don’t know PLT very well but wouldn’t top/bottom also be considered terminal/initial objects in categorical terms?


For that you just need Lattice theory ...


Oh right, "Lattice" is the word I was looking for. It's been too long. Would that be considered a subset of category theory, a precursor to it, or another field that it somewhat "unifies"?


Lattice theory is it’s own field that exists outside of and before category theory. All lattices form a category, though, but not vice versa.


I don't know much about CT, but I believe that if you want to work with Higher Kinded Types, you'll keep going back to ideas related to CT (functors, endofunctors, natural transformations...).

I'm not sure if taking a course in CT is the most efficient way to learn how to apply such concepts in computer programming, thought. Maybe someone with more experience can give it's opinion.


One thing it can be used for is to prove certain propositions that might otherwise be hard to get traction on.

Its concepts are so abstract that many different things can map to them. So if you can prove that something is a group, or a monoid, or whatever, then by the rules of category theory, you can now prove certain characteristics of that thing. This could wind up being a vital step in a longer proof.

When this happens, it usually feels like you moved your proof forward without actually gaining any new understanding (because the concepts are so abstract). Therefore, this is often referred to as "general abstract nonsense," but in a tongue-in-cheek manner, because you actually did derive concrete value.

https://ipfs.io/ipfs/QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1m...


Differentiation (of functions) is a product-preserving functor, which makes the Chain Rule the rule by which it respects function composition, which makes automatic differentiation a matter of writing out the syntax tree of functions and "folding it up" via a catamorphism (a generalized fold).

https://jtobin.io/ad-via-recursion-schemes

Yes, really. I wasn't making that up. Automatic differentiation is a page or two of code given enough categorical background.


To a programmer, the real nuts and bolts of category theory aren't a ton of use. People have had some success reverse engineering some of these concepts into programming technologies, but it is a very long road to learn enough CT to either replicate or benefit from that road.

There's also the regular misinterpretation that mathematical objects like, e.g. monoids, are well understood through CT. That's pretty false. There's the "intermediate level" of abstract algebra where you can study these objects and the world they live in much more directly and immediately. There's some real benefit from learning some parts of abstract algebra. Trying to learn CT in order to learn abstract algebra is complete overkill, though.

In both of these cases, my advice boils down to "just learn the things you're interested in directly as opposed to 'via' CT". That's because CT is really what happens when you boil and boil and boil away these different fields and are just left with some very universal ideas. Abstract nonsense.

If you're a mathematician and these concepts are your daily bread and butter... hey, you might still not want to learn CT because it's better to just work with what you're studying directly.

Okay, objections and disclaimers over. The thing you can learn from CT which is interesting and valuable is the very powerful, universal perspective of "if you want to understand something, look at how it relates to other things". Category theory is absolutely focused on this idea and most of its insight is that this idea, taken in its extreme, is extraordinarily far reaching.

Again, you can probably learn this more practically by studying where it shows up in something you actually care about. It won't be a theorem there, but instead just sort of "in the fabric" of the field, a major technique used in proofs. CT is nice because it unified those "folk theorems" from all over mathematics.

More relevantly, in programming the CT perspective lies close to the FP perspective where "transformations" of data and "pipelines" are given center stage as opposed to "objects" and "procedures". It's also embedded in ideas of domain specific languages and both concrete and abstract interpretations, thereof.

Most likely, you want to learn the actual techniques of interest and then maybe later just learn a bit of CT to get a sense for "how they all connect". I think there are opportunities for CT to influence your programming if you were to become familiar with it, but they're still kind of fuzzy today.

Finally, if you want to learn CT I recommend you learn abstract algebra first via Aluffi's "Algebra: Chapter 0". It's secretly a CT book, but is not trying to be so abstract. You can also learn a lot from Schanuel and Lawvere's "Conceptual Mathematics" which, I feel, is one of the better books for establishing the "category theoretic POV".


Second the Aluffi book. A brilliant treatment of Abstract Algebra from a categorical point of view.


What I usually do, and also did here, to evaluate if a course interests me is to look at some of the assignments.

Here they present three PDFs with a number of assignments of varying difficulty: https://ocw.mit.edu/courses/mathematics/18-s097-applied-cate...

If you find that being able to answer a high enough number of assignments you know that the course is for you. If not, the lectures may still be of interest, but it is much less likely the less of the assignments you think are worthwhile for you.

PS: Question 4 in problem set 3 is quite (in)famous :-) Just the headline:

> Question 4. A monad is a monoid in a category of endofunctors (harder/optional)

So, if you want to be one of the few people who actually understand that sentence when it is quoted as a joke (usually in programming forum whenever monads come up), this is your course.


This is a course. Not necessarily a good way to an understanding of that line.


What??? Exactly the opposite, obviously. A course is exactly what leads to understanding of that line.


The comment I was replying to said "this is your course".

I don't think the particular approach taken by the instructors is a good one, so I replied "this is a course" (not necessarily the course to take).

A lot of the applications are weird and a bit contrived. A more traditional approach would suit many people better.


From the course description:

Category theory is a relatively new branch of mathematics that has transformed much of pure math research. The technical advance is that category theory provides a framework in which to organize formal systems and by which to translate between them, allowing one to transfer knowledge from one field to another. But this same organizational framework also has many compelling examples outside of pure math. In this course, we will give seven sketches on real-world applications of category theory.


For those in mobile:

Category theory is a relatively new branch of mathematics that has transformed much of pure math research. The technical advance is that category theory provides a framework in which to organize formal systems and by which to translate between them, allowing one to transfer knowledge from one field to another. But this same organizational framework also has many compelling examples outside of pure math. In this course, we will give seven sketches on real-world applications of category theory.


Category theory is like putting types around mathematics. Like how dimensional analysis in physics points the way toward what operation to perform, category theory helps to show what mathematics "makes sense". Another example: I've heard that Haskell's type system is so good that you can often find the function you need just by searching for the function signature (it's type). Category theory does that for mathematics.


I found the first few videos of this series a good investment to answer that question: https://www.youtube.com/playlist?list=PLbgaMIhjbmEnaH_LTkxLI...

It is more focused on programming, but just enough to be relatable, it’s still mainly about CT

(Actually enjoyed the whole set, and the follow up)


They say that "mathematicians are people who like calling the same name to different things". Category theory generalises and abstracts many different concepts and shows that they are actually the same thing in categorical theoretical terms.


Philip Wadler's talk, Category Theory for the Working Hacker, is a nice little introduction: https://youtu.be/gui_SE8rJUM


The idea is that it's a language for abstracting out common type relationships and calculations. I haven't yet seen a software-development context where those abstractions carry their weight, though.


It is only useful in languages with sufficiently expressive type systems, like Haskell. Otherwise you quickly hit a wall where it's impossible or impractical to express your idea in the language's type system, the same complaint that python programmers have when they try to write Java.


I'm not seeing the payoff even in Haskell.


As far as I understand it, it's something like generics for math theory.


Are you asking whether it has beneficial features that Set Theory doesn't have (it does)? Or are you asking for examples of useful application (which there are a number)?

Or are you asking for examples of useful application such that category was absolutely necessary?


When I saw the title I suspect that this would be David Spivak, and happily it is, I am also please to learn that he now has a new co-conspirator in the form of Brendan Fong. I first encountered Spivak's work via Category Theory for Scientists [0] which inspired me to try to come up with ways to use it for documenting scientific protocols. My chief complaint at the time was that someone had not already implemented many of the ideas expressed in his work. Despite the usual treatment of category theory as something mystical and opaque, I find Spivak's work to bring such clarity to communicating complex ideas. Those ideas are always going to be complex, but the way we communicate about them does not have to be. From a quick skim Sketches will provide further inspiration.

0. http://math.mit.edu/~dspivak/teaching/sp13/CT4S.pdf


I’ve appreciated his explanation of dB schemas as categories.


Is there any accessible work using Category Theory that looks at API composition? I’m thinking here about data intensive systems that are often composed of different elements (e.g. Kafka, Cassandra, bespoke microservices, ...) and are glued together in an application. Could Category Theory help in getting a grip on this complexity?


http://hackage.haskell.org/package/pipes is the first that comes to mind.

And just Haskell programming in general involve composition Effects are monads and program composition is composition over kleisli arrows .


That looks very interesting and could be source of interesting ideas. Thank you.


Watched the first lecture, loved the format: start with a question that you can logically answer, then build up the math to formalize your intuition by the end of the lecture.


Sort of related... but is David Spivak related to the author of that famous Calculus textbook by Michael Spivak...?


Nice, now I can have sole video lectures to accompany my self printed copy of “Seven Sketches in Composability”


When i was in College, my wish when learning Math is that:

- Hey, wait! You don't tell me why we need to learn this concept

Yes, most of my lecturers use Math to teach Math. It's not useful to me, or i can't learn that way.

A wish is just a wish though.


What if the aim is to solve problems that have never been solved before? In which case, you don't know which tools you will need before. You are stuck just learning a bunch of tools and hoping you learned the correct ones.

That's how life is. Some (most) times you don't know what tools you will need in the future.


All tools were created to solve at least one problem. Explaining what that problem is, and how the tool is useful for solving it, goes a long way towards making it easy to figure out the abstract template of tool application. Going a step further and explicitly explaining why the tool works and when it doesn't is the marrow of an actual education that can be used out of context.


Agreed, I wish professors were required to take a course in Basic Pedaegogy for Math, as soon as someone invents it.


Looks like a good course but nothing cringes me more than seeing the Massachusetts institute of TECHNOLOGY still using chalk boards....I mean come on now.


What better alternatives are there?

“Smart” boards — expensive, and high drawing latency makes writing on them uncomfortable. Also, text is hard to read from a distance. (They may be better for accessibility.)

Dry-erase boards — boards and markers are expensive. Markers generate a lot of waste.

Chalk — cheap, naturally occurring (and safe to dispose of) material. Easy to read at long distances.


Chalk is really nasty on your hands when you're the lecturer. And nobody ever said "like fingernails on a whiteboard" for a reason.


I don't know about you, but I always get marker on my hands when I use dry erase markers, and chalk is a hell of a lot easier to clean off than ink.

Also, don't put your fingernails on the chalkboard...?


But ink doesn't dry and erode your skin. I hate the feel of chalk.


You're probably thinking of green boards when you say chalk boards because black slate chalk boards are a pleasure to draw on.

If you have ever used one, you'd know they keep using them because they prefer it.

They are also easier to read from a distance compared to a dry-erase board.


Having used said chalkboards there - they are amazing. The chalk is really smooth to write with, and the tactile sense is much better to draw on than a white board.

The most amazing thing though, is that person or persons unknown goes round and cleans the boards at the end of every day. Even in the holidays.


Still the best solution for teaching complicated topics that have a compact notation.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: