Hacker News new | comments | show | ask | jobs | submit login
Basic Category Theory (arxiv.org)
248 points by adamnemecek on Jan 2, 2017 | hide | past | web | favorite | 85 comments



Hi, I'm the author. Thanks for your interest in my book!

I agree, the words "relatively little background" are too vague. What I had in mind was that the book requires little background relative to many other introductions to category theory (such as the grand-daddy of them all, Mac Lane's Categories for the Working Mathematician). But I should have been more specific. If update the arXiv submission, I'll fix that.

As cokernel points out, the level of knowledge assumed is roughly what you'd get from an undergraduate mathematics degree at an ordinary university in Britain (and probably many other countries too). I know this because I used it several times to teach a master's course at the University of Glasgow. Probably the most famous master's-level category theory course is the one that Cambridge runs in its Part III (master's) programme, which I've also taught. But this book covers much less than the Cambridge course, and assumes less background too.

If you've taken either (i) enough algebra that you're comfortable with rings, groups and vector spaces, or (ii) any kind of topology course, then you should be able to understand enough of the examples that you can get a good grip on the general concepts. If you haven't, then it might not be the right book for you. As others have pointed out, Lawvere and Schanuel's book Conceptual Mathematics assumes much less background than mine, and there are also texts oriented towards readers with a computer science background.


For an even more basic introduction I'd recommend starting with Lawvere and Schanuel's Conceptual Mathematics: A First Introduction to Categories. Requires less abstract algebra. If you have some mathematical intuition then this is the book to start with.

I'm surprised he cites Sets for Mathematics but doesn't include it in his further reading. If you have some university level abstract algebra (groups, rings, vector spaces, topology) then Sets is a quicker start and better reference guide for Category Theory. I started with this book before I had done any abstract algebra and I quickly got out of my depth. But when I did get more algebraic structures under my belt, I had quite a few aha! moments and referred back to this book quite frequently.

I haven't read Awodey's introductory Category Theory book so I can't comment there.

Speaking of Category Theory: Bill Lawvere is an interesting and exceptional 20th century mathematician who probably has less name recognition than he deserves. Lawvere's Wikipedia page: https://en.wikipedia.org/wiki/William_Lawvere


> For an even more basic introduction I'd recommend starting with Lawvere and Schanuel's Conceptual Mathematics: A First Introduction to Categories. Requires less abstract algebra. If you have some mathematical intuition then this is the book to start with.

It's a great book! Don't forget to do the exercises. It starts off very easy but if you don't do the exercises all of a sudden you'll be lost.


This is generally good advice for all maths study, of course: "mathematics is not a spectator sport".


The best intro I've ever seen is Bartosz Milewski's video series (intended mostly for coders): https://www.youtube.com/playlist?list=PLbgaMIhjbmEnaH_LTkxLI...

It assumes almost no prior math knowledge (okay, you should know what sets and functions are...).


Looks so promising. Haven't decided yet whether I'll view through these 15 whopping hours or study the written (massive-book-length) equivalent chapters on his blog..

Sizing this up, my real question now becomes: as an active day-to-day-life programmer, by how much exactly will this propel my work if I already know and use monads, basic function composition and Haskell?

I'm sure there's some amount of skill/quality/productivity payoff if you evaluate this kind of time investment over the whole lifetime-of-work outstanding, but seems impossible to judge beforehand. Any, er, "testimonials" in terms of how this actually changed work and coding for you or others, other than "just neat geeky theory to digest and reason about endlessly"?

It's just SO MUCH material to dive into (in terms of sources lectures books etc, I'm sure once grasped everyone could condense it into a single page, though fully comprehensible only to themselves, until for teaching others once again decompressed into 15-hours-of-lectures/26-chapters-totalling-book-length) and I'm still driven by this odd old urge to "just go pump out code and make programs happen" back from my earliest Pascal/Basic days I suppose..


I loved his videos, personally. I had some familiarity with basic category theory and functional programming, but Bartosz does a really good job of explaining everything without prior math knowledge. Compared to all the introductory books people recommend, he's an excellent teacher.

On the subject of it improving your coding, not so sure how much you could measure that. I think it's usually preferable to have a strong understanding of the basis of what you're working with, however, as it gives you some confidence to think critically about things you would usually just accept as fact.


TBH, I don't feel my FP has improved much after learning why monads are monoids in the category of endufunctors. I did it for the math geek in me.


I suspected as much. Still gotta dig into all this more deeply eventually, but given your anecdote it's probably fine to stretch this out over half a year, a weekend here, a train ride there, etc, rather than attacking it full-on


Me too...it's amazing how much I learned from it about creating composable code.


This is how you serve science. Thanks to Tom Leinster. "This electronic version is not only free; it is also freely editable. For instance, if you would like to teach a course using this book but some of the examples are unsuitable for your class, you can remove them or add your own. Similarly, if there is notation that you dislike, you can easily change it; or if you want to reformat the text for reading on a particular device, that is easy too."


"for readers with relatively little mathematical background."

A well put together guide, but note that 'relatively little' here means you're OK with some abstract algebra, at least, as the second example of the introduction begins: This example involves rings, which in this book are always taken to have a multiplicative identity, called 1. Similarly, homomorphisms of rings are understood to preserve multiplicative identities. ;-)


I actually laughed out loud when I got to that part. It seemed like the first page and a half he really tried to explain it without jargon and then just gave up.

To be fair, though, who is going to read an introduction to category theory that isn't familiar with abstract algebra?

Someone should write 'an introduction to introductions to category theory'


> Someone should write 'an introduction to introductions to category theory'

More like the "Prereqs for an Intro to Category Theory."

The author intended his remark that the work is "for readers with relatively little mathematical background" to mean that his readers aren't assumed to have previous exposure to category theory, not that they have zero math.

He's reiterating the Basic in the title "Basic Category Theory."

Category theory is like the refactoring of a lot of math to express commonalities with a view toward recognition and reuse.

The Gang-of-Four OOP patterns book would also seem abstract for anyone sans programming experience.


Me! I know some abstract algebra, but I'm not going to understand it without some hand-holding. Remind me what a ring is and how and why it relates to categories.


"There exist only two kinds of modern mathematics books: ones which you cannot read beyond the first page and ones which you cannot read beyond the first sentence." -- Chen Ning Yang


Since Yang's name is probably not familiar to most HNers, I'll just add that he's a theoretical quantum physicist; Nobel Price in 1957, etc etc.


Bob Coecke writes good introductions to category theory, you might find some of his stuff enjoyable. Although he is often also introducing quantum physics and linguistics at the same time.

This paper is fascinating:

https://arxiv.org/abs/1602.07618


The "note to the reader" clarifies things a bit -- he's aiming for requiring "no more mathematical knowledge than might be acquired from an undergraduate degree at an ordinary British university". Though he does not specify whether he has in mind a mathematics degree, I think this can be deduced from the fact that he indicates that the text developed out of a master's-level course.


The text developed from approximately six lectures' worth of the MMath-level 24-lecture Part III Introduction to Category Theory at Cambridge, I believe. (Source: I took that course last year, and was part of a small reading group studying Leinster's Basic Category Theory at that time. We found that book really, really helpful.)


Why do mathematicians use phrases such as "relatively little mathematical background" or "introduction to.." when they assume previous knowledge? I find that really annoying and a turn off from reading most math textbooks that are suppose to be "introductions". Is there an actual good book on category theory for someone that didn't complete a math degree?

Additionally, what are applications of category theory to computer programming, or computer science in general? I find this really interesting from an outsiders view and want to learn it. I'm looking for a truly basic and gentle approach to learning category theory, without removing the rigor.


The target audience of this book isn't the layman, it's people studying mathematics. So a phrase like "relatively little mathematical background" is meant in the context of academia.

I don't think there's a basic and gentle approach to learning category theory that doesn't remove the rigor. You can learn at a high level what some of the stuff in category theory and get a basic feel and intuition for it, if that's what you want. But, if you want to learn category with the rigor, you're going to have to first learn how to write a proof in mathematics. Really though, you're going to want to have a decent grasp on some basic abstract algebra. Otherwise you're never going to be able to understand the examples for category theory.


The mathematical background assumed here is the content of the first couple weeks of the first upper-level course in an undergraduate mathematics degree. It really is "relatively little mathematical background".

That said, I second the recommendation of "Algebra: Chapter 0".


[flagged]


we're posting copyrighted material to HN now?


It's for a good cause.


I imagine the authors would disagree with you


I imagine the person putting up the link would disagree with them. What's your point?


You might be interested in "Algebra: Chapter 0" http://amzn.to/2iYpAn5

"The primary distinguishing feature of the book, compared to standard textbooks in algebra, is the early introduction of categories, used as a unifying theme in the presentation of the main topics."


> categories, used as a unifying theme in the presentation of the main topics

Categories are not a good way to be introduced to the "main topics" of abstract algebra. Yes, Category Theory can be seen as the "Chapter 0" of a course on abstract algebra, but for beginners a traditional presentation based on sets would be much easier to digest.

There is a similarity between using categories and using functional programming style or a language - as opposed to using an approach to mathematics based on sets and using the imperative programming style or a language: the latter is just more intuitive and therefore easier to learn for a beginner.


> for beginners a traditional presentation based on sets would be much easier to digest.

If you read the referenced book, you'd know the author does exactly this.


Do you have experience teaching both or are you just speculating?


Thanks, but is this another textbook that requires knowledge in abstract algebra prior to reading it, or make any sort of previous knowledge assumptions?


Having attempted to get through it myself(and having a math degree), I would strongly recommend against Algebra: Chapter 0 as an introductory text on either category theory or abstract algebra.

If you either already have a math degree, or if you're already at the point where you're ready to take an upper-level undergraduate math course, then it may be a decent book for you. Maybe. For example, if you're:

- already very comfortable with linear algebra

- possibly been exposed to a bit of topology in analysis class

- already been exposed to abstract algebra a bit by a more theoretical approach to linear algebra (comfortable with proving things in linear algebra via vector spaces and fields)

Otherwise, I would absolutely steer clear of it. The writing style is also not terribly beginner-friendly, in my opinion.

The closest book I can think of to what you're asking for would be "Conceptual Mathematics: A First Introduction to Categories". Though obviously I don't know whether it's a good fit for your particular background and learning style.


I think the first chapter definitely has no prerequisites (it introduces sets and categories), but it's also a core mathematics textbook intended for upper-level undergraduate and early graduate students in pure mathematics. So it re-teaches all of abstract algebra, and quite a bit beyond, using category theory as a unifying principle. It's a book intended to give you a mature perspective and prime you for research.


Not sure I see the point in trying to learn category theory without algebra to motivate it.


You might be interested in this then https://news.ycombinator.com/item?id=13268335

The book doesn't really require much knowledge.


I'll check out this book then. Has great reviews.


I think it's easily done when someone knows a topic too well to teach it to beginners from outside the field. Rings, fields and identities may just be math's equivalents of string, operator, or variable, when similarly used by introductory programming books without any explanation.


This is correct: "ring" is a concept that will be taught to all maths undergraduates by the end of the second year at latest.


Well, everything has some amount of implied/customary prerequisites. You wouldn't complain if a books "Introduction to Metaprogramming in Ruby" or "Introduction to C++ templates" didn't teach you programming.

In math many "general topics" like calculus, linear algebra, probability etc are sometimes studied a few times (eg the second course in calculus might start from scratch and carefully construct the real numbers, which the first course skipped), and I've noticed some authors use "Introduction to X" to mean "this textbook is meant to be a first course in X"


It feels like "relatively little" to many authors means "a complete set of undergrad math courses from a fairly math-heavy major" (which could be something like calculus including vectors, differential equations, linear algebra, real analysis, and maybe abstract algebra and discrete math). Maybe we need another term for "OK with logic, proofs, and calculations and doesn't mind learning notation, but hasn't had a lot of university-level courses, or in any event doesn't remember them".


I can't comment on applications of category theory to computer programming or computer science, but I believe that the book Conceptual Mathematics by Lawvere and Schanuel is a nice introduction to the basic ideas of category theory.

http://www.cambridge.org/catalogue/catalogue.asp?isbn=978052...

I believe this book has been discussed "recently" on HN, but I couldn't find a thread.

Mathematicians write for multiple audiences, but there are two major mathematical audiences for their works: specialists in the same field and researchers in other fields. Phrases like "relatively little mathematical background" normally signal that a work is intended to be accessible to non-specialists, but it's often safe to assume that it is aimed at research mathematicians. I think if a work actually requires relatively little mathematical background, a mathematician is more likely to say something like "no, really, you don't need to know mathematics to understand this!" even when it's not quite true.


I genuinely think you don't need a math background to get a general understanding of category theory, because it's so high level. A lot of it is just drawings, even, not complicated formulae and proofs. It's not category theory itself that's difficult to understand, it's the examples and vocabulary.

I think you really need no more than high school algebra to understand basically what category theory is about.


Great recommendation. Thank you!


I don't understand why this is down-voted. This is probably pretty perplexing to the lay-person. When a mathematician writes "relatively little mathematical background," what that constitutes from their perspective is probably radically different from what a lay-person considers to be "relatively little mathematical background"

When I see "minimal mathematical background" or the like, I usually think: "Understands basic set theory, the concept of a function, and has familiarity with basic proof structures"

When I see "mathematical maturity", I think "Understands, real analysis, abstract algebra, and topology."

Hope that helps.


jargon like "homoporhism, rings, multiplicative identity" are just scary sounding words for simple concepts. Especially this day and age with wikipedia.

Now, not to say that proving things about groups and rings is simple. but just because the word has 5 syllables doesn't mean the concept is hard. Whereas certain concepts in calculus - I think - are quite difficult to solve for even simple looking integrals.


If I may ask, why is the HN community so interested in this particular topic? I studied Math and have basic understanding of Category Theory... could someone point me to a text relating Categories and... something related to computers? What am I missing?


Yeah, I also have a mathematics degree and don't really get the fascination that computer people have with category theory. I have never been particularly impressed with their claims of applicability. It seems to just complicate things for very little benefit. It's a pretty abstraction, but I don't see actual results from it. People use ugly, practical things like git, not beautiful categorical abstractions like pijul.


> Yeah, I also have a mathematics degree and don't really get the fascination that computer people have with category theory.

This point seems to be repeated with every new mathematics.

> It's a pretty abstraction, but I don't see actual results from it.

It has applications in physics and is widely used in computation, particularly for reasoning about composition of programs with side effects. With respect, if you don't see any actual results, you haven't been looking.


> With respect, if you don't see any actual results, you haven't been looking.

With respect, if you think any category-theoretic results were necessary to author even 0.01% of the code executed in computation globally, you're willfully deluding yourself.


Good thing I never made that claim. I will however claim that, despite being unnecessary in principle, quite a bit more than 0.01% of code executed globally did make use of category theoretic abstractions because they are so useful (depending on how you measure this of course). Pretty much any program written for .NET and Scala makes use of monadic composition.

I will also claim that their use is only going to grow with Rust adoption and Java adopting lambdas and functional APIs.


> Good thing I never made that claim.

You said "It has applications in physics and is widely used in computation". Category theory is not widely used by those who program computers and thus produce computation in them.

> I will however claim that, despite being unnecessary in principle, quite a bit more than 0.01% of code executed globally did make use of category theoretic abstractions because they are so useful (depending on how you measure this of course). Pretty much any program written for .NET and Scala makes use of monadic composition.

And like clockwork, you provide the bog-standard argument for why knowing category theory is important: you point out how many people productively write software without knowing anything about category theory (or even abstract algebra).

Saying that people use category theory to write software without knowing it is like saying they use Maxwell's Laws to write software: so reductive it loses all relevance to productive conversation. That's bad, unless your goal is to make the conversation unproductive.


> Category theory is not widely used by those who program computers and thus produce computation in them.

Computation is bigger than just programming in industry. "Computing" has always referred to the overall discipline of computer science.

> you point out how many people productively write software without knowing anything about category theory (or even abstract algebra).

That wasn't my point at all. Read again. They can have even more productivity if they used category theoretic abstractions.

> Saying that people use category theory to write software without knowing it is like saying they use Maxwell's Laws to write software

Good thing that's not what I said either. You're making a habit of this.

People can understand a structure, in that they can grasp its semantics and its application, without knowing it's canonical name.


> It seems to just complicate things for very little benefit. It's a pretty abstraction, but I don't see actual results from it.

This is a curse which keeps affecting large groups of programmers, to read about some pretty looking abstraction and then wanting to use it everywhere regardless if it is suitable or not. I'm thinking of Design Patterns, CQRS, complex microservice architectures etc.


Have you never used a functional language?


Yes. I don't think knowing about category theory makes you more proficient in using or designing functional languages either. Lambda calculus has much more obvious utility. Even things similar to monads can be described more simply without abstract nonsense: state is just another variable that you are passing along in your functions. The best books on Haskell do not dwell on category theory.

There's actually a bit of harm too: the abstract nonsense seems to make it harder to reason about execution speed and makes it very easy to write very slow code. I know a lot of people have a hard time being able to predict the speed with which, say, Haskell code will run.


I assume the functional language you used was a Lisp or something not statically typed?


Category Theory is to mathematics approximately what Urbit is to computing.


Category Theory organizes mathematical concepts, and a lot of the concepts there are applicable to Computer Science.

As an example, consider the theory of Containers [0]. A container is an abtract mathematical model of some kind of data structures (such as lists, trees etc.). Like most mathematical structures they form a category. Further more, each container gives rise to a endo-functor Type → Type. In fact they form a full subcategory of such functors.

For instance the list endofunctor L : Type → Type can be seen as a generic data structure which takes a type parameter, A, and gives the type of lists of elements of A, namely L(A). The functor structure is the generic map function which to each f : A → B gives a function map f : L(A) → L(B). These kinds of generic maps are almost always natural transformations, which tells you a lot about their properties. Knowning these things makes it easier to reason about your code.

These are ofcourse very simple examples. For more elaborate applications of containers, see zippers — which involve an adjunction in the form of a differentiation structures. In layman's terms, zippers are datastructures with holes in context.[1]

[0]: http://www.cs.nott.ac.uk/~psztxa/publ/cont-tcs.pdf

[1]: This master thesis has a readable introduction: https://www.duo.uio.no/bitstream/handle/10852/10740/thesisgy...


I really enjoyed learning about the so-called "combinatorial species" [1]. It is another good example of what you are describing. And this idea of differentiation as putting a hole, it just blows my mind every time i think about it.

https://en.wikipedia.org/wiki/Combinatorial_species


The short answer is category theoretic monads. Once upon a time, Eugenio Moggi realized that monads could be used as the formal basis for assembling sequential actions. For example, when you write

    print("hello ");
    print("world")
you expect the output to be "hello world" and not "worldhello " or something weirder. Moggi realized that the semicolon there was a monadic operation that arranged for the two outputs to be printed sequentially.

That might seem like a small thing, but it isn't. In theory, so far as I understand, specifically in denotational semantics, the next best alternative is continuation passing, which adds an extra parameter and return value to the actions and is generally awkward.

The big advantage, though, is practically, in lazy programming languages like Haskell. In that case, the alternative was world passing, where the entire (metaphorical) universe is passed to the action to allow normal parameter passing conventions to sequentially associate the actions. (In contrast, monadic IO does the same thing, but hides the world in the definition of the monad.) The result is sequential code that we all know and love.

Then it was discovered that other bases than the (metaphorical) IO world made for meaningful, useful monads: the maybe monad aborts a computation on the first error, for example. And other data structures like lists...well, go look for a paper with a title like "turn your failures into a list of successes" and we were off and running.

[Edit]

And I'm thinking of the wrong Wadler paper. Try some of the ones from here [1], starting at the bottom.

[1] http://homepages.inf.ed.ac.uk/wadler/topics/monads.html


CT concepts and relations between them are used to design (or have influenced) libraries/APIs and even some language features in several programming language communities, especially those using a 'functional' paradigm. Take a look at Haskell, Idris, Scala (with the scalaz library or whichever took over its place today),...


The big application I'm aware of is structured recursion schemes: http://maartenfokkinga.github.io/utwente/mmf91m.pdf ( implemented in Haskell by the recursion-schemes library ). Used carefully they allow a very lightweight/concise way of expressing transformations of tree-like structures, which is the essence of compilers (and of a lot of other programming, particularly when you start working in DSLs and regarding everything as a compiler).


Erik Meijer gave a talk called "Category Theory, The essence of interface-based design" https://m.youtube.com/watch?v=JMP6gI5mLHc

Fundamentally, category theory is seen as some sort of holy grail of programming. The hope is that learning about only one API will let you interact naturally and composably with any interface conforming to that design.



This book also has the [backing of Peter Smith][1], who is basically the king of logic-related reading materials (logicmatters.net). I was part of a student reading group studying BCT last year with him, and I found it extremely helpful in the context of people studying for the MMath. One of the most important parts of basic category theory is seeing how adjunctions, limits and universal properties play with each other; this book is about exactly that, and goes into it in a depth I haven't seen very well elsewhere.

[1]: http://www.logicmatters.net/2017/01/02/tom-leinsters-basic-c...


If you genuinely don't know much maths, maybe Eugenia Cheng's "How to Bake Pi" also published as "Cakes, Custard and Category Theory"

I enjoyed it, it's not a text, there are no exercises. Popular overview of the subject, including a first half that mentions category theory only in passing.

Brave to even attempt such a book and I thought she pulled it off rather well. I hope we hear more from her and others mathematicians are similarly inspired.


This article definitely assumes more knowledge than basic abstract algebra - there are also some key facts used from set theory assumed (i.e. In a proof, it is assumed to be known that if a compositiom of two maps is the identity map on one set, and the reverse composition is the identity on the other set, then the map is a bijection - it also assumes knowledge from abstract algebra that the composition of two homomorphisms is a homomorphism).

I have only looked in the first chapter, so I cannot speak further on that atm. That said, I appreciate that a text talks about the universal property - I did not encounter a definition of it in a text when in grad school. I only encountered it because lecturers at my grad program made sure to talk about it.


> In a proof, it is assumed to be known that if a compositiom of two maps is the identity map on one set, and the reverse composition is the identity on the other set,

This is something math and computer science students typically learn in the first two weeks in their mandatory math lectures at least at German universities


In the US, I don't think I saw it until a course in set theory, although it could get rolled up in a topology course.


I mean: It really makes sense to put it at the beginning since otherwise it's ugly to understand why a diffeomorphism is defined as it is:

You surely know that an isomorphism of sets (bijective function) has an inverse that is also an isomorphism of sets.

For differentiable functions a similar statement does not hold in general (just consider [-1,1] -> [-1,1]; x \mapsto x^3; its inverse is not differentiable everywhere on [-1,1]; so its inverse exists as an isomorphism of sets, but not as a differentiable function). Since diffeomorphisms for R^d are introduced in the 2nd semester for math students (typically in the context of the inverse function theorem), one better has already understood the basics before.


I'm in the middle of writing https://arbital.com/p/universal_property_outline/ which is precisely an Intro to the Universal Property - comments, feedback and assistance welcome, if you care to help :)


Mathematical Physics by Robert Geroch was recommended on the haskell subreddit a while ago: https://www.amazon.com/Mathematical-Physics-Chicago-Lectures...


Tom Leinster wrote some things about open sourcing the book here:

https://golem.ph.utexas.edu/category/2017/01/basic_category_...


There "Category Theory in Context" book by Emily Riehl is also freely available at https://golem.ph.utexas.edu/category/2016/11/category_theory... (I didn't read it yet)...


Yet another excellent resource that I found helpful (and fun) is a series of short lectures by The Catsters: https://www.youtube.com/user/TheCatsters.


I enjoy "Make Category Theory Intuitive!" (2007), by Jocelyn Ireson-Paine

http://www.j-paine.org/make_category_theory_intuitive.html


Like so much of category theory writing, it lacks examples. Seriously, point me to examples where category theory is actually useful outside of specific parts of mathematics, and I'd be very happy. By useful, I mean that it allows you to prove or understand something that would otherwise not have been proved (or is much more difficult to prove).


The point isn't so much proving new results, as showing how different results (that you thought looked similar but were distinct from each other) are actually exactly the same result in different settings. One of the things category theory does is tell you whether a result is "deeply meaningful" or not.

For example, I still don't know how the Segre embedding of projective varieties is constructed (my algebraic geometry lecturer laboured hard to try and impart that knowledge, with the result that I thought it was difficult and abstruse); but I now know that it's a product in an appropriate category, so while the construction may be really nontrivial, the object itself is just the same old product which I already know and love. I now know that the difficulty lay in showing that this object exists and has the required properties, rather than the object itself being in some deep moral sense "hard to understand".


you might enjoy looking at category theory for the sciences by david spivak. there's a hardcover edition, but it's available online as well.

[html version] http://category-theory.mitpress.mit.edu

[old pdf version] http://math.mit.edu/~dspivak/CT4S.pdf


After reading that, will I be able to write a hello world in Haskell?


No. It's not a Haskell book.


I lost interest in math after calculus, when it stopped being about results and more about abstract and complicated squiggles on the page . Elliptic functions are interesting because it seems calculus-like (such as elliptic integrals and the theta series) but this weird set/category theory stuff just doesn't do it for me.


"Stopped being about results"? I think you're referring to the building-up of the vocabulary required to attain results. As an analogy, you learnt to count (1,2,3,4,…), but then when addition and multiplication were introduced, it "stopped being about numbers, and more about abstract and complicated operations on numbers". Yes, fine, but addition and multiplication open the gateway to the study of the primes, from which most of the results of number theory follow.


I'd say addition and multiplication are more important than prime numbers, or number theory, for that matter.




Applications are open for YC Winter 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: