
Initially dismissed, representation theory is now central to much of mathematics - dnetesn
http://abstractions.nautil.us/article/571/the-useless-perspective-that-transformed-mathematics
======
Koshkin
Also discussed at
[https://news.ycombinator.com/item?id=23473919](https://news.ycombinator.com/item?id=23473919)

------
btilly
I have no idea how nautil manages to so consistently talk about mathematical
concepts in a way that is so hard to make heads or tails of.

Here is a simpler explanation.

A group is a collection of elements with a multiplication such that
multiplication is associative, there is an identity, and every element has an
inverse. Multiplication is generally NOT commutative.

The classic example is the set of all permutations of a set using function
composition as multiplication. "Multiplication" is associative because (f o (g
o h))(x) and ((f o g) o h)(x) works out to be f(g(h(x))) for any x. The
identity permutation is "leave everything where it was". The inverse of a
permutation is simply "map everything in reverse".

They spend time talking about S3. That is just the permutations of 3 objects.
(Which they visualized as the corners of a triangle.)

Another classic example is the set of invertible nxn matrices using
multiplication as multiplication.

A representation is just a function from a group to some set of invertible
matrices such that F(x * y) = F(x) * F(y). Note that a representation does NOT
have to be one-to-one - you can always just map everything to the identity
matrix.

OK, so representations exist. Why would we care? The answer is that each
representation of a group shows something about the structure of that group.
And says it in a language that we have a lot of tools to work with.
Admittedly, the identity representation doesn't say much useful. But the
others do. And representations have provided ways to reduce a lot of hard
problems to easier ones that we have a better chance to solve.

There are a lot of examples, but the most widely known example of a hard
problem that used this idea as part of the solution was Andrew Wiles' solution
of Fermat's Last Theorem.

~~~
wutbrodo
> I have no idea how nautil manages to so consistently talk about mathematical
> concepts in a way that is so hard to make heads or tails of

Because they're writing for laymen. I have a degree in math and work in a
subfield of CS that relies on math a lot, so I have a lot of practice
explaining concepts to some of the smartest people I know without a math
background. The tradeoff for your concise expression is that it uses jargon
that adds cognitive overhead for people unfamiliar with college-level or
higher math, even if they've got a high quantitative intelligence. A non-
exhaustive list that would trip up, say, my friend with a PhD in pathology:

1) associative and commutative are not commonly-know terms

2) "A group is a collection of elements 'with' multiplication" is
underdetermined. What does it mean to be "with"? The notion that
multiplication isn't commutative is unintuitive without understanding that the
definition of multiplication is flexible here, defined based on the type of
the element in the group.

3) Using o as a composition operator is something that a layman is guaranteed
never to have come across

4) What does invertible mean? It's not trivially obvious to a layman that some
matrices A have no matrix B such that AB = I.

Etc etc etc. Note that each of these gaps and confusing points (for laymen)
compound; I'll occasionally read a paper in a field I'm not an expert in (like
economics), and the being bogged down every other sentence by confusion is a
severe impediment to understanding how these pieces are composed into
something novel. Hell, even when I understand each piece, sometimes the most
foreign concepts haven't marinated long enough for me to use them as building
blocks yet.

Whats your complaint here? "I have a math degree, how dare nautilus target
anyone but me with their articles?"

~~~
tsimionescu
This may be very dependent on your education system, but commutativity and
associativity are terms I was taught around fifth grade, while function
composition, groups and matrix inverses were high school maths.

I've lost contact with many of these after finishing my university, but they
are still pretty common terms around me.

~~~
btilly
Where and when I went to school, commutativity and associativity were taught
in elementary school, and again in high school because we had all forgotten
it. Matrix math was something I learned in university.

However that was a long time ago and standards have changed.
[http://www.corestandards.org/Math/Content/HSN/VM/C/9/](http://www.corestandards.org/Math/Content/HSN/VM/C/9/)
indicates that matrix math is now considered high school material across the
USA.

~~~
tsimionescu
So those should still be terms that are familiar to most people in programming
today, not some arcane post-grad math stuff like category theory.

~~~
btilly
In theory, yes.

But you know about theory vs practice.

------
cwzwarich
I'm not so sure about the historical accuracy of the claim that representation
theory was initially dismissed. They quote Burnside being skeptical of
representation theory in 1897 in the introduction of his book Theory of Groups
of Finite Order, but soon after he this he reformulated Frobenius' work on
characters in a group/representation-theoretic framework. Burnside used
representation-theoretic methods to prove his eponymous theorem in 1904, and
representation theory appears in the second 1911 edition of this book.

A lot of people don't realize that the concepts of abstract groups, vector
spaces, linear transformations, etc. were not particularly obvious to people,
and a lot of results were developed in specific contexts that were later
realized to be more general. After the development of abstract algebra, these
subjects morphed into something much closer to the standard algebra curriculum
we see today.

------
ad404b8a372f2b9
One thing I've been having trouble with as I'm learning math is to coalesce
new concepts into a broader context. When learning things I usually assume
there's a tree of knowledge that I can travel back up, but with math it seems
like a tangled graph of concepts which can be grouped into many different
permutations to form different theories, also it's not entirely clear at what
level of abstraction one should stop digging. I'm beyond lost.

~~~
zomglings
Just want to offer some help. If you are interested in math, don't get
disheartened about how tangled things seem. Even if you can't find any
external references to a broader context, you will find that, given enough
time and effort, a broader context emerges in your mind.

Dig until you get tired or bored - because you could really keep digging
forever.

If you ever feel lost, read the masters - Euler and Dirichlet are two
favorites. Here is a great book: [https://www.amazon.com/Euler-Master-
Dolciani-Mathematical-Ex...](https://www.amazon.com/Euler-Master-Dolciani-
Mathematical-Expositions/dp/0883853280)

If you get tired of that, read _about_ the greats. MacTutor has a great
archive. I especially like their biographies: [https://mathshistory.st-
andrews.ac.uk/](https://mathshistory.st-andrews.ac.uk/)

The most important thing (only my opnion) is to not lose the sense of wonder
at the beauty of the universe as seen mathematically.

I don't know what your particular situation is - are you in university or
school and studying math or learning math as you juggle a career? Any more
specific advice depends on your circumstances and your objectives, but I
really hope I was able to help somewhat.

~~~
zomglings
Re: masters. Also Shelah:
[https://arxiv.org/abs/math/0211398](https://arxiv.org/abs/math/0211398)

