
The Tyranny of Trendy Ideas in Academia - jseliger
https://www.chronicle.com/interactives/20190701-trendyideas?key=mi0Bff1vaLHL09_no2Emg7sn4dWNuRhvVkoo39I05x-Q5CJZ4DU8VWHMRpHsHYr-NUpHdkhCN0pBenY4TG5GU04tSXRqbE92aENwQWVCdXJzWXZrVG42b1pvZw
======
hirundo
> As every freshman course in “critical thinking” reminds us, the dull,
> unhappy burden of the rational mind is to follow the evidence where it
> leads, not the bandwagon.

If you do that you frequently end up with a position that is "extremist" to
someone or another. This effect is so reliable that if you meet someone who
isn't extremist about anything it's safe to say that they don't think for
themselves, rationally or not.

As a corollary if you agree with all of the mainstream positions of your
(aspirational or actual) social group, whether you realize it or not, you've
probably arrived there via social pressure rather than an open, honest,
rational method.

So I don't hold an accusation of extremism as an insult, but more a
prerequisite of someone capable of independent thought.

That doesn't mean such a person won't be wrong as much as anyone else, but
they're not necessarily more wrong than conventional wisdom, and they tend to
be right or wrong in interesting ways. Which is valuable.

~~~
13415
This is mostly wrong, because there is only one reality. As trivial as this
may sound, if you're interested in reality, then you will invariably get to
the same conclusion as many other people. Of course, there will always be some
shifts in knowledge and at that time a fringe position can be the right one,
but these are rare - very rare, indeed. Generally, the fact that there is one
reality means that scientific opinions will converge and they do so fairly
quickly. The majority of the work in science is adding a little bit here and
there and refining theories, not coming up with radically new theories that
contradict existing opinions in a radical way.

I'd be wary about areas in which this is not the case. There are such areas,
particularly in the humanities, but these are the ones with major
methodological problems.

~~~
Der_Einzige
There's entire books written which contradict every point you've brought up.

[https://en.wikipedia.org/wiki/The_Structure_of_Scientific_Re...](https://en.wikipedia.org/wiki/The_Structure_of_Scientific_Revolutions)

[https://en.wikipedia.org/wiki/Against_Method](https://en.wikipedia.org/wiki/Against_Method)

[https://en.wikipedia.org/wiki/Proofs_and_Refutations](https://en.wikipedia.org/wiki/Proofs_and_Refutations)

[https://en.wikipedia.org/wiki/Imre_Lakatos](https://en.wikipedia.org/wiki/Imre_Lakatos)

[https://en.wikipedia.org/wiki/The_Ego_and_Its_Own](https://en.wikipedia.org/wiki/The_Ego_and_Its_Own)
(his stuff on epistemology not the rest)

[https://en.wikipedia.org/wiki/Demarcation_problem](https://en.wikipedia.org/wiki/Demarcation_problem)

[https://en.wikipedia.org/wiki/Is%E2%80%93ought_problem](https://en.wikipedia.org/wiki/Is%E2%80%93ought_problem)

~~~
13415
I know most of these books and issues, I'm a philosopher myself. Most of them
do not contradict anything I've said, so you'd need to be a bit more specific.
You will not get me very warm with Feyerabend and Kuhn, though.

On a side note, what you're doing is appeal to authority. Not every successful
author in philosophy is really an authority, philosophy doesn't work that way.

------
autokad
A company was holding a competition for 'a great new idea to solve heart
disease' with a huge million $ prize. our research group of 30+ people
consisting mostly of faculty wanted to win it, so we broke our group into 6 or
so sub-groups to come up with ideas and submit it to the greater group.

my group wanted to submit as an idea 'tax on soda'. I was adamantly against
the idea. Ignoring the fact that I don't think it would work, its not a 'NEW'
idea, and the company that was holding the comp was doing this for PR, and
this doesn't seem like an idea they'd choose as a winner. However, when the
groups got back together, all of the other groups submitted the same idea, tax
soda.

I was floored. there was over 300 years (3 centuries) of higher education
among the groups, and they were completely incapable of original thought.

~~~
btrettel
To comment on a tangential issue...

> there was over 300 years (3 centuries) of higher education among the groups

I don't think simply adding up the total number of years of education is
meaningful. Most of that education is redundant. If it were humanly possible
I'd be more impressed by 300 years of education for a single person as it
wouldn't have the same amount of redundancy.

You'll see this cumulative strategy appear in other areas as well, e.g., "our
lawyers have 200 years of experience!"

~~~
LanceH
As much as I would like to hate on lawyers, 200 years of trial or contract
experience is meaningful. They have seen countless variations on whatever
theme they practice, they know the jurisdiction, they know the judges, the
opposing counsel, etc...

~~~
btrettel
Fair points. That wasn't a comment on lawyers, just an (exaggerated) example
I've seen before.

------
ex3xu
As appealing as the title of the article is, I basically came away from this
piece feeling like I had just read a snarkier, less useful version of Paul
Graham's essay on moral fashion [0] back from 2004.

The problem with following fashion, as far as I understand, is that you accept
the tacit assumption that the value of an idea is correlated with its
popularity, and I'm all in favor of smiting the very dangerous contrapositive
to that assumption: that unpopular ideas therefore must also lack value. I
support this author as far as he is attempting to raise awareness of this trap
-- lest someone else have to go through the same fate as Ignaz Semmelweis,
Guglielmo Marconi, or James Allison.

But I felt like this article did little to present valid mechanisms to
mitigate the risks of fashion, and instead spent most of the text attacking
fashionable ideas simply for being fashionable, operating under the assumption
that fashion itself necessarily lacks substance and causes problems on its
own. I think this sort of hipster approach is as big of a risk as blindly
following fashion -- both premises assume that popularity is necessary and
sufficient to determine the value, or lack of value, of an idea.

But maybe I'm just a blind fashion follower myself, since I felt so personally
attacked when the author criticized MOOCs as just another fashionable fad --
despite the valid criticisms he linked to, I felt like I got a lot of value
out of them, after all.

[0]: [http://www.paulgraham.com/say.html](http://www.paulgraham.com/say.html)

~~~
paultopia
Oh good heavens, Graham's essay is wrongheaded. He encourages one to look for
"taboo" ideas, and declares that one should train oneself to consider them.
But he describes "taboo" ideas as an explicit result of a power struggle with
no relationships to truth:

The most conspiracy-theory-esque passages:

1.

 _The statements that make people mad are the ones they worry might be
believed. I suspect the statements that make people maddest are those they
worry might be true._

2.

 _So if you want to figure out what we can 't say, look at the machinery of
fashion and try to predict what it would make unsayable. What groups are
powerful but nervous, and what ideas would they like to suppress? What ideas
were tarnished by association when they ended up on the losing side of a
recent struggle? If a self-consciously cool person wanted to differentiate
himself from preceding fashions (e.g. from his parents), which of their ideas
would he tend to reject? What are conventional-minded people afraid of
saying?_

Here's the problem with this: it totally ignores the possibility of social
learning (well, to be fair, he doesn't completely ignore that possibility, he
just dismisses it in a couple of sentences), and with it, social learning
about how some bad ideas can be particularly dangerous.

Here's an example. Anti-Semitism. Clearly a "taboo" idea in the sense that
people get punished for uttering it; moreover, part of the reason that people
get punished for uttering it is because those of us doing the punishing
(myself included) are worried that people might start believing it.

But we're not worried that people might start believing it because we're
worried it's true! I'm pretty sure I speak for essentially all other enforcers
of that particular taboo when I say that we know that the claims that anti-
Semites make are quite false indeed. (I'm assuming here that we're talking
about the parts of anti-Semitic thought that make claims to be the sort of
thing that can be true or false, like financial conspiracy theories.) Rather,
we're worried that people might start believing it because we know history! We
know that anti-Semitism is a belief system that is very easily exploited by
evil people, who use it to trick people who are casting about for someone to
blame for their misfortunes into finding an enemy and then giving the evil
people political power in order to attack them. Because we know that anti-
Semitism has that kind of mind-virus property (more like a mind-biological-
weapon), we come down hard on people who utter anti-Semitic things. It has
nothing to do with secretly worrying that they might be true.

The same is the case for lots of other taboo beliefs. (I can think of several
that I don't want to mention, because I know that doing so will derail the
subsequent conversation with deluded people saying "well, that COULD be true."
But, again, historically, those beliefs have contributed to massively evil
behaviors and I don't want to contribute to their propagation even
indirectly.)

~~~
wisty
> part of the reason that people get punished for uttering it is because those
> of us doing the punishing (myself included) are worried that people might
> start believing it

I think this is the irrational part people are detecting. Yes a few thousand
idiots on the internet believe some anti-Semitic conspiracy theory, and a few
more joke about it, but it's simply not something anyone remotely normal
accepts. People also say that gingers don't have souls, but even that's not
something that would be healthy to worry about.

~~~
Apocryphon
Those numbers are far underestimated, and grow exponentially when you consider
all of the other types of scapegoating conspiracy theories that attack other
groups of people on the internet.

------
Fede_V
I am as anti-pomo/critical theory as it gets, but, sadly, this phenomena is
very much present in the hard sciences as well.

For example, about 10-15 years ago, the big craze in biology was microarrays,
and the early papers made crazy promises about everything that they would
deliver. With hindsight, it turned out almost all the early papers were
fatally flawed and were drastically statistically underpowered: it turns out
that living systems have a huge amount of noise, making clean measurements is
difficult, and things are very complicated. The same exact thing played out
with microRNAs, epigenetics (chipSeq data is exceedingly noisy), metagenomics,
single cell genomics, etc.

Every single time you'd get 'trailblazers' who'd come into the field, do
shitty rushed science, hype their results like crazy, and by the time people
figured out that all the initial results were garbage, they'd have moved onto
the new hot thing.

I guess the advantage that hard sciences have over the humanities is that
eventually we'd figure it out and do things properly - but the amount of money
that has been wasted in shitty research that didn't add anything to human
knowledge is probably in the hundreds of billions in the last 15 years alone.

~~~
dannykwells
Hmmm, I'm not so sure about this. With things like microarrays and other tech,
sure there were promises that were too big. But the underlying _idea_ (gene
expression matters, there are lots of genes, we should measure them all) is
correct.

To wit, RNA-seq has transformed cancer biology. And Nanostring is now offering
a microarray-like technology with extremely impressive signal to noise. And we
still use lots of tools initially developed for microarrays (Limma, for
example). So I would argue the investment in microarray technology has
absolutely paid off.

I think you're throwing the baby out with the bath water to say that, because
a few bad eggs are effectively salesman and not scientists, that we wasted
"hundreds of billions" of dollars (i.e., a large fraction of the NIH budget).
The fact is, most research doesn't hold up not because the researchers weren't
honest, or because they were salesmen, but fundamentally, because science is
hard and it's _really_ hard to be right for the right reason in science.

Alternatively, if you have a formula to determine, in real time, which results
are hype and won't stand up vs. which are real and will, well, that would be
something too.

~~~
dfeliej
In my experience, there's a disproportionate reward associated with fashions
in science. That is, the people at the forefront of a trend get rewarded
professionally (tenure, positions, promotion), even when behaving in a way
that is irresponsible theoretically or empirically. The skeptics, on the other
hand, who carefully lay out the problems with the new research, and eventually
point out the obvious flaws, get treated as sticks-in-the-mud, etc., _even
though they are eventually proven correct_.

There are gains to be made through innovation, and I'm not arguing against
that. Sometimes too, details are just minor details to work out. But sometimes
the details matter and point to bigger problems.

The problems with fads in science are that the focus in academics seems
sometimes to be on popularity per se rather than verdicality, and denialism
about scientific fads as a phenomenon. This is pointed out in the article, and
is one of its better points.

Someone else criticized the article for not presenting solutions, which is
fair, but I think that's in part because effective solutions would probably
require a huge shift in culture in academics. We'd need to return to more
stable professional positions rather than the cannabilistic postdoc system in
place now, with declining tenure; we'd need to focus more on groups of
researchers and findings rather than celebrity academics ala TED talks; we'd
need to put much more emphasis on a pattern of replications rather than single
studies; etc. etc. Even then, I don't think the problem would entirely go
away.

------
bitwize
The "furious exchange of proper nouns" made me lul.

"Derrida and Badiou at Tanagra!"

It's like this because like the Tamarians, everyone in the discussion is
assumed to be familiar with all of the background material -- and if you are
not, then what the hell are you doing here? I saw this kind of elitism unfold
here on Hackernews once. Someone once asked for an explanation of what
deconstruction was and was told basically, "Have you read your Husserl,
Heidegger, Hegel, Nietzche, Freud, and Levinas yet? No? Then shaddap, you are
obviously not prepared to even begin to understand deconstruction." Can you
imagine Feynman doing that? Like, "Have you read your Newton, Leibniz,
Maxwell, Einstein, Bohr, and Schrödinger yet? No? Then shaddap..."

Philosophy is like AI: as soon as we know it works and put it into practice,
it ceases to be philosophy. So people who majored in philosophy are put in the
uncomfortable position of having to justify the importance -- to others and
perhaps themselves -- of what they study, and it's much easier to _seem_
important by forming exclusive intellectual cliques and inventing ways to
consider people outside your clique to be stupid and wrong-headed than it is
to dig deep and find out why and how your work is important in the real world
-- or, worse yet, face the music and admit your work is not important at all
and you're better off doing something else.

~~~
Zarath
I wrote a long response to this and deleted it, so I'll just say this.
Politics and philosophy are one of the few things where everyone feels
qualified no matter how pedestrian or unqualified they are. So you'll have to
forgive people for not bothering to listen to someone who hasn't read a book
on politics or philosophy maybe in their entire life yet feel like their
regurgitated Fox News talking points should be taken seriously in a
discussion. You wouldn't listen to someone who hasn't passed physics 1 about
the whether the bridge you are building is stable. This isn't elitism.

~~~
nkurz
> You wouldn't listen to someone who hasn't passed physics 1 about the whether
> the bridge you are building is stable.

It depends what they were saying about it. If they were claiming that it's
obviously safe because it's made of steel, because steel is really strong,
then no --- not even if they had a PhD in Physics and knew the exact tensile
strength of the materials. But if they were mentioning that in their personal
observation this particular bridge sure sways a lot when it's windy, I'd
probably listen to them despite their lack of formal education:
[https://www.youtube.com/watch?v=qbOjxPCfaFk](https://www.youtube.com/watch?v=qbOjxPCfaFk).

~~~
Zarath
You're correct, but in the same way you'd trust the person to tell you about
the symptoms of the bridge swaying, you can trust a person to tell you about
the symptoms of everyday life. E.g. it's hard to find a job, minimum wage
isn't enough to live on, I feel isolated, etc. But leave the higher level
analysis to those more qualified, at least IMO.

------
username90
Social sciences has yet to find their scientific method. Natural science was
mostly bullshit before the scientific method and it set them straight. So in
my opinion listening to social scientists today is kinda like reading
Aristotle to learn about physics. A lot of thoughts for sure, and it sounds
smart, but it totally lacks any connection to reality.

~~~
pmyteh
My part of political science has been (attempting to) follow scientific method
since the 1940s. Its results have been... underwhelming. The main reason for
this, bluntly, is that people are _really damned complicated_. They're
difficult to model, they behave irrationally, even a minimal model has way too
many variables. Hell, they don't even vote using the same heuristics that they
did 50 years ago. There's a reason why abstract social science (like model-
based economics) can look like it has little connection to reality, while the
less abstract you get then the woolier and less scientifically satisfying it
generally becomes. It's because getting crisp, reproducible, cumulative,
results is extraordinarily hard.

Faced with that, you have two choices: keep trying to get better at what we're
doing, or decide that these questions aren't worth answering (or that it isn't
possible to). I think that better understanding the behaviour of people
collectively organising themselves is vital, so I pursue the first. But I have
sympathy with those that would prefer to work on something else.

~~~
username90
Right, the scientific method is mostly useless for social science which is why
it is in such a dire state. My point is that we need something of equal
significance for social science before it will start make noticeable progress,
until then it will be on the level as Aristotelian physics.

~~~
fujvcijccfbj
Social sciences have their own methods, which are suited for the unique kinds
of data and questions they deal with. Read John Law’s After Method: Mess in
Social Science Research for a solid overview of why you can’t just mask
natural science methods onto the social sciences, and which might also clue
you into why your assumptions regarding the value of social science research
are wrong.

------
btrettel
From the article:

> Mayr’s observation draws attention to the fact that fashion is never merely
> additive; it demands we abandon certain things, and those abandonments can
> be, at the very least, premature.

I've observed these premature abandonments in my own area (a particular
subfield of fluid dynamics), but I think its cause is more complicated than
"academics follow fads". I think fads are part of it, but the grant funding
cycle is another part. Perhaps grants are largely driven by fads. I wouldn't
really know, but I can recall calls for proposals on fad X as applied to Y. I
think part of it is that people rarely do actually comprehensive literature
reviews, so they don't have a good idea of what's been done and what hasn't
been done, so they might assume something isn't as novel as it actually is. Or
their literature review might make them think that X is solved, but a little-
known paper showed that X is actually more complicated than is commonly
believed and not solved.

This dynamic means there's a lot of "low-hanging fruit" available that won't
look like low-hanging fruit to most people. I'm happy to grab such things
myself. A few years ago I made a large data compilation for a problem I'm
interested in (~1200 data points from ~50 studies) and have written two papers
from that already, with several more planned. Many people seem to implicitly
assume this work has already been done because it seems "obvious", but I think
large data complications of this variety are rarely ever done. Many things
that are commonly believed are actually easily shown to be false with the
right data. I'm planning to make a career of this. It's not sexy right now but
that could change with time.

Edit: Data compilation is a commonly missed "last step" in a review. Most
reviews I've read seem to collect the conclusions of analyses of small amounts
of data. It's better to instead collect all the data, filter out the bad data,
and draw conclusions from the compilation instead. A collection of the
conclusions of small studies will have bad data, biases, and/or not generalize
as well.

------
nn3
It happens in academia, it happens in programming.

Why exactly are you using that trendy java script frame work again? Maybe it's
a little better (or not), but at least it is a lot cooler than yesterday's and
looks better on a resume.

I guess it's just how human societies work.

~~~
bitwize
Let me introduce you to a little something called Industry Best Practice,
which is pretty much just fashion as corporate policy.

You are using the latest JavaScript framework because your boss made you use
it. Your boss made you use it because it's popular -- had he chosen an
unpopular framework with a less active community and a smaller pool of
developers to hire from, _his_ ass would be grass because he would be
introducing unnecessary risk into the software project. If your boss decides
to be a maverick, exercise his own judgement in development tool choice, and
pick Haskell or Erlang or something, he had better offer a perfect guarantee
that the project will not fail or overrun its budget because if it does, the
language/tool/framework will be blamed -- because it differs from Industry
Best Practice.

~~~
Apocryphon
It's cargo cults all the way up.

------
dredmorbius
My view of fads is that they are endemic to domains in which the underlying
structures are complex and deep, but in which being able to rapidly assess
tribal affiliation IFF[1], evident, costly, and at the least not trivially
donned-ordoffed signifiers are required.

I've noted the similarities previously across a number of domains: clothing,
music, food and diet, fitness, art. To a certain extent architecture.
Automotive and other equipment or product design. Fandoms around books, or
films, or comics. Theories of political economy. And management fads. See:

[https://old.reddit.com/r/dredmorbius/comments/62uroa/clothin...](https://old.reddit.com/r/dredmorbius/comments/62uroa/clothing_music_diets_art_management_theory_fad_as/)

Add to that software languages, development methodologies, and academic
fashions.

The inderlying reasons are information theoretic as well as tribal. We're
working in complex multidementional dynamic spaces with diverse worldviews and
incomplete, inconsistant, variably understood-and-applied vocabularies (and
other symbolic references) scrabbling for a small pie and limited prestige
slots. Simply understanding a common language (or choosing which to adopt) is
a high-risk decision made under highly imperfect knowledge.

________________________________

Notes:

1\. Indentify friend or foe:
[https://en.wikipedia.org/wiki/Identification_friend_or_foe](https://en.wikipedia.org/wiki/Identification_friend_or_foe)

------
chiefalchemist
Humans are humans. Some industries (e.g., academia, journalism, science, etc.)
have mythical branding that positions them above bein human. They are not.
History is pretty clear about this.

Such a flaw in and of itself isn't a bad thing. What it is, it is. The danger
comes when others put too much (blind) faith in the myth, and not enough in
the lessons history has already taught.

------
Der_Einzige
I like attacking academics for careerism as much as the next guy (and
especially in the context of the strange embrasure of post modernist authors)
but they straight up aren't even talking about the worst form of this yet.

The French authors listed at least had a sembalance of value from their works.
Hell, I'd highly recommend everything Arendt wrote and Derrida wrote at least
one good book on the Marx/Stirner rivalry. Plus if you're getting a philosophy
degree you basically need to read everyone on that list (okay maybe exclude
Foucault cus he says some dumb and anti-historical shit about leprosy)

But this piece failed to really get at the heart of Charlatanism. What really
scares me are the "academics" who unironically believe in psychoanalysis
today. It's not just for fashion - it's pure lunancy. These are your
"lacanians". Try reading Lacan, or Deleuze and Guattari and you will see a
whole level of bullshit one wouldn't even imagine possible. Seriously the
story of Jacques Lacan and his popularity today among academics should _scare_
you.

~~~
kaycebasques
Philosophize This just did a series on Foucault [0] and Deleuze [1] which I
got a lot of value out of. Particularly the Deleuze discussion about how post-
modernism doesn’t always have to devolve into deconstruction. That was eye-
opening and hugely helpful for me. And Foucault’s idea of the panopticon is
probably very relevant to anyone interested in surveillance capitalism.

I studied History at Cal so I understand the contempt for the philosophers and
their academic followers. They can be personally obnoxious and seem to be
intentionally obscure. Post-modernism seems to be creating more problems on
campuses than its solving. But when I hear a good, plain English account of
their arguments, like in Philosophize This, the ideas help me understand the
world, and I’m thankful to them for that.

[0]
[https://open.spotify.com/episode/1tcYLy2ILoRoDddvR56Kxy?si=O...](https://open.spotify.com/episode/1tcYLy2ILoRoDddvR56Kxy?si=OCDqkr12TbOqWc1gEdHeSQ)

[1]
[https://open.spotify.com/episode/0E80qwTOz41UruHfyZMc2P?si=n...](https://open.spotify.com/episode/0E80qwTOz41UruHfyZMc2P?si=nqrGIXBWSISvqBsGtf9jQQ)

~~~
Der_Einzige
Foucault's only half decent book was discipline and punish (where the
panopticon chapter is in) but if you actually read him (or open up any of DnGs
anti-oedipus) you will see very rapidly why you had to have a secondary source
understand them.

It's because modern people don't read Charlatan authors as much as Foucault
did. Foucault is a terrible historographer and no amount of writing against
"metanarratives" or "totalizing descriptions of history" saves him from this

"Deconstruction" is not what I critique. I critique people who unironically
stlll believe in the powers of psychoanalysis. DnG don't get a pass for
inventing schozoanalysis - it's way worse!

~~~
kaycebasques
I agree that secondary sources do a better job at explaining the ideas than
the originals. I did read Discipline And Punish though and got through it
fine. My understanding of the text mapped well to the Philosophize This
podcast on the topic. It would be exhausting to read his whole body of work,
though. Same goes for any of the post-modernists. I guess they might have
argued that they were venturing into new intellectual territory and that
language wasn’t well-suited to what they were trying to explain, but if that’s
the case then that’s just elitist BS in my eyes. The secondary sources prove
that you can explain these ideas in plain language just fine.

Didn’t mean to imply that you were critiquing deconstruction. I bring it up
because in my experience the major complaint around these authors is that they
seem to just tear down the “grand narratives” of the past without showing the
way to a better future. I think this is the main problem on campuses right
now. I for one can definitely say that was my college experience. I left with
the clear impression that a lot of ideas I had grown up with were BS,
meanwhile my ideas on how to build a better future were vague at best.

~~~
0815test
> the major complaint around these authors is that they seem to just tear down
> the “grand narratives” of the past without showing the way to a better
> future.

IME the main complaint is that they seemingly operate from a standpoint that
the work of "tearing down the grand narratives" can itself be a "grand
narrative", a "theory" of its own. We shouldn't be surprised when this doesn't
work very well! The fact that they aren't "showing the way to a better future"
(unlike, e.g., pragmatists who can be just as skeptical as postmodernists when
it comes to all-encompassing grand stories!) is really a symptom, not the main
issue.

------
samirillian
Let me excerpt an email from a friend:

> ...[H]ermeneutics is fundamentally about power, as Nietzsche described. But
> in Nietzschean terms it is not like Foucaultan terms, for Nietzsche includes
> "being a slave" or being incorporated into the power of another, as a
> potentially positive or potentially negative thing. For Foucault it is never
> neutral and rather black and white. And for Hegel. Which is why Nietzsche
> and Derrida are both superior to Foucault and Hegel whenever hermeneutics
> and the hermeneutic circle are issues of earnest import.

Now, I think this paragraph is entirely readable and gives a pretty good
breakdown of the difference between, say, a Neitzschean and a Foucauldian
understanding of power, freedom, and interpretation. Personalities here serve
both as metaphors and as reminders that the world we live in is a _human_
world.

I'm sure the people this writer is talking about are insufferable, but this
writer is also kind of an ignoramus.

------
aalleavitch
Kuhn said it first:
[https://bertie.ccsu.edu/naturesci/PhilSci/Kuhn.html](https://bertie.ccsu.edu/naturesci/PhilSci/Kuhn.html)

------
whoevercares
I thought this is about deep learning lol.

------
bigbadgoose
It's OK, even the 2nd, 3rd, and 4th deviations have their "normies"

------
api
One peeve of mine is these sorts of folks complaining about the popularity of
people like Jordan Peterson. Sure I agree that Peterson is confused, pseudo-
profound, and misogynistic, but maybe if academics churned out more philosophy
that spoke to the concerns of real people Peterson would have less of an
audience. People like him are popular because academia produces only
impenetrable walls of irrelevant jargon.

~~~
wayoutthere
People don’t want philosophy; if they did then there are plenty of authors
from Aristotle through the modern day. Existentialism is basically “how to
survive in an arbitrary world that makes no sense”. It’s a fucking guidebook
for the modern world built in the years before and after WW2.

People en masse want pop-philosophy that confirms their existing beliefs.
Nobody reads JP to become a better person, they do it to feel better about
being a shit person.

~~~
sho
> Nobody reads JP to become a better person, they do it to feel better about
> being a shit person

Without wishing to get into yet another discussion about Peterson, I think
that's a pretty textbook example of an intellectually dishonest attempt to
simply pigeonhole someone you probably don't know all that much about into the
"enemies" basket.

I've read and listened to JP to some extent - believe it or not, yes, as part
as some kind of greater effort to better myself, as is much of my reading. I
agree with some of what he thinks, and disagree with others, but I don't
regret the time spent. If that makes me a "shit person" then frankly I think
it's your worldview that is the problematic one.

