
How To Do Philosophy (2007) - HugoMelo
http://paulgraham.com/philosophy.html
======
jackson1372
(As a current Philosophy PhD Student...)

PG's essays are consistently great. Although in this case, I disagree with the
central thesis that the history of philosophy is merely a history of confused
talk. Certainly, a lot of philosophizing (especially OLD philosophy) commits
just this sin. But that's not to say that philosophy as a methodology is
inherently broken, that one has to be caught up in the muddle of confused
language. In fact, my own study of philosophy has taught me that philosophy,
at its best, clearly lays bear the ways in which language confuses us.
Philosophy, then, allows us to transcend the little confusions that pervade
our everyday language.

And what PG comes to saying, that philosophy should change people the way we
go about doing things, is essentially a reworded statement of the Pragmatist's
ideal. Peirce, James, and Dewey said the same thing a hundred years ago. PG is
right to stress this idea, but if his claim is that philosophy is not
sensitive to it, he's just wrong.

~~~
petegrif
I feel you fall into the very trap of which he speaks. Even a Philosophy
Ph.D.Student, wise in the way of the Ancients, and eager to speak plainly,
finds it impossible to not be 'caught up in the muddle of confused language.'
Your study has apparently taught you that philosophy 'lays bear' the ways
language confuses us. And that statement did indeed confuse me mightily.
Presumably it is a subtler articulation of underlying intent than the everyday
process of 'laying bare.' But perhaps I am wrong in this?

~~~
aptwebapps
That's the wordiest typo nitpick I've seen in a while.

~~~
petegrif
Precisely. LOL.

------
adrianhoward
I don't know if this is a UK/US thing - but the undergrad philosophy course pg
didn't seem to touch upon things that were happening in modern philosophy
judging by the essay.

I did a contextual course in cognitive philosophy as an undergrad back in the
late eighties (roughly equivalent to a minor in US universities) - so only
about two or three years after pg graduated.

Even in that - a minor in a narrow chunk of philosophy - we were exposed to
modern philosophers - people like Daniel Dennett
(<http://en.wikipedia.org/wiki/Daniel_Dennett>), Andy Clark
(<http://en.wikipedia.org/wiki/Andy_Clark>) and many more. All of which were
looking at work coming from the sciences to ground their thinking - and indeed
suggest courses for further research (look at Dan Dennett's influence on
cognitive psychologists for example).

There's a _whole branch_ of Philosophy (Experimental Philosophy -
<http://en.wikipedia.org/wiki/Experimental_philosophy>) that's been going
since 2000 with antecedents long before that.

At the time pg wrote this essay experimental philosopher Joshua Knobe
(<http://en.wikipedia.org/wiki/Joshua_Knobe>) was being written about in the
NY Times ([http://www.nytimes.com/2007/12/09/magazine/09wwln-
idealab-t....](http://www.nytimes.com/2007/12/09/magazine/09wwln-
idealab-t.html?_r=0)).

(Freaky PS... in double checking roughly when pg was born to see when he would
have been doing his undergrad philosophy course - I see he was born in
Weymouth, Dorset, UK.... about half an hour from where I live now ;-)

------
gnosis
It's a grave injustice to the vast field of philosophy to judge it by one
undergraduate's impression of Plato, Aristotle, Bertrand Russell, and
Wittgenstein.

By his own admission, he didn't get much out of his experience. Should the
entire field of computer science be judged by one unenthusiastic
undergraduate's impressions of Java?

~~~
pjscott
That argument applies to more than just philosophy. And yet, if you accept
that not all religions can be true, and that most of the major ones have long
scholastic traditions, then there must be _at least_ one field of study which
is vast, majestic, and wrong.

So, what method would you propose for determining which academic fields are
worthwhile, in their current state?

(I think I just did some philosophy. Oops.)

~~~
gnosis
For a method for determining "which academic fields are worthwhile", perhaps
it might be wise to start by becoming well familiar with the fields in
question -- something that the author of the above essay did not seem to have
taken the trouble of doing.

And how much less familiar with philosophy are most of the readers of this
essay? I would venture to guess that they are not even philosophy majors, but
rather computer science majors or self-taught computer programmers. What a
shame it would be if they thought the essay's author knew what he was talking
about and dismissed all of philosophy out of hand.

All of philosophy might be "wrong", but if that's so, it's going to take a lot
more than a vague, ill-informed five page essay by a disaffected undergraduate
to convince me.

------
rogueleaderr
If you like this essay, check out <http://lesswrong.com/>

They seem focused on what PG is proposing.

~~~
pjscott
Particularly on-topic is a series of articles by Luke Muehlhauser about "how
to do philosophy when you take cognitive science seriously."

<http://wiki.lesswrong.com/wiki/Rationality_and_Philosophy>

------
martingoodson
Nice article but isn't this mostly a restatement of Wittgenstein's thoughts on
philosophy? pg gives the impression that Wittgenstein had an obvious idea that
only he bothered to put into writing. Like many great ideas, Wittgenstein's
later philosophy only seems kind of obvious after you encounter it. And its
probably not fair to suggest that Wittgenstein just shut down philosophy
instead of studying it as 'an example of reason gone wrong'. 'Studying
philosophy as an example of reason gone wrong' is basically a description of
Wittgenstein's later thought.

Similarly, its difficult to answer the question 'has (western) philosophy just
been a complete waste of time' because we don't know how much of our thinking
has absorbed wisdom that came from philosophy. I suppose you could compare our
society to another technologically-advanced and successful society which
hasn't been as exposed to it. Like China, as pg suggests. But they have that
whole totalitarian-state thing going on so perhaps they are not the best
exemplar.

------
dschiptsov
One of such useful general ideas is to know when to stop.

Pilling up meaningless words (or Java-classes, or CL macros) is not just waste
of time, it creates even more confusion and messes everything up.

When very few people trying to capture the essence of a phenomena, so to
speak) they end up with something really clever, like Plan9, or Scheme (or
Arc) vi, Emacs, etc.

So, there is a simple heuristic - as long as you saw piles upon piles of crap
(J2EE, NodeJS, Clojure, everything that comes from MS or SAP) - just avoid it.

"Perfection", as we know, is achieved not when it is nothing more to add, but
when it is nothing more to remove.) This means, for instance, that we need
less special forms (but moar small macros), less special characters (and using
them consistently - I could write a brochure, about why using ~ instead of ,
in Clojure's macroses is not just a stupid break of consistency and
familiarity, but also lack of taste - , "matches" with ` while ~ not.))

Most of the time, even a single glance at a text or source code is enough to
form a correct intuition.

------
pfortuny
If you think philosophy needs "fixing" it isbecause you see it as something
external, a problem shich is out of yourself.

The fact is philosophy is the reaction of each individual to his confrontation
with reality, language, morals and people: you cannot "fix" individuality.

And the "muddle" is as ols as Descartes... don't start me on this.

Tryng to "fix" philosophy with "science" makes me laugh: which one, maths?
physics? string theory? behavior sciences??? COME ON.

Just think by yourself and accept you may not understand other people's
thoughts before trying to 'fix' them.

------
kijin
I usually like PG's essays, but this one is nowhere near as good as his essays
usually are. His argument has more holes in it than aerogel.

> _Most philosophical debates are not merely afflicted by but driven by
> confusions over words. Do we have free will? Depends what you mean by
> "free." Do abstract ideas exist? Depends what you mean by "exist."_

Isn't that exactly why (analytic) philosophers make a great deal of effort to
clarify the sense in which they're using a word? Philosophy professors often
tell students not to quote dictionaries in their essays, because typical
definitions of English words are nearly useless when it comes to modern
philosophy. Yes, ancient philosophers got confused by words. But people do
learn from their predecessors' mistakes! Read any good paper from the last
century that deals with free will, and it won't take long before you come
across words like "P-free", "Q-free", "T-free", etc., each with its own
precise definition. Nobody cares whether we have "free will" in a fuzzy sense.
What actually matters is whether we have "ABC-free DEF-will", where "ABC-free"
and "DEF-will" each have very precise definitions.

Has America become a communist country? Depends on what you mean by
"communist". But this doesn't mean that there aren't any interesting questions
to be asked about shifts in America's ideological makeup.

> _Instead of trying to answer the question: What are the most general truths?
> let's try to answer the question: Of all the useful things we can say, which
> are the most general? The test of utility I propose is whether we cause
> people who read what we've written to do anything differently afterward._

I'm afraid a lot of people who are attracted to philosophy actually want very
much to tackle the first question, and find the second question rather
uninteresting. If you're someone who is deeply attracted to the P=NP problem
and similar topics, would you find it satisfactory to spend your life creating
the next Instagram instead?

Seriously, I want to know whether there exist any general truths. Using a
suitably precise definition of "exist", "general" and "truth", of course. I
don't give a damn whether those truths (if they exist) have any impact on
human life, although I don't deny that it would be neat if they did.

Given what's being proposed here, the title of the article should be "What to
do instead of philosophy", or at best "How to do this special kind of
philosophy", not "How to do philosophy (in general)". The same applies to the
lesswrong crowd. I don't have anything against what they're trying to achieve,
but let's call a spade a spade. Failure to do so only exacerbates the
confusion-of-words problem that PG rightly points out.

> _Knowing we have to give definite (if implicit) advice will keep us from
> straying beyond the resolution of the words we're using._

If the resolution of English words is not enough, why not try to increase the
resolution? Why give up already?

Steve Jobs gave us the Retina Display while everyone else was playing with
anti-aliasing. The pseudo-words I mentioned above are one way to do to human
languages what Steve Jobs did to mainstream computer displays. (Okay, I'm
oversimplifying. The history of high-res screens is much more complicated. But
anyway.)

Also, we already use words like "#fdfdfd" and "#fcfcfc" to increase the
resolution of words like "white".

Disclaimer: I have a Ph.D. in philosophy, so I'm probably biased.

Edit: Increased precision in some of the sentences.

~~~
mattmanser
I did Philosophy as a degree and pg's essay, which I read a while ago, was one
of crystallization, a total clarification of something that had been bugging
me for years.

I personally believe that Philosophy is the place where amateur thinkers
reside until a discipline matures and breaks off. Philosophy is thinking
without the effort of doing. First it was Physics, Law, Chemistry, Biology,
Medicine and we are now seeing Psychology and Sociology departing Philosophy,
taking morality and personal identity into the realms of actual science
instead of thought experiments. Soon Philosophy will be left with its history
and debating the meaning of words. CS and Maths have taken all the interesting
logic questions away from Philosophy for example.

I believe what pg advocates at the end has already happened, is always
happening to Philosophy. The general truths philosophy helped discover are the
sciences, it just that as we get to answering the general truth the
philosophers step aside for people who actually do.

Most previously interesting philosophical questions are now interesting
scientific questions. There's not much left for philosophers to actually talk
about.

~~~
kijin
I think you're right about mature disciplines breaking off from philosophy,
but I don't see morality and personal identity departing the realm of
philosophy any time soon.

Sociologists only tell us how human societies behave, not how they are
_supposed_ to behave. You might hold the opinion that we can somehow derive
the _ought_ from the _is_ , but if that's the view that you're trying to
advocate, then you're already doing metaethics -- a branch of philosophy.

Psychologists and neuroscientists might be able to tell us how children
acquire a sense of identity, or how patients identify themselves after massive
brain surgery. But I don't think their job description includes figuring out
what the essence of a person is, or whether there is such a thing as an
essence of a person in the first place.

As for law, it is unlikely that you will be able to settle the question, "Why
in hell should anyone obey the law?" or "If the law is unjust, should I still
obey it?" until you've settled some questions about morality. Just because a
law is (un)constitutional doesn't mean that it is (im)moral.

Guess what, you can even combine questions about morality with questions about
personal identity, and throw in some legal theory as well, to end up with some
really interesting philosophy [1].

[1] <http://plato.stanford.edu/entries/identity-ethics/>

~~~
thirdtruck
We have neurological evidence pointing to how our very sense of "self" exists
as a construction of the brain, associated with specific regions.

Drugs that induce "a sense of oneness with the universe" appear to just turn
off that "self illusion" part of the brain. Compare it to when I kill my
Gnome/KDE daemon and see the shell underneath.

In short, neuroscientists have already discovered "the essence of a person",
regardless of job description.

~~~
kijin
If identifying the brain region or drug that produces the sensation of
identity counts as discovering the essence of identity, does identifying the
brain region or drug that produces the illusion of being a cat count as
discovering the essence of feline nature? Would the illusion provide us with
accurate rules to distinguish felines from other animals?

This is exactly the kind of equivocation that contemporary philosophers
complain about nearly every time an arrogant scientist announces the end of
philosophy. Not because the scientist's answers are uncomfortable, but because
they don't even answer the same question. (Arrogant philosophers, in turn,
would say in Latin: _ignoratio elenchi_.)

Yes, we already know which binaries produce the KDE Plasma Desktop when
executed from the bash shell. We also know exactly which bits produce which
parts of the GUI. But what we really wanted to know were the conceptual,
historical, and perhaps even ethical significance of the design decisions and
system architecture that make KDE what it is. (If we replaced Qt with
something else, would it still be KDE?) You can't answer that question by
showing us a million lines of C++.

~~~
thirdtruck
_Would the illusion provide us with accurate rules to distinguish felines from
other animals?_

"Felines" as a group exist only as a cognitive shortcut. Looking at the
genetic and evolutionary data, we find a continuum where felines and other
families bleed together in a very fine gradient. We just happened to draw
boundaries in the interest of productive discussion. We will never find "the
essence of feline nature" because there is no such concrete thing as a
"feline" which would have such.

For a human example: at what point do we stop being conscious when falling
asleep?

~~~
memla
He didn't ask you whether there are accurate rules for distinguishing between
felines and other families, he specifically and precisely asked you if you can
_deduce_ those rules from the illusion! It's the same inference you were
trying to make about the nature of the self. Read more carefully.

~~~
thirdtruck
Short answer: no. Long answer: no, because we derive these distinctions from
the real world instead, changing them as necessary to better reflect new
knowledge. See "birds are dinosuars" and "the brontosaurus never existed".

------
swansong
Dilettante.

<http://www.quickmeme.com/meme/3s458i/>

Just picking one of the many hilariously patronising statements in the essay,
the author says "we may be able to do better" than the great philosophers by,
in effect, making a big list of things that are true in general, and cause
people to act differently.

How do we measure truth? Who decides? What happens when the people who decide
become political? What if someone is wrong? What if an action that you thought
was true causes someone to do something morally wrong? What is morally wrong?

What if others want to talk about things that aren't on this list? Do you stop
them? Are they wasting their time if they ask other questions?

Maybe people in philosophy are looking at things now that will only make sense
to others in decades to come. That happens to be my opinion. It is worth
studying anything somebody gives their life to _very carefully_ before writing
it off as a "swamp of abstractions".

------
tylerneylon
Some commenters are defending modern philosophy, but I think there is
something legitimately wonky in the discipline. I can't figure out the best
way to describe it, but, to state it poorly, studying philosophy often feels
like filtering out a ton of BS.

I'm saying it's all BS (it's not) but that there's too much.

I don't think it has to be that way. Example ideas:

* Writing in philosophy could be improved (<http://tylerneylon.com/a/writing-about-philosophy/>)

* The attitude toward 'what is philosophy?' could be improved ([http://www.richardprice.io/post/35542139118/one-hypothesis-a...](http://www.richardprice.io/post/35542139118/one-hypothesis-about-the-role-of-philosophy))

------
goldfeld
There is one thing that I could do differently that's enormously significant:
freely undertake teleportation and mind uploading, given I live to be able to
do it; and it's in turn what's driving me just now into Philosophy (I'm
learning German even). Surprisingly for me, Graham nails that, but may not
quite explore the possibilities ensuing, when he talks about sectioning one's
brain in half, putting the other in a clone body, and trying to figure out
which person you are now.

My working line of reason is that consciousness is, as he puts it, merely an
abstract concept we get attached to and have the illusion of having. There is
no consciousness, only the awareness of a single moment--a point in time. What
we take for consciousness is nothing more than a succession of these single
moments, much as a computer's processor clocks at so many Gigahertz. With each
moment of awareness creating new memory and storing that memory back in the
brain, the subsequent moments are able to access it.

You coud think that the current moment of awareness is all that ever really
"exists", because for all the other previous ones you only have the memory,
you have no concrete evidence that it ever existed. There is no past
awareness, there is only a current awareness of what we have registered as the
past. With that decoupled thinking, you can start to realize that any given
moment of awareness that I have is as different (and as similar) from any
given moment you have as it is different from a moment I had 5 minutes ago, or
ever.

So who I "am" now is as entirely different from who I was just a second ago,
in essence, as it would be different from a clone that could be made of me.
There is nothing in my flesh ensuring I am a continuous consciousness. There
is only the memory. The reason I am closer to my 5-minute younger self than I
am to another human being is that the former and I share all the memory up to
5 minutes ago. We have a base commit to our memories in common. But a clone of
me, made now, would be exactly as close to my 5-minute younger self.

The bottom line is that I should not think that being teleported should kill
me, as that would imply that I'm dying after every moment of awareness,
birthing a new person who happens to share all my memories. Indeed I may well
be, and we have the illusion that we have a continuous life since we have all
these vivid memories of what just happened a second ago, even if it's not us
who lived through it.

And though I'm finding this train of thought increasingly alluring, I know
nothing of how philosophy ever brushed it, or it may well have tackled it head
on and I wouldn't know it.

~~~
GreyZephyr
I presume you are acquainted with Buddhist thought on this matter? One of the
central tenants of Buddhist philosophy is the idea that there is no such thing
as the self, but rather a linked series of experiences. The Chan and Zen
traditions make something of a big deal about this; the main goal of the
religious and esoteric practices associated with them is to come to the
realisation that you do not in some sense exist and are 'selfless' at this
point you are said to be enlightened. (As an aside one of the core rules for
Buddhist monastic communities is that you can't claim to be enlightened as if
you have achieved enlightenment there is no self to make the claim.) The
European philosophers came to the idea much later on, historically this would
seem to be a side effect of the notion of a soul, and in particular its
indivisible nature.

~~~
goldfeld
Buddhist philosophy did shape much of my thinking in the years I read and
meditated on the Vipassana variety. But I hadn't been aware that other
traditions took the notion of detachment as extremely--and down to it's
essence--as I'm now taking, thanks for pointing it out. I hear all of the hype
around Zen but the fetishism of Japanese culture keeps me at a distance;
though Chan sounds like a wonderful indulgement to go with my Mandarin
studies, I'll look into it.

------
knb
I was puzzled by the statement that he didn't learn much from the classes he
took in formal logic.

>(Consider "I don't know if I learned anything from them. " Footnote: 1 "In
practice formal logic is not much use,..." )

This guy has programmed massive amounts of LISP code, and written books about
LISP.

How can he say that his training in formal logic didn't take him anywhere?

------
powertower
I've always enjoyed reading this (by ― Mikhail Naimy, from - The Book of
Mirdad)...

"Logic is immaturity weaving its nets of gossamer wherewith it aims to catch
the behemoth of knowledge. Logic is a crutch for the cripple, but a burden for
the swift of foot and a greater burden still for the wise."

------
Tycho
My guide to doing philosophy: study philosophy (along with history and other
related subjects) until you're in your early twenties, then choose which
philosophy you think is right and live by it.

~~~
kijin
In fact, this is what most "professional" philosophers do. The theories they
prefer are more or less set in stone by the time they do their Master's (age
22-24). After that, it's mostly refinement and minor adjustments, and arguing
with everyone who disagrees. Of course, those minor adjustments make all the
the difference between a convincing theory and a ridiculous theory.

Some people are exceptions; that's when you hear them being referred to as
"the early Tycho", "the late Tycho", etc.

------
indubitably
Does he _ever_ get any less arrogant?

