Using deconstruction as a method, independent and decoupled of its own genesis, does not make sense and will yield little of use. The whole point of deconstruction is to oppose another strand of Western philosophy that, depending on country, has reigned for centuries: idealism.
Now, I realise it's cool to "show" how much superior and empirical engineering is, but idealism is a root of many causes. It has fundamentally shaped our ideas of the University, of statehood, nation, authorship, race and the sciences. Many of the subsequent movements, starting with Marxism, followed by Nietzsche, the New Left, and deconstruction (which, really, is called poststructuralism) are an undertaking to undo some of the damage idealist reign has caused during its reign.
Deconstruction aims to undermine the very hierarchies idealist thought has errected: the nation as the culture of a given, demarkated and distinct people, university as a structure of educating state servants, authorship as the translation of Geist into matter, science as the study of matter, leading to the ultimate victory of ratio and empiricism.
If people talk about advanced programming techniques without knowing anything about underlying compiler design, algorithms etc, they are ridiculed and set in their place. If people talk about history of thought without having any idea what they are talking about, they are applauded while people cheer for their (and their own) ignorance. Why is that?
Deconstruction without its history is worthless. That, however, also suggests that for a history of thought, for an understanding of our present, deconstruction is indeed relevant.
Say what you want about Wikipedia's internal processes (if you must), but it does an overwhelmingly good job at saying what things are, including schools of thought and beliefs. Even through the edit wars, it can explain the most controversial tenets of religions, where the stakes are much higher than poststructuralism and stuff.
But the Wikipedia article http://en.wikipedia.org/wiki/Deconstruction has been an incoherent disaster for over a decade. It's not even much of an edit war, just a stagnant mess. Everyone agrees that it is bad. Nobody agrees on what to do about it.
The disagreement is not between literary theorists and sciencey types. The disagreement is between literary theorists and other literary theorists, whose only unified view is that sciencey types should not be writing the article. Occasionally one of them will wholly rewrite the article, at which point everyone else says, basically, "wtf did you just write", and the list of problems gets longer.
My personal conclusion is that, while you criticize people who "talk about history of thought without having any idea what they are talking about", the people who do have an idea what they are talking about on this topic don't exist.
It's still dense and a bit jargon-y, but relatively easy to follow linearly.
The second you get into "X thinks Y about Deconstruction" or "X's work relates to Deconstruction because Y" (which is most of the Wikipedia article), the whole thing seems incredibly unapproachable without being an expert on the entire canon of western philosophy. This is unreasonable for something that's really a pretty simple concept.
This sort of argument is very common (from my experience) in contemporary philosophy: "Your work on X is invalid because you didn't consider how it relates to Y." Despite the fact that that is a fallacy, covering all the "it relates to Ys" seems to be a primary goal of the Wikipedia article.
> But with Derrida, we know now, the foundation is not a unified self but a divisible limit between myself and myself as an other (auto-affection as hetero-affection: “origin-heterogeneous”).
That's bullshit intended to baffle.
Translation: "Maybe the self is not the unitary thing we normally consider it to be. But how can that be possible? For example, if it has multiple parts, couldn't we just say that only one of those parts is the self? Where does the multi-faceted nature come from? Is it merely the grammatical division between subject and object -- are there two selves in the sentence 'I affect myself'?" Etc.
I agree that the writing is a little silly (that parenthetical should be its own sentence), but it's partly because the ideas are just pretty hard to think about.
No, there aren't, though I don't see why that has anything to do with the paragraph in question.
Are you saying that the foundation of Deconstruction itself rests on a violation of the use-mention distinction as obvious and pithy as "there is no 'I' in 'team'"?
I agree with you, though, that it should be more accessible, though articles on the Stanford Encyclopedia can often be fairly technical. But it really takes a brilliant writer (which the author of the SEP article clearly is not) to make accessible something complicated and arcane. I just don't want you to draw the conclusion that because this guy couldn't do it in this particular sentence, a whole field of thought is crap. Or unfairly ascribe bad intentions to the author.
The parenthetical is complete and utter bullshit, as if somebody said "this is good, but you should jargon it up a bit."
Deconstruction is not a "thing". Neither is "empiricism". They are modes of enquiry. Both position you in a certain mode towards your surrounding environment, be it political, social or scientific. They both open certain modes of enquiry while excluding others. One conclusion that one can draw from deconstruction (if we "apply" it onto itself) is that no answer one can ever give will be without bias or will completely exclude any other option.
Jürgen Habermas coined, in his Systemtheorie, the notion of the nth grade observer, where every degree of observer may observe another onlooker and his specific bias and blind spot, yet each degree of observation always introduces its own, new blind spot.
When Derrida talks about archi-écriture, what he says is that if there was ever an all-encompassing entirety of meaning that only had one singly way of interpretation, then this unity is a mythical one which cannot be recovered. Every utterance, every decision has the inherent option of an opposition.
Derrida is Hegel upside down. If Hegel sees Geist as the unity of matter and spirit, then for Derrida, this unity might have existed in an archaic past but has since been lost and cannot be put back together. Hence, whatever we do will not empty out all possible interpretations and views of life, there will be no single answer to the meaning of life or some such. Instead, all that remains is the fundamental idea that every utterance and every positions carries within itself already the means to disprove it.
This is not fundamentally different to the scientific method. The atom was thought to be the smallest, inseperable unit (hence the name) before it could be found. However, in finding the atom, we realised that it is, indeed, made up of smaller parts, which in turn are again made up of smaller parts. What was taken as an absolute minimum was, in fact, only the temporary minimum before we could conceive smaller units. The same, just in reverse, goes for the solar system, galaxies etc.
In current times the term thing applies also to non-physical entities. As such, a mode of enquiry is a thing.
Otherwise, i like the rest of your explanation.
To the extent this is meaningful at all, it implies that it is not true in the slightest, which makes it difficult to take seriously.
> This is not fundamentally different to the scientific method. The atom was thought to be the smallest, inseperable unit (hence the name) before it could be found. However, in finding the atom, we realised that it is, indeed, made up of smaller parts, which in turn are again made up of smaller parts.
Until that process bottomed out, of course, and bottomed out based on evidence, not philosophy.
1) If one can construct something, one can deconstruct it.
2) Derrida was a guy who existed.
If you have an idea for a single Wikipedia page that can resolve these two impossible contradictions, then you are a person who has an idea.
(edit: The grandparent poster's error was in making sweeping generalizations about capital D versus little D without mentioning the difference.)
(second edit: the Wikipedia page on Deconstruction is the quintessential case of original research?)
I don't know what you're trying to say, but "Derrida was a guy who existed" is not, in reality, an "impossible contradiction". It's just true. Your first statement, on the other hand, sounds either meaningless or false.
I think I agree with you, though, that the Wikipedia page on Deconstruction is "the quintessential case of original research".
Sometimes it is! (http://www.amazon.com/Fashionable-Nonsense-Postmodern-Intell...)
I'm reminded of David Hume's justification for the continued use of induction despite no solid theoretical justification for its correctness: we are creatures of empiricism and it is in our very nature to believe in that which works. For all the criticism, I do not often find people criticizing the effectiveness of empiricism.
The problem with anyone who majored in philosophy is that they will switch between big-D and little-D while expecting everyone around them to code switch at the same pace.
This is also true for any given smartass who never gave a flying fuck about Derrida.
It's useful as an intellectual exercise to analyze the theoretical merits and shortcomings of deconstruction, but because I'm a basically shallow person I'd like to point out that in terms of hipness and relevance we may as well be analyzing the music of Hall & Oates.
I will even suppose that something useful and important is being done in those fields--nevertheless, at the end of the day, I can point to my computer or bike and say "Engineering and hard science made this possible", and they can't point to a damned thing anyone cares about.
Here's a simple analogy: I am a teacher. My student is failing to learn something. There may be several factors in play, but the first thing I should do is check if I could do better as a teacher. Postmodern academics have mostly failed to be truly sensitive to how their ideas are coming across to others at all, so they aren't even at the point where they could figure out if they need to revamp their teaching approach.
First of all, Deconstruction should be credited to Heidegger, not the French. The French of the 50s and 60s read Heidegger, stole and (further) obfuscated some good ideas, then wrote books without citing him for political reasons (he was member of the Nazi party throughout the war, in good standing at the beginning and bad standing at the end, but he never rebelled or retracted publicly -- complicated person...) Then the students and readers of these French guys think the (derivative) French guys invented the idea.
There are two ideas of deconstruction I find useful.
The first is basically Nietzsche, and I paraphrase it thus: Every metaphysics hides an implicit morality of the universe, and every morality hides an implicit and very down-to-earth will to power, and this will to power is a good thing and the drive of Life itself manifesting in human ideas. N's big example is Judeo-Christianity, and how by inventing a selfless morality a bunch of folks took over Western Civilization in very concrete ways. So when the Libertarians have an "objective" system that shows that markets are best in an objective way, or the counselors/ psyches have an "objective" system that shows that depth psychology is best, they are all just protecting their jobs.
I extend this to our modern science loving era as: any supposedly scientific system that explains things but can't be rigorously tested just becomes wishful thinking to support the vested interests of a social group. This is basically all the social sciences, from econ to psych to anthro, and in all cases I think you can point directly to the group whose jobs or power the supposed "science" supports. (Owners in the first case, counselors who want more billable hours, locals fighting off outsider meddling, etc). And, yes I know about multivariate regression and survey data, and all I can say is what a joke.
Now that I have rambled too long, idea 2: I think one of the best things about Heidegger (who is fucking brilliant if you can get through his ridiculous prose) is that he asks us to question not whether something is an X or a Y, but rather to get "meta" and start examining what makes it possible for us to go through life assigning things too ontological categories (X or Y) in the first place. Because the mechanism of ontology -- that everything has a "being", that it "is" something -- is not a given, even though it comes so naturally linguistically.
Anyway, I should get on with my life now....
This is all too typical of the anti-scientific, anti-intellectual nonsense that holier-than-thou techies exhibit towards fields which they have very little knowledge of, and barely understand.
All of these fields are intensely focused on empirical evidence, statistical testing, and sturdy experimental design. Yes, sometimes poor studies are published (especially in economics and sociology; psychology and anthropology tend to be far more rigorous), but there are poor published studies in all fields, and generally authors comment on and make note of the flaws in their studies, and present them at face value. To decry the validity of thousands of scientists is not only arrogant, it is stupid.
And I am very pro-science, which is why I am anti social science. THe latter would be more intellectually honest to say "really, we would be lucky if we were in the same place as the alchemists were in the 12th century". Instead they fill journals with ideological crap that (in) directly supports their power struggles.
> All of these fields are intensely focused on empirical evidence, statistical testing, and sturdy experimental design.
Yeah whatever. The use of regression as the end-all be-all scientific magic that justifies a theory is the basic problem. What most sociologists don't understand is the difference, in real science (Physics, say), between a linear and a non-linear system. Most systems of any consequence, including orbiting planets, are nonlinear; if we could actually model human interaction, it would be nonlinear too. So a regression method might be able to describe a very local effect, but it has as much validity as a complete (global) model as a straight line does in describing the elliptical movement of the planets -- ie, none.
Furthermore, we have no real way yet to test social theories, which have too many moving parts to get a clear signal in an experiment, much less get consistent between repeating experiments. Without repeatable experiments, we don't have science. Post-hoc regression is not the same as an experiment. No offense, but most of the "science" in these fields is, in my estimation, nothing more than a Roman soothsayer making sure to get the incantations and the colors of his robes just right. We just have no fucking clue why stuff happens, for the most part, just like them.
So we aren't lead into error by well-known cognitive biases and/or outright fraud, both of which are known to exist as they can be demonstrated through repeatable experiments.
Or are you saying something else?
Social fields lead to people questioning local power systems, like states. These power systems happen to also fund research.
Take economics for instance. A rational curriculum would help people invent more advanced economic systems. Textbooks might try to enumerate many possible kinds of economic systems (encouraging students to do better), and we might spend at least billions experimenting with radically different alternatives. We'd encourage diverse researchers, particularly from groups suffering under the current economic regime. (As they'd naturally have interesting critiques.) The only thing like this I've seen is Z's "economic vision instructional", which is outside the university system.
But unfortunately, we're in a universe where economists basically work within a religious priesthood, to constrain solutions. Universities may tolerate certain ideological rivals to state religions, but within bounds. (So capitalist states may be fine with top-down forms of marxism.)
Even the hard sciences are under attack. Take (http://disciplinedminds.com/). However, states see the sciences as important to their economies. And the problem of the soft social "sciences" is that science doesn't easily apply to complex systems like psychology. There are other modes of inquiry than just science. But social pseudoscientists (like economists) naturally want the hard-won respect accorded to hard sciences.
He said that Christianity (not Judeo-Christianity iirc) is a slave morality. It preaches equality N. says and only a slave or an underling would preach equality and they do so because they have less than their masters. So in that way it is a self-interested morality (if you accept it was advocated by an underclass). He contrasts this with a master morality (might is right basically).
What is ethics or morality? It's what values you hold and project onto the world, correct? So I guess N. is saying that these values are subject to your socio-economic place in the world. That's quite materialist -- like Marx, who said more or less the same thing except that Marx said that the material conditions of your (as a class) existence shape your ideology. I think the text to look at for N. is "On The Genealogy of Morals" and the text for Marx is "The German Ideology".
Regarding the provenance of deconstruction. I'm not about to read "Of Grammatology" any time soon but you are the first person I've heard to assert the Heidegger should be given credit so you need to, you know, cite your source please.
Derrida read a lot of Heidegger, even wrote some essays on him before Gramma-whatever. Heidegger used the german phrase "destruktion" of the basic ontology of western society. Go look that up too.
The only bit of Heidegger I'm going to read is his where he talks about technology. Other than that I'm going to give him a wide enough berth because I don't like his writing style. He can be as profound as he likes and he can destruct whatever ontologies he's into but after having flipped through a bit of Being and Time I have to say no thanks.
And in fairness destruktion is not de_construction_. You still haven't told me where to look it up though. Chapter and verse and all that, no?
edit: ok, so FreakLegion hereabouts has some pointers and I used my Google-fu and got this which looks helpful. But even that says that Derrida got inspiration but that the terms are not used in the same sense.
I don't have time to read all of B+T because (a) I'm not into the style, and (b) it's a major work that is not related to my studies. So I just want the part that talks about destruktion but it's ok FreakLegion told me so you don't have to.
This is exactly where anyone coming to Heidegger for the first time ought to start. Unfortunately, for many people it's also infuriatingly difficult to unpack, due to Heidegger's style. For example:
"The discussion of Being as just that -- Being -- still speaks an inadequate language, insofar as, in our perpetual references to Being itself, it is addressed with a name that continues to talk past Being as such. In making this remark, we are voicing the assumption that Being -- thought as such -- can no longer be called 'Being.' Being as such is other than itself, so decisively other that it even 'is' not."
This obscure ditty is from Nietzsche, Vol. IV. The key, as you point out, is to focus on the 'is' and, equally importantly, the 'as'. Replace "Being" with "the hammer" or some other object and things start to clear up, at least a little bit. The insight is that that thing in the carpenter's hand exists 'as' a hammer (for the carpenter, for you, for me), but it 'is' not a hammer; the 'is' of a thing is simply an 'as', a way of being in a world. A world or a way of being in it can change, and then the hammer 'is' something else.
A seemingly small insight, to be sure, but it's Heidegger's point of departure, and has a wide range of practical applications in later thinkers, even ones working on the sciences (e.g. Bruno Latour and others doing "sociology of science"). Deconstruction -- destruktion is Heidegger's word -- is on balance the practice of resisting what is given, of not falling "into the grip of ideas already decided beforehand". The clearest statement of this idea in Being and Time is I think in §32, which starts on H149. This is the starting line for basically all contemporary theories of interpretation, from Gadamer to Ronald Dworkin.
1. Of course there are reasons for Heidegger's writing the way he does. Whether or not they're good ones is a whole 'nother discussion.
2. "Language", from Poetry, Language, Thought.
I have learned to see that these very terms were bound to lead immediately and inevitably into error. For the terms and the conceptual language corresponding to them were not rethought by readers from the matter particularly to be thought; rather, the matter was conceived according to the established terminology in its customary meaning.
In short, he tried to say something new using all the old terms, and that allowed readers to assume something about the meaning of those terms rather than work it out for themselves. So, to get his point across, Heidegger starts looking for a new way of saying, one without shortcuts, one that forces readers to think things through.
Did it work? In a sense. Of course it also ensured that a lot of people will never bother reading Heidegger. For me, I simply treated Heidegger as a foreign language and learned to speak it. Once you're over the hump, he's not wildly difficult, and the reward, I think, is not unlike what you get out of learning, say, a functional programming language. It gives you new eyes, rather than just showing you some new thing.
1. 'Term' is often used with a special meaning in Heidegger. From On the Way to Language (with a few bits omitted): "Everybody sees and hears words. They are; they can be like things, palpable to the senses. To offer a crude example, we only need to open a dictionary. It is full of printed things. Indeed, all kinds of things. Plenty of terms, and not a single word. Because a dictionary can neither grasp nor keep the word by which the terms become words and speak as words."
Here is my take on deconstruction, based on an MA in anthropology from a good university and some further interest and reading
Here is my take, which is a "meta" idea about institutions rather than a direct comment on the merits of deconstruction; I'm writing this when I should be writing my dissertation in English Lit, but so it goes:
Since at least the 18th Century, writers of various sorts have been systematically (key word) asking fundamental questions about what words mean and how they mean them. Dr. Johnson is a good place to start with this. We eventually start calling such people "critics." In the 19th C this habit gets a big boost from the Romantics and then from people like Matthew Arnold.
Many of the debates about what things mean and why have inherent tensions, like: "Should you consider the author's time period or point history" or "Can art be inherently aesthetic or must it be political?" One can formulate others. Different answers predominate in different periods.
In the 20th C, critics start getting caught up in academia; the idea of the research university gets started in Germany, then spread to the U.S. through Johns Hopkins in the 19th C. Professors of English start getting appointed. In research universities, professors need to produce "original research" to qualify for hiring, tenure, and promotion.
So English professors move from being primarily philological in nature towards being critics. The first people to really ratchet up the research on original works game are the New Critics, starting in the 1930s. They predominate until the 1950s, when Structuralists seize the high ground. In the 1970s, Deconstructionists (otherwise known as Post-structuralists) show up.
In each case, newly-minted professors need to get PhDs, hired by departments (often though not always English), and get tenure by producing original research. One way to produce original research is to denounce the methods and ideas of your predecessors as horse shit and then set up a new set of methods and ideas, which can also be called "assumptions."
But a funny thing happened to the critical-industrial complex in universities starting around 1975: the baby boomers finished college. The number of students plummeted. Colleges had all these tenured professors who can't be gotten rid of. So they stopped hiring (see Menand in particular). And they never really did so en masse again.
At the same time, in the 1980s and 1990s court decisions struck down mandatory retirement (http://www.slate.com/articles/life/silver_lining/2011/04/ple...). Instead of getting a gold watch (or whatever academics gave), professors could continue being full profs well into their 70s or even 80s. Life expectancies lengthened too.
Consequently, the personnel churn that used to produce dominant new ideologies in academia. The relatively few new faculty slots went to people who believed in Deconstructionist ideals.
Deconstructionism itself has some interesting ideas, like this one: "he asks us to question not whether something is an X or a Y, but rather to get 'meta' and start examining what makes it possible for us to go through life assigning things too ontological categories (X or Y) in the first place" and others, like those pointing out that a work of art can mean two opposing things simultaneously.
The problem, however, is that its sillier adherents—who are all over universities—take a misreading of Saussure Deconstruction to mean that nothing means anything, except that everything means that men, white people, and Western imperialists oppress women and everyone else, and also capitalism is evil as long as we're at it. History also means nothing because nothing means anything, or everything means nothing, or nothing means everything.
There was some blowback against this (Paglia, Falck, Windschuttle), but the sillier parts of Deconstructionist / Post-structuralist nonsense basically won, and the institutional forces within academia mean that that victory has been depressingly permanent.
The people who would normally produce intellectual churn have mostly been shut out of the job market, or have moved to the healthier world of ideas online or in journalism (http://chronicle.com/article/The-New-Economy-of-Letters/1412...), or have been marginalized as much as possible (Paglia).
The smart ones also go for MFAs, where the goal is to produce art that someone else might actually want to read.
Few important thinkers have emerged from the humanities in the last 25 or so years. Many have in the sciences (http://edge.org/).
Works not cited but from which this reply draws:
Menand, Louis. The Marketplace of Ideas: Reform and Resistance in the American University. New York: W.W. Norton, 2010.
Paglia, Camille. “Junk Bonds and Corporate Raiders: Academe in the Hour of the Wolf.” Arion Third Series 1.2 (1991/04/01): 139-212.
Paglia, Camille. Sex, Art, and American Culture: Essays. 1 ed. New York: Vintage, 1992.
Falck, Colin. Myth, Truth and Literature: Towards a True Post-modernism. 2 ed. New York: Cambridge University Press, 1994.
Windschuttle, Keith. The Killing of History: How Literary Critics and Social Theorists are Murdering Our Past. 1st Free Press Ed., 1997 ed. New York: Free Press, 1997.
Star, Alexander. Quick Studies: The Best of Lingua Franca. 1st ed. Farrar, Straus and Giroux, 2002.
Cusset, Francois. French Theory: How Foucault, Derrida, Deleuze, & Co. Transformed the Intellectual Life of the United States. Trans. Jeff Fort. Minneapolis: University Of Minnesota Press, 2008.
Surely the point is that the meaning of the text is dependant on the context supplied by the reader. So for an analysis of a text to a mean something (rather than nothing) is must not only define the subset of readers it is confined too, but also offer some justification of why it assumes those readers supply a common context (and what that common context is).
Without this context an analysis of a text is a largely a polemic unless the analysis is so absurd (or dissident) that it becomes a satire. Assuming a wide definition of "a text" then the majority of the media have been effectively "saying nothing" for the last 30+ years and academia have been satirising them.
I think that's a key point about "postmodernism". It spends a lot of time speaking of "text" as power play but it's continued orthodoxy in academia is itself a deliberate power play. As a result there's a lot of group-think and closing ranks.
He's a major influence on post-structuralism and is generally accepted as such.
That may be, but it makes one hell of a useful working assumption.
(seriously) Can you rephrase your question after skimming this wikipedia article?
edit: I find it almost impossible to have longer conversations about this topic without knowing a little bit more about the order in which the reader first encountered each writer.
I think that is an important point, actually. Both f'ing brilliant.
He was not just another member of the National Socialist German Workers' Party, he was Rector of Heidelberg University while Hitler was gambolling about Europe. Is "complicated person" what we're calling them these days? I can think of other words...
There are two ideas of deconstruction I find useful.
The first is basically Nietzsche,...
If it's from Nietzsche, then it isn't from deconstruction at all, so why attribute it to them? If instead you are saying that these are all ideas of deconstruction, then you're being arbitrary, I think. Walter Kaufmann, N's greatest interpreter and translator, attacked academics who did what I think you're doing here. They would perform exegeses of N's texts and then, oh happy day, find that N. supported their ideas in a preliminary way, then cite him as an authority and attach him to their ideology.
So when the Libertarians have an "objective" system that shows that markets are best in an objective way... they are all just protecting their jobs.
Umm, what jobs? The fraction of libertarians who get remunerated for their political beliefs is damn small. Since libertarian views are outside the mainstream, in some environments there is actually a real danger of being punished for refusing to conform to the norms of whatever sub-group the libertarian finds himself in. It's much easier in life to run with the herd. You don't end up being lunch as often.
The deeper problem with your comment is that you seem to be implying that N. was a relativist. He wasn't. See John T. Wilcox, Truth and Value in Nietzsche: A Study of His Metaethics and Epistemology, for a solid refutation of this (common) claim.
Every metaphysics hides an implicit morality of the universe, and every morality hides an implicit and very down-to-earth will to power, and this will to power is a good thing and the drive of Life itself manifesting in human ideas. N's big example is Judeo-Christianity, and how by inventing a selfless morality a bunch of folks took over Western Civilization in very concrete ways.
I could "deconstruct" this all night long, but I'll keep this short.
1) N's ideas on the will to power were never fully worked out and settled, so it's difficult to say with certainty what he believed in many cases regarding W2P. That said, he seemed to believe that...
2) W2P isn't good or a bad thing, it just is, like gravity. Our moralities are reflections of some inner drives, all of which are manifestations of W2P. Those moralities that derive from an inner sense of fullness and overflowing N. judges to be good. Those moralities that derive from an inner sense of deficiency, self-loathing, and especially revenge, N. judges to be bad.
3) The early Jews and Christians were heavily persecuted by (variously) the Pharaohs and the Romans. The "slave moralities" they invented were quite different from the "master moralities" of the aristocratic Greeks. Slave moralities are motivated, at their core, by a drive for revenge against the people who brutalize them. Since they are slaves, they don't have power in this life to take their revenge, so they invent another life (the afterlife, the Thousand Year Reich, the Age of New Socialist Man, the Dar al Islam) in which they will enjoy pure happiness and 76 virgins and the "evil ones" (Pharaohs, Romans, Jews, Capitalist Exploiters, Infidels, the One Percent) will suffer (the Fires of Hell, Zyklon B, beheadings, property nationalization, starvation by Maoists (my wife had fun with this one), execution by the Khmer Rouge, etc.)
It's fascinating to watch all of this play out in America today, with Obama explicitly calling last October for his supporters (the core of which are the descendants of slaves who were brutalized by their owners) to "vote for revenge", the coming glorious age of Hope and Change, the coming destruction of the evil One Percenters (as soon as Obama and his fellow One Percenters finish their golf games on Martha's Vineyard), etc. [I suppose we should cheer up. I mean it's not like he's running a massive surveillance state and murdering American citizens and foreign innocents via drone strikes without judicial review while his propaganda media manufactures consent for his policies.]
And now I too should get on with my life...
He was rector for a year and then resigned. He kept his party membership current though. He spent the last years of the war digging trenches, so he wasn't inner circle in the least.
Be sure also to read his and F. Randall Farmer's paper on the subject  and related collections of anecdotes  if you're at all interested in the history of virtual worlds.
I've studied history, art history and computing science at graduate level, and have to say I agree 100% with this engineer's statement: "The quality of the actual analysis of various literary works varies tremendously and must be judged on a case-by-case basis, but I find most of it highly questionable. Buried in the muck, however, are a set of important and interesting ideas [...]"
As a trolling aside, I found the computing science degree far easier than the arts ones...
This seems to be the opposite of the book's spirit. What's interesting is the authors they leave alone...
Ian Hacking has some engineering cred too given that he's written several great books about statistics that don't skimp on the math too much.
Picking out bad analogies and inapposite asides is not a good use of anyone's time.
- "Their talk" of science is not merely inaccurate, it is _disingenuous_
- This disingenuousness raises questions about the authors' sincerity (which the authors of Intellectual Impostures are at pains to limit to the scientific areas, but could well be extended)
- They do not "dismiss the work" of the authors
- They are at pains to try and work out what the authors might have meant (including unpicking "bad analogies" or "inappropriate asides".
Once again I'm required to ask for evidence of this from someone likely not to have actually read the book closely.
But the point still stands, as you put it, that "the authors of Intellectual Impostures are at pains to limit [their inquiry] to the scientific areas."
What is the use of that? I mean, sure, it's fine for the Bruno Latour chapter, because Latour's work is actually on science. But what's the point of demonstrating that Irigaray mangles fluid mechanics and Lacan mangles topology, in books that aren't about fluid mechanics or topology? Who cares? Pointing it out isn't wrong per se. But it also, despite Bricmont's extensive handwaving, doesn't prove anything worthwhile.
This is not my supposition. I had these very arguments with literary theorists at university. They preferred to believe Lacan's mathematical credentials over mine.
How to publish a lot of gibberish in one of the "important" philosophical journal.
Not that this means deconstruction is 'stupid'. Only that one has to take it with a grain of salt. At least one.
If you've ever had misfortune of having to review submission for non-major CS conferences you would know that there is an amazing amount of bullshit even in technical fields.
The only thing I've found shocking about the Sokal affair, or this article (and the many others like it) is the obsession with 'proving' the humanities to be bullshit. It is an incredible and intellectually lazy pursuit of the idea that: If I don't know it, it's not worth knowing.
Moreover, Sokal didn't target "the humanities", he targeted postmodernist "critical theorists". As a scientist it pissed him off to see careerist intellectuals appropriate the terminology of science (like the term "theory") in an effort to lend weight and credibility to their work while utterly vacating it of its scientific principles. You give way too much credit to postmodernism in broadening it to all of the humanities.
Retellings by Sokal fans make it seem as if they received it breathlessly as great work and rushed to put it into publication. That is not what happened. The Sokal Affair is less evidence for the fuzzy-headedness of litcrit and more evidence that you can often get what you want just by lying, being a dick, and having some kind of finished product for people to work with.
And Sokal's submission wasn't some fluke, it was in the ballpark of a lot of the heady self-referential fluff in that field.
Only if you're very insensitive to tone, style, vocabulary and methodology does the Sokal piece resemble any legit article in Social Text. From the descriptions of it you'd think Social Text was full of maniacal gibbering about how nothing is true, everything is permitted, and quantum mechanics is a tool of the phallologocracy. But in fact, the quality of work is far more pedestrian, if a bit wordy. Here's the table of contents for the latest issue (articles behind paywall, abstracts are free):
Is it self-referential nonsense? Maybe! Does it bear a reasonable surface resemblance to "Transgressing the Boundaries", if we're not being deliberately obtuse? No to me.
the terminology of science (like the term "theory")
The term theory pre-dates empirical science by a good bit, and the litcrit usage is not really orthogonal to the scientific one anyway.
That is how disconnected they were from anything resembling a "methodology" that could detect basic falsehoods nestled amidst pretentious language and reverential citations of suitably fashionable authorities.
Scientific theories generally yield testable hypotheses. Name one testable hypothesis that's ever come out of "critical theory".
People don't like to admit they've been conned. Anyway, sincerity is a separate issue from quality. The editors believed the paper was sincere, just not very good.
Alternatively, they thought it was "a little hokey" but assumed sincerity, and ran it because they had an opening and were too lazy to find another scientist willing to write for a humanities journal.
It's not that critical theory is necessarily good. It's that Sokal's prank proves more about the efficacy of social engineering hacks than the badness of critical theory.
Also I always found Sokal affair a bit stacked against the offending the journal. The issue was "Science Wars", that is confronting scientific vs. postmodernist view. If they were to edit what were presumably contributions from the side of science, it would defeat the purpose somewhat.
Your sentence "Not that this means deconstruction is 'stupid'." could be interpreted:
"One has to take at least one grain of salt when certain critical techniques are used."
Depending on how one reads this entire post, that may or may not be considered a charitable interpretation.
When Wittgenstein finished deconstructing the entirety of existence in his Tractatus, he reserved the last sentence to tell himself to shut up.
First, from the standpoint of intellectual history, it's probably most useful to cast conversations like this in terms of the split between analytic and continental philosophy (esp. postwar developments). The fact that he doesn't really mention the split and only talks of literary criticism is a very odd omission.
Second, I applaud the author for taking the tradition that he's poking fun at as seriously as he does. There are scads of philosophy professors who simply sneer at those from the opposing tradition. This is a shame.
Third, and related to the point above, it's a breath of fresh air to read someone who can genuinely straddle the two traditions. Richard Rorty is probably the best known philosopher that this can be said for.
Fourth, anyone interested in the ideas that underpin deconstruction should take a look at Saussure's "Course in General Linguistics". It's an utterly fascinating read that forms the basis for a huge amount of work in both contemporary linguistics and postwar French Continental philosophy.
Fifth, anyone who dismisses the techniques one of the world's greatest mathematician/logicians as a "cheap trick" without so much as a winking emoticon is immediately suspect in my book. ;)
It isn't on deconstruction specifically but on why literary criticism has become so important in general.
"Redemptive truth would not consist in theories about how things interact causally, but instead would fulfill the need that religion and philosophy have attempted to satisfy. This is the need to fit everything--every thing, person, event, idea and poem--into a single context, a context which will somehow reveal itself as natural, destined, and unique. It would be the only context that would matter for purposes of shaping our lives, because it would be the only one in which those lives appear as they truly are."
He appears to think this is possible; I think it's a dangerous illusion. I don't think there is any one, single context in which our lives "appear as they truly are". Our lives are too complex for that.
In fact, physics is too, although Rorty doesn't seem to realize that. He says that scientific inquiry could conceivably terminate; apparently he thinks that physicists who talk about a "theory of everything" mean that once we've found it, we won't need to do science any more. That's a dangerous illusion too. No matter how much we discover about the universe, there will always be more to discover, and that's what scientific inquiry is.
Furthermore, even with the stuff we do know in science, there is always the possibility of finding new ways to look at it. Feynman once said that every theoretical physicist who is any good knows multiple theoretical representations for exactly the same physics. Each representation is useful in its own way, and no one of them reflects "the way things really are". Even putting them all together doesn't necessarily reflect "the way things really are". All of our knowledge is incomplete, even in physics.
And if that's the case for physics, it's the case even more so for ourselves and our societies. We should not expect to ever be able to understand ourselves using one single context, any more than we can understand physics using one single context. And as Rorty describes it, the search for this one single context by intellectuals (which he defines, reading between the lines, as people who can afford to waste time in such pursuits because they don't have to do any productive work to make a living) is, in my opinion, a sad waste of human talents that could be put to better uses.
I think you have completely misunderstood him. He is arguing against the fact this is possible. The whole point of the article is that there is no redemptive truth at all. The whole point of promoting the literary over the philosophic is to remove our need for redemptive truth. He argues that philosophy (and science) suffer the same mistake as religion in that they believe they are searching for Truth. Literature doesn't suffer from that mistake.
Perhaps you only read the first 2 or 3 pages?
"[A] culture which has substituted literature for both religion and philosophy finds redemption neither in a non-cognitive relation to a non-human person nor in a cognitive relation to propositions, but in non-cognitive relations to other human beings... This sort of culture drops a presupposition common to religion and philosophy—that redemption must come from one’s relation to something that is not just one more human creation."
But if the point of this is to remove the need for redemptive truth, it doesn't work; the search for redemption through literature still tries to fulfill "the need that religion and philosophy have attempted to satisfy", it just tries to fulfill it without appealing to non-human entities:
"For the Socratic idea of self-examination and self-knowledge, the literary intellectual substitutes the idea of enlarging the self by becoming acquainted with still more ways of being human. For the religious idea that a certain book or tradition might connect you up with a supremely powerful or supremely lovable non-human person, the literary intellectual substitutes the Bloomian thought that the more books you read, the more ways of being human you have considered, the more human you become..."
Even at the one point where he does explicitly seem to say there is no redemptive truth...
"To give up the idea that there is an intrinsic nature of reality to be discovered either by the priests, or the philosophers, or the scientists, is to disjoin the need for redemption from the search for universal agreement. It is to give up the search for an accurate account of human nature, and thus for a recipe for leading The Good Life for Man. Once these searches are given up, expanding the limits of the human imagination steps forward to assume the role that obedience to the divine will played in a religious culture, and the role that discovery of what is really real played in a philosophical culture."
...he follows up immediately with something that contradicts what he just said:
"But this substitution is no reason to give up the search for a single utopian form of political life--the Good Global Society."
This is why I said that, although I understand where he's coming from, I don't agree with him. He claims to have "deconstructed" redemptive truth and put something better in its place; but all he's really done is fasten on to his own particular brand of redemptive truth. His belief in the Good Global Society meets the definition of redemptive truth that I quoted, and he appears to think that is possible; so whether he realizes it or not, he does think a form of redemptive truth, as he defines it, is possible. And he's wrong for the reason I gave: our lives are too complex for a "single utopian form of political life" to work.
Again, I would argue you haven't understood him. One of Rorty's main claims was Ironism:
I suppose you might not find it funny or clever that after 20+ pages of arguing there is no redemptive truth that he would share his opinions on a good life but don't think for a second he didn't realize it himself. It was the subtle wink at Nietzsche's "Thus Spake Zarathustra". Why doesn't Rorty get to share his own literary opinion?
However, as to the "utopia" that he describes in that article - I very much share your doubts and I feel he is unduly optimistic. However, the general idea of having a multiplicity of options and the liberty to pursue them is something I can agree with.
Realize what? That he was being funny or clever? Certainly, he can be both. That doesn't change the fact that he made a serious claim, and one which, whether or not he was being funny or clever, I believe he meant to be taken seriously (and you appear to believe that too--but see below).
Or are you saying that he realized that he was, in fact, advocating a new kind of redemptive truth, after he'd just said there was no such thing? In that case, I disagree: I don't think he realized that, because if he had, that part of the article would have been different; at least, it would have been if he is intellectually honest (and I have no reason to believe he isn't--but see below). Either he would have taken out the claim that there's no such thing as redemptive truth, or he would have taken out the claim about utopia, or he would have made some argument attempting to reconcile the apparent contradiction between the two.
You could, I suppose, be saying that he did realize the apparent contradiction, but it was OK because he didn't mean the claim about utopia to be taken seriously. If that's what he meant, then he would not, IMO, be intellectually honest, because he clearly believes that "having a multiplicity of options and the liberty to pursue them" is a desirable state for society.
Yes. That is the irony of ironism. The final point of Ironism is the self-awareness of authors to their own final vocabularies.
I think it is more than fair for Rorty to state his preference for social orders and explain his rationale for his beliefs. And he has as much right as anyone to expect his ideas to be taken seriously. But to suggest that the man believed he was relaying the Truth is to miss literally everything he stood for.
That is, I can say that man having liberty to pursue diverse lifestyles is a good state of social affairs without having to believe I am stating a redemptive truth.
I don't think he believed that; I think he genuinely thought he was "deconstructing" the idea of Truth and putting forward a way of dealing with the world without needing such a notion. I just think he was in error in believing that. See below.
I can say that man having liberty to pursue diverse lifestyles is a good state of social affairs without having to believe I am stating a redemptive truth.
Either you mean your statement to be taken seriously or not. If you don't, then yes, you're not stating a redemptive truth, because you're not stating anything. So I can just ignore what you say. (I assume that wasn't your intention, but of course I may be wrong, just as I may be wrong in thinking that Rorty meant his utopia claim to be taken seriously.)
If you do mean your statement to be taken seriously, then, whether you realize it or not, you are committing yourself to truth claims. Perhaps you would not describe them as "redemptive" truth claims, but that's quibbling over words. If you're committing yourself to truth claims, then you haven't deconstructed the idea of Truth, because we all live in the same world, so all of our truth claims about it have to be consistent.
Similarly, if Rorty means his claims about utopia to be taken seriously, then he's committing himself to truth claims (not least about the possibilities allowed by human nature). To say that, well, it isn't "redemptive" truth because he's allowing different people to have different worldviews, is, once again, quibbling over words. Everybody has to coexist in the same world, and his claim about utopia amounts to claiming that there is a One True Way to organize society that will somehow magically let everyone coexist without conflict. I think, as I said, that that's a dangerous illusion, but whether it is or not, it isn't something one can believe without having a notion of Truth.
After writing them down, I'm noticing they're all analytic philosophers who lean a bit continental. I wonder, what are some examples in the other direction?
Engineering and the sciences have, to a greater degree, been spared this isolation and genetic drift because of crass commercial necessity.
Unsubstantiated bullshit. Most theoretical backbones of math and science were discovered/formalized for anything but commercial reasons.
It is clear to me that the humanities are not going to emerge from the jungle on their own. I think that the task of outreach is left to those of us who retain some connection, however tenuous, to what we laughingly call reality.
You could start with your own field(s), which are full of mediocre paper-pushers milling around other people's theory, publishing uninteresting observations which, as long as they are "logical" and "correct", are shit into a godawful paper that no one will ever care about or read. "A Novel Method for Applying a Trivial Modification of an Already-Known Algorithm to Some Type of Specific Data" could be the title of about 35% of the CS papers in existence.
And if they end up serving no practical use they end up discarded and ignored by scientists and engineers (if maybe not pure mathematicians). The author's point is that there is no such selective pressure on lit crit.
Also, poststructuralist lit crit is not the be all and end all of "the humanities".
Litcrit theory is actually a pretty big commercial success. Derrida et al. sell quite well for academic books, and a Sexy Postmodernism Sex Theory Made Accessible to Undergraduates type course is a reliably decent draw for English departments, who need the enrollment.
Furthermore, if you consume entertainment products with shifting, non-linear narrative like The Simpsons, Coen Brothers movies, or Inception, you're exerting selective pressure in favor of postmodernism.
Practical use ≠ commercial use! Also, I have no idea where you got the idea of "disgarding" research, especially in regards to pure mathematicians. What is "practical" to you?
No, it's not, and yet when I deconstructed the author's paper (I crack me up), I found that he made a generalization about all of the humanities following his limited experience with the given topic.
That's like reading a scathing Yelp review of a dodgy restaurant and saying "he hates food!"
Your analogy is like doing meth in a bathtub with a small chimpanzee playing a xylophone to the tune of, "Pachelbel's Canon" while shoving a sharpened pencil into my urethra to wash away the pain of dealing with ridiculous comments on Hacker News.
How does that contradict what the author said? His argument is that any field closed over itself, over time, will tend to distance themselves from reality - obviously, the initial impetus of a field happens far too early to experience such effects.
I am saying that whether or not science and engineering are tied to reality has little to do with being relevant to the commercial world.
My claim: influence of commercial world is extremely unimportant for most of the important parts of Field X, ergo, commercial influence has had almost no bearing on keeping Field X "useful", regardless of whether or not Field X is actually useful.
Statement to support claim: the vast majority of what we consider 'science' (lumped in with engineering by the author) consists of a theoretical backbone tested against reality, which has nothing to do directly with commercial interests in theory, and often in practice. Examples include mechanics, EM theory, quantum mechanics, relativity, genetics, number theory, integral calculus. You would have a hard time proving that these things, which all have a basis in reality, are that way because of "crass commercial necessity", as the author claimed.
> A Novel Method for Applying a Trivial Modification of an Already-Known Algorithm to Some Type of Specific Data
Papers sharing empirical findings on applications of existing research can be very useful to those of us also, you know, looking to apply said research. Don't underestimate the amount of value (and legwork!) involved in figuring out how to adapt and apply theoretical work to real problems in some CS-related fields.
Obviously that kind of thing isn't accepted at the top conferences, so you know where not to look if you're not interested in it.
This is how science builds on itself. Every well-known paper has citations to 'lesser' works which laid the foundations.
Insert 3 credits and try again, dumbfuck.
* kicks back and waits for impending ban *
EDIT: Nice job editing your comment to remove the attack, you fucking dipshit.
If you're so much smarter than me, why do you resort to insults?
> Nice job editing your comment to remove the attack, you fucking dipshit.
I can edit to remove insults, but you can only edit to add them. What does that say about you?
And now you're making fun of molestation victims to score points on a website.
You have no idea what kind of history the people here have with molestation, and you're potentially dragging up some very painful memories just to score points on me. Why? People commit suicide over this stuff.
Here's the abstract:
"This is an investigation into some aspects of the way the C programming language creates meaning. In any formalized language, meaning is created by a tension between a community of speakers and the language's formal definition. In the case of C, this community preceded and presided over the formal definition in such a way that the formal definition itself embodies this tension. Because of this, C has a relatively unique view on how programming languages work, and how language in general should work. Specifically, I will argue that the 1989 ANSI C standard introduces the concept of abstraction by ambiguity into formal language specifications. Paradoxically, this ambiguity allows the knowledgeable programmer to be more specific than would otherwise be possible, while retaining the extensional benefits of abstraction. This has implications for philosophy of language in general, which I will briefly address.
"This work is situated in the emerging field of critical code studies (Marino). Although there has been related work dating all the way back into the 80s and 90s (Knuth; Winograd; Landow; Kittler), most studies that self-consciously look at code itself from a perspective that goes beyond computer science are a very recent phenomenon (Fuller; Chun). If much of my investigation seems overly broad, then, that's to be expected: just as a Polaroid photograph develops with broad splotches of color, only acquiring precision at the end of the process, likewise a new investigation must be satisfied with the faith that its clumsiness will be turned into precision with time. Many things herein are assertions with little corresponding argument, yet assertions which nevertheless to me seemed interesting enough to present for consideration in the hope that they might function as depth-soundings for future navigation."
Doesn't it just say "I am going to fire off some claims without any evidence supporting it, but since the claims sound good somebody should go and check if they're true just in case"?
Engineers get paid for not being wrong.
Ah, so much time wasted on my English Ph.D. (studying under Richard Rorty, at one point, starting the year before this was written). Could have just read Culler (which, actually, I did), instead of the entire tradition of continental philosophy going back to Kant.
I've also read Gödel, by the way. But really, that was all I had to do -- mathematics is all the same shit anyway.
I'm an English professor who occasionally teaches in the CS department. I am fully capable of detailing the absurdities in computer science (as well as great virtues).
Forgive the vent (which isn't directed at you); the article irritated the hell out of me. I have to say I expected a lot of "Yeah, literary study is garbage" in this thread. I was pleasantly surprised to see so much thoughtful engagement.
Funny thing, there is no way an average English Lit. major could tell if your average research-level maths paper is nonsense or not because it would take, you know, six years or more to get up to speed. Why it is that science and engineering types think that all they need to do is read a couple of primers and they're good to go is beyond me. I think it speaks to the arrogance and elitism of the scientific method as The One True Path To Truth.
I'll tell you the truth about truth. How long you got?
Vent more I say, this crap has real-world consequences -- humanities budgets are under fire. Whole flippin philosophy departments are being closed down because, you know, who needs philosophy anyway. Grumble grumble.
A great article, but the basic ideas have been reiterated many times within academia itself. The date is important - its currently almost impossible to find an institution with a viable postmodern/deconstruction focus (outside of individual courses framed as a historical perspective and some oddball schools that aren't taken seriously in the field).
We've effectively 'moved on,' in many cases along with the critiques found in the article, although we've ignored some. New tools have been developed, yielding sometimes greater and sometimes lesser successes.
In the UK at least Derrida couldn't be more unfashionable, pure toxic waste if you want to publish and/or have a career as a philosopher of this school.
I say all this because the writer fails to mention that the tradition he is having a go at is essentially continental philosophy. Contrary to forkandwait's assertion in this comments page deconstruction is attributed to Derrida, not to Heidegger. Derrida was influenced by Heidegger (and says as much himself I believe) but Derrida is credited for this whole deconstruction lark. Literary critics working in the continental philosophical tradition will employ stuff like this. Others won't, it is worth pointing this out.
Analytic philosophers have a hard time decoding continental philosophers. So it is no surprise that people coming from even further afield have difficulty. I had a hard time initially, it took me a couple of years.
My take on deconstruction is that it is the technique of finding an internal contradiction in the text you want to refute and basically letting the text undermine itself. If it is anything more than this I'd like to know. I generally would never use the word deconstruct as it is too trendy and has become too intellectually charged for my liking. I prefer plain language. But I will use words like ontological when they need to be used but only then, if you see what I mean.
Finally, onto the claim of bogosity. I think maybe yes at times by certain windbags and lesser practitioners but mostly I think that no. In Derrida's case I would say no. I say this is the back of having read "Plato's Pharmacy" by Derrida and let me tell you it blows a lot of other philosophy out of the water. It is telling our man read a secondary source on Derrida, and did not drink straight from the fountain.
It irritates me immensely that someone believes they can read a couple of secondary texts and then claim to know enough to rubbish an entire swath of thinkers in the history of ideas. Continental philosophy gets most of the brunt of this because of the prolixity and verbosity of their texts. But I ask you, why has our chap not read Derrida for himself? Hmm? Because then he would have to read all of Derrida's influences. And be aware of the currents of thought in which Derrida was swimming. And so on, back and back until you reach the ancient Greek thinkers. I mean analytic philosophers (less now than before) have reacted in very adverse ways to philosophical texts from continental philosophers so Chip has company. But you know, try harder I say.
One little final point. I think the reason the word "problematic" is used as a noun is because it is used as a noun in French. Correct me if I'm wrong.
I'm actually getting pretty tired of the condescending attitude from the science/engineering crowd. Some bits of philosophy are formal in the way that logic or maths is so it will never be empirical. And. So. What? Big deal that natural language is ambiguous but that's part of its beauty as well. I say if you approach what I do in such bad faith so frequently then it is your fields that are suspect, not mine.
It is but this  says it comes from a translation of problematische Urteile in Kant. At any rate it wasn't used as a noun in French before 1936
This is what I thought. Story goes. Key French texts are translated into English. They used "problematic" as a noun. Diligent students read texts, learn to use "problematic" as a noun and so it goes. Language changes, get used to it. I've seen tons of papers where someone argues about the problematics of this or that. I wouldn't use it so myself. But it may inevitably be on the way to assuming this new mantle.
My pet peeve is "foregrounds". Grr. I've a few more, I've always meant to make a list.
Describing the incompleteness theorem as "a cheap trick to frighten mathematicians" is hilarious as well.