Some cultures being known as more war like, in my opinion only signals their relative cultural poverty on other areas.
we're opposed to large organizations that empty out our pockets before we start our career and that old out the empty promise of the life of the mind, finally revealing that it only exists for people whose parents are professors.
the modern criticisms of higher ed are similar to those that came up in the 1960s but pecuniary issues are much more important now because higher ed has moved away from being subsidized by the government towards being financed by private loans.
It's just like every hotel moving up to the 5 star level, it's unnecessary and expensive.
The notion of disruption is not inherently anti-intellectual. The notion that whatever society or structure under discussion has reached a local maximum, and must be torn down somewhat to be built back up is not to say that a society is worthless, or wrong at it's fundament, but rather that sometimes, change is not possible from within.
It is the opposite of conservatism (with the little "c").
Additionally in the world of the internet, where a thousand flowers bloom, and theories and movements are spawned and die every day, i put relatively little stock in each of the common wisdoms that spring up for their 15 minutes of fame, and then fade into the abyss. Fads like the notion of experts being unnecessary are ridiculous, and largely have no legs. And even to the extent that they do reflect some deep seated feeling, i would more likely attribute them to an antipathy to credentialism, rather than the notion that expertise is worthless.
And if you want to put a political spin on this, posts like these are exactly the sort of false equivalences which make moral argumentation impossible. Sure geekdom has its share of charlatans and know-nothings pretending to be masters of insight, but this is different from demands of faith and fanaticism, and that if you disagree with orthodoxy, not only are you wrong but that you are forever damned, and in some cases, shunned by your friends and family.
I've listened to David Barton (The "Historian", and yes, the scare quotes should indicate to you that he's not an actual historian.) make the claim that the Bible lays out specific prescriptions on tax policy, and that if you vote for a party that raises taxes, you are not on the right side with God.
These are not the same caliber of argument, and i am extremely frustrated to hear them equated.
I'm not really sure what you mean here. Tearing down what--books? Schools of thought?
For what it's worth, the conflation of some extreme "post-modern" views such as "Experts do not deserve any special role in declaring what is known. Knowledge is now democratically determined, as it should be.", with some quite distinct and by no means dependent views questioning the value of forms of rote memorization was to me quite insulting. There is a world of possible nuances between views like these.
I am aware that the intention of that particular section was to (in my opinion clumsily) provoke. However to avoid engaging with the issues with intellectual honesty seems to me to be as anti-enlightenment as anything he ascribes to the anti-intellectual movement he apparently sees.
That is not to say that reading "War and Peace" is wrong, or that it has no value, but that we should reassess whether reading "War and Peace" is somehow a metric of serious thought. This is the same sort of bullshit that people doing pop culture research have had to deal with for years. Whether it's research into comic art (and yes there are research libraries for cartoons – i spent 3 years working in one), contemporary art, or journalists using twitter, their existence is not (inherently) a zero-sum game with the old order. It is a threat to orthodoxy, but it is not a threat to academic inquiry or knowledge.
It'd be like claiming that Martin Luther was a great threat to religion and faith, because he sparked the reformation.
I call myself an "Internet Knowledge Organizer." I started Wikipedia.org, Citizendium.org, and WatchKnow.org, among others. Now I am lucky enough to be able to work full-time on creating free materials for early education, which I am using with my two little boys and sharing with you.
Now, that all sounds pretty interesting. But does it sound like traditional academia? What are his formal credentials in Internet Knowledge Organization? What are his formal credentials in early childhood education? What does he publish in peer-reviewed journals in those areas? It sounds almost as if, despite no formal training in sociology, new media, online communities, pedagogy, etc. (all of which do have established academic fields, which he could've studied if he wished), he just brazenly went out there and started some projects. As far as I can tell, he didn't even read what had been written about those areas. That's cool, but why is he then all negative on the idea in general? If we wanted to apply some strict standard of expertise, Larry Sanger should be publishing in philosophy journals (an area he has formal training in), and staying out of online communities, education, and other areas in which he lacks academic expertise.
In any case, I'm a computer science academic, and in our field I don't see it as a new sentiment: the idea that you can be a brilliant garage hacker with no degree is decades old. I don't think it's overall that negative a relationship, either. It's not necessarily idyllic, but plenty of non-academics are interested in the work of academic research (algorithms, PLs, operating systems, AI). Even in industry, researchy stuff, like what comes out of Google Research and MSR, gets a lot of interest. It might help that some respect is given in the opposite direction as well-- plenty of academics' heroes include non-degree-holding garage hackers.
His premise isn't that making knowledge more accessible is bad. It is that the growing attitude that is comprised of (among other things) "If it's on wikipedia, I don't actually need to know it"
What seems to be the consensus is that academia has not brought in any revolutionary innovation for decades. Whenever practical innovation came, it came from characters as far from the 'standard intellectual stereotype' as could possibly be. As a result there is a haughty "keep debating while we get the job done" attitude, which may be mistaken for anti-intellectualism. In reality it is not that geeks dislike experts and intellectuals - it is that they've met too many would-be-experts who failed their most trivial relevancy tests, so the term 'expert' raises a BullshitIncomingException almost immediately in any geek's mind.
But lets be clear, this is the case everywhere. Geeks have the same clan politics. The difference is rarely the politics, but where you sit in the pecking order, that changes your perspective.
I actually think the problem with academia in CS today is it's too applied. They should NOT be focused on today's problems. The Cloud is not something one should do research on in academia. Industry will tackle the cloud from a million different angles. But what's after the cloud? What after touch and Kinect? Precise mind control and HUD in glasses/contacts really should be the mainstream in academic research today, but it's actually a fringe.
I think academia should be way out there more than it is. The geeks will protest that they do nothing of relevance, but that's how it should be. :-)
Generally folks who have seen far recognize the shoulders upon which they stand, but those who idolize them seem to see them floating in air.
What counts as "innovation" these days is what, LOLcats?
Very convenient. Also not far from the same-styled groups in the political spheres these days.
What most geeks are against in terms of 'anti-intellectualism' is the difference between X.400 and SMTP. You can build a somewhat functioning SMTP server/client in a day, try doing that with X.400.
Those spouting anti-intellectual dribble often forget the well worn, time tested shoulders they are standing on.
Regarding Tim Berners Lee, he is imho all but an 'average theoretical physicist' - which further proves my point.
The first is marxist and postmodernist critiques of the classics. "Dead White Men" and all that jazz.
The second is the demoticism (not the democratisation) of the internet.
These are unrelated, in my view. The PoMx/PoMo criticism is more political, in my view, than what it purports to critique. The change in demoticism is itself not a fundamental change but a rhyming riff on what has happened in the past.
Today the classics have never been more accessible. But just as when Penguin Classics was introduced, most people want penny dreadfuls, potboilers, comic books and a generous helping of porn.
The classics have their place as art and as sources of prestige, but if you're looking to learn things there are usually better places to look.
Not that I disagree with everything Mr. Sanger said, but the truth is a complicated beast.
 Some people studying the history of science read the originals, but thats different.
I would take issue with that. The way kids are taught calculus in high school ignores everything Newton did because it's not the most efficient way to teach kids how to solve AP problems. But without going back and learning about the questions that Newton was trying to answer, you don't actually understand how calculus works even if you can do it by rote. So at best kids learn it for the test and then forget it because they don't actually understand how it works, and more likely they never really gain any ability to begin with.
Sanger's essay is in praise of school and formal education, but in fact what's taught in school is more often than not just a showy pretense of false economy. It's only when you leave the classroom and have time to really engage with the classics and other substantive works on your own that you can actually start to understand how the world actually works. Don't get me wrong, some form of formal education would be the best way to learn if it actually delivered what people said it did, but it doesn't. This whole essay is intellectually dishonest because high schools and colleges don't even deliver anything remotely close to what Sanger is pretending, at least not for the vast majority of people who go through them. But then again, Sanger has been pretty much full of shit for the better part of the decade. If you actually go through and read his original writing on this topic, it's clear that what he really wants isn't a more intellectual culture, but rather a free pass on being able to make up facts without citing any sources because he has a Ph.D.: http://www.kuro5hin.org/story/2004/12/30/142458/25
It depends, of course, on what kinds of things you are trying to learn. And that is precisely what is at stake here.
You definitely don't go to the classics to learn science or technology. But is that all there is that is worth knowing?
And don't be too hasty in your division of "fields that make progress" and "fields that don't." Our knowledge of Ancient Greece, for example, has grown substantially in the past century, and there has definitely been progress in historiography--but that does not eliminate the need (or, more to the point, the desirability) to read the primary texts.
Edit: clarity. I never intended to say anything about the classics' value, only their past and future popularity.
(in hindsight, the former could easily have been an interpretation of the post)
The lessons learned from classics are constantly quoted, and it's beneficial for people to read and learn.
Also I need to hit him up for some pick-up advice.
they both have a wise man (Prof/Jubal),
someone with ridiculous power to get them what they need (Mike PC and the Frenchman/Jubal and Mike Smith),
someone rallying to the cause (Wyoh/Ben)
and someone who is central to the cause but is initially reluctant (Manuel/Jill)
and it's a fight for rights of the powerless over the incumbent rulers. They both have a rebellious feel, though "Moon" is more blunt about its political statement (anti-war, small government).
My argument was that we aren't destroying the silos because we hate the information, we're destroying them because we hate the silos. And if that's beneficial, who cares if most people, who wouldn't have otherwise read Russian literature or listened to Mozart, still don't?
I mean, Google is obviously supplanting the manual for a lot of things, but the point is much the same. If people read the fucking manual, they'll have a lot fewer problems. And a lot of the classics are pretty damn good manuals for life.
It's important to know of the past, but many who use this argument are simply afraid of the present, and are attempting to denigrate new ideas and discoveries.
So, I do think it is an accurate phrase, but misunderstood.
But here's a partial response, focused strictly on the phenomenon of college education:
It would be easier to embrace academia if the inflation-adjusted cost of going to school were stable or decreasing, rather than growing at 6.8% per year between 1987 and 2009:
With a few exceptions -- gods bless MIT Open Courseware and the Khan Academy -- the academy does not scale, nor does it even try very hard. The Internet gets more inclusive every year. The academy does not. As economies grow around the world and the web gets built out, we add more and more people to our public discourse, and the percentage of those people who've gone to Harvard just gets smaller and smaller.
It would be easier to embrace academia if a young academic had more than a snowball's chance in hell of finding an academic job. Nostalgic baby boomers can look back at a time when new colleges were going up at a great rate and they were all hiring staff. Not so much anymore. I know lots and lots of people who would love to do laboratory research, for example... in a world where it didn't involve an enormous pay cut, enormous career risk, incredible devotion, high stress, difficult politics, and potentially the sacrifice of one's family life.
It would be easier to embrace academia if academic research, much of which is paid for with public funds, was published in open journals that all taxpayers could read for free, instead of in expensive private journals that only other academics can access for "free" via their university libraries.
And a lot more people would be touting the virtues of college degrees if the unemployment rate for college graduates weren't at its highest point since 1970:
It may not be entirely the intellectuals' fault that college education costs more than before and can take longer than ever to pay off , but you can't expect people not to complain about it.
 AFAIK college education does pay off, handsomely, even now. Although if tuition keeps growing and wages do not, shouldn't we feel free to wonder, out loud, how much longer this can remain true? Isn't that what a numerate intellectual might wonder?
In principle I agree with this point, in practice it is moot. Any member of the public can visit a local university or large public library and get access to any book or journal article they desire. No, they can't read the articles without leaving their office, but, yes, they can get all of the research articles and other scholarly publications for free. Libraries exist for a reason, they still provide the service they advertise. This is a false criticism of the flawed and broken publishing system.
> I think this essay's definition of anti-intellectualism is incoherent -- must I disrespect knowledge itself, disrespect all education beyond grade school, disrespect War and Peace, or merely disrespect Nicholas Carr, in order to be labeled an anti-intellectual? -- so it's hard to make a definitive response.
As in a discussion on religion it depends entirely on how you scope the word "intellectual." It could and does mean a variety of things. I believe the author of the original article was leaving it open to many meanings. This is not good writing. He should have said what he meant, instead of the feeling of what he meant.
> in a world where it didn't involve an enormous pay cut, enormous career risk, incredible devotion, high stress, difficult politics, and potentially the sacrifice of one's family life.
Those are all fair statements on the cost of pursuing a life in academic research. However, only the pay cut really differentiates academia from any other competitive field. People engage in academia because they love it not because they get rich.
> As economies grow around the world and the web gets built out, we add more and more people to our public discourse
Interesting factoid, the amount of academic publishing has pretty much only gone up in last century. So even discounting the Internet public discourse (and I do believe academic publications are public) has been growing.
I think it's a bit sad that I have to drive all the way from Buenos Aires to Texas if I want to read a journal from an university in the US, even if it wasn't my tax money that paid for it. Specially considering how positive and easy it would be to make this knowledge publicly available.
When I was doing more research I did this occasionally and it worked out good, especially for newer papers.
Cornell's libraries are gradually disappearing; they quit buying CS conference proceedings in 2008 and 95% of the books in physical sciences and engineering will be going into long term storage where you can only get them with a library card.
Recent literature will only be available by using a computer connected to the internal internet of the school, and there may be a time when all of those computers are behind a locked door and need a password.
As for "people engage in academia because they love it" I've seen at elite schools that 80%+ of the researchers are people whose parents were also researchers who started drilling into them with a young age that their life would be complete failures if they did anything else. Often they'd run so ragged publishing and perishing that it's hard to believe that they enjoy anything at all.
This has not been entirely my experience. Unfortunately, I only have a half-remembered anecdote rather than data, but what I've found is that journals in some university libraries are becoming less accessible in paper form. Instead, one is directed to a JSTOR subscription or such. Of course, in order to access the electronic version, one must authenticate to the network with a university ID. This erodes the effect you describe.
Can anyone else confirm or refute this trend? Am I merely imagining it?
Have you ever visited tvtropes.org? Don't go if you don't have the rest of the day to waste. It is seriously addictive, mostly because it's a sterling example of hypertext in action, maybe even better than Wikipedia. Every article is linked to the others. Every citation has a link. You want to check a reference, you click. You want to surf a chain of references N generations back, you click N times. You can't not follow the links on tvtropes.org, unless you simply hate all pop culture.
It is the height of irony that HTTP was originally invented to link up academic publications, because when last I surfed academic journals, six years or so ago, it didn't work that way. You would click a journal article, maybe click through a couple of crappy login screens, maybe navigate up and down some menus on some journal's CMS, and finally find the article after a bunch of surfing. It would have footnotes. The footnotes were almost never links. Instead you had to manually type their key information into your local library's website in some other window, and then hope that your local library had the journal, then surf through some more screens and up and down some more delightful CMS cruft before you found the article.
Smart folks would download PDFs of everything as they went and then file them in some personal database like Endnote. But they still wouldn't be cross-linked. You had to spend time organizing them, or use the search a lot.
I assume that this workflow has improved at least a bit since then. Maybe Endnote or its competitor is smarter, maybe journals have finally started publishing hyperlinks. Maybe it has even improved to the point that academic literature is half as easy to surf as, say, tvtropes.org. In which case I take back much of what I said, and we can go back to estimating how many people can actually be expected to access the blessed terminal where all of this can happen. But I bet it hasn't improved that much. I bet it's still impossible to construct a hyperlink from Nature directly to Science, mostly because they've got different and incompatible DRM.
Mind you, things are heaven compared to when I was in grad school, when following references from a paper meant going into the stacks, laboriously collecting one heavy paper volume per footnote, physically hauling them back to a photocopier, and copying the articles two pages at a time. Oh, the hours I spent. Life is much better now. But, you know, young people don't know or care about how hard I had it. All they can see is that it's fifty times easier to research the plot of any episode of Neon Genesis Evangelion than it is to follow a chain of citations from a Nobel-winning publication back two generations, and that a popular article about a videogame from 1980 can link to its sources but a popular article about a publication from last Wednesday's Nature can only link to a paywall.
Science is always going to be a hard sell compared to the rest of our culture, so why do we make it even harder?
It is. try google scholar, pub med, or academic.research.microsoft.com for starters and Mendeley for building a personal "library" (note Mendeley isn't perfect but it is a lot better than not using a paper manager.) Things aren't great but they are better.
> I bet it's still impossible to construct a hyperlink from Nature directly to Science, mostly because they've got different and incompatible DRM.
I have no idea I don't read Nature or Science. The Computer Science literature I read is all easily searchable these days.
> Mind you, things are heaven compared to when I was in grad school, when following references from a paper meant going into the stacks, laboriously collecting one heavy paper volume per footnote, physically hauling them back to a photocopier, and copying the articles two pages at a time.
I still do this (but for Religious Studies research in my spare time) and don't really mind it, although I try to avoid photocopying when possible as it is a pain.
> All they can see is that it's fifty times easier to research the plot of any episode of Neon Genesis Evangelion than it is to follow a chain of citations from a Nobel-winning publication back two generations, and that a popular article about a videogame from 1980 can link to its sources but a popular article about a publication from last Wednesday's Nature can only link to a paywall.
I am usually against the idea of progress, but in this instance I do believe this will not be the case forever. Already we are starting to see changes in the publishing industry. Search tools are improving, archiving tool are improving, library collection tools are improving. Yes it is slow, but the problems they are solving a much harder than the problem of generating new content. New content can be put the days leading format (papyrus, scrolls, codex, books, pamphlets/newspaper/magazine, websites/blogs/wikis) at zero extra cost. Old content however costs a lot to update. Libraries and academic journals have a lot of content that existed before the web, before the pdf, and before the wiki. It takes sometime of the format transition but if the past is any guide the old formats eventually become historical curiosities rather than knowledge repositories.
Unfortunately, I was never a CS guy, I was a cancer researcher. And if you can't read the Nature journals you're not going to keep up with medical or genetic research -- everyone in the field openly lusts after Nature publications; Nature gets first crack at every manuscript -- and Nature sits firmly behind a paywall.
Let's try your helpful suggestions on the legendary paper, Viable offspring derived from fetal and mammalian cells, a.k.a. "the Dolly-the-sheep paper", Nature 385, 810-813 (27 February 1997). Surely I can write a blog post in which I link directly to the methods section of this famous work, which contains the first-ever working recipe for cloning an entire mammal? I mean, this is from 1997. We even had the web in 1997. Don't try to tell me that this has to be painstakingly transcribed from some faded papyrus.
Pubmed has the abstract, of course -- they have abstracts for everything:
And Microsoft provides lots of cute metadata about this paper, but no actual contents:
...so if I want to talk about the methods section, it looks like I'll have to tell my blog readers to click through to, e.g., the Nature Publishing Group and pay thirty-two dollars for the privilege :
Or will I? Google Scholar to the rescue:
The first link is to a reprint of most of the article, in a book. And they've managed to include every page. Today, anyway. For me, this time. (Other pages of the book are "not included" in the preview, so maybe others won't be so lucky?)
Oops, except the figures and tables aren't there. Sure hope there wasn't any data in those figures and tables.
So I'm saved, sort of, thanks to Google's big scanning budget and even bigger legal budget, provided I want to cite the words of Wilmut et al, and not their data. Or their slightly-more-obscure references, or any of their many citations -- for those, I will have to repeat this dance, and hope I still get lucky.
I know I've got a bit of an idee fixe here, but I cannot get over how utterly lame this is. My god, it is lame. Much of the best work of modern civilization, the pinnacle of hundreds of years of scientific advance, work which our society invests billions of dollars in, is hidden inside this pitifully obsolete rent-seeking system while we use state-of-the-art infotech to exchange pop-music lyrics and annotate famous games of Magic: The Gathering.
 No, that is not thirty-two dollars per year, or even per month. That's one paper. You're probably better off springing for a subscription ($199) although that only gets you Nature: all the other journals in the Nature family, like Nature Medicine, are extra, and of course every other journal is also extra.
And the most infuriating thing: I'm already paying for most of the public research with my taxes.
At my local university library the online journal databases are only available to students and faculty. Members of the public can only read the paper journals.
To: "Experts do not deserve any special role in declaring what is known"?
I recently attended a paid training course run by the authors of some well known toolkit. They are, without discussion, the experts on their own software.
Nobody in the course got to 'hello world' before 2PM, the instructors constantly bickered between each other what tools the students should be using, and further discussion was limited to 'check our kitchen sink app for more demo code'. Everyone on the course
This is because having knowledge of a topic, and being able to teach others, are separate skills. Both are required, however academics - people who spend their live in academia and are used to learning things for the purpose of learning, whereas most others are goal oriented - tend to lack the latter skill, despite their expertise in the former.
This is not anti intellectualism. This is about requiring teachers to have the ability to teach.
"Everyone on the course...." later revealed they were either glad they didn't pay for it themselves or regretted that they did.
As far as the lightning speed of modern information technology, I must admit to be alarmed at that- I feel that my attention span has become so deteriorated in recent years that I find it very difficult to complete reading books. I haven't even finished reading this article (though I shall do so later). Certainly shortened attention spans + greater access to information -> absorption and retention of only a shallow subset of knowledge would be a great danger.
Frankly, I'd like to see lifehacker-type articles that propose ideas to rebuild attention spans and teach memory techniques. Sort of blunting the anti-intellectual effects of prolonged exposure to the internet. I'd like to think that proper guidance, discipline, and training is the cure to anti-intellectualism. Perhaps that is something that Confucius would have approved of.
How can there be a monopoly when there are hundreds to thousands of individual institutions competing for each other? Of course, college isn't free or unencumbered by some silly rules, but neither is buying an iPhone.
I personally feel the image that they are or are trying to cultivate a monopoly is a misguided perception based on jealousy, resentment, etc
Think of it as applied Pareto: find the top 20 who do 80% of the work, and learn from them so you can move into them.
One of the most absolutely valuable pieces of 'research' I have done as part of my Master's is learning about systems that were designed and built before 1980. Then, learning about how many of those ideas were recycled and or forgotten in the trash heap behind Unix's house.
It's the whole idea that someone thinks their answer is right because someone dropped two hundred grand on an education they could have gotten for a dollar fifty in late fees at the public library.
I hear it all the time, "is he good engineer?" "yeah, he went to <who fucking cares>"
If that kind of thinking is anti-intellectual than sign me up for the anti-intellectual camp.
Then you're doing it wrong. Like many things in life, what you get out is proportional to what you put in.
I'm currently an undergraduate and I am having an exceptional experience. Two summers ago, I worked with a professor one-on-one to build a compiler for a subset of Scheme. From last summer through to the fall of 2010, I was working at Intuit which is by no means the epitome of excellent software engineering, but I got to live in San Diego, experience a whole new culture, learn to sail, and experience software engineering in the "real world".
Finally, this summer and fall, I get to work with Olin Shivers (yes, him) and some erudite, poignant grad students.
Even in classes, you can take advantage of having an expert to ask questions of, and believe me, I take full advantage of it. I've spent hours inquiring about the LHC, Mandarin idioms, and the history behind and intricacies of English grammar rules during my time here.
The school system is an opt-in, opportunity machine.
carpe diem, quam minimum credula postero
You might say that this is a learning experience on its own - how to cope with heavy workload. But:
a) when I look at myself and at others it definitely means that at some point you start to study for the sake of studying, not understanding
b) I could (arguably) create more value for my life and possibly the world, as well as learning to cope with pressure, by investing so much energy in other endeavors
Why would I read a book about a programming language when there is a set of APIs, tutorials, and articles that can convey the information in 1/10th the time? Why should I read a dozen chapters about the history leading to a new mathematical concept when I can just quickly read the research paper about that specific concept? Why do I have to read a hundred asides about what the author does on his days off, how he loves his wife and his dog, of what sort of stupid questions his students asked of him? I'm just here for information, and unfortunately books are rarely the most ideal way of getting it.
Not to say that all books are bad, and should be deprecated as a medium. I have, and will always find good books enjoyable. However, to insist that they are an irreplaceable feature of the quest for knowledge is ignoring the reality of how the quest for knowledge has changed. In the end it's not about long, difficult texts; just difficult texts will often suffice.
Intellectualism is alive and well on the Internet. In fact this blog post and reaction to it is evidence. It's just that the traditional expert model doesn't scale into real economies. Many forces continue to work against it. As with other capital, intellectual capital will flow, like liquid, through the least resistant path. If one seeks influence and participation over intellectual topics, why would they spend many years going through the motions of attaining credibility and standing, when they can establish their presence much quicker along other routes?
All of the objects and roles referenced in the article (e.g. books, experts, etc.) have continued value, and they won't disappear, they will just be augmented by technology.
Funny enough, this seems to come more often from people who either never went through college, or failed at it for whatever reason. Which can be fine, for people who are dedicated enough to actually learn on the job, instead of just acquiring habits and repeating them over and over.
What isn't fine is this praise for superficial knowledge. What irks me the most is people praising their ignorance of what academic knowledge really offers, and how it can change some of your solutions (many times unknowingly) in the "real world".
Practice does not replace fundamental knowledge. Practice only teaches you the solutions for the problems you encounter, not the solutions to the problems other people have encounterer over time, and the underlying reasons for the solutions you are using.
It this particular case there isn't so much "anti-intellectualism", but a justification for ignorance.
I firmly reject the notion that a book should be read because it is "classic". But any book that's good should be read for it's intrinsic quality as exemplary writing.
To that end, Ethan Frome can get fucked, Moby Dick is gonna (minus the incredibly inaccurate whale biology chapters) will be around forever.
Guess which «geek anti-intellectual» said that. Mark Twain.
> 1. Experts do not deserve any special role in declaring what is known. Knowledge is now democratically determined, as it should be.
Who said that, and where? "Experts" are not excluded from Wikipedia or, in fact, any other online community. In a way, Quora, StackOverflow and friends recognize the value of the expert. The only difference is that now you don't have to pay thousands of dollars and travel to another country to ask these experts questions. You don't even have to attend university! It's about inclusion, not exclusion. Is the OP bitter about the fact that any teenager with a web browser can now learn how to write a radix sort whereas he had to go to university to learn it? Or is he bitter about the fact that said teenager has already learned a good chunk of important CompSci material even before graduating from high school and thus feels that a formal CS degree might be overkill? Sure, you can't learn philosophy at home, and you sure as hell can't build a particle accelerator at home, but anything that does not require special infrastructure that only a university can provide is up for grabs. Think maths, compsci, even design.
> 2. Books are an outmoded medium because they involve a single person speaking from authority. In the future, information will be developed and propagated collaboratively, something like what we already do with the combination of Twitter, Facebook, blogs, Wikipedia, and various other websites.
Once again, I don't know where he's getting this from. Does he know about the Kindle? Or iBooks? People still buy books when they need in-depth knowledge. The only time I look at Wikipedia instead of a book is when I'm doing initial research on a topic or when I feel I only need surface knowledge of the topic at hand. Sometimes I will buy a book because it was cited on Wikipedia. Does the OP know about the books people publish online? Real World Haskell, Dive Into Python, The Architecture of Open Source Applications, Clever Algorithms to mention a few CS-related books. Then there are a bunch of books on analog and digital electronics, math and statistics, all published on the web. A book does not have to be in the form of printed pages to convey knowledge. And remember: the people writing these books are still "experts", they still speak from "authority", they still have an "individual voice". Social media merely aggregates the work of individuals.
> 3. The classics, being books, are also outmoded. They are outmoded because they are often long and hard to read, so those of us raised around the distractions of technology can’t be bothered to follow them; and besides, they concern foreign worlds, dominated by dead white guys with totally antiquated ideas and attitudes. In short, they are boring and irrelevant.
What "classics" are we talking about here? Are we talking about philosophical texts? If yes, I don't think any geek questions the value of the work of Socrates. OTOH, the value of fiction is debatable. I personally love reading classics because I want to learn about people from another era. Some people prefer to read contemporary authors, and I don't think they're any worse off. David Foster Wallace has as much to say as Tolstoy.
4. The digitization of information means that we don’t have to memorize nearly as much. We can upload our memories to our devices and to Internet communities. We can answer most general questions with a quick search.
This is an age-old debate. What is more important: understanding why the Roman empire fell or knowing when it fell? Besides, if something is worth memorizing, you almost always end up memorizing it involuntarily. For example, many developers know regex syntax by heart. Same goes for human language: after looking up the same thing 20 times, you end up remembering it. The argument geeks are making is: why remember inconsequential data when you can be analyzing it instead?
> 5. The paragon of success is a popular website or well-used software, and for that, you just have to be a bright, creative geek. You don’t have to go to college, which is overpriced and so reserved to the elite anyway.
So going to college is the paragon of success? I'm sorry, but building and running a business is a far more difficult task than getting through college. IMO, of course. I do not judge people by the degrees they have. My opinions might be a bit jaded, though; college in India produces graduates who are pretty much unemployable.
Looks like the OP met a crackpot on IRC and painted every geek with the same broad brush.
1) The difference is that the experts' opinions are not being amplified anymore,as there is no reliable "authority index" on the web yet. Should we trust wikipedia to build, say, LHC?
2) People actually read less books http://www.newyorker.com/arts/critics/atlarge/2007/12/24/071... . I guess he's referring to works of fiction and philosophy that require long linear reading, not reference material.
3) Classics are called "classics" because they convey fundamental ideas that shape our society. I would argue, though, that the problem is that "classics", as in "seminal books" are not being written anymore. Btw, Socrates never wrote anything.
4) True, i dont see what's bad about foregoing memorization of facts that are not relevant to one's pursuit.
5) Getting through college is just the start for academics. Making a profound discovery is the goal and it's much more difficult than making a moderately successful business. [My theory is that the role of luck is much bigger in entrepreneurship than in academia]
Classics are only classic in retrospect. It's not easy to say exactly what will be remembered from the last 30 years, but I'll bet the answer isn't "nothing".
This doesn't make education unnecessary, just that there needs to be a way of making it more relevant to people, problems and choices.
I think this is the core of the article.
Firstly, geeks are not opposed to knowledge, as such. There's knowledge, and the ability to apply it, and geeks value both. We don't all value both equally, there's a bit of diversity here, but "put up or shut up" is seen as a valid call. You can't just claim to know better.
You contemptuously dismiss experts who have it
We dismis experts who flash their "expert card". We are deeply suspicious of experts of the esoteric, unless they can show they are also proficient in the familiar.
you claim that books are outmoded, including classics, which contain the most significant knowledge generated by humankind thus far
Here is where we start deeply mistrusting experts of the esoteric. Notice how the author refuses to actually specify which old books are important? Criticize Plato and he will say he is talking about Gothe. Or Victor Hugo. Or Sun Zi. Or Spinoza.
you want to memorize as little as possible, and you want to upload what you have memorized to the net as soon as possible
I advocate memorizing as much as possible. Cognitive psychology has a lot of evidence that low level knowledge is the foundation of higher order skills. A lot of modern educationalists dismiss this, but they don't have any evidence; just credentials of esoteric expertise.
you want to upload what you have memorized to the net as soon as possible; you don’t want schools to make students memorize anything; and you discourage most people from going to college
Accessible knowledge is a good thing - would the author have opposed the Gutenberg press? Probably.
As I've stated, I do want students to memorize as much as possible. If there are any geeks who think differently, they should read "Why Students Don't like School" - which presents a fair amount of cognitive psychology in a very easy format.
Finally, we get to the real point. Because of our virolent anti-intellectualism, we discourage students from attending university. Bullshit. We discourage some students for going to university for three reasons: Firstly, we don't think it's necessarily the best way to educate themselves, mostly because class sizes are so big you might as well just watch a better lecturer on youtube. Second, it's not necessarily worth it, from an economic point of view. Finally, universities need a bit of competition.
However, I disagree with at least the analogy. The word "geek", in this context, refers to a computer programmer, not an intellectual.
But then, if we're seriously talking about "the classics", then "everyone" is already classic-illiterate, aren't they? I mean, I took two semesters of Great Books and done a good bit of further reading on my own. I'd be willing to bet that puts me in the top 1% of the population. But I'm still fairly classic-illiterate by the standards of 100 years ago...
Since 1942, inflation-adjusted cost per pupil increased by 700%.  Some of that is surely inefficiency but most is because we spend more on education. This means better teachers and better materials.
And this is in a country with one of the largest GDP in 1900. You can pretty safely assume we've had far less drastic changes in our own education system compared to the rest of the world.
I'm sure Thomas Jefferson could destroy you in Classics knowledge. However, he was one of ten his age that had the means and passion enough to spend much of his life reading them.
But, this all adds up. This leads to people reading War and Peace, Jane Eyre and Great Gatsby before they leave mandatory secondary school. And while not every country is at this level, technology and humanism has given people a much better chance in life to explore these types of works all over the world.
In 1900, colleges decided what it was important for an upper-middle-class person to know. One set of writers and texts was in the canon; the rest were not. Sometimes the selection was based on quality but it was often political as well. It wasn't actually possible to have "all knowledge" but one could get all the "important" knowledge in a few years of study. Any thinking person in 2011 realizes that this is no longer true (if it ever was). There's a million times more genuinely valuable knowledge out there than anyone can take in in a human life. Also, the authorities once trusted to decide what was important have been disrobed by technology; the increased power of technology has allowed us to discern that they aren't much smarter than the rest of us.
Anyway, the result is that a lot of people get really insecure and overwhelmed when they realize how much they know nothing about, and there's a tendency among some people, as a defense mechanism, simply to declare large sectors of knowledge useless, unimportant, or outmoded.
It's not just the less intelligent who have this attitude either. I've heard an esteemed computer scientist (someone with a name, but I'll withhold it here) argue that the only useful literature is science fiction because everything else is "just the seven deadly sins, over and over again".
It's not anti-intellectual to require someone back up their statements from authority, it's science.
Anti-intellectualism is when people start making ridiculous statements like "classical literature is boring and irrelevant" and "college is a waste of time" (rather than merely overpriced).
1) The more information we consume, the more likely we are to come up with a useful theory, tool etc. Our brain is designed to process and recombine stimuli and come up with innovative things, so the more combinations we do, the more likely we are to succeed. This is sort of like a darwinian theory of science and technology. There also an anti-specialization trend among geeks.
2) People have an equal opportunity to create new and significant knowledge.
While (1) is debatable, (2) seems to be outright wrong. Some of our greatest scientists, from Newton to Einstein were "lucky" in coming up with great Ideas more than once in their lives. They also had a narrow field of focus, and it's doubtful they could keep up with Twitter today.
I have never read anybody who has mastered the English language so completely has he has.