On the other hand, if you actually enjoy what you're studying and have any reasonable amount of skill in something else (as many humanities majors who read HN probably do), then graduate school can be fantastic. You want to use computers to solve problems, but unless you're a low-level systems programmer or a CS researcher, you'll need some specific thing to make websites or computer programs about. School is great for that.
Last year I finished my MA in classical Chinese literature. The problems of OCR'ing, transcribing, and translating Chinese are enormously interesting. Even simply trying to present digitized versions of ancient Chinese texts is difficult (no one told Confucius that only so many characters would be in the Unicode standard). I got a lot of attention (and should have productized it) by making free information available in a more convenient way with a simple Rails app. There's a huge gap in the field for a young grad student who wanted to digitize information, present it attractively, and sell it back to libraries or individual researchers. And these gaps exist in most traditional humanities fields.
Aside: I went to UW, which has the most rigorous Chinese lit. program outside of China. Obviously don't go to graduate school in the Humanities at a school that is only theory--you're probably smart and coherent enough to make your way through it without really learning anything--go somewhere that has a difficult program where you learn linguistics, philology, or serious history. I also got all of tuition, a nice stipend, and optional health insurance by doing some PHP programming for a lab at the school--don't go into serious debt for a masters in the humanities.
IMO, Getting a PHD is fairly close to real work. So, if you aproach it as a job it's not that bad. If you aproach it like your X year of collage you will probably fail.
- A mapping tool that searches for Chinese placenames through about 2500 years of history and shows you when/where that name existed, what level of the administration it was in, etc.: http://www.digitalsinologist.com/places
- Beginnings of a Cocoa-based critical text editor. http://www.digitalsinologist.com/blog/index.php?id=40
- One attempt to format a critical version of a text online: http://digitalsinologist.com/texts/XieLingyun/text.html
There is no more "standard formula" to follow, there is no more just being satisfied getting classwork done. You have to find your own ideas, your own schedule, and your own plans.
However, the Taulbee survey (http://www.cra.org/statistics/) seems to indicate that we are seeing a large increase in the number of PhDs over the last decade (1800 a year? thats a bunch to get sucked into Microsoft and Google on a yearly basis). Several of my cohort had to take postdocs or lectureship jobs (at $3K per class or something else ridiculous) to stay in academia, so it might be a bit of a CS PhD bubble as well. Falling undergrad enrollments over the last few years might have bottomed out as last year there was an uptick in BS enrollments.
The Taulbee data indicates that a large portion of CS PhD types (~60%) wind up in industry positions of one type of another.
I would probably encourage talented hackers with the interest to try grad school, but to not feel compelled to go at all if they can pick up challenging work at a startup or even bigger company.
Be damn careful about winding up an "IT professional" though. If you don't control that process, you'll wind up a jack of all trades in a job market that tends to reward specialization and scarcity of knowledge rather than general "get it done" skills.
Of course, if you start your own company, you can ignore all the above advice.
Humanities tend to have fewer direct applications in business though. History at least is seen as some value in certain types of analysis and historians are often used as consultants in some fields. On the other hand, perhaps I just run in the wrong circles, but I cannot think of any direct use for a PHD in say English Literature other than teaching.
So you have to actually be exceptional and have published something interesting to even have a good chance at making it as a professor.
You could do other things as well, write books or work for various foundations (teach high school!), but you can do those things without a Phd in medieval English literature.
You also are spending 8+ years not getting paid or putting yourself into debt, and that you still might end up working at Arby's (or what have you).
He left out that the pay for a starting professor sucks.
(Considering that you just spent 8 years of your life getting the degree).
Computer science is a bit of a different ballgame in that it is applicable (well, vaguely) to software development. So you might be able to get a job that is orthogonally relevant to whatever it was that you passionately studied in graduate school.
The sad truth, however, is that the majority of software development does not come anywhere close to requiring a Phd. And you still come out up to your eyeballs in debt or flat broke.
The good thing about it is that it looks pretty good on a resume. (Although weighing 8 years writing software against 8 years in grad school, I might go with the guy who has work experience... is kind of a toss up).
The google/ibm/microsoft research positions are few and far between, in any event.
The vast majority of current software development is guis for databases and can be done with relatively little training, not even requiring a BS in CS much less a PHD. But there is more advanced development where the more theoretical backing is at least useful.
(Although weighing 8 years writing software against 8 years in grad school, I might go with the guy who has work experience... is kind of a toss up).
I largely agree, but it depends on what kind of software the programmer was writing. Doing major kernel hacking for a Linux distro is highly impressive for instance, doing maintenance patches for yet another database front end is not.
Also remember that the two is not an either or proposition.
It requires a somewhat flexible day job and it may take longer, but you can work on a graduate degree while holding down a day job. I work a help desk part time for most of my undergrad degree and work as a DBA/Sr. Programmer while working on my masters right now for instance. On the flip side, you can go to class at day and contribute to open source or do contracting at night.
Another nice thing about the world of software development is that you can actually get involved in many of the more (or most) interesting projects without any degree at all. There are some areas where you'd benefit greatly from a PhD, but you aren't at all relegated to "guis for databases" as a programmer with something less than a PhD.
If you already have a career and are working on your masters on the side, then you need to focus on your job and do the masters as time permits. For instance, I expect my masters to take 7 semesters (3 1/2 calendar years) and it wouldn't bother me too much if it took 4 years. I have a job I like with a salary that pays the bills. I am working on my masters primarily for personal growth and hoping for some career advancement edges down the road.
But this may just be the situation at my alma mater, a land grant university that attracts lots of research dollars but doesn't quite have the prestige to attract lots of talented students. I wonder if students at "name brand" schools don't end up with more debt because competition for funding is more intense.
Ultimately the less glamorous a job/career and the higher the barrier to entry the better the conditions.
(This is one of the reasons why businesses are often the best way to make money; it's an unpleasant and uncertain slog which few are willing to take on and navigate).
I often wonder how much of the 'exciting' part is actually 'math was too hard so any hard sciences are out of the question'. Some students know for sure what they love and want, but some are in the undecided major for 2 years and then are forced to pick. The decision they make sometimes is influenced by their level of proficiency in the subject not by the love of the subject itself.
The answer is of course obvious. Roll the dice and start your own business in the hopes that you can become the exploiter instead of the exploited.
The owners in a business are the only ones who cannot be outsourced.
The bigger question is "how to not be useless?" Useful people will do well in the market whether they're an employee or an entrepreneur. Useless people, by definition, fail whether they're an employee or an entrepreneur. If you can figure out how to build skills & assets that other people want - which is much harder than simply starting a business - then you'll always be in demand.
If you make society about being useful to the top 5% of the population, then the rest of the population is always going to be left in the dust unless you turn to geneticaly modified workers ala Gattaca.
The question is, how do you create a society where everyone can make a living?
It is ingrained into modern society thinking that in order to make a living, you need to be useful to someone else that has the money, power, and natural resources that you don't have simply because of age, luck, or someone being naturally smarter.
But what if everyone was guaranteed land to live in and produce their own sustenance? In a different type of society, or perhaps far into the future, this may be a possibility given population control.
In this manner, people are guaranteed to at least make a living. The tendency for riots will be greatly diminished when basic needs are met. Everyone else who wants to create more wealth and enter the monetary exchange will be free to do so.
Q: What do you trade? A: Things that others find useful.
Even if you give people land they will need tools, supplies, medicine, housing, plumbing, fuel, electricity, roads, computers, phone/internet, etc. They will need to trade, they will need to be useful.
One can easily move to the middle of nowhere montana/north dakota/alaska and live off the land with a little bit of planning. Most don't want to.
There have been studies documenting this. Here is a short story from Tim Ferriss which sums this up where he realizes that a Mexican fisherman really has a very high quality of life.
Most of the arable land has been concentrated in the hands of a few, and the technology needed to generate sufficient harvest to weather downturns such as drought or flooding are out of reach for everyone in 3rd world countries.
That's why you can't look at 3rd world country farmers and just say that no one would be able to live "off the grid" comfortably.
In the 20th century an American of average ability could sustain themselves doing office work or industrial labor.
Beginning sometime in the late 20th century, the average American seems (to me at least) to be less and less needed in the modern American economy. Jobs are automated or off-shored.
What do we as a society do about/with those "useless" people? It's a lot of people.
Or am I too pessimistic?
Of course, if the products are also exported, eventually the exploited people will figure out that they can cut out the "useful" middleman. Then we useful and useless Americans are all in the same boat.
That's the subject of the whole novel, and it's a really great one.
That doesn't sound too bad at all.
Presumably some percentage of doctorate holders do not want "tenure-track positions." Some might be good at things other then academia to which their academic expertise is useful. Some are probably already wealthy or old & do not want to work full time. Some are just not very good at being academics. You have that in every field regardless of training.
Being a history researcher is probably a less wonderful career path then Laws or engineering but I assume the students know this going in & prefer history anyway.
Besides, I know several PHDs working as academics with comfy 6 figure jobs that I would never hire for anything.
This article seems to be assuming that all PHD candidates are all of the highest "quality" and that even the bottom 10% would be flying high anywhere else and are wasting their talents in Academia. That's just not true. I'm sure that many are. These get their cushy professor jobs or do something else that they like.
thread? I strongly support beefing up duplicate detection.
The situation is looking very ripe for a disruptive business model offering the same quality of education online for a much lower price.
This is true if you're saying that universities sell credentials among other things but false if you're implying that universities _only_ sell credentials. They also sell a rich learning and networking environment; _structured_ knowledge that you're presumably acquiring from people who've already mastered a field and thus can bring you through it faster than you could on your own; motivation in the form of deadlines and so forth, which is often difficult for most people; editing / mentoring relationships that help you dialectically develop your skills; and a way to guide figuring out what you might be interested in.
None of that is to deny that universities sell credentials too, but if that were their only function, they wouldn't be essential to our society.
That is worth anything only if it will help achieve future profesional goals. You are probably thinging of a rich start-up culture. It can be awesome for engineering and business majors. But for humanities, I am not so sure.
> motivation in the form of deadlines and so forth, which is often difficult for most people;
There is some value in that. Universities used to play the 'in loco parentis' role in the past (http://en.wikipedia.org/wiki/In_loco_parentis). In other words they provide an environment where discipline and certain norms are enforced. I am just not so sure it is worth $120k worth of debt at the end.
Could point. Therefore the diploma mill market. All in all it is just another bubble. The only thing that keeps the bubble going is that employers still screen based on degrees, therefore degrees are perceived to have value. Now I would argue for not even working for any employers that screen heavily based on degrees instead of extensively checking and matching a candidate's knowledge and personality fitness ... but that's just me.
> ... they would have been put out of business by public libraries long ago.
And a lot of universities already publish quality course materials on the web.
I am doing communications degree(easiest on campus). And you are right, if all I gained from it was material from my COMM classes, it would be pretty crap. But I each semester, at least 50% of my classes are in topics I am really interested in at the b-school.
Personally I like doing my communications degree because I can easily get the degree part of it taken care of and then nitpick classes all over the places that interest me. I came into school hoping to goto b-school. But the intro classes and the prerequisites at the b-school are so exhaustive and boring I gave up.
You might argue I am a special case. Probably. But here's one way you can make humanities work in your favor. They give you the much-hyped "college degree" and they let you pursue your other passions. For example, my COMM classes are full of basketball players:)
There's nothing new about PhD's leaving school to find nothing, except that there are more of them chasing fewer opportunities. Don't expect colleges to clue you into these realities: it would be bad for business.
Sad, but accurate. On one hand, some humanities academics are directly responsible for this; the attitude of many academicians that research was the "real work" and teaching was just commoditized grunt work ended up hosing the humanities. Physicists can afford to cop that attitude, because if they're great researchers the university will put up with poor/no teaching, but those in the humanities can't, because the transfer of culture to rising generations (e.g. education) is the raison d'etre of humanities departments.
On the other hand, the corporatization of the university and research world in general has been an unmitigated disaster, and it'd be better for all of us if the trend reversed.
If by "some humanities academics", you mean the deans and heads of departments, you might be right but otherwise you're blaming the soldiers for the large-scale situation.
A Ph.D in CS is not, as I understand it, a prerequisite for success in a technological field, but PG was able to make use of it to start Viaweb. Could a Humanities Ph.D do anything like that?
I found this article by Rands very relevant when I read it:
In my experience it's not that humanities Ph.D.s can't do anything with their degree - in general, these are smart, analytic people - but that they don't want to.
Those who study engineering learn to ask "how does that work?".
Those who study the sciences learn to ask "why does that work?"
Those who study account learn to ask "how much will that cost?"
Those who study liberal arts learn to ask "do you want fries with that?"
This is a bit self-serving, it having come from an engineering school (where I was a CS student), but there's some truth in there.
Of course--but that same person could do it without the Ph.D.