Hacker News new | past | comments | ask | show | jobs | submit login
Time to talk about why so many postgrads have poor mental health (nature.com)
414 points by amelius on March 31, 2018 | hide | past | favorite | 239 comments



The whys are obvious. What is not obvious is why we allow bodies like the NSF to continue advocating nonspecifically for careers in STEM when there is no domestic market for most PHDs. It's also befuddling why most faculty are able to sleep at night knowing they are at the top of a pyramid scheme.

It's worse than that they encourage young people to pursue a brutal career. They require what little money is allotted to research to be partially spent on making this problem worse.

But most readers of nature wouldn't like to hear those things.


What makes you think 'most' faculty appreciate the pyramid scheme? Many of us spend a lot of time worrying about our postdocs' future, in no small part because we recognize that the struggle for funding never actually ends. Tenure is not the end of the rainbow, it's reassurance that your salary can't go all the way to zero - but your lab, your team, even your office space absolutely can go to zero. At least in CS there's a career path to success that skips indentured servitude. In biology, unless you want to be a lifelong lab tech, good luck trying to get somewhere in biotech without toiling in the academic trenches. I promise, even those of us a bit closer to the top of the pyramid wish it were otherwise.


Unless you're amazingly talented at nurturing your postdocs, something like half of them will not have lasting careers in academia. I applaud your attention to their futures. It would help others in your position to understand what you envision for these non-academic-future postdocs and how you prepare them for their futures. What are you doing for them?


This is the right question to ask - not what NSF or NIH can do. I'm afraid I don't have a very satisfying answer, I'm still trying to figure it out myself. The first thing I tell my postdocs is that they can't put their life on hold and wait for adulthood to start after they finish this training (or more training). We need to be discussing career paths now, not 18 months from now. The second thing I tell them is that in my field (medicine) it's likely they can make a greater contribution on the industry side than in academia, because as important as fundamental (basic) research can be, industry is where most new treatments come from. For some I encourage entrepreneurship - harder in bio than CS but do-able for the right person. And some /can/ stay in academia as staff scientists - not responsible for running their own labs or generating their own funding, but acting mostly autonomously.


It's great to think about what you can do personally, but actually the NSF/congress did create this problem by choosing the financial structure for science in the US in the 40s-50s. That was done centrally and was not an organic extension of how science was previously funded. Many possible alternatives such as direct student funding like they do in Canada (advisors who are advisors not feudal lords), are imaginable. The best fit depends a lot on field. For example I'd say medical research is some of the most-marketable, and most medicine faculty choose academia for the right reasons.

Unfortunately in my benighted corner of science things are going the wrong way. The NSF actually just created a master's program for managing a sort of scientific software development for which there is no market. It did this because saying scientists care about education and commercialization "sounds good" on a grant proposal. This will do nothing but waste 50million, and 40 highly skilled human years of time. It's laughable to anyone in the field, but such are the incentives at NSF.


Very satisfying, I think. By all means, keep working at it! And this is great already.


It's also worth noting how the educational industrial complex plays into this. In my college, the university takes a whopping 50% of grant money in administrative fees, leaving so much less for actual hiring of postdocs and graduate students.


> In my college, the university takes a whopping 50% of grant money in administrative fees, leaving so much less for actual hiring of postdocs and graduate students.

OMG


This is effectively your research team's rent and taxes -- it goes to pay for things like the accountants, the heating bills, building repairs, instructional support, upgrading the building's electrical grid to support your new equipment, all of the library (books, journals, computers, librarians), campus security, student health, interest on debt taken out to build the building research takes place in, deprication of buildings and equipment etc. 50 percent is maybe on the high end, but the low end would be maybe 30 percent.

Many grants come with a stipulation that no more than X percent be allocated to administrative fee rate. Which basically means the non-stipulated grants have to pay an even higher rate. I believe I've seen some people argue that the ivy league's rather high indirect cost rate with the NSF, which was negotiated / grandfathered in, is an unfair advantage.


a) I'm a faculty member :P b) I meant you no personal affront, there are good an bad actors, but indisputably the career incentives are firmly directed at irresponsible ruinous growth in student/postdoc populations. At the very least professors should be removed from papers in which they contributed less than 10% of the effort.


Look at what someone does. Not what they say.

You say you worry. But you cash your cheques and continue acting in your capacity as if university degrees mean something.


University is where all the shiny tech is created that people like Peter Thiel use to make enough money to be able to tell people not to go to university.


I feel that could be said of higher edu in general. That is, it's being promoted and recommended to too many who just aren't the right fit. God forbid a parent with a degree have a kid without a degree. As if 25+ yrs ago is the same as now.

It's not much different than the housing crisis / crash. The unaware are fueling the fire, only to find out they're stuck with a bill beyond their means.

It's only a matter of time before higher edu implodes. It's not sustainable. It's not practical.


"If you think education is expensive, try ignorance".

Not saying that people without higher degrees are ignorant, of course there are many ways to learn. But I think many people wouldn't find the motivation to learn without the social pressure to get a degree. So that pressure effectively achieves a society with less ignorance.

In the US you should fix the problem of the crazy tuition fees that send people into bankrupcy, though. In countries like Germany, tuition is free, i.e., the whole society pays for your degree. I find this reasonable: as a member of society, it benefits me that e.g. we have competent doctors for when I get sick, and that they are the best possible (i.e., selected by knowledge and skill, and not by being rich enough to pay for a degree).


> "But I think many people wouldn't find the motivation to learn without the social pressure to get a degree."

On one hand, I agree.

On the other, you hit the nail on the head. This is a socialization issue. As much of the USA gives a nod to individualism, the fact is we are taught to not take that too far.

The status quo feeds the higher edu beast. But there are too many blindly chasing the promise, and only piling up tons of debt. The wrath of Higher Edu Industrial Complex continues. No questions asked.

Yes. We can fix the tuition issue. But that also means saying "Sorry Charlie, you don't qualify." This latter bit, no one wants to discuss.

In addition, the truth is, cost is a function of demand. Edu prices - like housing prices - were driven up by demand; demand that was increase with cheap loans and easy availability.

Edu's charge more? Because. They. Can.

But everyone gets to go. Some use the opportunity wisely. Some do not.

But if the pay model changes so must the who gets to go where model.

All that said, edu is not a gateway to success. And the edu model is broken. It's going to implode. When AI replaces the vast majority of jobs (including white collar) then there will be an excess of edu'ed people with too much free time on their hands (and too much debt with no viable way to pay).

Perhaps higher edu isn't the only thing headed for a reckoning?


>In countries like Germany, tuition is free, i.e., the whole society pays for your degree.

The missing part of the equation is that Germany also has high-quality, well-regarded vocational education. In the US, university is widely regarded as the only reasonable route to a secure middle-class lifestyle. In Germany, about half of school-leavers go into the Duale Ausbildung system, spending part of their time working as a paid apprentice and part of their time studying at college. The role of universities in society is completely different when there's a good alternative.


The difference is that getting a degree in Germany is much harder.

I’d be okay with publicly funded education (all levels) if the admissions requirements were a lot stricter.


That sort of defeats the point of education as socialization (that is, developing the common language that holds society together). The only fallback we have remaining is pop culture, and if we let that be our main means of socialization, people will start letting actors and reality TV stars make all their decisions.


To the extent society needs education as civic socialization, it seems like it should be done at the pre-college level.

In the US today, it is happening neither in high school nor in college (at least for most people at most high schools and colleges). The former is mostly about keeping youth supervised while parents work and enabling sufficient standardized test scores to get into college; the latter about getting the piece of paper that allows you to get a livable-wage job.


As someone who graduated high school with a whopping 1.8 GPA I totally disagree :). So far the industry seems to like my work.


>> In countries like Germany, tuition is free, i.e., the whole society pays for your degree.

The same is true for most of the EU, except, notably, the UK, which is the other developed country, besides the US, where higher education is a proper industry and where students often get some serious debt to go to university.

I did all of my studies in the UK, though I'm from Greece. From the start I was struck with how much the public discussion around education is focused on money. It's not so much that university studies are aimed at getting you a (better) job, it's more like the UK society has lost its ability to measure the need for education with any other metric than how much more money you're making afterwards.

With the debate focusing on money like that, I think it will be pretty hard for the US or the UK to do anything about their "crazy fees" like you say. They now see those fees as a kind of investment: you pay some money up front to get a return to your investment later.

That is certainly a broken way to do education, higher or otherwise, but the solution is not to tear down higher education altogether. The obvious (to me) solution is to make it publicly funded again, as it used to be in both countries in the past, so that younger generations can benefit as their parents and grandparents did, without taking on extravagant debt.


> In the US you should fix the problem of the crazy tuition fees that send people into bankrupcy, though. In countries like Germany, tuition is free, i.e., the whole society pays for your degree.

It is not always free, but at least at public universities it is really cheap compared to US universities.


>I find this reasonable: as a member of society, it benefits me that e.g. we have competent doctors for when I get sick

Problem with this is, in the list of degrees, useless degrees far outnumber useful degrees.


Education is unnecessarily expensive, but if it were otherwise, I’d say send everyone, pass or fail, for the benefit of what they glean trying. However, I’d also say , see the participation as benefit in itself and credit those that do.


Totally agree. And I would like to add that I don't even think a degree is the best way to learn even for many who excel at them.


i agree with you. and most of the teaching in universities is absolutely pathetic. students primarily learn hopelessly alone through poor but overpriced books. the curriculum and professors are uninspiring, save for a few. it’s amazing what a good professor can do, but those are unfortunately rare.

as someone who keeps thinking if i should go back to finish my ph.d., i can’t help but think of how dogmatic most graduate programs are in addition to being logistical nightmares. there’s basically no time or space for one to explore one’s own thoughts and program unless it specifically matches up with what someone else (namely an advisor) wants to do. there’s so much hand holding through required coursework (and homework) that one can’t hardly think deeply about anything actually original.

the book disciplined minds touches upon this. that those born in institutions will carry that institutional conservatism with them. i have felt this. i went to talk about some of my ideas to a well known and respected researcher. before i could even finish my sentence that began with “i’m interested in...” he cut me off shaking his head and, literally, said “i don’t believe in it...”. goddamnit.


>. there’s so much hand holding through required coursework (and homework) that one can’t hardly think deeply about anything actually original.

Take an actual graduate course in pure mathematics, very little hand holding.

>there’s basically no time or space for one to explore one’s own thoughts and program unless it specifically matches up with what someone else (namely an advisor) wants to do.

Not necessarily true. Find an adviser that will support YOUR interest (they exist). Sounds like you have some biases against PhD programs without actual experience of being in a good one.


sounds like you like making assumptions.

> Take an actual graduate course in pure mathematics, very little hand holding.

i have taken many. and you missed my point. i didn’t mean having one’s hand held through the actual material. that isn’t true at all of course, in part due to poor teaching and in part due to the difficulty of the material. i meant that one is forced to take x amount of courses. i believe, past a certain level, that this becomes extremely limiting. you spend all this time and time effort doing coursework and going to class rather than researching. learning within a context is much better than learning without context, aka most courses. and many schools hardly take any coursework transfers. there aren’t many good reason for this.

and of course i have a bias because that’s what personal experience yields. but there’s no reason for you to insult me. i would wager my bias is shared by many who have been through ph.d. programs. and of course it’s not necessarily true. there are excellent advisors. but they are rare.


To offer another point of contrast - while it's true that you often need to spend a year taking courses, in most graduate programs they're very very flexible on what courses to take. And those courses themselves are very flexible on how much work you want to do.

I don't find this too troubling - it's important to have a shared fundamental knowledge so you can communicate effectively and not reinvent the wheel, and I don't think most programs are giving you busy work (that's really against the whole ethos).


>> you spend all this time and time effort doing coursework and going to class rather than researching

Is it possible the problem is particular to the US postgraduate system? In UK PhD programmers you are not required to follow any courses. That said, at least at my university as a PhD student you have the right to follow any courses you want for the duration of your PhD (obviously at no extra fee), which I'm absolutely taking advantage of.

I should also point out that a PhD is not just a time to do research- it's an opportunity to become an expert in your chosen field. And you don't do that just by inventing new knowledge. You need to also become familiar with the work others have contributed before you- and by "familiar" I mean "learn it very, very well". Perhaps the US universities are just trying to make sure you don't spend all your time with your nose in your own research, while ignoring everything others are doing around you?


just as a point of optimistic contrast, i've taken only three required courses over my PhD. some of my peers (who came in with better undergrad prereqs) have taken less, and in a few cases, none


You know I’d actually pay the crazy fees if only I could find a place that’d reteach me advanced maths “in context”. I find often that adding context means subtracting the “advanced” part :)


Study Physics. I'd take the math prerequisites and not understand what this is useful for, and then the first week of physics class we'd review the math and suddenly it would become quite practical.


Solid advise. I think finance massively recruits maths aptitude from physics for precisely this ability to connect hard core maths to models of the world.

I wouldn’t expect a pure mathematician to be good at this nor to care about it either.

Did you ever read “How to Solve it?” Polya says he became a mathematician because he was too clever to be a philosopher and not clever enough to be a physicist :)


Not all math has applications; some of it is just interesting for its own sake.


Come to Europe. Or at least to some European countries. In mine, Spain, Ph.D. programs now have no courses at all (let alone homework). They are exclusively research-oriented, your goal is to produce good research (papers) and then your thesis. Also, the duration is capped at 4 years, and the tuition fee at my university is around €250/year. Other European countries have similar systems.


That is not the whole picture. Most Ph.D programs require you to successfully complete 2 years in a research master program which is based on classes and homework assignments. It's not easy.

After that you enter into the Ph.D. for 3 years(max 4) and you are right, you can almost do whatever you want. Just bear in mind that either you publish in top journals or your academic career ends here.


This is true, but in practice admission in most Ph.D. programs in Spain is not very selective as there are more slots than applicants, so any master's degree will do, even if it's not research-oriented. And for foreign students, many Ph.D. programs have a rule that "if you can prove that your degree gives access to Ph.D. programs in your own country, then you can enter" so it's even possible to be admitted with e.g. a US bachelor's degree (YMMV per university though).


You don’t need a masters in the UK. Three years bachelors and then straight into a PhD which is just research - no classes or teaching. And we produce more than our share of top science so it must work.


You'll have to have a first in order to be able to skip the masters, and since entry to funded programs is competitive, you have to be an exceptional candidate for this to actually happen – it's quite rare.

Then there's the program itself. DTCs (doctoral training centres) now mostly run 3 + 1 programmes, where your first year is a master's, involving taught classes and a research project. This is to get everyone up to speed on how to actually do independent research, and to build core skills. While it's true that teaching doesn't always form a mandatory part of a doctoral degree, working as a teaching assistant for at least one semester is often mandatory.

In short, you've mischaracterised doctoral education in England and Wales (Scotland may be different) to quite an extraordinary degree. I see no reason to comment upon the "more than our top share" claim, for reasons which are, I hope, obvious.


> You'll have to have a first in order to be able to skip the masters

Well I'm not sure that many people without a first would seriously consider doing a PhD, so this isn't relevant in most cases.

> DTCs (doctoral training centres) now mostly run 3 + 1 programmes

Right, which is a big disadvantage of them. DTCs are just one way to do a PhD and they're not the traditional approach in the UK.

> In short, you've mischaracterised doctoral education in England and Wales

I don't agree! This matches from I see of students going into and coming out of the system today.


Worse. A degree is no proxy for drive, ambition, creative thinking, problem solving, etc.

Too many have come to believe a degree is the magic. In some cases it is. In plenty of others it's a curse. The time and money could have been better spent elsewhere.


It's certainly no proxy, but it is a standard metric to measure said qualities against. I hope we see a rise in autodidact accreditation and acceptance of non traditional paths, along with more decentralized and public research taking place.


"it's not sustainable"

Unfortunately it is, in a grim sort of way, with grads' wages being siphoned off for decades to pay for debt and interest. Over a 15 year time scale even the largest educational debt loads are reducible, but they may be very uncomfortable to reduce for someone making, say, $60,000 a year.


To solve future problems and avoid war we need more educated people, not less.


The 'education' in this context is a system of indoctrination in a pseudo-cult like atmoshpere ("there is nothing more glorious than doing pure science in an accredited faculty and if you disagree you are an idiot"). The atmosphere is not due to malfeasance, but a combination of professional management mixed with borderline asperger short sightedness and deafness to human needs as psychological beings of flesh.

This specific system has very little to do with war or piece - it's only causal link with the society at large is it's funding - how much, by whom and by witch terms.

This is not to say all academia is problematic - but too large areas, as claimed by this article, clearly are.


I'm not saying all education systems are perfect, I'm sure they can be improved. And problems are not solved by education alone, you need a engineering/scientific mindset, that you only get from higher education, or acquired by self study and inquisitivity. The opposite is ignorance, which can be a bliss, but it will not help solve conflicts and problems.


There are no formal prerequisites to maintaining peace and avoiding war, and I don't think a formal education or a "scientific mindset" is of much help here. Further, the opposite of a scientific mindset is not ignorance, for the opposite of ignorance is awareness, and awareness is not synonymous with the ambiguous 'scientific mindset'.

I believe one could argue rather successfully that we need more poets, painters, cinematographers, musicians and artist of all shades and stripes over scientist if peace is the ultimate goal.


>> you need a engineering/scientific mindset

Another citation required. Liberal arts and philosophy majors aren't useful in spreading world peace? Ancient historians would strenuously disagree.


Ancient liberal arts and philosophy majors have generally considered war one of the highest, noblest human callings.


Communist dictators throughout history have been highly education engineers. I don't think you can say engineering degree = peace.


Not true. Not if they are educated to be ignorant.

The US was / is involved in 5 to 7 "conflicts." Edu isn't helping to stop that. In fact, anecdotally, I see educated people part of the problem. They have become less open minded, less willing to change position because their info was incomplete.

We need thought, and people willing and able to use it. Not education.


From James Loewen's "Lies my Teacher Told Me": http://loewen.homestead.com/ "Over ten years I have asked more than a thousand undergraduates and several hundred non-students their beliefs about what kind of adults, by educational level, supported the war in Vietnam. ... By an overwhelming margin - almost 10 to 1 - my audiences believe that college-educated persons were more dovish. ... However, the truth is quite different. Educated people disproportionately supported the Vietnam War. ... These results surprise even some professional social scientists. If you look at other polls taken throughout the course of the Vietnam War, you'll see that the grade-school educated were ALWAYS the most dovish, the college-educated ALWAYS the most hawkish. ... My audiences are keen to learn why educated Americans were more hawkish. Two social processes, each tied to schooling, can account for educated Americans' support of the Vietnam War. The first can be summarized by the term allegiance. Educated adults tend to be successful and earn high incomes -- partly because schooling leads to better jobs and higher incomes, but mainly because high parental incomes lead to more education for their offspring. Also, parents transmit affluence and education directly to their children. ... The other process causing educated adults to be more likely to support the Vietnam War can be summarized under the rubric socialization. ... Education as socialization influences students simply to accept the rightness of our society. American history textbooks overtly tell us to be proud of America. The more schooling, the more socialization, and the more likely the individual will conclude that America is good. ... Both the allegiance and socialization processes cause the educated to believe that what America does is right. Public opinion polls show the non-thinking results. In late spring 1966, just before we began bombing Hanoi and Haiphong in North Vietnam, Americans split 50/50 as to whether we should bomb these targets. After the bombing began, 85 percent favored the bombing while only 15 percent opposed. The sudden shift was the result, not the cause, of the government's decision to bomb. ... We like to think of education as a mix of thoughtful learning processes. Allegiance and socialization, however, are intrinsic to the role of schooling in our society or any hierarchical society. Socialist leaders such as Fidel Castro and Mao Tse-tung vastly extended schooling in Cuba and China in part because they knew than an educated people is a socialized populace and a bulwark of allegiance. Education works the same way here: it encourages students not to think about society but merely to trust that it is good. To the degree that American history in particular is celebratory, it offers no way to understand any problem -- such as the Vietnam War, poverty, inequality, international haves and have-nots, environmental degradation, or changing sex roles -- that has historical roots. Therefore we might expect that the more traditional schooling in history that Americans have, the less they will understand Vietnam or any other historically based problem. This is why educated people were more hawkish on the Vietnam War. ... Students who have taken more mathematics courses are more proficient at math than other students. The same is true in English, foreign language and almost every other subject. Only in history is stupidity the result of more, not less, schooling. Why do students buy into the mindless "analysis" they encounter in American history courses? For some students, it is in their ideological interest. Upper-middle-class students are comforted by a view of society that emphasizes schooling as the solution to intolerance, poverty, even perhaps war. Such a rosy view of education and its effects lets them avoid considering the need to make major changes in other institutions. To the degree that this view permeates our society, students automatically think well of education and expect the educated to have seen through the Vietnam War. ..."

See also John Taylor Gatto's "The Underground History of American Education" and Howard Zinn's "A People's History of the United States".


Having a paper is not the only means to get educated these days.

Oh and your assumption that education somewhat prevents wars is completely false. Societies like pre ww2 Germany and Japan were full of well educated folks.


A quick search gave plenty of articles saying higher education prevents war. For example "Among independents, education is highly related to support for the war. Those with only a high school education give clear majority support, but support declines as educational level increases."

http://news.gallup.com/poll/7768/war-support-education-gap.a...

"The results provide evidence for both the grievance and stability arguments, providing strong support for the pacifying effects of education on civil war"

http://www.uky.edu/~clthyn2/thyne-ISQ-06.pdf


Very poor data to support your point.

> There is a modest decline in war support as educational levels increase from high school or less, to some college, to college graduate (with no postgraduate education). However, the largest gap occurs at the highest level, between those with a postgraduate education and those without. Only among the postgraduate group is the majority opposed to war, 56% to 40%, while the other three categories all show majority support.

Postgraduates almost don't count since there are so few of them in a civil society. So virtually no difference between high school level and college graduates, which does not support your point.


Do citizens vote for war? All of our leaders are very well educated, by American standards, yet war and police action persists.


> Societies like pre ww2 Germany and Japan were full of well educated folks

Yes, and these educated folks opposed the Nazis. The parts of [Weimar culture](https://en.wikipedia.org/wiki/Weimar_culture) that were the centre of art, music, literature, poetry, mathematics, and science was the parts that did not become Nazi.

Out of their Nobel Prize Winners, many of them were Jewish. Out of the places that participated most in that culture was Berlin (famous for the [Berlin Circle](https://en.wikipedia.org/wiki/Berlin_Circle), also the area that voted least for the Nazis).

The people who were cultured were less likely to be Nazis. The gay clubs in Berlin were later shut down by Nazis. Nazi conservatism was to some extent a backlash against the permissive and flourishing culture - and they associated Jews, gays and Marxists with that culture.

I'm not saying cultured people can't also be barbaric - they can. But it's also not true that somehow the most cultured place in the world was hiding a dark secret - the parts that were cultured were not the parts that instigated the backlash.


You seem to conveniently ignore the mass of german intellectuals who supported the national socialist regime, as well as industrialists and artists. Even outside Germany the Third Reich had its share of supporters among elites. Lets not falsify History by pretending only stupid people were nazi supporters.


There were intellectuals who supported the Nazis, there are intellectuals who currently support the alt-right and believe in establishing ethno-states. But it's negatively correlated.


> But it's negatively correlated.

Negatively correlated? We'd need hard and reliable data to compare and make proper judgment here, and it is going to be difficult since almost everyone involved is not neutral.

The problem is that as humans, we tend to believe that the ones who support different parties/ideologies have low IQ compared to our own parties/ideologies.

It is far from being obvious because intelligence and education is usually widespread among various groups and currents of thoughts.

As individuals it is actually very dangerous to think in this way, because it leads us to underestimate our adversaries wherever they are.


What is an educated person? How do you measure education? Why do you think universities should have a monopoly of higher education?


Citation needed that we need more "educated" people (for what definition of educated?) to do those things, and that the American system is even remotely decent at doing those things. Given the trajectory of our country's political efforts combined with ever-higher enrollment in college, the correlation at least doesn't look so good.


The identity politics shoved down the throats students at universities across the country are going to create wars, not prevent them.


It's also befuddling why most faculty are able to sleep at night knowing they are at the top of a pyramid scheme.

Maybe it's a selection thing -- the ones who can't sleep at night generally leave before they reach that point, as the realization dawns on them about what's expected of them. The process selects for the people who can maintain it.


An anecdote is not a datapoint, but I left my PhD because I saw the pyramid scheme and that I didn't want (or even could) maintain the effort required to climb up the pyramid. Now I'm a research assistant, I still do science but I no longer face the insane pressure of academia.


More generously, the people giving you advice are self-selected from the group of people who succeeded by taking that advice. I expect that the faculty are well-meaning, but perhaps not able to accurately gauge what will work for others.


I my world I have not met people who can accurately gauge what will work for others.

Even your peers in study might create pressure that having phd is something cool. I dropped my masters because I had job opportunity, some of my friends assume that I finished masters and that is why I have good job. Then I clarify, I did not finish it and they are somewhat disappointed.


I recall during undergrad that the stated goal of the professors was to 'unteach' us everything we learned in high school, and twenty years later I was thankfully finally able to unlearn everything I learned in undergrad. People that stay in a cult are unlikely to be fully honest with themselves and others about its shortcomings.

Graduate school was a non-issue as I went through that as a means to an end and virtually all of the professors weren't wrapped up in academia.


What were you studying? How do you know that your professors were wrong (that what they taught should be unlearned)?


Undergraduate in Art, I think they were right to unteach us what we learned in high school but that they were wrong in only teaching Art as it relates to the 'Art world' and that bias is a result of how colleges/universities hire Art professors.


It's also befuddling why most faculty are able to sleep at night knowing they are at the top of a pyramid scheme.

I just tell prospective students the truth. That the probability of being able to stay in academia is very low. That they are going to learn and do cool things, but if they end up in industry, this is not going to be valued much (generally true in my country, it's better in others). Those that stay know what they should expect.


Here is an explanation from 1994 by Dr. David Goodstein of Caltech, who testified to Congress on this back then, whose "The Big Crunch" essay concludes: https://www.its.caltech.edu/~dg/crunch_art.html "Let me finish by summarizing what I've been trying to tell you. We stand at an historic juncture in the history of science. The long era of exponential expansion ended decades ago, but we have not yet reconciled ourselves to that fact. The present social structure of science, by which I mean institutions, education, funding, publications and so on all evolved during the period of exponential expansion, before The Big Crunch. They are not suited to the unknown future we face. Today's scientific leaders, in the universities, government, industry and the scientific societies are mostly people who came of age during the golden era, 1950 - 1970. I am myself part of that generation. We think those were normal times and expect them to return. But we are wrong. Nothing like it will ever happen again. It is by no means certain that science will even survive, much less flourish, in the difficult times we face. Before it can survive, those of us who have gained so much from the era of scientific elites and scientific illiterates must learn to face reality, and admit that those days are gone forever."

And see also "Disciplined Minds" from 2000 about some other consequences: http://disciplinedminds.tripod.com/ "In this riveting book about the world of professional work, Jeff Schmidt demonstrates that the workplace is a battleground for the very identity of the individual, as is graduate school, where professionals are trained. He shows that professional work is inherently political, and that professionals are hired to subordinate their own vision and maintain strict "ideological discipline." The hidden root of much career dissatisfaction, argues Schmidt, is the professional's lack of control over the political component of his or her creative work. Many professionals set out to make a contribution to society and add meaning to their lives. Yet our system of professional education and employment abusively inculcates an acceptance of politically subordinate roles in which professionals typically do not make a significant difference, undermining the creative potential of individuals, organizations and even democracy. Schmidt details the battle one must fight to be an independent thinker and to pursue one's own social vision in today's corporate society."

Or Philip Greenspun from 2006: http://philip.greenspun.com/careers/women-in-science "This is how things are likely to go for the smartest kid you sat next to in college. He got into Stanford for graduate school. He got a postdoc at MIT. His experiment worked out and he was therefore fortunate to land a job at University of California, Irvine. But at the end of the day, his research wasn't quite interesting or topical enough that the university wanted to commit to paying him a salary for the rest of his life. He is now 44 years old, with a family to feed, and looking for job with a "second rate has-been" label on his forehead. Why then, does anyone think that science is a sufficiently good career that people should debate who is privileged enough to work at it? Sample bias."

Or the Village Voice from 2004 about how it is even worse in the humanities than sci/tech grad school: https://web.archive.org/web/20130115173649/http://www.villag... "Here's an exciting career opportunity you won't see in the classified ads. For the first six to 10 years, it pays less than $20,000 and demands superhuman levels of commitment in a Dickensian environment. Forget about marriage, a mortgage, or even Thanksgiving dinners, as the focus of your entire life narrows to the production, to exacting specifications, of a 300-page document less than a dozen people will read. Then it's time for advancement: Apply to 50 far-flung, undesirable locations, with a 30 to 40 percent chance of being offered any position at all. You may end up living 100 miles from your spouse and commuting to three different work locations a week. You may end up $50,000 in debt, with no health insurance, feeding your kids with food stamps. If you are the luckiest out of every five entrants, you may win the profession's ultimate prize: A comfortable middle-class job, for the rest of your life, with summers off. Welcome to the world of the humanities Ph.D. student, 2004, where promises mean little and revolt is in the air."

The odd of success are probably even lower now with expanding use of adjuncts to replace tenured faculty.

Of course, the irony is that US society now has more than enough wealth so that anyone who wanted to could live like a graduate student researching whatever they wanted on a basic income.


> Of course, the irony is that US society now has more than enough wealth so that anyone who wanted to could live like a graduate student researching whatever they wanted on a basic income.

Which is more or less my plan. Some details here:

https://news.ycombinator.com/item?id=16610362

> My plan right now is to save money, retire early, and then do whatever research I want that fits my budget. This avoids many of the problems with the current system, but is not possible for many.

> This would allow me to pursue more risky research (in the sense that the research may fail to produce useful results) than an assistant professor trying to get tenure could. I also wouldn't have to raise funds, so I could focus on projects I believe are important, not just what can get funded.

I wish this option was more known and accepted. Some people seem to think I'm insane to intentionally pursue this, but as far as I can tell they don't see that the ship they are on (academia/government research/etc.) is sinking. It would be nice to talk with other independent researchers of this variety, exchange best practices, etc.


I wish you the best, but there's no one to talk to because no one has succeeded doing it. No science is done by independent researchers. You'd be limited to a very tiny sliver of research that doesn't require staff or expensive equipment. You'd be doing research without the support network of peers in a department, or colleagues at a conference to talk to. We should not advertise this as an option since it might make the gullible think this is feasible to do.


That depends on what you mean by "independent." There is a space between standard faculty and hermit researcher.

Depending upon the field, you can do research, publish, go to conferences, have a network of peers, without being standard faculty. I know research professors who have very few of the standard faculty obligations, for example. I also know people who do research entirely on private funding, but this almost always requires significantly more than just savings from retiring early, and they still stay part of the academic community.

I agree that it is not something that should be advertised as an option, because it is very rare, but it does exist.


> I wish you the best, but there's no one to talk to because no one has succeeded doing it.

There are many examples. Charles Darwin is the most famous. In my field (fluid dynamics) Robert Kraichnan is also well known and influential.

https://en.wikipedia.org/wiki/Independent_scientist

Of course, plenty of cranks go this route too. It's an option, not a panacea.

> You'd be limited to a very tiny sliver of research that doesn't require staff or expensive equipment.

Not a problem for me as a theorist. And I believe that expensive equipment is overused in my field anyway.

I also disagree that this is a "tiny sliver" of the research. Computation and theory is roughly half the research in my field, and I suspect this is true for many fields.

If this option doesn't work for you, don't do it.

> You'd be doing research without the support network of peers in a department, or colleagues at a conference to talk to.

I disagree. Collaboration does not require "official" status, and neither does attending a conference. In my field, no one cares what your affiliation is. At worst I could start a consulting company, which I'd probably do anyway. Plenty of consultants attend conferences in my field and collaborate with researchers in government, industry, and academia.


> If this option doesn't work for you, don't do it.

No need to get overly defensive. I'm responding to your argument that this should be more widely known and accepted. Science gets harder to do every year as the the lowest hanging fruit keeps getting plucked. Your examples aren't convincing -- Charles Darwin was born in 1809 and things were different back then. Your other example according to Wikipedia is someone who got a PhD at MIT, and applied for grants and held faculty positions at a number of universities.

But hey, if you end up doing this successfully. Come back and tell us how it went. But in the meantime, I'm going to continue disagreeing with you, that this is a reasonable route to productively conduct research.


Low hanging fruit is one great example of where independent research can shine. The incentives are different, so what is considered "low hanging fruit" is also different. An independent researcher can focus on projects where the timescale is longer or the project is unlikely to be funded. With independent research being rare, I can see a lot of low hanging fruit for independent researchers which traditional academics would not touch.

I could provide an example of research which is disincentivized in traditional academia but incentivized in independent research from my own PhD research if you're interested.

> Your other example according to Wikipedia is someone who got a PhD at MIT, and applied for grants and held faculty positions at a number of universities.

Kraichnan was an independent researcher from 1962 to 2003, the majority of his career. Yes, he was affiliated with a university at multiple points of his career, but he spent 4 decades as an independent researcher and produced some of his most important work during that time.


> I could provide an example of research which is disincentivized in traditional academia but incentivized in independent research from my own PhD research if you're interested.

I'd be interested, if possible.


Sorry for the length.

Traditional academics have the "publish or perish" incentive. In practice this means that they prioritize "quick wins" over "slow wins", e.g., given a choice between publishing 1 paper after 1 year (quick win) or publishing 5 papers after 5 years (with no publications before then) (slow win), they'll choose 1 paper after 1 year. If an academic goes too long without a publication, that will be counted against them. The low hanging fruit for quick wins has been taken due to this incentive, but I see no shortage of slow wins. (The scenario I describe is an extreme case, but the same incentive still exists in less extreme cases.)

There's also the problem that what can get funded is not necessarily what's most important. Norbert Wiener discusses this at length in his 1950s book "Invention". Wiener notes that despite the obvious political differences between the USSR and US, research funding is allocated similarly: by people too far removed from the actual research, who are often not in a good position to evaluate its merit. It doesn't matter if these people are managers, bureaucrats, or fellow scientists. There's generally an asymmetry in information between the scientists requesting funding and those able to provide. (Having more time to learn about each proposal could help, but the trend I imagine is that time available to review each proposal has decreased over the years.) This ignores the lottery like nature of the entire funding process.

To get more specific, both of these problems are would disincentivize a traditional academic from publishing this paper I recently submitted to a conference:

https://engrxiv.org/35u7g

In principle, a traditional academic could have written this paper. It's possible, but I think less likely because of "publish or perish" incentives. (To be clear, I am a PhD student right now, and most of the time I was doing the research in the paper I was a TA. I don't have the "publish or perish" incentives that make this research less likely. If I stayed in academia longer I would.)

My advisor and I tried to get funding for this project, but our grant proposal (I wrote the vast majority of it) was rejected for reasons beyond our control (which I have no problem with). We received positive comments on the proposal, and it served as a draft my PhD proposal.

Without going into detail, the paper develops a simple mathematical model of a certain physical process. The theory and its validation would not have been possible unless I did two things that traditional academics seem to think are a waste of time:

1. Very comprehensive literature review.

2. Very comprehensive data compilation.

Now, I think most people would believe these two are just what academics do. But apparently not. Traditional academics are incentivized to do the bare minimum to get another publication. There's an epidemic of copying of citations and merely paraphrasing review sections of papers without reading the original papers, and I think this is caused partly because of these incentives.

The literature review I did (not all of which made it into the short conference paper) was considerably more comprehensive than any I've seen published in the field before, and I was able to synthesize past theories and improve upon them by recognizing some of their flaws.

How do I know I was more comprehensive? One way is by the excellent papers I found which few seem to be aware of. In the paper I mention, papers 3 through 8 have very few citations. Some of them have not been cited at all in the past 40 years to the best of my knowledge. Someone could say that these papers are just unimportant, but they're not. In my view they're "sleeping beauties":

https://www.nature.com/news/sleeping-beauty-papers-slumber-f...

Further, I spent a year or two alone digging deeper and deeper into the literature in this problem. There were several times when I thought I probably had at least touched everything, but a few weeks later I found yet another area that I had missed. Being comprehensive is difficult and time consuming. If you just want the minimum to publish, you won't bother.

I also benefited from certain heuristics which allowed me to identify important neglected research. For example, I spent a lot of time tracking down foreign language papers and books because I recognized that this research was avoided because it was written in a different language, not because it was bad. The entry costs to foreign language literature have dropped greatly over the past decade with options like Google Translate. I've translated around 10 full papers into English right now, and produced many more partial translations. These papers have provided critical insights that were necessary towards writing this paper. At this point some people I know use the fact that I like reading foreign language papers as a joke. Traditional academics think this is absurd, but I see that there's value, just that it takes time to be realized.

It was through my comprehensive literature review that I got the idea behind my data compilation. By taking advantage of the properties of a special case, I was able to get information that most researchers in this field seem to believe is extremely difficult and expensive to obtain. I did not come up with the idea myself. I was translating a 1960s Russian language paper into English when I realized based on what was written in one paragraph that I could use the properties of a special case to get some hard to obtain information. The author was actually leading into this. The next paragraph explicitly said the author was taking advantage of the properties of a special case. So it wasn't very original on my part. The 1960s Russian researcher didn't have a lot of data to use, but there's a lot now 50 years later.

So I started compiling data. I get the impression that few academics would have compiled even half as much as I did, or have been even half as careful as I have about it. I was very careful to select only the least ambiguous data sources. Out of over 100 candidate data sources, there were only around 20 which were acceptable. I then took the time to carefully transcribe all of the relevant data from these sources, and develop a computational framework to handle this data (based on Python and Pandas). It was probably at least 6 months of work, but I can produce several papers based on it, so it's worthwhile in my view. My advisor was not initially enthusiastic about compiling this data, by the way. He's a successful traditional academic, however, and his intuitions are calibrated differently than mine are.


At the very least, you need someone to bounce ideas off of. I used to think I was an introvert, until I went to grad school and tried living in my own head for months at a time.

I clicked on that link and I noticed that Darwin and Kraichnan both had institutional affiliations.

Maybe possible if you're paying your own way. Lots of newly-minted professors need those small grants to get their research going. Find one who can see you as a colleague and not as an ATM.


Almost all good independent researchers had some sort of institutional support at some point during their careers. What makes them mainly independent researchers is that they spent a large fraction of their careers without institutional support, not that they never had any.


Increasingly, you can simply contract out the physical end of experimentation out to dedicated commercial labs (at least in the biological sciences.) It takes a bit of money (well, a lot, even) but it’s much friendlier to small-scale enterprises, among the other advantages.


> tiny sliver of research that doesn't require staff or expensive equipment

Like anyone who presents at tech conferences.


I wasn't being dismissive of tech conferences, I genuinely meant there's a lot of valuable research done by small teams. Sure, it's not work with the large hadron collider, but assuming the only thing that advances human knowledge is big budget research is just plainly false.

You're on Hacker News for Knuth's sake. The entire premise baked into the word hacker is that enterprising smart people can change the world with just a little determination and hustle.

Even the hackers out there not making the next breakthrough technology but emulating and testing some arcane ICS, it's crazy to warn people away from that as risky and possibly not useful.

Every career is possibly not groundbreaking, every path has that risk. People should be realistic, sure, but if someone wants to support themselves while they do interesting research, I have no objections to that.

Do what you love, this isn't someone whose hopes and dreams are contingent on starring in the next Hollywood blockbuster, it's someone who wants to geek out on their own dime. Great! Amazing! Tell HN how it goes and most of us will love it, world changing or even just something fascinating to you, we have your back.


I have a similar plan!

I think the opportunities in tech are promising towards this end, and would be interested in getting in contact with other people with similar plans.

I'm on the "build skills, save money" - phase currently, and probably will be for > 5 years to come. So that is my main focus right now.


Sounds like we're at a similar place. Feel free to contact me. My email address is in my profile here.


I feel like there's an opportunity for a web platform for the collaboration of (independent) researchers. Some place where I could propose an idea and others could tell me why that idea doesn't work or if it's already been done. And if the idea doesn't have any flaws and hasn't already been explored, then other researchers could help me perform the research or "fork" the project to go down different paths.

I think some platforms come close. There are places to ask questions (e.g. StackExchange) but they don't seem to like "what if" questions and if the idea is good there's no good way to follow up over months or years. Github works for some fields where the idea is tightly coupled with the implementation (e.g. some computer systems fields) but not for others.

I guess I would just like a place to discuss ideas. In academia it's common to discuss nascent ideas within your lab, but larger collaboration happens at things like conferences where you're presenting only the ideas that worked out. I think such a platform could make independent research like you're describing much more effective and attractive.


When I mentioned your comment to my wife, she suggested that Reddit has some subreddits a bit like that. A starting point: http://www.refinethemind.com/reddit-mind-44-smart-subreddits...

That said, you have a great insight here with the idea that some places on the web could replace (or at least supplement) the traditional advisor/advisee relationship with a more peer-to-peer approach -- especially for independent researchers.

That issue of how to follow up over months or years seems key as a difference from more casual one-off interactions. Well-run mailing lists or forums may provide some of that continuity -- but maybe there is a social or technical way to go further in that direction?

I'm not sure if there ever would be just one place to discuss ideas, but you might want to bring up this general idea of some new platform with Michel Bauwens of the P2P Foundation: https://en.wikipedia.org/wiki/Michel_Bauwens


My whole life, I've wanted to be a "gentleman scientist". Just went back to university to study Physics, with Bio on the side. I'm going to run out of money in 1-2 years unless the trading system works out, or I get another real jorb. Good luck. Keep the dream alive!

P.S. 42, with ADHD. Started University at 15, started high school at 11. Great at getting jobs. Not so great at keeping them. :P


What do you think of having your research funded through platforms like https://experiment.com/?


Browsing the projects on the site, I'm amazed by the fraction which have full backing. Much higher than I would have expected. But the total amounts are mostly too small for someone intending to live off them, unless they have a large number of projects at once.

From a diversification standpoint I think this should only be one source of one's funding. The bulk of my planned funding is going to come from savings from a job. I have the most control over that, and it's a much larger source. I've also considered working as a lecturer from time to time as engineering lecturers seem to be reasonably well paid (at my university, ~$10K per class). Might as well take advantage of the rise of adjuncts. You can travel to different universities regularly for collaboration this way too. Some more permanent lecturer positions are fairly decently paid from what I understand and may be a decent way to save money while also having opportunities for research. The research is the goal, not the title of "independent researcher".

Also, having to solicit funding regularly is something I'd rather not do. It takes away time from research, and I'd like to focus on research which is not so easily funded. To go back to the "low hanging fruit" point mentioned elsewhere in this comment tree, I think there are many research topics which don't sound good to a third party but are actually good. It can be hard to convince people of this. The easiest way to move forward on these research topics is to risk your own money. And with the most easily funded ideas taking priority, I can see many examples of "low hanging fruit" in my own field.

The Patreon model could get around the "research not sounding good enough to fund" problem. Pay an individual to do work in general, not specific work. But aside from someone working on topics of popular interest (e.g., gwern), I don't think this would work.


This is basically what I do and recommend. After saving up around €100K and continuing to freelance 2 months per year, I spend most of my time on research projects.

It's too bad there isn't more of an independent academic community, but it sure beats pressure to publish and writing grant applications.


This is something I used to think about too but how do you do research without learning about prior work and how do you learn about prior work without an advisor to guide you?


Follow the paper trail. Find a research topic you are interested in and find an interesting paper on Google Scholar.

Then look up some of the references and referencees or stuff from the same author. Keep doing this one tactic and you'll easily find years worth of reading to do.

Besides finding the abstracts you'll need access to the research. This used to be an issue but luckily now we have sci-hub.tw and paperdownloader.cf


All disciplines have journals, and many are either public access or available on sci-hub. If all else fails you can get a limited JSTOR subscription for only a moderately outrageous sum.

Mostly you do reaearch by being interested in something. If you need someone to tell you what to be interested in, a PhD may not be for you.

A good supervisor will tell you to be interested in things you may not have considered. But empirically, most supervisors will steer you in the direction of their own interests.

These may or may not match your own interests. The mismatch us at least as likely to be a bad thing as a good one.

In my own experience, I decided to work independently instead of starting a PhD. There are only a couple of directly relevant journals, and I literally skimmed every issue, reading and taking notes on the papers that counted as prior art.

Those papers often quoted other papers outside the immediate domain, so I followed them up - and that’s how you start.

I have a pretty good idea of the directions I’d be steered in if I was being supervised, and near certainty that those are not the directions I want to explore.


> Mostly you do research by being interested in something. If you need someone to tell you what to be interested in, a PhD may not be for you.

Maybe I'm misinterpreting but this comment seems a bit condescending.

The thing is, it's not really research unless it's something new, and you can't know if something is new unless you know what already exists.

> All disciplines have journals, and many are either public access or available on sci-hub. If all else fails you can get a limited JSTOR subscription for only a moderately outrageous sum.

Sure you can have access to journals but how do you even know what you should search for? I suppose this is sufficient if what you want to do research in is something that is currently mainstream?


> Sure you can have access to journals but how do you even know what you should search for? I suppose this is sufficient if what you want to do research in is something that is currently mainstream?

I "research" stuff all the time, do a google search for some term and find an interesting paper then go through the references that seem interesting. Works better than you'd imagine, I was looking up CESK machines and down through the rabbit hole I eventually found the original paper with the non-obvious name of The Calculi of λ-v-CS Conversion: A Syntactic Theory of Control and State in Imperative Higher-Order Programming Languages. Honestly, those math-heavy CS papers tend to make my head hurt though.

Don't think there's a whole lot of people who would claim lambda calculus is mainstream but I can spend hours upon hours finding stuff to read about and most of (or all) the papers are easily downloaded off the author's website.


> If you are the luckiest out of every five entrants, you may win the profession's ultimate prize: A comfortable middle-class job, for the rest of your life, with summers off.

That's pretty good actually. Your chances are far less than 10% in physics and that's for postdocs not students, and that's assuming you're from a top university with a good publication history in the top of the top journal(s).

Edit: For all the replies that focus on "summers off", I'm not saying you have summers off, that's a part of something I quoted. I'm well aware you don't get summers off from experience. Heck you can't take off time even after a baby unless you want to jeopardize your chances in tenure. If you read the portion that I wrote, you can see that my reply is about the job security (=tenure).


As a former professor, I can tell you summers are the busiest work times as that is when you are trying to catch up on all the research activities you put off during the semester.

It is actually nearly impossible to find time to actually take a vacation. I had more than 6 months banked when I quit.

Edit. To add to your edit, tenure is far less secure that it appears from outside. It allows you to get away with being a moderate pain, but if you get too bad what they do is make your position redundant in the next departmental reshuffle. Since these occur about every 2 to 3 years tenure is more illusory than real.

You not only have to worry about your position being made redundant, but your whole department. It is easier for admin to get rid of a few troublemakers by killing the whole department than just going after the troublemakers directly. A fair amount of my admin/politics time was spent defending the entire department rather than worrying about my own position directly.


As a current one, I can also tell you the same thing. Where did you get the idea that I am talking about summers being off?


I know you were quoting the OP, I was just adding to your post that professors don't get the summer off based on my personal experience.


Interesting do tell more please.

How much of a professors life is showing that they are doing vs actually doing.

What aspects are better and what worse? And which are popularly misunderstood (like summer vacation) ?


Professors have three full time jobs; administrator, researcher (well supervision of people actually doing the research, but this is very time consuming if done right), and teaching. Each job is more than 35 hours a week (the time I was nominally paid for).

I was deliberately terrible at the administration side and outright refused to do anything unless it meant more money for my department (the one good thing about tenure is you can be a pain in the backside and not get fired), but even spending less than 10 hours a week on admin, my work week was rarely less than 80 to 90 hours.

One of the major reasons why I quit academia and went back to my startup was to have more time to spend with my family.


> I quit academia and went back to my startup was to have more time to spend with my family.

That alone seems like a rather damning indictment of academia.


Yep. I only work around 60 hours a week now.


Can you explain why you made a point to say "supervise students"? From your statement it looks like there is no expectation to do original/independent research. Is that the standard practise too?


I didn't say supervise students so it was not a point I made??

I of course did supervise students. This does vary by field, but done right each student needs about 5 hours a week one-on-one direct enagagement (more at the beginning of their degree and less at the end). Less time than this and the student will struggle.

You can of course do what some of the big empire builders do and just let the student sink or swim on their own, but this is not good for the student. Good supervision is very time consuming, but probably the most rewarding aspect of being an academic.


I was referring to this :

> researcher (well supervision of people actually doing the research

You explicitly distinguished the two.

And thank you for your answers.


I am not danieltillett, but I am familiar with the difference between doing research and supervising research. A tenure-track professor is expected to advise students who do research. The professor's involvement in the day-to-day of the research itself will vary depending on the student's experience, how large the professor's group is, and what the professor wants. While I do know of some professors who did almost an equal amount of day-to-day research work as their student on a project, that is rare, and such professors can typically only have one or two students at at a time.

Instead, a professor is usually spending more time supervising research. This is typically because a professor has multiple students, and as danieltillett explained, they already have many other time commitments. But this is also because the point of the entire process is for the students to become independent researchers. That is, the research supervision is teaching their students how to be researchers. In my experiences, professors in science and engineering tend towards being managers rather than being in the lab or at the keyboard.

I presume danieltillett made this distinction because there is a difference between supervising and doing the research.


Also your summers aren't off, necessarily. If you're at a research institute, it's not like the group of PhD students and postdocs you manage vanishes.


Seems like the [VC funded] startup grind. Working college grads to death with the hope of being Mark Zuckerberg (or maybe not anymore). But mostly to make billionaires even richer.

The irony is that US society now has more than enough wealth so that anyone who wanted to could live like a startup "founder" making whatever app they wanted on a basic income.


I think your use of the term "startup" would be more accurate if you said "VC funded startup".


Agreed. I made the edit.


I mean, it sounds like people have been confidently predicting the demise of academia for 25 years, and it has yet to happen. And I don’t know about you but it certainly seems like we have still been exponentiall since 94.


Insightful. Congress probably saw that as a blueprint for how to oppress nerds.


You give Congress far too much credit. They’re just pushing policies that sounds good on the campaign trail, same as ever. And nerds probably cheered them. These kinds of effects are hard to understand after they happened, let alone see them coming.


> You may end up $50,000 in debt

Remember when $50,000 sounded like some kind of scary worst case scenario for student debt?


Thank you for this: the most telling aspect for me is the history of it all. This is not a new problem. I should like to look at the follow ups / aftermaths of each of these sources. I wonder what excuse(s) academia decided on.


You're welcome. I agree -- it would be great to see followups on each.

I don't think there was much response to any of them though.

As Upton Sinclair wrote about a century ago: "It is difficult to get a man to understand something, when his salary depends on his not understanding it."

How does that apply to the exploitation of the stars in the eyes of graduate students? There may be vast amounts of self-serving denial of the pyramid scheme aspect of much of academia.

Like George Orwell wrote in 1946: "The point is that we are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right. Intellectually, it is possible to carry on this process for an indefinite time: the only check on it is that sooner or later a false belief bumps up against solid reality, usually on a battlefield. ..."

I'd guess the outcome will continue to be mainly gradually increasing pain for all involved. Human systems seem to be able to tolerate a large amount of needless suffering when there is no obvious credible alternative and there are still some positive aspects of the current system. Related: https://www2.ucsc.edu/whorulesamerica/change/science_freshst...

Likely things will keep going on a downward trend until some significant shock causes a massive reorientation of resources. Alternatively, the shock may just be the crossing of various trend lines like increasing student debt versus decreasing graduate opportunities to the point where no one could justify the cost as an investment.

See also a few different theories of social collapse which could be applied to understanding possibilities for academia: https://en.wikipedia.org/wiki/Societal_collapse#Theories

And: https://en.wikipedia.org/wiki/Dark_Age_Ahead "Dark Age Ahead is a 2004 book by Jane Jacobs describing what she sees as the decay of five key "pillars" in "North America": community and family, higher education, science and technology, taxes and government responsiveness to citizen's needs, and self-regulation by the learned professions. She argues that this decay threatens to create a dark age unless the trends are reversed. Jacobs characterizes a dark age as a "mass amnesia" where even the memory of what was lost is lost."

And we are seeing that sort of amnesia in the USA in academia and other places -- where fewer people remember what academia used to be like decades ago.

Just like there is a growing amnesia where fewer people remember what it was like to go to school in the USA back in the 1960s when kids were taught how awful the USSR was because it kept its citizens under constant surveillance...

But we still might hope for a gradual transition to other ways of organizing research and discussion like via the internet (such as Hacker News) -- but people still need to somehow get enough available time to participate in productive ways.

And the original Nature article is an example of an attempt at self-correction.

Here are a couple recent satires on academia both from 2013:

From Amazon: "Option Three: A Novel about the University by Joel Shatzky -- When Acting Visiting Assistant Professor L. Circassian is fired and rehired in the same week (with a 35 percent pay cut), he is only at the beginning of a cycle of abuse and professional debasement at the university. Joel Shatzky has created an hilarious novel about the corporatization of higher education - a book filled with blowhard deans, corrupt politicians, grasping CEOs, inept union officials, inappropriately dressed students, and scholars in donkey ears."

https://www.lawrenceswittner.com/books/whats-going-uaardvark "What’s Going On at UAardvark? by Lawrence S. Wittner -- What’s Going On at UAardvark? is a faced-paced political satire about how an increasingly corporatized, modern American university becomes the site of a rambunctious rebellion that turns the nation’s campus life upside down."

Both relate to this Atlantic essay from 2000: https://www.theatlantic.com/magazine/archive/2000/03/the-kep... "The Kept University: Commercially sponsored research is putting at risk the paramount value of higher education—disinterested inquiry. Even more alarming, the authors argue, universities themselves are behaving more and more like for-profit companies."

Here is an essay I wrote mostly around 2001 on one way to fix one negative aspect of much of modern academia and other not-for-profits supposedly dedicated to working in the public interest: http://pdfernhout.net/open-letter-to-grantmakers-and-donors-... "Foundations, other grantmaking agencies handling public tax-exempt dollars, and charitable donors need to consider the implications for their grantmaking or donation policies if they use a now obsolete charitable model of subsidizing proprietary publishing and proprietary research. In order to improve the effectiveness and collaborativeness of the non-profit sector overall, it is suggested these grantmaking organizations and donors move to requiring grantees to make any resulting copyrighted digital materials freely available on the internet, including free licenses granting the right for others to make and redistribute new derivative works without further permission. It is also suggested patents resulting from charitably subsidized research research also be made freely available for general use. The alternative of allowing charitable dollars to result in proprietary copyrights and proprietary patents is corrupting the non-profit sector as it results in a conflict of interest between a non-profit's primary mission of helping humanity through freely sharing knowledge (made possible at little cost by the internet) and a desire to maximize short term revenues through charging licensing fees for access to patents and copyrights. In essence, with the change of publishing and communication economics made possible by the wide spread use of the internet, tax-exempt non-profits have become, perhaps unwittingly, caught up in a new form of "self-dealing", and it is up to donors and grantmakers (and eventually lawmakers) to prevent this by requiring free licensing of results as a condition of their grants and donations."

And here is a book-length essay be me from 2008 on how to rethink Princeton University as a mental-health-promoting post-scarcity institution: "Post-Scarcity Princeton, or, Reading between the lines of PAW for prospective Princeton students, or, the Health Risks of Heart Disease" http://pdfernhout.net/reading-between-the-lines.html "The fundamental issue considered in this essay is how an emerging post-scarcity society affects the mythology by which Princeton University defines its "brand", both as an educational institution and as an alumni community. ... We can, and should, ask how we can create institutions that help everyone in them become healthier, more loving, more charitable, more hopeful, more caring..."

So essentially, if we want the better parts of old academia from the US 1950s-1970s back, there will need to be some radical changes. As G.K. Chesteron wrote in 1908: http://www.ccel.org/ccel/chesterton/orthodoxy.x.html "We have remarked that one reason offered for being a progressive is that things naturally tend to grow better. But the only real reason for being a progressive is that things naturally tend to grow worse. The corruption in things is not only the best argument for being progressive; it is also the only argument against being conservative. The conservative theory would really be quite sweeping and unanswerable if it were not for this one fact. But all conservatism is based upon the idea that if you leave things alone you leave them as they are. But you do not. If you leave a thing alone you leave it to a torrent of change. If you leave a white post alone it will soon be a black post. If you particularly want it to be white you must be always painting it again; that is, you must be always having a revolution. Briefly, if you want the old white post you must have a new white post. But this which is true even of inanimate things is in a quite special and terrible sense true of all human things. An almost unnatural vigilance is really required of the citizen because of the horrible rapidity with which human institutions grow old. It is the custom in passing romance and journalism to talk of men suffering under old tyrannies. But, as a fact, men have almost always suffered under new tyrannies; under tyrannies that had been public liberties hardly twenty years before. Thus England went mad with joy over the patriotic monarchy of Elizabeth; and then (almost immediately afterwards) went mad with rage in the trap of the tyranny of Charles the First. So, again, in France the monarchy became intolerable, not just after it had been tolerated, but just after it had been adored. The son of Louis the well-beloved was Louis the guillotined. So in the same way in England in the nineteenth century the Radical manufacturer was entirely trusted as a mere tribune of the people, until suddenly we heard the cry of the Socialist that he was a tyrant eating the people like bread. So again, we have almost up to the last instant trusted the newspapers as organs of public opinion. Just recently some of us have seen (not slowly, but with a start) that they are obviously nothing of the kind. They are, by the nature of the case, the hobbies of a few rich men. We have not any need to rebel against antiquity; we have to rebel against novelty. ..."

Or for a more modern take on that, from 1963, as John W. Gardner said in "Self-Renewal: The Individual and the Innovative Society", every generation needs to relearn for itself what the words carved into the stone monuments mean. He says essentially that fundamental values are not some long-ago-filled-but-now-running-out reservoir from previous generations but a reservoir that must be refilled anew by each generation in its own way.

Without necessarily approving of their specific actions, value re-creation is something that people like the late Aaron Swartz (taking on JSTOR and MIT with his efforts) and Alexandra Elbakyan (taking on Elsevier with Sci-Hub) were and are trying to do. Richard Stallman with the GPL and GNU Manifesto from 1985 as a response to proprietary software agreements in academia is another less-controversial example because he worked within the existing copyright laws. So are -- also less controversially -- Wikipedia, Hacker News, Reddit (Swartz again), Slashdot, Archive.org, GitHub, and many other internet-mediated venues -- which are creating ways to have discussions and learn about sci/tech/humanities topics outside of formal academia. They are all essentially treating formal academic systems as-we-know-them-in-practice as damage and routing around them.


> Jeff Schmidt demonstrates that the workplace is a battleground for the very identity of the individual, as is graduate school, where professionals are trained. He shows that professional work is inherently political, and that professionals are hired to subordinate their own vision and maintain strict "ideological discipline."

I'll need to read that book, thanks. In the meantime, I was also reminded of things Erich Fromm wrote, or said, in this case:

> Our main way of relating ourselves to others is like things relate themselves to things on the market. We want to exchange our own personality, or as one says sometimes, our "personality package", for something. Now, this is not so true for the manual workers. The manual worker does not have to sell his personality. He doesn't have to sell his smile. But what you might call the "symbolpushers" , that is to say, all the people who deal with figures, with paper, with men, who manipulate - to use a better, or nicer, word - manipulate men and signs and words, all those today have not only to sell their service but in the bargain they're to sell their personality, more or less. There are exceptions.

-- Erich Fromm in an interview, https://www.youtube.com/watch?v=Cu-7UDT0Xe4&t=1m34s


Thanks for the interesting link to the Eric Fromm interview and the quote from the first half. The second half about pursuing happiness through engagement in society (rather than leaving it to "experts") is also quite interesting.


You're most welcome. I found this too late to edit the comment, but here's the full interview and a transcript:

https://www.youtube.com/watch?v=OTu0qJG0NfU

http://www.hrc.utexas.edu/multimedia/video/2008/wallace/from...


Does this remind others of other enterprises, like beginning video game programmers? When your are ready to leave from abuse, there is always another new person happy to take over.

The supply of new labor in the academic market seems endless. I have spoken to ~200 incoming grad students about careers, and never once did it sink in. My opinion is that it must be corrected earlier up the chain, in high school or early college.


That's like asking why there was an aristocratic court in Heian Japan when their skills in prosody and haiku were clearly not viable as jobs. I don't think every degree has to land a job, and education in itself must be a strong goal for our nation, not just economic feasibility. Don't be ruled by the Almighty Dollar.


> education in itself must be a strong goal for our nation

Too bad it isn't. Education is being used as a yardstick to isolate the poor from the economy. Don't have a BS? Guess you aren't qualified to work at McDonalds.

And for those that do get those degrees, like you said, haikus aren't a viable job. For all the history and literature majors of the world a lot of them were spoon fed what their parents thought - "get a degree, any degree, and you are ahead in the world" just to find out that "ahead" didn't mean comfortable living. It just meant you had higher priority for the limited pie of work than those who didn't have that generic piece of paper.

If you want people not to be ruled by the dollar, change society so your ability to be safe and secure in your day to day survival isn't bound so tightly to the pursuit and amassment of said dollar.


Decoupling Safety and Security from the Almighty Dollar is a noble pursuit.


I agree, but if we are going to agree to not be ruled by the almighty dollar, we have to stop giving so many almighty dollars to universities for degrees that will produce no almighty dollars after graduation.

Let people study fine arts with a minor in Russian literature. But we shouldn't subsidize huge student loans so they can get those degrees.


> “The arts are essen­tial to any com­plete national life. The State owes it to itself to sus­tain and encour­age them…. Ill fares the race which fails to salute the arts with the rev­er­ence and delight which are their due.” -- Churchill, 1938

We could debate how much subsidy should be allocated for arts vs. science, but I'd prefer that the wealthiest countries didn't restrict arts. They/we can afford it.


The arts won't go away if the US stops subsidizing it. Universities will just charge less for it.


Sure, but staying in school until your 30's is a bit ridiculous. Only so many people can do it. Think about it. You're living off of somebody else's dime until almost 40.


If you are actually trying for academia (and are considered as have a good chance of doing so), rather than being in academic wrappings around professional education, you should have consistent TA/RA/fellowships, and while it can be argued that, in living off research funding, you're living off someone else's dime, aren't you likely going to be living off that same dime the rest of your life?


Strongly agree. I got my MSc in Biological Oceanography, and it was mostly a waste of time and money. I’d imagine ~90% of my graduating class feel the same way.

I transitioned into data analytics, then data engineering, and now customer facing software development, and love my career, but the MSc has contributed very little to it.


I wonder if things are just as brutal when you take industry into account as a place to wind up post-PhD. There are research labs in industry that are interested in hiring PhDs. But maybe this is just the rose coloured glasses we have in CS.


> when there is no domestic market for most PHDs.

Isn't this common knowledge though. Isn't not like NSF is advocating people to not google search their career prospects before embarking on the journey.


I have a feeling most people don't really accept it applies to them, if the general atmosphere differs from the stats.


Just to clarify, I am faculty. I just don't take students ;P


Isn't teaching part of the definition of faculty? In other words, isn't it impossible to be 'faculty' if you don't take 'students' (i.e., 'teach')? At least anywhere I've ever heard of, this is true; someone who doesn't have students is considered 'staff'.


> Isn't teaching part of the definition of faculty?

No - I'm not sure where you got that definition from. You can be a researcher and be on the faculty of a department. It just means as opposed to the support members of staff.


I'm guessing you're either tenured or emeritus, or you're a research professor. You don't have that luxury on tenure track almost anywhere.


TT faculty at SLACs don't take graduate students (by and large) and still get tenure...


So, as tenured faculty I don't know about NSF advocating for careers in STEM. Maybe that's a good thing maybe its a bad thing.

What I do see driving this ultimately is something broader and more systemic, which is universities being underfunded, administrator heavy, and being run as for-profit businesses.

In many ways, this all reduces to how grants are awarded and funded, and the history of grant funding.

Through the 80s and 90s, and to some extent early 2000s, federal grant funding increased. There was sort of a heydey in that time, depending on field, where there was good funding opportunities, in general, for faculty.

At the same time, state funding to universities started decreasing, and costs started increasing due to the higher ed bubble. Because of indirect costs on grants, which allow universities to charge the federal gov a blanket percentage of a grant, universities were able to skim off a profit almost automatically. It was a sort of win-win it seemed because faculty got to fund their research, unis got a cut to offset decreased funding.

The problems are many, though. First, federal funding went down after a peak, so this money pool dried up. Second, this system sort of led to the pyramid scheme you mention, where grad students get recruited to maintain grants and research programs with nowhere to go, lest universities and faculty lose indirect funding. Third, it prioritizes research based on grant receipts rather than utility, which distorts research priorities in a perverse way (especially given nepotism in grant culture, which has been written about extensively elsewhere). Finally, because of the history of all of this, you kind of end up with a system in which those running the system are either beneficiaries of the 90s and 2000s, and don't realize what it is like now, or came up through that system and have a kind of survivorship bias.

Agencies like NSF and NIH are basically just maintaining any semblance of research in the US at the moment, making up for all the deficiencies that are occurring at the state level.

There are problems with NSF and NIH, but they're not with advocating for STEM grads. They're with how grants are funded. Basically, review panels need to randomly rotate in some kind of jury pool system almost; grants need to be awarded based on some kind of lottery system (proposed by former heads of NIH); indirect cost charges need to be eliminated or made line-item justified; some grants need to be awarded on the basis of publication record without regard to a specific proposal (ala Hungary); universities need to be adequately funded outside of the grant system; and tenure needs to be protected (I say this as someone who is probably going to step down from my tenured position soon).

The problems facing grad students and postdocs are just the tip of the iceberg. They don't end, even when you get a tenure-track position, and even when you get tenure. The system is broken, it's a mess that favors hype and popularity over rigor. What you end up with is the current reproducibility crisis, academic fraud, people manipulating the system to overclaim credit, and many more problems that society is unaware of because they have a stereotyped idea of what happens in science--the sort of "great man in the lab with a eureka moment" which is a false model of scientific process.


I think you're not giving enough weight to the massive increase (as a ratio) of people seeking postgraduate education. Today about 12% of people have at least a masters. And that number continues to increase as education, even in fields without meaningful career prospects, is continually evangelized. By contrast in years from 1975-1995 the total percent of people age 25-29 who had even a bachelors degree was only in the range of 21-25%. That number today is 36%, and again - growing. I doubt the ratio of gross 'revenue' (including grants) to total faculty today is less today than it was in the 'heyday.' Given the massive increases in tuition, I would be quite surprised if it's not vastly higher.

I completely and absolutely agree with you about the perversion of research directions, but I think this can be pin pointed more precisely to just a single point you said - universities are increasingly being run as for-profit businesses. That is really the root of all of these problems.


Yes, and the massive increase in people seeking postgrad education was preceded by a massive increase in people offering postgrad education. AND at the same time, you remind me of working in a college where a project was started to launch a grad program, and the justification made in committee meetings was, "we need to get some grad students around here to teach the classes and free up our time. Otherwise we'll never get ahead."


You are correct about how the current system is the result of a draw down of past heavy funding. The timing of the draw down was different in different fields. An NSF official explained to me in 1990 that, in the 1960's, 90% of grant applications were funded. By 1990, 90% were rejected. The same is true for hiring: in 1945-1970, with the G.I. bill and the explosion of new universities (e.g., in California), almost everyone with a Ph.D. was hired for a tenure track position, and many tenure track positions were held by professors without Ph.D.'s. As you say, faculty only very slowly learns that the systems they came up in are gone, and so they over-prepare young people for a world that is gone. The future may be a combination of the older tradition that professors start with inherited wealth and a new pattern that an academic career begins with 10 years building wealth in business.


I guess I sleep fine at night and I am faculty. It's only a pyramid scheme if you think everyone wants to become faculty.


Talk about Being in the Bubble Exhibit A.

People at the top of pyramid schemes get there by harming the people at the bottom and others outside the structure. One of the hardest lessons to learn in grad school was that those at the top can be incredibly nice, compassionate, affable people, but fantastically destructive.

If this topic doesn't concern you, I feel incredibly bad for your students and post-docs. It isn't a problem because it pops up on occasion. It's a problem because it's everywhere and people with the power to change it soothe their conscience with straw men and pedantic aphorisms of tenuous relevance. The bias that helps you sleep at night allows the problems to fester


Not shocking at all. One of the stunning things to me about my grad school friends' experiences is how many had stories of dealing with deranged, abusive faculty. Shit that would have me changing jobs in a heartbeat, but they couldn't really change PhD programs, and their departments would never do a thing beyond a pained look and a shrug of shoulders.

As an aside, if you want to be able to spot abusive behavior better, I strongly recommend this book:

https://www.amazon.com/Why-Does-He-That-Controlling/dp/04251...

It's the single most astute book I've ever read. The author spent years as a counselor for abusers, mostly there because a court ordered them there. It's clear that after listing to years of abuser bullshit he said, "I'm going to write it down." It's nominally target at women in abusive relationships with men, as that's what he dealt with most. But a lot of the lessons transcend the context. It helped me spot an abusive boss, for example. And the details on abuser motivations and how abuse cycles benefit the abuser have been very helpful to me in a work context as well.


Agreed. I think academia is up for its #metoo moment. Not sexual abuse, but emotional abuse. Many professors particularly the most prestigious ones (man or woman) are collosal a-holes.

Though most would say I’m flourishing in the ivory tower now as a professor at a top tier med school, my training experiences have been terrible. For example, I’d be rich if I had a dollar for every time I heard my former boss say “I don’t care about this {thing that you spent 5+ years on that I didn’t really allow you to work on anything else}”.


It's happening, off and on:

http://www.businessinsider.com/christian-ott-caltech-profess...

https://www.insidehighered.com/news/2017/12/15/uc-davis-prof...

http://www.slate.com/articles/health_and_science/science/201...

I think a lot of people in academia would be happy if the rampant sexual abuse would get some attention. I think we know that attention on the psychological and physical abuse is a pipe dream. It would be super good to get the rapists out, though.


I think there is a common economic structure in organization that creates a-hole cultures.

Actors, PhD students, etc. enter into superstar economy where many people compete for very small number of positions with prestige, money, power, fame (or whatever the measure of success is in the field).

Each position with value has large number of potential candidates who are all good enough. It's difficult or impossible to rank the top applicants with high accuracy and objectivity. Those who are selected are receiving something of great value from their mentors for subjective or arbitrary reasons.

Normally you would use auction or some other mechanism to sell the position, but actors or PhD student's don't have that kind of money and selling a position may be seen as ethically questionable because you are supposed to select the best.

The abusive culture is the result of this extra value that is given out for subjective reasons. Movie actors can be expected sell sex to cover the gap between their innate value and the position they are getting, PhD students can be used as a slave labor for years.

To correct the situation incentives should be aligned. If PhD student works on something for 5+ years, it should cost something for the boss. Dropping out from PhD program should cost something for those who are in the position to select who gets in.


Interesting that you focus only on actors and PhD students. What you describe is true of just about any field. The superstars are, by definition, rare. Sometimes they get their due to incredible skills, sometimes by luck, most often by both. It's also often the case that superstars not necessarily the best.

This is true for superstar programmers, lawyers, basketball players, managers, machinists, and just about everyone.

It's difficult to rank anyone for almost anything, the best aren't always chosen, and people often win for talents unrelated to their core competency.

To sum it up... life isn't fair.


The difference is that in some fields there is a huge gap between the superstars and everyone else. Someone who almost became a star actor probably has a second job as a waiter. Someone who almost became a tenured professor is probably making peanuts as an adjunct. Someone who almost because a superstar programmer is probably still making six figures.


You are either overgeneralizing or focusing in the wrong aspects of superstars.

Functioning job market is antidote for subjectivity. Basketball players and many other sports have selection process and auctions and player markets. Multiple teams can evaluate the candidates and there is binding contracts where money is exchanged.

(sports is US collages may be a form of abusing cheap labor though)


Every selection process or auction contains subjectivity. Ball players are not just judged on their stats, but on their pedigree, coaches, and endorsements, the compilation of which are subjective. Fine art, which is often sold at open auctions, is completely subjective. Brands, connections, styles, and attitude are all attributes that are bought and sold, and are almost entirely subjective.

Attempting to remove subjectivity from the job application or valuation process is a worthy goal, but has not historically been terribly successful.


The mechanism I described does not arise from the subjectivity alone.


Academia will never have a metoo moment. The power structure is basically feudal and there’s no role inside professional academia for free agents. You owe your adviser fealty as a grad student, then your principal investigator as a postdoc. If you speak out before you’re a PI yourself good luck with a career in academia. Even once you are a PI that horrible, abusive asshole? They know a lot of people. They sit on editorial boards and grant committees. If you don’t end their career they’ll at minimum severely damage yours. Film is a project based economy with many relatively well defined projects. Academia has very, very poorly defined ones. The economy of power and money are totally different.

And metoo worked because people care about sex. No one cares about harrowing psychological abuse. Stars being assholes to their assistants and others isn’t an open secret. It’s common knowledge. It’s not a scandal. Everybody knows everybody knows.


Just because what you say is historically true doesn’t mean it will always be true. One would have thought the #metoo movement should have happened decades ago with women’s lib movement but here it is 2018.

Information is massively more knowable now and bad behavior isnt hiding very well anymore. I’m putting my money on a-holes not winning like they use to.


and to add the literal gas lighting I have seen. Just manipulating students into thinking they are going nuts whenever they question norms that make NO sense.


So many problems, and probably too much good press for the post-graduate qualifications and the system as a whole.

My somewhat anecdotal recollection (from a good university) was that most PhD students stayed on by default as a way to continue in education with the campus lifestyle - except inevitably in a diminished way. If it wasn't this then it was to fulfill some naive ambition about making their academic prowess have some direct impact on the world.

So I think most of these problems are almost inevitable. A student goes from undergrad, where they are learning to close to the limit of human understanding in their subject at a break kneck pace, to perhaps (e.g. a physicist) spending 3 years studying what shape of foil best reflects microwaves in some particular manner.

Socially most of their peers have moved on.

Progress in science has been increasingly pushed beyond the reach of often otherwise brilliant individuals.

They have a project of often dubious value which is a completely different challenge, and a completely different value proposition to what they were doing in undergrad.

These problems are probably not well communicated, because everyone has a vested interest in continuing the lie. The PhDs generally don't want to admit failure and a poor life choice, the 'system' doesn't want to lose prestige/funding and the public want to maintain an unrealistic image about the practice and progress in the sciences. Most of the actors involved have essentially good intentions, yet it's leading to wasted talent and severe mental health problems.


Hmm, hopefully that stuff doesn't turn out that way for me. I'm going to school to learn how to do research in scientific computing. I want to get apprenticed to do a specific and competitive job, not prove that I'm smart or be an academic glutton or whatever else. I've had a real life job and I don't really care about all that extra stuff, I just want to do applied mathematics for a living. I'm probably one of the top applicants to my program so we will see what happens. I feel like I'll be content to enjoy office politics as well should I fail out and end up in another large corporate job, but that I'll be unable to focus if I don't give a serious shot at academic stuff.


I just think 'academic stuff' has a danger of being too broad and nebulous a goal. Just don't do things for the wrong reasons, and recognize that the end of undergraduate studies is a big life change regardless of what you do.

I'm quite far removed from it all anyway. It was just clear to me 15 years ago that there were problems then already. I would tend to believe that computing is going to suffer less as there has been more tangible progress in the field. Although I would add from an outsiders perspective, we do seem to get very little out of computing academia in my opinion. By this I mean usable new languages, libraries, hardware ... etc. I know this is not often the primary goal of research, but it feels like it should exist more substantially as a by product of all the activity in computing academia.


My experience was I absolutely didn't have anyone around me who was able to provide good, realistic advice about the achievability of a PhD for myself, or how to evaluate a lab and supervisors compatibility. (For context I was in chemistry for my research career)

In my case this led to eventually realising after 5 years I should submit a masters, get the hell out, and now I'm making more money then my wife who did get her PhD as a Linux sysadmin (the fortune being that throughout that time I tried hard to stay abreast of IT and that skillset - guess a part of me was correctly suspicious).


Are you perhaps European? I recognize your sentiment of people doing a PhD to continue being a student, but it seems the majority of HN doesn't. This suggests, this kind of thing doesn't happen in the US.

From what I know, a PhD in the US is rather different from one in the EU. Though things also differ between EU countries.


I work an administrative job at a US university, and I definitely recognize the trope of people pursuing their PhD just because they like the student lifestyle and/or don't know what else to do.


Yeah, UK (Or should that be not sure).

Seems to be the same result by a different mechanism? I'm not entirely sure what the alternative view is of how this situation has come about. Why are students willingly entering into this deal. Is it being missold? Shouldn't there be a lack of demand.


From what I know, British PhDs are positions that pay a wage. Whereas in the US, you need to find your own funding.

Thus, in the US there is more pressure on a PhD, because they need to take care of their pay.


British PhDs are somewhat unusual as you are still explicitly a student - with all the tax benefits that entails. Most positions are funded by Research Councils UK (STFC, EPSRC, NERC and so on) which gives a reasonable tax-free salary. You get council tax exemption, no student loan repayments, no income tax/national insurance. You also get 8 weeks holiday entitlement as standard.

We have tuition-only grants for EU students, but that doesn't include living. Vanishingly few universities provide full funding for foreign students - the exception is positions which have project funding.

In most of Europe the situation is different: you're employed by the university. This has ups and downs. On the upside, it doesn't matter where you're from, you still get paid. On the downside you're a salaried employee with all the tax burdens, crappy holiday and HR requirements that brings.

The pressure in the UK is time. Universities are now under serious pressure to churn students and 4 years is a hard deadline. Typically you need special permission and have to pay a fee to submit after 4 years. In most cases you're also only funded for 3 years, occasionally 3.5.

In other countries, where you have an employment contract, it's less urgent provided your group has funding. It's common for EU PhDs to take 4-5 years. This is again good and bad. The British system encourages rapid specalisation at the expense of breadth of study and number of publications. Realistically in the UK if you get one journal publication, you've done well. Abroad, it's expected you'll get at least 2-3 and in some places you need 4-5 to form a thesis.


In the US, your funding primarily comes from your PI. You may have to supplement it via TAing some semesters depending on your group's resources. Even if you are lucky enough to get something like an NSF GRFP, that wont fund your full five or more years as a student.


Duration is another difference, 5 year PhDs are rare in Europe. The Netherlands mostly has 4 year programs, Germany mostly seems to have 3 years. At the same time, PhDs here require a MSc (2 years usually), which in turn requires a BSc (3 years usually). So in the end, total time from start to PhD doesn't differ too much.


Many good comments here pointing out the reasons for this. One overarching problem I see is that academia (just like many/most institutions) is a bubble. People who are immersed in it are only able to see the way things work from the limited perspective of that academic bubble. There are few senior members of the bubble (in this case professors) who can really teach you to think in terms of the bigger picture of life and society as a whole. And even if they exist, there isn't a class for that.

When a starry-eyed high school graduate with little knowledge of how the world works, but with a strong interest in and ability to do science, goes to a top research university, he gets a lot of that academic bubble viewpoint. He's not taught to think independently or to identify and question the fundamental assumptions of that bubble. So he just takes it all in and does "the right thing" to go on to get a PhD, without necessarily thinking about all the implications for how it determines the rest of his life.

Doubly so if he comes from a cultural background that puts academic achievement and degrees on a pedestal. A lot of Asian cultures, for instance.

I don't know if there's a top-down solution for this. Any institution will be blind to the fact that it isn't able to look at the big picture of reality with minimal bias. It is really up to those of us who are aware of these issues to talk about this on the internet, and hope that as more people use the internet, curious individuals can learn enough to make a good decision.

BTW, the above description is based on myself. I am Asian American, had a sheltered suburban upbringing, really excelled at math and science, got almost full scores on college entrance exams, went to college at a top research university, etc. This university was really geared toward pushing people into scientific research. Fortunately I wasn't good at research but was great at building things, and realized early enough that I should go into industry. I've learned so much more about the world since then through my experiences outside of academia.


In my opinion, there's a much worse issue than universities being an ivory tower bubble. They are extremely corrupt. I don't say this lightly. I've spent more than a decade as a grad student in several top institutions. It's hard not to become insane. I've seen many of my fellow students and postdocs going mad, literally.

I've seen people holding papers they had been sent for review, copying and testing their ideas, and then rejecting them to send the same stuff for review themselves. Once you become reasonably well cited within a subfield, all stuff is sent to you for review. You become the gatekeeper. It's a winner-takes-it-all game.

I've sent a paper to Nature, which got held up because our reviewer was about to send the same stuff to Cell. After a year of intentionally slow reviews, our paper lost novelty and went into a different, but still good, journal within the Nature Group. This kind of stuff can wreck your career.

My previous principal investigator got expelled from the University for serious misuse of funds. I reported him because he used all my funds for travelling to all sorts of exotic locations instead of the experiments we had to do. It took years of reports from other people and extremely serious incidents to get him kicked out. Most of the times, political leverage and interconnections give principal investigators more or less carte blanche.

I could be writing all night, but I would rather not because I will go to bed upset. All biomedical departments, where tons of money is handled, are full of charlatans and scammers. It's sad. My prior probability of decency is only high for pure math and less flashy fields. CS is much better too.


>"I've sent a paper to Nature, which got held up because our reviewer was about to send the same stuff to Cell."

I suspect this reflects a much deeper anti-scientific attitude that I have seen in academia. If it was the exact same methods it would be great to see independent groups publish their results. Using different approaches to study the same thing is also great.

The only way I could see a problem is if they literally copied your results without running their own experiments/analyses.


I counter your anedote with mine. I've only observed highly ethical behavior from my colleagues.


Not to detract from all the valid points raised in this thread, but the elephant in the room is that an individual’s level of education is generally associated with increased intelligence, and higher intelligence is associated with much higher rates of almost all psych diagnoses.

The same is also true in most other areas of higher personal achievement: lawyers, CEOs, politicians, etc are all disproportionately on various spectrums of depression, substance abuse, sociopathy, narcissism, and so forth.

At least part of the problem is how we view mental conditions as a society, which leads people to think that this is abnormal.


High intelligence is also associated with the following:

1) not fitting in socially;

2) higher awareness of things.

Neither of these are good for anyone's mental health, but they're not problems directly caused by intelligence itself. Their combination is especially bad, as one could be aware of potential problems while others aren't, generating all sorts of friction.

I believe higher intelligence simply requires better personal mental health management, but there isn't really anywhere for people to learn that and I would say our current philosophical assumptions do not facilitate it.


> higher intelligence is associated with higher rates of almost all psych diagnoses.

Here's a recent thread about that: https://news.ycombinator.com/item?id=15886128

Yeah, it's possible that people with higher intelligence are more susceptible to some of the mental health issues that academia seems to induce. I know I struggled with some issues the last ~2 years of my PhD, but the issues were probably already there at the outset.


Couldn't it be that a high level of education/intelligence is also a sign of high socioeconomic status which would imply better access to healthcare and more psych diagnoses?

This might be wrong but it seems like an obvious superficial criticism.


This would mean that you wouldn't find a difference in poor and highly educated Canadians. I don't have data on this but I doubt it.


You might be right but it looks like this specific article is USA specific (at least the paper it's referencing is).


Just like a more powerful car is more likely to catch fire.


I actually disagree with the idea that academia doesn't teach you much about "how the world works."

In my life, my time in academia was probably the most "real world" experience of my career. I find many people who crave a more straightforward relationship between effort and recognition to be slightly naive about just how much of society is dedicated to essentially make-work tasks.

In a country where most basic needs (food, clothing, shelter) can be provided relatively cheaply by a small number of workers....well, what the heck is everyone else supposed to do?

I think it's an awesome privilege that middle class people have the opportunity to basically be paid by the government to pursue their own academic interests for a few years as a young adult, even when the return on this "investment" is highly uncertain.

The reason that the money gets dumped into STEM seems to be mainly just a coincidence of what our political culture is most likely to agree on (libs don't want to pay for theology degrees, and cons don't want to pay for gender studies or whatever).

But there's never been a better time to be alive if your interest is in mathematics or the natural sciences.


I'm one of these postgrads who decided that if I were going to face the same politics in academia as I would in industry, I should be better paid.

Surprisingly, the politics are much less where I landed. Hard work, though.

The biggest issue I've both experienced and seen about 90%-95% of PhDs in industry I mentor is that crushing feeling of inadequacy at being the "expert" in the room. Imposter syndrome is real.

My primary bit of advice is to seek mentors that invest in you. This is true whether you are experiencing Imposter Syndrome or not.


I'm many years out from my PhD, but still get impostor syndrome even working in the corporate world. Like a constant nagging feeling that I have no idea what I'm doing, and other people definitely know what they're doing. As long as I'm able to portray on the outside that I know what I'm doing, I'm able to achieve most of the objective measures of "success", but I still have the impending fear that at some point, I'm going to be found out...


You'll probably be found out as successful.

For me, just talking about it with trusted peers helps. Inside your company, if you completely trust some coworkers, or outside the company, if you don't.


Why? Let's start with:

1. More students than advisors, so a built-in pyramid scheme unless academia expands exponentially forever.

2. Fully backloaded reward system, so a built-in lock-in until the end, and the evaluation/reward isn't under your control.

3. For some, spend years doing work but only a small piece is relevant to the reward.

4. For some, zero actual training for the eventual academic job, if they get it.

5. It's a job where you do all the work and take all the risk, but are paid like shit and treated like a favor is done to "train you".

In other words, the whole thing is set up to destroy the self worth of a normal person and take away their agency and independent identity. Without coping strategies, I would expect them to develop mental issues.


How is this different from the GENERAL employment reality in every late-stage capitalist society? What you just said is applicable to pretty much any job/occupation in any industry, unless a person is at the top of the food chain (CEO, tenured faculty, etc).


The difference is, postgraduates in academia are near the top of the food chain and provide significant value to the economy, as seen by their market rates outside academia. They are usually the cream of the crop every step of the way, and on paper they are each independently training to be at the top of the food chain once they finish. This results in a severe cognitive dissonance and isolation that don't happen in your "other job/occupation in any industry". The same is seen with medical residents, where depression is frequent. Your regular wage-slave may also feel terrible but they have already accepted their fate.


This isn't new. It's a system to furnish high-quality and cheap laborers. Grad students and postdocs in big-name institutions are like all the hopeful would-be film actors hanging around LA. There are dozens of jobs and tens of thousands of applicants. But it's worse in the academy, because there's a plausible, but false, case to be made that hard work can change the outcome.

For every Nobel Prize, there are ten thousand broken dreams.


My most depressed time wasn't when I was on my PhD program. It was when I was on h1b and my job was insecure. I was very worried about losing my job because of project cut and having to leave. In 2015, I was in a state of depression which was triggered by job insecurity at first, but that was only the first week. Then following that, I couldn't sleep for about 2 months. I was trapped in a 3 day loop, the first 2 days, I simply couldn't sleep for a minute. And the 3rd day I would become too tired and I could sleep for only few hours.

At the time, I didn't know it was depression. I thought it was just Insomnia.


Mental health issues are everywhere, not just postgrads. Every week it's a new article about some easy to label sub-segment of the population.

It's time to talk about why so many (Americans) have poor mental health.

It's time to talk about the "hidden" costs (i.e., effects on physical health) of poor mental health.

It's time to talk about this symptom and what it says about the broader culture / society.


When I was a PhD student I would sometimes get so sad I couldn't work anymore that day and go home (and eat a family block of chocolate). I thought I just had emotional issues and needed to work on my mental health.

Turns out the problem was just that I was doing a PhD. I got bad burnout and left it behind. I only wish I did it sooner!


Trying to win your parents love by working yourself to death is a recipe/indicator of poor mental health.


That's an odd way of looking at it. I'd say it's an indicator of poor upbringing (whether parental or social), and it results in poor mental health.


A lot of hard work, very little financial certainty. I don't know if word that as that mental health, anymore than I'd call someone who doesn't have enough food, to be having poor physical health. When we can narrow the term down to something specific, we should.


> A lot of hard work, very little financial certainty.

Yes. It's exhausting. I'm several years past post-grad, have had a succession of one-year contracts, doing work I love and believe in and I genuinely believe has a lasting value, but the lack of any kind of security sucks. I'm rapidly approaching a wall where by I have to either go for tenure track/something more managerial/teaching or leave for something hopefully more lucrative in the private sector.

Sadly continuing with my present (rewarding, interesting) work is not an option even though after I leave, the work will still exist and will still have to be done by someone. But the academic system, such that it is, is a conveyor belt, and you gotta move on or get off.

And I got lucky too. My postgrad experience was short, went relatively well, and I'm well paid by academic standards now. I'm constantly surprised I made it out ok given the amount of stress it generated. I know many people who had much more miserable experiences in postgrad.


Not excusing the system, but for 99% of the population rewarding, interesting work is an oxymoron


Yes, to be clear, I'm certainly grateful I find (some aspects) rewarding -- if I didn't, it would be very irrational to continue, because it has very little else going for it.

Academia is so dysfunctional right now. It's parasitic off the drive of young people who just want to make the world a little better. It's very unfortunate it's ended up this way.


Startups can be the same way.


> Not excusing the system, but for 99% of the population rewarding, interesting work is an oxymoron

This is one of the reasons why I often ask myself why the suicide rate is so low in society.


I think a lot of it is context. For example when I lived in Russia, I didn't know kids could be nice to each other. It was a doggy-dog world where I grew up. When I came to Canada, it took me 4-5 years to stop mad-dogging people and relax.

Had I stayed there, being on alert your whole life and dropping dead at 55 would've been very normal indeed.

This is why people who get someplace and then lose it, are the ones who tend to kill themselves, or folks who have a hard time fitting in from the get-go. For most people, they don't know things could be different, so they're fine with what is.


> It was a doggy-dog world

You misheard that phrase or learned it from someone who did. The correct version would be, "It was a dog-eat-dog world." The idiom makes a lot more sense in its correct form as it describes an environment of brutal competition.

I'm glad you find Canada to be a kinder place. I like it here too.


Yes. Personal mental state is a poor indicator of how well things are actually going, because people seem to mentally get used to whatever it was they grew up with.

It goes another way: having some awareness of the situation is actively harmful for the same reason, because it makes it less acceptable mentally.


Dog eat dog


It might feel that way, but it's really not true. Plenty of people work jobs they'd rather not do, but for most there's at least something they like in it. Even menial labour can be satisfying when the impact of the work is immediate and obvious. When people really hate working somewhere, there's often some specific piece making them miserable, like the hours, their boss or their commute.

Life is a mixture of good and bad, and it's easy to fixate on the bad, but don't forget about the good. It matters too.


> someone who doesn't have enough food, to be having poor physical health.

If you can get all the food you want but you choose to go down a path where you are likely to starve... what does that say about your mental health?


> If you can get all the food you want but you choose to go down a path where you are likely to starve... what does that say about your mental health?

To me it says that the respective person is an idealist deep in his/her heart.


If that is the case then do you think there is even a problem?

I personally don't think it is the case.

I think the rate of people dropping out of academia is a good indicator that something is not quite right. Especially as these postgrads are dropping out well short of "making a difference".

That so many people continue to go into the system despite this shows a clear lack of judgement.


> That so many people continue to go into the system despite this shows a clear lack of judgement.

"Let me use this moment to feel better than those other people."

What is "lack of judgment", exactly?


> "Let me use this moment to feel better than those other people."

Or how about let's make sure at risk people are fully informed?

> What is "lack of judgment", exactly?

Feeling like you have wasted 5-10 years of your life - some of your best years.

The HN community thinks that intelligent people make highly informed decisions - and for many it's simply not true.


> "The HN community thinks that..."

As one HN member to another, please don't do this. HN members are not some monolithic block; they hold diverse opinions. You yourself are a member of this community. Just as you disagree with the opinions of some members, they may disagree with yours. And on other opinions you may even agree!

Just present your reasons without assuming or generalizing the opinions of others. Your statement works equally well as "Some think that intelligent people make highly informed decisions...."


> As one HN member to another, please don't do this.

There is nothing wrong with generalizing - in fact it is normal and healthy behavior.

> Your statement works equally well as "Some think that intelligent people make highly informed decisions...."

No it doesn't. That's not at all what I was trying to say - I was summarizing the general sentiment, in my experience, of the current HN community.

Look I could be wrong - but in that case please debate the point.


> "in that case please debate the point"

You're the one making an argument that your generalization is warranted: it's on you to back it up. I'm not sure how you'd be able to do so. It's not 100%, unless you yourself hold this opinion. Everyone except you? I have a hard time believing that. I've seen plenty of comments (perhaps not in this thread) about irrationality and bias in human behavior regardless of intelligence (some I've made myself). Most? How do you quantify this? What percentage? How are you going to support it? If you do have some kind of quantitative data, I'd very much like to see it. It would certainly be interesting and a good starting point for a substantive discussion.

Such generalizations do no one any good, other that reducing people to some caricature that weakens your own point. What you describe as "normal and healthy behavior" is what supports the tribalistic behavior that leads to polarization. That's certainly not healthy.


> it's on you to back it up.

I'm giving an experience report - if you do disagree with my point let me know and I can happily expand - but it feels like you disagree with me making a generalization rather than the particulars.

> It's not 100%, unless you yourself hold this opinion. Everyone except you? I have a hard time believing that.

I'm not sure you understand what a generalization is.

> Such generalizations do no one any good

That's a generalization and it's not a very accurate one.

We rely upon generalizations all the time - it's a healthy and necessary behaviour.

> other that reducing people to some caricature that weakens your own point

Another unsubstantiated generalization.

Reducing to a caricature helps focus the conversation on the caricature.

> is what supports the tribalistic behavior that leads to polarization.

How? It doesn't even make sense most of the time - for example I'm not defining or helping create an in-group vs out-group with my comments.

Secondly, what makes polarization or even tribalistic behavior inherently bad/unhealthy?

The vast majority of group behaviour is not easily quantified and is based on thousands of generalizations. It's not the problem you make it out to be.


> Or how about let's make sure at risk people are fully informed?

You didn't say they were poorly informed, you said they lack judgment. That's something usually said about a group of people to blame them for what happened and write the situation off as "not a problem, those people just suck".

So I'm not sure how you intended for that to go over as "those people are poorly informed". Of course they aren't, but that's not "poor judgment", it's a completely different statement to make, effectively an opposite one.

> Feeling like you have wasted 5-10 years of your life - some of your best years.

Why does a feeling indicate "poor judgment"? Do people who develop depression have poor judgment? Do people with midlife crisis? Do people with PTSD? Do failed athletes?


> You didn't say they were poorly informed, you said they lack judgment.

Correct. And?

One way to fix a lack of judgement is to inform.

> That's something usually said about a group of people to blame them for what happened and write the situation off as "not a problem, those people just suck"

Perhaps in your neck of the woods - where I come from it doesn't mean any such thing. In fact you would never use it like that.

> Why does a feeling indicate "poor judgment"?

Is the feeling reasonable and does it reflect reality?


I'm not sympathetic to the gripe that there's no "proper career guidance" in grad school. If you want a ready-made career track, you should not be in a PhD program.

Yes, academia is competitive, and unforgiving, and harbors a lot of eccentric and even ugly personalities. But the fact that it is "all about the discipline" is part of the appeal. And society cannot afford to sustain young people at such a high level of intellectual freedom with anything more than a "basic income" at best. Most of the things that make academia difficult are part and parcel to what makes it special.

The biggest change I would potentially advocate is an email to all first-year students letting them know that if they just want to take classes for a few years, get a cushy job, and drive a Lexus, they should be in medical school instead.


If people who got their PhDs in Physics from places like CalTech feel like they're at a complete loss as to how someone like me who only has a BS in Physics and Electrical Engineering manages to make 1.5x-4.0x what they make my first job out of college, then there is something seriously wrong with the system. I've worked alongside several PhD physicists/engineers to know this isn't an uncommon theme.

I've also had my mentors strongly discourage me from doing a PhD in Physics, despite my ability, because of how overly saturated academia is with physicists and how I will have lost money to the order of half a million or more while returning to the same exact salary and salary cap after the PhD is finished.


PhDs were not designed to maximize return on investment. They were designed to encourage students into research with a trophy of a doctorate and societal respect.

Anyone who thinks that a PhD is the fastest way to riches needs to have their head examines (by a PhD).


>how someone like me who only has a BS in Physics and Electrical Engineering manages to make 1.5x-4.0x what they make my first job out of college, then there is something seriously wrong with the system.

This is largely what I would expect. A PhD (actually, anything after the Master's) isn't really about acquiring new technical skills. It's about learning to put into practice the basic values that sustain the discipline, which in science consist largely of 1) a fanatical obsession with figuring out what the data is really telling you, 2) relentlessly questioning your hypotheses and entertaining alternative explanations, and 3) rooting out the various cognitive nooks and crannies wherein lie the temptations to fool yourself or others.

For most technical jobs, these kinds of habits are not really germane to what you are being asked to do. Hence many businesses are quite reluctant to hire people with PhDs at all. The ones they do pick up will tend to agree that their PhD was a "waste of time."


PhD is meant as training to be scientist. Even people who complain about salaries knew up front it is not something designed to earn you a lot of money. They are people who value science and academic success more then money, at least when they are starting.


This varies by field, and it varies very closely in proportion to the employability of a master's degree in that field. A master's degree in biology does not have the job prospects of a master's degree in computer science, so biology PIs have far more leverage to keep their poorly-paid grad students (and postdocs) working for them.


What do all the sane people do?

Good mental health includes things like confidence, self-actualization, self-determination, a true perception of reality, autonomy, and a positive view of ones self and ones ability to effect change.

Not having a PhD, but experience working and living with them is that there was a gap between their self-esteem and their actual self-confidence that caused many of them to suffer.

If good mental health is not something you are directly rewarded for, success in that field is going to be biased toward people who find a way to sacrifice it as leverage. From a base rate perspective, this suggests good mental health in any outlier of success is going to be spotty.

In many cases some fields must create opportunity for people with poor mental health because the sanity barrier in other fields makes these ones more viable. When you look at the stereotypes about the professions (lawyers, doctors, and professors in particular), the ones with low attrition and weak disciplinary systems acquire that worst reputations, where as a counter example, many fiduciaries seem even keeled to the point of dullness.

The solutions are likely about teaching good mental health in undergrad or before, trouble is, the question of what "good," means will likely attract extreme opposition.


The study to which the blog links is about a survey, and it finds that among those who chose to respond to the survey, about 40% have moderate-to-severe depression, and 40% have moderate-to-severe anxiety, according to a certain clinically validated scale. Also, among those who reported their work-life balance as "healthy", the percentages where about 23%, while for those reporting "unhealthy", 55%.

I take issue with the notion of "work-life balance". Metaphors do matter. This one, contraposes "work" and "life". They are now mutually exclusive, and "life" has a well known meaning, and living your life is what you certainly want to do.

And so, if you take the metaphor for granted, then when you're in the lab, or next to a blackboard, during your best years, during the most sunny hours of the day, you are not living your life. It is very easy to feel depressed in these circumstances. There is this famous Russian fable, about a girl who invented a monster, and now she is afraid of it.

My point is, a platitude when spelled out like that, that this kind of work is a part of life, not a mere staircase to some dubious place where you have "made it". I'd conjecture that a large fraction of these depressions are inner conflicts resulting from buying into the narrative that there must be some end-game to this "work", while the only thing that does not require an end-game is "life", which allegedly consists of cuddling with your loved ones, raising children, watching TV and climbing the Kilimanjaro.

Nothing wrong with caring about those, they are essential, as well as financial stability. But if you're in research due to your curiosity, its part of your life. Its an also an opportunity. And, although society wouldn't generally perceive it as sexy as Kilimanjaro, its still way better.

The only caveat is that, as with everything in your life, it requires taking responsibility. If the attitude is "I've been in the system for all these years, did all they asked me to do, and what have I got for it?", that's a problem.


one factor must be a selection bias towards people who are way too obedient and receptive to crticism. i think most people decide that its weird to slave away like that. im speaking generally, of course many people have wonderful experiences


We are wise animals but still animals.. I am guessing and it’s a guess since I don’t have any facts to substantiate this claim that we perhaps were not meant to be so techy or take so much mental stress. Starting from stressful college education to financial career, I am still at loss why even humanity moved to this direction... quality of life is a delusion ... human adjust to their given environment . In recent times.. all the progress is pacing out leading to even more stress .. do we really need so much progress so fast ??


I wrote about this in my blog https://invertedpassion.com/librarians-make-more-money-than-...

It’s simple economics. There’s oversupply if scientists and the world doesn’t demand/value the research output in most domains (although it does benefit when something valuable comes out).


Stresses of a Ph.D. program? Yup.

(1) My wife was high school Valedictorian and in college Summa Cum Laude, Phi Beta Kappa, and Woodrow Wilson and NSF Fellow.

For her Ph.D., she did fine in her coursework. But starting research, she struggled terribly: For her, the research itself was easy. Her problem was that, unlike coursework, what was good, good enough, too good, what was really wanted, the realities of research, were totally unclear leaving her terrified of doing something wrong.

The stuff of professors being not helpful, giving poor advice, finally being too critical was real and devastating.

She struggled on with the anxiety, right, causing depression. That actually hurt her speed for which she got criticism, the first ever in her academic work, and more stress and depression and finally real clinical depression. She was in severe clinical depression the day she got her Ph.D. She never recovered. She was visiting her family farm trying to recover, and she was missing. Her body was found in a lake.

Yes, the OP is correct: Grad school can be a very destructive place.

(2) As a high school senior, I took the SAT twice and got over 750 both times. At my high end, college preparatory, high school, of 1, 2, 3 on the Math SAT, I was #2. So, eventually it became clear I had some math talent.

Well, I figured out I had some math talent in the 9th grade -- I ignored the class and taught myself from the book and then made As on the tests. I continued that way in high school math classes.

The extreme case was plane geometry -- I loved the subject. The teacher was the nastiest teacher I ever had, so I slept in her class and refused to admit doing any homework. But in fact, I solved every nontrivial problem in the text, including the more difficult supplementary ones. On the state test, I was #2 in the class; #1 was the guy who later beat me by a little on the SAT Math.

I was a horrible student. Why? In grade school, my brother was ahead of me and a good student. The teachers kept saying, "Oh you are Joe's [not his real name] brother!". Well, from that start, anything I could do would be lower than expectations. Moreover, I was a boy; all the teachers were women; I had no social insight on how to please those women; my handwriting was awful; ...; and I just gave up trying to please the teachers. The teachers talked to each other so much that I was labeled as a bad student with a new teacher before I even started her class. That's why I was a horrible student -- in grade school, for me to please the teachers seemed just hopeless.

Well, that poor student approach continued in college: I nearly ignored every subject but math and physics, was a math major, largely taught myself, did some more advanced reading independently outside of class, took a reading course from a high end, pure math text where I did all the talking, lectures and exercises, and a prof just listened, and did write a math honors paper.

Then I worked for a while in applied math and computing on US national security problems and later at FedEx. Learned a lot of math, job related and also just good pure math.

Then I went to grad school. I already knew about enough for the qualifying exams.

But I still preferred to work directly from some of the best texts. Class time was mostly just writing exercise copying from the board. The real learning was studying the text or class notes.

Eventually I saw a problem not solved in class; I didn't find a solution in various relevant texts or the more famous relevant papers in the library so took a "reading course" to investigate, maybe just write an expository paper, on the problem. Well, it was a fun problem. I had some ideas and worked them out sitting with my wife on our bed while she watched TV. Some of my ideas were good, and I turned them into solid theorems and proofs. In two weeks, I was done. Fast reading course! The credit I got was the last I needed for an MS.

But it was clear that what I'd done was publishable. Later I did publish in JOTA. One prof walked up to me in the hall and said "I heard about your theorem: It says that for Brownian motion ...." Right. So, with that work, original, publishable, done with no faculty direction, presto, bingo, I had a halo.

Before grad school, I'd seen a problem. On an airplane flight I worked out an intuitive solution. Some coursework in my first grad school year gave me enough pure math background to turn my intuitive solution into solid math, and I did that independently in my first summer; walked out of the library with a 50 page manuscript. That was the original research for my dissertation. Later I wrote some illustrative software, polished my math, and that was it: There was no real faculty direction. I stood for an oral exam defense and got my Ph.D.

Lessons:

(1) For a Ph.D., the three most important things are research, research, and research. If get some good research done early on, then can have a halo and solid shield against criticism.

(2) It helps a lot to be able to be a poor student in courses. This way you are used to criticism and don't take it seriously. Also you don't wear yourself out trying to get a Ph.D. just from working hard in the courses or pleasing the profs teaching the courses. Bluntly, can't get a Ph.D. just from doing well in courses. Indeed, at the high end grad school I went to, officially there was no coursework requirement for a Ph.D. Instead, the main requirement was just research -- "An original contribution to knowledge worthy of publication.". So, (A) I didn't burn myself out doing coursework. If I didn't like a course or thought it wouldn't do much for me, then I didn't do much in the course or just walked out. (B) But I DID work hard learning the good material, and for that worked much as I had, largely independently, as I'd done back to the 9th grade.

(3) Work independently. Don't ask or expect the faculty to do much or anything to suggest problems, references, or approaches to solutions. Find your own problem, maybe even before going to grad school. Start research on your problem ASAP, in some sense in your first year or first summer. Before you and the school agree you are working on your dissertation, have it essentially all done, all but maybe some illustrative software and some polishing of the work, e.g., maybe more references.

(4) For financial support, expect free tuition but no stipend. Have your own source of food, clothing, shelter, etc.

(5) Stay close to math so that don't have to do lab work for some prof or for your research.

(6) Pick a practical problem, get a good applied math solution building on mostly some pure math, and call the work engineering. The usual criteria for research are that the work be "new, correct, and significant." Well, can get the "significant" part from the importance of the real problem. Can get the "new" part from the particular math, somewhat new, for the new solution to the new particular practical problem. The new, practical problem can stimulate and justify some new math that otherwise might not be regarded as of value or interest. For "correct", have the core of the work theorems and proofs which are seen to be correct.

So, where my wife wore herself out trying to get praise from the profs, I was willing to be a poor student and largely f'get about trying to get praise. Where my wife wore herself out trying to wonder if her research was good enough, I tried to do good research as I understood it and also that looked good enough for the criterion of "worthy of publication". Looking for praise, my wife was reluctant to do something original. I was ready to do something original whether I would get praise or not. Indeed, back in plane geometry, I'd taken a problem from outside of class, found a solution, after school showed that nasty teacher, and she said "You can't do that". Later I discovered that she was wrong and I'd reinvented a classic technique similitude and shown her. I didn't believe her. If are going to work with things that are new, realize that will likely get a lot of people, hopefully not quite everyone, saying that the work is not good.

What I did can sound like Ph.D. hacking. Well, not really: IIRC, at least at one time the Web site of the Princeton math department just stated: (1) No courses are given for preparation for the qualifying exams. Students are expected to prepare for the qualifying exams on their own. (2) Courses are introductions to research taught by experts in their fields. (3) Students should have some research work in progress as soon as possible, hopefully in their first year.

Well, that's basically what I did: I was already nearly prepared for the qualifying exams before I got there. I had a research problem started before I got there. The courses I worked hard on were good for my background and for my research. The course where I got the problem that led to my paper in JOTA was advanced enough that I could see a good unsolved problem.


I recognize many of your thoughts and observations (but not your achievements) as my own, and agree on most points.

A corollary you do not make explicit: the university is to students/diplomes as the prestigious publisher is to papers.

The student/researcher is expected to pay tuition and self-educate, and make an intellectual contribution (thesis in case of university or paper in case of publisher), for which if you pay the right price (enough years of tuition as a student, or underpaid labour as a PhD in case of university or the review and editing etc costs for a publisher) they are willing to issue a paper branded with their certification mark or prestige.

Your examples illustrate we might effectively cut out the middleman of universities as well.

Especially in the case of mathematics, it would seem that correctness could be mechanically verified by a proof verifier, novelty by the lack of the theorem and proof in some decentralized system (which only admits verified proofs) and significance I will leave open to the readers personal preference of establishing.

Perhaps significance could be established by challenge? i.e. before publishing your proof, you publish the theorem, and as time passes the reward increases, so if the problem is hard nobody else finds a solution and your tradeoff is low yield impatience or higher yield patience. If one or more people submit a solution (or its hash) then there is a deadline (say a month or perhaps a year) at which point everybody is expected to reveal the solution (original claimant last, so if people bluff they can't force you to prematurely release the solution). If multiple parties found a valid proof the rewards are split by some predetermined method. Just thinking out loud how at least pure math progress could be decentralized... The rewards you get could have a financial value, or you could use thee crypto-value for rewarding people to formalize supposed proofs from a text (if you have loads of texts that might interest you if only you had the assurance that the proofs check out, i.e. perhaps a substantial number of papers are bogus results!)

My condolences regarding your wife, that is a very sad story. And a perfect example of how universities don't do what they are supposed to do: foster and develop an individual's skills...


Take someone with intelligence and options, have them spend 10+ years of their life working toward something, and then give them a dead end job where it’s difficult to switch careers because of over education and lack of experience. No surprise here. Make these three year PhD programs at most. Don’t eat up someone’s best years.


The elephant in the room is that science is hard. You should expect failure to be the dominant mode of operation and this is antithetical to healthy mental state in general. As science progresses the low lying fruit get picked and this becomes increasingly worse.


I'd like to see a twin study on this. Maybe the direction of causality is that academia attracts depressive people.


Did a postgrad do this study? :-)


This is like asking why programmers in the game industry have poor mental health, or new doctors.


Why is Nature bringing this up only now, and why at all?


This was intended as a question of motives, not the value of the discussion.


Now's better than never. It's an important topic.


When you ask for time off you risk being replaced by someone younger and more energetic. When a research company loses funding you are told you fake the need for emergency medical leave. No one is to blame, money is the devil, and it's the only thing our government is good at controlling.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: