A PhD is not required to do research, nor does research necessarily lead to a PhD. A PhD is ultimately like any other degree: a school programme where, at the end, some instituting is staking part of its reputation on giving the world a promise that you have some minimum competence in some field. Anything you learn in school, you can in principle learn on your own. (I have a PhD myself and I encourage everyone who has the opportunity to undergo formal schooling, but I'm well aware that not everybody wants to.)
There is one part of the article that is a bit confusing:
> I was also told by one professor that it’d be “expected” for me to join faculty full-time after getting a PhD, which made me uncomfortable, because I knew I didn’t want to work in academia long-term.
Maybe some context is missing, but I'm a bit incredulous that anyone would believe this statement. A professor may train dozens of PhDs over his career, and if all were expected to join faculty, then the resulting exponential growth would cause the working population to be pretty much entirely university faculty.
As far as I see, PhD isn't really about studying or schooling - unlike all the previous school programmes, it's essentially an "apprentice researcher" job, where you work on research (possibly studying whatever you need for it), you get given some supervision by a more experienced researcher and once you demonstrate that you're capable of independent research by doing some and writing it up, you get a degree certificate. And it's not necessarily about some institution staking its reputation on it, the main evaluation of your research would be by external peer reviewers looking at your publications, not by your institution alone.
A PhD is not required to do research, but research is (or should be) required to get a PhD, and research done outside an university program would generally be sufficient to get a PhD - there are paths to get a PhD degree mostly based on existing publications or based on a 'monograph' such as the research-based book that the OP has written. Learning and schooling alone, on the other hand, is not sufficient for a proper PhD; if you've learned everything that's known and published on some topic, that's not enough, you need to make at least some novel contribution.
> A PhD is ultimately like any other degree: a school programme where, at the end, some instituting is staking part of its reputation on giving the world a promise that you have some minimum competence in some field. Anything you learn in school, you can in principle learn on your own.
No, this neglects perhaps the most important aspect of a PhD: it is an apprenticeship with a professional researcher. Learning through apprenticeship is the most ancient and still often most effective methods of learning complicated skills. It is reasonable to dispute the value of accreditation from a university -- that's a practical question, and indeed strict university accreditation is relatively modern -- but it's not reasonable to think that anything you can learn in "school" can be learned on your own.
Precisely - that would be an unsustainable pyramid.
The supply of Ph.D. graduates is much larger than the number of faculty job openings, which can attract hundreds of applicants for a single position. Winning the faculty lottery is extremely unlikely.
And nevertheless, plenty of faculty preach the belief that if you don't stay in academia, you are a failure.
At least that's the case in mathematics. People who leave academia are spoken of as if they are dead, which might as well be the case: most stop doing mathematics, attending conferences, keep connections with peers, etc.
This attitude has only began to change in the last decade.
I have plenty of love for academia (PhD here) but it does, like most large institutions and corporations, show a tendency to become a huge feedback loop that interfaces only with itself and excludes the society at large, apart from siphoning money from it.
We should make efforts to overcome the friction of opening up and reaching out no matter where we work. That keeps organizations healthy and sane.
This must be heavily field-dependent. I've honestly never seen that among economists. It's no doubt most common that someone in a PhD program will want to get an academic job, but places like Amazon and the Federal Reserve hire tons of PhD's.
I had a student request to work with me because he thought that gave him the best chance of getting a nonacademic job, and I didn't see that as unusual. I have a student at this very moment going through the interview process with one of the big SV companies. For that matter, I would be happy to talk if a nonacademic employer were interested in me.
Edit: Take a look at Paul Romer's bio. He worked his way up to full professor at Stanford, quit to start a business, and last year won the Nobel for his early academic work.
The paradox with mathematics is that the vast majority of math PhDs end up in the industry, but they are taught as if getting a tenured position is their only way to succeed in life and attain fulfillment.
There's no such stigma in Computer Science, for example (in my observations).
But isn't it actually the case that if you don't stay in academia, then any hopes of you contributing to mathematical research pretty much have failed?
Being in academia is the main way how people can be paid to work on research. If you're not in academia and not independently wealthy, chances are you're not doing research any more - some industries have big R&D labs doing fundamental research, but usually people going to industry would be just applying the research there and providing no research output that drives the field forward.
It's the case most often now, but has not been historically.
Consider that Fermat was a lawyer by occupation, and more recently, Fourier did a lot of things other than math.
An academic job is not all research: the amount of time and effort spent on teaching and duties is huge. So why can't people employed in the industry continue contributing to science?
The problem is mostly social, I think, and it's twofold:
1)It should become the norm to not work 5 days a week in technical fields (with a pay cut, if needs be). Not "work from home"; the mental space is scarce, and switching is hard.
2)There should be no friction in being involved with academia if you are not a part of it. People not officially affiliated with an academic institution do face friction.
Ronin Institute[1] is a virtual organization that aims to address (2). As for (1), people should be prepared to enter industry from the very start, so that if they do, they would be able to confidently demand the conditions that would allow them to continue with research.
I have a PhD too, and I felt I was expected to join academia after finishing my PhD. My supervisor treats me a bit like a "wasted talent" for getting a job in industry. Professors get reputation points when students become professors, in addition to expanding their influence in the field, as junior professors will still seek them for advice and will occupy strategic positions in academia. Some of my friends have told me they feel the same, and know a PhD student who is being discouraged from applying to teaching positions when she finishes, even though she feels that's her calling. There's a sense that teaching positions in academia are even lesser than a research position in industry.
> Anything you learn in school, you can in principle learn on your own.
I think this statement is lacking some truth. We can rewrite it as:
> Anything you learn in school, you can in principle learn on your own, but it can take significantly more time and there's no guarantee of consistency (compared to someone learning with a teacher).
>some instituting is staking part of its reputation on giving the world a promise that you have some minimum competence in some field.
Yes, and part of this process is having your work peer-reviewed as novel by active researchers in the field of interest (and getting advice/corrections from them along the way to the finish). I see this as an integral part of the process.
> Anything you learn in school, you can in principle learn on your own, but it can take significantly more time and there's no guarantee of consistency (compared to someone learning with a teacher).
So, honest question, did you guys really learn anything directly from a teacher? I've gone through high school (obviously) and university but everything I've learned has been at home by reading about it (or simply practicing to become fast enough). The lectures tended to only touch the absolute basic concepts, and the actual learning you had to do at home. Maybe it had to do with the lectures being in giant halls with little to no interaction with the prof in most cases, and that probably changes if you're doing a PhD (or just go to a different university), but even during high school I pretty much never learned anything of note directly from a teacher. So if anything, it would have been significantly faster and more efficient for me to just get a list of topics to learn instead of sitting in class.
As for consistency, my friends from a different university learned in some cases completely different things. If we compare strictly what was discussed in class or required to pass tests, there would be surprisingly little overlap. (And neither overlapped very much with actual programming.) Even courses with essentially the same topic would often differ greatly in content, as the professor usually decided which particular things to focus on. Which is completely fine I mean you can't go in-depth into everything, but this notion of consistency is kind of funny to me when the same degree from different universities (sometimes even the same university just a few years apart with different professors) can mean completely different skill sets. And then of course if your have a CS degree and want to work as a developer, from my experience you have to learn the actual programming pretty much 95% on your own anyway, as most courses focus on purely theoretical topics, and programming simply requires a lot of practice.
>I pretty much never learned anything of note directly from a teacher.
I've learned things after being corrected by a teacher, and then I practised on my own until the next mistake, at which point I was corrected again, and so on. It's this interaction that I find valuable, not just stating facts on a blackboard. As I have taught mathematics myself at the university level, I find that students don't really need me to read them the facts, I was more there to align their understanding.
As for consistency, I meant that you won't get to cherry-pick the topics that you like when learning on your own (or solve the problem sets that you find simple), which is a natural thing to do by the way. If you have your own curriculum and you stick to it, that's great, but I have found that when I allowed students to pick their own problems for homework, they grew weaker in some areas and stronger in others. This also didn't give me enough signal on their understanding in general, which meant that it deprived them of useful feedback.
> but this notion of consistency is kind of funny to me when the same degree from different universities (sometimes even the same university just a few years apart with different professors) can mean completely different skill sets.
Absolutely! I didn't mean consistency in terms of pushing out duplicates of the same, but consistency in terms of attacking a variety of problems in some course, allowing you to become well-rounded in your understanding. Once you reach that level, you can fill in the gaps and be comparable to a colleague that maybe had a slightly different curriculum.
>I've learned things after being corrected by a teacher, and then I practised on my own until the next mistake, at which point I was corrected again, and so on.
Yeah, that actually makes sense. I suppose with programming (or anything CS related) I had this feedback loop much more readily available in forums, IRC or Stackoverflow etc. so I never really appreciated having this from a teacher. But outside of CS topics it might not be that easy.
That is really the most basic part of CS however. Anyone can fumble through with trial and error to get things to work. Design is the hard part, and you can trick yourself into thinking you designed something well just because it compiles and spits out what looks right.
"Does it compile" is indeed a low bar to clear, but you can also get fairly quick data on "how fast is it?" by benchmarking, and sometimes even "how well does it work?" (ML, compilers, etc).
It can also take significantly less time. I'm not sure the traditional lecture in front of a blackboard approach is relevant nowadays.
>> there's no guarantee of consistency
There's no guarantee of that in any case.
I think the main value of the PhD is being in general proximity to, and collaborating with people interested in the same field. I'm not sure, however, if college is strictly speaking necessary for that in the third decade of the 21st century.
Strong opinion, but I think that if you want to learn facts, you can learn them faster on your own. If you want to learn a new skill, you need somebody to check your ideas until you can get to a level where you can self-correct yourself.
> II'm not sure the traditional lecture in front of a blackboard approach is relevant nowadays.
Sure, and I agree, most of the time I would prefer to study on my own instead of going to the lecture. However, it was incredibly useful to have an expert that could align my understanding whenever I was off to the wrong path; I think those were the opportunities for learning, not reading facts of off the board.
> There's no guarantee of that in any case.
Let's say that it's significantly more likely that you have practised on a variety of easy to hard problems in your field of interest if you have taken good courses from a university versus doing your own work. I believe that when you pick your own homework, it's natural to cherry-pick problems that seem simple.
> I think the main value of the PhD is being in general proximity to, and collaborating with people interested in the same field.
That would (and does) describe research divisions in industry too.
It's not a soo strong opinion. There are definitely people who learn a lot from lectures, I've seen a lot of them, but there are also a lot of people who don't like lectures and feel they learn nothing in it and learn quite fast alone at home. The last one is mostly me. (for context, i have a PhD in AI).
Completely agree. Intelligence alone is not sufficient (perhaps not even always necessary) to learn a new field. People really need experts to let them know when they're wrong until they develop the discipline to recognize it for themselves.
This is very domain-specific. It makes total sense in STEM, but is more debatable in the arts and humanities - where it seems you learn whatever specific style of writing, practice, and critical analysis is popular in academia at the time, and you're going to have a bad time if you try to step out of that.
Even there, if your goal is to learn to write in that currently preferred style, your odds are much better if you have someone who can point out when you deviate from it.
I'm the same way and have always been. The only time I got anything from lectures in college was when I had an extraordinarily gifted professor (inventor of the EMP bomb) teach electromagnetic field theory. Awesome lecture, beautiful math, too bad I used approximately zero of it later on. But at least I enjoyed it.
Maybe some context on my end is missing — but isn’t the last paragraph basically describing the current problem with academia? That’s exactly what people are being trained for, there is just not enough demand for PIs / Professors so bright-eyed optimistic researchers end up getting left up sh*t creek without a paddle after four to six years.
The problem is multifacted, but it's not the case that PhD students are trained to become professors. They are trained to become researchers. The problem is also not generally that PhD grads believe they will become professors and are completely lost when that does not happen - in many fields, a PhD gives you perfectly good job opportunities outside academia, in some it is almost mandatory.
The problems are:
* People who do become professors have spent years training to become good researchers, and very little (if any) time training to become good teachers. They're kind of expected to be pick it up along the way, which may or may not work well.
* Some people in academia still insist on the traditional idea that university should teach academic research and nothing else, and resist including anything that aims to better prepare students (and especially PhD students) for working in industry jobs, even though the vast majority of them will.
* There are some fields where there really are few job prospects outside academia, and logically those fields should have very few PhD students, or at least make it very clear to students that they need to have a Plan B that is really more Plan A. But Professors need PhD and postdoc students as cheap, qualified labor. Some unscrupulous professors try to attract students by giving them an unrealistic view of their prospects in academia. They probably don't even think they're lying: see the previous point. This kind of thing is probably most common in fields like literature which combine bad job prospects with being very popular.
Can't speak for everyone, but for my PhD I was trained for doing research. What else would it be? That kind of research is (presumably) also useful in R&D positions in industry, and I certainly see most of the PhD graduates here eagerly hired by companies with advanced products. Actually, what I was not directly trained for is exactly what you need only in academia: writing grants, managing small teams, supervising students, and so on.
The thing that killed the appeal of academic research for me was that I found I was playing the "publish or perish" game too well an it was actually making me (as someone observed at the time) "hyper cynical".
I like building stuff that people use, not writing papers about about how to build things that people will almost certainly never use.
It probably depends a lot on the specifics of the situation. For my PhD, the funding was secure up front. This is not unusual in Europe, but I hear the USA is sometimes different. I was given a lot of freedom, and ended up spending a lot of time on creating generally usable software artifacts (specifically: I do PL and compilers research, and I documented and made available the implementation of my compiler).
It probably depends on the field, also on the supervisor, and how much clout or willpower the student employs to force through their own vision. I made it very clear during my PhD that I was in principle paying a significant sum to be a PhD student (comparing my PhD salary to what I could go and earn in industry), so my tolerance for constraints was low.
What about working on stuff that no one will care about in the immediate future for various reasons, but that you definitely consider important to study?
Don't get me wrong - I loved the area I worked in, I think I just saw a bit too much of how the sausage is made in "big science" projects and I got rather disillusioned by the approach that seemed to be required to succeed at a "management" level in academia and decided that as a long term career goal it didn't interest me at all.
NB It probably didn't help that I spent 6 years working on symbolic AI at the start of the 90's at it was becoming increasingly apparent to me (correctly as it turned out) that this fundamental approach didn't really work no matter how fascinating it was.
To a first approximation that describes most research on literature. I’m sure there are other fields where that’s true but most academic press books have a first printing run of under a thousand and don’t have a second edition.
I’m 6 months in and already became a bit too cynical for my own good. Your last paragraph exactly describes a feeling I have been for months. Having a paper written in 10 days accepted in the best conference of the field increased both my imposter syndrome, and the feeling that we are writing useless crap.
So, in this end what did you do? Are you able to build product you care about now?
My experiences were a while back (I was in academic research from '89 to '95) - I left to co-found a startup in '95 and did OK.
I've never regretted leaving academia for a single moment - that's not to say I don't have fond memories and I learned a lot there but ultimately I have always liked building stuff and ultimately that's only worth it if people get value from what you build. I knew a lot of great people in academia - but I was conscious that there was very much a game to be played (as is the case everywhere) and the game from what I could see wasn't one that interested me.
> Having a paper written in 10 days accepted in the best conference of the field increased both my imposter syndrome, and the feeling that we are writing useless crap.
Are you sure you are not harsh on yourself? :) Maybe it was a good paper. If you are so worried, you can always ask the conference reviewers for comments; maybe they thought it was excellent.
> useless crap
You get enough "useless crap" down as knowledge and, who knows, one day you get to flying cars or the proof of Goldbach's conjecture.
My non PhD experience is totally the opposite. When I was studying physics, while there was interesting stuff and I enjoyed trying to build things, my professors and senior students hammered me with all the petty stuff I needed to organize the projects and get money.
Suffice to say, I dropped out of college and chased a quick buck on IT. At least on my university, it as expected that you'll get to academia, and of my friends, those who didn't went to other career path, are still waiting for a place in academia.
I think it depends on the field, I did my PhD in a biotech/drug-discovery lab where there was absolutely no expectation to continue in academia, in fact a post-doc in academia was viewed as pretty much the last option.
I can’t imagine any circumstance other than a Ph.D program where someone spends 5+ years studying in depth some tiny area of science and practicing research. And without doing that I’m having trouble believing almost anyone will do serious research. There are a few people who did it but to me that is extraordinarily unlikely.
Agreed, those people are few and far between. Even if your job is fairly stable, you'll more likely work 'around' a problem more than a deep dive into a very specific problem.
I agree, this isn’t true (especially in CS!). During my PhD, students in my cohort frequently had opportunities to talk to (and work with) people in industry. It’s not as disconnected as it may seem.
A lot of industries are like that though, "up or out" being the usual description - not everyone in law becomes a partner at a decent firm but there is no shortage of lawyers.
MS is the new BS and PhD is the new MS in the eyes of employers. You need a PhD or even a combo with MBA for certain jobs that used to require just MS these days.
There is one part of the article that is a bit confusing:
> I was also told by one professor that it’d be “expected” for me to join faculty full-time after getting a PhD, which made me uncomfortable, because I knew I didn’t want to work in academia long-term.
Maybe some context is missing, but I'm a bit incredulous that anyone would believe this statement. A professor may train dozens of PhDs over his career, and if all were expected to join faculty, then the resulting exponential growth would cause the working population to be pretty much entirely university faculty.