The average trajectory for a successful scientist is the following:
* age 18-22: paying high tuition fees at an undergraduate college
* age 22-30: graduate school, possibly with a bit of work, living on a stipend of $1800 per month
* age 30-35: working as a post-doc for $30,000 to $35,000 per year
* age 36-43: professor at a good, but not great, university for $65,000 per year
* age 44: with (if lucky) young children at home, fired by the university ("denied tenure" is the more polite term for the folks that universities discard), begins searching for a job in a market where employers primarily wish to hire folks in their early 30s
This is how things are likely to go for the smartest kid you sat next to in college. He got into Stanford for graduate school. He got a postdoc at MIT. His experiment worked out and he was therefore fortunate to land a job at University of California, Irvine. But at the end of the day, his research wasn't quite interesting or topical enough that the university wanted to commit to paying him a salary for the rest of his life. He is now 44 years old, with a family to feed, and looking for job with a 'second rate has-been' label on his forehead."
Of course, that doesn't take into account the fact that as R&D person you'll be viewed as a cost center and a target for cost cutting if the time comes.
I have friends in the bio/medical side (phds, not MDs) who are following that trajectory too.
The (sorry) data on that is thin.
And it would hardly be the first time that a hugely promising career field has gone bust.
Over my lifetime, I've seen the technical "hot jobs" go from:
⚫ Highway and civil engineering (1960s - early 1970s)
⚫ Nuclear engineering (1970s to March 28, 1979).
⚫ Defense and aerospace engineering (1980s through 1992).
⚫ Doctor / medicine (through the mid 1980s and the rise of HMOs / "managed healthcare")
⚫ Petroleum engineering (1960s - 1980, 1999 - present, big empty hole 1980 - 1999)
⚫ Genetic engineering (mostly false starts: late 1970s / early 1980s, again early 1990s, again post dot-com bust -- but it's never really launched)
⚫ "Data Scientist" (2009 - present)
⚫ "Webmaster" (1997 - 2001)
⚫ "Mobile Apps Developer" (2007 - ~ nowish, starting to fade)
Just because you're getting your time in the sun now doesn't mean the glow won't fade.
I'd also point out that most of the jobs on your list are specific roles in specific industries - data science as a concept will be more resilient because it's an evolving set of skills (as is programming/software engineering) that span basically the whole economy. The day people no longer have a reason to build systems using computers is the day these jobs will go away.
Sooo, not too long after some clever person figures out how to automate 80% of the grunt work of building an analytics pipeline and their colleagues dumb down the 20% left so that the people inhabiting the C-suite can work it on their iPads...?
I know at least three startups working that space, right now. And there are probably another 300 on the way.
Of course there aren't. The best way is to know exactly the causal process generating the data, thus enabling you to predict your variable-of-interest perfectly from the limited data.
This isn't currently possible, of course, but there's one best way to interpret data, not an infinity.
For your tortured analogy to bear any fruit you have to assume that we're just waiting to find a dead simple set of equations that explains all human endeavor, and after that finite memory and computation power won't be an issue because, hey, cause and effect.
As far as I'm aware doing probability and statistics correctly is all about framing the problem correctly. You can, for example, accidentally assume that two events are independent when they're really dependent. Is it even possible to detect that kind of mis-framing automatically?
That said, the business model for actuarial work typically hasn't been mining every last shred of individual and personal information (caveats, below) for retail advertising. Instead it's been measuring, tracking, and modeling risks within insurance, and many of the business lines don't concern individuals. You've got business, shipping, and industrial insurance, though yes, also life, healthcare, auto, home, and fire.
In particular, insurance is a fairly regulated market with exemptions from anti-trust collusion for the specific purpose of facilitating underwriting boards -- these are multi-company associations which pool risk data. It tends to make for a slightly less cut-throat competitive environment than IT. Also that personal data isn't bandied about quite so casually.
Though as I said, that's changing. I've been informed by industry insiders that a common practice now in automobile insurance is to purchase "smog reports" from states. These include a tremendous amount of data (most of it specific to vehicle performance), of which the salient element is the miles driven since the previous check. Turns out that your auto mileage is a significant risk predictor.
Though the fact that data gathered for one stated purpose (clean air) is being used for another (rate adjustment and fraud detection) is troubling.
An actuary is someone who does actuarial work. Most people join the profession right out of college. They have no piece of paper other than their degree. They are still actuaries.
A credentialed actuary is someone who has gone through the entire educational system, which are a series of exams that are typically taken while working as an actuary. Credentials and the practice rights that come with them permit an actuary to do exactly one new thing. Only a credentialed actuary can opine on the adequacy of an insurance company's loss reserves (to make sure that the company has set aside enough money for all of the claims it will eventually have to pay).
Not many actuaries issue formal statements of actuarial opinion. All of the other work that we do can be legally done by anyone. There is nothing stopping an insurance company from hiring a large number of statisticians or whatever to do the day-to-day work, only bringing in a credentialed actuary to do the reserve opinion once a year. Companies generally don't do this because actuaries are more than just statisticians; they're business professionals with very deep domain knowledge. Credentialed actuaries command a premium in the marketplace not because of any "regulatory moats", but because the specialized education itself is very valuable. If that ever stops being the case then nothing is stopping the market from correcting itself.
Why does it matter that actuaries are certified? It's a professional role which requires dealing with some pretty heavy concepts. Employers presumably want to be sure that candidates meet some level of capabilities.
That's part of what I was getting at.
But the other was that "Webmaster" was a really hot ticket for a few years. It's faded markedly.
I can't see my parent comment at the moment, but I saw a recent "ask HN" post where the question concerned where all the Ruby-on-Rails jobs had gone. What had been a really hot ticked for the past few years is starting to fade. A lot of kids are starting to learn their first talent-shift lessons.
If you're using it in the big data, industry buzzword sense, then I doubt the OP would consider it "hard science research."
It's starting to read like "webmaster" salaries in the late 90s...
But there was a long dry spell from the late 1980s through the 1990s where those jobs, especially in the US, were very, very hard to come by.
That said, I'd love to check my statements against historical salary data for the field.
What about for people outside of SF/NY/Finance?
Here's an example: you can become a principal consultant at the best and most well-known infosec firms in the world without a degree. By and large, they don't care. Even the ones that say CS degree on the job post - email a decisionmaker, tweet them, whatever - it's not true (some care, most don't).
The salary band for a principal consultant would be around $200-$250k, sometimes before bonuses and incentives to develop tools or do research. Fully realizable goal.
Now, that's my experience in that industry. But someone will also come along and tell you about software engineers who work at AmaGooBookSoft with no degree too. And if you're a top performer there - $250k is a fully realizable goal.
I'm deeply interested in how someone outside of a hotspot like SF/NYC could achieve a $250k salary. I was under the impression that $120k was pretty good for someone joining an infosec firm.
How would you go from an entry-level infosec position (an entry-level security consultant) to $250k/yr without switching companies? Also, what would be a reasonable salary for an experienced dev with no prior infosec experience who is joining an infosec firm? (Essentially someone who is switching fields to infosec, but who is still a very good programmer, and who rapidly picked up on the fundamentals of infosec and applies them in a valuable way.)
(That reminds me, I should check into working in NY or SF.)
1. >> I mean, you're in NYC. Are you referring to someone outside that area?
Yes I'm in NY (Westchester, actually). I'm thinking of people in NYC and San Francisco, so no, I suppose it doesn't answer your criteria of being outside those two areas. That said, I also know someone who makes this much and works remotely for a company in San Francisco, so he gets the best of both worlds.
2. >> How would you go from an entry-level infosec position to $250/yr?
For someone who decided to make a career of infosec and make that sort of salary (or the associated industry prestige/research you generally end up with at that level), it would take 6-10 years of exceptional work, original research and tool development. If you don't want to do all three, pick two, but you better make those two amazing.
Let me break down rough salary bands for you (keep in mind these will change based on a specific firm):
a. Associate - This is a bottom rung appsec engineer. If you join Matasano or a similar firm with promising intuitions about security and a background in development, you'll end up here. Great! $70-90k salary (there are outliers above this, and you'll see salaries for 50k - 60k, but if you know what you're doing with job searches and it's a good firm, that's the range). You usually stay here for a few years, training under the wing of mentors, until you competent in at least 3 different disciplines of information security (crypto, mobile, web app, network, incident response, reversing, malware analysis etc. etc. etc. and there are overlaps). You'll also stay here until you are ready to supervise someone else running a penetration test and can handle clients on your own.
70k - 90k
b. Consultant - You should be competent in 3 to 5 different disciplines of information security (or ridiculously stellar and potentially famous-in-the-future in 1-2). At this point, you probably haven't presented at a conference yet (and many go their entire careers without doing this), or done original technical research, but you're probably developing security tools or thinking about it. Any good firm is going to give a performance bonus for this. It's most likely taken you 1 - 2 years to reach this level, depending on innate talent, motivation and practice.
90k - 130k
c. Senior - Similar growth progression, now you have 5-7 disciplines under your belt, expert in 1-2. You're either doing original technical research or considering it, and you develop new tools are part of your research. You might present your research at conferences, you might not (most people who do meaningful research tend to submit). You now lead teams and projects for large engagements with "important clients" in the consultancy. Probably took you another 1 - 2 years to reach this, maybe 3.
130k - 180k
d. Principal - You're probably known in circles that know your consultancy, because you're one of the most important consultants at your firm now. You might even be industry-famous, if you have done particularly good research or released a tool that made pen testing easier for everyone. You're expert in 3-4 areas and it's no longer worthwhile counting how many disciplines you could technically operate in because the overlap is superfluous. It's likely taken you 1 - 2 years to reach this from Senior, if you are motivated and keep working.
180k - 250k
A few notes:
1. None of the definitions here are "hard" - don't take how many disciplines and similar such criteria as hard rules, it's just a rule of thumb.
2. These salaries are before bonuses such as for extra travel in a quarter, extra work in a quarter, tool development, research or conference presentation (you bonus for all those).
3. You'll notice salary ranges increase as you go up, because the higher you go the more nuanced the skill levels become.
4. There is a level or two above principal, generally "partner", "distinguished consultant/engineer" or something similar. You're basically at the top of your career as a consultant here and can found and manage consultancy firms. The salary here is $250k+.
5. You can skip, and the years are basic averages at best.
Are you absolutely sure about those salary bands? In particular, less than $100k minimum for a developer with ~ten years of experience seems extremely low. They could go get that much in a starting position at a trading firm, so why go infosec? If you can do anything, why do that?
Those salaries seem reasonable for ~2006, but not 2015. They say to reinvent yourself every ten years or so, but starting out at those salaries would be prohibitive.
If you have 10 years of dev experience and you're a strong enough case to get a yes from an infosec firm, they're going to work with you on salary to get to your level. As 'tptacek has explained quite a bit on here, a good firm is receptive to negotiation and starts at their floor, not yours :)
But your title will still be Associate because you need to learn the ropes. They could also fast track you too so you're a consultant within a year if you're ready.
This is a one size fits all chart. For mid career switchers and new grads alike.
If you have any more questions, feel free to email me. I didn't anticipate this comment would be so popular!
We hired basically two kinds of "software developer" (ie. people who could code but didn't have experience with software security):
* Developers with minimal professional experience in software --- a year or two out of college, say, or someone who had graduated from devops to software.
* Senior developers with impactful specializations and/or a well-cultivated side interest in software security. Impactful specializations would have included kernel-level systems programming, compiler theory, maths, maybe some kinds of networking or finance. Really: it's kernel stuff that got our attention most of all.
The former kind of developer fits into the ladder the way you're writing here.
The latter kind is for all intents and purposes in the door as a senior consultant.
We weren't in the business of asking people to take pay cuts.
Yeah, everything you're saying here is obviously true.
I tried to make the salary bands encompass most of the consultancies that operate in NY and SF (Matasano, iSec, Accuvant, Cigital, Neohapsis, Include, and lots more). As such, it doesn't square exactly with what Matasano offers.
I also want to underscore what you said about pay cuts - good firms and good managers are going to be receptive to your needs. Once you get to 'Yes', they want you, and the value you will deliver as a consultant at one of these shops will eclipse your fully loaded cost.
Finally, I want to share another piece of advice for anyone looking to go into infosec - like anything else, there are good firms and not so good firms. I won't speak ill of any specific company, but firms like Accuvant, Matasano and iSec should usually be your first attempt to break in. They are receptive to developers with no security experience.
Pick up The Web Application Hacker's Handbook (showing its age a little now, but still the only book you need to start in web app security) and read through it.
Other firms sometimes have you do ridiculous amounts of travel (75%+) without informing you of this upfront, or they are firms without good hiring standards (or firing standards) and operate mostly as point and click vulnerability scanners where you won't learn much and you'll be underpaid.
It really, really helps to reach out to people directly and ask them how they like where they work and try to get a referral to interview through them.
They might also send / you'd want to read:
- Gray Hat Python (debugging and fuzzing with Python)
- The Tangled Web (web browser security and insecurity)
- The Art of Software Security Assessment - the bible, which you really want to read as it's a foundational text. Very dense, but you'll come out of it knowing how to attack memory very well.
We didn't get to salary negotiations but if I had seen the starting salaries posted above I would have coughed up my spleen. My assumption was that a junior security researcher is a senior developer and therefore the salary would have been at least (senior dev salary + σ).
Matasano didn't have an "associate" rung on its ladder. We talked for 10 years about having a "junior consultant" role, but never did. So you can discard those numbers entirely.
I could obviously be much more specific about how the numbers in the rest of the ladder don't square with that comment, but I'm not going to.
You could get a PhD in a hard science, work for McKinsey and if you make it to partner be pulling in $1M+. Even if you don't make it to partner it's not hard to make >$250K after 2-3 years.
If you want to do technical work that has the compensation dynamics of McKinsey, you need to (a) hyper-specialize in something valuable and (b) own your own practice.
Call them specialists, experts, consultants, contractors, or whatever you want, but not engineers without an actual engineering degree.
It goes further than that actually. The term 'engineer' is often legally-protected, and often a degree is a pre-requisite, but isn't the only one. This becomes important when e.g. you're building bridges.
for example http://en.wikipedia.org/wiki/Chartered_Engineer_(UK)
Unless you are talking about 'engineer' in the same way some people talk about 'architect' as in software architect.
I don't know the specific pay in these fields, but it is definitely higher than you would get staying in academics.
Most of the folks I knew who were chemists have moved onto other roles/industries.
I met a very respected chemist at AACR last year who said all of his postdocs were scrambling for work, and that some were being realistic and considering moving to China for the opportunities.
And, as the article states:
> If Margaret had left science entirely…for, say, a consulting job…she’d have earned about $100,000 a year or more...but she stayed in science.
>>> Consider taking the same high IQ and work ethic, going into business, and being put on the fast track at a company such as General Electric.
My concern is that "work ethic" is not an independent, immutable trait, but is instead situational. If so, then a high work ethic, transplanted to General Electric, could become a low work ethic. Some folks may need their motivation to come from what they are doing, more than others. Those folks aren't helped much by a simple list of what jobs pay the most.
Truth is that so long as somebody is above you, it's very possible to have to work on things imposed on you, and in research that can mean working on topics you disagree with, along with all the office politics that come with it.
I think someone who has done both industry and research can comment, but my impression has always been that they're mostly the same thing if you're between 20 and 40.
I also got lucky with my choice of advisor and thesis project. While the topic of my thesis was an esoteric matter of academic interest, the _execution_ was an honest to goodness engineering project involving optics, electronics, and programming. So I had something to talk about in an interview.
I actually don't mind having work imposed on me. Among the many reasons why I left academia, one is that I simply didn't have my own research idea. I've even told my boss that he can assign anything to me, so long as he is confident that it's urgent, meaty work.
Now, I'd have a tough time sitting in an office, manipulating pivot tables and "dashboards," and bikeshedding at stand-up meetings.
Another myth or bias among a lot of grad students is that they have to be loyal to their specialty. I was fortunate to be fairly opportunistic about my technical interests.
I should also note that I'm reading all of this with a less than arms length interest: Both of my kids are potentially interested in science careers.
- Because you get paid to investigate some of the most interesting things in the world, in the company of mostly great people.
Science is awesome. Scientific careers right now are shitty, but that's because the current system is broken, not because of anything intrinsic to science. We should try to fix it.
I've tried quite HARD with Onarbor, https://onarbor.com, a crowdfunding for Science site, but its not exactly setting the world on fire. Would love if you have any specific suggestions.
I would expect people to end up funding the department which can produce the best promotional video. Then I expect departments to start spending ridiculous man hours producing promotional youtube videos.
Funding, in particular, is a sacroscant concept in the sciences. I think your effort is a good one, because researchers as a whole are well adapted to making one's dollar go as far as possible. However, since funding is so central there is a byzantine set of rules at each institution for its acceptance and use. Perhaps you should start at the administrative level?
I love the idea btw.
The site almost looks like an internal VPN search that I shouldn't be accessing, but you should be making it irresistible. :) good luck!
The colored google-like letters are cute- but not enough. Try making it look more like kickstarter or indiegogo.
Also the "I'm feeling Generous" button returns an error if I don't fill in anything! Make it do something in case I really am feeling generous and want to fund something. Suggest something, etc.
If one wants to stay in science, then there are many other options than being a prof including but not limited too industrial or government labs, research scientist positions at universities, etc.
Also the majority of assistant professors actually do get tenure and I feel this article is implying that they don't. And salaries are definitely higher than 65K a year. Though again, a CS perspective.
On the other hand:
* I think if you consider "tech, finance, data scientist", some reflection reveals a discouraging fact: we're usually not talking about science or even engineering at all, but essentially support functions allowing new kinds of productivity or scale for mundane business functions. So there's careers out there for us educated in these quantitative/technological arts... but it's generally not a career in science.
I don't know much about research related to tenure achievement and salaries, but I found these:
My takeaway from skimming is that (a) faculty salaries vary, those in areas with the industrial options we've talked about seem to be higher, but an average of $65-70k seems credible (b) data on tenure is hard to come by but everyone agrees it's very difficult to get at high status institutions.
The concept entirely predates modern ideas of science and research, so it is no real surprise that it does not serve that purpose.
As a quant, you can be doing ML to predict the stock market or hard core network coding to make placing orders faster.
In a data scientist job you'd be doing data mining to gather insights about whatever area your company works in.
At a company like Google or MS, you could be developing/implementing interesting algos for search, ad targeting, etc.
This is not even to mention Google/FB research, and even more so, MSR where you could be doing even more researchy stuff.
Now it's possible that the above jobs are much scarcer than the support function jobs but I mean most of the world does mundane jobs and earns way less than you. A PhD gives a much higher chance of getting a less mundane job.
I think he was pointing out that what is doing is not that relevant to advancing human knowledge. Proprietary trading strategies and optimizations that will most likely never be made public are not any better.
All of your examples are the same until you hit Google/FB research where you might actually be working on something that isn't directly promoting the bottom line of your employer.
64% get tenure according to http://www.sciencemag.org.libproxy.mit.edu/content/335/6070/... at least. Median time to exit the tenure track? 7 to 10 years depending on the area.
So now imagine this. You work as a PhD student for 6 years, as a postdoc for 2-3, as a faculty member for 7 years. And at the end (you're just about 40) someone flips a coin. Either you get tenure, or you beat it and get nothing. Along the way your industry peers have outearned you by hundreds of thousands of dollars by this point.
So yes. The situation is that bleak and the article is not overstating its point.
Along the way you have to keep in mind that things are only getting worse. Grant funding is much harder to get today than it was 10 years ago. Tenure rates are much lower than they were 10 years ago. So if you're planning on a career in science you have to factor in the fact that the situation you will face when you come up for tenure will likely be far worse than the already-poor statistics above indicate.
The point is that with almost any Science or Engineering degree you can get a great job in industry but that in the world of hard science research, a place people go because of their ideas, not because of the profits, you will struggle for little chance of serious reward.
Perhaps this is good for an economy. Most people will be flushed into practical things and those who are driven to be artists, teachers, scientists and performers will be carried there by their passions despite the hardship.
The biggest problem with this is that we are probably herding our most talented young people straight into the open maw of industry and away from things that lay the foundation for human advancement.
I reserve the right to later redefine this threshold but for now, call it %85.
Sadly the articles describing a lack of STEM graduates and the ones describing how shit being a scientist is both paint all science with one brush.
S is different from TEM in STEM...and CS leans more towards TEM than S.
Even if you had been successful for 16 years, it could still turn bad.
I'm a chem phd driving for lyft while I'm running my nonprofit science outfit. Some of my passengers have told me there are others out there.
"What did one New York City cabbie say to another? So, what did you get your PhD in?"
... was a well known joke when I was a grad student in the late 80's.
> * age 30-35: working as a post-doc for $30,000 to $35,000 per year
You can earn more than that (and younger) if you're prepared to travel outside the US, though it's still not great. I'm hesitant to share personal details, but I'm personally in my mid-twenties and just starting a "postdoc" (in a sense; still waiting to defend my thesis), and I'm earning quite a bit more than that, albeit in an expensive city and with a large tax/student loan burden.
I'd like to do a postdoc in the US next year, but I'd have to take a large pay cut to do it, which is something I'm on the fence about–I figure you're only young once, and money isn't everything, but I can't help but feel I'd be being taken advantage of just because I happen to enjoy doing science, and the additional opportunity cost would be high too. Is it worth it? I'm not sure.
Beyond that... I wrote a much longer comment but deleted it. The fact is, academia is in an unhealthy place right now, and there's quite so much wrong with it that it's very difficult to articulate. Where do you begin? Whether to even stay in academia or leave is a question that seems to be discussed a lot lately, certainly among my colleagues, and I expect to see a lot more threads like this in the future.
Edit: Actually, I should add, that despite being in a decent place now, I had to unexpectedly work without pay for a time last year due a (ahem) 'funding hiccup.' This experience was so very terrible at the time that I'm surprised I didn't just give up on the PhD right then. To explain: in the UK, STEM funding is typically 3 years, and when that runs out you're in trouble: you can't go and get another job because you need to finish your thesis to get your PhD. Most people can't finish a PhD in 3 years (unsurprisingly). Some people manage to get extra funding from somewhere or start postdocs while still technically writing-up, but most people don't. Some people are told there will be extra funding which never materialises, or doesn't arrive on time. Several of my friends ended up going back home to live with their parents to write-up (and are still there); this is not uncommon. I know someone else who ended up living in a hostel. Once you've started the PhD you're invested, and the longer you do it the harder it is to walk away. The real kicker to me is that a lot of people accept this and think it's okay.
The best thing you can do for your career is decide, right this minute, what you want to do. Do it as soon as humanly psosible.
The reason, I suspect, that this issue is cropping up is that graduate schools are incredibly bad at career development, and it will take at least an entire generation before the culture changes completely. Administrators in general are doing their absolute best and that's a great step. But it's the lab leaders/PIs that are culturally stuck in the past.
Source: recent PhD in biological sciences, spent years improving career development at school, have Real Job now
1. NIH stipend levels are pretty low. When you talk to people on NSF or DoD funding, they're like "You pay your postdocs what!?"
2. There are fewer "eject" options. Physics, for example, has a long and established pipeline into finance. The group I'm with does lots of work on complex networks - that has applications in tech. Bio work, for the most part, leaves you qualified to do bio work. So there's both less salary pressure for "We have to pay you or you'll go work for tech instead" and the feeling of your education being a sunk cost.
Podunk U in the midwest gets filled with MIT/Harvard degreed profs, who couldn't wait for existing profs to die off at their alma maters. It also captures the Harvard rejects with 4.0 GPA's, 2300 SAT's, 12 AP courses. Extreme selectivity at the elite institutions improves the quality of the non-elite schools. Better profs, better students. Does Podunk U have the potential to churn out some great students then? Even if its prestige isn't as good as an elite.
Besides, one has to consider the other options available. Would the person in your example be happier had she taken a different path? who knows.
I wouldn't discourage people to try and work in that field if that's what they want to do.
Basically, you have to play it smart, find a niche, and you can beat the averages.
At my particular institution, I have known several who have transitioned (all older than me, and most who had done postdoc first). Basically what is required to transition is at least one of, preferably both of: A) 1-3 top-tier first-author papers (Nature/Science), or B) getting several small grants or one large grant adding up to at least 150K/yr. Politics can also play a role, obviously.
But I don't know for sure yet whether I will want to transition. No question, tenure-track is more prestige and pay, and research associate/staff scientist was originally created as a position for people who were too old to be postdocs and not willing or able to become faculty. I think the position is changing somewhat to be simply a "middle ground" between postdoc and faculty, less dependent on age.
But I joined this profession to do great research, not write grants all day, and I am given a great deal of freedom to do that, so I'm not seeing right now much benefit in becoming faculty other than pay. Maybe my perspective on the importance of pay will change if I have a family.
So, to answer the question directly, right now I am thinking I will become a PI iff I have to do that to push forward my research adequately. If, on the other hand, I can find a PI who is willing to do all the boring grant-writing work, pay me, and give me a lot of freedom, maybe I won't. Right now I have such a PI, which is awesome. After being this spoiled, I could never settle for the normal "you will do these experiments, serf" relationship a lot of PIs have with their underlings.
Someone else's grant funds my paycheck, but in practice, staff scientist is often a position which is intermediate in freedom between postdocs (who virtually always pursue someone else's research goals) and faculty (who, in theory, always pursue their own, once they're done writing grants and doing paperwork). So, I work "under" no one, but I do have someone who expects me to spend roughly 30% of my time on their research goals, and people would refer to me as a member of the X lab (where Dr. X is the faculty member paying my salary).
Staff scientist typically connotes a relatively senior, but non-tenure track position. So, someone who either did not want, or could not cut it in, the cutthroat world of tenure-track positions. The hours often are lighter than for professors. It is typically filled by someone in their late 30s or early 40s and who has done a postdoc.
In my case, and at my institution, it is different. I got this position because I and the person hiring me felt I deserved better salary, benefits, and prestige than a postdoc, whose salaries are fairly rigidly set by the NIH in the mid 40Ks. I can still become a professor later (the main way to do that would be to successfully write some grants). So overall, for me, this position is a step up from the likely alternative of postdoc, but for someone in their 40s whose competing alternative is faculty member, it would not be nearly so appealing.
A related position, slightly higher up the food chain but more definitively non tenure-track, is that of "core director". That means you run a "core" (a facility that all the researchers at an institution can use for a particular kind of experiment or analysis). A bioinformatics "core director" would then take datasets from researchers who come to them and perform analyses on them, and by playing their cards right, would get lots and lots of coauthorships.
Sounds like to me like this is the problem. Its a lot to ask for lifetime employment, pension, etc. The tenure system causes this. A more fluid employment system would provide more opportunity as recruitement wouldn't be this overly neurotic thing.
That said, I know people in this exact situation. They bounce from university to university. They eventually ended up somewhere, at least most of them, with a couple scragglers who always came off as hugely unlikeable trust fund kiddies, so I'm not surprise they can't convince anyone to keep them or that they feel motivated to work hard.
The people I mentioned above were mostly U of C grads. Salaries there:
>The average salary for a full-time professor at the U of C during the 2007-08 academic year was $170,800—the fifth highest in the nation
170k 5 years ago is nice scratch. Even in "poor" public schools like the University of Illinois, you're looking at 110-120k for entry level professors. Oh, and usually that's without working any of the summer months.
I'm not sure why there's so much hang-wringing over these jobs, many of which pay a decent wage. Yet there's no hang-wringing over the millions who are just as smart and talented in the private sector who have no tenure protections, no publishing rights, no patent co-ownership rights, no royalty rights, and fear layoffs all the time.
Hah! This must be one of the strangest and most pervasive myths about academia. Typically you only get the summer off ... from teaching. It's the prime time to do that research that motivated you to get into the business in the first place...
And conveniently, the University doesn't pay you a salary for this time. You can elect to pay yourself... out of your research budget. At the opportunity cost of students, postdocs, staff, equipment, travel, etc.
Sad fact, there is no will in the general population for this programs. I'm studying right now a master degree on applied statistics and we are a generation of 6 persons.
There has to be a better way. I liked the mention of DIY scientists, if people could have done hobby medicine research the same way they can 3D print, start a SaaS company, built a home automation solution using an Arduino and some stuff from ebay, if doing DIY lab research would be as easy as launching a mini delorean style quadrocopter (search on youtube), I'm sure a lot of the community here in HN would have been working on the next cure for cancer.
Perhaps there's room for a 21st century form of patronage instead.
Then if you do get a good research position somewhere, you get your base pay and all but a lot of scientists reach out for grants and they can pay themselves from them, padding their base. If you can get one of those cherry positions, it's potentially way nicer than what most engineers make. If you work on things with industrial applications you can get royalties and such, all while working at a university. Seems a number of these places have like real pension type retirement programs as well.
The haves in science are doing great, the vast majority are have-nots though and they live near poverty in a lot of positions. It's my opinion looking in (my wife is a scientist) is that there are 2 main problems: 1) Little understanding and appreciation for real science, it's thankless, there are tons of failures to usually make incremental progress, the level of meticulousness is something else.. and 2) They've got their own culture running it now and there is a history of paying dues and lot's of unwritten traditions about how you can grow and evolve a career. Post-docs don't get paid much, that's how it is, even if a lab can afford it, post-docs don't often get jobs where they post-doc, even if they're experts on the lab's area of work because that's just not how its done. It's kind of like how MDs are stereotypically just ground up by working insane shifts at the at their residency because that's what the doctors before them had to do, almost like hazing..
There are also a lot of unwritten traditions on such basic matters as how to write and format a scientific paper. The standards are quite meticulous, and in many fields, nobody has actually bothered to write them down or teach them explicitly as course material to students.
Most MD/PhD positions are doing research directly applicable to medical treatments and/or clinical trials. And it's definitely possible to be an MD/PhD and not be paid a tremendous amount.
Some of the schools around here (Boston) have stipends which work out to be about 18,000 per year.
1500-1800 was common at the schools in the mid atlantic that I knew of recently.
You rarely talk to anyone who says "I want to make a lot of money so I'm going to be a researcher!"
Normally people (want to) become scientists because they are passionate about science.
The problem is that science is not highly valued, especially academically. We live in a society that values trading commodities, and science isn't easily commoditized. The closest thing we can come to a commodity in science is papers, and maybe patents in the commercial world.
The shame is that science is a public benefit. In our whole history, nearly everything is ephemeral, but the things we have learned and understood and documented have persisted. Technology can be good or bad, but (well founded) science is knowledge, and it's only ever good as it lets us better understand.
We live inefficiently as long as we focus on letting people live comfortable lives only when they have something they can restrict access to and barter in exchange for that opportunity.
There are people out there that would like to do science, who would like to further our understanding of the world and universe we live in, who don't do so only because they need to
help produce something that people will pay for.
In my opinion, we should pay for education and a fair income for any person who wants to practice academics. Sure, you can save luxurious salaries and budgets for the superstars, but make it such that everyone who has the capability and desire to do so can work in science, or possibly other academic pursuits, that they will have a wage that will let them have a home, healthy food on the table, support a family and live a comfortable, if modest life. The condition being that all of your results are made available to the public for free and you might get called to teach.
I think of it kind of like being in the military. Enough money gets spent on the military, TONS of money gets spent on the military, and for much of it, the public sees minor benefit. But you know that when you join, that at least while you're serving, you are going to be guaranteed to have a place to stay, food to eat, health care, education options. Nearly anyone who is fit can join.
If you could do the same for science, I think that would be great. Our governance would need to change first though, and our culture. I don't trust that government employed scientists give good results over lies to promote an agenda. At least in our current private economy you have conflicting interests that can act as oversight.
But as automation and technology supply more of our needs with less labour, we're losing more and more jobs. This leads to people without jobs. If there are people in that group that are smart enough to be scientists, there's always room for more science.
When I was doing my PhD, that is all I was really asking for. Honestly, I would have been okay with making the $40k salary per year for the rest of my life if I got to do the work I loved. But even that was not even remotely guaranteed.
Fortunately I had some good software skills from my bio phd so I could land a job.
It has nothing to do with our society per se. All capitalist societies work this way. Externalities distort pricing in very predictable ways. The fact that science is undervalued and commodities trading overvalued comes right down to how externalities are produced and/or captured by these activities.
Unlike the Soviet Union, where no personality cult of Stalin and Lenin existed. </sarcasm>
North Korea is as good an example of a totalitarian dictatorship as any Communist country (including my own).
Maybe the point that you are making is that communism is worse because if capitalism can also exist, it will take more resources and make communism miserable. Or, that if communism were widespread that life would be more miserable for everybody because we wouldn't have the capitalistic systems to make life more luxurious. Anyways, there's been plenty of debate on this topic, I just don't think there is any way to make a fair comparison in today's world. Even though these countries have similar ethnic makeup and size, I'd still say it's more of an apples-to-oranges comparison.
I'm not a fan of generational hate, but for the generations that were the architects of this corporatized society, an academic job meant that you wouldn't be rich, but that money wouldn't really be a problem for you. You'd have a lifestyle comparable to about $120k in the Midwest, adjusted for cost-of-living (but coastal property also wasn't as skewed). The humiliating day-to-day struggles of the poor are hostile to the life of the mind, and the earlier generations knew it.
Hence, you have Boomer professors, even with the best intentions, championing the academic career because it has been good to them. If college advisors weren't picked from the most successful 1-2% of those who attempt PhDs, you wouldn't see the smartest of every generation shoehorned into these programs. (And yes, my experience was that almost all of the top students took an academic path at first, though some, like me, left as early as one year into a PhD program. Sure, there are a few who start Facebooks or are hand-picked to be proteges of hedge fund managers, but the other 95% of top talent veers academic-- until their illusions pop.)
So... while it's true that no one went into academia expecting to be rich, they also expected lives where money wasn't really a problem: they'd be able to buy a house, raise kids, travel abroad now and then, and because college profs respected reciprocity in admissions, beat college admissions without a fancy prep school (those being at a price which most professors would still find out of reach, even in the better times). And they got totally fucked, relative to that promise.
Yes, well, academic training remains the best way to acquire truly cutting-edge skills and training. Or, in many cases, almost the only way (other than self-study of academic materials) to know that cutting-edge research exists at all, to know where the research frontier between the not-yet-possible and the possible actually is.
Of course smart people want to spend at least some time in academia: smart people don't want to give up before they've found the frontier of the possible. The smart and devoted people want to go beyond the impossible.
On the other hand, consider categorical imperative. We need someone to study science and advance the field. We even need medieval historians (albeit, perhaps not so many of them). If everyone decides to become a bullshitting rainmaker of zero net value to society because that's where the money is, then we all lose.
My mother is nagging me to join a masters degree so I can get a job as professor in a public university.
She does not think it is a good job, she thinks it is a easy job, her words were: "I think you are smart enough to pass the tests to get the position."
That is because in a public university, after some time tenured you can't be ever fired unless you commit a serious crime, so it is a "good job" in the sense you will never be unemployed again...
At least this is the situation in Brazil.
However, eu falo um poucu de Portugues.
It is a calling.
Which is why the insincere find it so easy to take them for a ride.
This is in contrast to the doorman example, or in my case, being a Lyft driver (which pays more than being a postdoc). Every night, I help society out by providing a service that someone wants, and if you want to be more abstract, by keeping drunk people off the road.
I was able to raise $56,000 for an experiment in anticancer research. That doesn't seem like much (it isn't) but in retrospect it's about right. I asked for money for one experiment, and that's what I got. It doesn't pay my salary, but I probably don't deserve it (yet) until I've proven myself at least at one stage. Drug development is risky, why should society pay much more than the bare minimum to get it done?
Science, specifically academic pursuits, is not something that can be quantized in the now. The scientific work that enables us to live our lives, that enables that very Lyft app to function, wasn't made last month or even last year. These technologies are based on fundamental theories developed 30 or more years ago (centuries if you want to count the basic electronic theory that's enabled circuits).
Scientific research has practically $0 value in the now, but it is nigh impossible to say what value any specific scientist's work will have in 50 years. It is for that reason that science and its practitioners need to be seen as public service/benefit, not a business commodity. They aren't even playing the same game as CEOs
There is a demand for science. People like contributing to something bigger than themselves. Just, the demand may not be as big as we 'hope for it to be'. But 'get it done now' might not be the best approach to some parts of science... Often times discoveries that were a total schlep to get through become nearly trivialized, shortly after discovery, by an orthogonal set of technical enablements.
Are you talking about NIH funding here, or crowdfunding? The accountability for government funds is vigilant (some may even say restrictive), as anyone who's applied for a grant or sat on a study section will tell you.
I disagree with the accountability aspect from experience, but I concur about cronyism, which is a major bias that the system is not set up to handle. It takes time to get good at winning grants, thus older applicants (tenured professors) have a much greater advantage over new ones (postdocs). There is a cultural sentiment that the best years of one's research career are near the end after amassing knowledge (or influence). This leads to many PIs shunning retirement and dampening enthusiasm for any new recruits.
The government is somewhat vigilant about fraud, but incredibly not-vigilant about bad ideas (Arsenic life comes to mind). But the accountability I refer to pretty clearly in my screed is long-term, 50-200 year accountability. You don't go back in time and rescind the grant of some scientist whose work was less than marginally relevant. And there is plenty of that stuff (Nanoputians come to mind).
But the vast majority of science is not the best. And most scientists will spend their whole careers trying to make a big dent in the unknown--but fail. Most make a very small dent, or none at all.
From a long-term social perspective, we need the best and brightest to rise to the top of science and push it higher. But that's not the same thing as providing a working wage and middle class+ lifestyle for everyone who wishes to work as a scientist.
"And, indeed, you are lucky! After a hundred years or so, your idea (along with a bunch of other ideas) leads to the development of aquarium air pumps, an essential tool in the rapidly growing field of research on artificial goldfish habitats. Yay!"
 - https://www.quora.com/What-do-grad-students-in-math-do-all-d...
Science is an exploratory discipline. You don't know what's most valuable until you've got the benefit of often 20-50 years of hindsight.
It is saying that you can't expect all the players on the football team to collect the same salary as the star players, which incidentally is exactly what happens.
A researchers work is largely dependent not just on advances in related fields, but in their colleagues incrementally advancing their own ideas. Eventually an all-star may envision and pull off a way of combining all those advances previously unseen, but it still requires those advances to have been made in the first place.
That's how science advances, not by individual genius, but by the collective ability of the field.
Even if 70% of the field aren't doing groundbreaking work, the work they are doing allows the other 30% to focus on the groundbreaking things.
Probably 50% of that non-groundbreaking work you mention is completely worthless, and the field grew in spite of those wasted resources, as dnautics says.
Even then, if the system was honest with individuals about their science abilities (not everyone can be a superstar), then there would be much benefit in preventing these people from advancing on their own in a broken system, and giving them the opportunity to have an actual job in the groundbreaking areas as a tech or such.
It's more like they are able to pull off what they do in spite of all the other researchers. (that are spawning off bad hypotheses, producing fraud, using up precious grant money, etc.)
In general, my observation has been that the students that quit after a Masters tend to do better financially than those who continue on. That's not to say that Master's graduates are hitting the big time; they just tend to end up getting their market value.
Clearly an MS or BS level science student can absolutely eventually do the same things as a PhD or postdoc. The latter just have scientific approaches that have usually been subject to repeated and rigorous challenge by their peers, which means they are typically (though by no means always) more rigorous out of the box. In my time in industry I've found that in many cases that kind of critical challenge doesn't happen as much as in academia.
In the end I think people in hiring positions use the PhD letters as a screening tool for how strong a scientist the person is, without necessarily really evaluating the individual.
I would agree in the context of staying within the R&D field. However, I know more than a few people who used the extra years not spent doing PhD post-doc to develop their careers in a different direction (sales, biz-dev) that allowed them to stay in their field and shattered those glass ceilings. You can still do these things with a PhD, but it will be more difficult (people pigeon-hole you, and you're playing catch-up time-wise).
In the end I think people in hiring positions use the PhD letters as a screening tool for how strong a scientist the person is, without necessarily really evaluating the individual.
This is key. I've actually found that a PhD. is a relatively poor predictor of how well a person might function in an industry R&D role. Please note that I'm not saying it's a negative predictor. I've just found there's little to no correlation. I've worked with incredibly smart and capable people with PhD's and others who could not function outside of a low-consequence lab setting.
One piece of advice that stuck with me the most was from the director of career development, who had previously been a research manager at IBM. He told us that if we wanted to go into industry just skip graduate school and focus on getting lab experience. He told us if we really wanted to do graduate work to just get a MS and then go into industry. He told us to only do a PhD if you wanted to go into academia because not only was it a waste of time compared to what you could learn and earn in industry, it would pigeon hole you into a very specific field.
He was very adamant that a key skill in business was being flexible and PhD programs most certainly aren't.
In R&D, it's a matter of career path. A PhD is expected to become a lead, someone to guide a research program. A Master's degree holder will be closer to the action and likely develop technical skills that make them an asset lower on the org chart.
The problem, I think, stems from the fact that PhDs are also expected to master their techniques and to perform well as a requirement for advancement. They have a career path that extends to management, but are using mostly technical skills at the beginning. The Master's degree holder does not have this difference.
I think "PhD is a poor predictor of how well a person might function in industry R&D" is spot on. PhD is a poor predictor of most things that hiring managers are looking for, I might say!
Schools are producing too many Biology Phds. Or Forensic Science undergrads. Such is life. There's no way to predict 100% what the market will be like, so we make do. There's no intrinsic right to a job in the field that one chooses to study, and most people work in fields outside of their major.
It's also true that your choice of first (or second or third) job is no guarantee. GM used to be a job for life. Now it isn't. As individuals we make our best choices, and then have a small safety net to fall back on if we're wrong.
The only thing we should push for schools to do is show transparency on where the alums wind up, because unfortunately their incentives in producing Phds (and JDs and Russian literature degrees) differ from ours in receiving them.
I and the other people in my graduate program don't have that option... I studied music technology, degree was pretty expensive. no stipends, literal you pay for degree, BA-style, but are not going into a six-figure business field.
These are the choices you make in life, and being presented with an opt-out like that is as they say "a good problem to have". It would be cool if every field could make intelligent people rich as they improved the world but, you know.... its life.
If I had chosen a different field maybe I would be making 6-figures instead of doing boring enterprise work (decent money but most of my research peers were less programming focused, didn't even have this type of out), but I also would be even further from my goals. I wouldn't have the foundation of knowledge that I'm hoping to return to / build from in the future.
I guess the point of this comment is that the author should be thankful that people in science are valued by industry (even many programmers are only considered "labor", basically). There are lots of fields people devote themselves to that have no industrial use & they are relegated to lower-paying dayjobs regardless of who they are willing to sign on with. As for whether or not academia should be more lucrative, well.... should people be paid to learn? is academia the most efficient way of learning? is it an antiquated social construct? many open-ended questions.... got my masters but didn't go back for PhD, waiting to see whether or not there is a better way to make it happen out on the pavement.
I think it's the years that really hurt. A student is delaying entering the workforce for a lousy paying job, for 5-8 additional years, and now a postdoc or two? Wouldn't you want for some more money at the end?
But I completely agree with your diagnosis: accountability is a great way to open some folks' eyes. It is my wish, in fact, to be able to reach biology/chemistry/genetics/biochemistry/etc. students early in their undergraduate career and convince them NOT to consider graduate school. I think the application process with GREs etc. makes it difficult to accurately weigh the opportunity cost of going for a Ph.D., and by the time I talk to applicants on interview day it's too late.
I think Academia is not paying that well. I think a lot of these post-docs could make a lot more in industry, no? I think if you are willing to sell your soul to the devil, most STEM pays pretty dang well...
If you are in a programming heavy field, yes, else no. If you did biochemistry, developmental biology, etc, you are SOL.
For example: a lot of Biochemistry and Dev Bio these days relies at least in part on so-called next-generation sequencing. If you happened to work on a project where you worked with that technology, there are lots of well paying jobs out there for you. If not, it might be harder to find one, but again it's very person-project-location dependent.
Of my graduate school (PhD Biochemistry) cohort, I don't know anyone who doesn't have a job they're happy in. Some of them went postdoc, some went to industry (maybe 50-50 at this stage), and I have no doubt those who stayed in academia would have no problem finding a job in industry. That said, I went to a strong program in an area where there's a lot of Biotech, so that helps.
As for me, I left academia and am now a hybrid data scientist/biochemist, I'm fortunate to have a decent salary and a job I love. Was the PhD worth it? I wouldn't have this job if I didn't, but I don't know. If it wasn't it's mostly because a PhD was a huge opportunity cost for me I think.
What area is that?
I never understood it to mean a path with a defined cost to equal a defined salary. This scientific equation of years of school to salary just feels foolish.
Being a scientist offers nearly none of those things.
But today's young scientists and researchers aren't guaranteed anything -- they're lowly paid and most likely will not get any decent lifelong faculty position. And settling for being, say, an adjunct lecturer means you'll be paid less than a high school teacher in many cases (with no pension and very minimal benefits to boot).
Oh yeah, sure, while I'm young and learning maybe it's okay but at some point you realize you're just being scammed out of money
Through various internships and part time jobs, I worked both in industrial and academic labs as an undergrad, and while I enjoyed the work it helped me realize that it was more about dedication to the love of a subject than actually forging a balanced, lucrative career.
An informal polling of the students in my PhD program showed a large fraction (~90%) that came directly out of undergrad. Without setting foot in the real world I believe one can see believe this distorted reality. And in the economic "long term" (salary per addition year of schooling) most PhDs are probably better off than their Bachelor's counterparts, but this belies the opportunity costs.
Even MORE depressing, once our young female scientist has done a good job, built her lab, and is now in her mid-forties, if she really wants to improve her pay at that point she will have such attractive options as "department chair", which adds a lot of non-research administrative work to an already overworked person. What fun!
If I'm reading it correctly, this article seems to suggest that the advice that pursuing work that you find fulfilling is itself responsible for lowering wages, which may be true in as much as people are willing to accept lower wages if they enjoy the work despite having better options. But what would you propose as an alternative? Pursuing work that you find less fulfilling in exchange for more money? That's not a sacrifice I'm willing to make, and I've been privileged with enough opportunity and financial security to have a choice. Of course everyone has different objective functions they're seeking to optimize, so I see why others would make different choices, which is all good by me.
To put it another way: do you think you are paid enough for the quality of work you do? And at what salary would you rather work on something less interesting? I always thought part of the problem was that salaries are mostly set--or that's what researchers think--and therefore they accept whatever the system assigns to them.
That's because way, way too many people want to be scientists. Lots of people want to be actors, too, and most of them end up working as waiters for the same reason.
I've never run across such a smart group of people who are so dumb. Even if you win the lottery and get that coveted tenured position you're not going to be doing much in the way of research - you're going to spend all your time filling out grant applications and managing grad students.
You may as well get an MBA instead.
Like, I get why a naïve opinion might be "the more engineers, the better" on the simple but attractive theory that more engineers = more stuff = better for everyone.
But what's the even simplistic theory that leads to "the more lawyers the better"?
And surely even a simple theory says "once you have enough doctors to provide healthcare to everyone, you're kinda done with doctors."
We already have things that aren't scarce and by assuming scarcity is a factor of value, we irrationally devalue them. Post-scarcity economic systems attempt to resolve that.
Look at the struggle the entertainment and journalism industries have had since the value they provide was divorced from distributing scarce print/recording media, etc. Disregard human subjective behavior, the rational behavior under capitalism before was to buy the thing you wanted, now it's to try to be a free rider, so that's what most people are doing, despite valuing creativity and journalism no less.
So, instead of being able to participate in an economic system that solves the problem of compensating the creators of non-scarce information, they turn to DRM and paywalls and other ineffective solutions to try to force their bits to be effectively scarce for the majority of consumers, and therefore remain valuable and prevent the free riders.
That said, a post-scarce economic system also shouldn't aim to be producing a glut of any one variety of professional, so it certainly isn't a good argument within the original thread.
If anyone on HN has any questions/doubts/need details, AMA. I'm a Life Sciences Post-Doc at a major US university. I'll try my best to answer (no personal details please).
My view of STEM and liberal arts have begun to merge. To excel academically in either requires devotion, and you have to learn 10 years to have the base knowledge enough to even start making headway into new territory. The idea of a 10,000 hour expert doesn't work through college education.
Going to college used to be an affair for the intellectual elite, something that you did when you had family that could already cover living expenses for their children. The pursuit of knowledge was because you wanted to, and there wasn't often jobs connected to degrees. That is different now, thanks to available credit and a golden carrot people from all corners can sit in university, and get an education. The costs are deferred, hidden, for their future selves to contend with.
Knowing what you know now, would you again go into Life Sciences with the thought of making a living in it? Would you go again perhaps because you love Life Sciences, that is your hobby, your passion, but do it on the side and get a 'regular job'. Maybe you could finish getting your PhD, decide that the joy of discovery was worth the time and cost, but then go into a more lucrative but perhaps unrelated field.
Whatever happens, the price, the time of learning everything I assume you have learned has value. I took a different path, but I loved my 2 years of community college that I did as a hobby, learning about history, art, geology has all brought richness to my life.
You're on HN, you have a hacker mind I assume, you see the startups. You have a valuable asset in your knowledge that most (top 20% easy) don't have. Look at the world, find a problem, solve it, do it on the side like programmers bootstrap. You can make the future what you want, but an education is very valuable, and a really cool thing to have done.
You can make the future what you want, but an education is very valuable, and a really cool thing to have done.
Yes, as is having composed a symphony, painted a masterpiece or written a magnum opus. But none of those require an investment of youth(time) along with an opportunity cost that is near-impossible to recoup, in more ways than one.
I understand the essence of what you are conveying in terms of value and I was motivated by primarily the same ideals and thoughts before I decided to dedicate my life to science and research, discarding a tried-and-true (by social standards) career as a medical doctor (and before anyone asks, no I can't go back, it's too late for that).
The romance of science is one thing, paying the bills is another. And watching your fellow college-mates make (undeserved, imo) high salaries with far less education than you, makes you question many things, including that pesky thing called your career choice.
I have been a long term (~6 years) HN user (lurker). Here, in front of my own eyes, I have seen Web 1.0 implode, HN explode and the birth of Web 2.0 as well as it being raised, milked and put to pasture. I have seen HN legends, both companies and people, come and go.
Somewhere, within me, lies a dreamer, the same dude who lured me into the romance of science, whispering the possibilities that may lie in life-sciences+software entrepreneurship. But that implies taking huge risks, not easily possible with a wife in the same science-boat and a very young kid. No real savings, coz life science doesn't really pay much in research. Can't go the ramen route, am almost (back) on it as a Post-Doc!
So, what are my possibilities? Anything that can open the doors to entrepreneurship, draw upon my polymath training (biology + software) and a deeply diverse skill-set (molecules-of-life + mostly python coding + systems administration). Hello HN! Any takers?
I'd like to think that all these gripes about science are reaching some sort of boiling point, and that some solution is right around the corner-- except that I'm here in the trenches, so I know there is no such thing happening. People suffer through it, grumbling once in a while, but refusing to attempt to better their own circumstances in any way other than more work.
Many of them lead outwardly lonely or empty lives spent slaving away at their lab benches or tissue culture rooms. They cannot afford to replace their old clothes, phones, or bicycles. They live in houses with 4-6 other people, even into their 30s. When they publish their paper, it is their supervisor's name that is noticed. The bitterness and beatdown demeanor they express suffuses many of their non-work conversations.
Science is a pretty bad life in academia. Industry scientists still need to muddle around in academia for at least some period of time, but on the whole they seem much better off... except that most people from the academic side don't consider them to be scientists at all.
Are they publishing their work in a way that other people can use? Or are they grinding out the meat of patents and trade secrecy?
My professors pushed me so hard to get on the grad school track.
I was too smart to do something so stupid.
Professors are no longer the smartest people in society because only idiots go into academia. Grad students are often not the best or brighest but people with some kind of mental illness that prevents them from making rational long-term decisions.
There is no way that an intelligent person would ever try to get into academia in this climate. The people who try are on par with the pothead guitar player starting a garage band to get rich--they're not smart.
Oh, my, yes. One of the biggest problems with being smart is that it makes one especially prone to listening to and trusting in words, because one is so used to doing so due to the fact that it was the rewarded and reinforced behavior throughout their entire life, which remember, has consisted of little but schooling for such people up to that point. It takes additional training or practice to break out of that. After a lifetime of words about how wonderful this career path is, how suited to it they are, and how it's their natural goal, it cuts down to their personal identity to realize that the words were not true, and that's a hard thing to go through. (I am not being even slightly sarcastic. That can literally drive people to suicidal levels of emotional pain.)
But I do get your point and have lived it and seen it first hand.
Having said that.. I truly believe that academia is now SELECTING for these types of people: naive, foolish, simple-minded, trusting, innocent, gullible.
Only people with those traits are dumb enough to get into it.
The smartest people, who have not just intelligence but foresight, self-control, cunning, and guile, are found at Wall Street, in med school, or as high-level executives in major corporations.
They aren't doing science because they are too smart for science. Science is for dunces.
If you can't spot a dishonest salespitch from an academic, doesn't that make you gullible?
Smart people make decisions based on evidence. Gullible people trust their elders.
I can't see any better solution to human gullibility than to herd all the gullible people into a reproductive dead end. This means gullible men should be put into a situation of poverty so they can't find mates, and gullible women should be made to work in careers until they turn 38 and their likelihood of reproduction drops into the single digits.
If we can't make gullible people smart, we can at least make sure their genes leave the gene pool.
I am of the opinion that having a building full of head-in-the-cloud brainiacs chipping away at hard problems in medicine, nanotech, industrial chemistry, etc will do more good for society than having a building full of the meanest, savviest businessmen brainstorming new innovative ways to trick people into buying comparatively poorly priced insurance / advertising / savings plans. Due to the non-excludability of scientific progress, we have an economic system designed to incentivize the latter at the expense of the former. That's a bug, not a feature.
Markets are very good at sniffing out and rewarding some kinds of value creation. They're not so good at sniffing out and rewarding other kinds of value creation and sometimes they create perverse incentives that are downright terrible. There are enough instances of this happening that I don't think it should be controversial to demand that "markets know best" philosophies should only be taken seriously if they come with a string of conditions for (at the bare minimum) identifying dysfunctional markets and rejecting their conclusions in such cases (e.g. Enron's rolling blackouts, pay-for-service health care incentivizing longer wait times, the Chinese Businessman strategy, Ponzi schemes, etc). "Markets know best" should never be taken as an article of faith, especially if you identify as a libertarian.
Libertarians philosophers that have won respect from the academic community (self-test: name 3, excluding Ayn Rand, otherwise you have homework to do) know how to qualify their arguments. "Markets know best" is a conclusion to be made (or not) under specific circumstances for specific reasons. There is no branch of libertarianism I am familiar with that has both survived the competitive back-and-fourth of the philosophical community and managed to uphold "markets know best" as a tautology. If you're taking "markets know best" as an article of faith, I would recommend examining this foundation for cracks. The most popular one (and the one I "fell" for during my libertarian phase) has to do with using a transaction-centric model for value creation. There are others. Perpetual vigilance is the only defense.
Sorry if I've read too much into your opinion, but it seems very similar to one that I used to hold and I'd be remiss if I didn't point that out.
Do you know what else Galois did in his free time? He became a political radical, served a nine month jail sentence, and had an affair with the prison medic's daughter ultimately resulting in his death at the age of 20.