Nice reminder of some general principles. I wonder how well it applies to young scientists given today's academic career prospects, though.
> If you want to make progress in any area, you need to be willing to give up your best ideas from time to time. [...] Medawar notes that he twice spent two whole years trying to corroborate groundless hypotheses.
Unfortunately, being willing to scrap your idea is only one part of the equation. Securing funding after 2 "failed" post-docs is an entirely different matter.
This is why you should not ever dedicate yourself to a single project no matter how much you believe in it. You should at a minimum have one back up project that will generate publishable results no matter what happens.
Ouch. Your point is tough to stomach, but there is another side to it they don't teach you in school: tenure-track research isn't the only path. In fact, tenure-track research may actually be the worst path for the vast, vast majority of PhD graduates.
Exactly. It's foolish to follow this path. There's a ton of VC money out there, if you need millions or billions to see your idea through. If not just make your own money. Honestly feels less stressful and hypocritical than the academic Ponzi scheme.
Few academic disciplines lend themselves to monetarisation via startups. Indeed, many academic disciplines don't lend themselves to any kind of monetarisation, except academic funding.
However, building a business is a different problem from doing research. When one accepts VC funding, the funding comes with the expectation that it will lead to a high-growth business. This is fine when you have an idea that has the potential to “take off” from a business standpoint.
However, there are many research problems where there is no obvious or immediate business application. The aim of such research is different from the aim of investors. This requires a different funding source, one that is willing to embrace the risks that come with research and is willing to do work solely for the advancement of science, with productization being a nice side effect rather than an expectation.
Of course, obtaining such funding is not easy. Part of what makes modern academia such a rat race is because of how competitive it is to procure research funding from funding agencies such as the NSF (disclaimer: this is a US point of view; I’m not very familiar with the situation abroad). My advisor works hard applying for grants, and sometimes they get rejected. I’d love a Genius Grant ($125,000 a year for five years with no strings attached) to work on whatever research I want without any pressures from the funding agency or from managers, but there are only so few awarded per year.
There is also the matter of research freedom in the sense of being free from the pressures of short term thinking and "publish or perish" mentality. I am reminded of Alan Kay's observations (http://worrydream.com/2017-12-30-alan/) about short-term research. I'm also reminded of what the discoverer of the electron, J.J. Thompson, once said in a 1916 speech that resonates with me whenever I think about research:
"If you pay a man a salary for doing research, he and you will want to have something to point to at the end of the year to show that the money has not been wasted. In promising work of the highest class, however, results do not come in this regular fashion, in fact years may pass without any tangible result being obtained, and the position of the paid worker would be very embarrassing and he would naturally take to work on a lower, or at any rate a different plane where he could be sure of getting year by year tangible results which would justify his salary. The position is this: You want one kind of research, but, if you pay a man to do it, it will drive him to research of a different kind. The only thing to do is to pay him for doing something else and give him enough leisure to do research for the love of it."
For me, my dream is to start and grow a non-research lifestyle business that pays the bills, so that way I can spend the rest of my time on research, which is what I'm passionate about.
I am in the same path, make the money and fund my own research. All other models have been perverted by existing long enough that the culture around them is very antithetic to the true motivation of the endeavor.
The only systems not amenable to this model is perhaps high energy physics or things of that sort but I'm sure some resourceful mind will come up with ideas!
Exactly. Most academics really can't win. It's how the system is designed.
This is why I sold out after my MD/PhD and became just a regular physician. Maybe in a few decades I'll have enough in savings to fulfill my dream of becoming a mad scientist...
Having been guilty of "can we try another algorithm" which is only one remove from p-jacking, I think we need to encourage more negative results publication. We need to incentivise and reward it.
There should be a way to get tenure on it, at least partially.
The problem is that good negative results are also a lot of effort to achieve. You need to figure out if you have a real negative result, and not just some error or mistake in your experiments. And spending that additional effort when the result by itself is not interesting might not be justified.
It is too easy to churn out negative results though. If someone were to hit this at an incentives level, I think it might be fruitful to require that anyone getting tenure has to have had a major study replicated by an independent researcher. That'd have complicated ramifications and probably encourage a very insular community.
That would improve incentives for those trying to get tenure, but what are the incentives for the replicators? That seems to be the biggest problem. It costs money, time, and with the current incentive structure, reputation.
I think some lower pressure medium for publication could work. Publishing every attempt that didn't work would be tedious, but a database of attempts, implementation, and results would allow others to not walk into the same bog
"Dogmas" are pretty much never disproven. (Technically, outside of mathematics, they can't be.)
Science is a long process of lots of people looking at lots of data and coming up with different explanations to make sense of it all. Some of these explanations make more sense than others, but it's pretty much guaranteed that even the experts won't be able to agree on which ones. As new data and new generations of researchers emerge, the explanations evolve, consolidate, and sometimes are discarded again. In short: science is a mess of opinions, gradually moving along.
If you want to get a feel for what that looks like in practice, have a look at this blog post about diverging opinions in ecology: https://dynamicecology.wordpress.com/2018/04/30/poll-results... (Most of the hypotheses he explores are current "textbook knowledge"!)
A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it. . . . An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out, and that the growing generation is familiarized with the ideas from the beginning: another instance of the fact that the future lies with the youth.
— Max Planck, Scientific autobiography, 1950, p. 33, 97
This page is full of junk statements. Picking some extracts and adding some commentary made by someone without experience in the field isn’t helpful.
> Science proceeds because researchers do all they can to disprove their hypotheses rather than prove them right.
Obviously written by someone who has never step a foot in a lab.
> When there’s an urgent need, we learn faster and avoid unnecessary learning.
There is no such thing as unnecessary learning as the benefits of something learn can pay in an unexpected way later on. Which is how some great discoveries were made. I agree that reading too much is a form of procrastination, but advocating ignorance is no good either.
> This page is full of junk statements. Picking some extracts and adding some commentary made by someone without experience in the field isn’t helpful.
IMHO FS is a “modern gospel” echo chamber with little to no new ideas. Why people link it here so often is a mystery to me
Am I the only one who is getting kind of ill from the amount of blogs out there (farnam street is just an example) that have a business about selling ideas/advice to people?
Maybe I've been listening to too many podcasts and I'm jaded but when people selling the ideas and their entire business model is constantly pushing out new ideas and interviewing thought leaders at some point quality content drops off and the pressure to publish probably makes the content suspect. Not sure I'm describing this right but just something I've been noticing lately.
You sure aren't. One of my buddies worked with an older woman who owned a bunch of self-help-y type business books, they were talking one day and she said, "You know, I noticed we all owned a bunch of these and it wasn't changing the way anyone worked." I thought that was pretty salient. The demand for ideas seems to be much more based on the satisfaction of novelty and being "in the know" and much less on actually utilizing any of it. These publishers try to feed that demand continuously, which is just kind of a ridiculous quirk of the human brain, and like any other irrational demand it has no inherent limit. So like you said they start scraping the bottom of the barrel.
Yeah, I had similar feelings about the self-help/personal development/business "literature" space, and it drove me towards learning more about philosophy. A big thing that frustrated me was a lack of critical thinking, especially attempts to understand or reconcile why seemingly contradicting principles each seem reasonable and effective in similar situations. (Given seemingly the same situation, book A says X is the correct approach to take, but book B says (seemingly) not-X is the correct approach.)
I think there should be a sort of academic-esque literature (in the sense of journals and critique) that analyzes these things.
It seems to me, self-help is mostly just people spouting ideas with little scientific backing anyway. People get caught up in famous person X wrote this book. It seems like sometimes the person is actually knowledgeable like Jordan Peterson given their credentials for instance, but then they just draw conclusions that don't follow, make logical leaps, write platitudes, state common sense, etc.
Also it requires a lot of conscious effort to change the patterns that make up yourself. The first step is to learn that skill. It seems like that is where everyone mostly fails.
Ironically, I could write a self-help book/blog using these observations and present myself as an authority, but I realize these are just my opinions.
It's likely that self-help books do help some people, but only some. The disconnect people have is probably a result of different self-help books being suitable for different people in a way that we have a hard time defining. Eg imagine a self-help book catered towards introverts. Extroverts might not find it all that helpful, but be unable to realize why, yet their (introvert) friend swears that it was enormously helpful. Some (small) group of people feel that the book was helpful to them. They got that kick from it that started their engine. Another group might get it from another book.
Combine this with people liking new things and I think it explains a significant part of the self-help book industry. And why do people write them? Money, but also the same reason we write comments here. It somehow feels good to share your opinion, regardless whether you say you're an authority or are just sharing an opinion you're not completely sure of.
As problematic as academia is, that's kinda the idea behind tenure - to make the pursuit of research independent of economic incentives. Obviously this is not the case in practice with grants, the allure of industry jobs, low pay, etc., but I wonder if something similar could be achieved in a more contemporary context with platforms like Patreon.
Well, you seem to be tacitly assuming that those individuals have more interesting thoughts to contribute, but the market is somehow corrupting them, or misguiding them. It's also possible that each person only has so much to give, and is thereafter a spent force; tenure/patreon/other wouldn't change that.
Not sure what individuals you're referring to specifically, but I would agree that we are shaped by economic forces - not necessarily "corrupted" or "misguided", simply shaped. Because we have limited time and energy, in general we're more likely to do what has the least resistance (e.g. what's popular or easier) to make money rather than what we want/what interests us. Tenure/Patreon/? loosens this up a bit, allowing people to focus on more niche interests & research. Sure, some of this will be garbage, but some of it may not.
I think this is just the nature of the "buffet of information" we currently enjoy. As in a restaurant, if you keep eating, no matter how good the food might be, you'll get sick of it eventually.
You don't have to read them all. At some point (after much learning and preparation of course) we all have to decide to be our own guide; to trust our own intuition and primary research to know what's best for us.
Each of us have incredibly unique backgrounds, experiences, goals, and values. Until we start to take action on the things that we really want to achieve, articles like these will eventually come to sound like the same tired platitudes. Check out Emerson's essay Self-Reliance if this resonates with you. (And yes I recognize the irony :) of my comment.)
exactly - the kind of people who these posts admire don't actually read these posts. I called this the Metacreator Ceiling https://www.swyx.io/meta-creator-ceiling/
I couldn't be bothered to read your post so I took it upon myself to write a blog post I like to call "Self-awareness and the meta-meta-creator ceiling"
And I couldn't be bothered to read your post about the first post and decided that I would write a comment about the meta-commentator-creator ceiling. Now I just need a way to monetize my comments...
>To be creative, scientists need libraries and laboratories and the company of other scientists; certainly a quiet and untroubled life is a help
This is contradicted by Richard Hamming in his lecture on creativity. He points out a famous, tranquil, well-equipped environment, viz. the Institute for Advanced Study, where only very few breakthroughs have been made:
People in general: Understand basic statistics and risk estimates. Know how to balance a checkbook. Have an open mind and be careful when dealing with "experts". Such people are not the same as "knowledgeable people"...
Young scientists: I assume this means PhD in a scientific or biomedical field. Those with the latter could go into medical work, which requires empathy and an ability to handle stress. Either of these could go into academic work, which requires luck, a publication record and the ability to secure funding.
For those interested in "working", (reasonably) strong computer chops are becoming a necessity. One either works with professional developers, handles the output from such or become professional developers themselves. Appreciating the data that is created, its limitations and how to participate in and run a project (people skills, too) are critical. There's a number of scientific developers reaching retirement, and the codes they maintain and expand need new people or we need new alternatives.
Finally, if you're a rock star interested in scientific/biomedical works, get enough background to understand the jargon and nomenclature of these fields. Trust me, dealing with chemists without knowing proper nomenclature makes them think you're an idiot. Yeah, there are translators in the middle but better to not need one. Remember Egan's Rule - the number of failures before the project ignores you is 1.
My favourite: “I cannot give any scientist of any age better advice than this: the intensity of the conviction that a hypothesis is true has no bearing of whether it is true or not.”
I'd love to hear from a mature scientist familiar with this advice from the outset of their career tell us how the reality lined up, things like '“A scientist will normally have contractual obligations to his employer and has always a special and unconditionally binding obligation to the truth.”' seem a little idealistic in a contemporary setting?
It's very apt. The only issue I take is the point to work on something important rather than something interesting. Sure, you can find interest at sufficient depth in nearly any topic if you're a good scientist, but you'll miss out on the truly groundbreaking stuff--the long tail stuff.
For example (obvs YMWV), when I started graduate school in the mid-aughts, my interest was piqued by a little studied RNA modification called pseudouridine. I actually had another prof in the department ask why I didn't want to work on something "important". Turns out it was pretty important for improving human health and reducing disease in unforeseen ways. If I hadn't followed my gut interest that said "hmm?" when I first learned about it, I would have missed out on something grand.
Maybe what I would add is the pursuit of a PhD, IMHO, is a privilege. It's a time to drift and wander around a topic of interest; not a time to bust your ass. It's a time for pious interrogation of the universe. You should not be gunning for money or fame or success, but rather focused on making connections with other scientists to come up with answers to interesting questions about how things work.
So who can spend so many of their prime years possibly not accomplishing anything of value? Someone who is privileged. I am all, all, all for opening up doors to everyone, but the risk is that "getting a PhD" turns into the pursuit in and of itself. We are seeing this now more with people assuming the higher degree is simply the next step after a masters or bachelors degree. It wasn't designed for that.
But with all this being said, I love how humanity evolves and I'm down for enabling anyone who hustles to get ahead. If this is what society thinks a PhD should become, then let's try it out. But people need to make sure they realize the way things used to play out after the degree are no longer the rule but more of the exception.
It's worked out well, but it would have been fine regardless because my personality is tilted toward getting off on learning new shit and connecting disparate dots. My absolute favorite part of science are the tools: learning how to use every tool available one and inventing new ones. Publishing papers is fun, but it's a slog (really for everyone). I much prefer reviewing and editing manuscripts as opposed to writing them and I get to do that now on a plurality of topics.
I'd say science is itself an ideal, like the law, but with application of rigour. The moment you stop seeking trutg, it's no longer science but akin to alchemy or astrology.
But then we get to the slippery slope of lying outright vs "cooking the math" to suit your hypothesis. It's still not clear where the line is drawn.
The only thing that's hard to reconcile for me is advice to "work on something important". That's like asking someone to just get better from depression. If we can all truly agree on what's important and what's not that's half the problem solved right there.
Other issue is that if a problem is important then a lot of folks are already on it. Is it worth everyone's time for one more person to come and compete for the same goal? This only makes sense if you know that you can bring a new perspective to that topic (which unfortunately every scientist believes they do).
If this were written today, most of the advice would be tips and tricks for getting your papers into high-profile journals and writing grants. Which in turn boils down to having a good pedigree from your PhD and postdoc advisors.
FYI for other readers, Mises is an Austrian Economics think tank with Ron Paul and Andrew Napolitano on the board [0].
It's true that food shortages don't necessarily lead to famine. However, the potato harvest did drop by 80-85% [1], and the social structure of Ireland turned that stressor into a famine. You could thus say that the system in Ireland was overly dependent on the potato. The article is saying that the Irish system was too dependent on the potato, but they're focusing on the faults of the system rather than the food shortages.
That said, I also disagree with this article. It is true that the famine was not the direct killer, but blaming the deaths during the potato famine on disease caused by workhouses is very suspect. From wikipedia [2]
> The diseases that badly affected the population fell into two categories famine-induced diseases and diseases of nutritional deficiency.
> The malnourished are very vulnerable to infections; therefore, these were more severe when they occurred. Measles, diphtheria, diarrhoea, tuberculosis, most respiratory infections, whooping cough, many intestinal parasites, and cholera were all strongly conditioned by nutritional status.
The trouble was that there wasn't enough food in Ireland, and it wasn't getting to everyone. I don't know enough about the tariffs and corn laws to comment, but it seems suspicious to me that the article mentions tenant farming and conversion of cropland to pasture, and blames the issues on tariffs? Then, they say that private charity would have made up the difference in food production? I don't doubt that the workhouses were worthless when the problem was lack of food, but I would suspect that land redistribution would have better prevented conversion to pasture than tariff/tax reform.
I like the general tone of this advice towards action over consumption. I have found that to be a good way to approach my interest in deep learning research.
> If you want to make progress in any area, you need to be willing to give up your best ideas from time to time. [...] Medawar notes that he twice spent two whole years trying to corroborate groundless hypotheses.
Unfortunately, being willing to scrap your idea is only one part of the equation. Securing funding after 2 "failed" post-docs is an entirely different matter.