Hacker News new | past | comments | ask | show | jobs | submit login

"if you pay a man a salary for doing research, he and you will want to have something to point to at the end of the year to show that the money has not been wasted. In promising work of the highest class, however, results do not come in this regular fashion, in fact years may pass without any tangible result being obtained, and the position of the paid worker would be very embarrassing and he would naturally take to work on a lower, or at any rate a different plane where he could be sure of getting year by year tangible results which would justify his salary. The position is this: You want one kind of research, but, if you pay a man to do it, it will drive him to research of a different kind. The only thing to do is to pay him for doing something else and give him enough leisure to do research for the love of it. "

-- J. J. Thompson"




I've seen this used to explain the odd dovetailing of research and lecturing.

There was a point where this made inherent sense - no one but research experts could educate new students, and students were attempting to quickly reach the state of the art. That's long past - a complex analysis expert with no training in education is a terrible candidate to teach basic multivariable calculus.

And yet the whole thing makes a sick sort of sense, because paying and tenuring a professor to teach classes justifies their presence without demanding publications. They'll muddle through intro classes without causing any real problems, and can go back to being brilliant on no particular schedule in between lectures.

Or, on the flip side, set up something like the Institute for Advanced Study. It only took already-proven brilliance, so it could justify paying people for years without any breakthroughs to justify it.


"When I was at Princeton in the 1940s I could see what happened to those great minds at the Institute for Advanced Study, who had been specially selected for their tremendous brains and were now given this opportunity to sit in this lovely house by the woods there, with no classes to teach, with no obligations whatsoever. These poor bastards could now sit and think clearly all by themselves, OK? So they don't get any ideas for a while: They have every opportunity to do something, and they are not getting any ideas. I believe that in a situation like this a kind of guilt or depression worms inside of you, and you begin to worry about not getting any ideas. And nothing happens. Still no ideas come.

Nothing happens because there's not enough real activity and challenge: You're not in contact with the experimental guys. You don't have to think how to answer questions from the students. Nothing!"

Richard Feynman, "The Dignified Professor"

http://www.pitt.edu/~druzdzel/feynman.html


This is absolutely a risk. I was thinking of Feynman when I brought it up, thanks for finding the quote!

The tenure-and-teaching model is a better one than it was ever given credit for; the seemingly irrational connection between elementary teaching and cutting-edge research has been a powerful one.

Even so, I think the IAS model is better than a lot of what we're seeing today. Now, introductory classes have been offloaded onto (poorly paid) lecturers, and professors are expected to publish consistently, even without funding. The results are hideous: not just inaction but counter-productive action.

The replication crisis is, to a real degree, a function of this paradigm. "No results" is not an acceptable justification for "no publications", so we see people using statistical trickery that has been bad practice for decades to keep up their publication count. I'd really like to see them go back to teaching and researching, but even a movement to sinecures that don't demand any results at all would be progress.


This doesn't really describe life at the IAS. Maybe in Feynman's day, but it wasn't my experience. If you're at IAS, you're continually surrounded by lots of very bright people who are trying to solve problems and understand things. Lots of conferences, lots of visitors. There's always someone to talk to or something new to learn. It's probably the most stimulative environment I've ever encountered.


Ideas that generate a breakthrough are the exception not the rule. Sometimes, like in the case of Einstein all the ideas appear in one or two years and then nothing else. Newton with Calculus was another leap.

Expecting ideas to appear, no matter what you do, is not a good idea. One difficulty is that is very difficult to estimate the probability of rare events. Perhaps having more free time can increase the probability of having new deep ideas, but that is not easy to measure. Furthermore, another key ingredient is communication with people full of energy, like P. Erdos, perhap some people can generate a hub of ideas.

Edit: Added A little more of money in the table not in the hands of political but directed to those that can do real research is a real innovation.


Einstein spent ~8 years developing general relativity. Certainly not "all the ideas in one or two years".


I think this is half-true. There's historically been a lot of cross-pollination between IAS and Princeton, which has resulted in some spectacular achievements such as the first electronic computer(the IAS machine; built a little bit after Feynman left I believe). IAS also has a lot of scholars who are working "remotely" at other institutions/colleges iirc. My general impression is that while there are a lot of slackers, there are also some really active and committed researchers, much like the rest of academia, so I'm not sure if we can reject the null hypothesis here so quickly.


He sounds bitter.


Not so - he turned down the IAS long before writing that!

The full context there is that he spent a semester exempt from teaching to have more time for research. He produced very little and was deeply embarrassed, until he eventually realized that many of his insights came from rehearsing fundamentals and talking to people in other subfields.

So he returned to teaching, and when the IAS came knocking he refused on the grounds that it was the worst environment for him to work. Certainly I think he's overgeneralizing (the IAS has seen some pretty good results since that time), but this wasn't about jealousy!


Proofing something wrong for the first time is not valued as it should be- and blunt honesty is neither. The honesty of "the results where inconclusive" is still murderous for a scientific career in some fields.


I think that was what tenure was supposed to be for.


Somewhat. Sinecures come in many forms.

Tenure's more important because it means you can say naughty but true things about the powerful without losing your job. As structured right now, in the physical sciences, a base salary won't support research.


Yes and no. In some fields, such as mathematics (applied & pure), one can squeak by without a grant, and many do. This has to do with the funding model in different fields -- math departments tend to have enough TA positions to support all their grad students, which lightens the pressure to secure funding. Of course the trade-off is time spent teaching lower-division courses vs actually doing research.

As for tenure, it qualifies one for all kinds of service. So job security is nice, no questions about it, but one has even less time for serious work (unless you are willing to say no to requests to serve -- and some of these committees do matter).


Like i said. In the physical sciences, you can't do it.

Math is not a physical science -- and you can get stunning breakthroughs without anything except a quiet room, a pad of paper, a pencil, and a trash can. (old joke: /s/math/philosophy and remove the trash can!)

This is not true if you're a microbiologist.


The funding requirements for theoretical physics aren't really any different from mathematics (and like mathematics, there's usually more than enough TA positions to go around).


Yes, that's true -- math is not a physical science in that sense. Though as another poster pointed out, theoretical physics is not so different from mathematics in this way, so perhaps the distinction is more between experimenters and theorists. (However, I've no idea what life is like for theoretical computer scientists without any kind of external funding support.)


You can do original microbiologist research on the cheap. For an extreme case, Einstein did useful theoretical physics without a lab.

Money vastly expands the scope of work you can do, but breaking new ground is often less expensive than making minor refinements to well understood areas.

PS: The real risk is will likely will go 20+ years without finding anything, but if you have tenure that should not be a problem.


You've never tried to buy a good microscope, have you?

Kitting out a lab to do publishable work in a lot of fields right now is just jaw-droppingly expensive.


Good optical microscope's are really not that expensive and last a long time. (AKA under 1,000$.) The equipment costs can quickly go up with new electron microscopes running 500k-6mill etc. But, restricting yourself to the minimum equipment 1 person needs for a specific kind of research is significantly cheaper than a more abstract lab.

Further, used equipment can often be vastly cheaper.

Honestly, time and consumables really are the biggest issues.


Any decent fluorescent scope + base + optics + laser + dichroic will run you well north of 250K. Just the objectives are 10-20K. Two photon, more like 750. You want to do funny stuff like FRET or PALM or STED? You'll spend 700 grand on parts, and a year building the thing yourself because nobody sells them.

You will not publish anything good enough to get tenure if all you have is widefield and a lamp in basically any biomedical field.


I am in no way saying this stuff is not really really useful across a wide area. Just not required for every topic type of research.

Further, the assumption is this is someone that has tenure but not funding and is thus able to take long shot risks. Think designing an artificial intestines for microbiome research, not yet another paper using E. coli.


Where are you shopping?

Florescent microscopes are in the low six figures and others are much more expensive. A confocal microscope is typically several hundreds of thousands of dollars, potentially approaching $1M if it has all the bells and whistles. Super-res and high-throughput imaging can also consume essentially infinite amounts of money.

You could probably get a usable brightfield microscope for around a $1,000 but realistically, you're not publishing much with that alone.


It absolutely is (at least in the sciences)...the problem is that the path to tenure does not value this type of work. Naturally people could switch their focus once they do have tenure. That assumes that people are not creatures of habit though. The type of people who are likely to succeed on the path to tenure in its current form are those that are focused on production. Asking them (i.e., giving them the freedom) to suddenly change their focus to big, world changing, problems once they are 'free to do so' is unrealistic. Being taught through grad school, post docs, and your time as an assistant professor that this thing is how we measure value becomes your worldview.

Just ask Peter Higgs[0]

[0]http://www.sciencealert.com/peter-higgs-says-he-wouldn-t-hav...


Sorry if I'm missing the point of this quote, but it seems wrong to me that you "can't point to something to prove the money wasn't wasted".[1]

You can always do some variant of "we tried this, it didn't work; in the future, you no longer have to spend money to see if this will work." Right?

[1] Removing the triple negative: it should always be possible to prove the money accomplished something.


“To arrive at the simplest truth, as Newton knew and practiced, requires years of contemplation. Not activity. Not reasoning. Not calculating. Not busy behaviour of any kind. Not reading. Not talking. Not making an effort. Not thinking. Simply bearing in mind what it is one needs to know. And yet those with the courage to tread this path to real discovery are not only offered practically no guidance on how to do so, they are actively discouraged and have to set abut it in secret, pretending meanwhile to be diligently engaged in the frantic diversions and to conform with the deadening personal opinions which are continually being thrust upon them.” –George Spencer-Brown in The Laws of Form, 1969

[Of course, some may disagree with this assessment; just throwing it in for your consideration.]


From a business perspective, you're right enough. Academic research is not business oriented -- and Thompson was saying that it should not be.

When discovered, there was nothing particular monitizable about quaternions, and they were not found as a consequence of any business enterprise looking for new tools -- but Hamilton didn't starve to death because he failed to capitalize on them. 80 years later, suddenly we rediscover the fact that we've solved gimbal lock and can easily describe quantum states.


The quote is referring to proof of progress, not proof of profitability, which is a separate issue.


Sadly, the much-longed-for Journal of Negative Results does not exist, giving it an impact factor of 0.



At the least you could provide video recordings of your activities.


The maxim of the software company manager: "If it takes more than an hour to do, it isn't worth doing".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: