Hacker News new | past | comments | ask | show | jobs | submit login

I agree that academia selects for conformists, which is bad for curiousity driven research.

Jeff Schmidt's book "Disciplined Minds" makes a similar argument. I have not read it all, though, so perhaps I should pick it up again.

The book has a good description of the problem, but in my view the author views the process as more political than it necessarily is. I also think the book's solutions are unlikely to help. But I still recommend the book as one of the few on the subject.




Disciplined Minds is a great read.

Early on the book discusses some data that indicates more highly educated people tend to be more supportive politically of the government than the population in general. (E.g. during the Vietnam war lawyers defending anti-war activists in the US were trying to figure out how to pack juries with people who would be favorable to their clients and discovered that roughly the more educated someone was, the more likely they supported the government and would disapprove of anti war activism)

The book makes the argument that gaining professional credentials is less about gaining knowledge/skills, but more about demonstrating that one can be trusted to work in an organisational and confirm to the assigned goals/ideology of that organisation.

One rough argument made in the book is for workers performing roles with work that can be completely specified by superiors, it's less necessary for workers to demonstrate they can conform to how their employer wants them to think, provided they do the work. For professional roles involving more intellectual work where an employee's day to day tasks cannot be completely specified in advance, it's important for such employees to demonstrate they can think how the organisation wishes them to think, so an important part of training for these roles is essentially political training.


> One rough argument made in the book is for workers performing roles with work that can be completely specified by superiors, it's less necessary for workers to demonstrate they can conform to how their employer wants them to think, provided they do the work. For professional roles involving more intellectual work where an employee's day to day tasks cannot be completely specified in advance, it's important for such employees to demonstrate they can think how the organisation wishes them to think, so an important part of training for these roles is essentially political training.

Great summary. I guess my qualm is that this is not necessarily political. I'll admit it often is, but in my field I think people are often dedicated to bad ideas because of what I see as institutional inertia, the sunk cost fallacy, or not liking math, not just politics.


Jeff Schmidt's book "Disciplined Minds" makes a similar argument. I have not read it all, though, so perhaps I should pick it up again.

Having never heard of this book, I googled it after your mention and found this:

http://disciplinedminds.tripod.com

which immediately convinced me to order a copy. Sounds like fascinating stuff. Thanks for bringing this up!


Oof. Funny you mention that. I just bought it, but haven’t read it yet; only flipped through a little. It’s clearly a dangerous topic for me and if I can find the discipline in myself, I’ll return it before it’s too late.

Thanks for the thoughts, and the other comments here are all very lucid. These points aren’t lost on me, and I expect plenty of truth to them.


Norbert Wiener's book "Invention" is another on a similar topic. He argues that research should be driven more by the interests of individual researchers rather than managers and bureaucrats.


Research will always be driven by the interests of whoever is financing it. It just happens that it's almost never the researchers themselves.

And one thing to consider: a researcher's mind is an inquisitive mind that will "dig" even in directions nobody really cares about, don't help humanity, etc.

The one upside of someone else giving the direction is that you can be sure that there will be some concrete benefit from it. The downside is that it's mostly about money and power...


> The one upside of someone else giving the direction is that you can be sure that there will be some concrete benefit from it.

Wiener argues that this often is not true, because the people with the money often have a poor understanding of what's actually important.

To give an example, in my own area of research, my impression is that most industrial folks hate math. They consistently deny the benefit of it. As a theorist who tries hard to do practical work, I find it very irritating for my work to be written off by industry for a bad reason. I went to a conference a few weeks ago, and it was clear that industry folks won't attend my talks, but when I spoke with some of them in private they seemed much more interested in what I knew. It may be mostly an advertising problem, but I still have the math barrier to break through. My mantra now is "math is cheap", which seems to resonate with people who are used to paying tons of money for new experiments which usually don't give you useful information. At the very least I can tell them what sort of experiments would be valuable based on my theoretical understanding.

Ultimately, we need a balance of input from those with money and those with more detailed knowledge. This is what I strive for. At the moment I'd say the vast majority of research is directed from above, so I'd agree with Wiener that we need to move in an individualistic direction.


I won't claim to understand the topic better than this author but one thing I know: important is an incredible subjective concept. So I will quote the perspective of another writer. In Yuval Noah Harari's Sapiens: A Brief History of Humankind there's a chapter dedicated to research (the Sugar Daddy of Science). The short version is you have limited resources. You can't invest in every research project. You are more likely to invest in the ones that increase those resources or to try to solve a pressing issue at that time.

There is no scientific answer to the questions "Which project to fund? What is good? What is important?", only political, economic, or religious reasons. Science studies for the sake of expanding human knowledge and satisfying curiosity.

What's important for you, the scientist? Finding a cure for Alzheimer or developing a new semiconductor? What's more important for the person funding it?

So I will argue that "having a poor understanding of what's important" in a generally valid conclusion about anyone. And unfortunately I am acutely aware that a scientist might just be curious enough to spend money on studies that will bring no palpable benefit to anyone but his own curiosity sometimes. While this is an admirable academic exercise, is it better than any other study that produces in the end a more palpable result, like money?

I might be to cynical or pragmatic but sometimes there's no going around it. Just recently I read about a new archaeological dig that uncovered a viking toilet and could finally describe their approach to human waste over the centuries. While this definitely increased the total human knowledge, can you imagine a more practical way of spending that money? I can assure you someone a poor understanding of what's actually important could :). Or at the very least you can expect that they will be able to identify the "importance" based on the financial and profitability aspects.


I agree that the problem of selecting which research to fund is difficult and (usually but not always) subjective. While a very interesting question, it's not actually relevant to what I was arguing. I'm getting the impression that you didn't understand one of my points, so I'll restate it in more detail.

> While this is an admirable academic exercise, is it better than any other study that produces in the end a more palpable result, like money?

You seem to believe that research results in the most revenue when managers and bureaucrats are in control, but I disagree. Wiener's argument is that managers and bureaucrats often don't even accomplish their stated goals (which should be fairly objective) due to their lack of subject knowledge, e.g., in your example, how to increase revenue. Required subject knowledge makes the right path forward invisible to most people. Many people are fond of efficient market type arguments suggesting these things are unlikely, but if the number of eyes who can spot the problem is small, efficient market ideas don't work.

Jacob Rabinow, a prolific inventor, had a list of "laws", and this is one of them, which basically summarizes the problem:

> When a purchaser, who doesn't know the difference between good technology and garbage, orders "good technology," he will always get garbage.

If someone can't tell the difference between good quality research and bad quality research, they'll likely optimize on cost or some other axis and make a poor decision.

I try hard to provide value to industry, but I find that industry folks avoid the sort of theoretical engineering work I do because they don't like math. Again, some of this is a marketing failure on my part, but this is only a fraction of the problem in my view. You can lead a horse to water, but you can't make him drink.

Some of this seems to stem from managers and bureaucrats not trusting researchers. Yes, many researchers would waste the money, but this to me is similar to managers and bureaucrats wasting money because they don't know what they're doing. I see no reason to be "acutely aware" of researchers' faults but not the faults of managers and bureaucrats. Ultimately we need a hybrid approach, not the largely top-down manager and bureaucrat controlled approach we have right now.


> When a purchaser, who doesn't know the difference between good technology and garbage, orders "good technology," he will always get garbage.

That may be perfectly right but I think we're both right only when taking it to an extreme. A purchaser might as well understand the technology very well. Just as a researcher could be a manager at some point.

The problem lies with "pure" researchers and bureaucrats. They will always see just their way so there's no middle ground. The perfect situation happens just by accident, where the researcher's curiosity happened to intersect with the bureaucrat's ideal money maker. And much of this is down to how each one communicates the expectation: the scientist doesn't know how to state his goal in terms of benefits that the bureaucrat can understand, and the bureaucrat doesn't know how to ask for things in therms that the scientist would understand.

But take the military for example. Their currency is power. And they are perfectly able to drive research in that direction even without being driven by scientists.

And yes, bureaucrats don't trust the researchers with their money. It's because a manager investing in research is like a bet with long odds, where you don't know what exactly you're betting on, and the bookie is a gambler himself. Scientists get guaranteed money without guaranteeing a result. And "worse", a success for the scientist doesn't really mean success for the manager. You are happy you have an interesting result, or a confirmation. The manager is happy if his investment was worth it in terms of time and money.

A more practical example would be if you paid someone to build you a manor and after years of work he comes with a hangar. Unless you're able to monetize on that it's a failure.

Every one of my former colleagues learned to sell their knowledge in terms of benefits a manager understand. It's easier for you to understand their language than for them to understand yours.


> And yes, bureaucrats don't trust the researchers with their money. It's because a manager investing in research is like a bet with long odds, where you don't know what exactly you're betting on, and the bookie is a gambler himself. [...] The manager is happy if his investment was worth it in terms of time and money.

Wiener has an entire chapter on this. Research is a gamble by its nature and will rarely ever appear to be a good investment to a bureaucrat or manager. As I recall, Wiener seemed to want to move away from this model towards funding individual researchers with good track records, for this reason and also because management often does not understand what's important. While I do want to provide value to industry, I don't expect them to fund my research because they seem to be allergic to math in my field.

You also seem to be confusing research with development. If someone is asking for a manor, that's not research in my view. If someone is asking for a structure with certain properties and it's not obvious what the answer is immediately, that's research. Research often returns unexpected answers even when done intellectually honestly.

> Every one of my former colleagues learned to sell their knowledge in terms of benefits a manager understand. It's easier for you to understand their language than for them to understand yours.

I try, but it's frustrating when most industry folks lose interest you once you mention basically any math. As you've said, it's a learned skill, which I'm still learning. There are no guarantees, unfortunately.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: