Hacker News new | past | comments | ask | show | jobs | submit login

The next thought should be: why doesn't neo-liberal capitalism fix this problem? And: is my characterization of the problem correct? Why not start a new firm that better compensates researchers (and tool makers) for their valuable work? It seems like big tech (especially Google, and perhaps Microsoft) comes in from the commercial side and invests in R&D at reasonable rates for just this purpose! But surely if workers are systematically undercompensated, there is room for a disruptive firm to come in and take the best talent and still make a profit.

Perhaps the characterization is wrong and the EV (expected value) of this work is far lower than you think (this seems likely), and/or there are externalities like regulation, or the leverage of prestige that traditional orgs (e.g. universities and publishers) wield, that warp the profit incentive. Or (and this is my cynical view) pure science was always best left to the hobbyists. Historically the most important discoveries have come only rarely and to those who loved doing science in their free time or, more rarely, when a talented individual found a patron. Building a science factory and hiring science factory workers not only sounds distasteful, but it doesn't seem to work very well. (The exceptions being those very capital intensive projects like the LHC which require a large pool of professional scientists and engineers to do the experiment.)




"If it always worked, it would be business. Let's go to the pub." -- Me, consoling a grad student after experiment failure #24.

More seriously, if you're in basic science, your skills are valuable in transforming the work into a more useful thing to be used later. Using your science factory model, you have created a reusable widget that other people can use. The science factory model does work, you can see its results in things like MIAME: https://www.nature.com/articles/ng1201-365 Where large pooled datasets are used to get insights otherwise impossible.

There's not a ton of low hanging fruit in some fields, as time has gone on the edges are harder and more expensive to see to be at the cutting edge. Ex: you spend $2M on a microscope that does a cool thing and two years later the new model is all that, a bag of chips, and a soda for the low price of $750k. You hope you have a good enough relationship with the vendor that they will either mod or upgrade your system, or that those two years were enough for you to get ahead. It probably wasn't. And you now have a not as fast ferrari for more money than the fast ferrari.

There is a massive glut of international students willing to work for basically nothing, beholden to your PI by their visas. I say this not as xenophobia, but I was the only working class American (my parents do not have degrees) in the department. All students/postdocs that I worked with were from other countries, or if they were American, their families were doctors, or a faculty member. More generally, the kind of people that might own horses :D.

No firm would take this work on, as the profits are not clear, and the time scales for success range from two years to never. In this case success is "great job publishing, we'll give your lab another 2-3y of funding." After which, you better get good at writing books and eating pasta.


I would also say, and I'm surprised this needs to be said in a community that is so connected to the Open Source and startup cultures, that just because something is valuable doesn't mean it's possible to make a business out of it.

Imagine research into a technique for getting better blood pressure readings from people who are so nervous around medical settings that their blood pressure spikes (or more basic research into the mechanisms of blood pressure and anxiety). This is a valuable thing to society (more accurate data informing treatment decisions for individuals, screening for physically demanding jobs, life insurance, forecasting medical spending for Medicare and the like), but it's not worth a lot to anyone in particular.

For the field you described originally, complex imaging devices, there are only so many users of that research so it's conceivable that work could be taken up by a corporate R&D department.

There are all kinds of other very useful research topics that are very valuable to humanity as a whole but it's not clear exactly who should pay for it (I'm not saying you aren't aware of this BTW, hopefully I'm adding support to your argument). In those cases it makes a lot of sense to take a fraction of a cent from everyone and pay for it that way, as we currently do.


It's very difficult to tell what will become valuable in the basic research world and what will remain a curiousity. A classic example in biotech is the study of sex in bacteria - it seemed about as useful as studying the sexual reproduction of ferns at the time. Bacteria generally replicate themselves clonally, but the discovery that they were also exchanging genetic material by the use of plasmids (essentially, mating with each other) eventually opened the doors to things like cloning the human insulan gene, inserting it into a plasmid, getting a bacteria to take up the plasmid, and then, voila, human insulin could be grown in vats in bulk. That was the first real biotech business that I know of, and from there it just exploded.

The problem with universities pushing research that clearly has some short-term financial reward (due solely to patents and exclusive licenses under the 1980s Bayh-Dole law) is that they neglect basic research and so close the door to the potential of truly fundamental discoveries like that. This is generally known as the corporatization of the American academic system and it's really been a disaster for basic technological advances.


Do you think the decline of large corporate R&D efforts is cause or effect here (or is this a false premise)?

I am wondering whether we've seen the reverse of the idea I was originally challenging (if research was valuable it would be a business), where universities captured a profitable business because it was being neglected by the business community (and were distracted from basic research).


The original concept was that universities were places of basic research, and more translational (read: monetizable) research was thought to be done at corporations.

That theme changed after 2008~ when NIH was flat funded and most universities were gazed upon by the Eye of Sauron for funding. A lot of places that were basic science focused, let's say at the level of studying a set of proteins in mitochondria, had to figure out how to connect the dots to disease or therapeutics. Not everyone made it.

Also, universities got into the game of stacking patents to license. I don't know the arc of that, but I know for sure after 2008 my Office of Technology Transfer was really into it.

Ex before: "We study apoptosis signalling in mitochondria, to understand how mitochondria are related to cell death." After: "We study apoptosis during heart attacks, and how mitochondria contribute to cell death in ischemic conditions."

Something along those lines.


Totally! Most of our best equipment was stolen and modded from materials science imaging or manufacturing automation. There was a budding industry for automated fluorescence imaging, but they were still finding their legs.

We had a couple electron microscopes that we modernized from film, and the companies we contracted with mostly dealt with materials people.


> surely if workers are systematically undercompensated, there is room for a disruptive firm to come in and take the best talent and still make a profit.

Other good replies here, but this part of the comment reveals some assumptions that need better definition. Having been both, I can comfortably say that academics aren’t “workers” in the same way that industry programmers are “workers”. The parent comment is not correct about the norm; programming for research projects is not usually sold for profit later to industry. It happens occasionally, but most academic work stays academic. Sometimes when it does happen, it’s in the form of a spinoff company that brings the original authors of the software, and so they end up getting some of the proceeds… when the business survives.

Also the top comment didn’t say ‘undercompensated’ - in business this has a clinical meaning that someone is being paid lower than market rates. We know that adademics pays lower, but we do not know that it’s lower than market rates for academics. It’s plenty true in industry alone that you can make a lot of money at Facebook or very little money at a small indie game dev shop. That doesn’t mean the indie game devs are undercompensated, it means they’re in a different market.

Starting firms to compensate researchers better is what pharmaceuticals (for example) are. The problem with your suggestion is that the need for income can undermine the ability to do research that is unbiased, risky, controversial, critical, or just free of agenda. If you pay researchers in line with what Bay Area programmers get, it will put an enormous burden on the PIs to make many multiples more money than their peers, and compete with them using a small fraction of the number of people of peer groups.


I'd guess that the expected commercial value being low would be the norm, and discoveries making millions relatively rare, just as this is in every other context. However, the second half of your second paragraph is where my mind went to first, because what gp says happens does happen, albeit at a normal (low) rate. The motivation of people working in science is different, as it is in say the games business. Game developers have historically been paid lower except at a tiny handful of companies. Not 33 cents on the dollar, but maybe 50 to 70 (bearing in mind that FAANG/unicorn salaries are not the norm either)


> The next thought should be: why doesn't neo-liberal capitalism fix this problem?

You are the vehicle by which neo-liberal capitalism fixes the problem. By leaving academia to work for a firm directly, you are responding to a price signal sent by the industry, relaying that price signal to the academic labs.

You might object, this is slower than most price signals! That's because the university environment is somewhat insulated from the ordinary pressures of capitalism (and thus better able to exploit young research programmers).


> you are responding to a price signal sent by the industry, relaying that price signal to the academic labs.

Which means absolutely nothing unless a ton of other people do it as well. A handful of people here and there can be replaced.


They expected value theory is very plausible. there are a lot of r&D projects that basically produce zero output for decades. high risk high reward


>why doesn't neo-liberal capitalism fix this problem?

The whole point of academia is to subsidize research before it gets to an application phase. How can a private firm compete with academia who benefits from government funding and are tax exempt? Trying to pin this problem on "capitalism" is just lazy.


No, lazy would be straw-manning a stranger's argument for no good reason to elicit an emotional reaction. It's a style of communication that seeks conflict rather than understanding, and there is plenty of it on twitter and reddit, but not here.


There are plenty of firms that sell software to academia and many of them make a ton of money. I bet there are great opportunities in that space. I guess the issue is that most business educated/oriented people are both too disjoint from both engineering and science, so competition is rare.


>The next thought should be: why doesn't neo-liberal capitalism fix this problem?

Neo-liberal capitalism fixes problems?!


why should anyone pay when the government is keeping it all alive today?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: