Hacker News new | past | comments | ask | show | jobs | submit login
The decline of unfettered research (1995) (umn.edu)
197 points by KKKKkkkk1 80 days ago | hide | past | favorite | 111 comments



Coming out of Uni with a PhD in the UK in 2003, I went looking for some "curiosity driven" research and didn't find any. The dotcom bust had sucked budgets dry and no-one was hiring. I've dipped my toe back into the market a few times and not found any. I'd have to move to the States and even then there was no guarantee of doing unfettered research. I carried on in my spare time but it's not the same as being surrounded in a melting pot of like minded (yet different subject) people.


I got a PhD in 2004 and could not find any non-academic research positions in the US. (I was pretty fed up with academia at the time, so I didn't look at those.) The closest I got was a job offer from a group at Telcordia Research, but when I went up for an in-person interview, I found the building mostly empty, most of the people there very bitter, and the only group that was hiring doing product development, not research. (And it was in New Jersey.) I declined the offer and went back to contracting.

At IBM. That wasn't a smart move either.


Telecordia is a disaster. When your primary customers are fixed line telecoms, you don’t have to be that great. IBM isn’t good either.


If you try making your curiosity a bit more military related I'm sure you'll find something pretty quick haha.


Not if you score low on the "authoritarian" psychometric trait. ;-)


The military just follows the orders of the authoritarian government that funds and controls them. They are just an extension.


My PhD was part funded by the military and even then - nope.


Sometimes a good manager lets you work on ideas you believe in, as long as you continue to pay lip service to "the official project" so that one can move forward as well.

The term "under the radar R&D" has not been unheard of in many corporate research labs.

But what worries me there are reports that the corporate R&D lab as an institution is in decline. I cannot judge whether this is true, since I recently switched back to academia to have a bit more autonomy after a decade in industry R&D.


In semiconductor-industry related companies, I've seen the departments continuing to exist, but the expected time-to-market for R&D decrease from 10 to 5 to 3 years. With an expected product launch in 3 years, it's effectively development, without any research.


I'm a researcher in Biotech, and honestly, a working at curiosity driven research institute is all I've ever wanted from a career. I have the credentials and have proven myself academically, and in industry, but this kind of position is just so hard to find! I really just need time and a small amount of resources to work on ideas. I usually struggle to get more than 20% of my time working on my own ideas.

I've spoken with colleagues over the years, and when I bring up this desire, so many of them feel the same way. Many of these people are incredibly accomplished and come from top institutions. What a waste to not give that magnitude of creativity an outlet! The only path I see to this type of life now, is independent wealth. It just strikes me as such an opportunity to create a place for these people. If you build it they will come!


> It just strikes me as such an opportunity to create a place for these people

the problem is finding and filtering the applicants that come to such a place. How do you tell the difference between somebody who just phones it in, from somebody who would actually produce some valid research breakthroughs?

Unless you have unlimited resources to spend, this is a difficult problem to solve.


Very good question and a hard problem of itself.

"Success leaves clues" affords some possible hope, though there's also a long history suggesting progress and individual attribution are limited and difficult to assess.

There's a history of major research institutions, public and private, including those of Edison, GE, Dupont, AT&T Bell Labs, Xerox, IBM, Microsoft, Google and other companies, as well as the Manhattan Project, WWII signals and computing research (see especially Norbert Wiener's account), the space race and Apollo project, the Santa Fe Institute, and more. These should afford guidance.

Often one clear trend is that results follow something of a sigmoid curve --- slow initially, a period of strong progress, followed by a decline. Much engineering trends toward an optimum, not some unlimited potential. Even Moore's Law likely ultimately follows this path.


I think this is such an interesting question! Perhaps a new science of creative management. I guess there were a few places that presumably seemed to do this successfully (e.g. bell labs :)). There is probably a 51% problem somewhere in there, where the institution would need to make sure it isn't captured by those who would 'phone it in'. Maybe you could pay people a healthy sum to quit at any time.

Managing a process which has no defined outcome, but whose local quality and direction can be measured seems like an interesting research path in itself. I think the institute would have to be bootstrapped and a lot of learning would have to take place to make a sustainable version of it. Also, as you say hiring would be challenging (though maybe no more challenging than hiring professors). However, I'll say that I can recognize at least one phenotype of researcher reliably who would thrive there.


I was lucky to take a grad-level math course in error correcting codes from Dr. Andrew Odlyzko, the author of the essay.

I read a lot of his papers ( http://www.dtc.umn.edu/~odlyzko/ ) in the hopes that I would learn something to improve my exam scores, but he has a knack for asking questions so fundamental that they have almost never even been properly formulated before.

If you have some time, I recommend reading a few of his papers. He completely changed my view of mathematics.


Wow, what a diverse body of work. Could you give an example of a paper that you feel fits your description of having "almost never even been properly formulated before"?


> what a diverse body of work

Indeed, for example this page [1] is absolutely fantastic, it hits a lot of right buttons in my case (history of railways, history of finance/economics, a combination between history of railways and the history of finance, which is even more interesting).

[1] http://www.dtc.umn.edu/~odlyzko/doc/bubbles.html


I was also in need of some 1990's WWW nostalgia tonight, which this website provided.


I remember coming across his essay on Newton. Very interesting.

https://physicstoday.scitation.org/doi/pdf/10.1063/PT.3.4521


A lot of times people are afraid of wasting money by giving research grants where it isn't clearly specified what they are to be used for. What is missed, is that there is a huge amount of waste, when researchers are forced to work on a dead end ideas, they know wont work out, just because that's what they have promised to do, while it was still looking promising. Scientists should be penalized for piloting to something new. unfettered research doesn't just let people work on what ever the want, it also lets they drop anything they don't believe in.


As someone who sat on a committee handing out (small) research grants in the past (usually as no-strings-attached donations), the concern was never really that you would not deliver exactly what was promised. The concern was that you would not do anything at all (or very little). I know it might not even occur to an honest person that this would be a problem, but you would be surprised.

If you no longer believe what you proposed is a useful thing to do... send an e-mail. This may be easier to resolve than you think.


> If you no longer believe what you proposed is a useful thing to do... send an e-mail. This may be easier to resolve than you think.

eh... I know of one similar instance (student had found that what they were looking for has been proven false by another team) - the advisor basically said "okay, we're stopping the phd there". Two years to the drain.


What should someone do if they are a PhD student who believes their funded project is a waste of resources and their advisor agrees, but their advisor refuses to contact the funder about that?


You continue to perpetrate the academic fraud in exchange for your PhD.


Meaning, do the bare minimum for the project and spend your glorious time getting distracted by what matters.

Seriously, a lot more phd students just need to embrace their ADHD. Even if you are given a path, there is no path.


Yeah, so, I ended up believing that my academic research was completely useless, but tried to continue with the PhD anyway. Not only was it demoralizing, but my anxiety and ADHD just came pouring out and I hated even thinking about my research. It grew worse and worse. Eventually, my advisor left, I had no funding, and I withdrew with nothing to show for the past 3.5 years except student loans.

Fun times.

(What was the research? Basically taking Heckman's 70s-era selection correction and applying it to nonlinear models. Big fuckin' whoop. My advisor had already written multiple useless papers riffing on the idea anyway. He's probably still doing it to this day.)


That's hitting close to home. I'm near end my funding and visa time, with not enough publications. The three years have been mostly being told no to whatever I proposed and not given worthwhile stuff to do instead. And when it did it was things already done 3-4 decades ago. My advisor even told me the other day things that made it clear he was aware of the kind of stuffs I want to incorporate in my work, yet didn't seek any middle ground.

On the other side I have multiple projects, the biggest one already presented in a conference. I meet multiple people my university and other big ones that are very enthusiastic about either that project or the official one (which got none in my own lab) so it's not like what I'm doing was totally dumb.

I'm not sure what to do now. I have 3 more years to write the thesis, but would need to find a job to stay in the country. Hopefully I used web technologies I can market on my CV but I'm bit sour about the whole thing (and life in general).


> Seriously, a lot more phd students just need to embrace their ADHD.

This hits so close to home that I think I just heard a knock on my front door.

I was diagnosed with ADHD about a month before I started a PhD (in my mid 30s). In my 3rd year, I finally took the required course on theory of computation and almost failed it because I spent all my time reading SEP entries about the foundations of mathematics and the origins of computing. I even started reading the Homotopy Type Theory book, despite a woefully inadequate formal mathematics background.

Meanwhile, I’ve got two papers in submission on a topic that I think is a dead end and a waste of time, and I’m hoping that my advisor won’t be too upset if I just refuse to work on any of the follow-up papers that he is considering. I prefer to hang out with math PhDs rather than my lab coworkers, so I can pester them with questions about category theory. And I’m supposed to scrape together a dissertation in, like, 18 months.


But it’s your career and future against what, integrity?


Yeah, kind of crazy that we’ll put that kind of pressure on students when this is in many ways out of their control. If someone happened to pick or be given a path that’s a dead end…


Wouldn't doing that just prompt the funder to stop/reduce funding? They might have a great project they want to fund for someone else, but not enough funds available for it. You're contacting them about the research being a waste of resources would be very welcome, no doubt!


I have no doubt there is a lot of fraud going on! I just don't think making researcher write 5 years plans for what they plan to do is the way to combat it.


HHMI grants are unusual, in this respect. The grant is to an investigator over seven years, not to specific a project. They have to have demonstrated significant research in the usual funding system to be eligible, though.

https://www.hhmi.org/programs/biomedical-research/investigat...


There are a number of programs like this - the NIH MIRA is another.

They also have their problems, but it's good to have some of them in the mix.


You could have a compromise, so you don't accidentally fund eugenics or something.

There could be a whole range of topics that could be worked upon and you could allow the researcher to move freely between them,


The thing that is needed is trust. Its by far the best way to do it, it just that going on trust sounds like the worst and most irresponsible way of doing it so people don't dare do it, and they don't dare argue for it.


“Erwin Griswold, who had been the Dean of Harvard Law School, had the theory that he knew which people were geniuses. If he approved of them, they would certainly do good work over time, and therefore they had to write nothing.“

https://volokh.com/2011/10/02/justice-breyer-on-tenure-stand...

“The highest form which civilization can reach is a seamless web of deserved trust. Not much procedure, just totally reliable people correctly trusting one another. That’s the way an operating room works at the Mayo Clinic.”

https://fs.blog/munger-operating-system/


I've often given thought to exactly this problem: trust in all American institutions has been and continues to decline precipitously. I can not name a single case of trust increasing anywhere within the last decade - religion and the church (catholic/Catholic/et al), government at all levels (SF on up to NASA), academia and the sciences and seemingly all forms of education from elementary school on up to University, unions, the medical establishment from local providers to the CDC/FDA/NIH, the US military, the police (although somehow, we went from the Democratic party regarding the FBI as Hoover's institute to one safeguarding Democracy), Facebook (Big Tech Everyman) went from a beacon of Democracy under the election of Obama to whatever it is considered now, corporations and Capitalism, the courts and even the founding of the country itself. For a more niche subject of personal interest, our furniture has seemingly purposefully declined in quality as we outsourced manufacturing to China via IKEA and forced everyone else to do so.

I'd guess that the polarization of our society is self reinforcing, and that as it increases, our institutions follow along for one reason or another whether it be some form of "good business" or internal capture. Its probably some Internet Law that any institution not specifically devoted to staying out of it will eventually succumb, but even then you have the ACLU... and there is non-culture war decline in trust - something more like the financialization of everything or the injection of metrics into all parts of life that can be metered....

We don't value competence and accountability first; maybe this was always the case and only our narrative changed, but that's hard to imagine when we used to do things. At this point, why not Research, too?


My view of this is that the pursuit of money is leading America (and the whole western society) to a dead end. Everything now is corrupted by money is a very shocking way (it was always like that, but not in the open). We're back to the 20s of the previous century, where a few people had enormous amounts of money and most people barely survived, even in the richest country in the world.


Interesting observation. Maybe it's a natural evolution of society, that in part it resembles evolution of a person - it starts out young, full of ideals and energy, trusting the people and institutions around them. Then it gets older, loses energy and as perception of corruption blows up, trust becomes rarer. This does not have to be the end of that society like it is for the person, but probably society needs some radical revolution to get things "back on track" again.

In research, abolishing the grant system and the business model of universities selling credentials in favour of direct financing of learning and research institutions to teach the best students should be attempted. It probably won't come from the academia, it has to come from the state.


Shelby Steele does a good job of diagnosing the malady in his book White Guilt: when institutional America had to own up to its sins in the era of civil rights, it surrendered moral authority. Every generation rebels, but usually the kids come back to the same principles that their parents held once they experience the real world for a time. The civil rights era broke this pattern: the adults were wrong, and admitted it, and so an entire generation (Boomers) came of age thinking they had the justified authority to question and possibly overturn any of their elders’ institutions.


"You could have a compromise, so you don't accidentally fund eugenics or something."

The thing about basic research is that you usually cannot see the consequences down the line.

Few of the original researches that discovered ionizing radiation could anticipate the enormous destructive power of nuclear weapons. And yet their contributions were crucial.


Starting to think that the most honorable thing to do is live a simple, modest life and not have children.


There's already a lot of published work in that area. Are you looking at confirming earlier work, or is there a particular focus that introduces novelty?


I guess the practical execution varies a little in each time, place, individual, and circumstance. Look at all these new temptations and tools we have developed! Maybe it will be different this time! Maybe not.


Here’s a novel idea: Try to have the same child with two different partners.

May require extensive trial and error before a good result is uncovered. Don’t let anyone tell you it’s cherry-picking..


Funny thing is that people often choose simple, modest lives specifically because it allows them to have children and be decent parents.


Nothing wrong with having children. It’s an honorable, self-sacrificing choice.


Part of this is just the professionalization (bureaucratization) of research. Maybe in the future more independently wealthy people will simply do science for fun.


So I recently came across an interesting idea that I think applies here. The idea is that "bureaucratization" and "professionalization" are actually opposites. This is a semantic argument, but the ideas themselves are interesting enough and we need to attach words to them either way, so let's just roll with it.

A professional is someone you trust to do their job. Like a doctor or a lawyer. If you have a medical or a legal problem, you go to a professional and the professional uses their professional judgment to make a decision and works on your behalf to try and solve that problem. And they take personal responsibility for their professional decisions. For instance, when a professional engineer signs off on building plans, he is saying, "this building is not going to collapse and kill people, and if it does, I will take personal responsibility".

A bureaucratic environment is an environment where processes and controls have supreme authority and there are no professionals. You have to jump through hoop A, fill out form B, and have everything reviewed by committee C to do anything because you are not a professional and your judgment isn't trusted. To some degree, this means doctors aren't fully professional anymore.


This is a nice story but I don't think it makes much sense.

As you observed, by this definition surgeons are not professionals but the teenager who's the sole employee at a lemonade stand is a professional. The former is extremely constrained by bureaucracy while the latter can do pretty much whatever they want as long as nothing burns down or gets too many people too sick.

Bureaucracy and professionalism are largely orthogonal. There are horrible bureaucracies where some individuals have immense power. In fact, that's probably way more common than not. There are also relative anarchies where no one has any real power because the whole org is completely beholden to the market in every aspect of its operation; e.g., most corner pubs on a crowded business street. Even the owners have at best marginal control over their employees and rented space.

A professional is just someone who does the same sort of skilled work year over year for pay. Most blue collar workers think of themselves as professionals, and you'll see plenty of discussion of "professionalism" in any trades training program.

Historically, the connotative notion of a "professional" that you're using here -- basically, upper-middle class professions with a certain amount of social esteem -- were always the most bureaucratic occupations. Have they gotten even more bureaucratic with time? Sure. But they were always more bureaucratic than other occupations of their time (mostly farming). Medicine or law being more bureaucratic than than farming is not a new thing.


I'm not actually sure if medicine was more bureaucratic than farming 100 years ago. Most of the medical bureaucracy I can think of developed over the past century. In the 19th century medicine was, most people believe, unregulated to a fault.


Going full circle then back into the old days when this was almost always the case. Sadly no matter if it’s the old way or the new way of doing research politics, ego and personal disputes will still always play a big part.


We're going to have a relatively large number of children/inheritors to billion dollar scale fortunes within a decade or two. These individuals will have more money than they would ever need to use - however they will lack for prestige and impact.

I wouldn't be surprised if some of them choose to create university positions for themselves, or otherwise "self-fund" their own prestige projects.


Yes, and for better and for worse. “Science is friendship” isn’t far from the mark.


A return to the glorious days of the gentleman scientist! All you need is wealth and nothing to do.

I wonder what such people currently do with their time? Actually, HN is a great place to ask this, as lots of them are here! wait


One potential saving grace is in falling costs of tools from technological advancement but science is a very broad subject where material demands vary greatly. Theoretical physics may have minimal material needs. A citizen scientest might be theoretically able to do something with CRISPR to say, modify e-coli to start producing carbon nanotubes or try to evolve plastic eating bacteria. But not making their own Large Hadron Collider.

One would need a very complete picture to be able to accurately generalize in such an absurdly broad area.


> Theoretical physics may have minimal material needs.

I don't buy that because it's very expensive to experimentally prove the difference between various theories.


Exactly true. physics is an empirical science. The separation into “theoretical” and “experimental” (and “phenomenological” in-between) is a result of some haywire marketing that somehow has divided the scientific method up into specialized professions. The field of Physics though, is grounded in experimentation and observation. The lack of this grounding is called mathematics, philosophy or in the extreme case: religion.

Best regards, A purebred experimentalist from the theoretical institute of physics at Blegdamsvej 17 (Niels Bohr’s Institute).


> The field of Physics though, is grounded in experimentation and observation. The lack of this grounding is called mathematics

Right, and the mathematical field where you use axioms matching physics experiments is called theoretical physics. Currently there is a lot of work to do in that domain, as currently we don't have any theories that matches well with all experimental data we have. We have small scale theories and big scale theories, but there still needs people to work out theories and formulas for combining the two. You might call that mathematics, other calls it theoretical physics, either way they will help further the field of physics and science in general.


A huge contributor to this imo is the grant system and how it works particularly at very large research institutions and universities. Typically the only people with the means and time to do "just for curiosity" / fundamental research are people in long-term professor / research positions where they are allowed to pursue whatever they want (as long as they are frequently published). Grants work against this, as they create an incentive to work on specific, often more short term projects / applications rather than fundamental questions. In this way, injecting money into academia via grants actually reduces the amount of fundamental research being done, because a majority of researchers are going to chase the grants aka the short term interests of corporations and governments rather than do less financially rewarding fundamental research.


Without grants, those people would just not have a job in the field.


That's really not true. My advisor in undergrad publishes 5-6 papers a year and has never bothered to seek grants in his entire career because he works at an institution that actually emphasizes teaching and compensates it's professors accordingly.


I find this article a bucolic tale of tech and research.

Right after WWII with the planet in shambles, living in the "winning" country, you're working on computers and found unfettered access to funds and investment? No surprise.

GE in 1956? To leave out the massive macroeconomic power of GE in that day and age is shortsighted. Same with Bell Labs, et all. This was an age where military spending rose from 1% of GDP to 10%. It was military spending and military might that bought you that "unfettered research". Yes, society should be better at allocating for the long term regarding research and tech -- but 1956 GE was not some sort of utopia.

We're just in a lower part of the cycle right now. Unfortunately, the only reliable "reset" button society has found seems to be war. Hopefully modern financial markets will be able to create those cycles without as much bloodshed.


War only pushes the reset button in a positive way when it's WWII and you're America. See the economic consequences of the war in Vietnam for a more typical outcome of putting military spending to use.


Parent did note "living in the 'winning' country" as as precondition, so seems that you are probably in agreement to some extent


Okay, economic consequences of successfully invading Afghanistan. Or Iraq.


I'll let you know, as soon as we do either of those things successfully. :)


Both countries were successfully invaded, but they were not successfully (defined in the way a Victorian English gentleman would define success) occupied.


Do pyrrhic victories count? Seems that the data model is not clear cut.


Is the success of the drone industry an outcome of these wars?


"success" of the drone "industry"


Successful drone industry: China making quadcopters.

Expensive drone industry: US government buying unmanned air vehicles, totally unrelated to quadcopters.


Did North Vietnam lose the war?


Military spending as a percent of national income rose a huge amount in the 1940s, but so did spending on pretty much everything else. Over the period from 1930 to 1950 the United States (as well as many other Western countries) transformed themselves from societies with pretty low taxes who spent the bulk of their (small amounts of) revenue on defense to higher-tax societies which spent (a lot more) revenue on defense, education, all kinds of scientific research etc. In fact, while defense spending rose a lot during this period, these other categories of spending rose significantly more (as percentages of national income) because prior to the early twentieth century they weren't really considered core functions of the state. That is the bigger story (rather than World War II).

> Unfortunately, the only reliable "reset" button society has found seems to be war.

In my opinion this claim needs significantly more justification even though it is frequently tossed around.


If you're talking about America, yes the only "reset" button it knows is war.


I've got to say, that 4th paragraph is, or at least should be, the modus operandi of every scientist in any related field.

>In this style of work, the researcher is allowed, and even required, to select problems for investigation, without having to justify their relevance for the institution, and without negotiating a set of objectives with management. The value of the research is determined by other scientists, again without looking for its immediate effect on the bottom line of the employer. The assumption that justifies such a policy is that "scientific progress on a broad front results from the free play of free intellects, working on subjects of their own choice, in the manner dictated by their curiosity."

It's sad the current state of science doesn't appreciate the work done purely through curiosity, and instead want to milk professionals for other means and agendas. The paragraph sheds a light on what science really is, and what's kept fueling it for millenia, curiosity. Some of the greatest scientific discoveries have come from curiosity in answering burning questions. Yes we still have some great discoveries, but not as much now I would think. Most of what science today seems to be is just proving or disproving agendas with clear incentives. There are some that seem to be born out of organically produced work, but it's hard to know because who knows the incentives and agendas behind the scenes.


Needs a (1995) in the title.


Not that it's gotten any better in the intervening 26 years.


Science Mart by Philip Mirowski does a good job of laying out, for those in research but also for those who are not, the political-economic changes that are responsible for this.


I do industry R&D. I don’t think I am all that fettered. I do accept that there is a certain “social contract” involved though! I understand my employers business and my work is done with the ambition to advance the companies long term position.

Practically speaking this means that my work is restricted in that it deliver something resembling a product or service within a couple of years. Even if only the thinnest of MVPs of a concept. String enough of those together and you get pretty close to something resembling “unfettered”.

I suppose my point is that industry R&D is not all that bad right now, you just need to be a reasonably responsible corporate citizen.


> Practically speaking this means that my work is restricted in that it deliver something resembling a product or service within a couple of years.

So fettered, as per the articles definition of the word fettered.

It really just sounds like you don't like the word fettered for some reason. Maybe it has a negative connotation to you?


"It is widely acknowledged that science made this transformation possible." Widely acknowledged by scientists, but not necessarily true. In the generation since this essay was published, could anyone argue that innovation has withered?


Only one past tiny thread:

The Decline of Unfettered Research (1995) - https://news.ycombinator.com/item?id=2952423 - Sept 2011 (2 comments)


> It will be a long time (if it ever happens) before Netscape earns enough profit to justify its initial stock market valuation.

Now that a long time has passed, I would be interested in hearing an analysis of this exactly as it is phrased: financially, with respect to profits vs valuation, and not in terms of historical impact.


Netscape was bought by AOL for $10B in 1999.[0] According to [1], Netscape was valued at $3B after it's IPO. So in that sense its early investors did well (not so much those who bought at the peak price later in 1995). However the natural follow-on question is did AOL actually get $10B worth of value out of owning Netscape. I would guess not, but I don't know how you'd prove it.

[0]: https://web.archive.org/web/20171107021707/http://news.morni...

[1]: https://www.fool.com/investing/general/2013/08/09/the-ipo-th...


> However the natural follow-on question is did AOL actually get $10B worth of value out of owning Netscape.

I suppose that at this point it is lost in the digital accounting noise, but I bet there are some people left who saw it happen and have a perspective on it.


I'd say AOL got $10B out of owning Netscape, and then some. They bought Netscape in 1998 and, riding the hype wave, were able to buy Time Warner in 2000 — it was valued at $182 billion, but they paid nothing but AOL shares, which soon demonstrated their near-worthlessness.


From the perspective of stability-loving wealth, unfettered research is opening pandora's box.

At last, the internet, opium of the masses, finally neutralizes the intelligent classes.

Wealth will not care for genuine research until two aliens races war bitterly over some unique natural resource we take for granted. We will not recover.


He’s a little too casually dismissive of the end of Cold War military budgets as a driving force. He basically waves it off without explanation.

(I’m always amused that the Lambda-the-Ultimate papers were paid for by the Office of Naval Research.)


One of the reasons I took the position I did was that - fairly rarely for my field, which is heavily grant funded - there was enough hard money to do some curiosity driven research and methods development.


A lot of blame on the university side goes to Bayh-Dole patent laws of the 1980s which allowed patents from publicly financed research to be effectively transferred to the private sector, via the mechanism of the exclusive license. This is going on right now with vaccines and antivirals as well, but it's been a steady theme for over three decades.

This could be fixed with the commonsense solution of requiring public access to publicly financed research and patents without burdensome fees. This would have allowed numerous private entities to begin their own development and manufacturing processes without worrying about patents and IP lawsuits.

Unfortunately, the end result wa the rise in power of the academic Intellectual Property Office which oversees the licensing (and gathers in the percentages). The UC and MIT systems are most notorious for pushing this approach in the 1990s, but it seems to be everywhere these days.

Effectively, the administrators put researchers with patent-generation potential at the front of the food line, and also pushed hard for public-private partnerships with mega-corps (UC Berkeley and BP, Stanford and Exxon, etc.) As part of that mentality, short-term profitability became the guiding light, not basic blue-skies research into whatever the researchers wanted to look at. Since much basic research generates nothing of immediate commercial interest, it was viewed as less important and even a drag on the bottom line.

Running with this mentality equates to killing the goose that laid the golden eggs, as groundbreaking discoveries leading to truly new technologies then become much less likely.

On the business side, the large private research centers of the post WWII era seem much diminished. Giving corporations a tax exemption for increasing their R&D spending while also raising their taxes to 1960 levels might be an efficient way to reverse that trend. Elon Musk could then avoid the tax bill by putting all his money into a SpaceX R&D facility to rival the old Bell Labs, which is kind of a good idea anyway isn't it?


What he described in the beginning of that essay is how Microsoft Research worked just 15 years ago. Maybe it still does, I wouldn't know. Yes, you are judged by your results. But beyond that and beyond some flimsy constraints dictated by what "team" you were on, there were no limitations whatsoever. This was the most productive and enjoyable time of my entire 25+ year career, and a staggering contrast to how the rest of Microsoft works. And the people were by far the smartest I have ever met, and this was not the only lab I worked at. Most of them weren't genius level (although they were still unusually smart), but some very clearly were, to the point where I'd sit in some meeting and lecture and think "what the fuck am I even doing here". Too bad most of the stuff they do never ends up in products - the rest of Microsoft can't tell a gradient from a hole in the ground, with very few exceptions (basically just Bing and some parts of Ads).


(1995)


Added. Thanks!


I recently proposed an idea to create a new data transfer protocol that involves drones to carry data in a medium with a prong attached to them.

When the drone lands on a platform atop a building, the prong connects to a computer connected to the platform and sitting inside the building triggering a mount action.

The data gets transferred to the computer. Now, whoever needs to transfer data from this building to another will use the same method to upload it.

I do not think this will be the best way to move data but if a protocol is in place it could be used as a basis for future intra-campus data movement.

The proposal was shot down. A schematic of the idea is drawn here: https://github.com/ketancmaheshwari/datadrone/blob/main/sche...


Is this a joke I'm tired to understand?


This is not meant to be a joke.


It sounds like you're proposing an improvement to RFC 2549 to reduce latency and packet loss.


I would suggest using RFC 6214 instead.

But really, for practical applications I would probably try to use UUCP.



Build it for transporting beer and just start attaching SSDs to the bottles.


So it is like Sneakernet but with wings. A winged sneaker net if you will Name it after some mythological hand maiden to Athena and you can't go wrong.


What's the maximum altitude of these drones? It might help you get funding if you could pitch it as Actual Cloud Data.


One could make a good security case for establishing ad-hoc point-to-point communications mechanisms.


But why?


I took a networks class during college, and there was a homework question from the textbook about a scenario like this. It had you compare transferring a large amount of data over the Internet versus loading it onto a disk and driving a physical distance to load it onto the other computer. The answer depended on the available bandwidth against the distance to drive.

And for other practical applications related to this idea: https://aws.amazon.com/snowmobile/


A few reasons:

-- Cost: Drones are getting faster, accurate, reliable and cheap while storage devices are getting lighter and denser. If a protocol is in place, vast amount of data could be moved relatively cheaper at a faster rate.

-- Auxiliary Medium: If a campus network is down due to security threats or assessment or some other reason, this protocol may be used to pass critical data around.

-- Remote, inaccessible (edge) locations: Places where conventional network is difficult to setup due to temporal nature or hazardous conditions etc.


Use cases for this are easy to imagine. For example; data collected by instruments in remote locations that do not have high capacity or cost effective network connections. When collection has to be done by a human it could require days or weeks, perhaps more. Whereas an automated system could perform the task more frequently.


I'll put this old witticism forward...

Never underestimate the bandwidth of a station wagon full of hard drives.


That's what the guys from the Internet Archive and CiteSeer also tend to say.


To actually hear your data buzzing around?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: