And see also "Disciplined Minds" from 2000 about some other consequences: http://disciplinedminds.tripod.com/ "In this riveting book about the world of professional work, Jeff Schmidt demonstrates that the workplace is a battleground for the very identity of the individual, as is graduate school, where professionals are trained. He shows that professional work is inherently political, and that professionals are hired to subordinate their own vision and maintain strict "ideological discipline." The hidden root of much career dissatisfaction, argues Schmidt, is the professional's lack of control over the political component of his or her creative work. Many professionals set out to make a contribution to society and add meaning to their lives. Yet our system of professional education and employment abusively inculcates an acceptance of politically subordinate roles in which professionals typically do not make a significant difference, undermining the creative potential of individuals, organizations and even democracy. Schmidt details the battle one must fight to be an independent thinker and to pursue one's own social vision in today's corporate society."
Or Philip Greenspun from 2006: http://philip.greenspun.com/careers/women-in-science
"This is how things are likely to go for the smartest kid you sat next to in college. He got into Stanford for graduate school. He got a postdoc at MIT. His experiment worked out and he was therefore fortunate to land a job at University of California, Irvine. But at the end of the day, his research wasn't quite interesting or topical enough that the university wanted to commit to paying him a salary for the rest of his life. He is now 44 years old, with a family to feed, and looking for job with a "second rate has-been" label on his forehead. Why then, does anyone think that science is a sufficiently good career that people should debate who is privileged enough to work at it? Sample bias."
Or the Village Voice from 2004 about how it is even worse in the humanities than sci/tech grad school:
"Here's an exciting career opportunity you won't see in the classified ads. For the first six to 10 years, it pays less than $20,000 and demands superhuman levels of commitment in a Dickensian environment. Forget about marriage, a mortgage, or even Thanksgiving dinners, as the focus of your entire life narrows to the production, to exacting specifications, of a 300-page document less than a dozen people will read. Then it's time for advancement: Apply to 50 far-flung, undesirable locations, with a 30 to 40 percent chance of being offered any position at all. You may end up living 100 miles from your spouse and commuting to three different work locations a week. You may end up $50,000 in debt, with no health insurance, feeding your kids with food stamps. If you are the luckiest out of every five entrants, you may win the profession's ultimate prize: A comfortable middle-class job, for the rest of your life, with summers off. Welcome to the world of the humanities Ph.D. student, 2004, where promises mean little and revolt is in the air."
The odd of success are probably even lower now with expanding use of adjuncts to replace tenured faculty.
Of course, the irony is that US society now has more than enough wealth so that anyone who wanted to could live like a graduate student researching whatever they wanted on a basic income.
Which is more or less my plan. Some details here:
> My plan right now is to save money, retire early, and then do whatever research I want that fits my budget. This avoids many of the problems with the current system, but is not possible for many.
> This would allow me to pursue more risky research (in the sense that the research may fail to produce useful results) than an assistant professor trying to get tenure could. I also wouldn't have to raise funds, so I could focus on projects I believe are important, not just what can get funded.
I wish this option was more known and accepted. Some people seem to think I'm insane to intentionally pursue this, but as far as I can tell they don't see that the ship they are on (academia/government research/etc.) is sinking. It would be nice to talk with other independent researchers of this variety, exchange best practices, etc.
Depending upon the field, you can do research, publish, go to conferences, have a network of peers, without being standard faculty. I know research professors who have very few of the standard faculty obligations, for example. I also know people who do research entirely on private funding, but this almost always requires significantly more than just savings from retiring early, and they still stay part of the academic community.
I agree that it is not something that should be advertised as an option, because it is very rare, but it does exist.
There are many examples. Charles Darwin is the most famous. In my field (fluid dynamics) Robert Kraichnan is also well known and influential.
Of course, plenty of cranks go this route too. It's an option, not a panacea.
> You'd be limited to a very tiny sliver of research that doesn't require staff or expensive equipment.
Not a problem for me as a theorist. And I believe that expensive equipment is overused in my field anyway.
I also disagree that this is a "tiny sliver" of the research. Computation and theory is roughly half the research in my field, and I suspect this is true for many fields.
If this option doesn't work for you, don't do it.
> You'd be doing research without the support network of peers in a department, or colleagues at a conference to talk to.
I disagree. Collaboration does not require "official" status, and neither does attending a conference. In my field, no one cares what your affiliation is. At worst I could start a consulting company, which I'd probably do anyway. Plenty of consultants attend conferences in my field and collaborate with researchers in government, industry, and academia.
No need to get overly defensive. I'm responding to your argument that this should be more widely known and accepted. Science gets harder to do every year as the the lowest hanging fruit keeps getting plucked. Your examples aren't convincing -- Charles Darwin was born in 1809 and things were different back then. Your other example according to Wikipedia is someone who got a PhD at MIT, and applied for grants and held faculty positions at a number of universities.
But hey, if you end up doing this successfully. Come back and tell us how it went. But in the meantime, I'm going to continue disagreeing with you, that this is a reasonable route to productively conduct research.
I could provide an example of research which is disincentivized in traditional academia but incentivized in independent research from my own PhD research if you're interested.
> Your other example according to Wikipedia is someone who got a PhD at MIT, and applied for grants and held faculty positions at a number of universities.
Kraichnan was an independent researcher from 1962 to 2003, the majority of his career. Yes, he was affiliated with a university at multiple points of his career, but he spent 4 decades as an independent researcher and produced some of his most important work during that time.
I'd be interested, if possible.
Traditional academics have the "publish or perish" incentive. In practice this means that they prioritize "quick wins" over "slow wins", e.g., given a choice between publishing 1 paper after 1 year (quick win) or publishing 5 papers after 5 years (with no publications before then) (slow win), they'll choose 1 paper after 1 year. If an academic goes too long without a publication, that will be counted against them. The low hanging fruit for quick wins has been taken due to this incentive, but I see no shortage of slow wins. (The scenario I describe is an extreme case, but the same incentive still exists in less extreme cases.)
There's also the problem that what can get funded is not necessarily what's most important. Norbert Wiener discusses this at length in his 1950s book "Invention". Wiener notes that despite the obvious political differences between the USSR and US, research funding is allocated similarly: by people too far removed from the actual research, who are often not in a good position to evaluate its merit. It doesn't matter if these people are managers, bureaucrats, or fellow scientists. There's generally an asymmetry in information between the scientists requesting funding and those able to provide. (Having more time to learn about each proposal could help, but the trend I imagine is that time available to review each proposal has decreased over the years.) This ignores the lottery like nature of the entire funding process.
To get more specific, both of these problems are would disincentivize a traditional academic from publishing this paper I recently submitted to a conference:
In principle, a traditional academic could have written this paper. It's possible, but I think less likely because of "publish or perish" incentives. (To be clear, I am a PhD student right now, and most of the time I was doing the research in the paper I was a TA. I don't have the "publish or perish" incentives that make this research less likely. If I stayed in academia longer I would.)
My advisor and I tried to get funding for this project, but our grant proposal (I wrote the vast majority of it) was rejected for reasons beyond our control (which I have no problem with). We received positive comments on the proposal, and it served as a draft my PhD proposal.
Without going into detail, the paper develops a simple mathematical model of a certain physical process. The theory and its validation would not have been possible unless I did two things that traditional academics seem to think are a waste of time:
1. Very comprehensive literature review.
2. Very comprehensive data compilation.
Now, I think most people would believe these two are just what academics do. But apparently not. Traditional academics are incentivized to do the bare minimum to get another publication. There's an epidemic of copying of citations and merely paraphrasing review sections of papers without reading the original papers, and I think this is caused partly because of these incentives.
The literature review I did (not all of which made it into the short conference paper) was considerably more comprehensive than any I've seen published in the field before, and I was able to synthesize past theories and improve upon them by recognizing some of their flaws.
How do I know I was more comprehensive? One way is by the excellent papers I found which few seem to be aware of. In the paper I mention, papers 3 through 8 have very few citations. Some of them have not been cited at all in the past 40 years to the best of my knowledge. Someone could say that these papers are just unimportant, but they're not. In my view they're "sleeping beauties":
Further, I spent a year or two alone digging deeper and deeper into the literature in this problem. There were several times when I thought I probably had at least touched everything, but a few weeks later I found yet another area that I had missed. Being comprehensive is difficult and time consuming. If you just want the minimum to publish, you won't bother.
I also benefited from certain heuristics which allowed me to identify important neglected research. For example, I spent a lot of time tracking down foreign language papers and books because I recognized that this research was avoided because it was written in a different language, not because it was bad. The entry costs to foreign language literature have dropped greatly over the past decade with options like Google Translate. I've translated around 10 full papers into English right now, and produced many more partial translations. These papers have provided critical insights that were necessary towards writing this paper. At this point some people I know use the fact that I like reading foreign language papers as a joke. Traditional academics think this is absurd, but I see that there's value, just that it takes time to be realized.
It was through my comprehensive literature review that I got the idea behind my data compilation. By taking advantage of the properties of a special case, I was able to get information that most researchers in this field seem to believe is extremely difficult and expensive to obtain. I did not come up with the idea myself. I was translating a 1960s Russian language paper into English when I realized based on what was written in one paragraph that I could use the properties of a special case to get some hard to obtain information. The author was actually leading into this. The next paragraph explicitly said the author was taking advantage of the properties of a special case. So it wasn't very original on my part. The 1960s Russian researcher didn't have a lot of data to use, but there's a lot now 50 years later.
So I started compiling data. I get the impression that few academics would have compiled even half as much as I did, or have been even half as careful as I have about it. I was very careful to select only the least ambiguous data sources. Out of over 100 candidate data sources, there were only around 20 which were acceptable. I then took the time to carefully transcribe all of the relevant data from these sources, and develop a computational framework to handle this data (based on Python and Pandas). It was probably at least 6 months of work, but I can produce several papers based on it, so it's worthwhile in my view. My advisor was not initially enthusiastic about compiling this data, by the way. He's a successful traditional academic, however, and his intuitions are calibrated differently than mine are.
I clicked on that link and I noticed that Darwin and Kraichnan both had institutional affiliations.
Maybe possible if you're paying your own way. Lots of newly-minted professors need those small grants to get their research going. Find one who can see you as a colleague and not as an ATM.
Like anyone who presents at tech conferences.
You're on Hacker News for Knuth's sake. The entire premise baked into the word hacker is that enterprising smart people can change the world with just a little determination and hustle.
Even the hackers out there not making the next breakthrough technology but emulating and testing some arcane ICS, it's crazy to warn people away from that as risky and possibly not useful.
Every career is possibly not groundbreaking, every path has that risk. People should be realistic, sure, but if someone wants to support themselves while they do interesting research, I have no objections to that.
Do what you love, this isn't someone whose hopes and dreams are contingent on starring in the next Hollywood blockbuster, it's someone who wants to geek out on their own dime. Great! Amazing! Tell HN how it goes and most of us will love it, world changing or even just something fascinating to you, we have your back.
I think the opportunities in tech are promising towards this end, and would be interested in getting in contact with other people with similar plans.
I'm on the "build skills, save money" - phase currently, and probably will be for > 5 years to come. So that is my main focus right now.
I think some platforms come close. There are places to ask questions (e.g. StackExchange) but they don't seem to like "what if" questions and if the idea is good there's no good way to follow up over months or years. Github works for some fields where the idea is tightly coupled with the implementation (e.g. some computer systems fields) but not for others.
I guess I would just like a place to discuss ideas. In academia it's common to discuss nascent ideas within your lab, but larger collaboration happens at things like conferences where you're presenting only the ideas that worked out. I think such a platform could make independent research like you're describing much more effective and attractive.
That said, you have a great insight here with the idea that some places on the web could replace (or at least supplement) the traditional advisor/advisee relationship with a more peer-to-peer approach -- especially for independent researchers.
That issue of how to follow up over months or years seems key as a difference from more casual one-off interactions. Well-run mailing lists or forums may provide some of that continuity -- but maybe there is a social or technical way to go further in that direction?
I'm not sure if there ever would be just one place to discuss ideas, but you might want to bring up this general idea of some new platform with Michel Bauwens of the P2P Foundation: https://en.wikipedia.org/wiki/Michel_Bauwens
P.S. 42, with ADHD. Started University at 15, started high school at 11. Great at getting jobs. Not so great at keeping them. :P
From a diversification standpoint I think this should only be one source of one's funding. The bulk of my planned funding is going to come from savings from a job. I have the most control over that, and it's a much larger source. I've also considered working as a lecturer from time to time as engineering lecturers seem to be reasonably well paid (at my university, ~$10K per class). Might as well take advantage of the rise of adjuncts. You can travel to different universities regularly for collaboration this way too. Some more permanent lecturer positions are fairly decently paid from what I understand and may be a decent way to save money while also having opportunities for research. The research is the goal, not the title of "independent researcher".
Also, having to solicit funding regularly is something I'd rather not do. It takes away time from research, and I'd like to focus on research which is not so easily funded. To go back to the "low hanging fruit" point mentioned elsewhere in this comment tree, I think there are many research topics which don't sound good to a third party but are actually good. It can be hard to convince people of this. The easiest way to move forward on these research topics is to risk your own money. And with the most easily funded ideas taking priority, I can see many examples of "low hanging fruit" in my own field.
The Patreon model could get around the "research not sounding good enough to fund" problem. Pay an individual to do work in general, not specific work. But aside from someone working on topics of popular interest (e.g., gwern), I don't think this would work.
It's too bad there isn't more of an independent academic community, but it sure beats pressure to publish and writing grant applications.
Then look up some of the references and referencees or stuff from the same author. Keep doing this one tactic and you'll easily find years worth of reading to do.
Besides finding the abstracts you'll need access to the research. This used to be an issue but luckily now we have sci-hub.tw and paperdownloader.cf
Mostly you do reaearch by being interested in something. If you need someone to tell you what to be interested in, a PhD may not be for you.
A good supervisor will tell you to be interested in things you may not have considered. But empirically, most supervisors will steer you in the direction of their own interests.
These may or may not match your own interests. The mismatch us at least as likely to be a bad thing as a good one.
In my own experience, I decided to work independently instead of starting a PhD. There are only a couple of directly relevant journals, and I literally skimmed every issue, reading and taking notes on the papers that counted as prior art.
Those papers often quoted other papers outside the immediate domain, so I followed them up - and that’s how you start.
I have a pretty good idea of the directions I’d be steered in if I was being supervised, and near certainty that those are not the directions I want to explore.
Maybe I'm misinterpreting but this comment seems a bit condescending.
The thing is, it's not really research unless it's something new, and you can't know if something is new unless you know what already exists.
> All disciplines have journals, and many are either public access or available on sci-hub. If all else fails you can get a limited JSTOR subscription for only a moderately outrageous sum.
Sure you can have access to journals but how do you even know what you should search for? I suppose this is sufficient if what you want to do research in is something that is currently mainstream?
I "research" stuff all the time, do a google search for some term and find an interesting paper then go through the references that seem interesting. Works better than you'd imagine, I was looking up CESK machines and down through the rabbit hole I eventually found the original paper with the non-obvious name of The Calculi of λ-v-CS Conversion: A Syntactic Theory of Control and State in Imperative Higher-Order Programming Languages. Honestly, those math-heavy CS papers tend to make my head hurt though.
Don't think there's a whole lot of people who would claim lambda calculus is mainstream but I can spend hours upon hours finding stuff to read about and most of (or all) the papers are easily downloaded off the author's website.
That's pretty good actually. Your chances are far less than 10% in physics and that's for postdocs not students, and that's assuming you're from a top university with a good publication history in the top of the top journal(s).
Edit: For all the replies that focus on "summers off", I'm not saying you have summers off, that's a part of something I quoted. I'm well aware you don't get summers off from experience. Heck you can't take off time even after a baby unless you want to jeopardize your chances in tenure. If you read the portion that I wrote, you can see that my reply is about the job security (=tenure).
It is actually nearly impossible to find time to actually take a vacation. I had more than 6 months banked when I quit.
Edit. To add to your edit, tenure is far less secure that it appears from outside. It allows you to get away with being a moderate pain, but if you get too bad what they do is make your position redundant in the next departmental reshuffle. Since these occur about every 2 to 3 years tenure is more illusory than real.
You not only have to worry about your position being made redundant, but your whole department. It is easier for admin to get rid of a few troublemakers by killing the whole department than just going after the troublemakers directly. A fair amount of my admin/politics time was spent defending the entire department rather than worrying about my own position directly.
How much of a professors life is showing that they are doing vs actually doing.
What aspects are better and what worse? And which are popularly misunderstood (like summer vacation) ?
I was deliberately terrible at the administration side and outright refused to do anything unless it meant more money for my department (the one good thing about tenure is you can be a pain in the backside and not get fired), but even spending less than 10 hours a week on admin, my work week was rarely less than 80 to 90 hours.
One of the major reasons why I quit academia and went back to my startup was to have more time to spend with my family.
That alone seems like a rather damning indictment of academia.
I of course did supervise students. This does vary by field, but done right each student needs about 5 hours a week one-on-one direct enagagement (more at the beginning of their degree and less at the end). Less time than this and the student will struggle.
You can of course do what some of the big empire builders do and just let the student sink or swim on their own, but this is not good for the student. Good supervision is very time consuming, but probably the most rewarding aspect of being an academic.
> researcher (well supervision of people actually doing the research
You explicitly distinguished the two.
And thank you for your answers.
Instead, a professor is usually spending more time supervising research. This is typically because a professor has multiple students, and as danieltillett explained, they already have many other time commitments. But this is also because the point of the entire process is for the students to become independent researchers. That is, the research supervision is teaching their students how to be researchers. In my experiences, professors in science and engineering tend towards being managers rather than being in the lab or at the keyboard.
I presume danieltillett made this distinction because there is a difference between supervising and doing the research.
The irony is that US society now has more than enough wealth so that anyone who wanted to could live like a startup "founder" making whatever app they wanted on a basic income.
Remember when $50,000 sounded like some kind of scary worst case scenario for student debt?
I don't think there was much response to any of them though.
As Upton Sinclair wrote about a century ago: "It is difficult to get a man to understand something, when his salary depends on his not understanding it."
How does that apply to the exploitation of the stars in the eyes of graduate students? There may be vast amounts of self-serving denial of the pyramid scheme aspect of much of academia.
Like George Orwell wrote in 1946: "The point is that we are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right. Intellectually, it is possible to carry on this process for an indefinite time: the only check on it is that sooner or later a false belief bumps up against solid reality, usually on a battlefield. ..."
I'd guess the outcome will continue to be mainly gradually increasing pain for all involved. Human systems seem to be able to tolerate a large amount of needless suffering when there is no obvious credible alternative and there are still some positive aspects of the current system. Related: https://www2.ucsc.edu/whorulesamerica/change/science_freshst...
Likely things will keep going on a downward trend until some significant shock causes a massive reorientation of resources. Alternatively, the shock may just be the crossing of various trend lines like increasing student debt versus decreasing graduate opportunities to the point where no one could justify the cost as an investment.
See also a few different theories of social collapse which could be applied to understanding possibilities for academia: https://en.wikipedia.org/wiki/Societal_collapse#Theories
And: https://en.wikipedia.org/wiki/Dark_Age_Ahead "Dark Age Ahead is a 2004 book by Jane Jacobs describing what she sees as the decay of five key "pillars" in "North America": community and family, higher education, science and technology, taxes and government responsiveness to citizen's needs, and self-regulation by the learned professions. She argues that this decay threatens to create a dark age unless the trends are reversed. Jacobs characterizes a dark age as a "mass amnesia" where even the memory of what was lost is lost."
And we are seeing that sort of amnesia in the USA in academia and other places -- where fewer people remember what academia used to be like decades ago.
Just like there is a growing amnesia where fewer people remember what it was like to go to school in the USA back in the 1960s when kids were taught how awful the USSR was because it kept its citizens under constant surveillance...
But we still might hope for a gradual transition to other ways of organizing research and discussion like via the internet (such as Hacker News) -- but people still need to somehow get enough available time to participate in productive ways.
And the original Nature article is an example of an attempt at self-correction.
Here are a couple recent satires on academia both from 2013:
From Amazon: "Option Three: A Novel about the University by Joel Shatzky -- When Acting Visiting Assistant Professor L. Circassian is fired and rehired in the same week (with a 35 percent pay cut), he is only at the beginning of a cycle of abuse and professional debasement at the university. Joel Shatzky has created an hilarious novel about the corporatization of higher education - a book filled with blowhard deans, corrupt politicians, grasping CEOs, inept union officials, inappropriately dressed students, and scholars in donkey ears."
"What’s Going On at UAardvark? by Lawrence S. Wittner -- What’s Going On at UAardvark? is a faced-paced political satire about how an increasingly corporatized, modern American university becomes the site of a rambunctious rebellion that turns the nation’s campus life upside down."
Both relate to this Atlantic essay from 2000: https://www.theatlantic.com/magazine/archive/2000/03/the-kep...
"The Kept University: Commercially sponsored research is putting at risk the paramount value of higher education—disinterested inquiry. Even more alarming, the authors argue, universities themselves are behaving more and more like for-profit companies."
Here is an essay I wrote mostly around 2001 on one way to fix one negative aspect of much of modern academia and other not-for-profits supposedly dedicated to working in the public interest:
"Foundations, other grantmaking agencies handling public tax-exempt dollars, and charitable donors need to consider the implications for their grantmaking or donation policies if they use a now obsolete charitable model of subsidizing proprietary publishing and proprietary research. In order to improve the effectiveness and collaborativeness of the non-profit sector overall, it is suggested these grantmaking organizations and donors move to requiring grantees to make any resulting copyrighted digital materials freely available on the internet, including free licenses granting the right for others to make and redistribute new derivative works without further permission. It is also suggested patents resulting from charitably subsidized research research also be made freely available for general use. The alternative of allowing charitable dollars to result in proprietary copyrights and proprietary patents is corrupting the non-profit sector as it results in a conflict of interest between a non-profit's primary mission of helping humanity through freely sharing knowledge (made possible at little cost by the internet) and a desire to maximize short term revenues through charging licensing fees for access to patents and copyrights. In essence, with the change of publishing and communication economics made possible by the wide spread use of the internet, tax-exempt non-profits have become, perhaps unwittingly, caught up in a new form of "self-dealing", and it is up to donors and grantmakers (and eventually lawmakers) to prevent this by requiring free licensing of results as a condition of their grants and donations."
And here is a book-length essay be me from 2008 on how to rethink Princeton University as a mental-health-promoting post-scarcity institution: "Post-Scarcity Princeton, or, Reading between the lines of PAW for prospective Princeton students, or, the Health Risks of Heart Disease"
"The fundamental issue considered in this essay is how an emerging post-scarcity society affects the mythology by which Princeton University defines its "brand", both as an educational institution and as an alumni community. ... We can, and should, ask how we can create institutions that help everyone in them become healthier, more loving, more charitable, more hopeful, more caring..."
So essentially, if we want the better parts of old academia from the US 1950s-1970s back, there will need to be some radical changes. As G.K. Chesteron wrote in 1908:
"We have remarked that one reason offered for being a progressive is that things naturally tend to grow better. But the only real reason for being a progressive is that things naturally tend to grow worse. The corruption in things is not only the best argument for being progressive; it is also the only argument against being conservative. The conservative theory would really be quite sweeping and unanswerable if it were not for this one fact. But all conservatism is based upon the idea that if you leave things alone you leave them as they are. But you do not. If you leave a thing alone you leave it to a torrent of change. If you leave a white post alone it will soon be a black post. If you particularly want it to be white you must be always painting it again; that is, you must be always having a revolution. Briefly, if you want the old white post you must have a new white post. But this which is true even of inanimate things is in a quite special and terrible sense true of all human things. An almost unnatural vigilance is really required of the citizen because of the horrible rapidity with which human institutions grow old. It is the custom in passing romance and journalism to talk of men suffering under old tyrannies. But, as a fact, men have almost always suffered under new tyrannies; under tyrannies that had been public liberties hardly twenty years before. Thus England went mad with joy over the patriotic monarchy of Elizabeth; and then (almost immediately afterwards) went mad with rage in the trap of the tyranny of Charles the First. So, again, in France the monarchy became intolerable, not just after it had been tolerated, but just after it had been adored. The son of Louis the well-beloved was Louis the guillotined. So in the same way in England in the nineteenth century the Radical manufacturer was entirely trusted as a mere tribune of the people, until suddenly we heard the cry of the Socialist that he was a tyrant eating the people like bread. So again, we have almost up to the last instant trusted the newspapers as organs of public opinion. Just recently some of us have seen (not slowly, but with a start) that they are obviously nothing of the kind. They are, by the nature of the case, the hobbies of a few rich men. We have not any need to rebel against antiquity; we have to rebel against novelty. ..."
Or for a more modern take on that, from 1963, as John W. Gardner said in "Self-Renewal: The Individual and the Innovative Society", every generation needs to relearn for itself what the words carved into the stone monuments mean. He says essentially that fundamental values are not some long-ago-filled-but-now-running-out reservoir from previous generations but a reservoir that must be refilled anew by each generation in its own way.
Without necessarily approving of their specific actions, value re-creation is something that people like the late Aaron Swartz (taking on JSTOR and MIT with his efforts) and Alexandra Elbakyan (taking on Elsevier with Sci-Hub) were and are trying to do. Richard Stallman with the GPL and GNU Manifesto from 1985 as a response to proprietary software agreements in academia is another less-controversial example because he worked within the existing copyright laws. So are -- also less controversially -- Wikipedia, Hacker News, Reddit (Swartz again), Slashdot, Archive.org, GitHub, and many other internet-mediated venues -- which are creating ways to have discussions and learn about sci/tech/humanities topics outside of formal academia. They are all essentially treating formal academic systems as-we-know-them-in-practice as damage and routing around them.
I'll need to read that book, thanks. In the meantime, I was also reminded of things Erich Fromm wrote, or said, in this case:
> Our main way of relating ourselves to others is like things relate themselves to things on the market. We want to exchange our own personality, or as one says sometimes, our "personality package", for something. Now, this is not so true for the manual workers. The manual worker does not have to sell his personality. He doesn't have to sell his smile. But what you might call the "symbolpushers" , that is to say, all the people who deal with figures, with paper, with men, who manipulate - to use a better, or nicer, word - manipulate men and signs and words, all those today have not only to sell their service but in the bargain they're to sell their personality, more or less. There are exceptions.
-- Erich Fromm in an interview, https://www.youtube.com/watch?v=Cu-7UDT0Xe4&t=1m34s