Hacker News new | past | comments | ask | show | jobs | submit login
The Ends of Knowledge (aeon.co)
44 points by drdee on Oct 5, 2023 | hide | past | favorite | 24 comments



This is an amusing example of how pervasive censorship coupled to the corporate takeover has created an atmosphere of fear within the academic community in the USA:

> "If the utmost end of the university is or should be the advancement and distribution of knowledge – an increasingly open question in some quarters..."

They just can't come out and say it openly - but the corporatization of the American academic system has resulted in a new primary goal for academic institutions: the generation of revenue streams. A secondary but closely related goal is to not damage any existing revenue streams (by, for example, demonstrating some negative side effects of a newly patented and lucrative drug / technology / etc.).

You won't find any career academics in the US system openly discussing this; that would be about as damaging to their career as criticizing Trofim Lysenko's plant breeding ideology in the USSR would have been for a Soviet scientist (although getting fired is less draconian than being exported to Siberia, I suppose). Some independent journalists have tackled it, however:

https://muse.jhu.edu/article/195083

> "In the process, the university's culture of openness and collaboration on scientific discovery has eroded; and growing conflicts of interest, as well as the demands of start-up businesses operating out of faculty research labs, have the potential to negatively impact the professional development of future generations of scientists. The result of this trend poses what Washburn calls "the single greatest threat to the future of higher education""

It's not surprising that the authors of this piece don't dare mention this issue explicitly, though it is a bit pathetic.


> They just can't come out and say it openly - but the corporatization of the American academic system has resulted in a new primary goal for academic institutions: the generation of revenue streams. A secondary but closely related goal is to not damage any existing revenue streams (by, for example, demonstrating some negative side effects of a newly patented and lucrative drug / technology / etc.).

Then add to that the fact that much of their revenue comes from government funding, and you have a really toxic mix.


There are times, when a complete rewrite is warranted; prose, code or otherwise. Yet explicitly putting a due date on knowledge and its pursuits ignores the fact that novel insights may arive- and last well beyond the 'end date' we put on such endeavours.

Firth's, "you shall know a word by the company it keeps" from 1957 and whether meaning resides 'in the text' or in the minds of the reader, prevailed academia at the time, and seemed fairly settled (halfway in between, contextually entangled)[0].

Yet this very premise became the basis of modern NLP half a century later, where yes, a semblance of meaning could be found in just the text itself; you just needed a lot of it to be able to make out the interrelations.

That is to say: 'settled knowledge' today [ed. if there is such a thing] can have a renewed impact tomorrow, and it seems odd to self impose an expiry date on its pursuit because at this moment in time, there seems nothing left to uncover.

[0]: https://en.m.wikipedia.org/wiki/Différance


> settled knowledge today can have a renewed impact tomorrow

We need to move away from the idea that there is such a thing as 'settled knowledge', because as you suggest there is always something else to uncover. Our knowledge is fallible, never settled, there is always the possibility to find and correct errors.


Perversely, this "settled knowledge" nonsense actively inhibits the growth of knowledge and just leads to the "justified true belief" swamp...

All we have is our current best explanations/understanding. We use them to make progress until we find a problem with it – something it fails to account for, something it makes an incorrect prediction for etc.

Then, we attempt to conjecture new explanations which solves the new problem while still accounting for all the useful aspects of the old theory.

That is knowledge creation, and also progress.

[Edit]


> All we have is our current best explanations/understanding.

And if our forefather/mothers ran away every time they encountered a "swamp", we wouldn't have that. If you ask me the philosophical accomplishments of this era are minimal, if not negative...let's hope there's no penalty for that, for us or our children (say, the effects of climate change on a species who is unable to act).


Thanks for reminding, I was about to edit: "settled knowledge (if there is such a thing [...] )"


A fitting title. The end of my reading, “As news outlets disappear, extreme political movements question the concept of objectivity and the scientific process.” Yuck.


What makes it worse is that it _completely_ misses the point.

Humanities is _interesting_ to many people.


The End of Knowledge is usually just experienced as grief.

I don't think academia ends, it goes dark. We probably need a white spot of apprehending new concepts inside the vaulted, empty, halls of academia.

Underwater basket weaving and gender studies are a good cover for ideas better kept secret.

Secrets in general are basically mandatory. Electronics has persecuted 'knowledge creators', enough.


An epistemologically flawed article.


I assume you mean because it conflates modern day scholasticism, science, empiricism, and making shit up under one broad heading of "knowledge creation"?


It's just blind prophecy and therefore a fiction.

There is no way to predict the growth of future knowledge. If there was it would be equivalent to already having said knowledge.

Whether people do or don't create some specific piece of knowledge is solely down to what problems they have, and whether we choose (individually or as a species) to attempt solve them or not.

Our fallible knowledge is the byproduct of us solving problems.


I agree specific knowledge 'products' are difficult to predict. That science would grow beyond the capacity of any individual researcher to keep up with its field however, has long been contemplated [0].

Yet when a field is plateauing or saturated is difficult to say, and I don't find it a compelling argument for closing up shop and calling it a day as the article is suggesting.

[0]: https://en.m.wikipedia.org/wiki/Little_Science,_Big_Science


Speaking of epistemological flaws, stating one's opinions as facts is the easiest way to commit one.


I see what you did there. :)


Nice catch, I like to leave Easter eggs sometimes.


> There is no way to predict the growth of future knowledge. If there was it would be equivalent to already having said knowledge.

Exactly. Known unknowns vs. unknown unknowns.


Liberalism and the capitalist project at large does a good job of effectively disappearing into the natural world. In that, we have much to thank it for in what it has generated. But it doesn't mean we can't still point that out in conversations like this, and start to consider how much we should reconsider whether we really have two things with compatible ends anymore.

Kant, quite famously, set out on his critical project only after reading the Newton's Principia. Today, its easy to read the Critiques in a kind of Philosophical vacuum: something abstract happened, the Enlightenment, and the idealists are all responding to just each other in this context.

But the way it played out was much more political (to put it, admittedly, crudely): the discovery/assertion of natural physical laws demanded a reassement of subjectivity, because suddenly things that had up until then been attributed to rational minds had become seemingly decentered, now a property of the world itself! It is a short path from there to the idea of the modern individual, filled with interiority, subjectivity, agency. The same kind of subjects we still reckon over in the humanities. It is also the kind of subject that can thrive in capitalism: the modular individual that can be bargained with others on the labor market, rather then a person who is first and foremost an appendage of a community.

But this whole event also helps create the "natural world" that we now take for granted in scientific pursuits. To gleam the world as a bundle of mathematical laws tends to render the whole world as static, as an underlying canvas ontop of which we can paint our experiments on. This too, is not just sufficient but necessary for the boon of capitalist progress we all enjoy today: it is only with an ever regenerating nature that we can even begin to think about scaling the world onto an industrial global monolith coordinated with adept, beautiful precision.

I am not sure where I was going with this, I just think even if you really love capitalism and don't see it ever ever going away, you still do yourself an injustice not bringing it into this kind of conversation. It just feels inarguable that it is the ground we stand on, almost literally. If it seems hard or strange to think about ends, or if things maybe are going past their end, it is maybe at least related to the fact that our entire world is in a situation these days where it literally can't slow down, can't even stop accelerating, without crashing.


It is becoming offensively dopey, the unrelenting attack I've experienced my whole life, on subjective knowledge structures, coming primarily from Americans (and a little bit from Germans).

This dopey attack is an absolute masterclass in throwing out the baby with the bathwater.

Modern electronics as an entire category is hamstrung by it's love of (and power of) objective knowledge structures.

It would be much more pleasant if every single discussion about the future of electronics was not lead by more-of-the-same Harvard types whom cannot know higher structures than a koral reef.

This persecution of higher ways of knowing (and categorical imperatives) leaves me extremely standoffish and anxious about the sharing of conscious and invisible ideas, with the Western public at large.

America and CyberSecurity are doing their darndest (in effect, not intent), to drag the highest into the lowest.

In essence, the West has tilted itself towards permanent close-minded-ness.

Maybe that's good, I don't know (I don't want to know). The future will not be with those who persecute higher knowledge, that is guarenteed.


Bold first fkn sentence


Odd how it doesn't address the replicability crisis in the first paragraph. If half of the "knowledge" produced is fiction, the knowledge mill needs to end...


And doesn't even question the root cause of this decline in "knowledge".

This collapse of knowledge is akin to when engineering leadership pushes for faster and faster product releases, with less and less time for thinking about code, and then blames the team because the product shipped in a broken state.

You can pretty tie the replicability crisis to the funding centered pushed for greater and greater amounts of peer review. Economic pressure to prove knowledge's "worth" has lead to increasingly faulty knowledge.

I would expect this worldview from the WSJ, but it's a bit surprising coming form Aeon... oh they have book to sell about it, what an interesting coincidence. Further highlighting that our late capitalist world, knowledge that cannot be commodified should be eliminated.


It might be his last essay.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: