Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How many AGI researchers have/are studying philosophy seriously?
15 points by chromoblob 4 months ago | hide | past | favorite | 21 comments
Just asking. Philosophy by itself just seems to be THE important thing to study and maybe research on the road to creating an understanding artificial mind. And "deep learning", "free energy principle"... were these produced with help of theories with philosophical insight, systematically?

Seriously. 'Existential' (epistemological+influential) power of AGI by definition (sorry, no definition, read it like "in my opinion") encompasses "all of science" or at least some "analog" of science (working somewhat differently than "human science"), in which there will be analogs of all human sciences nonetheless. I want to put forth my opinion that it's foolish to search for some numb mathematical formalism, one that isn't informed by philosophical theories, in hopes that it will generate all of science or comparable. That's too primitive. Remember that scientists from different fields have different cognitive styles, different sciences have different philosophy, methodology, paradigms, the "style of mental content". AGI by definition (or in my opinion) will present a unified understanding of all of them from a "sufficiently philosophically powerful/abstract" philosophical foundation. So why not work on that theory?




As someone with a background in philosophy: very few researchers seem to care about philosophy, unfortunately. (At least from the outside-looking-in: I don't actually work in the AI field, so perhaps all of the philosophically-inclined researchers simply don't talk much.) This has two major negative consequences:

1. Sloppy, unclear thinking. I see this constantly in discussions about AGI, superintelligence, etc. Unclear definitions, bad arguments, speculation of a future religion "worshipping" AIs, using sci-fi scenarios as somehow indicative of future progress of the field, on and on. It makes me long for the days of the early-mid 20th century, when scientists and technicians were both technical and philosophically-educated people.

2. The complete and utter lack of ethical knowledge, which in practice means AI companies adopt whatever flavor-of-the-day ideology is being touted as "ethical." Today, that seems to be DEI, although it seems to have peaked. Tomorrow, it'll be something else. The depth of "ethics of AI" or "AI safety" for most researchers seems to be entirely dependent on whatever society at large finds unpleasant.

I have been kicking around the idea of starting a blog/Substack about the philosophy of AI and technology, mostly because of this exact issue. My only hesitation is that I'm unclear of what the monetization model would be – and I already have enough work to do and bills to pay. If anyone would find this interesting, please let me know.


Hello… I would find your proposed blog about the philosophy of AI interesting. I never studied philosophy formally beyond my first year of uni, and now I study psychology, but I have a strong interest in philosophy, and philosophy of mind more specifically. I’m about to start a masters thesis and my topic is (roughly) something that to do with the psychology of human-computer interaction. Philosophical thought around AI/AGI would offer another way of thinking about some of this stuff for me personally.

I’d love to hear some perspectives that were not almost entirely from a tech industry perspective. They take such a narrow view.


Thanks, that’s reassuring. I definitely agree the range of viewpoints is limited.


If they did - we wouldn’t have a bubble we have today, NVidia stocks be overpriced etc.

In other words — here as in many areas there is no incentive to dig deep, while there are plenty - to stay on the surface and tell scary stories about agi doomsday to journalists who barely have writing skills, let alone some philosophical or logical foundations.


I am more interested in meta-cognition, advanced cognitive architectures, learning about how humans learn and then figuring out how we get machines to do that better and faster


Philosophy is just thoughts and thoughts about thoughts. To really understand consciousness researchers should study meditation (for direct experience) and traditions such as Buddhism that have been studying consciousness for millenia.

See for example, the Buddhist descriptions of the jhanas, progressive levels of consciousness in which meditators peel back the layers of their personality, human awareness and end up in pure awareness and beyond. It's hard to read (and experience, albeit only the initial stage in my case) such things and not be left in little doubt that consciousness doesn't derive from thought like philosophers like to believe (no, Descarte, you are not just because you think).

It's for this reason I don't buy the AGI hype. Maybe after fundamental breakthroughs in computation and storage allow better simulations, but not any time soon since these traditions tell us consciousness isn't emergent. Most AGI researchers are barking up the wrong tree. Still, the hype boosts valuations so perhaps it's in their best interests anyway.

Philosophers can get so wrapped up in thoughts they say nonsense like "I can't comprehend not having an internal monologue", which you can experience any time you watch a film, listen to music, etc. Someone with only the smallest experience of meditation shouldn't fall into such thought traps.


Modern philosophy in its philosophical classes might be, but it's originally meant to be truth seeking. Pragmatic philosophy has simply evolved into math. You have math, you prove the logic.

Everything outside philosophy and science is religion. Not to say it's necessarily wrong. But with religion you start with a dogma. With philosophy you start with none. And you have to prove everything according to the dogma. If the dogma seems incorrect, then the assumptions that lead to this dogma are wrong; so you have sects around which parts of the dogma are most correct, and so on.

Buddhism is, to my understanding, something where the dogma is easily proven or disproven by doing the meditation, but doing that much meditation is not accessible to everyone. And so it lies in the realm of religion where you trust in the monks who have done it and build off that.


> Buddhism is, to my understanding, something where the dogma is easily proven or disproven by doing the meditation

This makes sense because it is trivial to empirically measure if a person has achieved the cessation of dukkha and escaped the wheel of rebirth


>Philosophy is just thoughts and thoughts about thoughts.

This is my beef with the field in its entirety as well. Stupid philosophers.

>... and traditions such as Buddhism that have been studying consciousness for millenia.

As opposed to say, philosophy?


No because their insights are through firsthand direct experience. Of course they may just be experiencing convincing simulations, but then everything might be anyway.


How is consciousness relevant to creating AGI?

> Philosophy is just thoughts and thoughts about thoughts.

I think these thoughts contain (in an ideal variant of philosophy which doesn't exist yet) enough "philosophical content" to directly help create AGI. Why do you think philosophy is so irrelevant here?

> Philosophers can get so wrapped up in thoughts they say nonsense

If someone says nonsense, that should not discredit the general ideas/foundation.


If someone demeans the importance of philosophy, it teaches you a lot about the person doing so. In the western tradition, early philosophy and what we now call math where more or less the same. Look at the original meaning of the Greek word Logic/Logos for illustration. The world as we know it can't exist without it.


I just don't believe it's possible to reason without consciousness. Combined with intelligence, memory, etc it leads to actual understanding rather than just imitation, which is where AI is today.


I just don't believe it's possible to reason without consciousness.

This is called an argument. It's what philosophers make.

Your attitude is a pretty typical one: "Philosophy is stupid, but this philosophical viewpoint that I have is special and somehow not bound to the things that apply to philosophy." It's a very common and predictable move from those that lack knowledge of the subject; the New Atheists and the philosophy of religion are a very prominent example.


> I just don't believe

Is it possible to compute without consciousness? (Yes.)

Is it possible to perform calculation over ideas without consciousness? Apparently so.

> understanding

You should prove that a "formal" form of understanding would be insufficient for reasoning.


so what is that "etc"?


I've got a background in philosophy and I'm constantly asking myself these questions too. There seems to be a two way failure - first, the ML folks failing to engage with the extant literature, and second, academic philosophy's failure to produce anything remotely resembling concrete, practical, or really even relevant philosophical work. The former is pretty much par for the course (in academic philosophy you get used to being ignored early on), but it's the latter I find egregious. Especially given (as others here have pointed out) the almost universally lazy and magical thinking in the ML space. For example, there's much hand-wringing about the benefits and perils of "AGI" with barely any attempt to establish that "AGI" is even a coherent concept. I'm skeptical that it is, but I'd be happy to entertain arguments to the contrary---if there were any! "AI" has become a marketing term for increasingly sophisticated statistical methods for approximating functions. I think some sober discussion about whether such brute-force induction is the right sort of thing to warrant the term "AI" would be a welcome addition.


Doing real stuff vs just talk. The Universe do not coherent with your thought experimental result from philosophy e.g. quantum mechanics, black hole, etc. Scientific discoveries have provided people evidence to pursue science, but what about philosophy? What is your evidence to prove that you have value?


Actually, approaching problems after thorough consideration as opposed to narrow consideration. The value is "having thought about it" - and it does not need a long theorem.


Refer to Wikipedia:

> In its most common sense, methodology is the study of research methods. However, the term can also refer to the methods themselves or to the philosophical discussion of associated background assumptions.

So I consider methodology philosophical in character. So I propose researching methodology from a unified philosophical foundation


George Boole




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: