Hacker News new | past | comments | ask | show | jobs | submit login

I’m not talking about self harm. I’m talking about the experience of pain—which most everyone has had! These are different things.



I'm not quite following your argument on pain. Ability to feel pain is not sentience.


It is if you take “sentience” to mean “the ability to feel,” which is what my dictionary just told me. I think this category really is the most basic differentiating one. Higher level stuff like self awareness all depend on it. The most basic difference between a computer and a human (or even a dog…) is, in my opinion, the ability to feel.


>It is if you take “sentience” to mean “the ability to feel,”

I don't like this definition much because "feel" is a fuzzy word. In this context it should be "feel" as in experience. I can build a machine that can sense heat and react to it, but I can't build one that can experience heat, or can I?

You need to figure out what having the capability "to experience" means, and you'll be one step closer to defining sentience. Even so, I've never experienced anyone coming up with a succinct definition encapsulating how I experience sentience. I believe it can't be done. If it can't be done it renders any discussion about whether or not someone or something is sentient moot. If it can't be put into words we also cannot know how others experience it: If they say this machine is just as sentient as I am, we'll have to take their word for it.

So the meaning of sentience is subjective, so there can't be an objective definition acceptable to everyone and everything claiming to be sentient.

There's my argument for why sentience cannot be defined. Feel free to prove me wrong by pulling it off.


> but I can't build one that can experience heat, or can I?

It would need to have a planner that can detach from reality to hunt for new longterm plans, plus a hardcoded function that draws it back to the present by replacing the top planning goal with "avoid that!" whenever the heat sensor activation has crossed a threshold.


‘So the meaning of sentience is subjective, so there can't be an objective definition acceptable to everyone and everything claiming to be sentient.‘

It feels like your begging the question here, I don’t think this follows from any of your arguments. Except for maybe where you state you believe sentience can’t be defined, which again, begs the question.

Though admittedly I don’t see much of a traditional argument — your conclusion is interesting, could you try supporting it?


The first "So" at the beginning of that sentence is a typo. It indeed doesn't follow.

You can quickly spot what makes sentience subjective when you follow the explanations. They're all either utter gibberish once unpacked, lead to the conclusion that my computer is sentient (fine by me, but I don't think that's what we wanted?), are rooted in other terms with subjective meaning, or they are circular. Let's look at that third kind, which Wikipedia illustrates well:

> Sentience: Sentience is the capacity to experience feelings and sensations [...]

> Experience: Experience refers to conscious events in general [...]

> Conscious: Consciousness, at its simplest, is sentience [...]

Back at where we started.

To break this circle one needs to substitute one of the terms with how they intrinsically and subjectively understand it. Therefore the meaning of sentience is subjective. I realize you can expand this to mean that then everything is subjective, but to me that is a sliding scale.

The challenge I posed could be rephrased to come up with a definition that is concise and not circular. It would have to be rooted only in objectively definable terms.


> I can build a machine that can sense heat and react to it, but I can't build one that can experience heat, or can I?

Agents can imagine the future and the expected positive and negative rewards, this is an important process in order to select actions. Thinking about future rewards is "experiencing" the present emotionally.


I guess it is hard to define because it’s such a basic, essential thing. So does it matter that it’s hard to define? Even babies and puppy dogs experience pain and pleasure. They are feeling creatures. We don’t have any evidence that non-biological beings have pain, pleasure, fear, excitement… and so on.


Dictionary definitions are of limited utility in philosophical discussions because they often make very broad assumptions. For example computers can certainly sense things, they can detect inputs and make decisions based on those inputs. What is the difference between feeling and sensing though?

In this case by ‘feel’ we might implicitly assume various capabilities of the subject experiencing the feeling, like self awareness. If we’re being precise we can’t just say feeling is enough, we need to state the assumptions the dictionary leaves unstated.


When we say a camera can see—or a computer can sense—we are using an anthropocentric metaphor.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: