Hacker News new | past | comments | ask | show | jobs | submit login

> A robot (nor the software that runs it) has no inherent rights, as it's a non-sentient creation.

That's not why. We don't know what test could be performed to determine if some future (or present) AI is or isn't sentient, we just assert it and get on with our lives.

No, the reason robots and software have no inherent rights is because the law says so.

As a demonstration proof that this is a purely legal status that has nothing to do with underlying nature, the co-inventors of the automobile did not have many rights afforded to her husband: https://en.wikipedia.org/wiki/Bertha_Benz

And conversely many such rights do exist for corporations, which are not themselves sentient.




> No, the reason robots and software have no inherent rights is because the law says so.

This is based in a limited theory of rights being delivered by a state as sovereign rather than inherent to every being. There are other valid ways of interpreting rights beyond "because the law say so."

https://en.wikipedia.org/wiki/Negative_and_positive_rights


Those other methods are great in a philosophical debate, but have no power until implemented as law.

I'm not saying you're wrong, those kinds of debate are great at telling you what the law should be, but the practicality of it is that a right you can't enforce is not useful.


A view that a sentient entity has no rights but what the law gives is incompatible with that entity’s agency. Enumeration of rights is for the purpose of limiting the laws that govern any entity’s behavior and are not a specification for what the entity may do.


> have no power until implemented as law.

Influence is power. For a stark example, consider Sharia.


It's worth pointing out the main reason why people are invested in robots, AI, or corporations is that you can own them as capital either for direct production purposes or secondhand economic benefits, often replacing human labor, which you cannot own.

This status difference is rooted in significant recognized differences and philosophical beliefs about the value of individuals, and in the idea that the law and society should exist to support that. It's pretty far from "purely legal" and the fact that we're going to let people own robots should be only one of the many clues that there's something different about them.


Couple of hundred years ago, we could own other people.

All that philosophical stuff, that only mattered because it resulted in a change in the law — and in the case of the US different philosophy led to a civil war to reject/enforce the law.


I used that more in the context of the "speciest" accusation; I don't consider robots to be a species, particularly a species worthy of forcing a theater to allow their entry on their own merits.

You're right that using sapience is a bad way to identify what is and is not a species though.


Fair — "robot" is as vague a term at this point as, oh, "fish", I guess? Even sentience (or sapience) aside, a Roomba, an Optimus, and a vending machine are all importantly different kinds of robot, and a cinema would be relaxed, cross, and confused in that order by finding them in attendance during a screening.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: