Hacker News new | past | comments | ask | show | jobs | submit login

Wouldn't such a society be predicated on the idea that everyone has a basic income - and these fully automated factories with no (state?) ownership would be relegated to producing the basic consumables that come along with a society that provides all the basics for survival and hygiene and ideally health (mental and physical)

As such, a society would find value in the skills of the populous which produce things that have a "human" value for having been created by a human.

Further, it would seem that the overall population would drop significantly. Especially as technology for automation iterates, machines will care for human basic needs, and will care for maintenance and production of other machines to keep the system going.

Will AI manage the overall resource supply chain?

It would be interesting to see a critically thought out matrix of all the roles which could be done by machine/automation/AI vs that which must require a human.

What about "soft" skills required to run a civilization; politics and law for example.

Where politics is fundamentally required to ensure stability of an economy and society such that humans can survive in an ordered world, it is clearly exploitable and shouldn't we be attempting to remove as much human cruft from that process as possible - but ensuring that human empathy remains. As machines cannot have empathy. (At what point do we trust "programmed empathy" in AI?)

---

While there are all these efforts on ML to get machines to "see", say, cats in a picture, are there any efforts for teaching an AI to discern emotion in any given scenario?

Then ultimately, an AI will use all this to interpret intent...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: