
100% Unemployment: on keeping busy when the robots take over - ggkrneqgpu
http://hyponymo.us/2013/01/29/100-percent-unemployment/
======
treelovinhippie
At my last job, something I tried to push unsuccessfully was a company-wide
automation initiative.

Imagine a company-wide incentive structure for automation whereby if you
automate 90-100% of your job, you receive massive bonuses. If you completely
automate your job, the bonus could be the entirety of your annual salary as a
single payout. Or perhaps a combination of bonus and equity.

So now you're out of a job. But if you can automate other jobs in the
organisation, you also get bonuses.

Eventually the majority of the company becomes automated, hopefully generating
the same or greater revenue and value for customers, with less overhead costs.
Which is great for any stakeholders who can continue to receive revenues, but
terrible for employees. So it would be good to include some kind of equity
and/or revenue sharing model. The sell could then be "hey, let's all work
together to automate this company, we'll get bonuses for doing so, and when
the whole thing is automated we can continue to receive income while sitting
on the beach".

~~~
TeMPOraL
This sounds like a cool idea - if and only if everyone coordinates to it
together fairly, which seems extremely hard to achieve. I can see tons of
caveats, e.g.

\- If you automate away your non-tech co-worker (someone who can't contribute
much to the automation of jobs other than his own), will he get the bonus? If
so, how do you assure people doing the automating won't feel treated unfairly
as the non-tech guy just got bonuses for doing nothing?

\- If someone automates you away, will you get anything? Will you be angry?

\- If you keep automating away your co-workers, who's gonna be left to help
you automate yourself away? (assume for a moment that you have the most
difficult to automate job in the company)

I mean... this could work if you'd manage to set up incentives so that
everyone in the company works together and not against each other. But I'm not
sure how to do this, especially in a way which doesn't leave anybody feeling
he's being treated unfairly.

~~~
redeemer
> If you automate away your non-tech co-worker ... will he get the bonus?

Yes, and you get nothing. It's fair; just don't automate away other people's
jobs, automate away your own.

> If someone automates you away, will you get anything?

Yes, 100% of the bonus. It's not a choice, you must leave. I still think
that's fair.

> If you keep automating away your co-workers, who's gonna be left to help you
> automate yourself away?

You don't automate away your co-workers unless you like doing free work.
Problem solved. If you have the most difficult to automate away job, you
entered into it knowingly. The market will ensure that those jobs will be
higher paid by supply and demand.

This model doesn't require cooperation to work, just a relatively clear set of
job responsibilities and a clear definition of what it means for you to be
automated away.

I actually really love this idea, from an employer's perspective.

~~~
TeMPOraL
Ok, so how about when you're an accountant? Now your employer has a strong
incentive to pay someone extra to eliminate you, and you have no way of
automating yourself because you don't know squat about programming.

That's the kind of situation that needs to be handled.

~~~
dragonwriter
> and you have no way of automating yourself because you don't know squat
> about programming.

Automating existing work is more than just programming; a big part of it is
expertise provided by people knowledgeable about the existing workflow. None
of the projects I've been on that involved automating existing work involved
only programmers and no one with experience doing the work.

~~~
TeMPOraL
Yes, I know, and I appreciated that fact in the post upthread. The point here
is, if you go with the suggestion of 'redemeer that "if you get automated
away, you get 0", said accountant will have no incentive to help with the
automating, and every incentive to make it as hard as possible - which I think
is not the result we want to have. To get a happy ending for everyone, we need
coordination, not competition.

~~~
redeemer
> "if you get automated away, you get 0"

No, I actually said the opposite.

If you get automated away, by yourself or by anyone else, you instantly get
the bonus and you're instantly fired.

The accountant has every incentive to either be an amazing accountant so that
any automation would be a pale imitation (everyone wins), or he can choose to
automate himself away, move to the next job, automate himself away and pick up
the bonus there, and so on (everyone gets their due).

This is competition, and I think it's a happy ending for meritocracy. The only
people who lose are those who desire to continue to be paid while not
generating value.

~~~
TeMPOraL
Oh, I'm sorry. After re-reading your comment for the third time I realized you
said something exactly opposite than what I thought.

------
norswap
> Instead, my job will be outsourced to a datacenter. Everything I do
> professionally, from reading and giving feedback on specs to estimating
> development schedules to architecting and typing in the code and writing
> unit tests to reviewing others' code, every one of these things will be
> subject to automation. We can't do it now, of course, not at any price.

The whole thing derives from these flawed premises. Machines are incredibly
dumb. Even these machine-learning wizz-things. They can do statistics,
inferences, correlations. That's very very far from enough, especially for
something like programming, which we can see as turning a spec into code.
First, one still has to write a spec (so the "programming" isn't totally
gone). Second, it means you essentially need to have "solved" natural language
processing; as well as have assigned semantic values to linguistic constructs.
Good luck with that.

At this point, I feel compelled to point out that humans don't agree on the
meanings of what they say. Most of the time, humans can't write software that
meets a spec. That's because of ambiguity, and machines are way way worse at
dealing with that than humans are.

I think the error is simply to think the current rate of progress will keep
going (and actually even the idea that there's been an acceleration of
progress in AI recently is an illusion -- what we're seeing is increased
application). It's the same error the fathers of AI made when they assumed
everything would be solved in a matter of years. That's where the expression
"AI winter" comes from.

~~~
TeMPOraL
First of all,

> _Machines are incredibly dumb. Even these machine-learning wizz-things. They
> can do statistics, inferences, correlations._

Do you think human brain does something more? All _we_ can do is statistics,
inferences and correlations, and we do it poorly.

Moreover, there is an even bigger problem than 100% automation of everything -
namely, the automation of 50% of jobs. What are we going to do with half of
the society being unable to find any job? And that most likely includes your
or mine parents, siblings and friends, who will no doubt come to us techies
for assistance. And if current trends continue, it's 10 years away, not 100.

~~~
dm2
Other technologies will make the need for jobs obsolete.

It's called a post-scarcity society and it must be achieved at the same time
or slightly before most jobs become automated.

A molecular re-arranger would be the holy grail of a post-scarcity society.

Also, you're not giving enough credit to the human brain. Whatever it does, we
can't even get close to replicating it with current computer technology.

Yes, they can emulate a bee or an ant with computers, but that creature
doesn't have the ability to reason and come up with ideas, and dream (as far
as we know and are emulating).

------
api
If we dealt rationally and benevolently with it, 100% unemployment would be an
age of abundance... almost a post-scarcity society. I do not expect to see it
soon, but I think it's possible. If technology continues to advance at the
same pace it has for the past 50 years and if we can engineer our way out from
under the impending fossil fuel energy crunch, I'd say it's _probable_ within
the century.

Even before things get as extreme as effectively "100% unemployment," I could
certainly see a massive drop in the need for labor. We may already be seeing
the leading edge of this.

It wouldn't mean nobody would do any work, but it would mean that basically
everyone would be a trust fund brat and able to work on whatever they want.

Truly intelligent beings would do that.

But so far we aren't known for being that intelligent. We will probably
continue with means-justifies-the-end market fundamentalism, maybe with a bit
of ham-fisted wealth redistribution done in an unfair and inefficient way
thrown in to keep the masses from rioting too much.

~~~
TeMPOraL
Indeed. Our real problem is that of coordination, and that there is no easy
path from here to there - if we keep just following the market's lead hoping
it will magically solve everything, we're not going to get to that age of
abundance.

~~~
bryanlarsen
I think there is a (relatively) easy path from here to there. Let's presume
that we want to create a Star Trek post-scarcity future.

One theory I like for Star Trek economics is that they've got an _extremely_
high basic income, the equivalent of $10M/year or so. Energy costs about the
same (~10c/kwh), and is the limiting factor in replicator economics.

Some stuff is really expensive, like land and human labour. Replicator-made
material goods are incredibly cheap, unless you need an huge amount -- you
couldn't afford to buy a copy of the Enterprise.

Obviously you don't need to work, and most don't. A few are VR addicted, but
most are just dilettantes, they spend their time doing hobbies, art and
travel, much like the able bodied retired population in the western world. A
large minority get "real" jobs, but mainly for purpose and prestige. And some
do it to get rich -- they want to buy a planet or a starship or something.

The nice thing about this theory is that the path is quite clear: start with a
very small (possibly inadequate) basic income, and increase it regularly as
circumstances permit.

~~~
api
That's the only cogent explanation of how the Trek universe's economy might
actually function I've ever heard.

So huge goods (e.g. Starship) would be pricey, but everything else except
labor feels "free." Definitely describes how that society seems to operate,
and I could imagine it working if you had replicators, matter antimatter power
generation, and other kinds of mega-technology.

~~~
TeMPOraL
I always thought it's the agreed-upon explanation ;). I.e. Star Trek's post-
scarcity society is enabled mostly thanks to the replicator technology. So
food, clothing and basic construction materials are essentially free and
unlimited. People do whatever they want - most civilians just go and live and
further their passions; some do work for others (e.g. Sisko's dad and his
restaurant), but they do this because they want to, not because they have to.
Many join Starfleet, which basically offers people grand goals and adventure,
and enforces structure and obedience based solely on non-monetary incentives.

Things like starships are pricey partially because they're huge (and thus
reaching the limit of civilization's energy output) and partially because some
things apparently just can't be replicated with their-era replicator
technology.

------
NhanH
Historically speaking, most of the philosophical question never got an answer
by philosophy. They just tend to become obsolete as science advances -- and I
consider any question regarding strong AI to be half philosophical by nature
(100% unemployment can only happen with strong AI, otherwise we'd spend the
last 0.0001% or whatever it is and focusing on developing strong AI)

With that in mind, there are a note: from what I've gathered from actual
Machine learning and AI researchers, anything that resemble strong AI is not
just far away, but ridiculously far away.

On a less tangent point: a lot of our economic system, and assumptions are
based on a scarcity of resources (human labor is one of the scarcity). By the
time automation could really take hold and realistically take over most of
human's work, it seems to me that there are many things to be concern about
rather than employment, and they depends on whether the model of economy is a
scarcity or an abundance one, and whether it's actually sustainable or not
etc.

~~~
dm2
AI comes up a lot but in my opinion there is no guarantee that it will ever
actually exist.

AI is by definition is artificial and the lines between artificial and non-
artificial will be blurred very soon as BCIs come around.

If we take all deceased people's brains and throw them in a pool of goop that
allows them to talk to each other and connect them to a BCI so that they can
communicate with the rest of the world, is that Artificial Intelligence?

I predict that we will have ultra-intelligent humans/transhumans and advanced
AI might or might not ever come into existence.

As expected, DARPA is leading most of the BCI research programs at the moment.
I'm not sure how I feel about Super Soldiers and modifying humans, that's a
huge topic in itself.

------
leereeves
For anyone who hasn't seen this story:

[http://marshallbrain.com/manna1.htm](http://marshallbrain.com/manna1.htm)

~~~
blacksmith_tb
Also Iain Banks' Culture novels have this question as an occasional theme (and
the main themes are also very interesting, of course).

~~~
teraflop
There's an interesting little passage where the protagonist of _Use of
Weapons_ , experiencing the Culture's society for the first time, meets a
sociologist who works part-time cleaning tables at a cafeteria -- even though
the job could easily be automated.

“I could try composing wonderful musical works, or day-long entertainment
epics, but what would that do? Give people pleasure? My wiping this table
gives me pleasure. And people come to a clean table, which gives them
pleasure. And anyway" \- the man laughed - "people die; stars die; universes
die. What is any achievement, however great it was, once time itself is dead?
Of course, if all I did was wipe tables, then of course it would seem a mean
and despicable waste of my huge intellectual potential. But because I choose
to do it, it gives me pleasure. And," the man said with a smile, "it's a good
way of meeting people. So where are you from, anyway?”

------
6t6t6
That same concern existed 150 years ago and the truth is that we are still
working the same amount of hours as before. We just don't do the same things
for living.

There will always be jobs to do. Most of the manufacturing and repetitive jobs
can be automated, but there will always be jobs that only humans can do. For
instance, food can be processed by a machine, but I prefer to pay to see how a
sushi master prepares each Nigiri for me. Or, would would someone prefer a
massage chair rather than getting that massage from a human? Music? Arts?
Design? All that things can be done by machines, but we will always prefer to
get them from a human. Our lifestyle will change. Objects will loose value
because they will be extremely cheap to produce and our jobs will be more
focused towards services to other people.

What concerns me most is the inequality and how the richness produced by
machines will flow though the society. With that level of automation in the
industry it will be really easy for a few people to control the production of
all the goods.

But this is something that has to be solved by the politics.

------
hodwik
The human mind, while having less data storage, and being drastically slower
for some types of problems, will stay significantly faster than computers at
other types of problems.

We merely must learn to better teach those brains (matrix upload?), extend
them (artificial memory enhancements and built in calculators) and utilize
them for the problem sets they excel at.

~~~
Osmium
> The human mind ... will stay significantly faster than computers at other
> types of problems

This is controversial. Semantic tagging of photographs used to be something
that was done much better by the human brain, but all of a sudden it seems
like computers are catching up fast. I hesitate to say there's any one domain
that we won't be able to tackle algorithmically/via machine learning in some
way.

> extend them (artificial memory enhancements and built in calculators)

My phone is already my artificial memory enhancement. Communication is via a
high latency, unreliable interface, but it's (for all intents and purposes) an
extension of my brain ;)

~~~
hodwik
Absolutely, technology of some kind will be able to replicate all human
domains of intelligence at some distant point in the future, but our own minds
will be continually extended (with lower and lower latency, and increasingly
reliable interfaces).

The question becomes moot as humans and machine coalesce.

------
javajosh
I'm happy to see people thinking about this sort of stuff.

I foresee a Constitutional Amendment limiting the amount of computational
power any private person or entity can own. Each person, upon birth, would be
granted a certain allotment of computational capacity. This computational
capacity can be hired out, but you collect it's paycheck. If you take this to
the logical extreme, everyone could own the robot that replaces them at their
job, and live out their life collecting it's paycheck.

Check out Marshal Brain's "Manna", where the concentration of wealth nightmare
scenario is taken to it's logical extreme. (I don't find the scenario very
convincing, but the thought to get it there was useful.)

[http://marshallbrain.com/manna1.htm](http://marshallbrain.com/manna1.htm)

------
patmcc
I really can't imagine 100% unemployment, because I think there will always be
a demand for actual human people in some roles: hairdresser, waiter, singer,
prostitute, masseuse, athlete, actor, croupier, etc.

Will robots/AI be good enough to do all these jobs? Sure, they might even be
now. But as long as some people want to be sung to / served by / entertained
by an actual flesh-and-blood human, there will still be employment in those
sectors. In fact I wouldn't be surprised in the far future if those service
positions were the most valuable in society, since that's where real scarcity
will be.

~~~
pornel
Maybe some of these jobs won't be replaced with a robot, but will become
completely irrelevant/unnecessary?

Instead of having a robot hairdresser, what if cutting hair wasn't even a
thing? e.g. maybe we'll all get holo-wigs that let us change hairstyles as
easily as profile pictures.

Similarly we don't talk to switchboard operators to enter URL in the browser
addressbar. We've got AI that does it better and doesn't judge :)

~~~
TeMPOraL
Exactly. And yes, in principle we could keep inventing completely random and
absurd jobs just to keep people busy earning for living - but then, what's the
point? Nowdays we have to work _because_ resources are scarce. If we reach the
point of post-scarcity, forcing people to slave away their lives for food and
shelter seems just evil.

~~~
patmcc
I wasn't suggesting we invent jobs to keep people busy - I think we should
strive towards employment being required for no one. Once food and material
goods are post-scarcity, absolutely they should be provided to all.

But as long as anything is scarce (and experiences are, broadly, and I don't
think there's any way around that), we'll have something like "employment". It
might start at "I'll sing you a song if you massage my back" barter-style, but
that evolves into currency eventually.

~~~
TeMPOraL
Ok, maybe you weren't suggesting that but I see this tossed around quite often
- that people will always find new jobs and _therefore_ we can keep requiring
everyone to work for sustenance. And we're inventing many bullshit jobs right
now - not to look too far, half of the work done in web development is
probably contributing to nothing else than zero-sum games of companies trying
to out-advertise one another.

I agree - as long as resources required for basic survival are scarce, there
will be employment. But the point is, we shouldn't try to maintain the status
quo as long as possible; we should jump to letting people live without a job
the first moment this becomes feasible.

------
netvisao
Sometimes it is forgotten that what is of 'value' is what is inherently
scarcer. Today, it just happens that there is a shortage of automating agents
(programmers, AI scientists etc.), once this opportunities (STEM) become
common, they won't hold much value, and humans will value what is the next
scarce thing (Hand crafts, Arts ? The future will tell)

We also need to remind ourselves, that the rich cannot feel rich without an
army of poor underneath them. I can guarantee having the very rich people
shielded from the rest is actually not an evolution of our race.

------
Havoc
This whole thing - I can see it coming and am fully convinced that it will. I
just can't work out how to leverage that insight for personal benefit (not
millions just modest advantage). Feels a little weird for once I'm ahead of
the curve & recognise it as such but still can't seem to leverage it. My
profession isn't anywhere near AI or even tech so it's a little difficult.

~~~
TeMPOraL
That's exactly what I feel too. Having this knowledge allows me for slightly
more than just telling people that the end is nigh. I don't see the way to
either leverage that knowledge to safeguard myself and my friends/family, or
(preferably) do something to safegurad the society at large. It's easy for me
to notice now when people around me, or even I myself do the Moloch's bidding,
but I can't see a way to contribute to fight against it.

------
lowbloodsugar
"You and I and everybody else, if the present trends continue, will be selling
what we do to the highest bidder."

That's how its supposed to work. The problem here is not "selling what we do
to the highest bidder", its the "You and I and everyone else". If you are
competing against "everyone else" then the highest bid is going to be pretty
low.

------
slagfart
The article alludes to one core bastion of economics - "the invisible hand",
without mentioning another - "as humans, we are creatures of infinite wants".

With that in mind, a computer can never truly replace a human, as there would
never be a purpose in creating a machine that 'wants' things that are
different to its owner. Along the same lines, as robots fulfil each current
human want, new wants will simply emerge.

A trite example - A robot that cleans my floor satisfies a current want, but I
want a robot that cleans my whole house - every surface. After that's
fulfilled, I'm likely to want a robot that can reconfigure the house based on
my predicted needs, and after that, a robot that works in conjunction with
others to make sure my preferred dwelling config is available in the right
place and the right time every night. I'm likely to keep working in my job to
have this, and my career will no doubt evolve to reflect the group need that
people have for these robots.

The selfless robot in Interstellar is a realistic scenario - incredibly
helpful, but programmed to want what its owners want. Imagine endless
configurations of that - I think it's a likely future.

'Stupid' machines have been replacing humans since the industrial revolution,
but humans have just taken a position higher up the chain. The article claims
that 'smart' machines will end this process - I disagree. The greed of human
nature will mean we always have to work.

Edited for phrasing.

~~~
oconnore
> I'm likely to keep working in my job to have this

You are assuming that humans will remain better at developing towards the
remaining non-automated desires of humans. That is unlikely. When a robot is
able to reconfigure your house to meet predicted desires, you will probably
(and most people will certainly) no longer be relevant in any job that you (or
they) are capable of performing.

Anyway, that doesn't matter. What matters is that even if you are a super
elite engineer that can compete a few years longer than the average joe, that
doesn't fix the problem for some % >50 of the population that are already
being rapidly obsoleted.

------
sparaker
I think there has to be some kind of massive leap before we get to that state,
regardless i we'll get creative freedom to do whatever we please and excel in
things we might not have been interested in earlier as they had no "economic
value".

Obviously its going to be a journey before the masses will get used to not
working as they have been all their lives.

------
moe
Obligatory CGP Gray (15 min video):
[https://www.youtube.com/watch?v=7Pq-S557XQU](https://www.youtube.com/watch?v=7Pq-S557XQU)

------
jtwebman
I have dreamed on building a program that learns to program. Maybe I need to
look into it again.

~~~
TeMPOraL
There's that old joke about a Lisp programmer who's looking for a job - "will
write code that writes code that writes code that writes code for money" ;).

------
cranklin
Too much credit given to "robots". I see this coming from Hollywood, not
Hacker News.

~~~
TeMPOraL
Google has cars that are about to kill the entire transportation industry.
Amazon already reduced the workers in some of its warehouses to biomechanical
grabbing hands; all higher-order functions are performed by robots. No, I
don't think we're giving too much credit to robots.

~~~
cranklin
autonomous vehicles and biomechanical grabbing hands are completely different
than engineering software. The aforementioned tasks, while an impressive feat
(especially in computer vision), are programmable tasks. To create "robots"
that can engineer/create new software without a software engineer is not going
to happen in this lifetime.

