Imagine a company-wide incentive structure for automation whereby if you automate 90-100% of your job, you receive massive bonuses. If you completely automate your job, the bonus could be the entirety of your annual salary as a single payout. Or perhaps a combination of bonus and equity.
So now you're out of a job. But if you can automate other jobs in the organisation, you also get bonuses.
Eventually the majority of the company becomes automated, hopefully generating the same or greater revenue and value for customers, with less overhead costs. Which is great for any stakeholders who can continue to receive revenues, but terrible for employees. So it would be good to include some kind of equity and/or revenue sharing model. The sell could then be "hey, let's all work together to automate this company, we'll get bonuses for doing so, and when the whole thing is automated we can continue to receive income while sitting on the beach".
- If you automate away your non-tech co-worker (someone who can't contribute much to the automation of jobs other than his own), will he get the bonus? If so, how do you assure people doing the automating won't feel treated unfairly as the non-tech guy just got bonuses for doing nothing?
- If someone automates you away, will you get anything? Will you be angry?
- If you keep automating away your co-workers, who's gonna be left to help you automate yourself away? (assume for a moment that you have the most difficult to automate job in the company)
I mean... this could work if you'd manage to set up incentives so that everyone in the company works together and not against each other. But I'm not sure how to do this, especially in a way which doesn't leave anybody feeling he's being treated unfairly.
Yes, and you get nothing. It's fair; just don't automate away other people's jobs, automate away your own.
> If someone automates you away, will you get anything?
Yes, 100% of the bonus. It's not a choice, you must leave. I still think that's fair.
> If you keep automating away your co-workers, who's gonna be left to help you automate yourself away?
You don't automate away your co-workers unless you like doing free work. Problem solved. If you have the most difficult to automate away job, you entered into it knowingly. The market will ensure that those jobs will be higher paid by supply and demand.
This model doesn't require cooperation to work, just a relatively clear set of job responsibilities and a clear definition of what it means for you to be automated away.
I actually really love this idea, from an employer's perspective.
This is why IT is considered White Collar (especially software developers and engineers)... It's our job to automate people out of jobs... which then churns the need for jobs developing ever more complex systems that aren't human. Often badly written.
We'll never reach greater than say 60% unemployment... there's just to much crap to shovel (even in automation).
That's the kind of situation that needs to be handled.
Automating existing work is more than just programming; a big part of it is expertise provided by people knowledgeable about the existing workflow. None of the projects I've been on that involved automating existing work involved only programmers and no one with experience doing the work.
If you go with a principal that if you are involved in automating the job away, you get some reward, the accountant may have some incentive to help (especially if it is likely to happen, but with perhaps lesser quality -- but still perhaps a net cost savings to the company -- anyway, which would result not only in the company getting less than if the accountant was actively involved, but the accountant getting less as well.)
No, I actually said the opposite.
If you get automated away, by yourself or by anyone else, you instantly get the bonus and you're instantly fired.
The accountant has every incentive to either be an amazing accountant so that any automation would be a pale imitation (everyone wins), or he can choose to automate himself away, move to the next job, automate himself away and pick up the bonus there, and so on (everyone gets their due).
This is competition, and I think it's a happy ending for meritocracy. The only people who lose are those who desire to continue to be paid while not generating value.
Of course this does mean a lot of people out of jobs, which is the crux of the article, but I don't think it's a fault or weak link in this employment model.
The basic, not fully thought out idea, was that if you automated your own job you got the full bonus. If you automate another person's job, you both split the bonus 50/50.
I think with most jobs, automation would definitely involve programming and tech skills. But there would be a lot of non-tech tasks as well, such as thoroughly understanding their workflow and processes and identifying more efficient practices.
"Automation" can also be a combination of computer and human outsourcing too.
In the small web dev company I was working for (launching internal startups), the "creative process" of designing website mockups was incredibly time consuming. I suggested we collect all of the customer requirements and collateral, and automatically outsource the mockups via API to something like MobileWorks/Freelancer/Elance/99Designs/DesignCrowd etc
How do you compensate those who lose their jobs as a result? What if they were also trying, but just happened to not be first to succeed?
It's a really cool idea though. (As a manufacturing engineer in a fairly old-fashioned company, I am currently trying to automate as much as possible; no idea yet about individual bonuses since I'm <1 year on the job)
The original thought was that if you fully automate your own job you get your salary as a bonus. But if you fully automate someone else's job, you both split their salary 50/50 as a bonus.
Now both of you are free to work on automating other jobs in the org. So you'd end up with various teams working together to automate jobs in the org.
As a byproduct, it would also free up a lot of human capital in an organisation to focus on pure R&D and intrapreneurship.
And then hire you back on a $5,000 1099 contract to automate other people's jobs?
You seem to be trying to come up with some sort of "fair" system, as if that were how compensation works now or ever will work.
I know you meant this as a joke, but I want to emphasize a general point - one of the things that totally dumb down conversations about economy is people grouping ideas under labels like "communist" or "libertarian" or whatever and evaluating them in batches.
If done in good faith, I think it's legitimate to start with approximate labels, like "libertarian", and the arguing parties can then proceed to make their position clearer if necessary. Hopefully, this saves time.
Also, I think it's fair to describe someone that believes in the importance of "workers should own the means of production" as a marxist (at least it's more accurate than socialist or communist).
The founders, directors and investors could still own 51% of the company.
Heck, if they wanted they could offer only the bonus payout incentive structure with no equity or revenue share. In essence that would still encourage employees to automate the company. Employees work themselves out of a job, and the owners reap all of the automated revenues.
Or you could not be a dick.
At a company I worked for, after they realized that things couldn't go on anymore without massive automation, their first approach was to introduce it into the infrastructure lifecycle and require all new infrastructure to be heavily automated. That allowed the legacy stuff to die slowly without much disturbance.
Another different idea entirely would be spin off a subsidiary with those ideals and slowly migrate the work there. Perhaps too much work, but if the situation is desperate enough..
But there must be a particular type of company, at a particular scale, with a particular ethos out there that would be up for giving an automation initiative a shot.
If it worked, it would make for an impressive news story which could spark others to follow suit.
Imagine an economy where everyone's job was to automate their job...
The whole thing derives from these flawed premises. Machines are incredibly dumb. Even these machine-learning wizz-things. They can do statistics, inferences, correlations. That's very very far from enough, especially for something like programming, which we can see as turning a spec into code. First, one still has to write a spec (so the "programming" isn't totally gone). Second, it means you essentially need to have "solved" natural language processing; as well as have assigned semantic values to linguistic constructs. Good luck with that.
At this point, I feel compelled to point out that humans don't agree on the meanings of what they say. Most of the time, humans can't write software that meets a spec. That's because of ambiguity, and machines are way way worse at dealing with that than humans are.
I think the error is simply to think the current rate of progress will keep going (and actually even the idea that there's been an acceleration of progress in AI recently is an illusion -- what we're seeing is increased application). It's the same error the fathers of AI made when they assumed everything would be solved in a matter of years. That's where the expression "AI winter" comes from.
> Machines are incredibly dumb. Even these machine-learning wizz-things. They can do statistics, inferences, correlations.
Do you think human brain does something more? All we can do is statistics, inferences and correlations, and we do it poorly.
Moreover, there is an even bigger problem than 100% automation of everything - namely, the automation of 50% of jobs. What are we going to do with half of the society being unable to find any job? And that most likely includes your or mine parents, siblings and friends, who will no doubt come to us techies for assistance. And if current trends continue, it's 10 years away, not 100.
It's called a post-scarcity society and it must be achieved at the same time or slightly before most jobs become automated.
A molecular re-arranger would be the holy grail of a post-scarcity society.
Also, you're not giving enough credit to the human brain. Whatever it does, we can't even get close to replicating it with current computer technology.
Yes, they can emulate a bee or an ant with computers, but that creature doesn't have the ability to reason and come up with ideas, and dream (as far as we know and are emulating).
I think you can also state that machines generally lack the ability to discern expansive context unless explicitly trained to do so.
Granted, it will happen. True AI requires these things. At present all ML needs a push from a human to become more "intelligent."
The bigger question is can we as a species evolve into a more altruistic species and move beyond capital. Some days I don't think we can...
Languages, debugging tools, version control systems and bug trackers are all rapidly reducing the number of developer-hours required to implement a given program, which we should expect to reduce the size of the programmer workforce eventually if we can't find new things for them to do.
It is possible that we will find enough new problems for programmers to solve to counteract increased productivity, but for many applications, most of the general problems that one would hire a programmer to do already have reasonable-quality solutions. I think we'll see smaller and smaller teams of hyper-productive developers working on a shrinking set of increasingly esoteric projects, or things with a single specialized use (like calculating taxes for that particular year).
Even before things get as extreme as effectively "100% unemployment," I could certainly see a massive drop in the need for labor. We may already be seeing the leading edge of this.
It wouldn't mean nobody would do any work, but it would mean that basically everyone would be a trust fund brat and able to work on whatever they want.
Truly intelligent beings would do that.
But so far we aren't known for being that intelligent. We will probably continue with means-justifies-the-end market fundamentalism, maybe with a bit of ham-fisted wealth redistribution done in an unfair and inefficient way thrown in to keep the masses from rioting too much.
One theory I like for Star Trek economics is that they've got an extremely high basic income, the equivalent of $10M/year or so. Energy costs about the same (~10c/kwh), and is the limiting factor in replicator economics.
Some stuff is really expensive, like land and human labour. Replicator-made material goods are incredibly cheap, unless you need an huge amount -- you couldn't afford to buy a copy of the Enterprise.
Obviously you don't need to work, and most don't. A few are VR addicted, but most are just dilettantes, they spend their time doing hobbies, art and travel, much like the able bodied retired population in the western world. A large minority get "real" jobs, but mainly for purpose and prestige. And some do it to get rich -- they want to buy a planet or a starship or something.
The nice thing about this theory is that the path is quite clear: start with a very small (possibly inadequate) basic income, and increase it regularly as circumstances permit.
So huge goods (e.g. Starship) would be pricey, but everything else except labor feels "free." Definitely describes how that society seems to operate, and I could imagine it working if you had replicators, matter antimatter power generation, and other kinds of mega-technology.
Things like starships are pricey partially because they're huge (and thus reaching the limit of civilization's energy output) and partially because some things apparently just can't be replicated with their-era replicator technology.
I don't think it works that way. We get the future we construct. If we just trust the means to produce the ends, we will just get some random future selected for us by impersonal and amoral evolutionary and economic forces that don't give a damn about our well being.
With that in mind, there are a note: from what I've gathered from actual Machine learning and AI researchers, anything that resemble strong AI is not just far away, but ridiculously far away.
On a less tangent point: a lot of our economic system, and assumptions are based on a scarcity of resources (human labor is one of the scarcity). By the time automation could really take hold and realistically take over most of human's work, it seems to me that there are many things to be concern about rather than employment, and they depends on whether the model of economy is a scarcity or an abundance one, and whether it's actually sustainable or not etc.
AI is by definition is artificial and the lines between artificial and non-artificial will be blurred very soon as BCIs come around.
If we take all deceased people's brains and throw them in a pool of goop that allows them to talk to each other and connect them to a BCI so that they can communicate with the rest of the world, is that Artificial Intelligence?
I predict that we will have ultra-intelligent humans/transhumans and advanced AI might or might not ever come into existence.
As expected, DARPA is leading most of the BCI research programs at the moment. I'm not sure how I feel about Super Soldiers and modifying humans, that's a huge topic in itself.
“I could try composing wonderful musical works, or day-long entertainment epics, but what would that do? Give people pleasure? My wiping this table gives me pleasure. And people come to a clean table, which gives them pleasure. And anyway" - the man laughed - "people die; stars die; universes die. What is any achievement, however great it was, once time itself is dead? Of course, if all I did was wipe tables, then of course it would seem a mean and despicable waste of my huge intellectual potential. But because I choose to do it, it gives me pleasure. And," the man said with a smile, "it's a good way of meeting people. So where are you from, anyway?”
There will always be jobs to do. Most of the manufacturing and repetitive jobs can be automated, but there will always be jobs that only humans can do. For instance, food can be processed by a machine, but I prefer to pay to see how a sushi master prepares each Nigiri for me. Or, would would someone prefer a massage chair rather than getting that massage from a human? Music? Arts? Design? All that things can be done by machines, but we will always prefer to get them from a human. Our lifestyle will change. Objects will loose value because they will be extremely cheap to produce and our jobs will be more focused towards services to other people.
What concerns me most is the inequality and how the richness produced by machines will flow though the society. With that level of automation in the industry it will be really easy for a few people to control the production of all the goods.
But this is something that has to be solved by the politics.
We merely must learn to better teach those brains (matrix upload?), extend them (artificial memory enhancements and built in calculators) and utilize them for the problem sets they excel at.
This is controversial. Semantic tagging of photographs used to be something that was done much better by the human brain, but all of a sudden it seems like computers are catching up fast. I hesitate to say there's any one domain that we won't be able to tackle algorithmically/via machine learning in some way.
> extend them (artificial memory enhancements and built in calculators)
My phone is already my artificial memory enhancement. Communication is via a high latency, unreliable interface, but it's (for all intents and purposes) an extension of my brain ;)
The question becomes moot as humans and machine coalesce.
I foresee a Constitutional Amendment limiting the amount of computational power any private person or entity can own. Each person, upon birth, would be granted a certain allotment of computational capacity. This computational capacity can be hired out, but you collect it's paycheck. If you take this to the logical extreme, everyone could own the robot that replaces them at their job, and live out their life collecting it's paycheck.
Check out Marshal Brain's "Manna", where the concentration of wealth nightmare scenario is taken to it's logical extreme. (I don't find the scenario very convincing, but the thought to get it there was useful.)
Will robots/AI be good enough to do all these jobs? Sure, they might even be now. But as long as some people want to be sung to / served by / entertained by an actual flesh-and-blood human, there will still be employment in those sectors. In fact I wouldn't be surprised in the far future if those service positions were the most valuable in society, since that's where real scarcity will be.
Instead of having a robot hairdresser, what if cutting hair wasn't even a thing? e.g. maybe we'll all get holo-wigs that let us change hairstyles as easily as profile pictures.
Similarly we don't talk to switchboard operators to enter URL in the browser addressbar. We've got AI that does it better and doesn't judge :)
But as long as anything is scarce (and experiences are, broadly, and I don't think there's any way around that), we'll have something like "employment". It might start at "I'll sing you a song if you massage my back" barter-style, but that evolves into currency eventually.
I agree - as long as resources required for basic survival are scarce, there will be employment. But the point is, we shouldn't try to maintain the status quo as long as possible; we should jump to letting people live without a job the first moment this becomes feasible.
If labor isn't scarce (e.g., if there are easy substitutes that provide the utility provided by labor), then there's no real basis for employment even if there is scarcity in goods.
Scarcity guarantees that people will have an incentive to sacrifice to get scarce goods, but if labor isn't scarce, then sacrificing time in the form of labor isn't likely to be an option to do that.
But I think the future for humans looks somewhat similar to that of horses today; a small number of us will still be "working", but more because of individual preferences than any kind of need.
prostitute: soon (fleshlight+occulus)
actor: there are tons of CGI actors in movies
croupier: slot machines, video poker, etc
Take croupiers; we have very good video poker, automated roulette tables, virtual blackjack, but still lots of humans dealing out cards (especially in the high roller rooms). I was at a casino in Las Vegas two weeks ago; the robot roulette gave strictly better payouts (36 to 1) than the human-run table (35 to 1) for single number bets, but there were still people at the tables.
We also need to remind ourselves, that the rich cannot feel rich without an army of poor underneath them. I can guarantee having the very rich people shielded from the rest is actually not an evolution of our race.
That's how its supposed to work. The problem here is not "selling what we do to the highest bidder", its the "You and I and everyone else". If you are competing against "everyone else" then the highest bid is going to be pretty low.
With that in mind, a computer can never truly replace a human, as there would never be a purpose in creating a machine that 'wants' things that are different to its owner. Along the same lines, as robots fulfil each current human want, new wants will simply emerge.
A trite example - A robot that cleans my floor satisfies a current want, but I want a robot that cleans my whole house - every surface. After that's fulfilled, I'm likely to want a robot that can reconfigure the house based on my predicted needs, and after that, a robot that works in conjunction with others to make sure my preferred dwelling config is available in the right place and the right time every night. I'm likely to keep working in my job to have this, and my career will no doubt evolve to reflect the group need that people have for these robots.
The selfless robot in Interstellar is a realistic scenario - incredibly helpful, but programmed to want what its owners want. Imagine endless configurations of that - I think it's a likely future.
'Stupid' machines have been replacing humans since the industrial revolution, but humans have just taken a position higher up the chain. The article claims that 'smart' machines will end this process - I disagree. The greed of human nature will mean we always have to work.
Edited for phrasing.
You are assuming that humans will remain better at developing towards the remaining non-automated desires of humans. That is unlikely. When a robot is able to reconfigure your house to meet predicted desires, you will probably (and most people will certainly) no longer be relevant in any job that you (or they) are capable of performing.
Anyway, that doesn't matter. What matters is that even if you are a super elite engineer that can compete a few years longer than the average joe, that doesn't fix the problem for some % >50 of the population that are already being rapidly obsoleted.
The question is what happens when we approach the end of that chain.
Obviously its going to be a journey before the masses will get used to not working as they have been all their lives.
Also, a lot of those jobs disappeared forever in the third world and never came back. Ag jobs were never really replaced with manufacturing job. Only now are some third world countries getting manufacturing jobs. If they are replaced with automated manufacturing those coding, engineering, and management jobs will go to the first world.
So maybe it works out for America and Europe. But someplace like Indonesia and India get permafucked yet again by a new industrial revolution.
The automation that is happening now has never happened in the history of the human race. You simply cannot extrapolate from past events and say it'll be like that again.
This is a serious problem, because the more physical wealth can be provided by robots, the more those status games necessarily become about obtaining power over other humans. This puts capitalism, as an enabler of this kind of status games, on a path to escalating conflict not just with democracy but with human rights.
Of course, many would say that this is already happening. For now, the conflict is simply at a level where human rights violations and everything that goes with it can still be exported to third world countries.
For example, you will find countless people to help dig various archeological site, help processing the various museum collection still hidden in boxed, ...
Also "historically" is misleading. Historically the industrial revolution was one of the worst period to live - the conversion of those 90% to the industry was not a happy story. And the post-war golden age was after the worst wars ever.
We are also dealing with some hard limit this time. On the production side, we are hitting earth physical limit on about everything. Even on the consumer side, we are hitting some limit, even for virtual activities, there are only 24 hour a day for you to consume.
History is good to remind you to be optimistic. However, history is not a solution: some guy in 100 year may look back at this period as another great time in human history, but that's down to us find the details on how to achieve it.
Of course, it does it gradually. So you see low-skilled jobs disappearing, and no useful jobs being created in their place. As this process continues, I expect we'll be facing a growing number of people trapped in poverty who have no means and abilities to retrain themselves for jobs with more skill requirements.