Hacker News new | past | comments | ask | show | jobs | submit login
Will Robots Take My Job? (willrobotstakemyjob.com)
79 points by iisbum on May 30, 2017 | hide | past | favorite | 52 comments



Software Developers has 4%, but Computer Programmers more than 10 times higher. Given that this titles happen to be used interchangeably, I take those numbers with a grain of salt.


A brief dig through different occupations reveals that much if not all of this is made up garbage not worthy of rational consideration.

Technical writers are 89%, maids and housekeepers are 69%, but "commercial pilots" are 55% and "airline pilots, copilots" are only 18%? Hah. Sure.

30% for "Zoologists and Wildlife Biologists"?

20% for "Funeral Service Managers, Directors, Morticians, and Undertakers"?

34% for "Detectives and Criminal Investigators"?

28% for "Athletes and Sports Competitors"?

What?


Musical Instrument Repairers and Tuners are 91% "You are doomed". Off-the-shelf instruments are notoriously badly set up, and that'd probably be the kind of "good enough" level that robots could get to. For anything more organic you'd need a human luthier or whatever. After all a human is going to play the instrument, not a robot, given musicians themselves are at a 4% "totally safe" level.

Barbers are at 80%. The amount of R&D that would be needed couldn't be justified by the low waged worker the robot would replace ($24k avg apparently). There's just so much hidden complexity with all these fine-motor skill jobs. It reminds me of those old physics problems "assuming a perfectly spherical head...". Nice idea, but in practice the incentives to create a robot barber just aren't there.


Musical Instrument Repairers and Tuners are 91%

I guess it's a question of how to interpret the numbers. Does it mean that there is a 91% chance that no one will work as a Musical Instrument Repairers and Tuners or that we will only need 9% or the current Musical Instrument Repairers and Tuners. The former sounds very unlikely, while the latter sounds quite reasonable.


1.2% for pharmacists 95% for Landscaping and Groundskeeping Workers


Only 0.4% for physicians?


Reading the paper it's based on (Carl Benedikt Frey and Michael A. Osborne, Oxford Martin School) is pretty funny.

It starts with them fawning over Siri, claiming that this is somehow evidence that in the future we're going to stop wanting to talk to actual humans when we call up Comcast to complain about their billing mistakes:

"Advances in user interfaces also enable computers to respond directly to a wider range of human requests, thus augmenting the work of highly skilled labour, while allowing some types of jobs to become fully automated. For example, Apple’s Siri and Google Now rely on natural user interfaces to recognise spoken words, interpret their meanings, and act on them accordingly. Moreover, a company called SmartAction now pro-vides call computerisation solutions that use ml technology and advanced speech recognition to improve upon conventional interactive voice response systems, realising cost savings of 60 to 80 percent over an outsourced call center."

Then they show the same amnesiac fawning admiration the financial press does when reporting what Foxconn says:

"Foxconn, a Chinese contract manufacturer that employs 1.2 million workers, is now investing in robots to assemble products such as the Apple iPhone."

Yes, in 2011 they started making threats to their employees about replacing a million or so workers within 3 years with some kind of super-robot:

http://spectrum.ieee.org/automaton/robotics/industrial-robot...

The bizarre eemployee union dance party threat didn't exactly pan out. They employ 100,000 more people now than they did then.

The core of this paper's methodology is "we asked a bunch of ML researchers how good they thought AI was and they replied "really, really good":

"First, together with a group of ml researchers, we subjectively hand-labelled 70 occupations, assigning 1 if automatable, and 0 if not."

Real sciencey.


It seems to be using BLS categories, which split what you would call a "Software Developer" into three categories "Software Developers, Systems Software", "Software Developers, Applications" and "Computer Programmers". The "Computer Programmers" category includes only the most routine of the tasks that software developers would complete, so it makes sense that it would be more likely to be automated. It is also only 20% as common as the other two categories, so you may never even have met someone who is classified as a "Computer Programmer".


They are strange but the numbers have come through from the original paper. Appendix starting on pg 62 http://www.oxfordmartin.ox.ac.uk/downloads/academic/future-o...


The original paper (http://www.oxfordmartin.ox.ac.uk/downloads/academic/future-o...) derives these numbers in the following way:

1) A selection of 70 occupations is hand-labeled as "automatable" or "not automatable" by the authors and a group of ML researchers.

2) Three statistical models are trained using nine variables (Finger Dexterity, Manual Dexterity, Cramped Work Space/Awkward Positions, Originality, Fine Arts, Social Perceptiveness, Negotiation, Persuasion, and Assisting and Caring for Others) taken from O * NET (https://www.onetonline.org) survey data for each of 702 occupations. The models are graded by how well they match the hand labels from step 1. The best model of the three is chosen.

3) The numbers generated by this model are then reported for all 702 occupations. This includes the ones that were labeled to begin with: "We implicitly assumed that our hand label, y, is a noise-corrupted version of the unknown true label, z. Our motivation is that our hand-labels of computerisability must necessarily be treated as such noisy measurements. We thus acknowledge that it is by no means certain that a job is computerisable given our labelling."

This methodology doesn't make a lot of sense to me. If you allow yourself to make up (or "hand-label") 70 numbers, why not make up all 702? Why not trust your own labels in the final data? What is the statistical model even telling you when it's being trained on labels you don't trust?


I find it sociologically interesting how layers of math and publishing indirection can be used to convince people that someone's opinions and biases (hand labels) might in fact be statistically significant facts about the world.

To their credit, at least the original authors acknowledge this problem.


It's even more interesting to consider why economists might have these inherent biases. What incentives might they be responding to?


> This methodology doesn't make a lot of sense to me. If you allow yourself to make up (or "hand-label") 70 numbers, why not make up all 702?

Well, what is the science in that?? See, the headline will tell you that the X% of chance that Y% job will be automated, and we know we have to take that with a grain of salt.

However, the statistical models have some extra information, such as which of the 9 variables carries more information about how to predict the labels and etc.

Also, some model can also handle somewhat "bad" data, since even though it is crappy, some signal is still there.

All that said, we still need to take the prediction with a grain of salt... :D


28% chance of 'Athletes and Sports Competitors' being replaced by robots. I for one am looking forward to a robot wars olympics.

https://willrobotstakemyjob.com/27-2021-athletes-and-sports-...


But 0.71% for athletic trainers. Someone's gotta keep the bots oiled.

https://willrobotstakemyjob.com/29-9091-athletic-trainers


Reality here

I'm a dishwasher partially as well as other kitchen duties like cooking/preparing food, that is what pays my bills.

My job is to organize the plates (computer vision), then I spray the food remnants off them before sliding it into the "automated" washer... then I stack them/put away where they go throughout the restaurant.

Yeah I definitely agree with that being automated/should be. I do wonder how long(when will that happen), how much of an investment to pay for it, likelihood it will break down. Then the rest of the site being automated as well.


A relevant result for dishwashers is also how did the number of human dishwashers decrease once automated ones become available. If you are the only one in your kitchen, where previously there would have had to be three for the same workload, your job is already 66% automated.


Might seem racist but a good portion of the workers are spanish-speaking primarily... I might be assuming things as well where advancement relies on being able to use English/work with others cohesively.

Even as a fumbling developer I'm glad to at least have some clue as to what can be... even if I may not reach it... Dunning Kruger effect haha... I wonder for those people what will happen. If I don't escape, I too will be in the same boat of the low-skilled, replaceable laborer, that is what I am right now.

edit: haha, "have a clue as to what can be" like nobody else wishes life was better/be what they see in the news. At least the ability to know what to look for and hopefully learn/improve my situation in life.


It strikes me as a little naive to only consider job security from the perspective of "can a machine do my job". Once AI and automation start taking people out of the workforce at scale, it seems likely there will be a lot more people competing for the jobs that are left.


Interesting, the best I could find was lawyer. Politician wasn't available, I suspect that is the safest job of all, if it should ever become threatened they will simply pass a law that makes that illegal.


Your reasoning works if you think of "robotic takeover" as some isolated event that threatens existing politicians. I see it as gradual change.

It could start with politicians leaning on AI to make better-informed economic decisions, or to predict the voters response to proposed policies. That would force other parties/governments to either do the same or fade to obscurity. This change would be spread out over longer period, but I don't see politics as safe haven.


I really don't see 'vote for algorithm x' as an option in any democracy.


I looked up lawyers too (https://willrobotstakemyjob.com/23-1011-lawyers). Given the fact that a lot of the legal research and discovery work that lawyers do is already being done by software instead of junior level humans, I think that number is completely wrong. A lot of legal work outside of negotiation is essentially data transformation and interpretation which is exactly what machine learning is good for. Law is probably second to accounting for massive disruption in the next few years, but it's a close call.


Being a practicing lawyer is steeped in tradition and all kinds of archaic things (wigs?). What I meant by 'lawyer' is not paralegal or research but being admitted to the bar and allowed to practice law. I really don't see a computer getting a pass on that anytime soon simply because the bar association is made up of lawyers.

That research can be aided and may be done by a computer is beyond dispute at this point, but that job is called being a 'paralegal'.

Being a paralegal is usually a step to becoming a lawyer so I can see a bit of a continuity problem there.


The continuity problem will crop up across lots of disciplines, though. When we automate, we'll probably start out at the lower ends of every discipline, crowding out juniors where they'd normally onboard. That won't be too threatening to senior stakeholders in the discipline, either, so they may let it happen. And then automation will creep its way up the value chain; there'll be strong pressure for it to do so, because with automation making the onramp harder to get onto, there will be an increasing scarcity (and thus increasing profit motive) to climb the value chain.


We, at Doctrine (doctrine.fr), are "automating" the search/discovery/legal monitoring. So, we are disrupting the low-level, groundwork so lawyers can focus on higher level tasks.


This was what I was telling my friend: management, legal research, and medical diagnosis are places where machine learning will flourish in the incoming years as (I hope I'm not oversimplifying this) those processes depend on looking up specific cases and inferring a condition from them.


Try teachers. 'Teachers and Instructors, All Other' has 0.85%, although teachers assistants don't do so well.


According to this data - the safest is Recreational Therapists. However I suspect you are correct about Politicians...


Sales Engineer was the lowest I could see - 0.4% risk of automation.

Much better than 3.5% for lawyers and 13% for software developers.


How are theses numbers getting calculated? Actuaries have 21% probability of automation vs programmers who have it 48%?


Yeah, I always type in "computer programmer" first to see if they put any real thought into the dataset. 48%... whatever that means, interpreted as "start worrying".

The history of computer programming is the history of people trying to automate computer programming. All that ends up ever happening is that the nature of computer programming changes.

Just think of all those assembly-language programmer jobs that were "automated away" by the invention of Fortran. Or all of those C++ programmer jobs that were "automated away" by Java, Ruby, SQL, whatever.


But with the changing nature of computer programming, arguably programming becomes more efficient and needs less manpower. It's easy to see that programming a web app today takes less man hours than it would've if you were to try to do it 30 years ago using C.

So far it seems that these efficiency gains have been offset by growing demands in what software can do, making the industry continue to grow. I wonder if at some point in the future, these demands will stabilise and continued improvements in efficiency will necessitate a reduction in the workforce.

Will automation make a computer programmer's job disappear? No, probably not. But will it allow one person to do the job of ten? Maybe.


You'd have had a hard time coding the web 30 years ago, given that it hadn't been invented yet; but 20 years ago, Perl was about as productive as something like Python or Ruby today[1]; a web server, after all, just provides stateless RPC. What was missing was a lot of frameworks that could permit a developer ecosystem to grow, a developer ecosystem being something that lets your more easily reuse things other people wrote. Developer ecosystems with critical mass are what increase productivity, rather than technological improvements.

[1] Technically, I think common web frameworks aren't particularly productive, because they are built around a very low common denominator. I worked on a CRUD app framework in the early 2000s that would still blow something like Rails out of the water in terms of productivity.


The numbers came from a report published in 2013 (linked on our about page): http://www.oxfordmartin.ox.ac.uk/publications/view/1314


Well if you look at the description of a computer programmer, it sais that its the person that just implements the specifications, written by a software developer. So they see the software developer as the one that does some thinking too, and the programmer the one that only translates specifications to computer language, which indeed will be easy to automate in the future


We already have a tool for translating the specifications into computer language: a compiler.


You missed "Software Developers, Systems Software" and "Software Developers, Applications" at 13% and 4% respectively. Combined they are five times as common as "Computer Programmers".


Don't worry, we programmers are safe - just select "software developers, applications" and it's 4.2% ;)

https://willrobotstakemyjob.com/15-1132-software-developers-...


Hmm the BBC already built a front-end for this a while ago. See http://www.bbc.com/news/technology-34066941


Seems to say journalists are pretty safe:

https://willrobotstakemyjob.com/27-3022-reporters-and-corres...

However, I'm not sure I'd agree too much with that. Yes, some reporters and correspondents are safe, especially the ones writing opinion pieces or doing investigative journalism.

But a lot of lower level journalism is not particularly novel or interesting, with a lot of people in gaming, tech, media or sports journalism basically just writing about current events the same way as every other individual in the same industry.

I feel an AI could probably automatically write articles based on the events in a sports match (heck, I'm sure that's actually being done right now in some cases), or figure out an automated way to pick out information from a trailer or E3/Nintendo Direct event broadcast.


an AI could probably automatically write articles based on the events in a sports match (heck, I'm sure that's actually being done right now

That's one of the areas Narrative Science (https://www.narrativescience.com/) operates in: http://www.bbc.co.uk/news/technology-34204052


> (heck, I'm sure that's actually being done right now in some cases)

It absolutely is, and has been for quite some time. https://www.theguardian.com/media/2016/oct/18/press-associat...


I was curious about how thieves will be doing after the robot uprising. The first occupation matching "thief" was "Chief Executives" http://imgur.com/a/hDIeZ


I am a developer, but based on a recent article I read, data scientists are less probable losing job to automation, so tried it and ended up here - https://willrobotstakemyjob.com/15-1111-computer-and-informa...

I also work in insurance industry, and rumors were, underwriters are more at risk losing jobs to automation - https://willrobotstakemyjob.com/13-2053-insurance-underwrite...


I entered Machine Learning Engineer and it gave me a list of job titles. No automation ? (╯°□°)╯︵ ┻━┻


4.2% probability for "Software Developers", but 48% probability for "Computer Programmers"?


Models shows 98% [1].

If I were Kate Upton or Bar Rafaeli, I'd start looking for a new job pronto!

[1] https://willrobotstakemyjob.com/41-9012-models


Chicken Deboners: 94%

Chicken Sexers: 41%

I'm thinking I should go on that Chicken Sexing course.


A fun side project I worked on with a friend.

Based on a report published in 2013, find out how susceptible over 700 jobs are to automation.


Ah, the numbers looked familiar. Some of them are pretty bogus, like "computer programmer" at 48%.


That data is floating around since a long time, I've build https://www.replacedbyrobot.info/ like over one or two years ago, I think. ;)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: