A year ago, TLF wrote this about MOOCs:
Nothing has more potential to lift more people out of poverty — by providing them an affordable education to get a job or improve in the job they have. Nothing has more potential to unlock a billion more brains to solve the world’s biggest problems. And nothing has more potential to enable us to reimagine higher education than the massive open online course, or MOOC, platforms that are being developed by the likes of Stanford and the Massachusetts Institute of Technology and companies like Coursera and Udacity.
Then Sebastian Thrun admitted that Udacity had a "lousy product" in this interview:
Friedman is a cheerleader for employers. Employees are warned that average isn't good enough. One stupendously glaring omission in this kind of talk is that good managers are a dime a dozen. If anyone should be warned, it should be managers--if there is no reason to hire employees displaced by automation, then there is absolutely no excuse not to hire the very best managers.
Basing an entire industry's future off of the shortcomings of one man's idea and expectations is a bit silly.
Having the best professors use video (or radio) to openly broaden the audience of their abstract teachings is a concept that has been widely tried since, well, I’d say 1530 with the College de France, but the recordings where a bit different then — anyway: it’s the principle of almost every high-brow show. Those tend to have less of an audience than we’d like, or need considerable adaptation — but Udacity was one attempt that was remarkable in mainly one way: how little the class was adapted for anything but highly intelligent and motivated twenty-somethings with fresh brains.
Look at what Dan Ariely is doing for an great example of how to preserve the internet support and have an engagement comparable to a physical class.
As to Friedman… Well, this is HN, a polite society, so let me skip that part.
It has occurred to me that management is basically an information processing task. Why shouldn't it be automated? I think a reason is that managers are kept around to manage information that can't be committed to a stored record. An example would be details of the decision making process for who gets laid off during a downturn.
We know something about the hypothetical manager's decision making process: the typical manager would include the constraint that he should be excluded from the collection of employees to be laid off. An automated improvement on this algorithm would unlock tremendous benefits to principals, and in the case of publicly owned companies, to shareholders.
NEW YORK TIMES COLUMNIST THOMAS FRIEDMAN
THEN: "Let's start with one simple fact: Iraq is a black box that has been sealed shut since Saddam came to dominate Iraqi politics in the late 1960s. Therefore, one needs to have a great deal of humility when it comes to predicting what sorts of bats and demons may fly out if the U. S. and its allies remove the lid. Think of it this way: If and when we take the lid off Iraq, we will find an envelope inside. It will tell us what we have won, and it will say one of two things.
"It could say, 'Congratulations! You've just won the Arab Germany--a country with enormous human talent, enormous natural resources, but with an evil dictator, whom you've just removed.'
"Or the envelope could say, 'You've just won the Arab Yugoslavia--an artificial country congenitally divided among Kurds, Shiites, Sunnis, Nasserites, leftists, and a host of tribes and clans that can only be held together with a Saddam-like iron fist. Congratulations, you're the new Saddam.'
"In the first scenario, Iraq is the way it is today because Saddam is the way he is. In the second scenario, Saddam is the way he is because Iraq is what it is. Those are two very different problems. And we will know which we've won only when we take off the lid. The conservatives and neocons, who have been pounding the table for war, should be a lot more humble about this question, because they don't know, either."
--The New York Times, January 26, 2003
It's true that automation has a disproportionate effect on different levels of society. But that's more about humans than it its about machines. There are a lot of people working on technical solutions now and sooner or later people will get to work on social solutions that address some of Friedman's concerns. We're a long way from a fully automated society and we have time to prepare for the inevitable changes.
Sure they do. Many founders would like their companies to pay them at least a working wage.
I would argue, having lived in countries that lead automation rates, that we already are at a point were large groups of the population cannot find work with their qualification, and never will: farmers, miners and forest workers had to convert to become factory workers, but most of those are now outsourced or automated; at the moment, there isn’t a job in Western countries for someone who has difficulty learning how to read.
Some people genuinely can’t grasp any abstract concepts: I taught in jail for a while, to people who couldn’t for the life of them generalise “Two apples plus three apples equals five apples; two sheets of paper plus three…” and remained baffled by ‘addition’ as an abstraction. Keep in mind: some of those guys fashioned working radios and distilleries from scratch.
Similarly, I have talked to taxi drivers who couldn’t understand it was in their interest to switch to Uber, how the trust mechanism worked with the app (they thought the dispatcher could better tell on the phone if the client was shifty), to say nothing of choosing a path based on a congestion map. I love my Roomba and my waching-maching, but without the cleaning jobs, what are immigrants to Western country going to do? Take care of elderly people to whom they can’t speak?
I fully agree with your point that automation is good for all, but we might have to face before the end of our careers a point where a vast majority of the population has to be considered handicapped, “mentally” for lack of a better word, out of the workforce and entirely subsidised.
At the moment, even a coder, hard working and of average skill, cannot find a job where I am: I know a handful of guys who can make functional, useful apps all by themselves (say: scanning barcode, interrogate database to tell if a product is ethical, handle shopcart, and all that with a great design; sports tracking). They make a hundred of euros from those on the first month, but that’s it: I tell them to treat those as portfolios to get interviews, and they do. I have no idea what goes wrong during those, but their repeated failures to sign anything is bigger than one HR being in a bad mood.
What I see is a world in 15-20 years where 3% of the population designs things; 12% have menial jobs, mainly manning security and healthcare bot-based systems and 85% cannot work. Spain or even Berlin already have a modern economy with 30% unemployment: subsidies, black-market service jobs, low rents… it looks somewhat stable, if shocking.
Clay Shirky had an interesting point about that, in an essay called “Gin, TV series and Wikipedia” or something like that — where he assumed there is a sacrificed generation, and the next learns. I have a hard time seeing millenials be anything but creatives with a YouTube channel. Maybe that’s worth it.
Automation is what I love most about programming. Most of the time, when you encounter a tedious, dull process, you can come up with creative ways to make your machine do the work.
It's one of the few occupations in which boredom represents opportunity.
Friedman has a talent for writing as well as glueing together disparate concepts to support his arguments. It's fair to say that in some articles, he says very little, very well.
But he has a point. We are on the verge of some new golden age. Perhaps a new machine age. I tend to think we're seeing the beginnings of a transition to a machine lifeform-dominated Earth. Probably take 200-400 years to fully play out. People aren't just going to become obsolete -- they're going to become irrelevant and non-existent. Seeing pictures of an old-fashioned plain human like us, in another 400 years, will be as quaint to future man as seeing pictures of neanderthals in a science magazine is to us.
The interesting thing is just how awake the average person is to the amount of change happening. When the Industrial Age happened, there were huge factories, great cities being built, and all sorts of physically giant things to look at and note that yes, something new was happening. With the new age, we're looking at changes in Silicon, machine learning, maybe robotics. There aren't going to be giant robots leveling cities in the near future (thankfully!) or anti-gravity machines taking us to Mars in a few hours (sadly!). Instead we're seeing massive changes, on a tiny scale, about what it means to be human.
This article is for the cheerleaders. HN is for the cheerleaders, like us. But I wonder if the average person is even going to notice.
Okay, but why not bring a gun to a chess match against a human? That would be an excellent winning strategy (cf. the Gordian Knot).
During an economic depression, a growing fraction of the population is removed from the "real" economic system and market, they start to use Tide detergent, lotto tickets, drugs, cases of cola as their new economic system and currency.
In a similar way, as the fraction of the population still involved in the interchange of information decreases, despite all the article hand wringing, its pretty simple, they'll just drop out. Hard/impossible to say what the new system and currency will be. People will just stop. The future of the "knowledge worker" looks a lot like ... travel agent.
So, put it together, and in a post-knowledge worker economy, they'll always be a small number of elite knowledge workers, and always be some jobs that are not financially viable to automate away (not many, of course), but overall society will just wander on over to something else and look at knowledge work / service work as something from the past... thats what grandpa used to do before all of them got downsized, or whatever.
The article is much ado about nothing, in the long run. Currently there exists a service / knowledge economy, at least for a small segment of the population, and its going away, just like every other segment eventually mostly goes away. OK.
That and watching the political ax grinding was moderately entertaining. "If we all just held our hands around the campfire and sang, then ..."
Yet they consistently don't choose it. Much as in the following case...
All this data means we can instantly discover and analyze patterns, instantly replicate what is working on a global scale and instantly improve what isn’t working — whether it is eye surgery techniques...
That's an excellent example of another thing that won't happen, mainly because it already doesn't happen; medicine is a field where there are many, many examples of hard-data demonstrating better techniques, well-known, that nonetheless don't spread. Atul Gawande discusses it, amongst other interesting things, in his book "Better"; I namechecked that book mainly because, as well as discussing this, it's quite a fun read and a good starting point for principles that carry across all fields, but it's by no means the only source.
> ...lowering taxes on human labor to make it cheaper relative to digital labor
> ...guaranteeing every American a basic income
What all these say to me is a an unwillingness to actually advance the social dynamic that has been in place for 100s of the years. The hierarchy of serfdom.
Why when you have the possibility of removing drudge like work with automation would you actually want to actively retain it.
Why when a society of plenty is possible would you want to limit it's benefits for a few and give only a basic income to the rest.
Why would you try to define someone with a menial task when you could aim to help them bring more worth to society and themselves with a more fulfilling and beneficial activity.
Not saying there are easy answers to any these questions but trying to maintain the status quo is not an approach that benefits everyone.
Most likely that will eventually play out with doctors, probably economically enforced. Sure, doc, do whatever your medical judgement thinks best, if you want to work for free... but we're only paying on claims if your medical judgment matches "watson". So overproduce new docs for awhile to crater salaries, only pay claims if their work matches a computer model, and what we used to call a "nurse" will be the new "doctor". This also helps with doc-in-a-box facilities, which could now literally be a doc-in-a-box instead of a nurse-in-a-box or PA as implemented now.
Eventually, being a "pharmacist" will mean being the minimum wage drone who stocks a semi-smart vending machine, or maybe it'll mean being the minimum wage drone answering the 800-number support line outsourced to the Philippines.
I could see the same thing happening with accountants and HR and perhaps marketing.
If you think on why you can't replace programmers with software written by experts who know nothing about the domain (or even by those who do), it will tell you why the professions you've listed would be very hard to automate.
I think software and the hardware it controls is going to be pervasive and take over a lot of formerly human functions in our society in the next century, but the professions you've listed above, along with programming, will be some of the last work to be completely automated, after manual labour, farming, cleaning, data-entry, call-centres, technical support, etc etc. It'd require strong AI to do replace those professions, expert systems don't come close, and if you think your profession is uniquely immune to these changes, why is that?
1) I do think software engineers are in danger of suffering the same fate, just later than some. We won't even be the last profession to go, that'll be comedians probably.
2) People often underestimate the impact of automation by looking at automation as a binary state where the problem comes when you automate away everyone, but that's not a great way to look at it. Partial automation on a quick enough timeline is more than enough to destabilize the economy of a particular industry.
We are a long way from computers replacing all lawyers or doctors or software engineers, but if they can replace just 25% of them by allowing the remaining 75% to be as productive as the entire 100% was previously, that's already a large economic impact. If automation can replace 75% of them, that's a massive world-altering shift, economically.
We really should be worrying about the economic impact of all this automation now, because at least in the USA (where I live) we are a very long way away from (sociopolitically) being able to deal with this quickly approaching new reality of having way more people than are needed for doing what would traditionally be considered "productive" work.
Oh I'd disagree with that... If you'd like an example, note how "Excel" is the industry wide corporate standard database management system. Its buzzword compliant, being NoSQL, and scales to tens of thousands of records, which despite claims to the contrary is almost always enough for most real world problems. The world of the future is not having a DBA and CRUD developer cooperate to write your shopping list, its one untrained dude with a spreadsheet.
Look at how many graphics artists using paper and pencil have been replaced by one marketing guy with photoshop. Its already happened. Its not bravado but observation.
Rows and rows of desks of junior accountants replaced by one accountant with quickbooks. Another observation, not bravado.
And what, exactly, are pharmacists doing 99% of the time beyond being a very high item value / high risk vending machine? Realizing my mom worked as a pharm-tech part time when I was a kid, so you're not going to fool me... Very highly priced customer service and team lead and thats about it.
The mechanical steam shovel never quite eliminated the use or sales of hand shovels. Its just that "digging ditches" isn't a viable career path anymore for almost anyone, anymore. Or "manual metal lathe operator" or any number of other tasks. I'm not saying there will never be another accountant hired or dr. or lawyer. Just a whole heck of a lot less of them. At least one, two, maybe three orders of magnitude fewer.
Much like the first industrial revolution... there's people right now, out there, being paid to use a hand shovel... the story is, its now probably hundreds of people right at this moment, not 5% of the population or whatever like in 1850.
Replace it totally? Doubtful. There's still inherent limits to computation that we have not resolved, and if our current hunches are correct, likely won't.
COBOL was meant to make programming a negligible and easily doable task with its rigid English-like syntax, but how did that turn out?
Sloppy code generators have been around for decades that can pop out some generic half-useful spaghetti abomination, but if you want anything just above total crap, you have to do it yourself.
Things like databases? Sure. CRUD apps? Absolutely. Even if you don't have a GUI to generate them, there will likely be some extremely abstracted library where it's a matter of pasting in a few procedures.
Yet a lot of things are still too vast and niche. That, and reaching the bug-free equilibrium. Removing state and side effects will deal with a wide class of bugs, but it obviously does not rid of them entirely. There will still be some just barely competent (or wizardlike by the hypothetical far-future standards) people who will be in charge of maintenance, and those will be our programmers.
While programmers are certainly a peculiarity in that doing your job well means furthering your own extinction, I don't think it will be anywhere near as drastic as it's made out to be. An ever-breaking technological dystopia sounds much more likely than some clockwork ubiquitous computing society.
The exponential curve has no leaps. This looks like pedantry, but it's important to keep in mind. Exponentials are completely smooth and boring all the way up.
Excel is not a database (I don't mean that as a trivial correction, it's quite distinct); many of the jobs done with it could conceivably have been done with databases but were not, so in a way it's opened up a whole market to store and manipulate more data than ever before on a small scale in businesses, and I suppose you could conceivably replace dbs with it in some cases. However I don't see it replacing databases in the businesses I work with — in my experience at least I see more data moving from excel over to databases and apps (with internal dbs) as it becomes complex or people want to share it than I see it moving the other way. For example imagine a CRM built in excel, scaling up to 1000s of customers and several users — eventually there comes a point where it is far easier to use a db and front-end than to try to manage that sort of data in excel by hand and share amongst several people.
Data manipulation is an excellent example of the complex changes wrought by technical progress — there is an argument for saying that as technology progresses, we discover more work we can do with it, and data and analysis become more complex, not less — more jobs are created which simply didn't exist before. Automation doesn't always lead to jobs becoming simpler or humans becoming redundant, quite the reverse in our recent experience.
Look at how many graphics artists using paper and pencil have been replaced by one marketing guy with photoshop.
Answer, zero. Graphic artists have become quicker, can do more work, and can produce work of higher quality in terms of resolution, type etc than before, but it's still quite a distinct job from marketing. One related example of a profession which has gone is typesetter though — they've been entirely supplanted by technology. Just to be clear I'm not saying technology never supplants anyone, but that the professions you listed are the farthest away from being supplanted and are not even on the horizon in most cases.
Customer service is still really important in many jobs, interfacing with humans is extremely hard, and expert systems are not an adequate replacement for most humans yet, esp. in creative or customer facing roles — that will require a qualitative change in the nature of our software. Perhaps that will happen eventually (in some ways I hope it will), but we simply haven't made that leap yet.
It most certainly is used as one in the corporate world. That doesn't mean its a very good one, in the sense that a hammer isn't a very good substitute for a screwdriver.
None the less, where a businessman would have called IT to store a couple hundred records in 1970, now the businessman uses a spreadsheet as a database.
I agree I LOL when I see people my hand doing the equivalent of an SQL JOIN. To some extent, the older I get, the more I see people using computers as a somewhat different form of manual labor, rather than a replacement for manual labor. Rather than typing this form by hand on the manual typewriter, you use Word and call it progress.
In 1970 the businessman would probably have used a rather more old fashioned method - a piece of paper and an assistant to write down the figures on a page which looked very like an excel spreadsheet!
Now, my doctor has forty years of experience, and if she order an MRI it's because she already knows an x-ray won't return any useful information, and will just waste time, money and her patient's radiation exposure limit. But, statistically speaking, the carrier in question has obviously determined that they'll save money if they force providers to route through x-rays regardless. In other words, my provider's judgement was overridden by some Excel spreadsheet somewhere -- not even a Watson or Big Blue.
Meanwhile, doctors are being forced to handle more data entry functions, something they understandably hate, and which has probably turned more providers against the Affordable Care Act than anything else. (The provisions are, IMHO, unduly onerous even as I understand the anti-fraud rationale behind their introduction, but that's another issue for another time.)
Having said that, given the stringent requirements FDA places on med-tech, I don't anticipate AI providing more than a support role over the next few decades. FDA is always going to want a credentialed human in the critical chain between patient and procedure.
When the cost-risk equation comes out in favor of the computers.
Put all these advances together, say the authors, and you can see that our generation will have more power to improve (or destroy) the world than any before ...