So I think the apocalypse is double-bladed: while automation kicks a bunch of current workers out by making them immediately redundant, it also freezes out the next generation by removing entry level jobs and not really replacing them with anything equivalent. Meanwhile, universities and vocational ed programs won't get this memo for another ten years so they will continue to happily propel waves of students onto a set of closed and locked doors.
I've been on the college recruiting circuit to help my company hire, and I'm often severely disappointed by a good portion of those I meet. There's occasionally the standout who really impresses me, but then I think they're never going to want to stick around at my place. With this thought, I'm sort of in agreement with the commenter who said that we need college to train people at a higher level since there's almost not time on the job to ease into it.
Maybe companies need to stop thinking in terms of employees sticking around for a really long time, and get used to the idea of employees going from place to place when they get too bored or want to do something different. It seems insane to me, the idea of expecting an extremely intelligent, high-performing person to want to come to the same workplace day after day, for years or decades, doing mostly the same work.
This is the paradox of hiring today, and why tech hiring is broken. In fact, there's a story right now titled "Hiring Is Broken" on the HN front page, not far below the OP. So it's getting harder to recruit people who do make the cut technically and communications-wise, while at the same time it's getting harder to retain them because let's face it—so many startups don't have a compelling value proposition or profit model.
Most of the students I interview fail miserably at this. They can hack together a working application by copy/pasting from examples and SO posts and making small modifications, but they have no fundamental understanding at all of what the computer is actually doing with the code they write.
Some things I see that give me pause:
1) No project work. Lack of interest in building things on their own, researching frameworks, building out small apps, etc. I'd like to see even a small attempt at learning tooling and frameworks used on real world projects (it doesn't have to even be close to the toolings we're using, just anything).
2) Lack of reading the technology-centric, software engineering centric internet sites. Even attending a Meetup or two to start seeing what's going on outside academia (again, I live in a tech hub, so lots of opportunities).
3) Just general feeling of "hey, I got this CS degree, I'm ready to work", but not really showing much enthusiasm that they actually want to be software engineers as a career (it's a tough career that requires a lot of self-driven learning and curiosity to do it well). Believe it or not, I've seen students come through summer internships, and decide they actually don't like the real world day-to-day software development for a career.
I don't really have any preferences with what you're reading. Just the fact that there's some part of technology that fascinates you enough to read further, on your own in a self-directed way. It shows curiosity and initiative. I always like this part of my conversations because I often get to learn something new myself.
That's your biggest asset. The perfect GPA will help you get pass the useless HR gatekeepers, a demonstrated ability to build something will mean a lot more to the technical interviewers. Bonus points if those projects happen to use common industry things like an sql database and contains unit tests.
Most graduates, if you sat them down with some fairly simple requirements and said "build this with whatever tech stack you're familiar with" would have no idea where to start.
Even relatively rote software development that involves embedding business logic into a software system is likely safe as long as the cost of continuing to develop that software is close to the cost of switching to some other framework. It's when the cost of development is much greater than the cost to try another framework out, or when a company wants to expand something and doing so on the development side would be cost prohibitive, that someone's job could be on the line.
There's a whole set of fairly basic tasks that can be done that can get a developer to the point of submitting a PR on their first day. Many, many companies screw that up.
(Disclaimer: I work for a company that has been in the 5% club since 2013)
The Cloud companies are also hiring armies of support people, and it's a great way to kickstart your career in any of these companies while getting company provided training in the tech.
Of course, the service provider may need some amount of people figuring the failures out, but that amount is smaller than if customers had to debug their own problems, and is more susceptible to centralization, and to "fixing by replacing".
Agreed entirely. I would be surprised if we still have two-year technical degrees in a decade.
> But instead of five backend developers and three ops people and a DBA to keep the lights on for your line-of-business app, now you maybe need two people total.
All nine of those people would probably have been CS graduates, or at least many of the backend developers would be (and perhaps the DBA). Or they would be people that thought of themselves as "developers" and not "IT" for whatever that distinction is worth now.
I think this is a big error people make. Some colleges are great. Others are horrible. It's very hard to say, "College is a waste" or "college is great".
In general interviewers just have unrealistic expectations for entry level candidates. They forget how incompetent they were at the same age. Or they have ridiculous notions that everyone should know how to write a quicksort algorithm or whatever, when some students may have focused their studies on other (but equally challenging) topics.
I find many people who can install Ubuntu and run the canned commands or curl foo.sh | sudo bash or docker / k8s scripts they download, think they know what they are doing.
Its getting to the point someone who can install windows is more technical than someone who can install linux
That’s because it trains for the job, not for the field. While you’d probably get up and running easier with an IT degree since you know the current tooling, you’d be worse off than someone with the conceptual knowledge that comes with a more general CS degree, and you’d therefore have a tougher time adapting to whatever new technology that didn’t exist in your IT program but was touched on conceptually in the CS coursework.
As someone whose taken a fair number of IT degree classes, I'd say there's a fair bit of conceptual knowledge involved. And in the case of networking, for example, most of the standards and protocols you're being taught how to work with have been around since the mid-80s, and aren't showing significant signs of going away any time soon.
I'd say the CS vs. IT split would probably surprise you. I have gone up to the bachelor's level in a game programming degree, and it was amazing how poorly people who were proficient in writing C++ couldn't handle basic PC troubleshooting, it's a different skill set entirely.
I've also run internships with CompSci graduates and have to say they're basically unemployable when they graduate, they might know some theory but they can't build anything. Community College teaches you to build things, so you come out with skills relevant to the workplace and you can fill in the CompSci stuff later.
My son currently is enrolled in CS and his first 2 years of school are filled with humanities, history and a few more irrelevant courses all to keep some profs employed. His next two years will be filled with more useless courses and by the time it's all over will have cost 50k + (he lives at home and goes to a State School).
I feel like he could have taken a 6 week Java/Python/whatever and got more out of it. Add a CCNA/CCNP for the Networking knowledge, Linux Cert,Security Cert from SANS, and some self study and he would know more than a 4 year degree and be bettered prepared for the working world.
Universities in the US are all about making money, supporting football and athletics, tenure for the profs, and finally accreditation for 50k+?.
Meanwhile, 10's of 1000's of H1B's are needed because our kids know nothing and are being taught shit.
Two areas that need major change and disruption, Education and Healthcare, everything else can wait.
Yeah god forbid he enrich his mind and develop lateral thinking skills, empathy, perspective and wisdom instead of focusing exclusively on how he can best serve capital.
1. If you want a very specific skillset to do a very specific job, there are more opportunities and options today than ever before. Self-study, MOOC, bootcamps, certs, etc. Pros: fast, efficient, focused, practical, immediate. Cons: the specific/narrow focus may leave you with gaps you won't even be able to appreciate until too late.
2. If you want a more general education, Universities are there to provide. You'll get not just immediate hands-on-keyboard skills, but math and CS-theory background, and also also communication skills, discipline, diligence, social networking, perspective to be a team lead one day, etc.
Now, I do believe universities have a LOT of optimizations to make; a student's life tends to be sucky in many ways it doesn't need to. I've repeatedly found and heard of the difference in attitude between a college/bootcamp of "You're the paying customer, we'll provide knowledge", and university attitude of "you are irrelevant, be grateful, and jump through the hoops jump for the privilege" - whether from the ever-increasing admin/bureaucracy cohort (sometimes helpful, often power-blinded), the obscure rules and difficult processes, or some of the tenured professors. But again, the information is out there, the choices are available - and overall there's never ever been a better and easier time to acquire knowledge.
Full transparency, I hear this from nearly every 1st year college student every fall semester - either the CS, CHEM, or Engineering students exclusively. "Why do I have to take English, I'm just going to work with [chemicals] [computers] [software] [roads] [whatever else]"
Not sure how this opinion will fly on this site, but I'm not sure what you expected. It sounds like you have an ax to grind with a specific institution and you needed to do more research about the system of universities overall. They were and are designed to make a modern version of a renaissance wo/man - good or knowledgeable about everything being the idea. Making citizens who are more than just 1 skill cogs. Teaching critical thinking and higher order thought processing. They were not, and are not job placement agencies.
If you were looking for nothing but the certs and technical skills, you should've sent him to a technical/trade school or community college. That's why those exist.
There is a massive difference between being job task ready - like just finishing the certs would make you. And being life ready - like a liberal education makes you in theory. Giving a student a liberal education is literally why universities were designed. Why was that a surprise?
I genuinely don't understand why being good at things that are outside of your expertise, or at least knowing enough about them to sound like an educated person in conversation, is 'irrelevant'. I don't get it and never have.
>Meanwhile, 10's of 1000's of H1B's are needed because our kids know nothing and are being taught shit.
My experience has taught me that whenever an employer says "we can't find the workers" and use H1B's, what they really mean is "we can't find the workers at the wage we're willing to pay". Those are two different things.
THAT BEING SAID, the costs of education are out of hand. Living inside the beast, I can tell you that many administrators are just flat blind to the storm coming.
In the 90's, the message was go to college, go to college, go to college - relying on the past 40 years of if you went to college, everything else just sort of fell into place.
Well, now we have so many 'extra' services students expect, so many expenses, and less state/federal dollars. So students pay for it.
NOW the message is that you need to go to college only if it furthers your career goals. They're working in kindergarten with my child on that. It's frightening, honestly.
I think the pendulum will swing the other direction and we'll see a glut of skilled trades-people in the next 10 years.
@forrestbrazeal is implying that many of those certs are going to be obsolete really soon. At least, that is what I took from the article.
I agree that American Universities are kind of insane. We, Canadians, are looking in and shaking our heads.
I suspect, if you aren't just being hyperbolic, you mean, “because he chose to a seek a degree from a liberal arts institution rather than an engineering one (which would have some, but less, general ed) or a vocational certificate program or career-focussed bootcamp.”
We need a more decentralized education system top-down that isn't tightly coupled to the gov.
The same thing happens to programming positions.
But I think the good news is missing from this article—IT jobs are, overall, sticking around or increasing in number. (According to the Bureau of Labor Statistics, the jobs are growing “faster than average for all occupations”). You do have to keep updating your skill set, but it’s not like manufacturing, where efficiencies eliminate jobs altogether or move them to completely different sectors. And there is that ageism to worry about, and uncertainty.
I’m personally more worried about some of the other remaining white-collar office jobs, like the accountants, paralegals, HR, various banking positions, etc.
I can't stress how important this is. Folks going to things like boot camps or other educational outlets that focus on one language will utterly kill their career if they aren't aware of how fast things move. If you don't learn the underlying abstractions and paradigms that take various forms in different languages, you will get left in the dust in a matter of a few years.
The best programmers I've ever worked with got excited about programming patterns and paradigms, not frameworks and syntactic sugar. Those are also the ones I paid the most attention to.
Bottom line for both software and IT engineers: you learn to learn, not just to do.
I think this is (at least partially) a side effect of the pedagogy of computer science in schools changing throughout the decades. When I was in school, just about the entire program stressed OOP, with very little focus on imperative/procedural programming. Newly minted programmers fresh out of school don't have the exposure/mindset to jump straight into one of these (C maybe being an exception). The old guard that called these languages home are a dying breed, and the salaries paid to program in them these days bears that out the scarcity that results.
The Cloud greatly diminishes and in some cases completely eliminates that work. The only thing left is actual software development.
The problem is that junior sys admins aren't as useful as before to most startups. I still think they'll figure it out, but the industry is changing.
This trend results in less competent people flooding the job market.
Today ? Not happening.
SRE or "DevOps" roles new roles that are similar to system administration but the biggest difference is the use of cloud technologies, automation and most importantly those in need to be or at least understand software develop and code.
It takes a hell of a lot of work to take a company's entire infrastructure and migrate it to Kubernetes and the Cloud, and then monitor and manage it. It's not trivial.
So the key is seeing your job as solving a business problem with computers, not "I administer Oracle version X.y.z running on Redhat Linux".
The mid-range jobs have always been vanishing. Many times, I'm the guy automating them out of existence. It's always replaced by something else.
It depends on the circumstances, of course, but there is often more work after the automation than before. It's just different work. It requires reskilling.
The article mentions some new product AWS is coming out with. No matter how "simple" it makes things, someone is going to end up being an expert at using it, and will probably be paid well to do so.
Really, the toughest and most crucial part of this career has been keeping up. The work stays steady, though.
But this isn’t true, outside of webdev.
If your skill was “DB2” or “Oracle” or “Cisco” or “C++” you could have had a 30-40 year career in that, easily. There are plenty of others. Java has been around commercially since about 1995, there will definitely be plenty of Java jobs in 2025.
Whereas in webdev you’re basically starting from scratch every 2 years and competing with new entrants to the market because they literally have as much experience of the hot new framework as you do.
I don't think so. Mobile phones were already a significant thing before 2008 and the morph to smartphones was already in train (first iPhone was 2007).
VR still has to emerge from a relatively small set of niche use cases. Augmented / Mixed reality is more likely, given the ubiquity of good cameras on smartphones. Arguably AR/MR will let the phone manufacturers keep pushing device upgrades for longer, as the smartphone market saturates.
Sure we'll see slowdowns and some dramatic shifts in skill sets. We just need to stay ahead of the curve.
The trick is to write your resume so it gets past HR's filter, but without putting bullshit on it. It's unfortunate but I consider this kind of thing a critical skill for anyone applying to technical jobs.
I'd argue that skill applies even after you get hired. Fudging details to get around wasteful trivialities without outright bullshitting the person asking for the requirement is truly an art, and it's very hard to navigate this field without that skill.
Any tips on how to do that?
The hiring manager will then make sure that your skill set is a good addition to the team.
As a rule of thumb, if you are applying to a job and you meet all of the listed requirements as written, you are overqualified and should apply to a higher level position.
Definitely brush up on the things listed, though. Walking into an interview without at least a cursory recognition of what a listed language/framework does sucks, as it wastes your time and the interviewers'.
Also, be wary of bad recruiters. If they drink their own kool-aid and actually enforce the arbitrary x # of years in foo language blurb, you might be screwed (as well as the manager that put out the request for a hire in the first place).
The theory here is that programs that automatically scan resumes for keywords will always find the ones they are looking for.
I can’t say I have tried this or know how well it would work in practise.
Last year I witnessed the sad story of a general manager, that promised to migrate away from an as400 in six months.
This guy (and several others) doesnt fully understand that a system, that is software and hardware and infrastucture, has its life determined by the returns it gives to its mother organization. If the system works, the organization wont pay or dare to replace it. Core systems are the hardest, and thats where all those mainframes and c and fortram and nowadays java legacy sys are still alive and kicking.
Pd: I love those systems btw, if you have one that needs love and care, I'd like to hear about it :D.
"I’ve spent my career in tech, almost a decade at this point, running about a step-and-a-half ahead of the automation reaper."
I mean, yes, that is the entire job description. If you are in IT, your responsibility is to learn the best technologies, and be continually re-evaluating what to keep of your organization's current and what to improve or replace.
That is why I wouldn't want to do anything else. I love learning new things and no profession offers more opportunities to learn new things than computer technology.
Automation replaces repetitive work with tooling and work that's more complex. Abstraction allows one to delegate to another for details, which may include choosing from a palette of pre-made options. Consolidation will come about as fewer independent players can sustain themselves in the market. Some will be out-competed by economies of scale, some will be starved by restrictions on intellectual property and lack of access to expertise.
This process has already played out for "small business websites", yet there's still lots and lots of web developers and web designers employed or freelancing. The current wave of WYSIWYG website generators is actually very good, and they have add-ons and integrations that make sense for their target market. But plenty of clients don't want to mess around in it, so they'd rather hire someone. This could be maker of the generator, or it could be an outside consultant. In either case, the person brings judgement, experience, and creativity, to tailor the deliverable to the needs of the client. These are skills resistant to automation, but not immune to abstraction and consolidation.
In the end, the antidote is the same as it always was: be adaptable, be personable, be resilient, and be resourceful. These are especially important in one is in a comfortable job shielded from most competitive pressure, because they will be the most surprised and unprepared if their current employment is made redundant.
Manufacturers in the USA don't make cheap coffee cups. We don't make underwear. We don't make car fenders. We make warheads. We make gyroscopes. We make electro-mechanical assemblies that China or Malaysia or Mexico would screw up. We specialize in quality over quantity, and we specialize in cutting edge tolerances and specifications. We make export controlled things for enterprise contracts and the government. Things that require certifications to produce, and govermnent regulatory compliance, and tight tolerances. Nobody here is making the 100,000,000 wrenches you can buy at Wal-Mart.
We don't make 100,000 of anything either. We make 100 gyroscopes for General Dynamics, or 5 jet engines for General Electric. We make US military grade munitions and weapons for the government. The author obviusly doesn't realize that the company making the wafers for Raytheon ISN'T ALLOWED TO USE THE CLOUD. All that great automation that helps AirBNB function with no infrastructure is meaningless when you have to protect your IP from nation state actors. To probably >50% of American manufacturing the Cloud is useless. It's a consolidated attack vector that WILL be compromised in the future and lead to liability. Sure you can put a NIST 800-171 or DFARS compliant business in the Cloud, but it costs extra and it's not worth the risk. You hear about misconfigured buckets leaking data almost daily. Nobody doing govermnet manufacturing work wants to deal with that headache. Infact, I've been in this industry for 10 years and I have NEVER seen a DFARS compliant supplier with outsourced IT infrastructure. I've visited hundreds of companies over the years. What you're describing doesn't interest American manufacturers one bit.
This is probably going to change. People like you said the same thing about health data, and student data. The savings were so tantalizing that the regulators and stakeholders figured out how to make it work. What do you think GovCloud is for? C2S and "Secret cloud"?
Our university had a 3-4 person dedicated Exchange team. When "Google Apps" came out, people wanted us to switch to that from our old mail server stuff. Go figure, why would you keep using pine and squirrelmail when you could use gmail? "It can't hold student data" the IT team said, "it isn't certified for FERPA or ITAR." Okay, true. Fast forward two years, now Google's "Apps for Education" can deal with both. The switch was sudden and brutal and the university no longer has a 3-4 person dedicated Exchange team or an Exchange deployment of any kind.
There is a pervasive myth that servers run by private organizations are more secure than those run by the public cloud providers, and the opposite is actually true. Does your organization receive embargoed information from Intel to mitigate side-channel 0-days before they are publicly announced?
Lobbyist and fools can get past most logical objections and cloudify anything.
This a blanket statement and is wrong. Most cheap manufacturing is done over seas but the US still has a large manufacturing sector that makes all sorts of crap.
Sterilite boxes are also made in the US -- cost a bit more than imported, but again, much higher quality.
It's a shame there isn't a "made in the US, and slightly more expensive but a lot higher quality" option for everything I buy, cause I'd do that in a heartbeat. I hate buying shit that breaks; waste of my time to even have to think about that stuff.
There are still people making nails in the US. Fertilizer. Food gets exported. Then there is all the stuff to too expensive to ship. Lumber, aluminum sheeting, cement ... lots of non-precision stuff is still made locally. Not every US factory makes munitions.
And some stuff is made locally not because of 'better' manufacturing ability but for speed. The fashion industry has to react quickly, quicker than overseas shipping can manage. I just ordered a small electronics assembly from a Canadian manufacturer not because they are the most skilled or precise but because they can chat with me on the phone and ship a small-run (5) faster than any Asian manufacturer. (It's a device for measuring laser energy at specific wavelengths but I have some specific needs re how the data is collected/displayed. It only took a 10-minute call to explain my issues and get a deal together.)
Almost everything in the fashion industry is made in Asia.
And few consumer goods (if anything) are "too expensive to ship".
>I just ordered a small electronics assembly from a Canadian manufacturer not because they are the most skilled or precise but because they can chat with me on the phone and ship a small-run (5) faster than any Asian manufacturer
Yes, but for anything at scale, they wont be the most competitive option.
Do you find this a bit strange, given that the Pentagon is making a massive push toward (presumably private) could infrastructure?
What you’ve seen in 10 years was reality during that time. That speake nothing little to the future.
Absolutely - if something is repetitive, it's a candidate for automation. This is true across all disciplines. Only the as-yet unautomatable human judgement, insight and communication is safely valuable.
On the other hand, "go away or I will replace you with a very small shell script" has been a BOFH joke since the 90s.
Literally had a business call to see if they could run another instance of $LOB_SOFTWARE because they had a person clicking a button for 40 hours a week.
Often it's not easy to make those decisions through code. That is because the person making those click decisions, has a lot of tribal knowledge of the business situation at hand, which triggers only when the situation presents itself.
But if human is basically a meat-robot doing programmable tasks that is not even a automation issue. Its really more management and planning problems. You are supposed to fire people running the business in those cases.
That's pretty much the business model of the company I work for. We have what is essentially a call center using custom CRM/workflow software doing work that used to be done by many times more people with much more training. Instead of 5 licensed and well paid people we do the same work with one lower paid and less trained person assisted with software.
Humans are still involved to make decisions when needed and for communication, but the simple and clear stuff is automated so we can hire pretty much anyone.
It's good for our clients since we are cheaper and more efficient, but less good for the workers we're replacing who will need to find new marketable skills or fall down the ladder.
The problem isn't the technology, it's the complexity of the customer's business requirements, and their nearly complete inability to transfer those requirements into software without complex implementations that they could never hope to implement themselves. I would love to see more tooling to help with this. I have been waiting for 25 years. It gets better, but not nearly what can be described as an apocalypse.
I wouldn't even mind if they even had "those requirements". That would be a huge step up. Oftentimes the requirements are not written down and stuck in tribal knowledge. And woe betide you if the tribe is an outsourcer or offshore team. About half the time, they're unintentionally leaving knowledge in the heads of their meat-robots, and getting the information transferred out of those heads is painful and time-consuming, because "set of procedures for meat-robots" is effectively what they're hired for.
If the procedure is periodically performed, I often get pretty good results just asking for copies of the resultant emails reporting completion, and working backwards from there. Automation at this layer of staff work is considered exotic developer-realm, needs-a-budget-and-a-project-manager effort, even for what most HN readers would consider relatively trivial multi-hour or multi-day scripting work.
The abruptness and agony of automation sweeping through these layers in the upcoming years as the tooling to discover, capture, distill, and maintain these requirements with tight synchronization to software teams maintaining the code behind the automation are going to be politically challenging, as a lot of these people have zero notion what they're doing can be automated away, even as they consume the results of ML in their daily lives.
And I'm rather glum about teaching these people how to perform the automatation themselves. The reception I've gotten to my offers to help them get on the track to learning programming and automation have been very underwhelming. Even if someone doesn't "get it" about coding, just the exposure to the thinking patterns would help me enormously cut down on unnecessary meeting times, as there are still way too many people whose conception of automation is closer to "can't they/you just...[magic/mind-read]?"
I've been using a fullstack low code development tool for several years now, and when it comes to developing CRUD apps or data reporting apps (with charts, interactive, drill down reports, etc. etc.), it's astonishing how quickly you can stand-up a secure, fully responsive web-app, complete with authentication, authorization schemes, report subscriptions, etc., without writing any code at all.
And, when you bump up against the limits of the declarative/low-code aspect of the framework, you can toggle over to java script, your own CSS, SQL, etc., so it's not like you paint yourself into a corner.
So, I agree, if Amazon creates something like this, and it is as good as some of the existing low-code tools out there, it's going to have a big impact over the long term.
In my earlier consulting role, and now as a CDO, it's my go-to tool for CRUD and data presentation apps.
(I don't work for Oracle.)
I wish I had the time to spare to build a PostgreSQL/Python clone of it.
Though SQL is certainly a form of code; it's just nice restricted domain-specific code that people without the title "engineer" or "developer" often know.
If there are IT jobs developing for smaller organizations, maybe those will go away, but... I think a lot of that disappeared already.
I'm close to retiring (from this job, anyhow), so it's not a personal issue for me. I just haven't seen it happening as described.
Let's say 10% of these working class people just need a push to discover they're actually decent at coding/devops/admin/etc. 3,500,000 truck drivers means an additional 350,000 people competing for IT jobs. Now factor in all the other people employed in these sorts of jobs. There's going to be a huge glut of incoming "cheap" labor when these sectors become more automated. I don't expect them to be competing with people like Linus but they're sure as hell going to be competing with the neighbors kid that went to a public 4 year school for a CS degree because the job prospects were good.
A huge number of "tech workers" today are from the same families that would be working class, factory workers, and so on just a generation ago.
There's no magical racial difference between working class people and tech workers.
Working class to me means Walmart, Mcdonalds, and Jack Generic Construction LLC.
Working class to most people just means that they do manual labor. They work outside and in factories building and moving physical things be it burgers or buildings or balloon animals. The key seems to be that they don't generally sit at a desk with a computer on it for 8hrs a day.
Well, neither do most tech workers. Why’s there another high profile breach seemingly every week? Why is software generally full of bugs? Because the devs lack basic competence at their jobs. Maybe they should have been flipping burgers instead. But RoR, Node.JS et al dumbed things down so much we got brogrammers...
This was the fear with "India" in the 1990s, and how it was going to kill all of our wages.
I make a lot more money now than I did then.
The reality is, you're going to have a million monkeys hitting a million keyboards, and very few will be producing Shakespeare. All of that crap will be consuming lots and lots of AWS/Azure/etc bill.
You'll need way more IT people to rationalize it. There are tens of thousands of people in the United States whose purpose for the last decade has been re-implementing the 90s version of this in formal IT systems. You will have churn as we purge the legacy staff, especially windows click to admin types.
In web development this is most apparent (to me) in SAAS application development, where many/most of the underlying pieces of building a CRUD application that can scale to thousands of users, and be really functional are now provided by other SAAS apps which provide a _better_ service than the average developer can scrape together themselves.
Billing -> stripe.com over writing against the gateways directly
Database/Hosting -> Heroku PostGres/Redis and compute
Email -> Sendgrid, Mandrill, ActiveCampaign
Or even just SAAS frameworks like BulletTrain (Rails) or Laravel Spark which dramatically cut down on the boilerplate and integration code you'd have to write.
Sure, some of them will still have positions the same or similar roles but there will be a crunch. The large outsources will be hit overseas (WiPro, Infosys, etc.) but it will also impact administrators at medium-large sized businesses in typical American Cities as Forrest mentioned. The worst part out of all of this is too many colleges and especially technical colleges still teaching networking, linux or windows administration as if they'll be able to have life long career. That is no longer true.
I don't want to imagine what it'll be like for those students who graduate, get good jobs (now), a mortgage and start to raise their family only to find themselves unemployed in the middle of their lives. I don't expect much sympathy from the largely meritocratic tech industry or anyone else.
As for myself, I already work for one of the big three and apart of many "cloud" migrations. I should be okay but at the same time I am somewhat conflicted. Am I going to need to go back to school for Computer Science and become an fully-fledged actual software developer? I mean, it's fine, there's still enough time (I don't think we will really feel the burn for at least another 4-6 years) but is it reasonable or realistic that everyone needs to be rockstar developer?
I don't think we're anywhere near an "IT apocalypse". I think we're more likely to put a ton of machine learning engineers out of work long before companies start needing less help desk technicians and sysadmins. I think a lot of people have moved to the cloud only to discover they needed just as many people to help manage their cloud presence as they needed to manage their on-prem hardware.
The author does touch on this though by highlighting that you will need less and less people. As more services move into 'cloud' solutions it can free up time for those and they'll step into those spaces.
Yes, someone still needs to manage all of this but you need a lot less people. Or you ship these "trade jobs" to low cost areas like India.
The main thing moved to the cloud where I work leads us receiving and handling the same number of support tickets as when it was on-prem. The difference is, now some tickets we can't fix, and have to wait for the cloud provider. Service is worse, and it doesn't really save us any time.
A lot of cloud solutions offer an on-prem option. The tools are the same, it's just a matter of it running itself in the building or running itself somewhere else. A lot of times, running something on-prem means spinning up literally the same software you could have them host for you.
(Also: Windows Updates also aren't some crazy painful manual process that Intune fixed. You can just tell WSUS to approve everything automatically if you want, and just as similarly, you can manage Intune more granularly which takes up your IT staff's time and effort.)
If you're simply running EC2 instances with your same off the shelf software, you're not doing it right, but you'll still eliminate your entire datacenter physical facilities team and server install/rack & stack/replace failed disks team.
If you do it properly, it's incredible what you can do. I have a client with applications running in Ireland, Frankfurt, Singapore, Tokyo, and the US, totaling around 50 EC2 instances running containerized workloads that automatically heal, APIs that are accessible globally and won't go down unless 6 AWS regions simultaneously fail, about 30 static websites, DNS hosting for a dozen domains, monitoring, auditing, and log analytics for all of the above. I set it up in about 3 months as a single engineer and manage it all with about 8 hours a week of total effort. The cost to my client is basically the same as hiring a single senior engineer, but they're running infrastructure that would have taken a team of 3 shifts of IT professionals without the cloud.
Haven't worked on Workday and the like yet to understand how "Cloudy" they trully are.
That's going to be a long time being automated.
For instance, if (for example) you model insurance data in the EU, you cannot use gender as a factor in pricing (even though it's effective).
In general, the modelling/ML pipeline is the easy bit, the hard part is the data cleaning and figuring out how to translate a business problem into one that can be solved by data.
tl;dr as many people have said about Comp Sci over the years, learn the fundamentals (statistics and experimental design) and you'll be in a much better position.
I would be very careful before dismissing AWS’ no code/low code project. Mulesoft, Microsoft Flow, and Zapier have a combined revenue of hundreds of millions of dollars a year in serving a market for establishing business logic workflow without code. I am only surprised it took AWS so long to move into the space considering the cross marketing opportunity to existing customers and their compliance capabilities.
What exactly do you mean by this:
'“senior application developers” who munge JSON in C#'
Why did you choose to JSON and C# in particular here?
I really hate the current pattern of getting some interesting topic and nice data about and burring it into 20 pages of story telling journalism.
The real problem with these ready-made plumb-and-plug modules is sooner or later these are either too slow, or expensive, or just a pain to refactor/redo. Eventually you just come back and realize you need a more granular control over things, and anything you are likely to come up with resembles a programming language.
I had this moment of realization myself while having to change a complicated graph in Pentaho Kettle a few months back. The graph looks bonkers hard and brittle, changing anything requires redoing all the dependent elements of the graph, and if you have a graph complicated enough you will be forced to rewrite it. The real trouble there is no functional/unit testing with these things. And then you realize, you are just better off with a full fledged ETL language/programming language. The second problem I faced was running into performance issues. Want to change the sort algorithm? Running into heap space issues? Want better logging? Want a better threading model? All the best. Nothing is possible.
This is above and beyond the need for meta-programming facilities. At that point whatever GUI graph you draw is worse than any verbose code you will write.
Regarding programmable tools, we already have those. Vim, Emacs, Microsoft Excel all give you a degree of meta control over the tool and what you want to do with it. But that's that, and it is often hard to bend this tools to your command.
These are just a few reasons why there won't be an apocalypse soon.
The job market will close up a bit, but right now tech is looking like the California gold rush, where 4-5 years ago any bootcamp grad could jump right into a web dev job (at least in my job market in the Midwest). I think if you continuously learn and remain marketable as the times change, then as a worker you will be fine. I also like the comment that mentions that you may just end up working for the cloud provider rather than the business application company.
I manage a machine learning team and I also think that at least partially automated data curation and modeling will reduce the number of people required in my field. It might take 5 or 10 years, but I think it will happen.
I think you are spot on that IT and devops will take a hit. I look more at Heroku’s model that AWS and GCP as the future. That said AWS and GCP will keep getting more ‘Heroku like’.
It’s a complete blind spot for most engineering minded people because they never realized how flexible the platform was, and with Bret Taylor running the show now, it’s miles away from just being a clunky Sales CRM.
Couple of examples of recent developments:
I was at a Money Transfer company and we couldn't move our Transaction Monitoring staff over to SF without hitting limits
I mean, if it doesn't, what the hell are you even doing?
The whole point of technology and modern capitalism is to increase automation, increase the amount produced by the same number of workers, and increase the overall amount of wealth in the world and improve overall living conditions for everyone (setting aside very important questions of distribution). I just find it odd people in the computer technology industry find this shocking or especially worrying.
"I look more at Heroku’s model that AWS and GCP as the future."
Google's App Engine was much closer to the Heroku approach, and the AWS approach won. So I will be pretty surprised if the Heroku approach wins out.
Better automation has been "reducing the number of people required to deliver technical solutions" for ages.
Local Area Networks replaced many mainframe computers in the 80's. Optimized C compilers took the jobs of countless Assembly programmers. WordPress, Joomla and better web frameworks (Django, Rails) took the jobs of many Perl/Web developers. Python enabled a lot of people to do what FORTRAN/Java/C++ programmers were able to do before.
"Apocalypse" is just the normal state of affairs.
Companies are always going to follow the latest trends and it's always going to take smart people to follow them. I'm not worried about my ability to make a living. I just can't wait to see what comes.
A couple years ago I made the switch to full time development. I now do most of the DevOps stuff for my teams, but from a developer role, instead of a sysadmin/cloudops role.
I'm certain that's going to be the future. Look at Google's requirements for SREs. They are full-fledged software engineers.
Services that run services is the way to go. Lose a box from the fleet? No problem, the operator service bounced it and got it running again. I’ll admit that timelines often mean you don’t get to build the grand vision out of the gate, but I absolutely agree with you that bash scripting isn’t how we should be running services these days.
People who are already on the cloud aren't going to ditch their service providers so they can take time away from their actual business to manage something they don't understand to save a few hundred dollars a quarter.
Finally, programming hasn't changed substantially in the last 15 years for most people. I know that might sound shocking to people on HN, but most developers are working in dingy cubicles on archaic systems without continuous integration or release management. They're deploying to production and then jiggling the handle until things work. The systems they produce are just good enough to keep other business units crunching along, and their saving grace is two-fold: they don't cost enough to warrant real scrutiny, and the potential utility of an efficient IT department is non-obvious to most business managers.
Bash scripts are programs.
There is something to think about. The folks who are running things are plentiful right now but as this churns we will run out of folks who understand the lower layers well enough to manage it.
As I interview people I run into this for things like SRE and DevOps already
Having seen the inside of so many places though, it also continually surprises me how many companies are all struggling with the exact same issues, to the point I'm highly tempted to try to pitch a whole IT solution I've been thinking on for years at the next YC.
One thing I've noticed on HN is that too many people tend to think of all companies as SV software startups, when there is a huge swath of companies in-between coasts that don't have a single dev or programmer and are just doing their business. Much of the "devops is killing sysadmin" hysteria is overblown due to this filter bubble.
Selfishly, it also makes things harder for me, as somebody who is a generalist's generalist, because people legitimately do not know what to do with somebody who's built and shipped mobile apps, can drop into a new piece of backend software and rapidly get up to speed, will architect, implement, and manage your cloud environment, and has hard-won opinions about rack cabling techniques. But I do OK regardless.
There is also the problem, if the IT is far away, the smallest problem needs "tickets" and days to solve and it will be never possible to automate something with a selfmade script, because your just a user and have no rights.
(Yes I'm a bit frustrated because a lot of this things at work at moment.. )
Technically minded, astute engineers are great at solving problems. In an ideal world, we'd be using our ability to solve complex real world problems. I'd much rather the Golang engineer equivalent 20 years from now solve critical watersupply issues to a village, than writing API's with flame graphs. Full disclosure: CS engineer by love and training.
It will take some time, but I think these providers will basically displace advanced ML knowledge workers, especially at large fortune 500 companies. IMO to stay ahead of the curve, data scientists need to pick up more business knowledge and move into a business/financial analyst role.
This is absolute spot on. It's our company's bread and butter to automate repetitive human operations. I can tell that many companies are looking for solutions that would automate as much human work as possible. Repetitive routines are the typical candidates to replace with software or less skilled personnel.
To be fair, this is sometimes trickier because a lot of "devops engineers" are actually mouse-driven system administrators, so the means and medians often look odd. That somebody who does what I do, and somebody primarily doing things hand-o-matically, would have the same job title throws a wrench into one-to-one comparisons. But you can figure it out.
In fact, the steady increase in both the use and the diversity of technology means we'll need more people than ever in IT. The job title might change, but there will always be a department dedicated to fixing shitty technology and help users use it.
Comparing to factories shutting down is hyperbolic. It think what we’re about to see is more in line with a a job market that rewards things other than specific application knowledge.
However, I can’t say much in defense of the mid-level IT Pro because just this past month I went from considering a server running an OS and MySQL db somewhere that someone would have to maintaine into using AWS DynamoDB with API Gateway with automated calls to Lambda functions - I can completely get rid of our server and paying a mid level IT Pro to maintain it. For our usage it’ll be practically free now, where I feel like I would have been paying a lot of “enterprise IT” overhead before.
That doesn’t mean that guy is going to starve, he will have to either adapt as factory workers have or be valuable in a different way than “I’m good at something people don’t really need anymore”. This is as old as time.
Now that isn't the case. The traditional system admin will need to adapt, but adaption looks like going back to school for software development.