Hacker News new | past | comments | ask | show | jobs | submit login
The creeping IT apocalypse (forrestbrazeal.com)
324 points by forrestbrazeal on Jan 17, 2019 | hide | past | favorite | 277 comments



This kind of change has also impacted me. It shows up when I'm trying to give students advice about starting their careers, and I realize that the first jobs I had (system administrator, SOC worker) have been replaced by robots. Especially in the SOC, I was a "Tier 1" analyst that would do monitoring (watching a bank of green lights waiting for one to turn red) and first level triage and analysis. This has been replaced by ML driven data processing systems.

So I think the apocalypse is double-bladed: while automation kicks a bunch of current workers out by making them immediately redundant, it also freezes out the next generation by removing entry level jobs and not really replacing them with anything equivalent. Meanwhile, universities and vocational ed programs won't get this memo for another ten years so they will continue to happily propel waves of students onto a set of closed and locked doors.


The pessimists' view is that automation will deprecate a heap of jobs in the tech industry that will never return. The optimists' view is that automation simply allows companies to do more stuff: things they couldn't afford to do before, and soon, things they have to do in order to stay competitive. For the optimists the number employed in the tech industry stays the same or increases, but the proportion of different roles changes (ie no more green light watchers).


New college grads today aren't prepared to do anything more complex than watch the green lights. If we automate out all of the entry-level work, that means we have to train workers to a higher level before they enter the work force, which is clearly untenable with today's college tuition (at least in the US).


That's demonstrably false. Hell you can just look to startups created by new grads and dropouts to confirm your view point is inane. Today's college grads are better equipped to do a wider breath of work than any other time in history precisely because of the efficiency gains of automation.


A minority exits college ready to run a significant startup, but the majority exits college ready to struggle finding a mediocre job. In other words, the automators and the automated away.


Sure, there's outliers. But Not all college grads are capable of operating at that level. Which is why companies have a lot of regimented hiring practices and tests to filter out all those not making the cut.

I've been on the college recruiting circuit to help my company hire, and I'm often severely disappointed by a good portion of those I meet. There's occasionally the standout who really impresses me, but then I think they're never going to want to stick around at my place. With this thought, I'm sort of in agreement with the commenter who said that we need college to train people at a higher level since there's almost not time on the job to ease into it.


>There's occasionally the standout who really impresses me, but then I think they're never going to want to stick around at my place.

Maybe companies need to stop thinking in terms of employees sticking around for a really long time, and get used to the idea of employees going from place to place when they get too bored or want to do something different. It seems insane to me, the idea of expecting an extremely intelligent, high-performing person to want to come to the same workplace day after day, for years or decades, doing mostly the same work.


There's occasionally the standout who really impresses me, but then I think they're never going to want to stick around at my place

This is the paradox of hiring today, and why tech hiring is broken. In fact, there's a story right now titled "Hiring Is Broken" on the HN front page, not far below the OP. So it's getting harder to recruit people who do make the cut technically and communications-wise, while at the same time it's getting harder to retain them because let's face it—so many startups don't have a compelling value proposition or profit model.


From your perspective, what is the issue with a lot of the current college graduates? I'm a college student studying computer science, and I feel like I'd be ready to start work really soon. I'm also the sort of CS student who reads Hackernews, participates in CTFs, maintains a perfect GPA, has side projects, etc.


To be a good developer, you have to understand how computers work. You have to understand data structures. When you interview and you're asked to write a breadth-first search, or what the big-O complexity of accessing a hash map is, it's not because the company is going to have you writing your own custom hash maps right out of school. It's because you have to be aware of the general characteristics of the tools you're using to be able to select the right tool for the job.

Most of the students I interview fail miserably at this. They can hack together a working application by copy/pasting from examples and SO posts and making small modifications, but they have no fundamental understanding at all of what the computer is actually doing with the code they write.


I feel I'll be fine, this is the stuff I love. I'm incredibly interested in understanding how things work, to the point that I've considered changing my degree to maths/physics/compsci. With zero revision I could explain the hash map, it's big-O, how to implement, use cases, etc. I figure that once you understand something it simply makes sense. I haven't looked into it but I'm sure there is some interesting statistics on the probability of collisions, optimisation, etc.


It sounds like you're on the right path, combined with a decent personal passion project and some summer internships.

Some things I see that give me pause: 1) No project work. Lack of interest in building things on their own, researching frameworks, building out small apps, etc. I'd like to see even a small attempt at learning tooling and frameworks used on real world projects (it doesn't have to even be close to the toolings we're using, just anything).

2) Lack of reading the technology-centric, software engineering centric internet sites. Even attending a Meetup or two to start seeing what's going on outside academia (again, I live in a tech hub, so lots of opportunities).

3) Just general feeling of "hey, I got this CS degree, I'm ready to work", but not really showing much enthusiasm that they actually want to be software engineers as a career (it's a tough career that requires a lot of self-driven learning and curiosity to do it well). Believe it or not, I've seen students come through summer internships, and decide they actually don't like the real world day-to-day software development for a career.


I do actually have projects; I mention that in my post. Which websites would you recommend however?


I wouldn't recommend any in particular, but let your interests guide you. If your into front-end development, there's probably some bloggers who write about your favorite frameworks; or your favorite language; or if you're interesting in enterprise dev, you might follow InfoQ or Fowler's blog; Or with AWS you might read Cloud Guru's weekly updates on AWS, or Werner Vogels Blog, etc; you might be interested in startups, so your read one of the many VC blogs, etc...

I don't really have any preferences with what you're reading. Just the fact that there's some part of technology that fascinates you enough to read further, on your own in a self-directed way. It shows curiosity and initiative. I always like this part of my conversations because I often get to learn something new myself.


> has side projects

That's your biggest asset. The perfect GPA will help you get pass the useless HR gatekeepers, a demonstrated ability to build something will mean a lot more to the technical interviewers. Bonus points if those projects happen to use common industry things like an sql database and contains unit tests.

Most graduates, if you sat them down with some fairly simple requirements and said "build this with whatever tech stack you're familiar with" would have no idea where to start.


What you're describing is _training_; and the act of training employees has long since gone out of vogue in western markets.


I've worked in IT type jobs for over two decades, in some sense training would help, but to be frank the code and systems IT folks have built are of such low quality that it's better for everyone if they're handed over to more competent teams that can cost effectively maintain them long term.


One reason why training is out of fashion is because of poaching. Companies don't want to invest a lot of money in training their employees only to have their closest competitors become the beneficiaries. That is a myopic view, to be sure, but it's a common one.


I think they do train people, it's just not formal. And the other issue is the companies have no clue what their doing most of the time anyway, so you just end up with IT people on the market will all sorts of nonsense opinions.


If you increase competition for jobs you can push the cost of training onto employees.


Why train employees when you can pay just a little bit more and get pretrained ones?


For when you can't find any more pretrained ones. Or to get someone for a lower salary and growing them that provides some loyalty and help solving the problem of not having enough pretrained ones in the first place.


There are internships


Was watching David Bull talk about historical Japanese wood carving. He described how the introduction of the printing press to Japan killed the entry-level, apprenticeship positions in printing.


Hello fellow David Bull fan.


I am worried about the latter aspect as well - in order to keep automation going, very advanced developers would have to be involved; if the whole "easy job" ecosystem disappears, there won't be any reasonable way to keep developers progressing, with best in a competition filling up the spots at "cognitive automators".


In the absence of on-the-job experience/growth for less-experienced developers, it forces them back into more academic training programs. I personally think this is likely to lead to exacerbated "degree inflation", where MS degrees will be the new minimum expectation for these new "entry-level" (read, Tier 2+) jobs.


A significant number of companies who can benefit from automation likely won't need to continue to automate indefinitely. I work in software development/automation and in my industry it's more about finding and configuring a framework that enables business users to configure software systems than about automating everything possible.

Even relatively rote software development that involves embedding business logic into a software system is likely safe as long as the cost of continuing to develop that software is close to the cost of switching to some other framework. It's when the cost of development is much greater than the cost to try another framework out, or when a company wants to expand something and doing so on the development side would be cost prohibitive, that someone's job could be on the line.


Maybe we'll see a more apprenticeship type approach, where junior personnel are instead assigned to and trained by seniors. This would probably be a net good, but who knows how things will shake out.


I've thought about doing this at work, but it's hard to figure out how to make it attractive to my company. If I ask to hire a junior dev for the primary purpose of training them, it seems likely that I could get them full time for 50% of my salary. They're going to be a little bit productive, but they're also going to take up a lot of my time. At best, I think you end up with 150% of the labor costs for the exact same amount of work being done. That's a really hard sell to the business side of the company.


In the UK we have the 5% club [0], whose aim is to "to make at least 5% of its employees apprentices within a 5-year period". It does take commitment from the company, and it's perhaps not a surprise that its more common in companies that already invest in graduate recruitment. 280 companies have signed up so far.

(Disclaimer: I work for a company that has been in the 5% club since 2013)

[0] https://en.wikipedia.org/wiki/The_5%25_Club


I mean this is already an acknowledged good approach. Companies don't want to make that investment though, hiring both a senior and a junior who won't be immediately productive, and keep chasing the mythical senior who will work for mid-level salary.


A lot of companies hire interns and usually, the typical intern doesn't provide a ton of bottom line value. Aside from being good for the culture and mentorship practice for the more senior folks, they usually need a lot of handholding. They're basically doing paid apprenticeships, which tells me that companies _are_ okay with the concept of having apprentice type of training.


The whole internship thing for STEM careers is to onboard you early while you are still cheap. Even if you don't work for the company that you interned with, the progress is still there so on day 1 you aren't totally lost. No one wants to give a salary to someone or even spend the money on the hiring process that hasn't worked a day in their life because they have zero clue how well they get along in a work-environment vs. an academic one. From an HR and hiring manager viewpoint, even if you have a lot of experience with a certain tool or field but have no work experience, you are going to look a lot less desirable than someone that maybe has a little bit of experience in that field or with that tool but mostly has unrelated internships. There are lots of smart people out there that can't get along with anyone so they suffer in their career.


It's cute that companies think junior or seniors will be immediately productive. There is a ramp up period. Hell the first day is probably just paperwork. Then 3 months of getting integrated.


I'm not sure how many companies really think this, not that I'm arguing the point. I do know that successful companies realize that developer ramp to productivity time is an important metric and attempt to optimize that metric.

There's a whole set of fairly basic tasks that can be done that can get a developer to the point of submitting a PR on their first day. Many, many companies screw that up.


That’s shitty on the company then. Senior engineers work way more efficiently if they have a junior working under them to crank out busywork. The junior then knows exactly how to succeed in a senior level position from working intimately with the senior engineer.


Companies being shitty and prioritizing quarterly margins over long-term success is like the entire basis of the modern economy


Why does watching a bank of green lights waiting for one to turn red and do first level triage and analysis require ML, isn't this just a bunch of rules?


I was over-simplifying to communicate the repetitive and dull nature of a job I'm very happy to no longer be doing. Really the game was more about "anomaly detection" and the sensor indicators/measurements were far more continuous than categorical, and their outputs had to be weighed against past experience and context of the monitored components.


Well, to start with, you have to use ML to train the computer vision system to recognize green and red lights. After that you update your resume with the ML experience you gained. :)


I want to ask a serious question about the ML, but your answer is great! My experience of anomaly detection, even with deep networks, is that recognition of the important anomalies can be done for some of them which means that you can raise a big flag for those - but someone still needs to sift through everything else.


The one entry level IT job that is not going away is tech support. Sure, some parts of it can be outsourced, but beyond a certain point you need a person on-site to figure out why the Internet is broken.

The Cloud companies are also hiring armies of support people, and it's a great way to kickstart your career in any of these companies while getting company provided training in the tech.


Turn your products into services, hardware ownership into leasing, and suddenly you don't need to offer tech support. SLAs will establish the new "laws of physics" for users, where failure is a binary state: either it works, or it doesn't. When it doesn't, someone will come in couple of hours/days, trash the broken black box, replace it with a working black box, and things will be back to normal.

Of course, the service provider may need some amount of people figuring the failures out, but that amount is smaller than if customers had to debug their own problems, and is more susceptible to centralization, and to "fixing by replacing".


Not with AWS Outposts, and you can bet Google and Microsoft won't be far behind.


"So I think the apocalypse is double-bladed: while automation kicks a bunch of current workers out by making them immediately redundant, it also freezes out the next generation by removing entry level jobs and not really replacing them with anything equivalent."

Agreed entirely. I would be surprised if we still have two-year technical degrees in a decade.


I can't find the article any more but I think Toyota has it's engineers build a handful of cars by hand as part of it's kaizen process. This would act as entry point for next the next generation of engineers.


How many solid universities still have IT programs? At UIUC as of 2016, the IT program was nonexistent.


Two points: first, even if "solid" universities are right to run away from this (and IMO they don't run away from it, they just move the IT program into the business school where you don't see it any more) there are still many other universities pumping out students into a dead career field, which should be concerning. Second, this is not just about IT programs. Computer science programs are impacted to. From the OP:

> But instead of five backend developers and three ops people and a DBA to keep the lights on for your line-of-business app, now you maybe need two people total.

All nine of those people would probably have been CS graduates, or at least many of the backend developers would be (and perhaps the DBA). Or they would be people that thought of themselves as "developers" and not "IT" for whatever that distinction is worth now.


Except that there are more developers than ever developing more software than ever. Instead of one business app with five backend developers and three ops people and a DBA you have five business apps each with two developers.


Interviewing college graduates is depressing. No way the current education system will create people competent enough to keep this going


Don't generalize. Are you interviewing graduates from the top 10-15 programs or lesser-tier?

I think this is a big error people make. Some colleges are great. Others are horrible. It's very hard to say, "College is a waste" or "college is great".


It's not even a matter of tiers. There are plenty of great candidates coming out of lower rated schools, although the hit rate is lower.

In general interviewers just have unrealistic expectations for entry level candidates. They forget how incompetent they were at the same age. Or they have ridiculous notions that everyone should know how to write a quicksort algorithm or whatever, when some students may have focused their studies on other (but equally challenging) topics.


I expect you to know fundamentals about things like TCP networking and how to install, configure and manage Linux Distributions.

I find many people who can install Ubuntu and run the canned commands or curl foo.sh | sudo bash or docker / k8s scripts they download, think they know what they are doing.

They dont.

Its getting to the point someone who can install windows is more technical than someone who can install linux


Your expectations are unrealistic and counterproductive. Bachelor's degree programs shouldn't be teaching students how to install Linux. That's just job training, not education.


Agree, installing linux is IT, not normally in the background of CS. It's not that hard, devs could figure it out. Yes, I built my own machines because that was a lot cheaper. But no one taught me that in my cs program.


When the job is managing Linux systems and colleges arent producing people who know how there is a problem


Bachelor's degree programs aren't intended to provide job training on narrow technical skills that will probably be obsolete in a few years anyway. If you want entry level Linux sysadmins then hire college graduates with the right aptitude and train them, or recruit from community colleges which do provide technical job training.


Did the interviews in Silicon Valley. MIT UCLA Stanford Berkley are some of the grads I interviewed.


What do you mean by an "IT program"?


My community college has two different departments: CIS and CIT. CIS is more akin to "computer science", and heavily covers various programming languages and has a game development sub-program. CIT includes hardware troubleshooting and repair, certification classes for Microsoft, Cisco, and VMware, network administration, and has a subprogram in cybersecurity. Much like in the business world, academics now do treat development and IT as two separate fields.


The community college also has an automotive repair program. You’d be hard pressed to find a decent school with an IT or any other of these vocational programs at their main campuses.

That’s because it trains for the job, not for the field. While you’d probably get up and running easier with an IT degree since you know the current tooling, you’d be worse off than someone with the conceptual knowledge that comes with a more general CS degree, and you’d therefore have a tougher time adapting to whatever new technology that didn’t exist in your IT program but was touched on conceptually in the CS coursework.


My community college is extremely "decent", thank you. In most cases, other than needing to check of the "have a bachelor's degree" box for job application purposes, most people will probably get more bang for their buck in a community college than they ever will in a fancier school. Depending on what your local community college offers, there's a good chance that for a fraction of the cost, you can pick up nearly anything you'd want to know (or just like to learn, at that price).

As someone whose taken a fair number of IT degree classes, I'd say there's a fair bit of conceptual knowledge involved. And in the case of networking, for example, most of the standards and protocols you're being taught how to work with have been around since the mid-80s, and aren't showing significant signs of going away any time soon.

I'd say the CS vs. IT split would probably surprise you. I have gone up to the bachelor's level in a game programming degree, and it was amazing how poorly people who were proficient in writing C++ couldn't handle basic PC troubleshooting, it's a different skill set entirely.


I went through community college over a decade ago, I've spent a depressing amount of time since teaching people with CompSci degrees about CompSci concepts.

I've also run internships with CompSci graduates and have to say they're basically unemployable when they graduate, they might know some theory but they can't build anything. Community College teaches you to build things, so you come out with skills relevant to the workplace and you can fill in the CompSci stuff later.


Europe does. ;)


My bad, I should've clarified "IT" in the NA sense.


I have not found one yet.

My son currently is enrolled in CS and his first 2 years of school are filled with humanities, history and a few more irrelevant courses all to keep some profs employed. His next two years will be filled with more useless courses and by the time it's all over will have cost 50k + (he lives at home and goes to a State School).

I feel like he could have taken a 6 week Java/Python/whatever and got more out of it. Add a CCNA/CCNP for the Networking knowledge, Linux Cert,Security Cert from SANS, and some self study and he would know more than a 4 year degree and be bettered prepared for the working world.

Universities in the US are all about making money, supporting football and athletics, tenure for the profs, and finally accreditation for 50k+?.

Meanwhile, 10's of 1000's of H1B's are needed because our kids know nothing and are being taught shit.

Two areas that need major change and disruption, Education and Healthcare, everything else can wait.


I went to a state school. Their curriculum is easily and readily available online. It's nothing like what you describe. On top of that, you could have seen that and known it was the case before you sent him there. I'd really like to see the curriculum for the school you describe, I have a hard time believing it could be quite that different.

http://www.cse.uconn.edu/wp-content/uploads/2017/04/Selectio...


I understand some of your concerns, but at the same time, as you point out yourself, that's why there are choices available:

1. If you want a very specific skillset to do a very specific job, there are more opportunities and options today than ever before. Self-study, MOOC, bootcamps, certs, etc. Pros: fast, efficient, focused, practical, immediate. Cons: the specific/narrow focus may leave you with gaps you won't even be able to appreciate until too late.

2. If you want a more general education, Universities are there to provide. You'll get not just immediate hands-on-keyboard skills, but math and CS-theory background, and also also communication skills, discipline, diligence, social networking, perspective to be a team lead one day, etc.

Now, I do believe universities have a LOT of optimizations to make; a student's life tends to be sucky in many ways it doesn't need to. I've repeatedly found and heard of the difference in attitude between a college/bootcamp of "You're the paying customer, we'll provide knowledge", and university attitude of "you are irrelevant, be grateful, and jump through the hoops jump for the privilege" - whether from the ever-increasing admin/bureaucracy cohort (sometimes helpful, often power-blinded), the obscure rules and difficult processes, or some of the tenured professors. But again, the information is out there, the choices are available - and overall there's never ever been a better and easier time to acquire knowledge.


>humanities, history and a few more irrelevant courses all to keep some profs employed

Full transparency, I hear this from nearly every 1st year college student every fall semester - either the CS, CHEM, or Engineering students exclusively. "Why do I have to take English, I'm just going to work with [chemicals] [computers] [software] [roads] [whatever else]"

Not sure how this opinion will fly on this site, but I'm not sure what you expected. It sounds like you have an ax to grind with a specific institution and you needed to do more research about the system of universities overall. They were and are designed to make a modern version of a renaissance wo/man - good or knowledgeable about everything being the idea. Making citizens who are more than just 1 skill cogs. Teaching critical thinking and higher order thought processing. They were not, and are not job placement agencies.

If you were looking for nothing but the certs and technical skills, you should've sent him to a technical/trade school or community college. That's why those exist.

There is a massive difference between being job task ready - like just finishing the certs would make you. And being life ready - like a liberal education makes you in theory. Giving a student a liberal education is literally why universities were designed. Why was that a surprise?

I genuinely don't understand why being good at things that are outside of your expertise, or at least knowing enough about them to sound like an educated person in conversation, is 'irrelevant'. I don't get it and never have.

>Meanwhile, 10's of 1000's of H1B's are needed because our kids know nothing and are being taught shit.

My experience has taught me that whenever an employer says "we can't find the workers" and use H1B's, what they really mean is "we can't find the workers at the wage we're willing to pay". Those are two different things.

THAT BEING SAID, the costs of education are out of hand. Living inside the beast, I can tell you that many administrators are just flat blind to the storm coming.

In the 90's, the message was go to college, go to college, go to college - relying on the past 40 years of if you went to college, everything else just sort of fell into place.

Well, now we have so many 'extra' services students expect, so many expenses, and less state/federal dollars. So students pay for it.

NOW the message is that you need to go to college only if it furthers your career goals. They're working in kindergarten with my child on that. It's frightening, honestly.

I think the pendulum will swing the other direction and we'll see a glut of skilled trades-people in the next 10 years.


> My son currently is enrolled in CS and his first 2 years of school are filled with humanities, history and a few more irrelevant courses all to keep some profs employed.

I suspect, if you aren't just being hyperbolic, you mean, “because he chose to a seek a degree from a liberal arts institution rather than an engineering one (which would have some, but less, general ed) or a vocational certificate program or career-focussed bootcamp.”


I think that is precisely the opposite of what the original article was talking about.

@forrestbrazeal is implying that many of those certs are going to be obsolete really soon. At least, that is what I took from the article.

I agree that American Universities are kind of insane. We, Canadians, are looking in and shaking our heads.


> My son currently is enrolled in CS and his first 2 years of school are filled with humanities, history and a few more irrelevant courses

Yeah god forbid he enrich his mind and develop lateral thinking skills, empathy, perspective and wisdom instead of focusing exclusively on how he can best serve capital.


Who says he shouldn't be doing these things? That's what secondary education is for, in most of the developed world. Apparently, college is the new high school - in a quite literal sense!


OP, the guy I'm replying to, seems to be saying that? That's why I replied to him?


Education innovation is definitely a massive problem to solve. Especially because of a weird mix of control and influence Universities have between dictating k-12, expected job fulfillment vs skill, and their government sanctioned student debt vehicle unavoidable by bankruptcy.

We need a more decentralized education system top-down that isn't tightly coupled to the gov.


Absolutely not. Look at colleges like Devry or University of Phoenix. Little to no regulation in curriculum, incredibly expensive, etc.


This has been happening since the 1970s. Or earlier. I don’t really think of it as an apocalypse. IT skills have never had a long shelf life. Any time you are a technology expert at your company, in IT, the technology landscape will shift under you. This is the Red Queen Hypothesis in action. People who ran mainframes in the 1980s became trusted experts and then most of the jobs evaporated. Same thing happened to people running critical VAX or Unix systems. Your skills are only valuable as long as the related technology is.

The same thing happens to programming positions.

But I think the good news is missing from this article—IT jobs are, overall, sticking around or increasing in number. (According to the Bureau of Labor Statistics, the jobs are growing “faster than average for all occupations”). You do have to keep updating your skill set, but it’s not like manufacturing, where efficiencies eliminate jobs altogether or move them to completely different sectors. And there is that ageism to worry about, and uncertainty.

I’m personally more worried about some of the other remaining white-collar office jobs, like the accountants, paralegals, HR, various banking positions, etc.


> The same thing happens to programming positions.

I can't stress how important this is. Folks going to things like boot camps or other educational outlets that focus on one language will utterly kill their career if they aren't aware of how fast things move. If you don't learn the underlying abstractions and paradigms that take various forms in different languages, you will get left in the dust in a matter of a few years.

The best programmers I've ever worked with got excited about programming patterns and paradigms, not frameworks and syntactic sugar. Those are also the ones I paid the most attention to.

Bottom line for both software and IT engineers: you learn to learn, not just to do.


Sure, there's places to learn hip new tech that might not be around for a while and you'll have to continuously learn new languages/frameworks/etc. But there's also things like FORTRAN, COBOL, and C, which are all in-demand in their own little niches, aren't going away any time soon, and where you can till make a quite good career with only knowing one of them.


I would say the difficulty with these niches is that the number of experts and the demand follow different curves. For COBOL, it seems that you have the initial period where supply and demand are high, and you have the final period where the few experts remaining command high salaries. But between them you have a glut where demand is dropping and the experts are moving on or getting "reorganized". Surviving that middle period is not something to take for granted.


> can still make a quite good career with only knowing one of them.

I think this is (at least partially) a side effect of the pedagogy of computer science in schools changing throughout the decades. When I was in school, just about the entire program stressed OOP, with very little focus on imperative/procedural programming. Newly minted programmers fresh out of school don't have the exposure/mindset to jump straight into one of these (C maybe being an exception). The old guard that called these languages home are a dying breed, and the salaries paid to program in them these days bears that out the scarcity that results.


The key skill is knowing how to continuously acquire new skills.


There are hundreds of thousands of people or System Administrators that have made life long careers out of managing networks, server farms, windows and linux systems since the early 90s. It's not fancy as software development or drives business value but work that needs to be done.

The Cloud greatly diminishes and in some cases completely eliminates that work. The only thing left is actual software development.


If you've tried to hire a good sys admin in North America you'd be shocked at how high their salaries are getting. Amazon is hiring them by the boatload. Shopify is moving to all cloud because they're unable to staff.

The problem is that junior sys admins aren't as useful as before to most startups. I still think they'll figure it out, but the industry is changing.


But isn't Amazon hiring similar people as Google SREs? So they are not really looking for sys admins but more of a full fledged programmers / dev ops people who can write software, not just bash scripts.


This. Competence is reducing per capita. The market is saturated with IT people who dont know what they are doing


The average competence is NOT reducing. What is reducing is demand for mediocre (and low) competence.

This trend results in less competent people flooding the job market.


But if demand for low competence people goes down, those people aren't learning on the job, therefore decreasing the average competence of the workforce.


...said every generation ever about the next generation.


Sysadmins used to be able to compile a kernel and some could even hack a kernel module together.

Today ? Not happening.


Another way to look at the trend - a lot of companies are moving to the cloud because they can't get the people to do it on their own.


Doing it on the cloud is now cheaper than doing it on their own. That could be because doing it on their own has got more expensive, but I suspect that the cloud becoming cheaper is a bigger factor.


So AWS is really Amazon's play to corner the market on sysadmins? That sounds remarkably plausible.


The author is referring to traditional system administration - this is Windows - SCCM, SCOM, Exchange, SharePoint, VMware Virtualization, Citrix, Networking, etc.

SRE or "DevOps" roles new roles that are similar to system administration but the biggest difference is the use of cloud technologies, automation and most importantly those in need to be or at least understand software develop and code.


"The Cloud greatly diminishes and in some cases completely eliminates that work."

It takes a hell of a lot of work to take a company's entire infrastructure and migrate it to Kubernetes and the Cloud, and then monitor and manage it. It's not trivial.

So the key is seeing your job as solving a business problem with computers, not "I administer Oracle version X.y.z running on Redhat Linux".


Not really. We now call systems administrators "devops engineers" now.


When my company puts out a rec for developers we get 300 applicants. When we try to get a devops, we get 3. It's tough.


Just as webdevs are “full stack engineers” now


This. Check out my username, this is what I do, and I've been doing it for far longer than a decade.

The mid-range jobs have always been vanishing. Many times, I'm the guy automating them out of existence. It's always replaced by something else.

It depends on the circumstances, of course, but there is often more work after the automation than before. It's just different work. It requires reskilling.

The article mentions some new product AWS is coming out with. No matter how "simple" it makes things, someone is going to end up being an expert at using it, and will probably be paid well to do so.

Really, the toughest and most crucial part of this career has been keeping up. The work stays steady, though.


I gave myself the title “IT Janitor” because for years all I did was clean up other people’s $4!7. I automated everything I touched, and told people it was so I would have more time to surf the Internet. Now I’m over 50, in the Innovation group, and dreading ever having to find a new job. But I’m having fun: blockchain, robotics, mobile app dev, and now NLP.


The problem is most companies want someone with 3-5 years of experience in that new skill set. So, unless you jump on that particular band wagon early you're out of luck.


In my experience, companies say 3-5 years but this requirement is written by someone in HR translating what the hiring manager said. The hiring manager is more likely to hire someone who has less than 3-5 years of experience, or even hire someone with zero experience, as long as the candidate demonstrates aptitude.

The trick is to write your resume so it gets past HR's filter, but without putting bullshit on it. It's unfortunate but I consider this kind of thing a critical skill for anyone applying to technical jobs.


> The trick is to write your resume so it gets past HR's filter, but without putting bullshit on it. It's unfortunate but I consider this kind of thing a critical skill for anyone applying to technical jobs.

I'd argue that skill applies even after you get hired. Fudging details to get around wasteful trivialities without outright bullshitting the person asking for the requirement is truly an art, and it's very hard to navigate this field without that skill.


> The trick is to write your resume so it gets past HR's filter

Any tips on how to do that?


Let's say the job listing says "3+ years experience with JavaScript," among other requirements. You only used JavaScript for 6 months, so you roll JavaScript up as part of your web development skills. You say, "5 years writing web apps, using Java, .NET, C#, and JavaScript." This is true, the hiring manager understands that you have not necessarily spent all 5 years writing JavaScript, and it gets past HR's filter.

The hiring manager will then make sure that your skill set is a good addition to the team.

Keep in mind that the hiring manager may have asked for someone "with some JavaScript experience" and HR might have translated that into "3+ years". In some cases, when the hiring manager sees the job posting, they're surprised by the requirements and say, "So, this is why we haven't been getting any applicants!" This is, yes, somewhat dysfunctional but it is a common dysfunction. Yes, you are essentially second-guessing the job listing and this is not the way the world should work.

As a rule of thumb, if you are applying to a job and you meet all of the listed requirements as written, you are overqualified and should apply to a higher level position.


It's often easy to spot when there is a recruiter/manager disconnect. When the language is overly gregarious/buddy-buddy, or uses the stereotypical bad job offer language ("rockstar ninja programmer", "we have ping-pong tables and free snacks!", etc...), you know the requirements can be taken with a grain of salt.

Definitely brush up on the things listed, though. Walking into an interview without at least a cursory recognition of what a listed language/framework does sucks, as it wastes your time and the interviewers'.

Also, be wary of bad recruiters. If they drink their own kool-aid and actually enforce the arbitrary x # of years in foo language blurb, you might be screwed (as well as the manager that put out the request for a hire in the first place).


One idea I heard recently was to include a copy of the job description that the HR person wrote in your resume. You include it in such a way that it can’t be seen visibly. For example, make the font size really small and the colour the same as the background colour.

The theory here is that programs that automatically scan resumes for keywords will always find the ones they are looking for.

I can’t say I have tried this or know how well it would work in practise.


Then, instead of SEO, you can put REO (Resume Engine Optimization) on your resume as well.


This problem has existed at least since the 90s, probably longer than that. I remember the joke being job descriptions asking for 3-5 years Java experience in 1996.


Mainframe still rules many companies. Especially banks and insurance


And they are not going away.

Last year I witnessed the sad story of a general manager, that promised to migrate away from an as400 in six months.

This guy (and several others) doesnt fully understand that a system, that is software and hardware and infrastucture, has its life determined by the returns it gives to its mother organization. If the system works, the organization wont pay or dare to replace it. Core systems are the hardest, and thats where all those mainframes and c and fortram and nowadays java legacy sys are still alive and kicking.

Pd: I love those systems btw, if you have one that needs love and care, I'd like to hear about it :D.


Agreed.

"I’ve spent my career in tech, almost a decade at this point, running about a step-and-a-half ahead of the automation reaper."

I mean, yes, that is the entire job description. If you are in IT, your responsibility is to learn the best technologies, and be continually re-evaluating what to keep of your organization's current and what to improve or replace.

That is why I wouldn't want to do anything else. I love learning new things and no profession offers more opportunities to learn new things than computer technology.


IT skills have never had a long shelf life.

But this isn’t true, outside of webdev.

If your skill was “DB2” or “Oracle” or “Cisco” or “C++” you could have had a 30-40 year career in that, easily. There are plenty of others. Java has been around commercially since about 1995, there will definitely be plenty of Java jobs in 2025.


True, but on the other hand those technologies have evolved more or less drastically as well. E.g. the Java or C++ skills from 1995 won't get you that far today, both in terms of the language itself and the framework/library ecosystem.


Thank God! The whole reason I have been working as a programmer for decades is because I always get to learn new things, and never get bored.


Yep. I've done both FORTRAN and COBOL in the late 80s, and today's versions only have a vague similarity to what I recall.


Sure, but it’s a gradual, iterative process, and each new feature you learn builds on what you already know, there’s no sudden cliff of obsolescence. When Oracle 9i came out there was still plenty of work for Oracle 7 and 8i guys and they had time and an easy path to migrate on and a big community to help them. Same with Java. There will be guys retiring in 2035 who have done Java for their entire 40 year careers! Even today a C++98 guy could find work maintaining something while he or she got up to speed, many sites aren’t even fully on C++14 yet and won’t be for a few more years.

Whereas in webdev you’re basically starting from scratch every 2 years and competing with new entrants to the market because they literally have as much experience of the hot new framework as you do.


Another worry is recession and the popping of the tech bubble. Sooner or later the bubble has to popped and that's going to cause a lot of pain throughout the tech industry. Should be interesting to see how the industry rebounds and in what form. Will VR be the new "hot thing" like smartphones were after the 2008 recession.


But smartphones are useful because now you have a computer in your pocket (but just like a PC, it's mostly being used for social media), if I look around my train in the morning many people are staring into their phones. I doubt VR will ever reach that. But AR might, so having a heads-up-display that will, for example, show you the next step of a recipe, or which screw to undo next while fixing your car.


I'm no pollyanna but I doubt we'll ever see another tech bubble collapse in the likes of the post-2000 one ever again, unless society itself collapses. And believe me, 2002-03 were the most painful years of my working life.

Sure we'll see slowdowns and some dramatic shifts in skill sets. We just need to stay ahead of the curve.


> Will VR be the new "hot thing" like smartphones were after the 2008 recession

I don't think so. Mobile phones were already a significant thing before 2008 and the morph to smartphones was already in train (first iPhone was 2007).

VR still has to emerge from a relatively small set of niche use cases. Augmented / Mixed reality is more likely, given the ubiquity of good cameras on smartphones. Arguably AR/MR will let the phone manufacturers keep pushing device upgrades for longer, as the smartphone market saturates.


I agree with the points raised in the writing, but it mixes automation, abstraction, and industry consolidation as if they weren't separate processes. As such, the transformation being described isn't an impending cliff, but an ever-present pressure of economic forces that affect all business all the time, and one is wise to watch for.

Automation replaces repetitive work with tooling and work that's more complex. Abstraction allows one to delegate to another for details, which may include choosing from a palette of pre-made options. Consolidation will come about as fewer independent players can sustain themselves in the market. Some will be out-competed by economies of scale, some will be starved by restrictions on intellectual property and lack of access to expertise.

This process has already played out for "small business websites", yet there's still lots and lots of web developers and web designers employed or freelancing. The current wave of WYSIWYG website generators is actually very good, and they have add-ons and integrations that make sense for their target market. But plenty of clients don't want to mess around in it, so they'd rather hire someone. This could be maker of the generator, or it could be an outside consultant. In either case, the person brings judgement, experience, and creativity, to tailor the deliverable to the needs of the client. These are skills resistant to automation, but not immune to abstraction and consolidation.

In the end, the antidote is the same as it always was: be adaptable, be personable, be resilient, and be resourceful. These are especially important in one is in a comfortable job shielded from most competitive pressure, because they will be the most surprised and unprepared if their current employment is made redundant.


saving this comment...


I keep seeing thse kinds of articles where the author has drank the Cloud kool-aid themselves, forgotten how to function without it, and insists that it's impossible to function without it. This guy even drags manufacturing into the mix, and obviously has no idea about manufacturing in the United States.

Manufacturers in the USA don't make cheap coffee cups. We don't make underwear. We don't make car fenders. We make warheads. We make gyroscopes. We make electro-mechanical assemblies that China or Malaysia or Mexico would screw up. We specialize in quality over quantity, and we specialize in cutting edge tolerances and specifications. We make export controlled things for enterprise contracts and the government. Things that require certifications to produce, and govermnent regulatory compliance, and tight tolerances. Nobody here is making the 100,000,000 wrenches you can buy at Wal-Mart.

We don't make 100,000 of anything either. We make 100 gyroscopes for General Dynamics, or 5 jet engines for General Electric. We make US military grade munitions and weapons for the government. The author obviusly doesn't realize that the company making the wafers for Raytheon ISN'T ALLOWED TO USE THE CLOUD. All that great automation that helps AirBNB function with no infrastructure is meaningless when you have to protect your IP from nation state actors. To probably >50% of American manufacturing the Cloud is useless. It's a consolidated attack vector that WILL be compromised in the future and lead to liability. Sure you can put a NIST 800-171 or DFARS compliant business in the Cloud, but it costs extra and it's not worth the risk. You hear about misconfigured buckets leaking data almost daily. Nobody doing govermnet manufacturing work wants to deal with that headache. Infact, I've been in this industry for 10 years and I have NEVER seen a DFARS compliant supplier with outsourced IT infrastructure. I've visited hundreds of companies over the years. What you're describing doesn't interest American manufacturers one bit.


> The author obviusly doesn't realize that the company making the wafers for Raytheon ISN'T ALLOWED TO USE THE CLOUD.

This is probably going to change. People like you said the same thing about health data, and student data. The savings were so tantalizing that the regulators and stakeholders figured out how to make it work. What do you think GovCloud is for? C2S and "Secret cloud"?

Our university had a 3-4 person dedicated Exchange team. When "Google Apps" came out, people wanted us to switch to that from our old mail server stuff. Go figure, why would you keep using pine and squirrelmail when you could use gmail? "It can't hold student data" the IT team said, "it isn't certified for FERPA or ITAR." Okay, true. Fast forward two years, now Google's "Apps for Education" can deal with both. The switch was sudden and brutal and the university no longer has a 3-4 person dedicated Exchange team or an Exchange deployment of any kind.


And at some point, AWS or Azure might be considered more secure than servers configured and administered by an organization that doesn't have that as its core competency.


This. Who beat every single private organization with Spectre/Meltdown mitigation? Amazon Web Services.

There is a pervasive myth that servers run by private organizations are more secure than those run by the public cloud providers, and the opposite is actually true. Does your organization receive embargoed information from Intel to mitigate side-channel 0-days before they are publicly announced?


Fast forward two years and after Google promised they would not spy on the students like they do on the general population to get the contracts they did exactly that.

Lobbyist and fools can get past most logical objections and cloudify anything.


> Manufacturers in the USA don't make cheap coffee cups. We don't make underwear. We don't make car fenders. We make warheads. We make gyroscopes. We make electro-mechanical assemblies that China or Malaysia or Mexico would screw up. We specialize in quality over quantity, and we specialize in cutting edge tolerances and specifications. We make export controlled things for enterprise contracts and the government. Things that require certifications to produce, and govermnent regulatory compliance, and tight tolerances. Nobody here is making the 100,000,000 wrenches you can buy at Wal-Mart.

This a blanket statement and is wrong. Most cheap manufacturing is done over seas but the US still has a large manufacturing sector that makes all sorts of crap.


We sure do, and our stuff is generally a little more expensive than the Chinese-made import but a lot higher quality. My go-to example is brooms and mops. Libman makes their brooms and mops in the US, and I'll never buy another brand. They break like any sort of cleaning product does, but a lot less often than the imported ones (IMO).

Sterilite boxes are also made in the US -- cost a bit more than imported, but again, much higher quality.

It's a shame there isn't a "made in the US, and slightly more expensive but a lot higher quality" option for everything I buy, cause I'd do that in a heartbeat. I hate buying shit that breaks; waste of my time to even have to think about that stuff.


>> Nobody here is making the 100,000,000 wrenches you can buy at Wal-Mart.

There are still people making nails in the US. Fertilizer. Food gets exported. Then there is all the stuff to too expensive to ship. Lumber, aluminum sheeting, cement ... lots of non-precision stuff is still made locally. Not every US factory makes munitions.

And some stuff is made locally not because of 'better' manufacturing ability but for speed. The fashion industry has to react quickly, quicker than overseas shipping can manage. I just ordered a small electronics assembly from a Canadian manufacturer not because they are the most skilled or precise but because they can chat with me on the phone and ship a small-run (5) faster than any Asian manufacturer. (It's a device for measuring laser energy at specific wavelengths but I have some specific needs re how the data is collected/displayed. It only took a 10-minute call to explain my issues and get a deal together.)


>And some stuff is made locally not because of 'better' manufacturing ability but for speed. The fashion industry has to react quickly, quicker than overseas shipping can manage.

Almost everything in the fashion industry is made in Asia.

And few consumer goods (if anything) are "too expensive to ship".

>I just ordered a small electronics assembly from a Canadian manufacturer not because they are the most skilled or precise but because they can chat with me on the phone and ship a small-run (5) faster than any Asian manufacturer

Yes, but for anything at scale, they wont be the most competitive option.


Clothing industry != fashion industry.


> ISN'T ALLOWED TO USE THE CLOUD

Do you find this a bit strange, given that the Pentagon is making a massive push toward (presumably private) could infrastructure?

https://www.reuters.com/article/us-usa-pentagon-cloud-idUSKB...


They probably are allowed to use the cloud, it just requires a lot of red tape and paperwork: filling out forms, waiting, and filling out more forms. Pretty sure AWS GovCloud exists for a reason, and that reason isn't because it has no customers.


May not be allowed to use cloud today but that will likely change in the near future. FWIW, I can imagine your post in my head as an argument in favor of horses over automobiles.

What you’ve seen in 10 years was reality during that time. That speake nothing little to the future.


We don't need a lot of people to do all those things.


Good points, but there are other American-dominant industries aside from defense manufacturing, and they are definitely taking a keen interest in the cloud. Also, defense is moving to the cloud too, albeit more slowly than AirBnB, say.


The article is not talking about IT at Raytheon but "[...] anonymous Windows administrators and point-and-click DBAs and “senior application developers” who munge JSON in C#".


If anything, it will be a private cloud setup on an isolated company owned data center.


> Repetition is a sure warning sign. If you’re building the same integrations, patching the same servers over and over again every day, congratulations – you’ve already become a robot. It’s only a matter of time before a small shell script makes it official.

Absolutely - if something is repetitive, it's a candidate for automation. This is true across all disciplines. Only the as-yet unautomatable human judgement, insight and communication is safely valuable.

On the other hand, "go away or I will replace you with a very small shell script" has been a BOFH joke since the 90s.


I have actually replaced a person with 40 lines c# program


I replaced a person with a one line code change (performance tuning) - they got her another job though.

Literally had a business call to see if they could run another instance of $LOB_SOFTWARE because they had a person clicking a button for 40 hours a week.


We replaced our entire overnight testing and on-call operations department with SSD and a little bit of SQL optimization. Jobs that used to kick off at 5pm would run until close to noon the next day. When I stopped interfacing with that unit those jobs completed by 9pm. Went from a department of 25 operators to a team of 15 or so. We weren't able to reduce further due to vacations, on call, sick time etc.


Yeah, I see the biggest problem here when existing code which ran fine never got the maint it needed to scale up the team, so they throw more bodies at the issue instead of pay to fix the code.

Good work.


That kind of work is easy to automate. Until you reach a situation when it has to be decided when a person has to click the button. And with what data they are supposed to click it.

Often it's not easy to make those decisions through code. That is because the person making those click decisions, has a lot of tribal knowledge of the business situation at hand, which triggers only when the situation presents itself.

But if human is basically a meat-robot doing programmable tasks that is not even a automation issue. Its really more management and planning problems. You are supposed to fire people running the business in those cases.


There's hybrid cases though, where you can use automation/business logic to lower the skill floor required for some work.

That's pretty much the business model of the company I work for. We have what is essentially a call center using custom CRM/workflow software doing work that used to be done by many times more people with much more training. Instead of 5 licensed and well paid people we do the same work with one lower paid and less trained person assisted with software.

Humans are still involved to make decisions when needed and for communication, but the simple and clear stuff is automated so we can hire pretty much anyone.

It's good for our clients since we are cheaper and more efficient, but less good for the workers we're replacing who will need to find new marketable skills or fall down the ladder.



His wife and kids don't even know.


> His wife and kids don't even know.

:-)


What did the program do?


Pulling info from standardized word documents.


While I think this essay has some good points, it ignores the problem that I always see with this idea - people. I really wish the business people I deal with on a daily basis could have their business requirements met by such automation, cause it's the least fun part of my job. But the don't because I don't care how awesome your cloud provider tools are, they always come to me with some weird requirement or platform and I'm back to "munging JSON in C#".

The problem isn't the technology, it's the complexity of the customer's business requirements, and their nearly complete inability to transfer those requirements into software without complex implementations that they could never hope to implement themselves. I would love to see more tooling to help with this. I have been waiting for 25 years. It gets better, but not nearly what can be described as an apocalypse.


> ...their nearly complete inability to transfer those requirements into software without complex implementations that they could never hope to implement themselves.

I wouldn't even mind if they even had "those requirements". That would be a huge step up. Oftentimes the requirements are not written down and stuck in tribal knowledge. And woe betide you if the tribe is an outsourcer or offshore team. About half the time, they're unintentionally leaving knowledge in the heads of their meat-robots, and getting the information transferred out of those heads is painful and time-consuming, because "set of procedures for meat-robots" is effectively what they're hired for.

If the procedure is periodically performed, I often get pretty good results just asking for copies of the resultant emails reporting completion, and working backwards from there. Automation at this layer of staff work is considered exotic developer-realm, needs-a-budget-and-a-project-manager effort, even for what most HN readers would consider relatively trivial multi-hour or multi-day scripting work.

The abruptness and agony of automation sweeping through these layers in the upcoming years as the tooling to discover, capture, distill, and maintain these requirements with tight synchronization to software teams maintaining the code behind the automation are going to be politically challenging, as a lot of these people have zero notion what they're doing can be automated away, even as they consume the results of ML in their daily lives.

And I'm rather glum about teaching these people how to perform the automatation themselves. The reception I've gotten to my offers to help them get on the track to learning programming and automation have been very underwhelming. Even if someone doesn't "get it" about coding, just the exposure to the thinking patterns would help me enormously cut down on unnecessary meeting times, as there are still way too many people whose conception of automation is closer to "can't they/you just...[magic/mind-read]?"


A point that often gets missed with "low-code" tools, is that it's not so much that they enable "non-coders" to build applications, but that they enable experienced developers to go so much faster.

I've been using a fullstack low code development tool for several years now, and when it comes to developing CRUD apps or data reporting apps (with charts, interactive, drill down reports, etc. etc.), it's astonishing how quickly you can stand-up a secure, fully responsive web-app, complete with authentication, authorization schemes, report subscriptions, etc., without writing any code at all.

And, when you bump up against the limits of the declarative/low-code aspect of the framework, you can toggle over to java script, your own CSS, SQL, etc., so it's not like you paint yourself into a corner.

So, I agree, if Amazon creates something like this, and it is as good as some of the existing low-code tools out there, it's going to have a big impact over the long term.

edit: typo


I'm doing amazingly more involved things much faster now than 20 years ago. However, user demands and expectations are higher than ever.


Which tool?


I hesitate to say it on HN as everyone seems to hate Oracle (and for good reason), but the tool is Oracle Application Express. (https://apex.oracle.com/en/)

In my earlier consulting role, and now as a CDO, it's my go-to tool for CRUD and data presentation apps.

(I don't work for Oracle.)


I used that for small internal LOB apps at a huge bank when it was still called HTMLDB. It saved a huge amount of time, despite its architecture making experienced developers wince.

I wish I had the time to spare to build a PostgreSQL/Python clone of it.

Though SQL is certainly a form of code; it's just nice restricted domain-specific code that people without the title "engineer" or "developer" often know.


As a senior employee in an organization that has migrated much functionality to SaaS and the cloud... I'm doubtful. In my experience so far, what we do changes, but the need for IT employees hasn't gone down. Most of what we did was figure out how to solve business problems with IT, and that continues, cloud or no. SaaS offering are sophisticated, but hard to use out of the box when you have significant regulatory (and other) requirements.

If there are IT jobs developing for smaller organizations, maybe those will go away, but... I think a lot of that disappeared already.

I'm close to retiring (from this job, anyhow), so it's not a personal issue for me. I just haven't seen it happening as described.


I think for most mid level guys the threat isn't AI doing their job. I think its the influx of people who will undoubtedly join the "coding" workforce once automation takes away many trivial jobs like driving trucks, cabs, or flipping hamburgers at McDonalds. All these people will be told to retrain themselves and go into the tech sector, and we'll get a huge influx of cheap tech people.


I don't know if you know any working class people, but a lot of them simply don't have the intellectual capacity for tech work - which is often why they flip burgers.


And a lot of them also just didn't have the opportunity to learn it but are easily smart enough to pick it up when their ability to make a living depends on it.

Let's say 10% of these working class people just need a push to discover they're actually decent at coding/devops/admin/etc. 3,500,000 truck drivers means an additional 350,000 people competing for IT jobs. Now factor in all the other people employed in these sorts of jobs. There's going to be a huge glut of incoming "cheap" labor when these sectors become more automated. I don't expect them to be competing with people like Linus but they're sure as hell going to be competing with the neighbors kid that went to a public 4 year school for a CS degree because the job prospects were good.


There are no doubt already 300k job openings for different kinds of tech work in the US. There's probably 30k jobs just in Seattle (endless local npr series about jobs at https://www.kuow.org/series/region-of-boom). But not everyone will make it as an programmer, we need so many jobs for different skill sets.


Totally agree. People will do what they have to to not starve.


You'd be surprised.

A huge number of "tech workers" today are from the same families that would be working class, factory workers, and so on just a generation ago.

There's no magical racial difference between working class people and tech workers.


You're being rather reductive. Just because someone is in a 'working class job' doest diminish their intellect. I've encountered many working class folks with a lot of knowledge in the fields they were in (electricians, plumbers, mechanics, etc).


Those aren't working class, they're tradesmen. At least IMO.

Working class to me means Walmart, Mcdonalds, and Jack Generic Construction LLC.


Why they're hardly even Scotsmen at all! Less humorously I'd say your definition of working class differs from that of the public. If you ask those tradesmen if they're working class, I'd wager upwards of 90% of them will say yes.

Working class to most people just means that they do manual labor. They work outside and in factories building and moving physical things be it burgers or buildings or balloon animals. The key seems to be that they don't generally sit at a desk with a computer on it for 8hrs a day.


It appears you don't know any working call people either.


Lambda School seems to be doing quite well training a lot of them to program and taking a cut of their initial wages. This guy specifically mentions fast food as a previous job: https://twitter.com/AustenAllred/status/1083994585557127168


a lot of them simply don't have the intellectual capacity for tech work

Well, neither do most tech workers. Why’s there another high profile breach seemingly every week? Why is software generally full of bugs? Because the devs lack basic competence at their jobs. Maybe they should have been flipping burgers instead. But RoR, Node.JS et al dumbed things down so much we got brogrammers...


"All these people will be told to retrain themselves and go into the tech sector, and we'll get a huge influx of cheap tech people."

This was the fear with "India" in the 1990s, and how it was going to kill all of our wages.

I make a lot more money now than I did then.


The limiting factor there though is probably that it is currently 4am in India and they can't typically be brought into a US office to be trained by your core team.


We've been through this before. It was called Windows NT. The democratized tools were Access and Lotus Notes. Amazon is doing this for the same reason Microsoft did -- it's revenue by a thousand cuts. People spent $5,000 in 1995 to give diesel mechanics PCs so they could update work orders -- they will do the same for little apps.

The reality is, you're going to have a million monkeys hitting a million keyboards, and very few will be producing Shakespeare. All of that crap will be consuming lots and lots of AWS/Azure/etc bill.

You'll need way more IT people to rationalize it. There are tens of thousands of people in the United States whose purpose for the last decade has been re-implementing the 90s version of this in formal IT systems. You will have churn as we purge the legacy staff, especially windows click to admin types.


Automation hits white collar professions as a massive productivity improvements for the top performers displacing everyone else working in the field (think the top 10% of people in your position doing 100% of the work).

In web development this is most apparent (to me) in SAAS application development, where many/most of the underlying pieces of building a CRUD application that can scale to thousands of users, and be really functional are now provided by other SAAS apps which provide a _better_ service than the average developer can scrape together themselves.

Billing -> stripe.com over writing against the gateways directly

Database/Hosting -> Heroku PostGres/Redis and compute

Email -> Sendgrid, Mandrill, ActiveCampaign

Or even just SAAS frameworks like BulletTrain (Rails) or Laravel Spark which dramatically cut down on the boilerplate and integration code you'd have to write.


As someone who's been in IT over a decade, I am concerned and so many IT folks are going to be blindly hit.

Sure, some of them will still have positions the same or similar roles but there will be a crunch. The large outsources will be hit overseas (WiPro, Infosys, etc.) but it will also impact administrators at medium-large sized businesses in typical American Cities as Forrest mentioned. The worst part out of all of this is too many colleges and especially technical colleges still teaching networking, linux or windows administration as if they'll be able to have life long career. That is no longer true.

I don't want to imagine what it'll be like for those students who graduate, get good jobs (now), a mortgage and start to raise their family only to find themselves unemployed in the middle of their lives. I don't expect much sympathy from the largely meritocratic tech industry or anyone else.

As for myself, I already work for one of the big three and apart of many "cloud" migrations. I should be okay but at the same time I am somewhat conflicted. Am I going to need to go back to school for Computer Science and become an fully-fledged actual software developer? I mean, it's fine, there's still enough time (I don't think we will really feel the burn for at least another 4-6 years) but is it reasonable or realistic that everyone needs to be rockstar developer?


I've always kept my coding skills up a bit as a hobby, but I don't realistically see this IT apocalypse coming. Everyone in Silicon Valley thinks they're going to automate away everyone's problems, but nobody there has managed to prevent people from needing the same type of support they needed two decades ago: Why doesn't my printer work, and how do I know if this email is real? (If you believe Gmail or Office 365 can do the last one, you're wrong, FYI.)

I don't think we're anywhere near an "IT apocalypse". I think we're more likely to put a ton of machine learning engineers out of work long before companies start needing less help desk technicians and sysadmins. I think a lot of people have moved to the cloud only to discover they needed just as many people to help manage their cloud presence as they needed to manage their on-prem hardware.


> but nobody there has managed to prevent people from needing the same type of support they needed two decades ago: Why doesn't my printer work, and how do I know if this email is real?

The author does touch on this though by highlighting that you will need less and less people. As more services move into 'cloud' solutions it can free up time for those and they'll step into those spaces.


I don't see "less people" as an upcoming IT problem, as most places I look are getting more IT people, as more parts of their business become dependent on computers. I haven't seen any evidence of that changing, and believe it or not, the average business still has plenty of migration to digital means left ahead of them. (Where I work, manual punch card time clocks are still used.)


Look at the cloudification or automation in Windows Products - Windows Updates are pushed through Intune over the internet, Exchange, SharePoint are all through Office 365. Entire On-Premise Datacenters are pushed to Azure.

Yes, someone still needs to manage all of this but you need a lot less people. Or you ship these "trade jobs" to low cost areas like India.


People keep stating "you need less people" with the cloud like its an objective truth, when there's really no evidence of it, and I'd argue its patently false. You end up paying both your own IT and the cloud providers' IT, for a product that also doesn't work when your internet is slow or down.

The main thing moved to the cloud where I work leads us receiving and handling the same number of support tickets as when it was on-prem. The difference is, now some tickets we can't fix, and have to wait for the cloud provider. Service is worse, and it doesn't really save us any time.

A lot of cloud solutions offer an on-prem option. The tools are the same, it's just a matter of it running itself in the building or running itself somewhere else. A lot of times, running something on-prem means spinning up literally the same software you could have them host for you.

(Also: Windows Updates also aren't some crazy painful manual process that Intune fixed. You can just tell WSUS to approve everything automatically if you want, and just as similarly, you can manage Intune more granularly which takes up your IT staff's time and effort.)


You need less people with cloud is objectively true, as long as you do it properly.

If you're simply running EC2 instances with your same off the shelf software, you're not doing it right, but you'll still eliminate your entire datacenter physical facilities team and server install/rack & stack/replace failed disks team.

If you do it properly, it's incredible what you can do. I have a client with applications running in Ireland, Frankfurt, Singapore, Tokyo, and the US, totaling around 50 EC2 instances running containerized workloads that automatically heal, APIs that are accessible globally and won't go down unless 6 AWS regions simultaneously fail, about 30 static websites, DNS hosting for a dozen domains, monitoring, auditing, and log analytics for all of the above. I set it up in about 3 months as a single engineer and manage it all with about 8 hours a week of total effort. The cost to my client is basically the same as hiring a single senior engineer, but they're running infrastructure that would have taken a team of 3 shifts of IT professionals without the cloud.


Nice. I wouldn't mind joining you for a contract


Author here - I'm serious when I say that I'm happy to be a sounding board if you're at a career crossroads. Twitter DMs are open @forrestbrazeal.


I've been in IT (mostly the "Legacy" kind - Oracle/PeopleSoft ERP) for about 20 years now, and I fully empathize with the article. I've subconsciously, then consciously, had to keep "moving up" on the value scale, partially due to outsourcing and partially due to automation. From sysadmin type to infra architect, but then made a curious curve that still catches me by surprise into tech lead and now find myself in basically management. I struggle to understand what will give me better skills and career for future: developing people and org knowledge and experience, as I do in my current role, which just feels generic and not particularly useful to somebody who used to gobble technical manuals for fun (but apparently market finds value in technically knowledgeable people who can manage and communicate); or go back to my love of technical and get back hands-on... understanding it's an uphill battle at my age and may be a losing one with the amount of automation and "AI being around the corner (forever;)".


If you have a decent understand of ERP systems and projects it might be easier and more productive to transfer those skills to more modern cloud-hosted ERP products than go back to more generic development skills?


Thx; Agree, though I find many "cloud-hosted ERP products" are actually very traditional ERP products, just hosted by vendor (I remember the "Lightbulb" moment when I realized the much-vaunted Oracle HCM Cloud is just Oracle Fusion tech stack underneath, that somebody else manages; in that case, I don't know if it's necessarily _fewer_ people that work on it, they just all work at Oracle datacentre instead of dispersed among the clients :)

Haven't worked on Workday and the like yet to understand how "Cloudy" they trully are.


Cloud based ERP/CRM do solve a lot of infrastructure issues - but they can also introduce a lot of issues (or opportunities depending on your perspective) - some integration tasks and data extract suddenly become a whole lot more difficult.


I've just started my career in ML, and I already sort of feel that a majority of the work that people around me are doing can be done by an automated pipeline (like Python's AutoML) and throwing enough compute power at the problem. It's quite worrying.


Learn statistics and how to communicate them.

That's going to be a long time being automated.

For instance, if (for example) you model insurance data in the EU, you cannot use gender as a factor in pricing (even though it's effective).

In general, the modelling/ML pipeline is the easy bit, the hard part is the data cleaning and figuring out how to translate a business problem into one that can be solved by data.

tl;dr as many people have said about Comp Sci over the years, learn the fundamentals (statistics and experimental design) and you'll be in a much better position.


automl is pretty horrible if you're doing anything that is complicated, even at Google-scale compute power SotA DNNs are hand crafted, not auto-found. Feature engineering is also quite important for a lot of data types and that has too large of a phase space to probe with automl. At the same time if your problems are mostly solved by logistic regression/random forest/etc with simple features (ie you have well defined categories and/or states or your task is well solved in literature) then ML is not your value proposition, it's more data/business analytics (and autoML should enable you to provide your value faster).


I don’t have twitter so I hope you don’t mind me engaging here.

I would be very careful before dismissing AWS’ no code/low code project. Mulesoft, Microsoft Flow, and Zapier have a combined revenue of hundreds of millions of dollars a year in serving a market for establishing business logic workflow without code. I am only surprised it took AWS so long to move into the space considering the cross marketing opportunity to existing customers and their compliance capabilities.


This is exceptional writing. It's concise, clear and engaging with a really clear call to action but not defeatist like so much writing regarding this issue.


Thank you, that's very kind!


Ok, question:

What exactly do you mean by this:

'“senior application developers” who munge JSON in C#'

Why did you choose to JSON and C# in particular here?


Because it's quite trendy to hate on all things Microsoft, especially in Silicon Valley startup culture. I'm a C# developer and have had other developers cringe in my face when I tell them I'm a C#/.NET developer.


I use VSCode and develop exclusively on Windows 10. I also have mac and linux machines. I think its better to be agnostic when it comes to language and OS.


No particular reason, just trying to provide an example of low-value integration work that is likely to be subsumed by a service at some point.


I enjoyed reading it too, to the point and full of information.

I really hate the current pattern of getting some interesting topic and nice data about and burring it into 20 pages of story telling journalism.


Programmers have been predicted to be going out of jobs since the days of COBOL. There is a reason that is not happening(yet), or hardware companies would have been all shipping pluggable chips, we'd just config-connect them and be done. The reason is in the word itself `Software`.

The real problem with these ready-made plumb-and-plug modules is sooner or later these are either too slow, or expensive, or just a pain to refactor/redo. Eventually you just come back and realize you need a more granular control over things, and anything you are likely to come up with resembles a programming language.

I had this moment of realization myself while having to change a complicated graph in Pentaho Kettle a few months back. The graph looks bonkers hard and brittle, changing anything requires redoing all the dependent elements of the graph, and if you have a graph complicated enough you will be forced to rewrite it. The real trouble there is no functional/unit testing with these things. And then you realize, you are just better off with a full fledged ETL language/programming language. The second problem I faced was running into performance issues. Want to change the sort algorithm? Running into heap space issues? Want better logging? Want a better threading model? All the best. Nothing is possible.

This is above and beyond the need for meta-programming facilities. At that point whatever GUI graph you draw is worse than any verbose code you will write.

Regarding programmable tools, we already have those. Vim, Emacs, Microsoft Excel all give you a degree of meta control over the tool and what you want to do with it. But that's that, and it is often hard to bend this tools to your command.

These are just a few reasons why there won't be an apocalypse soon.


I empathize with this sentiment, and will add that it may take fewer developers/IT folks to get a product out the door (MVP), but a company that depends on the product will have more specific business requirements and will eventually onboard more people to deliver solutions specific to those needs.

The job market will close up a bit, but right now tech is looking like the California gold rush, where 4-5 years ago any bootcamp grad could jump right into a web dev job (at least in my job market in the Midwest). I think if you continuously learn and remain marketable as the times change, then as a worker you will be fine. I also like the comment that mentions that you may just end up working for the cloud provider rather than the business application company.


Nice blog, keep writing.

I manage a machine learning team and I also think that at least partially automated data curation and modeling will reduce the number of people required in my field. It might take 5 or 10 years, but I think it will happen.

I think you are spot on that IT and devops will take a hit. I look more at Heroku’s model that AWS and GCP as the future. That said AWS and GCP will keep getting more ‘Heroku like’.


Heroku and its parent company, Salesforce. Salesforce databases are code and no code manipulated and should already carry most of the business and customer data and offer several different off the shelf products for plugging other data sources in. For most companies, it could cover the majority of what their legacy IT department does, and for many companies in the Bay Area it handles everything, except their product and what’s in JIRA. For others, it is woven into the product.

It’s a complete blind spot for most engineering minded people because they never realized how flexible the platform was, and with Bret Taylor running the show now, it’s miles away from just being a clunky Sales CRM.

Couple of examples of recent developments:

https://developer.salesforce.com/blogs/2018/12/introducing-l...

https://lightningdesignsystem.com

https://developer.salesforce.com/platform/dx


My experience with Salesforce (mostly with Marketing Cloud) has been that the developer experience is truly awful. Mediocre documentation, no debugging tools to speak of and systems that never quite do everything you want. Maybe other parts of the ecosystem are better but I'd be very reluctant to take another job where I had to work with it.


Actual Salesforce is just expensive, complicated, cumbersome, opaque, legacy-encumbered, and programmed in a unique language and UI framework. Marketing Cloud is all that, plus what you said.


Yeah, Marketing Cloud is a collection of acquired products that is not really on the actual Salesforce platform. It’s very much its own thing, kinda like Heroku is its own thing.


I was a SQL/Tableau analyst and a SF Admin before I turn to software engineering. SF is truly a fantastic piece of shit to do any proper development for. I am so glad i've stepped away from it. It has been built with the idea to sell to managers that will not be handling the implementations, and with the idea that non-programming staff can configure it. The front end is decent and most people can make it do... something. But custom integrations are just a pain. Salesforce impose lots of fair usage policies on upserts for example.

I was at a Money Transfer company and we couldn't move our Transaction Monitoring staff over to SF without hitting limits


I had some cursory experience with SAP, which is another platform that can do anything. From that I got the impression that while these things are extremely flexible, you're often better off just hiring normal software engineers to write normal software for your problem instead of throwing millions at $PLATFORM expert consultants that know the arcane platform.


"I manage a machine learning team and I also think that at least partially automated data curation and modeling will reduce the number of people required in my field."

I mean, if it doesn't, what the hell are you even doing?

The whole point of technology and modern capitalism is to increase automation, increase the amount produced by the same number of workers, and increase the overall amount of wealth in the world and improve overall living conditions for everyone (setting aside very important questions of distribution). I just find it odd people in the computer technology industry find this shocking or especially worrying.

"I look more at Heroku’s model that AWS and GCP as the future."

Google's App Engine was much closer to the Heroku approach, and the AWS approach won. So I will be pretty surprised if the Heroku approach wins out.


20+ year technology consultant here. I have done work for dozens of clients across just about every major industry out there. The number of (relatively well paid) people I've seen at clients in this neverland between Business and IT whose jobs revolve around pulling data from one system, munging it offline, and then loading it into another system, or similar tasks that should have been automated with a script a decade or more ago, is absolutely staggering.


I object to the word "apocalypse", it is just business as usual.

Better automation has been "reducing the number of people required to deliver technical solutions" for ages.

Local Area Networks replaced many mainframe computers in the 80's. Optimized C compilers took the jobs of countless Assembly programmers. WordPress, Joomla and better web frameworks (Django, Rails) took the jobs of many Perl/Web developers. Python enabled a lot of people to do what FORTRAN/Java/C++ programmers were able to do before.

"Apocalypse" is just the normal state of affairs.


I'm advertising for 2 roles at the moment: a Senior Backend engineer and a Junior Frontend (in London, UK). Almost impossible to find someone for the backend role, but I've had to turn off the advertising as I've had 71 people apply for the junior frontend role. The mix is fascinating, a lot of ex-bootcampers, some CS grads, some self-taught people, but all of them are desperate for a shot to get into our industry. I've found this quite worrying as a signal for what's happening in the wider economy.


Could be a lot of things, but my guess - maybe seniors don't find your offering compelling enough? Try avoiding "young teams" and "beers" and put more "healthy snacks", "work life balance" in the job offer and see what happens. Seniors generally have better options than juniors, you're gonna have to work hard to get and retain them.


I read an article regrading this recently... With junior engineers, they are selling themselves. With senior engineers, the company has to sell itself to them instead.


My comment was about comparing and contrasting the amount of cold interest(/desperation). I wasn't passing comment on how hard it was to recuit - of course it's harder to recruit experienced people.


Yes, the web front-end is perceived as low-effort entrance to the software development fields. Purely commercial bootcamps, people from unrelated fields, people without a clue about networks and browsers, people hanging on anything which will allow them to land a job - it makes it trivial for Google and Facebook to basically take an absolute control over the web with their technologies, solutions, and stacks.


I'm waiting for it to happen to web dev so that companies can find something else to fixate on so I can go do that.

Companies are always going to follow the latest trends and it's always going to take smart people to follow them. I'm not worried about my ability to make a living. I just can't wait to see what comes.


don't hold your breath, this beast won't go to sleep soon.


The author and I share nearly identical work histories. I've been in IT for about 10 years, starting with AWS and database administration, then turned more DevOps with a focus on CI/CD. Over the last few years, it's become very obvious that the DevOps role is requiring more and more development skills. Simple bash scripting is not going to cut it in the modern tech company.

A couple years ago I made the switch to full time development. I now do most of the DevOps stuff for my teams, but from a developer role, instead of a sysadmin/cloudops role.

I'm certain that's going to be the future. Look at Google's requirements for SREs. They are full-fledged software engineers.


I’ve gotten to a point where if we write a bash script for something I consider it somewhat of a failure. I want operational tooling to be easily maintainable by a large team of software engineers, and bash scripts just aren’t that.

Services that run services is the way to go. Lose a box from the fleet? No problem, the operator service bounced it and got it running again. I’ll admit that timelines often mean you don’t get to build the grand vision out of the gate, but I absolutely agree with you that bash scripting isn’t how we should be running services these days.


Is there another side to this story? I've spoken with friends running the engineering teams at big analytics companies and they've gone back to self-managed, bare-metal servers because they couldn't get the cost/performance they needed from "one size fits all" cloud solutions.


Well AWS is expensive.


And if AWS becomes one of the few games in town, it's going to get more and more expensive.


Conspiracy aside, AWS is most likely profiling all of its customers code bases and coming up with patterns as in what domain their applications are being used for etc. In order to capture that low-code/no-code type of market they don;t have to do any market research when the research is already sitting with them. It is pretty genius from a business point of view but yes can become very concerning for mid-tire programmers as the article states.


Most enterprise IT is change management. Migrating that database cluster takes 3-4 man hours. Planning the migration might take a half dozen meetings, vendor calls, dry runs, calculating transfer duration, etc. Enterprise IT isn't going anywhere. The cost of doing it right 100 times is greatly outweighed by the potential loss in doing it wrong once.

People who are already on the cloud aren't going to ditch their service providers so they can take time away from their actual business to manage something they don't understand to save a few hundred dollars a quarter.

Finally, programming hasn't changed substantially in the last 15 years for most people. I know that might sound shocking to people on HN, but most developers are working in dingy cubicles on archaic systems without continuous integration or release management. They're deploying to production and then jiggling the handle until things work. The systems they produce are just good enough to keep other business units crunching along, and their saving grace is two-fold: they don't cost enough to warrant real scrutiny, and the potential utility of an efficient IT department is non-obvious to most business managers.


Cubicles! Where do I apply?


I am a developer on HP/Nonstop. Can someone please explain what automation means with respect to software. Is it bash scripts? I can see bash scripts doing routine tasks but not complicated business logic which I think can only be done by writing programs. I lost my job due to outsourcing not automation.


"I can see bash scripts doing routine tasks but not complicated business logic which I think can only be done by writing programs."

Bash scripts are programs.


Bash scripts have existed since 1970s when Unix was used. That did not result in loss of jobs though.


Completely agree with this.

There is something to think about. The folks who are running things are plentiful right now but as this churns we will run out of folks who understand the lower layers well enough to manage it.

As I interview people I run into this for things like SRE and DevOps already


Exactly this. I've been a sysadmin since I got out of the Corps, with a recent break and now found a really exciting startup that is automating something else. This is the first time I've really forced myself to stop allowing technical debt and lack of staffing to dictate how I automate, so I am working very close with the devops/SRE team, and I am continually flabbergasted at how much they don't know of the underside of the systems they use, or just fundamentals of many things. (I even had one guy yesterday try to tell me I should stop using emacs and start using pycharm because "it's a real IDE" , yet this is one of the guys who's going to "automate all the things"!?) I'm having to learn to automate better, but they don't seem to be learning how to admin better, but I also consider that part of the sysadmins job... just as devops and SRE need to be "evangelists" in order to get C-suite buy in, I think the sysadmins need to make sure the lower level knowledge isn't being lost, or worse, discarded falsely as not relevant.

Having seen the inside of so many places though, it also continually surprises me how many companies are all struggling with the exact same issues, to the point I'm highly tempted to try to pitch a whole IT solution I've been thinking on for years at the next YC.

One thing I've noticed on HN is that too many people tend to think of all companies as SV software startups, when there is a huge swath of companies in-between coasts that don't have a single dev or programmer and are just doing their business. Much of the "devops is killing sysadmin" hysteria is overblown due to this filter bubble.


There's definitely an issue in the devops/SRE world around understanding the totality of their stack. But I would submit that this is endemic to tech in general, not just devops/SRE folks. The combination of siloing and incuriosity isn't an SRE problem, at least not exclusively.

Selfishly, it also makes things harder for me, as somebody who is a generalist's generalist, because people legitimately do not know what to do with somebody who's built and shipped mobile apps, can drop into a new piece of backend software and rapidly get up to speed, will architect, implement, and manage your cloud environment, and has hard-won opinions about rack cabling techniques. But I do OK regardless.


And most companys see in the IT just a cost factor. So they don't invest really in automation. They mostly just invest in creating even more data for marketing and other funny guys.

There is also the problem, if the IT is far away, the smallest problem needs "tickets" and days to solve and it will be never possible to automate something with a selfmade script, because your just a user and have no rights.

(Yes I'm a bit frustrated because a lot of this things at work at moment.. )


This is a real thing for certain areas like packaging and deploying software or database and to a degree. But I think that since we continue to create more and more software, there will still be plenty of demand, until AI becomes a significant aspect of software development. That may not happen until at or near the time that we achieve truly general artificial intelligence. This is pure speculation but in my opinion the first "real" AGI systems are less than two years away. Within 5 to 10 years of that happening, I expect unaugmented human programmers to be obsolete.


I submit - that engineers should NOT be working on anything technical except at the highest abstraction layers. We should NOT be coding, or scaling servers.

Technically minded, astute engineers are great at solving problems. In an ideal world, we'd be using our ability to solve complex real world problems. I'd much rather the Golang engineer equivalent 20 years from now solve critical watersupply issues to a village, than writing API's with flame graphs. Full disclosure: CS engineer by love and training.


The same thing is happening in data science, just at an earlier stage. There are software/service providers like H2O and RapidMiner that automate the whole model selection/optimization/productionization process.

It will take some time, but I think these providers will basically displace advanced ML knowledge workers, especially at large fortune 500 companies. IMO to stay ahead of the curve, data scientists need to pick up more business knowledge and move into a business/financial analyst role.


>Repetition is a sure warning sign. If you’re building the same integrations, patching the same servers over and over again every day, congratulations – you’ve already become a robot.

This is absolute spot on. It's our company's bread and butter to automate repetitive human operations. I can tell that many companies are looking for solutions that would automate as much human work as possible. Repetitive routines are the typical candidates to replace with software or less skilled personnel.


This article is very prescient. This is happening now and the pace is speeding up considerably. I work in healthcare and we hit a tipping point about a year ago where HIE's and hospitals are now comfortable with their vendors going to AWS instead of insisting on on-premises infrastructure. In the last year, my company is basically turned on a dime and we no longer use virtually any of our on-premises hardware. If you are in IT (non-developer), you should be concerned.


At the same time, as a software engineering manager, I would have given my left eye for a highly competent AWS DevOps engineer. It took months to find someone with the skills we were looking for.


You might be an exception, but food for thought: the number one reason why my clients, when I was running a consulting business, couldn't find in-house DevOps folks was an unwillingness to pay. At best, most places would pay line engineer rates--the same thing they'd pay for a Java or a Rails developer--for a skillset that's significantly broader than most (certainly not all, but most) line engineers at a given level of seniority and is significantly more in-demand. In the Boston area as an example, I regularly saw companies trying to get senior devops engineers at rates more appropriate for mid-level, and principals for hilariously under rate. (We're talking $140K offers for a senior devops engineer when the next place would offer $180K, and $160K for principals who could get $220K elsewhere.)

To be fair, this is sometimes trickier because a lot of "devops engineers" are actually mouse-driven system administrators, so the means and medians often look odd. That somebody who does what I do, and somebody primarily doing things hand-o-matically, would have the same job title throws a wrench into one-to-one comparisons. But you can figure it out.


If IT were going to become obsolete, technology wouldn't suck. But it does. Virtually all technology that either has or uses a microprocessor sucks. And it probably always will, as long as humans are creating technology.

In fact, the steady increase in both the use and the diversity of technology means we'll need more people than ever in IT. The job title might change, but there will always be a department dedicated to fixing shitty technology and help users use it.


Apocalypse, no. A great filter, yea I think so.

Comparing to factories shutting down is hyperbolic. It think what we’re about to see is more in line with a a job market that rewards things other than specific application knowledge.

However, I can’t say much in defense of the mid-level IT Pro because just this past month I went from considering a server running an OS and MySQL db somewhere that someone would have to maintaine into using AWS DynamoDB with API Gateway with automated calls to Lambda functions - I can completely get rid of our server and paying a mid level IT Pro to maintain it. For our usage it’ll be practically free now, where I feel like I would have been paying a lot of “enterprise IT” overhead before.

That doesn’t mean that guy is going to starve, he will have to either adapt as factory workers have or be valuable in a different way than “I’m good at something people don’t really need anymore”. This is as old as time.


Yes, this is old as time but it's just now we've starting eating into positions that with a moderate education investment you could have a decent technical career.

Now that isn't the case. The traditional system admin will need to adapt, but adaption looks like going back to school for software development.


The flipside of this is, in a hypothetical world where Amazon / Google is trust busted, this problem is delayed somewhat because only Google / Amazon have the necessary scale to create the IT apocalypse. I mean there is Salesforce and Heroku but the major IT automation stuff is from the FAANGs really.


They use to say a jack of all trades is a master at none, now these days SME's feel like they are walking on quick sand. SME's in core competencies still useful but more and more are becoming jack of all trades by default of how quick the industry evolves. Survival of the agile-ist.


Until now the destruction of jobs through progress has never reduced the amount of work we needed altogether. For instance think about how the development of transportation created dozens of new jobs actually, and brought us in many regards onto new levels that wouldn't have been even imaginable before. I wouldn't worry too much about something that might not be just-bad and instead try to figure out what can be done with the changes that we already see. E.g. all this nice virtualization and automation means it's so much easier to share code and changes and ideas, and helps us communicate over borders, language barriers, etc.


I feel this is a very late article and that it has already happened but not in the crisis way. Meaning, this has allowed a shift in work to focus on programming tasks rather than infrastructure as much and hence there is more competition to get features out the door quicker which helps in more IT work etc. for programmers. I haven't seen a shortage of work in other words, rather more of it. In terms of code-automation for non-programmers, these will always fail unless they become AI integrated with some agency and can figure out the minutia of demands real world apps actually need.


I am seeing this happen with game developers as well. Unity has made it super easy to release a game which is fantastic. However a lot of newer game developers don't necessarily know as many fundamentals.


> Forrest Brazeal is a cloud architect, writer, speaker, and cartoonist, currently based in Forest, Virginia. He is also an AWS Serverless Hero

The authors bio indicates extreme bias :p


FTA:

>> But they’re less expensive than a bunch of humans with health benefits

Whenever someone opposes health, fatigue of human against robots, he basically makes human less human and more robot. Hence, justifying the superiority of robots. It also makes it almost logical that human are PITA to work with and therefore, are overrated, that is too expensive.

The only way out is to reaffirm the value of human, instead of underlying their weakness.


Ok. While idealistic, this isn't what businesses do.

Businesses are required to sell, to pay salaries. Prices come down over time. Competition mounts, it gets bloody. Consolidation and then suppression of margins.

Humans ARE machines, just a different type; it doesn't "Feel right" to call us that, but we are meat machines.

There is no way to expect the trend of automation, removal of human tasks for a "better, faster, cheaper" environment. it won't happen.


Give it 50 years and we'll all be "tech priests" from Warhammer 40k, praying to the "machine god" Alexa in hopes she stops telling us to take our meds and instead just turns the darn TV back on.


Most people working closely with any sort of computer realted technology should already know they've singed up for continuous change. If you want to learn a good paying skill for the next 20-30 years try these:

Plumber, Electrician, Auto mechanic, Janitor, Police officer ...


It would be interesting to know the growth of employees at the cloud providers relative to the number of IT employees at non-cloud-provider companies over time.

I wonder how much of this is simply a consolidation of talent and resources?


On the other hand, as an individual working in IT I can achieve so much more than I could before... team sizes seem like they’re a lot smaller now as a result. That’s the upside, surely?


None of this is new. "Learn to code, or become irrelevant" has been a massive talking point for at least 10 years, maybe more, and this is only becoming more true by the day.


> but that they(cloud providers) are straight-up reducing the number of people required to deliver technical solutions.

I would like to see some evidence for this. Has anyone had such an experience?


I thought this was going to be about how our shoestring, bubblegum, and tape made cloud applications become self aware and kill themselves out of shame and disgust...


>No, the real trend to watch here is not that the cloud providers are making it easier for non-technical people to code (although they are), but that they are straight-up reducing the number of people required to deliver technical solutions.

Prehistoric era: Agriculture and domestication are straight-up reducing the number of hunter-gatherers

1800s: Cotton mills reducing the number of manual textile manufacturers

1908: Internal combustion engine straight-up reducing the number of horse stablehands

-Vacuum tubes reducing the number of human-computers

-Cell phone reducing the number of switchboard operators

Not seeing the problem here.


The use of the word "apocalypse" seems hyperbolic. How are these changes any different than previous technological advances?


If you are a point-an-click DBA you should be worried about your job even without this or any future products from any vendor.


It's kinda tough figuring out a whole new direction if you're in IT. But, if you're looking to get into web development, it's a pretty good time. Start off with HTML, CSS, Javascript and AJAX. that's the foundation and from there you can learn VUE/basic HTML/css from my tutorials: https://codeorc.com


The site is broken if you have JS disabled. The top banner covers the entire first page of content.


Eh, I've been a sysadmin/programmer since 1997 or so; I'd argue the slow automation of our profession has been going on forever. it used to be that sysadmin started as the kid who swaps the tapes and does the reboots, and went up to the person automating, and above that, the person troubleshooting your backtrace when the kernel panics. We were all sysadmins, just of different levels.

But... the job, if you were a mid to senor sysadmin, was always to automate away your job. There was always this race against yourself and the other sysadmins to keep your skills ahead of the scripts, and there were always people who fell behind during the crashes and couldn't get back in.

Really, I think the radical change that cloud brought us sysadmins was that employers care so much less now about your ability to go from a kernel backtrace to bad hardware. Knowledge of hardware in general has lost most of it's value; that's a lot of my knowledge and experience that isn't worth much anymore, and a lot of people who will have to find new roles. in my early 20s, I paid expenses while trying to start a company being on-call datacenter guy for a few small companies. I mean, I'd charge a minimum number of hours, they'd call me up or depending on the customer, my systems would page me when something needed fiddling with at the datacenter. - this job... almost doesn't exist anymore, because most of the customers are on AWS. (I mean, the job does exist; prgmr.com still even has co-lo customers, though I don't think they are accepting more... but it's super rare.)

Now, it seems like we want to split the sysadmin role into different titles;

SRE - production sysadmins; you have a high bar for programming skill and Linux internals knowledge, but a very weak requirement to have experience with standard configuration management tools (often SRE implies you will be using custom configuration management tools) -

then you have the devops title for smaller scale cloud sysadmins with a low bar for Linux knowledge, a medium bar for programming knowledge and a high bar for knowledge of standardized configuration management tools.

The sysadmin title remains for people who hit SRE standards for linux knowledge but don't meet the programming bar of SRE or necessarily have the experience with the standard configuration tools of the devops.

Generally, both in terms of pay and in terms of the number of machines you manage, SRE > DevOps > Sysadmin - of course, that's modified by level, location and company;

(Note, both SRE and DevOps involve systems design in ways that I don't understand well enough to speak about. I mean, I can pass SRE interviews at 3rd tier companies no problem, but at the first tier, I'm not even sure why I'm failing, which means I'm a long ways from passing. I mean, I also need to level up my programming to be a SRE at a first-tier company, but I have a solid idea of where that bar is and what I need to do to get there. I don't understand the systems design questions well enough to even really know what I have to learn.)

Also note, I personally think that SRE is probably a more solid career path than DevOps going forward, if you are in an operations career. Programming basics has been the thing that has changed the least during my career; the skill that is most portable. You don't want to marry a particular tool any more than you have to, 'cause that tool... might be out of favor next year.


The other thing I point out here is that "the cloud" is suuuper expensive, and while that is still way cheaper than hiring a me or two to handle your hardware while you are small, there comes a point pretty quickly on amazon where buying hardware and hiring a few people like me becomes cheaper for base load in the long term.

I believe that companies are staying on amazon longer than it is in their interest to do so, and I further believe that this will change when people start looking for savings during the next downturn.

For that matter, I believe that people will overreact and move in the opposite direction; you will see a fad of people buying their own hardware before they really have enough load to justify the person with the requisite hardware abilities. That's just how these in-sourcing/outsourcing fads work.


As someone who straddles SRE and SWE (with a classic CS background), I would agree with this analysis - the market is changing, and prioritizing programming skills heavily.


The problem extends outside IT field, as you mentioned factory workers. I think the main problem is that there's no coordinated effort to control inevitable shifts. Workers unions tried protecting workers rights but capitalism clearly won and if there's no profit, you have no securities. IMHO pieces of advice in this article are very good: do not put all eggs in one basket. Don't be afraid to let go of outdated knowledge. And prepare for the change that is going to happen. I'll just add that this is as much a political issue since unemployment is and will be a huge problem, I can only hope our public sector will become productive instead of reactive.


Capitalism won and in return we got Trump.

That doesn't sound like winning.


I really thought this would be about something else, given the title, but oh well.

As others have pointed out, this has been going on since.. the beginning of IT.

I got my start in the 90's, working for an employer that sold tech software on: AIX, Digital UNIX, Ultrix, SunOS, Solaris and IRIX. SunOS and Ultrix were on their way out, but still supported. Windows and Linux were relatively new additions. 64-bit was just starting to become a thing, so we needed to support both 32 and 64 bit versions.

Supporting dev, stage and prod environments for all this was a huge job, and took a fairly high level of technical skill as well as a huge amount of domain specific knowledge. Really - far more domain specific knowledge than technical skill, on the balance.

It did require building a lot of software, and autoconf was a (new) blessing for cross-platform builds.

My CS degree was helpful, but honestly there just wasn't much programming needed, outside of shell scripts.

At the time, every few years someone would say how all of this "IT management stuff" would be automated away. I distinctly recall Sun executives banging on about it in the media a bunch just as I was starting to work full time after graduation.

And I distinctly remember thinking - they are completely wrong. And they were.

Commoditization was a big shift - now all those UNIXes are gone and we're left with just Linux.

"The stuff" that cannot be automated was shifting then, has shifted a lot since then, and continues to shift.

At the time, lots of knowledge and skill was needed to build sendmail for every OS, configure and install it everywhere. Now we've basically just got Linux, and just about every package you can think of is available via apt and yum.

Configuration is done through a DSL like Ansible, Chef or Puppet.

And now we're shifting such that we'll just use SES or some other cloud service for sending, and we won't manage mail servers at all - or any other commoditizable service - SQL database, noSQL, NFS, block storage, etc etc.

Or we will, we just won't be tweaking many knobs and buttons on it - and we'll still be managing it primarily with a DSL rather than bash scripts.

And perhaps writing a fair bit more "real" code as well.

But - somebody's got to stitch all that together - as the article says, the key is providing value that is specific to the company/product/service. It always has been!!

Creeping - sure.

But is it an apocalypse if a bunch of DBAs and Windows Administrators have to learn some new skills, or retire, or lose their jobs? People who basically have had to be continually learning and adapting all along?

Was it an apocalypse when I "lost" the career value of all the skill and knowledge I had related to IRIX, Digital UNIX, Solaris, etc?

There is a REAL creeping apocalypse, but it isn't this. It is security. Software is eating the world, and for every line of code written, X new security bugs are introduced. In this, I'm including social engineering bugs.

That is creeping.

And the apocalypse will be when some combination of those bugs leads to something truly horrific. If it hasn't already - like the end of democracy.


Pop the ad bubble and the IT apocalypse as in the article will not be a viable scenario.


What click bait


If this article is accurate then we should end the H1B program immediately, or at least increase the minimum income required on it to $100,000.


His thing about the 'fox' at the bottom makes me hope any job he attempts is immediately automated, what a dunce



How do I downvote this comment into oblivion?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: