
Software will eat software in a remote-first world - sidhanthp
https://themargins.substack.com/p/software-will-eat-software-in-a-remote
======
brokencode
The premise that we are on the verge of some breakthroughs in software
development that will significantly reduce the need for engineers is really
weak, and is something people have been saying for decades (see the failed
fifth generation of programming languages from the 1980s for an example).

In my experience, software engineering is endless, complicated decision making
about how something should work and how to make changes without breaking
something else rather than the nuts and bolts of programming. It’s all about
figuring out what users want and how to give it to them in a way that is
feasible.

The idea that we will have some abstraction that will someday (in the
foreseeable future) save us from all of this difficult work sounds very far
fetched to me, and I can’t imagine how that would work.

Even the example of hosting complexity being replaced by cloud companies seems
kind of silly to me. Maybe that’s saving very small companies a sizable
fraction of their engineering resources, but I really doubt it for medium or
larger companies.

The cloud saves us from some complexity, but it doesn’t just magically design
and run a backend for your application. You still need people who understand
things like docker, kubernetes, endless different database options, sharding,
indexing, failover, backup, message queues, etc. Even if the pieces are now
more integrated and easier to put together, the task of figuring out how the
pieces will interact and what pieces you even need is still outrageously
complicated.

~~~
hinkley
I spend a lot of time thinking about how to make development easier, or at
least less error prone.

Every once in a while I have a moment of clarity.

I remember that the other part of our job is extracting requirements out of
people who don't _really_ understand computers, even the ones who are
ostensibly paid to do so (and if we're honest, about 20% of our fellow
programmers). The more you talk to them the more you realize they don't really
understand _themselves_ either. which is why shadowing works.

If building the software gets too easy, we'll just spend all of our time doing
the gathering part of the job.

And then I will do just about anything to forget that thought and go back to
thinking about less horrific concepts.

~~~
panta
It’s even stronger than that: the other part of the job is extracting
requirements from people who don’t understand the problem they want to solve -
even when the problem is not technological. There is no silver bullet (AGI
would be it, but we are far from achieving it imho).

~~~
brodo
Exactly. Most of the time, the problem is not to find out what people want and
put it into software. The problem is to help people in the process of
discovering what they want and what can be done. After that, development can
begin.

~~~
groestl
> After that, development can begin.

And trigger another cycle, as (successful) software inevitably changes the way
people understand their own problem.

------
JMTQp8lwXL
This sounds like it was written by someone who hasn't worked with ERP systems,
to give an example of where software will never eat software. "Can we make A
work with B?" \-- a lot of businesses tie together Salesforce + 8 other
systems, and that's how their business ticks. And a whole cottage industry
forms around it: consultancies, etc. I need to see a clear non hand-waving
explanation for how all of that complexity melts away.

The industry has already tried commoditizing by off-shoring. What we learned
was high-performance teams require psychological safety and trust. The human
factors involved in creating software are why engineers are not plug-n-play.
Because that reduction of the problem doesn't describe how the software is
actually made: product solicits customer interviews/data to recommend new
features, architects brainstorm a high-level solution, and the IC engineers
implement the vision. Human factors, through and through.

~~~
sailfast
In your specific SalesForce scenario they built SalesForce to stop people from
coding CRM systems, and then people continue to make money building connected
abstractions / Apps on top of other systems that are SalesForce compatible so
you get an app ecosystem that abstracts away all of these integrations. The
result is less code.

I agree that there is a lot of complexity specifically for "mission critical"
or "last mile" systems that will not be addressed by the mainstream
abstractions for many organizations, but I don't think SalesForce is
necessarily the best example. I see the author's hypothesis freeing up time to
do a lot more things within organizations that are otherwise on the back
burner because you can't get to that feature set, and/or pivoting to solve
either a) complex problems that are not yet solved or b) specializing on a
layer that is now "platform". Somebody builds AWS, and Azure, and GCP.
Somebody has to create, build and maintain the next platform / abstraction
too.

~~~
0xB31B1B
I think your falacy is in the "less code" assumption when you say "The result
is less code". I'd argue that empirially we've seen this to be false. The
result isn't less code, at least in a global sense, its more productivity,
more features, more customization, and more specificity at a cost of less
code/feature. Software has really interesting economics where as the
cost/feature decreases by a factor, say 1x, then the set of features that can
be profitably worked on expands by like 10x, so paradoxically, as cost/feature
decreases, it makes sense to hire more engineers and expand R&D budgets.

~~~
Apocryphon
I think ultimately, the question is whether this trend will result in "fewer
programmers needed", which is the most important by-product of "less code" in
the author's thesis.

~~~
0xB31B1B
Did we slash R&D budgets once we standardized on the X_86 instruction set thus
needing less compiler devs? Did we slash R&D budgets when we moved from on
prem to cloud hosting? We have seen this happen many times before, we know the
economics. Decreasing cost/feature is synonymous with increasing productivity.
We know that a 1x increase in productivity results in a very large increase in
the numbers of features that become feasible.

There isn’t some fixed factor here that causes it all to collapse.
Productivity increases are plowed into growing the market 10x and building the
business, not reducing eng budgets. At some point in the future this will slow
down, but that is so far from happening, like many decades from now, maybe
never in a non theoretical sense.

------
cm2187
The problem is that while it is economical to hire a team of developers to
automate a simple process executed by a lot of people, there is a huge amount
of complex processes executed by a small number of people in companies. And
you are never going to be able to justify a team of 3 developers or more to
automate and maintain the code of the job of just one guy.

Machine learning won't be the answer either. Machine learning is just another
kind of software, you still need to set it up and maintain it. And you need
data to train it, which for these complex processes often there is none.

The solution really is for non developers to write code to automate their
tasks themselves. Here simple code with simple platform to run that code is
the only solution. But we are taking the opposite direction. Newer generations
are becoming increasingly remote from how computers work (teenagers seem to be
struggling even with a file system), platforms are increasingly becoming
locked down (both consumer and corporate environment). And I dispute the claim
that software is getting easier. I am mostly evolving in a .net environment,
and I think the platform is becoming increasingly messy and complicated, we
are moving away from simple things. Same with technologies, every time I go
back into the azure portal website, I feel I am lost in the hundred of
products with evasive names.

What we need is the power of the almost "draggy and droppy" features of VBA,
something end users can play with. It is shocking how much office processes
rely on such an antiquated and neglected technology.

~~~
MattGaiser
> Machine learning won't be the answer either. Machine learning is just
> onother kind of software, you still need to set it up and maintain it. And
> you need data to train it, which for these complex processes often there is
> none.

Machine learning will learn to set up, maintain, and train itself. /s

I understand the desire to have non-developers write code for themselves, but
the problem is that the quality and reliability of that code can be utterly
terrible and they don't have the expertise for the edge cases, so there would
still need to be at least an intermediate developer overseeing these 20 part-
time very junior people simply because some of those processes would
eventually go haywire and do something dangerous or destroy some data.

~~~
schwartzworld
> I understand the desire to have non-developers write code for themselves,
> but the problem is that the quality and reliability of that code can be
> utterly terrible and they don't have the expertise for the edge cases,

People should write code for themselves, and not all code needs to be good
quality or have all edge cases covered. there's nothing wrong with someone
making a tool for their job and handling the edge cases as they occur.

~~~
sokoloff
You just about perfectly described 75-95% of Excel sheets that run an enormous
amount of business processes around the world.

~~~
slfnflctd
Microsoft Access could be a better tool for a lot of those spreadsheets, I
might wager! It would teach user interface design, simple relational database
concepts, data types (!) and more. I really wish MS hadn't turned this product
into a dead end, seems like a lost opportunity to give aspiring devs a path to
learning more capable systems.

------
safog
I don't buy the argument that we will have such a leap in software development
productivity that we need way fewer people to solve all the things that need
technical solutions. You can unravel abstractions we build on all the way to
the bits and bytes and 90% of software is just gluing libraries together.
Infact, you could probably describe the entirety of several of big tech cos as
"just" gluing libraries together.

Anyway, the other argument about salaries is more interesting. Most people
seem to agree that there's a huge untapped crowd of qualified developers in
small / mid sized US cities who would love to join $BIGCO but the only reason
they aren't is because it involves relocation. As an example, a Sr Dev in
Orlando, FL makes $100-120k in total comp while one in SF / NYC makes $350k+.
I limit my search to Sr Devs because I assume college kids are happy to move
to exciting cities like NYC / SF / Seattle on fat relocation checks.

My suspicion is that supply and demand have converged already and big tech has
mined out the supply of talented devs in the US already. The other datapoint
here is that companies have made it as easy as possible for folks to move by
opening dev centers wherever there's talent - NYC as a tech hub wasn't a thing
in 2012, but it's huge now for all the people who don't want to leave the east
coast. Boston is pretty big. Colorado, Austin as well.

The only way supply of devs is increasing here is if:

* Sr Devs who did not move to tech hubs because they preferred to stay where they are. (Personally think this is unlikely)

* Qualified Bootcamp graduates

* CS Enrollments hitting pretty high numbers, so maybe we'll start graduating lots of CS folks.

* Immigration reform / Outsourcing

* Interviewing change so we skip the algo problem solving shenanigans.

I personally think if big tech wants to hire in the US and still pay lower $
than they currently do, the only lever they have left to pull is the
interviewing format / bar.

~~~
partyboat1586
>Sr Devs who did not move to tech hubs because they preferred to stay where
they are. (Personally think this is unlikely)

You would be surprised at how many people value their hometown or where they
have settled. Technical only equates to high aspiration in SF. There are
smaller slower more steady tech companies (probably using the Microsoft stack)
outside of the tech hubs that offer stable jobs with decent pay and good work
life balance. Being a software engineer in SF means constantly learning new
tech and 'keeping up' but if you're not building a massively scalable consumer
facing product that doesn't matter so much. In SF even B2B SaaS is built like
this but it doesn't have to be.

~~~
exhaze
> Being a software engineer in SF means constantly learning new tech and
> 'keeping up'

No, it does not.

> You would be surprised at how many people value their hometown or where they
> have settled.

I would love to see actual data about this rather than articles from hometown
newspapers and posts by hometown residents, enthusiastically praising their
way of life. It's easy to argue the counterpoint as well, right?

1). There are many jobs that have to be done in person 2). Many people
_prefer_ to live in large cities and accept the downsides in order to get the
benefits

So, next time you want to make a claim like this, can you share anything
objective about this? Thanks.

~~~
olioven
I know this is more anecdata, but I'm from Ohio and worked on a joint project
with a couple of New York devs at #{famous company}. They were very good, and
I learned a lot from them, but I could definitely hold my own with them
technically (as could another senior dev from my company). We're both family
men in the Midwest with no desire to move. Even if you offered me $350,000 or
whatever I'm still staying here. But I would obviously take a remote job with
#{famous company} if it paid 60-70% of that and I felt like I wouldn't be a
second-class citizen as a remote dev.

~~~
mgoblu3
Chiming in as a Midwest engineering manager here (Michigan). There's no lack
of talent in the Midwest, although it's certainly a different calculus to try
and match hiring to the supply/demand of engineers here and not everyone does
so appropriately.

------
programmarchy
The most salient part of this essay was the last section. Remote-first is
forming in a way to break tech labor power. Employees will be less able to
bond, and their personal conversations will be easier to spy upon. This will
drastically inhibit collectivization. WFH may have some upside, but there’s a
huge downside looming.

> Remote-First, Collective-Last

> Lastly, let’s talk about the impact of remote-first on labor. Many months,
> before the virus hit, a CEO friend of mine “jokingly” told me that he
> believed all the “remote” buzz was as much about reducing the collective
> power of the employees as it was about saving a dime on salaries. He
> personally did not want his company to go full-remote but was under some
> pressure from investors to consider it. We were both hammered at the time,
> and I didn’t put too much thought into it, but it feels right the more I
> think about it.

~~~
JSavageOne
> Employees will be less able to bond, and their personal conversations will
> be easier to spy upon.

How does the freedom to work remotely make one more prone to being "spied on"
than being in an office?

Also, when did tech have any "labor" power? Last time I checked any talk of
"unionization" will get you fired at the drop of a dime.

~~~
programmarchy
Because the majority of remote work communication will be electronic rather
than face to face. It will be monitored, stored and searchable (Slack logs,
GSuite email, etc.) That’s much more difficult to achieve at the proverbial
water cooler.

Not sure which rock you’re living under, but tech is probably the last bastion
of labor power in the US. Sure, they aren’t well organized, but Google
employees recently canned a DoD contract [1]. Saying no to Uncle Sam is a huge
flex, and you can bet that pissed off some overlords.

Blue collar workers in the US have lost almost all of their labor power due to
offshore workers or immigration (i.e. scabs). Tech has had it easy for a while
now, but it’s the next target; the immigration debate is already shifting from
“jobs Americans won’t do” to “merit-based”. Gig economy is another false
liberation ploy being used to weaken collective bargaining power. Remote-first
is yet another.

[1] [https://www.washingtonpost.com/news/the-
switch/wp/2018/06/01...](https://www.washingtonpost.com/news/the-
switch/wp/2018/06/01/google-to-drop-pentagon-ai-contract-after-employees-
called-it-the-business-of-war/)

~~~
JSavageOne
> Because the majority of remote work communication will be electronic rather
> than face to face. It will be monitored, stored and searchable (Slack logs,
> GSuite email, etc.) That’s much more difficult to achieve at the proverbial
> water cooler.

The majority of communication is already electronic. I've worked in offices
and remote, there is practically no difference on that front. Most discussion
takes place in Slack, Jira, GitHub, and email either way. Most companies
already have team members distributed across various offices, remote work
isn't new. If you want to avoid monitoring, then spin up a Google Hangouts or
Zoom meeting, nobody is monitoring that.

The difference is that in an office, 1. your physical presence is also
monitored (no joke I worked at a office once where every day an office
assistant would secretly record what time everyone showed up to their desks)
2. your internet is also monitored, which is extremely creepy and I'm
surprised this doesn't get more attention.

When I work remotely, there are no IT managers able to spy on my internet
traffic. Sure there may be some dystopian corporations who try to force their
remote workers to log into "company VPNs" and install "monitoring software" \-
I have absolutely no interest in working for any such companies and have never
had to deal with that.

As a software engineer I've been fighting for the freedom to work remotely for
ever since I joined this industry, so having the freedom to live wherever I
want, and work whenever I want is a huge win. I've already been working
remotely since pre-COVID, so I'm happy that other workers will also be able to
enjoy this freedom.

~~~
novok
If you work at a corporation beyond a certain size, usually big enough to have
an IT department vs one 'IT guy', there is definitely some sort of monitoring
software installed on the computers given to you.

It just lives in kernel modules or as a OS config and they do not tell you as
an employee, and it's done for 'compliance' or 'security' reasons so it's not
obvious what it is when you look at a list of processes on your computer in
the task manager.

Some popular ones in the bay area for macOS at startups are crowdstrike,
carbon black, jamf, openvpn, umbrella, crashplan, munki, etc. Not mention the
OS management configuration stuff like MDM profiles for macOS and active
directory configurations for windows. A lot of brand name corps you might
think are 'good guys' use this, like lyft or dropbox. Similarly with companies
FANG, where it might custom software.

~~~
disgruntledphd2
Yeah, like the one FB open-sourced (and uses internally still, I believe):
[https://osquery.io/](https://osquery.io/)

I think this is a super good tool, except for all the privacy implications.

~~~
zwass
Osquery core developer, consultant, and technical committee member here...

We've always treaded carefully around privacy concerns as a project. This is
why you don't see tables that access information such as browser history. If
you join our Slack channels you will see open discussion of privacy
implications for changes and improvements.

There's a balance to be struck between visibility and privacy. If a security
team has no visibility into a system, they can't secure that system. This
doesn't mean they need to be able to look at your family photos and read your
messages.

I work with folks who care deeply about privacy and trust with their users.
These folks ensure that osquery configurations are available to users, so that
they can see what exactly is being monitored.

If you are curious what osquery can do, I encourage you to check out
[https://osquery.io/schema/4.3.0/](https://osquery.io/schema/4.3.0/).

~~~
novok
I think the real issue, that has nothing to do with osquery or any specific
piece of software, is the corp being able to push any software on worker's
computers and spy on their employees secretly. I call it the stalker IT
employee problem or the psychopathic manager / lawyer problem.

The solution will be legal I think in the end, like in some european countries
that don't let you do this kind of surveillance on your employees. And if you
want to look at an employee's work emails, you do it in front of them with
their lawyer present.

I'm glad you guys are trying to keep some semblance of privacy although.

------
aschla
"Say software one more goddamn time!"

I don't really agree with the point that because we're creating things that
allow us to use less or no code, there will be less code to write. We make
things easier for ourselves so we can then build upon it and then write more
complex things to solve more complex problems. It's the continual layering of
abstraction that's been going on for decades.

~~~
trimbo
I think the argument is that things like Wix put web developers out of
business. Instead of real estate agents hiring a local webdev to make their
listings site, or whatever, they can just use Wix.

Those webdevs are out of luck if they can't flex into something else. But,
generally, I think you're right. When those jobs go away, in their stead we do
other things with software. So instead of building websites for local real
estate agents, SWEs are building systems to make virtual RE agents using ML.

~~~
paxys
I also find that sites like Wix are rarely used as a replacement for an in-
house dev team or professional consulting, but rather by businesses that
wouldn't have had an online presence in the first place.

~~~
coldtea
And yet, business that now use Wix used to pay good money for an "online
presence" in 1997-2005.

Even your local real estate agent or plumber paid for a website...

~~~
paxys
No business I know paid "good money" for that. At most they would get a
neighbor's kid (one of them being me) to throw together a few html pages.
Fancy ones had a PHP visitors counter.

~~~
coldtea
> _No business I know paid "good money" for that._

You'd be surprised. Back in the day web devs couldn't decline enough jobs, and
you could make a month's salary for what was a single static webpage...

------
jaredklewis
I thought this piece was really interesting.

His thesis seems plausible, but I’m far from convinced. It seems equally
plausible to me that as software gets easier to write, we just write more
software. Industrialization of clothing meant the amount of labor required to
create clothing dropped drastically. However more people are employed in
clothing and fashion related industries than ever, because we now just own and
wear more clothes than ever before.

Won’t we just write more software?

~~~
adrianN
Eventually the market for software will be much more saturated than today. It
has already happened for consumer desktop applications. The rate of new
commercially successful desktop applications has dropped in the last thirty
years. Sooner or later software for businesses will be "done" too.

~~~
harpratap
This is like saying since diesel cars are dying therefore entire
transportation industry is coming to an end.

~~~
adrianN
No, it's saying that since most people in developed countries who want a car
already have a car, the market for cars is markedly different from the first
half of the 20th century.

------
Joeri
“No code” platforms aren’t going to replace software developers. I’ve tried
out several of the major ones and came away with the conclusion they are this
generation’s microsoft access. Someone who isn’t a programmer might be able to
build something in them, but their IT department won’t want to support those
solutions and they will definitely need support. I expect a cottage industry
of no code consultants, who will be programmers under a different name.

I think something like airtable or microsoft lists is going to replace a lot
more custom software. They are this generation’s microsoft excel.

------
benjaminjosephw
When someone posts about the inevitable commodification of software
development, the response on HN is usually one of denial. I'm wondering if
some of these responses are starting to sound a little angry too - maybe we're
going through the five stages of grief as a community!

Lowering the bar to software development and opening up the employment market
to a global one is going to have an undeniable impact. It certainly won't take
away from the fact that problem solvers are needed to create solutions and it
also doesn't mean that there won't be a place for the highly skilled people
with specialist technical skills. But it _will_ change the landscape
significantly.

Software engineers will loose a lot of the influence and power they have
enjoyed over the last decade. In my view, this is an important and necessary
step. The ability to build software that solves meaningful problems shouldn't
be concentrated in the hands of a few people (even a few hundred thousand) who
have a narrow view of the kinds of problems software should be put to work to
solving.

The article expresses concern that this displaced power _could_ end up in the
hands of big tech. I'm more of an optimist and believe that we could go in the
other direction all together. The ability to create problem solving software
can and will be democratized to a much broader degree than ever before and
will lead to more meaningful problems being solved across the world. As a
software engineer myself, that's the future I want to help build towards.

------
young_unixer
> We are coming to a point where software is developing so fast and the
> abstractions getting better that soon we will have more software written by
> a smaller number of people.

I think software is just a big pile of garbage. The thing is: our civilization
depends on this pile of garbage. It's remarkable that the whole thing somehow
chugs along.

Over time, the pile will just keep getting bigger and bigger and more people
will be needed to keep it from collapsing.

~~~
downerending
Yeah, pretty much. But maybe more like a compost heap. Occasionally something
nice sprouts from all of that shit. Not too often, though.

------
gitgud
It's a false dichotomy, that because software is getting easier to write, less
people will be writing software.

Yes it's true that software in the future will be easier to write, but we will
_not_ be solving the same problems as today.

In the past (20 years ago), developing a TODO app was relatively difficult and
could of been a valid startup idea. However today it's a trivial application
to develop and is the equivalent of a "hello world" program.

The reality is; _Writing software gets easier, While the problems software
solves get harder..._

~~~
typon
This is the right take. Same thing with 2D games to VR/AR. Infotainment
systems to Self Driving, Real Player buffering to Netflix. And the list goes
on. You can't get this progress simply by making better abstractions. You can
only paper over the complexities of "solved" problems.

------
jmull
Not sure if this is actually a serious (but misguided) take, or just the kind
of thing you have to write when you need to blog something but don't have a
good idea this week?

Software is getting more complex, not less.

No-code, no matter how magically great, won't change the fact that you have to
somehow understand and codify your requirements and figure out how to achieve
them through available building-blocks. (A no-code solution is just a tradeoff
in building blocks -- the more one does for you, the less flexible it becomes.
The lack of flexibility has a cost: the degree to which one doesn't exactly
fit your system, it pushes complexity elsewhere, blunting or even reversing
the value.)

The longer-term effects of a remote-first world are a much more interesting
topic, at least (though I think the author stiff completely whiffs. The ideas
on why remote work will be bad for the employees are very weak.)

------
stephc_int13
In my opinion, the author is a bit disconnected from the reality of software.

We've clearly learned a few things during the last five decades, our
abstractions are better today but progress has been mostly slow, painful, and
linear.

A lot of the gains we're seeing are built on top of the tremendous progress
made by the hardware guys, we've been free-riding for a long time.

I think that a few bullshit jobs might be exposed/compromised by the
transition to remote work, and that includes managers, but not people doing
creative work.

------
FearNotDaniel
I'm not buying it. Just as MS Office didn't reduce admin work, but instead
created whole new categories of busywork for executives to do, so new software
frameworks are mostly just increasing the complexity of the overengineered
solutions we create to address what are really just simple CRUD interface
problems in the majority of cases. If only classic ASP in VBScript was still
being security patched, I guarantee you the majority of web apps could be
written in that to be so much faster and more responsive than any of these
"modern javascript" front ends that have to make 78 different API calls across
a fleet of microservices just to show part of a database table on a page...
after the user has sat looking at some dumb spinning beach ball for 5-8
seconds. Naturally, you could maintain that shit with only one or two
developers instead of the rooms full of fullstack devops rockstars that it
takes to build even the most basic thing these days. Without a kubernetes
cluster in sight. But as long as our entire industry continues to throw away
basic architectural and performance principles in favour of embracing the new
hotness, your jobs are safe. Enjoy the busywork, folks, and let's continue
creating additional layers of abstraction and indirection to ensure the next
generation requires even more armies of developers just to say "Hello, world."

------
crazygringo
This is no different from saying that economic progress, in general, will eat
itself until hardly anyone is employed. Which people worried about, for
example, in the early 1900's as farm employment fell dramatically due to
efficiency.

Spoiler alert: it didn't happen.

What people always seem to miss is that as things become more commodified and
more efficient, we _move onto solving harder problems_ that aren't commodified
or efficient.

As programming CRUD applications or managing databases becomes quicker, we
start working on better product ideas, cleverer scalable architectures, and
whole new product categories.

Innovation never eats itself. This is a myth that people seem to fall for
decade after decade. (Unless we somehow reach the "singularity", but that's
purely in the realm of science fiction for now, and certainly not worth
worrying about until we solve AGI.)

------
MattGaiser
1\. We have no-code already. It is called WordPress and its hundreds of
thousands of plugins. Is WordPress eating everything? Nope. Is it destroying
web development? Nope.

2\. This entire post does not align with the existence of SAP, IBM, and
Oracle. Vast, vast, vast, amounts of code are highly customized for a specific
domain. These companies would already have lapped up these synergies to boost
their own profits. I really want to know how no-code solves the processing of
custom contract objects.

------
JSavageOne
Software automation has been going on for a long time - there's no denying
that. I don't see what that has to do with remote work though. Seems like the
author just stuck in the remote work buzzword to get some more clicks despite
that being a completely separate topic.

> a CEO friend of mine “jokingly” told me that he believed all the “remote”
> buzz was as much about reducing the collective power of the employees as it
> was about saving a dime on salaries.

I've spent the entirety of the ~6 years of my software engineering career
wondering why a profession where you're in front of a computer screen all day
was so hostile to remote work. At every job I'd try to persuade managers to be
more open to flexible hours and wfh, ultimately realizing it was impossible to
change corporate office culture and giving up to exclusively work remote roles
at remote-friendly companies.

So after years of fighting this fight for increased acceptability of remote
work, it's hilarious to hear some CEO telling people that remote work is a
ruse intended to decrease our labor power. That's like hearing a CEO say
"we've decided to increase the salaries of our engineers above market-rate in
order to put them in "golden handcuffs" so they won't leave'. Sure a
conspiracy theorist can always theorize about some possible hidden motive, but
I'll take the raise, thanks.

Remote work might be damaging to the mediocre SF engineer who's inflated
salary until now has been protected by companies'/investors' insistence on
engineers being in SF, but it's a boon to everyone else who never wanted to be
tied down to SF and living in some office campus.

------
buboard
> We are coming to a point where software is developing so fast and the
> abstractions getting better that soon we will have more software written by
> a smaller number of people

Maybe i m in the wrong planet, because all the software i see in the past ~10
years is an overcomplicated unmaintainable slow mess that needs more and more
people to keep it from imploding.

~~~
deckard1
Look no further than Docker, which is the tech equivalent of giving up on
Earth and moving everyone to Mars because we've screwed up so bad. We
transpile one version of JavaScript to another and our stack requires webpack
now. Remember when the browser could just load JavaScript without half a dozen
middlemen in the way? Back when you didn't have to compile one interpreted
language to another, not even gaining a higher level language in the process.

It's rather laughable that software will solve the mess that software creates
any time soon.

~~~
msla
Docker is to deployment what an API is to development: Hide the complexity
inside some barrier and present a cleaner interface. The complexity can be
well-managed or not; the point is, the outside world shouldn't have to care
what you're doing in there as long as it works and is reasonably efficient.

~~~
msla
Hm. People don't like when you give an actual rationale for Docker.

------
amasad
I think it's absolutely true that remote means less solidarity and connection
between workers. Also hiring international remote workers are usually done on
consulting basis, which less regulatory protections. I certainly see a world
where software engineers become more fungible, and with the reduced messiness
of human interactions, managers will be ruthless.

There are also some interesting arbitrage opportunities. For example, American
companies would want to hire from countries with socialized medicine so that
they don't pay for health benefits.

It could be more efficient, and it could lead to reduced poverty in some
places in the world, but on net I don't think it's a desirable.

Ironically, the people who champion remote work the most do it from the
perspective of the benefit to the worker. But as this essay suggests, and I
agree, if you play it out, it will work out to the benefits of the companies
and executives.

To be clear I'm not against remote, and could see the benefit to everyone, but
it's naive not to play it through and think of the second-order effects of a
fully remote industry.

------
yaur
Software started as flipping switches, then moved on to tape and punchcards
and then to keyboards and the ability to write in words instead of bare
machine logic. Soon we were in the print phase where "print" meant putting
physical words on physical paper and sometime later you were able to actually
view your program on a screen without needing to reprint it. A generation
later we get code generation from a WSDL and squiggly lines under typos. A
generation later we get AI that tries to predict if I'm more likely to buy the
Smooshtek or Prodex version of the exact same item on Amazon but can't come
close to solving unknown business problems independently.

I guess that what I'm trying to say is that the first and last job of software
is to eat its self. That's the environment and if you want to make it making
software you have to learn how to use the new impossible thing that software
can do and not take it as the brand new end of the world.

~~~
Drakar1903
We could draw parallels to warfare, the one industry that historically warps
everything around itself. First we fought with spears, swords and shields,
then with primitive musket guns. Suddenly tanks and planes and helicopters
join the frey. Then carriers, submarines, missiles and targeting systems.
Radars. Space satellites. I'd be surprised if there aren't secret projects
that simulate the same potential conflict a million times to decide chance of
success given new parameters.

And every time the new advancement in technology blows the previous ones out
of the water, but it is open to its own exploitable faults. Tanks beat
infantry in open field, but they're less suited for urban warfare if you don't
want to leave the city in ruin.

------
midrus
Last few companies I've worked for were using what would be considered the
latest trends in software development: SPAs, React, Redux, BFFs, monorepos,
kubernetes... I can tell you, we're not reducing the need for developers at
all... if anything we're building things so complex and overcomplicated that
will require generations, and generations of developers to maintain an
understand. So we're pretty safe for the time being...

Yes, then you have things that would really cut down the number of people
working on the field, such as Heroku, App Engine, Firebase... but this is such
a small factor due to the few companies using it that compared to the messes
we're creating in other places I don't see it causing any collapse of the job
market.

------
notsag-hack
Considering the struggle of companies to find qualified people to work for
them, even when offering good salaries, it's to me a huge exaggeration to say
that going remote will represent a disadvantage for software engineers. I've
been a software engineer for 10 years now. If you see the struggle we have
sometimes with dumb topics like timezone issues or ridiculous security issues
everywhere, you'll understand that software is really far away from eating
software.

Also saying that it's easier for the company to get rid of people may somehow
be true, but it's easier for people to pivot and find better opportunities as
well. To me it will increase job rotation, which in my particular experience
has helped me A LOT to improve my skills. This will help improve job
conditions as well, companies in search for more stability will have to be
even more creative to keep smart people around. They don't seem to have
understood this yet unfortunately, good job conditions is not having a ping
pong table and a bloody PS4 to play with.

------
joejerryronnie
It’s not the software engineers that will be automated out of a job, it’s
everyone else. We will need to deal with the fact that jobs which require very
little creativity and ad-hoc problem solving will largely be automated away
much sooner than software will be writing itself.

------
jupp0r
I think the author is a little out of touch with reality. In reality, state
government unemployment websites that are really simple CRUD apps collapse
from a few thousand daily users. Lots of simple office work can’t happen
remotely because people are still using paper files. There is plenty
disruption to be had in the world that can be solved by software.

------
AriaMinaei
> ... wrote a persuasive piece on how companies should people based on what
> value they add, not pay them differently based on where they want to live.

I learned the concept of "golden handcuff" the hard way. We paid a remote
worker a fraction of the salary they'd make at a major European city. But that
salary was still a multiple of what they'd make at a local job.

This turned into a golden handcuff, as the employee started to lose interest
in the job, yet would try hard to convince themselves that they're still
interested. That was likely because any other job for them would mean a 66%
pay cut at the least, and a significant downgrade in their standard of living
(unless they'd find another employer as inexperienced as me).

Lesson learned: Negotiate close to market rate, and let the fulfillment of the
job be the talent's biggest motivator.

------
pknopf
> software is developing so fast and the abstractions getting better

Software won't eat software because this isn't true, yet.

IMO, the last 10 years have been an explosion of new frameworks and patterns
that are overly-complicated and bloated. Because of this, there isn't an easy
cohesion between off-the-shelf solutions.

Think of how eye-opening UNIX was when it first came on the seen and the
ability to pipe together commands to solve new problems. It was eye-opening.
But, think of how hard that would have been if each tool read ASCII different,
or piped text differently. Imagine if grep and awk where combined into one
tool with a complicated set of switches.

I think eventually, working with off-the-shelf software to build solutions to
tough problems will be as easy as gluing together 3-5 UNIX commands in a bash
script to process some text files. But we aren't there yet.

------
blauditore
I would argue this can never be true. If software gets to a point where
today's problems can be solved by a 5-minute specification, engineers will be
able to do more high-level engineering, connecting dots with those building
blocks.

One example is memory management: At some point, we started having languages
and powerful-enough hardware that engineers didn't have to worry about memory
anymore (for certain classes of applications), so they could spend more time
thinking of other things. It's not like switching from C to Java would cause a
software company to get rid of 40% of its devs because there's now less work -
it's just that one hindrance is gone. But there's always plenty to do; the
backlog never gets emtpy.

------
loosetypes
Is the success of the no-code trend really a given?

Insofar as I can tell no-code addresses configuration management of known
entities. It’s not a dsl and it’s not a visual programming environment - it’s,
generally speaking, a graphical layer on top of one or more yaml / xml /
whatever-file(s) with a finite set of options.

That’s good enough in many, many cases. But it’s not a panacea. And it’s
certainly not “no-code means no coders.”

~~~
MattGaiser
We already have no-code for the web. It is called WordPress. We have no code
for calculation. It is called Excel.

I can see it having applications, but for programs that have a value of $1000
only to the business.

~~~
msla
> We have no code for calculation. It is called Excel.

By that metric, Python is "no-code" for practically everything: It's code, but
it isn't called that because it isn't written by coders, it's written by
people who write software to do their jobs, see? "Formulas" and "scripts" and,
in a previous era, "macros" are what we call software written by people who
are being paid to do something else.

~~~
MattGaiser
Excel comes with most people's computers and they can do it all by clicking
buttons and drag and drop. If you are someone who spends all day in VBA, you
are at least a software engineer lite if not a software engineer.

Doing a table of calculations in Python is much more difficult than doing it
in Excel without programming knowledge.

~~~
pessimizer
Learning the abstractions that Excel uses, how to write formulas, and where to
click, drag, and drop _is programming._

Python is no harder if you spend as much time with it as the average Excel
expert does in Excel.

------
carapace
Counterpoint: If this was going to happen it would have already.

Here's RMS talking about secretaries extending Emacs with Lisp:

> programming new editing commands was so convenient that even the secretaries
> in his [Bernie Greenberg] office started learning how to use it. They used a
> manual someone had written which showed how to extend Emacs, but didn't say
> it was a programming. So the secretaries, who believed they couldn't do
> programming, weren't scared off. They read the manual, discovered they could
> do useful things and they learned to program.

[https://www.gnu.org/gnu/rms-lisp.html](https://www.gnu.org/gnu/rms-lisp.html)

~~~
neolefty
Yes and I can't count the number of programmers I have met who started off
doing the same thing with VBA macros. Very empowering, for all its warts.

~~~
carapace
Right! Like how MS Excel is the most successful IDE in the world, eh?

------
nevster
I've been hearing this same thing since the 80's when my friend's dad
questioned why I would want to become a programmer when eventually computers
would be able to program themselves.

The complexity just keeps ratcheting up.

~~~
lopis
A developers work is 50% solving problems and 50% creating enough problems to
ensure their job security.

------
brooklyn_ashey
“Anyone who’s spent a few months at a sizable tech company can tell you that a
lot of software seems to exist primarily because companies have hired people
to write and maintain them. In some ways, the software serves not the
business, but the people who have written it, and then those who need to
maintain it. This is stupid, but also very, very true.“

This is exactly what happened to higher education. It seems to be a kind of
entropy that signals that a system is in major need of complete overhaul so it
can get back to doing what it was actually designed to do.

------
codegladiator
> we will have more software written by a smaller number of people

This was the past IMO. Future will have larger number of people writing even
more software.

I wonder if the author has met people working in non-software industries, but
most software is still unwritten. What we have nowis most amount of software
for software developers.

~~~
chii
> Future will have larger number of people writing even more software.

When the majority of the population became educated inn literacy, there wasn't
a huge jump for everybody to start writing novels. They merely started writing
smaller things - letters, notes, and short form, one-off stuff.

It's the same with software skills. There are going to be an increase, but in
the form of small scripts and one-offs for tasks, rather than more software
"products".

~~~
novok
When the entire world became literate, the demand for good writing skyrocketed
in comparison to the illiterate world.

------
z3t4
No-code is nice, there is only one problem though. Computers are not getting
faster. Sure they can handle more parallel loads with more cores, and in the
cloud you can scale horizontally, but building parallel systems is hard, and
the higher up in the abstraction chain the harder it becomes. So the software
might be built by some drag n drop tool, and it will look nice, but
performance will be terrible, uptime will be terrible, and there will be weird
issues once the software get many users.

------
whytaka
Mo software, mo problems. Either coders will become the people managing all
the different "no-code" solutions or companies will pay dearly for when,
inevitably, their no-code solution meets a limit that was never considered.
The problem with all the no-code solutions though is that it becomes mind-
bogglingly difficult to remember how to navigate the graphical interfaces of
each platform while the coder can focus on and enjoy the superior ergonomics
of mere text editing.

------
zaro
There is no way that software will soon reduce the need to engineers.

One way to look at it is AWS for example. It removed the need for hardware and
sysadmin skills. But the cost of this is that now instead of sysadmins you
need devops and probably in bigger numbers.

The way I see it software crates problems at a higher rate than it solves
them, hence the need to constantly change and evolve technologies,so that at
least of the technical debt created can be written off.

------
pvsukale3
I feel that every no-code tool that gets sophisticated enough to replace
manual human coding work will have a significant learning curve. So much so
that we will need trained professionals to operate these sophisticated no-code
tools.

Yes, people will be more productive using no-code tools over manual
coding/deployment, but we will still need these skilled no-code professionals
to use these things and ship faster.

------
1024core
My problem with "everyone remote" is the following scenario.

Suppose I'm sitting at my desk, and I need Bob's approval to check in some
piece of code. In the current (office) scenario, I send Bob the request. If I
see him lounging back in his chair, playing some game on his phone, I can walk
over to him and gently prod him to approve my request. He can't deny his game-
playing, so he has no option but to act right away.

In the remote world, I have no idea what's keeping Bob busy and why he's not
approving my request. He _could_ be playing games or out for a walk with his
dog... or he could be fighting a (metaphorical) fire.

What we need is some sort of a "presence" signal. Unfortunately, we can't have
one without some privacy issues.

~~~
crazygringo
I dunno. Even in all my in-person office experience, the person I need to get
an approval from is not within eyesight 99% of the time.

If it's urgent, you ping them over chat, then send them an e-mail if they
don't immediately respond. If it's less urgent, just an e-mail.

It's none of my business what's keeping Bob busy. He'll see my requests soon
enough. If it's _incredibly_ urgent, I'll call. If he doesn't pick up, that's
on him.

------
bborud
Software doesn't actually get easier to write. Yes, you have more building
blocks to choose from and sometimes (very rarely) improvements in
abstractions. But this also means that your software sits atop an ever larger
mountain of code that you do not understand. This means that it is easy to do
things half way, but it still isn't any easier to do things well.

Much software also has to model the world around us. Which doesn't really get
any simpler as we try to solve more problems in systems that need software.
Like self driving cars, or even something as everyday as your mobile phone.

The piece was too long.

------
tmellon
I have exposure to software in medical devices such as an MRI machine. As far
as I can tell, in these expensive devices atleast, the predictions of this
article 'Software will eat Software' do not apply unless you are talking
complete stagnation of new machine development. Apart from this it requires
programmers to understand MR physics to even begin making any sort of real
development - takes a minimum of 3 years to come to speed after an advanced
degree IMHO.

------
douglaswlance
There is a simple truth about the modern world that many don't seem to grasp:

Automation --> Wealth --> Opportunities --> Automation...

The job of the software engineer is to automate their own job. Most people
stop their reasoning there and don't continue thinking about the downstream
effects of that action.

The wealth generated from automation creates new opportunities that will be
filled by the software engineer themselves because automation never ends.
Software is never done.

------
zelos
>In my first job at a small start-up, we had tons of physical servers. Now,
it’s hard to imagine any "webby” tech company ever interacting with any
hardware at all.

My record on predicting tech trends is _terrible_ , but wasn't the outcome of
that switch to cloud computing a massive _increase_ in the demand for software
developers? For now, at least, it seems that things that reduce the amount of
work developers need to do actually increases their value.

~~~
osdiab
I think while it's true that new tools allow fewer people to do more work,
that also expands the pool of tasks engineers can work on - more fields come
to value software, work gets deeper in the fields that already did,
competition gets fiercer, and the market as whole expands. This probably can't
go forever but I don't see that trend ending any time soon.

------
sumanthvepa
I suspect that more people will be writing software as part of their job. They
just won't call themselves developers. As the underlying software becomes more
powerful (I'm thinking of Microsoft's recent demo of AI writing software) you
need to be less skilled to build something we consider a complex app in the
current time, relatively easily. Many professional developers will find them
selves migrating to solving harder more advanced problems.

~~~
chii
> Many professional developers will find them selves migrating to solving
> harder more advanced problems

it may be wishful thinking that this is gonna just happen without major
investment from some sort of entity for those developers. And i would also
argue that advanced problems are more fewer and far between than "standard"
problems being solved today by the majority of developers.

History has shown that displaced workers don't get to re-educate for free.
Either the workers themselves will have to pay for such re-education (or
advance their existing education to higher level, and thus able to do a more
difficult job that hasn't been "automated" or trivialized via a tool), or the
gov't pays to mass-educate.

If you currently work for a big-tech, and enjoy a high salary, it would be
very prudent to try to invest your salary as much as possible to build a
future stream of income for protection, for the eventual scenario where you
are made redundant due to advances.

~~~
sumanthvepa
> If you currently work for a big-tech, and enjoy a high salary, it would be
> very prudent to try to invest your salary as much as possible to build a
> future stream of income for protection, for the eventual scenario where you
> are made redundant due to advances.

Good advice. That has always been true for programmers. We constantly have to
invest in learning new kinds of things.

------
adatavizguy
This is like saying building an abstraction of assembly language like C will
put all the software developers out of a job. Rather it creates more
opportunities.

------
alexashka
A lot of speculating, nothing of value.

Software is a _tool_ to get things done.

The only way software eats software, is by having intelligent people wanting
better tools and abstractions to get shit done.

Building better tools and abstractions is hard work. For people to be
motivated to do hard work, there need to be goals worth sacrificing for and a
support system in place, to help and motivate the hard workers to keep going.
Society has to _want_ change and treat those on the front lines as _champions_
for their cause.

That's how real progress gets made, and not only in software, but in all areas
of life.

Americans seem to have forgotten or maybe never knew, that this is how things
work. This tendency to dumb everything down to never offending anyone and not
letting anyone even have a chance to make a wrong move, has made everyone into
a bit of a complacent dumb-ass, writing blog posts like these that say
absolutely nothing.

If you want a bit of a shocking example of how deep this goes - you can't
customize, group, rank, save, or do _anything_ on the number one photo-sharing
app in the world. You can only mindlessly scroll, like, comment and share.

I mean, we've reduced an entire creative medium to a single algorithm of
behaviour, optimized for maximum time spent zombie-scrolling.

When people wake up to what zombified lives they've been living and that it
hasn't been an accident but happened by design - it will not be software
eating software, it will be software users eating software companies and then
some for what they've done to humanity in the name of quarterly profits.

------
coldtea
> _Put another way, the technology industry will soon get a taste of what has
> been going on in other industries._

Well, we can always "learn to code"... oh, wait...

------
swiley
As someone who has spent the last month fighting ansible and puppet:
automation in software does not mean less work.

IMO: there’s an inverse relationship between maintainability and ease of use.
Config files are extremely easy for _large groups_ of people to maintain but
it sucks making changes to them. As an individual I’ve found it’s often easier
to rewrite the application myself to do what I need than to edit complex
config files.

------
TulliusCicero
> We are coming to a point where software is developing so fast and the
> abstractions getting better that soon we will have more software written by
> a smaller number of people.

This seems unlikely to me, or at least unsupported by current observations.
Software tooling and abstractions have improved tremendously since the early
days of programming. Going from machine code on punch cards to modern high-
level languages with assorted libraries and frameworks and debuggers
represents at least a few orders of magnitude productivity increase per
programmer.

But has that incredible efficiency gain resulted in fewer programmer jobs?
Nope, just the opposite: the better the tooling is, the more useful each
programmer is, so the more it makes sense to hire. This makes sense when you
consider how open-ended programming is as a field: it lets you manipulate
data, and it turns out manipulating data has a seemingly endless number of
uses. It's not a field like, say, plumbing, where you need a set amount of
plumbers, where going beyond that number doesn't really accomplish anything.

------
mcv
I don't see much evidence for the argument that software will destroy software
development jobs. It's true, we have automated a lot of tasks and develop at a
higher level now than we did in the past, but there are also more people
working in software now than in the past.

You can do impressive stuff with a small team or even alone now, but the past
also had small teams doing impressive stuff. Look at how Apple was founded by
two guys, how many startups started with a small team. In the 1980s, many very
successful games were written by one or two people. We make more sophisticated
games and systems now because we build on the abstractions created then.

The demand for more sophisticated software will continue to grow, and we're
not remotely near the point where every problem will have been solved. The
focus of software development will very likely shift, as it has always done.
But software engineers will continue to be needed as long as software is
needed.

------
bjorn2k
Until we solve all "problems", we need software. At least in some of the
solutions, part of the solution will require software. The whole no-code thing
is BS in my opinion, because it is just another layer of abstraction. Maybe we
make "making software" easier and more people can do it, but solving a problem
will not go away.

------
cosmodisk
Low code/no code trumpet again. I work with it(Salesforce) and while it can do
pretty cool things,as soon as something more complex comes up, I have to write
good old code. I understand these things are improving a lot but it will take
decades to get to the point where it's wiping thousands of devs out of job
market.

------
x3haloed
I have no evidence to back this up, but as a counter-point, I think that the
construction of new things is always a platform upon which more new things can
be made, which require people. If we reach some precipice where much more can
be built with much less, the. I think that not only will the diversity
(breadth) of software will increase, but also the complexity and power
(depth). All of that still requires workers. As software becomes commoditized,
it’s ubiquity and availability should provide a new base level on which to
build. The only scenarios I can see where we lose opportunity for common
growth is if we build machines that are better at innovating and making things
for humans than we are or if the power of the tools we build all ends up
concentrated into a few powerful hands. The later is why I believe FOSS is so
important.

------
LordHumungous
SWE's have been the ones pushing for remote work for years, now here come
dozens of blog posts claiming that it's our undoing. Ok sure. Trading some TC
for lower housing costs, lower COL, less traffic, and a better quality of life
outside of Silicon Valley seems like a decent trade.

------
friendlybus
The microservices make it easy to create a big ball of software that covers as
much featurespace or ground as possible. That's not the end of software
development. You need a lot of people to gut these microservices and
reconfigure them to be part of your monolith and to wire in new workflows.

As soon as precision and performance become critical, you're back to dredging
through the code because microservice were built on assumptions that your ball
of software executes poorly.

This move for collective action attempts makes the independent wfh type
irrelevant and then suck them back into the fold via 'collective bargaining'
and other union tricks. I fail to see how this will play out any differently
to historical unions. Blaming the current president is s basic move for
bringing about your utopian collective.

------
zubairq
As the author of a no code tool I disagree with this article. I think that we
will need even more developers to manage the apps churned out by no code
tools. Tools may go up the abstraction later, but developers will always need
to debug ALL layers of the abstraction layer

------
helen___keller
I agree with the author that software eats software. It took a sophisticated,
expensive team just to launch an ecommerce line a couple decades ago, now you
need no engineers at all. This trend will continue.

I disagree that this will result in net decrease for engineer demand in the
foreseeable future. There appears to be no evidence for this at this time.
Investor demand for new software products is currently endless.

I agree that a trend towards remote work could be a downward pressure on
engineer salaries and power, particularly for silicon valley workers. On the
other hand, it could instead be an upward pressure for all us engineers
outside of silicon valley. It could be both, or maybe the remote trend will
fizzle out in a year. Time will tell.

------
doktrin
My take :

At some point in the mid-to-distant future, most human labor (certainly white
collar labor) will potentially be automated away, with remaining blue collar
labor not far behind (the physical interface, e.g. general purpose robots that
are superior to humans in non-static environments will probably come last).

Until that point, the demand for a technically literate workforce will
continue to grow, simply because the number of potential areas of application
will also continue to grow. With every new digital encroachment on our
analogue lives, we find new business verticals that previously simply didn't
exist. I don't see that trend abetting anytime soon. Put differently, it's not
like software development ends with webapps. For every domain that gets
commoditized, there will almost guaranteed be new ones in need of hands on
skilled labor. There will of course be some developers who aren't as
effectively able to reposition themselves for new verticals (we've all
probably heard some anecdote of a C programmer who simply couldn't grok OOP,
etc.), but by and large it's difficult for me to see the demand for human-
generated bespoke software development dissipating until we get close to that
mythical land of general AI.

Now, on the payscale side I do agree with the author that the push towards
remote work will definitely have an impact on many of our salaries that are
currently tightly bound to geography. That seems fairly evident, to be honest.
I live in northern Europe where our salaries are quite good relative to other
parts of the continent, but you can still find high quality engineers for
<$100k USD - many of whom are on par with FAANG developers being paid >$200k
in SFBA or Seattle and the like. The further east / south you go, the lower
the current salaries are. It seems logical that decentralization of work will
lead to a flattening of the income curve - raising some incomes, lowering
others - as we increasingly all enter the same digital playing field. That
said, wages - like housing prices - are to use econ parlance 'sticky
downwards' so if I were still a SV developer I wouldn't start losing sleep
yet.

------
ilaksh
Salaries will go down not because of less collective action (there was not any
really to speak of with high salary people) but because now the (remote)
applicant pool has many people living in relatively low-cost areas, many even
in countries/localities where the cost of living my be 70 or 80% less than the
big US cities.

In such a situation, it is very easy to question the idea of hiring _anyone_
who lives in a big city and therefore requires a large salary.

As far as software eating software, it seems obvious to me that it is
occurring, but may not be obvious to everyone because as that has been
happening, we have also seen a rapid increase in software use in business and
industry. And we have seen software projects become more mainstream, where in
some places, almost every random person seems to think they need to build
their own mobile app or web-based service, for example.

I think we are not yet close to the end of the level of automation of software
development. Here is one thing that may be on the horizon: chat, voice, and
UI-mockup - enabled software development bots that take advantage of large
knowledge-bases of application templates and components, and use deep learning
to smartly combine and configure them according to natural language
instructions and intuitive UI interactions. This could also involve generating
design assets with AI based on combining large libraries of starting points
and styles.

I also personally believe that more advanced AI for programming is going to
sneak up on everyone, including programmers, faster than most expect.
Especially when you think of tackling specific types of software at a time.
What the deep learning people are working on is making neural networks learn
better-factored and more accurate models of the world. I believe this is
probably feasible to do using mostly existing algorithms, by building up
models (and networks) gradually with a progressive curriculum approach. Using
that type of capability and very large datasets of specifications, design
interactions, sample assets, templates, and final configurations, it will be
possible to train AI to build many different types of software.

------
burlesona
Most of the comments so far are about his take on there potentially being less
need for people to write software in the future. The more interesting part of
the article IMO is the bit about salaries:

> Most people would like to believe salaries are determined by a cost-plus
> model, where you get a tiny bit less than the value you add to the company.
> However, in reality, they are really determined by the competition.
> Companies are forced to pay as much as possible to keep the talent for
> leaving. In a competitive labor market, this is often a good thing for the
> employees.

He goes on to concisely explain why people get paid so much for working in the
Bay Area, and why they won’t get paid like that elsewhere.

I think this is a keen insight, and should be sobering to the large cohort of
HN that is clamoring for all the tech giants to go full remote.

As we saw already this week, when Facebook decides they don’t actually need
you to live near Menlo Park anymore, that also means they don’t need to pay
you Menlo Park wages any longer.

I think there’s actually a significant risk to Silicon Valley here, which is
the following potential vicious cycle:

\- Big Tech decides all/mostly remote is the future.

\- Big Tech mostly stops hiring in the Bay Area because why pay more for
talent when they no longer “have to.”

\- The COVID reset layoffs continue, resulting in a lot of Bay Area engineers
looking for work at the same time.

\- Supply and demand mean that engineers wages start dropping as smaller
companies no longer have to compete with the giants (as a hiring manager in SF
I’ve already seen this start happening).

\- Engineers who can no longer command crazy salaries to justify the rent
start leaving the area (this is also already happening).

\- Rents start to drop, and the housing market softens a little.

\- Tech workers who have relatively recently bought homes get nervous and look
to exit before they get underwater on million+ dollar loans.

\- The housing market softens further...

At this point most of these things sound like a much needed reprieve for the
insane local housing market, but the funny thing about bubbles is the way they
get hotter and hotter for a long time, and then tend to pop relatively
quickly. Sometimes these things can lead to vicious cycles that take quite a
while to come back from, where a bear market feeds on itself.

I love working in software. I loved it before I moved to the Bay Area and
quadrupled my earnings. I’ll love it even if it all comes back down to earth
and is just a “normal job” again.

But I think that could really happen, and I think Big Tech embracing remote is
a great way to pop the balloon in the Bay Area with the result not being a
utopia where people get to make Bay Area wages and then live on the beach in
Cancun, but rather they get to make Omaha wages and live in... Omaha.

I haven’t really seen this community wake up to that possibility yet, which is
surprising to me considering it’s a fairly obvious conclusion to the top
paying companies all opening up the floodgates of worker supply by truly
embracing remote work.

~~~
alexitosrv
I do agree with your remarks, but I wonder what is it that reinforces the
negationism to that new reality by most commenters here. I have doubts that
companies want to go the offshore routes of yore, but effectively I expect
this wfh trend will create a lot of new competition between IT work force, in
the USA, at least.

------
dragosmocrii
I wonder if instead of completely eliminating software developers/engineers to
write the code, what if we had specification writing engineers. Just like
tests, the specifications would describe what is needed, and then we'd let the
machine figure out a way to make things work given all constraints. Very much
like machine learning I guess. Of course, those specs/tests would need to be
covering almost every possible scenario, but then when you write code you
write the code and rhe tests, so in this case you'd only write the tests, but
the code would be figured out by the machine. Does this all make any sense?

------
kennu
I agree with the writer. Microservices have already made it possible to
commoditize many of the services that we used to re-implement over and over
again in the past. We can now buy them as cloud services. But that is just the
beginning. Our development work is still mostly non-business-logic effort and
smart companies will figure out new ways to eliminate more and more of it.

The important point is that non-business-logic development work is never
eliminated completely. But if you eliminate 50% of work, you need only half of
the previous human effort. Then the other half needs something else to do.
Preferably enjoy the increased productivity by gaining increased free-time.

------
carapace
> Companies are forced to pay as much as possible to keep the talent for
> leaving. In a competitive labor market, this is often a good thing for the
> employees.

In practice they conspire to rip off the talent: "Apple and Google's wage-
fixing cartel involved dozens more companies, over one million employees"

[https://pando.com/2014/03/22/revealed-apple-and-googles-
wage...](https://pando.com/2014/03/22/revealed-apple-and-googles-wage-fixing-
cartel-involved-dozens-more-companies-over-one-million-employees/)

------
ezoe
Slack is a direct successor of IRC. Those who are familiar with the IRC are
quickly getting used to. But if used by people without IRC experience, the
result usually suffer as they don't understand the IRC culture and can't use
it efficiently. I've heard a company failed so bad at using slack. Only the
sysadmins has the privilege of creating the public channels, resulting just a
handful of generic use channels and most channels are flooded with noise. To
reduce the noise, they ban the non-essential post to the Slack, just
nullifying the Slack as a communication tool.

------
icheishvili
Feels as though this essay was written by Ned Ludd himself. Software has been
constantly commodified and yet the hurdle the market expects you to clear gets
higher at the same time.

------
yters
Kolmogorov complexity means programs cannot generate more information than
they are provided with. However, I do think it possible someone will invent a
much more efficient coding workflow that can create an assembly line sort of
system. E.g. a very experienced dev can decompose a problem into a lot of
subtasks and interfaces to outsource and complete in parallel. Once completed,
boilerplate software automates the integration of completed tasks.

------
zaptheimpaler
Scary and true IMO. The difference is clear in the head counts & philosophies
of newer companies vs. old ones. To do anything in software, you had to be a
software-first company. Now that part is easier, we see many more software-
enabled companies instead - the business is front and center, only a few
programmers needed, and they have a lot of software plumbing kind of work, not
much software building.

------
carapace
If you could write a thing that actually did replace yourself as a programmer,
would you?

Do we protect and perpetuate the priesthood or bring the fire down the
mountain?

~~~
keenmaster
It’s not like anyone has a choice in the matter. There’s a huge profit
incentive to create the first ubiquitous no-code framework. Like some
commenters above though, I think that no-code will only increase the aggregate
demand for software, much of which will need software engineers.

------
sircastor
I feel like I've been worrying about this for now than a decade and it still
isn't true, and it's not going to be true in a decade.

------
k__
Just a reminder: Remote isn't as special as it seems.

Teams where always working remote in respect to other teams.

The only thing that changed is what teams do internally.

------
kanakiyajay
I had written about my predictions about technology trends which are going to
make a maximum impact in 2021 out of which no-code will have a definite impact

[https://jay.kanakiya.in/blog/what-i-am-excited-about-
in-2020...](https://jay.kanakiya.in/blog/what-i-am-excited-about-
in-2020-and-2021/)

------
caogecym
I just don’t get it why more developers working remote would impact the amount
of code needed to be written for their business.

------
naveen99
People who think software will get simpler due to software should look at
genetics. Complexity is infinite.

------
rglover
If you're a developer, realize that this happening is going to make you filthy
rich if you know how to play it correctly.

This is going to take a _boatload_ of competition out of the industry and
means that truly competent, experienced software developers will become rare.

The faster this happens, the better.

------
smdz
I'd argue that as abstractions and automation improves, it will bring in more
complex demands to be met and therefore it will increase demand of
programmers/engineers. And the cycle continues.

Software will continue to eat the world until it creates an authoritarian
utopia for humans.

------
woah
“No-code” is great, and I love some products in the category (Airtable). But
the idea that it is going to fundamentally transform anything is hyperbole,
similar to soylent’s marketing when they were saying that nobody was going to
eat food again.

------
cntlzw
Software development is a bit like fashion or music. There is no direction
like forward or backwards. Good and bad are often just personal opinions. Some
things get out of fasion, just to be discovered years later. Others are are
stables.

------
SteveMorin
Asana with its inbox workflow will help all remote companies. Didn’t believe
it at first but after using it for onboarding all remote like how it enables a
sync work. Disclaimer I am a recent Asana hire that I boarded remotely.

------
kirubakaran
Software development, imho, is mostly complexity management. If you believe
that too, and believe the premise of this article, can you describe how those
two fit together in your mind?

------
jolux
This seems hugely overblown verging on nonsense to me. Since when is it easier
for your employer to surveil you outside their office than within it? There
are more and better encrypted messaging tools than ever before. No-code tools
are a long way off from being able to replace most software. We’ve been
hearing this about the technology for at least 30 years at this point, and
_now_ it makes “most” software development irrelevant? In a lot of ways
software engineering is better than it’s ever been, but there are huge swathes
that don’t follow the “best practices” that newsboard technologists are known
for promoting (myself included) and their software is going to need work
forever.

------
imvetri
In progress. Support my project. [https://github.com/imvetri/ui-
editor](https://github.com/imvetri/ui-editor).

------
blackrock
LOL. Nothing I’ve seen so far about Artificial Intelligence is intelligent at
all.

And anyone that tells you that A.I. is intelligent, might as well be selling
you some snake oil.

------
iandanforth
Anyone who brings up and doesn't take a moment to laud the success in
dismantling dragonfly earns a black mark in my book.

------
PeterStuer
There's lots of different points in the article, most of them tangential at
best to 'remote first'.

The remarkable thing about 'software eating itself' is that it is true, in the
sense that the _potential_ of what a small team of good people can accomplish
has gone up by orders of magnitude over the decades. We stand on the shoulders
of gian( code base)s, but ... at the same time the dysfunctionality seems to
have increased, and not just in corporations, but also in smaller outfits.
Just as efficiency paradox observed in the infamous 'Bulshitt jobs' article,
it is as if there are these ubiquitous hordes of 'overhead' cadre hiding in
the wings relentlessly trying to bootstrap themselves at the slightest
opportunity.

I'll give an anecdote, but it is fairly typical for many situations I
encounter. This job was putting a daily webpage dashboard on some existing
data and and integrating an existing service that delivers new data-points.
There were very few uncertainties, It was scoped as a one week effort for a
team of two self-managed senior engineers, with a second week of polish and
iteration. This is not what happened. First a manager was appointed for the
project, who's first 'hire' was another line manager to keep track of meetings
and progress. He hired a business analyst to write a spec, soon joined by two
other analysts to read the spec and write more spec. Then a 'data scientist'
consultant was brought on, who (rightly) ignored most of the gibberish in the
analysis document but managed to instate a full enterprise cloud BI stack
(hey, we're dealing with data and need to draw a chart), and instead of just
reading the data from the nightly reports decides to integrate directly with
the data-warehouse, now upping the need for enterprise security etc. etc. 2
months into the project this is still going and the end result will still just
be a chart on a webpage.

Second anecdote. This one in reverse, but putting light on the same principle.
This concerns a large organization that for years was pondering to go digital
with its mailing system (this org deals with thousands of pieces of physical
mail coming in and thousands of paper pieces going out every day, often
requiring multiple official approvals and signatures etc.). Thos was seen as a
complex and large project, taking a year probably running into the multiple
millions of euro). COVID-19 hits, and the organization is forced to go fully
remote. A pragmatic team of 3,5 FTE is called in to 'just make this work
ASAP', as the mail-flow is vital to the org. They finish the job in 3 (hectic)
days. The solution has now been running for months.

So there is this 'paradox of efficiency' in software as well. Software eats
itself, but as it does it has the tendency to become morbidly obese (to stay
within the metaphor).

------
29athrowaway
Website generators have been around since the 90s, but in 2020, people still
work have full time jobs writing websites by hand.

------
Apocryphon
A lot of people in the discussion disagree with the article's first point
about no-code/less-code. While I do think the argument is a little fatalistic,
the scoffing at the idea in the face of him bringing up evidence such as AWS
and other IaaS/PaaS reducing the number of in-house server engineers that
companies need, and how the proliferation of open-source libraries/frameworks
has made creating software easier than ever, reminds me of economists pooh-
poohing the notion that modern forms of automation will lead to unprecedented
levels of job loss just because previous waves of automation didn't. Even in
the face of self-driving cars and automated receptionist/call support agents
that James Watt couldn't even dream of.

But whether no-code/less-code is inevitable, I think his first argument can
still be substantiated from a different approach, at least in the near future:
we're coming at the end of a tech bubble, the money will start receding, and
organizations have already been realizing they don't need so many coders (and
other staff) after all. Did Uber really need to build their own in-house
version of Slack? Did Airbnb really need to pour so many resources towards
adopting, even contributing heavily, to React Native, only to do an about-face
and have to rewrite their mobile apps in their native platform? Did either
company, like so many gig/sharing companies during this bubble, have to invest
so much in hyper-growth and expanding to so many markets before they were
ready? Did so many of these companies have to fall prey to Not Invented Here
syndrome and waste so much time in engineering boondoggles?

To some extent, yes, it's what's the investors wanted, or what corporate
leadership thought would make the investors happy. And so this boom has led to
massive hiring on a lot of busywork that doesn't actually have tangible
economic benefit to these organizations, and may have even led to worse
outcomes due to unsustainable or reckless behavior.

Going back to the article, whether the first point can be explained by
inevitable no-code/low-code eating the world, or by the fact that a lot of
software generated were due to the frenzy of a bubble, it still leads to the
same point:

> _Anyone who’s spent a few months at a sizable tech company can tell you that
> a lot of software seems to exist primarily because companies have hired
> people to write and maintain them. In some ways, the software serves not the
> business, but the people who have written it, and then those who need to
> maintain it. This is stupid, but also very, very true._

And the article's subsequent conclusion about how widespread WFH will lead to
cold calculated culling of a lot of unnecessary or redundant personnel, aided
by the the emotional detachment of not having to see the faces of the people
to be let go, still follows.

~~~
novok
AirBNB had less than a few thousand employees in their HQ, I think their
engineer count was under 1000. In the scheme of things that is not a lot of
engineers.

And Uber did not write their own slack, they set up a mattermost instance and
then put some logos on it. It's a bit more than complex running an exchange
server in a corporation.

Most of the money burned by these companies was in the operations, marketing
and incentive side, not on the engineering side.

My estimate at the end of this will be a second tech bubble, due to way more
money being printed and given to bankers again, even lower interest rates and
surprise surprise, software is one of the few things giving a return, yet
again.

~~~
Apocryphon
Airbnb had around 2k engineers, nearly as many people as the entire size of
Stripe (~2.5k):

[https://news.ycombinator.com/item?id=22809778](https://news.ycombinator.com/item?id=22809778)

I'm not saying that engineering is necessarily the cost centers at these
companies currently experiencing major layoffs (though the high salaries
probably don't help), but it does call into question how much software being
written is actually beneficial to these businesses, if they're able to shed
engineers just like that.

~~~
novok
The reason why these two companies are having major layoffs is because they
are in the travel sector and COVID is a custom made disaster for any company
in the travel sector. I think they are hurting themselves in the long run, but
they are choosing long term survival and cutting of new incomplete initiatives
for the more well proven cash cows, which hurts their future.

Also software in many companies managing large complicated multibillion
business are a bit of an iceberg. There is a lot of internal software that you
are not exposed to as an end user customer. I call it the google search
effect. Why so many servers and engineers for a 2 page website?!

~~~
Apocryphon
Airbnb and Uber (and Lyft) are far from the only tech companies doing layoffs
during this period, though.

------
artsyca
You are what you code

------
trhway
i work only with software that eats others software.

------
29athrowaway
To get started in software development, people no longer need to learn about
memory management, concurrency, networking, how numbers are represented in
computers, serialization, compression, encryption, etc. And in some cases,
they do not need to know about data structures and algorithms or object
oriented programming either.

If you use something like node, garbage collection will handle memory for you,
the event loop will make things fast enough, you will use JSON serialization,
you will use REST, every number is a float... and you get the idea. That
reduced amount of knowledge will get you very far.

Of course, eventually, you will run into a problem that will require some
deeper level of understanding. But by that time you will be already employed.

The way I see it, wage depression is imminent.

~~~
ratww
This has not been the reality for way longer than you assume.

For instance: in the late 90s, before the 2000s tech boom, Delphi and Visual
Basic dominated business programming and most coders had no idea about any of
those things. Same for Java and .NET later. On the web side, Perl/CGI was
king. After that we had PHP, Java, Ruby, Python, C#.

It never really got simpler. If anything, things are more complicated today:
mobile development, frontend frameworks, functional programming, Docker and
Kubernetes, more complex design patterns. Just a simple "hello world" today
with NPM, React, backend MVC, K8s and deployed in AWS is ten times more
complicated than anything most of us ever did in the late 90s or early 2000s.

It was mostly embedded programmers having to worry about the things you
mention, but most of the ones I know from the early 2000s think that our
current web stacks are way more complex than what they do.

