Ask HN: How has software engineering changed since you started your career? - alxmdev
======
nostrademons
Started programming professionally in 1998 at the tail-end of the Windows
monopoly / beginning of the web boom.

Your choice of languages is much broader. When I started, your option was
basically - C++. Java if you wanted to be hip, and Visual Basic if you wanted
to be lazy. Pascal/Ada/COBOL/C/assembly were approaching obsolescence,
Python/Perl/Ruby/Javascript existed but very few people used them. Now you
have a huge selection of languages to choose from on the server, and some of
them actually incorporate academic research from the last 10 years.

Almost everything was in-process. Programming meant managing the CPU and
memory of a single box. There was this hot new thing called "CORBA" that was
supposed to enable distributed programming, but nobody used it successfully.
Now everything is external databases, distributed computing, SOA,
microservices, cloud-computing, and other multi-box approaches.

Similarly, networking & distributed computing are critical skills today.

The job description of "software engineer" has specialized. When I started,
the was no such thing as "front-end" or "back-end"; you were expected to do
everything from end-to-end, including the UI, algorithms, networking code,
etc. Everybody was full-stack. Since Windows had a monopoly, that was your
client code; there was none of this division into Android vs. iOS vs. web on
the frontend. And since programs weren't distributed, you usually didn't have
a frontend vs. backend distinction. Occasionally some devs might specialize in
the persistence layer or the UI, but you all worked in the same code.

There was, however, a professional divide between PC vs. workstation vs.
mainframe programming. The people _we_ would've been asking this question of,
20 years ago, were mainframe programmers, and they used a totally different
software stack from consumer Windows apps.

Distribution, packaging & marketing was much more difficult. Your software had
a "ship date" when a finished binary had to be sent off to your publisher for
packaging on a CD-ROM, and there was a "code freeze" a couple weeks before
then, at which point no new features could be added or changed, and only
bugfixes could make it in. You had to spend time writing InstallShield scripts
so you'd have a nice installer wizard that'd dump a whole bunch of shit on
their computer. Once you shipped, that was it - you couldn't update other than
by releasing a new version. No continuous deployment, and everything within
the organization was synchronized around hitting the ship date. To sell it,
you needed either face-to-face sales or advertising, and you needed
distribution deals with retailers. Virality was a thing with this newfangled
web stuff, but it didn't really exist on desktop software; the closest you got
was shareware, with things like WinZip and WinAmp. (I'm going a few years
before my first job here; when I started programming, the Internet was
available, and you usually distributed software by dumping a .exe file on your
website for registered users to download. Most of my coworkers remembered this
stage, though, and a lot of our engineering practices were built on the
assumption of cutting a physical product.)

Markets are much bigger. For a sense of the scale - the Apple 2 series sold
roughly 5-6 million units in its 17-year production run. Today, the Apple
Watch sold that much _in its first quarter_ and is widely considered a
failure. A product like Whatsapp can get 300M active users in 4 years now,
while it took _30 years_ for the total PC market to reach 150M units/year.

Some things that _haven 't_ changed - your skills would still go obsolete
every 5 years. You still needed intense concentration to build anything new.
You could still make a lot of money by owning software that lots of people
used. Software was still a security mess - the threat model has changed from
viruses & trojans to data breaches and botnets, but the overall security
situation is probably about the same. You still had kibbitzers who would look
at your code and declare you incompetent. You still didn't have nearly enough
time to add all the features you wanted.

------
makecheck
(I’m thinking across a span of about 20 years.)

We have moved up one or two levels of abstraction. It is now not only
_possible_ to perform complex tasks in scripting/shell environments but _easy_
to do so (huge standard libraries, etc.), and code written at a high level is
now “fast enough” in most cases. Also, the nature of those tasks has gone up a
level or two; networking, for instance, is no longer a neat feature and is
more like a core competency for a language.

We’ve spent a long time on parallel processing and finally have some pretty
neat constructs for doing so (in _mainstream_ languages and not just side
projects). Well overdue, and it leads to more natural coding in a lot of
cases.

We have a more “international” coding environment. It is far more common now
to see at least a _preference_ for stuff like UTF-8, if not full support for
it.

We’ve successfully open-sourced certain tools that are critical to
development. For instance, nowadays you don’t _really_ expect to pay money for
a compiler, and you _probably_ are using an open-source revision control
system (though some proprietary ones still exist). Generally, there is a
greater expectation that a community of some sort will exist around crucial
infrastructure tools.

Hardware, obviously, is way better. Unfortunately, while this has clearly
allowed for some incredible advances, it has also enabled extremely sloppy
coding. There seem to be massive amounts of memory allocated and other
“features” of some modern programs, and they work only because so many people
have over a gigabyte of RAM to make up for it.

