
Ask HN: What has changed for developers in the last 20 years - mukundmohan
I&#x27;d love you know your thoughts on what&#x27;s changed for developers in the last 20 years.
1. We moved from client apps to web apps to mobile apps.
2. Client server apps were for work and now there&#x27;s a lot more &quot;consumers&quot; using apps
3. Hence, design and simplicity have become important.
4. We went from waterfall to agile programming.
5. There&#x27;s a huge proliferation of &quot;stacks&quot; and &quot;fragmentation&quot; of languages for specialized tasks.<p>What other trends do you see?
======
codeonfire
Software development has been turned into a ridiculous contest rather than a
job or profession. Twenty years ago you would go to college, work on some
projects, meet with some companies in your town and get a job and maybe work
on fighter jets or banking software. Now you go to a three week web dev boot
camp, read cracking the code interview books, compete with hundreds of other
people from around the world to solve a math/programming riddle/problem for
the chance to make $150k/yr.

It's not about the work or profession any more. It's about people trying to
optimize the shortest path to $150k/yr and has a certain game show feel to the
industry. No one has any idea what they would be doing at the job. Many places
don't even put you on a team immediately, and there is a sense that nothing
else matters in the workplace/contest other than those skills used to 'crack'
the interview. Degrees don't matter, and there's no industry or work to be
done outside of web or mobile app development either.

Imagine today if mechanical engineering was solely focused on not designing,
but making specific types of flanges. People would go to a flange boot camp
then read 'cracking the flange making interview.' They would interview where
they would sit at a lathe and make an honest-to-god flange. If it were a good
flange you'd get a call back and a $100k salary with 50k in stock over three
years to make flanges. You'd start making flanges on Monday. One day you ask
yourself "why are we making flanges again? Aren't machinists supposed to be
making the parts? Why are we not developing new CFD techniques or new packages
for structural stress analysis?" Oh yeah, venture capital is only funding
flange making.

------
adventured
I've been building for the Web since 1995.

Databases for web services used to really suck back in the 1995/2000 time
frame. They were either difficult to use, or very expensive. MySQL gradually
killed the bottom 2/3 of the commercial database market between ~1998 and
~2005 or so.

Simple interactivity on sites is incredibly easy now. Applets and Shockwave
were horrible, due to performance issues (systems and browsers), general
bugginess, and lack of consistent support.

RAM and storage used to be really, really expensive. When Excite wanted to
test out how well the first version of their search engine could scale, they
paid $10,000 for a 10gb hard drive (purchased by Vinod Khosla).

Using, controlling and embedding media on sites is almost an afterthought now,
it's so easy. Real Media was satan.

Search was really bad until Google put AltaVista & Co out of their misery.
Search spam ruled the day. The notion of just punching in a problem and
finding a solution on Stack Overflow, ha.

Ten years ago, 20-50kb still mattered when loading a site. Developers no
longer need to obsess about the size of their site assets (rather, more on how
many and latency), unless they're getting really crazy about it.

Language options are so much better today than in 1995/2000\. You can now
choose from numerous good options, pick whichever one works best for you, and
you can be confident that so long as you use it optimally, you're unlikely to
run into big problems unless you're dealing with hyper scale services.

Collaboration capabilities are... well, _drastically_ better now. Github,
Slack, really easy and cheap video conferencing, large file storage & sharing,
sites like Stack Exchange, social networks, and so on.

Bandwidth is dirt cheap; that doesn't really need an elaboration.

The countless cloud services have made prototyping / testing super cheap.

The rise of the internet service API, following fairly standardized
approaches, has made interfacing with vast amounts of data very easy, very
precise and very cheap.

~~~
richardbrevig
For databases: I remember all the "create dynamic websites with PHP and MySQL"
tutorials around 2000. Prior to that, on the low-end (being a high schooler
with no money), my applications used flat file databases.

> Real Media was satan.

I remember Real Player wanting $2,000+ per server license to be able to stream
content. It was a perk in different hosting accounts, some featured the
ability to stream.

------
kls
One of the biggest changes is the dissemination of information about
programming. Ask any developer from the 80's or 90's about the trip to Borders
or Barns & Nobels and you will see a smile of nostalgia cross their face.
Books used to be how one learned to program or learned new technologies. I
remember owning a well worn Javascript Bible that was used by 8 or 9
programmers in the office as reference back when JS was new. Many developers
spent several hours on the weekend at the bookstore pouring over programming
books. Now with high speed Internet, video tutorials have trumped books for
learning new tech. I really miss that experience.

~~~
richardbrevig
Learning to script back in 1997, this was completely my experience. Learned a
little by looking at other code, but also had to buy a book to learn Perl.
Among others, this was a main reason why I switched to PHP around 2000,
because of the online manual making it easier to find out how to do things.

------
MrTonyD
I think the biggest change has been the loss of autonomy in development work.
When I started programming the programmers were viewed as "professionals" who
made decisions about when, how, and what they did. It was much more
egalitarian - and not just for managers. Now I think that programmers need the
same protections that blue collar workers used to need - hours, overtime,
vacations, equipment, and an effective way to give feedback into a system
which no longer treats them as true professionals.

------
MalcolmDiggs
I can't speak on the whole 20 years (I've only been in the industry for 12
years). But, at any rate:

The largest change I've noticed is the level of accessibility that programming
has now. When I got started there were "Web Designers" who worked mostly in
Photoshop and static HTML, and there were "programmers" who did _something_
else, on special machines, with special training, but it was completely beyond
the grasp of us mere mortals.

Nowadays programming is not something you _need_ a degree or certification
for, it's not something you _need_ rigorous training for, or special equipment
for, it's just something you can pick up if you feel like it (and get a taste
for with little commitment or money or time spent). That's a huge change to
me, and I think it's for the best. In this decade, we're seeing the definition
of "literacy" expand, to include things like reading and writing programming
languages...which is awesome!

~~~
hcarvalhoalves
> Nowadays programming is not something you need a degree or certification
> for, it's not something you need rigorous training for, or special equipment
> for, it's just something you can pick up if you feel like it

I believe this is true since the 80's with cheap PCs running Basic, Pascal,
etc.

~~~
nostrademons
What's changed is the huge corpus of well-documented open-source software now
available.

Back in the 80s and 90s, knowledge about how to become a programmer was easily
available as long as you were willing to spring for books and a computer, but
the only building blocks you had available were whatever was in the Mac
Toolbox ROM, the Win32 API, or the POSIX standard. Beyond that, you were
basically working directly with the bits and bytes on the machine. So yes,
anyone could become a programmer, but you didn't get very far unless you could
plan and execute an immensely complex web of algorithms and data structures.

There are still some problems like that (eg. if you work for Dropbox, Google
Search, or Galois), but most programming these days involves assembling
building blocks. You have scripting languages like Python or JS to take the
pain away from memory allocation and working with raw memory layouts. You have
GUI frameworks like web browsers, Android, Cocoa, or .NET to build UIs without
worrying about vector graphics. You have serialization protocols like JSON,
protobufs, Cap'n Proto, MessagePack, Thrift, etc. so you can just dump your
data structures to the wire; back in my first programming job in 2000, I
remember a significant amount of effort being figuring out which bytes would
go over the wire and how they would be formatted. You have packaging managers
like NPM or PyPI with tens of thousands of packages for the picking.

That's changed the job description of a programmer significantly. It involves
a lot more memorization these days, a lot more lookup skills, and a lot less
mathematical & logical ability.

~~~
sharemywin
visual basic was real easy to use. As long as you didn't want to program a
complex game.

~~~
nostrademons
My memory is hazy, but didn't VB not come out until the mid/late 90s? That was
the era when "build it all yourself" began to die... Other systems from that
vintage included COM, DHTML, Java, MFC, GNOME, KDE, wxwidgets, etc.

~~~
bayonetz
VB6 was at probably peak popularity mid/late 90s which is why you probably
remember it as just coming out even though it had been out for a while. That's
one thing that's certainly different - the rate of growth and hype for new
languages and technologies. The literal and figurative network effects today
obviously allow stuff the ability to get hot so much faster.

------
ridiculous_fish
I got started seriously around 1998, when I bought (bought!) a C++ compiler,
the CodeWarrior Discover Programming Starter Kit [1], for the low low price of
$79. Metrowerks achieved this amazing price by forbidding distribution of any
software built with this compiler.

Two years later I was rocking with Project Builder, which came with OS X
Public Beta - at no extra cost! I credit the release of OS X, and the huge
usability advances in Linux, for making the GNU toolset accessible to
hobbyists and students. This barrier lowering is what enabled the Cambrian
Explosion of programming languages we enjoy today.

1:
[http://www.macworld.com/article/1014313/19reviewscodewarrior...](http://www.macworld.com/article/1014313/19reviewscodewarrior.html)

------
matt_s
The big changes are in the frameworks/libraries, collaboration tools,
information availability and pure computational power. All the other stuff is
the same really. Agile and waterfall are still being used, client, mobile and
web are still used, mainframe too.

From my perspective it is the collaboration and depth of answers you can find
online now. When I was programming 20 years ago I was in college and "the web"
really didn't exist - there was email and networks, etc. but web browsing
hadn't really begun mainstream yet. When I needed to learn something I had to
find a book on it or actually attend a class - like programming EJB's in J2EE.

An example about frameworks/libraries: building web applications was very
different since you had Netscape and IE (v4) supporting different and
overlapping HTML, CSS and JS features. MVC and all the frameworks out now were
really in their infancy or didn't exist 20 years ago. I was doing CGI scripts
with Perl for some apps and there was a lot of heavy lifting compared to
today.

Today you can probably have a scaffold/basic application (thinking in Rails)
that is OS and device independent with something like bootstrap, running on a
VM, on the internet in about a hour. So it's much quicker to get to the point
where you start adding value rather then spending enormous amounts of time on
basic infrastructure.

------
RogerL
I disagree with the waterfall->agile thing. There are a large number of
development paradigms, and back then I was mostly doing some sort of Rapid
development (not sure the term was in use right then, but the concept sure
was).

This may not matter to a lot of you, but computation has made a huge
difference. I can run nonlinear solvers in seconds instead of hours. I have
ready access to things like NumPy and SciPy to very quickly explore data in a
way that used to require a supercomputer.

hardware = this is partly covered by 'computation' but the kind of work I do
would have been impossible back then. We generate 10+ terabytes of data a day,
and that is laughably small to some of you. some more from hardware...

bits. Anyone remember trying to fit things in 32K? You know, because integers
were 16 bits long, and so were pointers. It was a major chore to just sort or
otherwise manipulate data, and I made some significant wins by being clever
about this sort of thing rather than just saying 'eff it, use a database'
(which is what my competitors did).

5 minute compiles for rather small applications.

Sneakernet.

minimal version control. A lot of people didn't know what it was.

Off the hardware front, you mostly had the ability to understand your 'stack'
(that word wasn't used that way in those days). You could more of less know
the Windows API, inspect the assembly coming off of your C compiler, read the
standard library, and pretty much have a picture of what is going on. These
days it is much harder.

To put some of this in context. My first job, in 1988, consisted of computing
cancer statistics. I'd carefully write a batch job. It'd go off to a
supercomputer center. There, operators would (eventually) pull tapes off the
racks and insert them into the tape reader. The job would run (eventually).
The job would print to printers. A van would make runs several times during
the day, delivering print outs. I'd go check the bins, eventually my print out
would show up. And so on. Part of my work was converting that to the PC, but
we still had to do a lot of batch runs to massage the "huge" data inputs that
the PC couldn't realistically handle. By '95 the situation re data handling
was better, but still quite limited both in storage and computational
capacity.

These days we can trivially handle pretty big problems on a PC. And if we
can't handle a problem, well, there is AWS. It is just a different world in
this regard.

------
imakesnowflakes
Ability to run multiple Virtual machines, a working hibernation feature in
commonly used operating systems and popularization of Distributed version
control systems...

