
The Programmers Before Us Were Better - JeremyMorgan
http://www.jeremymorgan.com/blog/programming/the-programmers-before-us-were-better/
======
confluence
That's complete crap and merely a manifestation of "Rosy Retrospection"
(<http://en.wikipedia.org/wiki/Rosy_retrospection>).

Monty Python made fun of this "in my time, you see" bullshit with the Four
Yorkshiremen skit (<http://www.youtube.com/watch?v=Xe1a1wHxTyo>). Woody
Allen's movie Midnight in Paris did the same
(<http://en.wikipedia.org/wiki/Midnight_in_Paris>).

The programmers before us were the exact same as the programmers of today and
they will be the exact same as the one's that come after us. The only
difference between them is whose shoulders they stood upon and what they were
be able to do using the tools at hand.

Programmers of the 60s were happy just to get things to add up. The guys in
the 80s squealed with glee at GUIs. The guys in the 90s made the internet what
it is today. And the guys of the 2010s will bring AI and robotics into the
main stream.

Everyone glues things together - it's how creation works
([http://en.wikipedia.org/wiki/Steven_Johnson_(author)#Where_G...](http://en.wikipedia.org/wiki/Steven_Johnson_\(author\)#Where_Good_Ideas_Come_From)).
The only difference between generations is the things you have available to
glue together. Sometimes you glue together a stick and a rock to make a spear,
and other times you glue together a machine gun, a camera, and a computer.

Either way - same people, different situations.

~~~
jakejake
I do agree with you that people tend to glorify the past. I also agree with
the author that it was way, way more difficult not even that long ago when you
couldn't get answers by just typing into google. I think that did weed out
most people from the occupation because you had to do a ton of reading (not to
mention buying those expensive programming books!)

On the other hand you could say that I wasted a lot of time thinking about
things that were inconsequential. Now I'm able to focus my attention on the
true work rather than getting hung up on the minor details.

There are still problems to solve, they're just on a different layer for a lot
of us. You might even say that because of all of our frameworks and libraries
that our problems are even more complicated.

------
thaumaturgy
I'm not sure what I think of this. (I started programming around 1985.)

I don't think the older generation of programmers were any better than the
younger generation in terms of skill. I think they were, generally, more
_practical_ \-- they don't generally engage in language wars or programming
evangelism, but when they do, they speak directly from a massive body of
experience that's hard to argue with.

I think that the problems that they spent most of their time on were generally
simpler. The most challenging bit of magic I ever worked on a mainframe was
convincing COBOL to do variable interpolation on what was essentially
punchcard-data-on-disk. (COBOL has no string operators or variable types, for
those that are unfamiliar.) That was challenging, but it was a tiny problem in
terms of scope compared to, say, building a scalable web application with a
database backend.

I've worked with and hung out with greybeards, and generally prefer their
company to that of hyperactive younger programmers. (For one thing, they don't
talk about programming much!) They built things the same way we do -- by
thinking about it and then banging it out one line at a time. They just used
different tools.

Documentation was better back when. That's one thing I really don't appreciate
about the Google Era; now, if I need information on some function or language
feature, I usually end up reading some incomplete community-submitted
documentation and a stack of opinions from people who may or may not know what
they're talking about. For anybody that got to use those lovely old massive
3-ring binders (they came in sets!) with professionally-written and compiled
in-depth documentation covering every single aspect of whatever you were
using, there's just no modern replacement.

Otherwise, I don't really think the field of computer programming has
fundamentally changed in all these years. The people are about the same.
(Greenblatt was a pretty annoying, hyperactive young programmer once.) Some
things are different. Some things aren't.

~~~
gordonguthrie
1979 calling.

> Documentation was better back when.

When I started programming properly ('84, '85?) it began with being issued the
full shelf of IBM 3060 manuals - and when I mean shelf, I mean 3 feet. All my
reference material was for the version I was using - relevant, complete,
practival. We used to say RTFM and we meant it, you had an M to RTF.

Lots of people hate on the Erlang documentation <http://www.erlang.org/doc/>
but I love 'em, proper shelf, proper docos.

I am trying to write some HTML5/Javascript sound stuff and help is picking
overcooked vegetables out of Google Soup - ech!

~~~
eckyptang
This times a million. Microsoft, OpenBSD and Golang are the only things I've
found that produce good quality extensive reference documentation these days
that I can take offline.

Python documentation, while extensive, is crappy.

I still miss my Sun Ultra enterprise 2. That was documented to bits.

~~~
josephlord
To add to your list I like the PostgreSQL manuals. They always seem detailed
and clear.

I find the Rails documentation awful lacking a lot of detail with an
expectation that you read the source which in places is fairly hard to follow
through several layers and even if you find the relevant part without good
documentation you can't tell if it is a bug or intended behaviour in some
cases.

~~~
eckyptang
Yes I probably should have added postgres. I like the fact that they keep the
documentation nicely versioned as well. The quality is fantastic too.

------
luu
_There are a lot of mediocre and flat bad programmers out there because we’re
knocking down the barriers and letting everyone in, plus they have all sorts
of crutches they can lean on now._

This is an incredibly good sign. Jeremy's statement translates to two things.

1\. Factor productivity in programming is so high that even "mediocre and bad
programmers" can make a net positive contribution to the economy.

2\. Barriers to entry are low.

Would we want to change either of these things? Surely no one is going to
argue that we should prefer lower factor productivity.

So, what about barriers to entry? If you look at industries that have '1' but
not '2', you have a small set of very rich people keeping people out of the
industry in order to protect their riches. This makes everyone else in the
world poorer, because the service is artificially expensive, and it's
especially bad for the "bad and mediocre" people who could have otherwise made
it in the industry. The main benefit of creating high barriers to entry would
be to make people who are already rich by the standard of the one of the
richest countries in the world, in the most prosperous time in human history,
even richer. Why should we want that?

It's great that some people think of programming as a craft and continually do
everything they can to improve their skills; I can completely understand the
attitude, since it's one I have myself. But, not everyone is like that. Some
people would rather spend time doing other things. What's really amazing is
that we're so rich and productive that someone can put almost no effort into
learning how to program and still be a productive member of society. I love it
that we live in a country where people can work 1/100th as hard as an employee
in a Foxconn factory and produce more value. I hope that my kids will be able
to be ten times richer than me while working one-tenth as hard. I pray that
they'll choose to work harder than that, but I want it to be a choice.

It is a wondrous and amazing thing that total factor productivity[1] is so
high in the U.S. that unskilled Mexican laborers become three times more
productive when they cross the border; keep in mind that Mexico is, on a
global scale, one of the more productive nations in the world.

[1] <http://en.wikipedia.org/wiki/Total_factor_productivity>

~~~
reinhardt
> Would we want to change either of these things? Surely no one is going to
> argue that we should prefer lower factor productivity.

Try imagining these things happening for, say, doctors, civil engineers or
something else that matters. I'm sure many are going to argue that we should
prefer lower factor productivity.

~~~
anonymous
That's already true of doctors. The barriers to entry to being a GP are lower
than those to being a neurosurgeon. Like building rails websites has lower
barriers to entry than writing code for the Mars rovers. In fact, if you so
much desire, you can set up a homoeopathy practise without a degree in
medicine.

~~~
javajosh
JPL doesn't have particularly brilliant programmers, but they do have
methodical, conservative, reliable programmers who test even the simplest
systems extensively.

~~~
jrajav
So what would the brilliant programmers do?

~~~
javajosh
In my mind, brilliance is associated with (successful) innovation. One can
innovate in two basic ways, in methods or products. For example, Rails is an
innovation in method. The webapp itself (however it was written) is an
innovation in product. Both kinds of innovation do occur at JPL (they've
created a substantial toolset around Eclipse, actually, and are pushing into
Cloud computing in a serious way), but the nature of the space-based/rover
projects ( _years_ between "qa" and "production") means that those programmers
are prized for their high skill and low tolerance for risk.

------
majormajor
"Do you really need to use a whole giant framework for some simple site with a
few CRUD operations? Probably not."

I'd rephrase this as "should you really spend any more time than necessary on
a simple site with a few CRUD operations?" The answer is still "probably not"
but now implies something completely different.

I also think that the fact that "just in it for the money" programmers can
continue to exist shows that there's more demand for programming work than
could be done by only the best and brightest programmers. So if there's work
to be done, and work that doesn't need the best and brightest, why dwell on
it? I agree with this part of the conclusion: "we should have some
appreciation for what we have these days, because whether we want to admit it
or not development is pretty easy" but disagree with much of the tone of most
of the article leading up to that. And I disagree that we can turn the trend
around (or that we'd even want to).

~~~
bryanlarsen
Sorry, mis-clicked and downvoted.

"Do you really need to use a whole giant framework for some simple site with a
few CRUD operations? Probably not."

A simple site with a few CRUD operations can be created in about 5 minutes
using a giant framework. (<http://hobocentral.net/>)

Spending days to do 5 minutes of work sounds like a bad thing, not a good one.

------
fusiongyro
One thing I think is important to keep in mind is that standards were also a
lot lower back then. Seeing a Lisp Machine in person, when you hit the bottom
of the interactive terminal it would just start overwriting the top of the
screen. For me, it was a surprising reminder that a lot of things we _really_
take for granted today were considered superfluous in the past.

For as much as things are easier today, what non-programmers consider
acceptable ("the bar," if you will) has been raised more. The idea of full-
time user experience people in the 1980s would have been ridiculous. Today we
have "front end engineers" who are mainly there to inform and realize the
interactions designed by full-time user experience people. For a significant
amount of time, networking meant network filesystems and email. Today it means
multi-million lines of code browsers running thousands to hundreds of
thousands of lines of front-end code with just as much or more in the backend.

Programmers before us did more with less, true. But they also were more
willing to accept compromises. I think things would probably be a lot better
today if people's expectations were lower. But Plan 9 and Haiku are evidence
that people actually care quite a bit more about usability than stability.
It's a bummer. But I'm glad we enthusiasts can at least continue these
traditions outside the mainstream.

~~~
derleth
> Today it means multi-million lines of code browsers running thousands to
> hundreds of thousands of lines of front-end code with just as much or more
> in the backend.

And people using all of that to call everything beyond email 'bloated'
because...

Actually, I don't know why. One wonders if they also call a Ferrari bloated
because of the Model-T.

~~~
fusiongyro
I'm not exactly making a value judgement here. I like Plan 9, but not enough
to use it daily, in part because I like to do things like leave comments on HN
and do flashcards on memrise. I'm just pointing out that the capacity of
computers isn't the only thing that changed since this golden era. Standards
have too, and not really in the direction people were predicting. Dijkstra did
a lot of hand-wringing over the Software Crisis, which amounted to "our
software is still shit." Well, it's 2012, and it's still shit unless it's
touching something nuclear, but nobody seems to be willing to make the
tradeoff under normal circumstances.

On the one hand, both a Ferrari and a Model-T are really fundamentally about
getting you from point A to point B. The majority of the progress between the
Model-T and the Ferrari have to do with unnecessary optimization (going
bazillion miles per hour) and tangental/incidental value-adds (being red,
having air conditioning, etc.)

On the other hand, an inablity to discern qualitative differences is exactly
why every discussion about Apple on Slashdot turns into "My Rio did everything
the iPod did and more! It isn't fair! Wake up sheeple!" Maybe everything we do
with Google Plus we could have done with email in the 80s, but does it matter
if all the end user wanted to be able to do is put their signature in lilac
Comic Sans?

Where we are today (in any field, on any topic) is a mixture of good
decisions, practical solutions and random selections. Every pundit (and
everyone on HN is a pundit) thinks they know which things arose intentionally
and which things were just noise. I hesitatingly suggest that a large portion
of the intentional arose from the desire for good looking stuff. I double-plus
hesitatingly suggest that an alternate world in which Plan 9's view
(everything is a filesystem, every user is command-line fluent) or in which
Smalltalk's view (every user is a programmer, everything is live and can be
live-debugged) would be interesting and offer certain benefits. Nobody will
ever really know what that alternate world would be like—besides that it would
probably look like absolute _shit_ —and I think it's unfortunate, in the
bittersweet way that unrealized potentials always are.

But I take heart knowing that at least in this world, with very few
exceptions, technologies can become _marginalized_ but they can't really be
_killed_ anymore.

------
jandrewrogers
I am going to characterize the situation a little differently.

20 years ago you had to be fluent in the deep arcana of low-level details to
be a good programmer. You also could not lean on deep, rich libraries to get
the job done because such libraries were far and few between. There was
nothing good about this per se, it was a fact of reality. It also made it
difficult and slow to get work done.

Today, development environments are rich in both tools and libraries beyond
the wildest dreams of programmers two decades ago. You can implement
incredibly complex applications without touching any of the low-level
supporting code and expect reasonable results. That is the banner of progress.

However, this characterization of current software development is only true at
the median. At the high-end, you still require a deep understanding of low-
level systems, have to build your own libraries, and be able to wrangle
esoteric computer science to be a good programmer. Many years ago, there were
a great many programmers with these skills because you needed them to get the
most rudimentary jobs done. Today, I see many more software engineers that
completely lack these skills because they never needed to learn them to become
a good, marketable programmer.

That is an important distinction. It is not to say that earlier generations of
programmers had more intrinsic ability, but they did have skills that are
relevant today to designing software on the very high-end that are much less
frequently learned now. To a significant extent it was a product of
environment. Ironically, many of the primitive skills that were relevant many
years ago have become cornerstones of writing high-performance and scalable
software systems today.

------
tagawa
Looks like rose-tinted glasses to me.

> Bad programmers will still build bad software tomorrow too, but I think
> there are more of them now than there were then and that’s the real problem.

Of course there are more bad programmers now - because there are many more
programmers now. But this doesn't make the great programmers any less great
than their predecessors.

And things like higher barriers and lack of powerful search engines may have
been true, but do they really improve a programmer's ability? I'd argue they
simply affect a programmer's productivity, whether the programmer's skilled or
not.

Human greatness (or lack of) in a particular industry doesn't vary over time
in my opinion, especially over just a few decades. I don't think the
achievements of the past are any better or worse than the achievements of
today. It's just much easier to recognise great achievements (and achievers)
in retrospect - as time passes, the signal-to-noise ratio improves.

------
nathan_long
I work in Ruby on Rails, and have a developer on my team who started way back
in the punch card days.

His take is that it's harder now. We have better tools, but the things we're
building are more complex. We're programming at higher levels of abstraction
for broader code re-use.

We're also making lots of code libraries work together, coordinating
application code and cacheing layers and multiple kinds of databases. We're
writing client-side code that talks to our server-side APIs. And we've got a
whole system of tests and a system of version control which, though they make
our lives overall easier, are also part of the learning curve.

New day, different challenges.

~~~
ExcitedByNoise
I agree with this. Even though I started writing code in the 90's, 15 years
has changed the landscape tremendously. I think there is a disparity between
the fundamentals of programming and the fundamentals of being a modern
programmer in the work place. Fundamentals are often not needed everyday,
because the scope is much larger.

Still, it is a craft, and like any craft you should strive to always be a
better craftsman.

Like a story a friend of mine once told me. When he was in culinary school he
was asked "If a cook and a chef can produce the same meal, what is the
difference?". The answer: a chef knows why he does what he does (or she).

------
kevhsu
As a CompE in his final semester of undergrad, I resent this characterization.
I've seen what the tests and MPs from ten years ago looked like, and they're a
joke compared to what we see nowadays. Even when you consider the number of
cheaters and copy-pasta-ers (which was much more difficult to catch a few
years ago), I think the standards for LEGITIMATE programmers are constantly
improving.

There will always people that skirt by on the bare minimum level of work, and
due to the increasing number of people in this industry, the number of
slackers is increasing as well.

Or perhaps I'm overly optimistic about our current generation.

------
manaskarekar
How difficult is it to understand the concept of building on past human
achievements? Standing on the shoulder of giants?

If people don't use slide rule as much today, is that a bad thing? Are those
problems no longer getting solved?

Do you still need/want to learn how to build wheels if you want a car?

As you develop better tools, you go on solving increasingly difficult
problems.

------
zafriedman
In the past few weeks, I don't know why, but I took a prominent note of how
easy application development has become. The fact is that someone probably
could learn how to write a web application in Ruby on Rails in three months
without much hinderance. Production quality, probably not, but a REST API
persisting to a database and implementing some business logic, absolutely.

It scares me a little (a really little) in that this trend is ostensibly a
precursor, or leading indicator if you will, to the increasing commoditization
of our profession. But here is the thing. There are steps that we can actively
take to mitigate the complacency that recent technology has afforded to us.
This is likely where I agree with the author the most. If your skill set is
narrower than you wish it to be, engage problems in a different domain, and
acquire the knowledge to do so beforehand. Meanwhile, if you're one of the
people, like me, who feel beyond fortunate to wake up everyday doing something
that you love, then continue to do that, stay hungry to learn new things all
the time, and hope that that will be enough. That's pretty much all you can
do.

~~~
TuringTest
Am I the only one that sees commoditization of programming as a _good thing_?
I long for a world where everybody could define simple-to-moderately-complex
automated behaviors, without needing to pursue a career in programming.

Tools like Apple's Automator or Android's Locale provide low barriers to
scripting for the masses, but they fall short of providing a good, easy to use
abstraction mechanism; in the end they amount to classic imperative languages,
which are difficult to master.

On the other hand we have the Spreadsheet, the only widely successful End-
User-Development tool, ever. This one provides a really good for building
abstract data models and workflows - I've seen it used by people without any
programming understanding to develop complete form-like applications,
collaboration tools and storage repositories. Unfortunately, using those
required a lot of repetitive actions. The spreadsheet model does provide a
good abstraction mechanism but does not support automation capabilities; you
still need a classic scripting language to automate behaviors.

I hope the recent _live programming_ fad initiated by Bret Victor's "Inventing
on Principle" will finally produce widely used reactive environments; those
are a good basis for non-programmers to begin programming without a steep
learning curve and only up to the point that they really need.

------
sodafountan
I like this piece because I agree with a lot of what was said, but I don't
necessarily think that this is a bad thing. There's always going to be people
who are hardcore computer scientists, the ones who started programming in
their bedroom at a very young age. But I'm glad that there's room in this
industry for those who have lives outside of computer science as well.

As the world progressively becomes more automated (which I undoubtedly believe
will happen)and personal computers become even more integrated into our lives
(Smartphones are perfect examples) we'll need more average programmers to do
the "gruntwork" so to speak, as you all well know computers are only good at
following instructions line by line, computers will always need those
instructions to be written whether that takes a master CS guru who's been
doing it for 20+ years or someone who's fresh out of college and kind of
drifted through. There will always be room for innovation for those of us whom
are dedicated to our craft, and i'm glad to see that there will be room for
those who aren't as dedicated.

------
flyinglizard
Writing code today is easier because of better tools and more powerful
hardware, but the requirements and scope of projects have gone up by an absurd
factor.

A good programmer today is more knowledgeable and more versatile than ever
before. Problems are more complicated, systems are much grander in scope and
there's simply more technology to deal with and learn from. Google is a plus,
not a minus. All this code sharing makes you learn new stuff all the time.

Seeing new approaches and methods makes me a better programmer than someone
who's ultra focused on their niche, like back in the day.

And if you look outside the HN web-centric scope, at low level programming,
consider this: a reference manual for a simple 8-bit microprocessor (lets say
8051 core, very popular) was 300-400 pages long. Many products were designed
around it in the 80's and 90's, but as demands from products skyrocketed, this
hardware was no longer enough. For comparison, the reference manual of a TI
OMAP4460 processor, like the one in the Galaxy Nexus phone and Pandaboard dev
kit, is around 5200 pages long with the most complicated parts (e.g. 3D)
blanked out for confidentiality reasons.

Or look at it another way: a $3 microcontroller today has most of the
capabilities of a Pentium PC from 15 years ago (it has audio codecs, USB,
Flash storage and a lot of other stuff not found on the PC, and pretty close
computational power too). Someone has to write software to use all that. This
software is complicated. Your car has dozens of processors and there was an
article around here estimating the new luxury car code at several million
lines. Someone has to write that, too. This someone is very likely (or so I
hope) more knowledgeable and adept at doing their work than his predecessor
that only had to deal with one or two simple ECU chips.

Sure the barriers were higher back then, but someone who's truly good today is
magnitudes better and more versatile than in the past - there's no contest
really.

------
calinet6
Three words: "Shoulders of Giants." I'm sorry you had to forge the way, but
the present is still the frontier. We would do well to learn from the past as
always.

------
grn
Similar claim was made by Alan Kay in his lecture _Normal Considered Harmful_
[1]. However he didn't complain about lower barriers to entry. Instead he
pointed out that the people in the 90s ignored the work and experience of the
past generations of programmers. He especially criticized the Web.

[1] <http://www.youtube.com/watch?v=FvmTSpJU-Xc>

------
S_A_P
The point he makes is kind of obvious, IMO. Anytime you lower the barriers to
entry, less skilled persons can take part. He also seems to think of a
programmers ability as relatively static in nature. People can learn, grow and
get better. Low barriers to entry mean that you can have poor developers writ
and release code into the wild. Eventually, these people will either dig in
and learn, or whither and die on the vine. Even if they stay around, who are
they really hurting? Bad code needs to be maintained or replaced so it keeps
good devs employed :)

This just struck me as a curmudgeony rant akin to "Back in my day, we had to
walk up hill both ways through the snow and fight mountain lions just to get
to our unheated one room school house"

That said, certainly the first couple generations of developers solved some
very hard problems for those of us here, and that should not be taken lightly.
But we live in a great time right now, there is still much work to do and
there is a pretty danged nice environment to do it in.

------
sunkencity
We just had a post on how to build your own private google to facilitate
looking shit up <http://news.ycombinator.com/item?id=4580537>

Machines are more powerful these days! The unix kernel used to fit in a book.
How many books would it take to fit the linux kernel?

These days we happen to have in place an efficient communications system,
built by these very giants of the past, that actually facilitates getting
advice from your peers.

Why do you want to use a framework? Well it's not just about not writing the
code, it's about having code that has been well tried in action and honed. Any
eventual problem is probably documented somewhere on the net. Write your own
core stuff and you'll need do manage the cost of stabilizing the software.

~~~
oofabz
The Linux kernel has about as many chars as one to two Encyclopedia
Britannicae. C does not have as many chars per line as English, so printing
the kernel would require three or four Britannicae.

------
benzor
I find the title to be a little misleading, given what he goes on to explain
in his article:

" _Don’t get me wrong here, the class of 2012 CS grads are full of brains and
talent, they’re just mixed in with more mediocre people than the class of 1972
was. We aren’t getting less talented or smart folks, its our signal to noise
ratio that’s suffering._ "

More clearly stated as above, I quite agree with the point that we now have a
whole lot of "fake programmers" coasting through school with the help of
"copypasta," as the author likes to put it. But then again this is just the
diffusion of innovation in action [1], applied to careers instead of
technology.

[1] <http://en.wikipedia.org/wiki/Diffusion_of_innovations>

------
masterponomo
The copypasta crowd was alive and well back in the old days. I got my junior
college programming degree in 1985. Grading of programs was based on correct
output (still relevant today) and number of compiles. Yes, you could get it
100% right but fail anyway if you took more than 5 compiles. No penalty for
printing, though, so desk-checking of listings was our main activity. Once you
marked up a listing and made your changes (sometimes via terminal, sometimes
via punched card) you tossed the old listing. The folks who couldn't grasp the
concepts would grab listings out of the trash and pirate code from them. Good
programmers kept close track of their punched card decks too--the pirates
would happily swipe an unguarded deck.

------
rdale
I've been a professional programmer since 1978 and the article seems 'off-
beam' to me. He thinks that because programmer was harder 30 years ago, that
would be there was a higher barrier to entry and therefore only really good
programmers would get programming jobs. That idea is just completely wrong.
There were the same number of people who could program as opposed to those who
weren't very good as there are today.

Most programming goes on in your head in my opinion, and it is very much the
same today as it was in the 1970s. You need to be able to run programs in your
head if you are writing a COBOL program on coding sheets, that are sent away
to be turned into a deck of punched cards, and once the cards are punched you
can get them compiled into a program at most twice a day. If you want to be a
really good programmer today, you need to be able to visualize what you are
working on in your head. To me that is the ability that separates average
programmers from really good ones - a good programmer doesn't need a computer
to program, although having one is obviously handy..

For me there have been two big changes in how you conduct a programming
career. The first was the introduction of personal computers that were cheap
enough for a programmer to buy and use them to learn new programming skills in
their spare time. The second big change was Free Software where you could
publish your own code, and collaborate with people over the internet. I bought
a Macintosh in 1984 and used it to learn Pascal with MacPascal. Then I got a
programming job using Pascal. More recently I did a Ruby/C++ Free Software
project and got jobs as a Ruby programmer and as a C++ programmer as a result.

Only really good programmers can write Free Software and handle open reviews
of their work, and so if the article was about 'better self selection of
programmers', then people who write software in public today are way better
than the average programmer of 30 years ago. The very best programmers of
today are much the same as the very best programmers in the 1970s, it is just
that they are easier to find today. That is assuming you think people like
Linus Torvalds or Rails DHH are examples of the best programmers in the 21st
century.

------
VeejayRampay
I could spin that the other way: Sure we're not as good, but who cares, the
tooling and the methodology are much better, so as a whole, we're probably
producing better quality on average.

I'm sure you can cherry-pick a few great names from the past and tell us how
massive their shadows still are in nowadays' programming, but the average back
in the day was probably not very good, especially since everyone was using
tools and languages that were by nature very unforgiving.

------
capkutay
Are the musicians before us better? Are the artists before us better? Who's to
say?

Better questions are..has the best song already been written? the best
painting already created?

Sure, now we have more tools and low barrier to entry, that should only
increase the likelihood of the best talent coming in to this field. I'm
excited to see what can be created as a result.

~~~
Zak
A lot of people do say the musicians of years before were better. I think it's
more likely that the ones we still remember today were the best of their time,
and that the best of our time will be remembered the same way by future
generations.

------
eloisant
Well, the expectations in terms of delay and scope grow to match the ease you
gain by having better tools.

Sure 30 years ago you could waste hours on a trivial problem because you
didn't have Google and Stackoverflow, but your boss was happy when you spent a
month on something you would have a week to do nowadays.

------
RobAley
The good programmers today are just as good as the good programmers of
yesterday. The challenges they face are just different as the tasks they
tackle are different.

The other difference is that there are enough computers about that the bad
programmers can get jobs as well!

------
aroberge
tl;dr: I would argue, based on experience, that today's programming languages
make it much easier to get the job done than used to be the case back then. As
a result, programmers today don't have to be as good (i.e. lower barrier of
entry) to write programs as programmers of the past had to be. So, it is less
of an elite profession ... and it is not necessarily a bad thing.

==========

I first encountered programming in the late 70s, when I learned Fortran in a
college level course. I did not have much opportunity to use it then, but I
found it fairly straightforward to use. I hardly used it, nor did any other
programming until the late 80s.

In the late 80s, I had to relearn Fortran to do numerical work, which I did
for a couple of years. Again, I found it very straightforward.

After I moved to a new job (solitary university prof with no research budget)
I needed to do some numerical work and could not get a (free) Fortran
compiler. I learned C and could easily translate (in my brain) algorithms that
I knew how to program in Fortran; it was not quite as straightforward ... but
it was fairly easy. (In other words: C was not the best tool for the job.)
Then, I got a copy of Numerical Recipes in C ... and it was (at the time) as
though it was written in a completely different language, one that was
definitely not suited for its purpose. Thankfully, I got the results I needed
and could put that behind.

In the mid 90s, I learned Java to write applets for teaching concepts to
students. Using Java was not pleasant, as I found it rather verbose and a bit
like having a straightjacket on.

I did not program for 10 years. Then, around 2004 I stumbled upon this
language called Python ... and found that programming in it was incredibly
straightforward. Fortran made numerical code easy to write (by design); Python
made everything easy to write. It was so much fun, that I picked up
programming as a hobby.

Since then, I've learned a few other programming languages and found that
having the right tool for the job makes the job easier (duh!) ... and that
there are a lot more useful tools (read: programming languages) today than
back then. So, you don't have to be as good today as you had to be back then
to write programs that actually do stuff.

------
senorcastro
Falls under the "no shit" category.

------
digitalWestie
All I heard was: "Kids these days!" - a phrase that is doomed to repeat itself
for ever.

------
zwieback
I was the programmer before "us" and I was not better but I'm better now.

------
derleth
Compiler design has an interesting example of how you don't need to be
wizardly to do interesting things anymore.

Jack Crenshaw's series of articles is an interesting example of this from an
era that's already bygone (his example was a compiler written in Turbo Pascal
directly generating Motorola 68000 assembly) but is close to our own time in
many ways that count. For example, he emphasizes the fast turnaround time you
get when your compiler is running on a PC instead of a mainframe across
campus.

Specifically, his little digression in the eighth part is on-topic here:

<http://compilers.iecc.com/crenshaw/tutor8.txt>

He specifically mentions the multi-pass design common to early compilers (63
passes for a compiler written for the IBM 1401!) as being down to having to
fit the compiler into limited RAM. Similarly, Crenshaw's compiler relies on
the existence of a fairly capacious stack to allow the parser to be recursive;
on a 1950s-era mainframe without a stack, or a 1960s-era minicomputer where
procedure calls were not re-entrant, you'd be left to do a lot more
bookkeeping by hand.

Then you have the massive emphasis on error detection and reporting forced on
programmers by the slow turnaround inherent in batch systems: You can't just
die at the first error with a one-line message if it'll take two weeks to run
the program again; soldiering on and verbosely reporting every step was the
only way to get the most out of your precious computer time.

And, of course, less-optimized code is easier (and faster!) to generate, but
can only be justified if your machine is fast enough the optimizations aren't
actually needed. tcc, the tiny C compiler, is a modern incarnation of this
idea; it compiles fast enough you can throw the result of the compilation away
and run the compiler every time, like you do in Perl or Ruby.

And he mentions standardization: His compiler can only handle grammars you can
express cleanly in BNF, which is okay because those kinds of grammars are
dominant now anyway. Kinda hard to predict that ahead of time, though.

<http://compilers.iecc.com/crenshaw/>

