
Coding Horror: The Magpie Developer (2008) - jalan
http://www.codinghorror.com/blog/2008/01/the-magpie-developer.html
======
wwweston
If I had to pick a single pain point about front-end development right now, it
would be this. The new features in HTML, CSS, and native browser APIs plus the
variations and limitations of each browser (plus the special and not
particularly settled world of mobile/tablets) is hard enough to keep up with.

But we're now in the stage where there's a dozen frameworks out there,
probably classifiable into at least three distinct paradigms, and then we have
the languages that target the browser. And I suspect we're a long way from
shaking this out into a semi-stable point.

The thing that I like least about this treadmill is that time invested in the
ephemeral arcana of a stack/platform is time that isn't invested in skills
that will transfer elsewhere and help you become a better general problem
solver.

~~~
hcarvalhoalves
The cambrian-explosion of frontend frameworks you mean is mostly about
patching up the web browser to do something it wasn't born to do though. The
endless cycle of solving things that are solved elsewhere won't stop as long
as the browser keeps being pushed as an application platform.

~~~
matthewmacleod
That's a rather negative take on it. I'd argue instead that the explosion in
frameworks is a result of a broad interest in developing and experimenting
with new approaches to providing applications. Web applications are typically
network-aware, real-time, cross-platform and easily-updated; the basic
technologies (HTML, CSS and Javascript) are easy to start using, and in some
ways extraordinarily flexible.

I don't think that these are actually solved problems, and the web as an
application platform can play a part in changing that.

Further, the whole concept of "patching up" a platform to "do something it
wasn't born to do" is misleading. Every computing device you use has it's
roots in older software that was developed without being intended to perform
the tasks it currently is - we stand on the shoulders of giants.

The web as an application platform is still in its infancy, and we're
currently playing around with a lot of different approaches - some will live,
and some will die. It's how we make progress, after all.

~~~
hcarvalhoalves
I agree that's a rather negative take on it, but I can't help. Working on web
development for almost 15 years has washed away my sugarcoat ;)

Frontend web development is _still_ absolutely painful and ugly. It's based on
a broken paradigm (hypertext vs. interactive interfaces); relies on hacks for
fundamentals (XMLHttpRequest is probably the best example); multiple standards
pushed by organizations with special interests (remember having to encode
video in 4 different formats?); is labour and time-intensive; tooling is still
catching up with the 80's.

None of those, in isolation, are big problems though. The _big_ problem is
that, because you can always patch everything up with some javascript, there's
no drive to have a platform that rests on top of more sound fundamentals -
condemning ourselves to an endless cycle of frameworks.

------
habosa
A related notion is the Blub Paradox:
[http://c2.com/cgi/wiki?BlubParadox](http://c2.com/cgi/wiki?BlubParadox)

"As long as our hypothetical Blub programmer is looking down the power
continuum, he knows he's looking down. Languages less powerful than Blub are
obviously less powerful, because they're missing some feature he's used to.
But when our hypothetical Blub programmer looks in the other direction, up the
power continuum, he doesn't realize he's looking up. What he sees are merely
weird languages. He probably considers them about equivalent in power to Blub,
but with all this other hairy stuff thrown in as well. Blub is good enough for
him, because he thinks in Blub."

I, personally, am very slow to adopt new languages or frameworks for serious
projects. Still haven't found anything I can't do with Java and/or Ruby on
Rails. I do try to keep up with the news so I'm not totally caught off guard,
and I make a point of building toy projects in other languages like Python,
Haskell, Scala, Clojure, etc. It's important not to be a Magpie, but also not
to get caught up as a Blubber and end up looking for a COBOL to Objective-C
cross compiler so you can make an iPhone app.

~~~
yohanatan
Ruby is closer to Blub than Java. I'm not sure why you would even mention them
both in the same breathe. You know that for Paul Graham (and anyone else who
is informed) Blub essentially means LISP yea?

~~~
TheZenPsycho
Uh, what? if "Blub" meant any specific language, or any specific kind of
language, why anonymise it? Are we suddenly afraid of embarrassing java or
ruby programmers for getting stuck on terrible languages?

Isn't it more likely that "Blub" simply means _whatever language it is you
happen to be using_ ?

~~~
yohanatan
He doesn't anonymize it at all. Read: 'Revenge of the Nerds'. He specifically
mentions Java as the language the pointy hard boss prefers.

Also, the post I responded to (and my subsequent post) was a bit confused
about what it referred to as Blub. Blub is in the middle and LISP is on the
top of the power continuum.

~~~
habosa
You clearly don't understand the Blub paradox. Blub is whatever language you
are using.

~~~
yohanatan
No, I understand it very well. Blub is whatever language you are using _that
isn 't on the top of the power continuum_. As of this point in human history,
LISP sits at that end of the spectrum (all of this is directly from PG-- if
you have issue with it, please read more of his essays). I've already
mentioned 2 essays but he has written books on it as well.

In short, PG makes it clear in his writings that LISP is the most powerful
language we have as of yet and he even puts forward the idea that it might
theoretically be the most powerful language _possible_ (due primarily to its
syntax being a human-readable [and morphable] representation of an AST).

------
fit2rule
One thing that we seem to constantly need to be reminded of, is the fact that
the world gets a new set of human minds, every year, looking at the scene.
Children are among us. While some of us may have had decades to dissect the
polity of the world, yet new minds are today discovering the basics. Thus,
there really isn't any 'news' as such - just 'data relevant to those available
to view it'. The consumerist ideal of 'the new new' is a fallacy; in fact
things need to be at least 6 months old before they become 'the new thing', in
most realms of human cultural interaction.

This fact of 'where new comes from' (sex, basically) is true of developers, as
it is true of any other human responsibility that can be taken. Developers,
new to the scene, who do not know what was there when they arrived (for
various reasons), end up building new things. Those new things do in fact
represent progress to the human species; in that they can be as-broken or as-
brilliant as anything else, but won't - likely - be exactly the same as
anything else out there. Difference drives us forward.

But calling people out specifically and associating them with animals is,
alas, not a new thing. It has been going on forever, it seems. Is it not
tiresome to a developer to be instantiating fallacies like 'magpie disorder'
on other human beings so easily?

Not that I agree with the position that the 'always-new widget must be used'
specifically; more that 'new ways to discriminate' isn't something this
hacker, personally, wants to read about ..

------
motter
I agree with the main conclusion here, but it's a stretch to reduce technology
choices to a simple "new or not" dichotomy.

Let's say you're writing a new web service in Java, because it has features
aplenty and is also the language your team is most familiar with. You're
confident the JVM is a platform you want to build on.

Now you need to:

1\. Choose a set of libraries or a framework. Do you go for Spring or Java EE,
or for something newer like Play or Dropwizard?

2\. Choose a build tool. Maven? Ant? Gradle? Maybe we'll write some scala, so
SBT?

3\. Choose tools for deployment, config management, etc.

4\. A database.

5\. And so on.

All of these tools have different trade-offs. There are so many trade-offs
that I don't think blog post comparisons (or whatever) cut it. And so you have
the "magpies" who try and figure out some of these trade-offs for themselves
by experimentation. (That is what, in my opinion, hack days and 20% time are
for, not your new production system.)

But don't listen to me, we wrote our new web service in Go ;)

More seriously, it was a major decision and I couldn't possibly write a few
hundred words on my blog to justify it. I may write a few thousand, though.

~~~
kyle_t
"All of these tools have different trade-offs. There are so many trade-offs
that simple blog post comparisons don't cut it. And so you have the "magpies"
who try and figure out some of these trade-offs for themselves by
experimentation. (That is what, in my opinion, hack days and 20% time are for,
not your new production system.)"

Absolutely. Someone has to be the designated pseudo-magpie in order to
architect the stack. Doing so effectively though requires a dev who can look
past the buzzwords and elevator pitches to really get to the core of it.
Essentially they have to be magpie and anti-magpie at the same time. Does this
new technology really offer me any benefit or is it the same end result
wrapped in new clothing?

------
mildtrepidation
It's become funny to me how people who aren't professional web developers
assume that the "pace of technology" means you must obviously be learning and
using the newest cutting-edge tech at all times.

Aside from such a thing being irresponsible for those of us with clients who
need fundamentally sound and stable groundwork, it's an incredible waste of
resources. Unless it's actually your job to evaluate new languages and
frameworks (which, by the way, would be awesome!), it rarely makes sense to be
riding the latest craze just because it _is_ the latest craze.

It's good to have options. It's not necessarily good to try and use all of
them.

------
interstitial
I'm glad database technology with its deep (and provable) mathematical
foundation like SQL is free from these distractions. I mean what if there was
this huge push of Object-Relational Databases in the 1990s and 2000s,
including XML-native databases? I mean what if there was this massive claim
that normalization is just an old man's fetish, and a bunch of geniuses
figured out you could store things in memory and flat files like the punch
card days? And then they created this huge buzz on hash tables, and key value
stores with some magic serialized JSON could be ankle deep in linked
key/values stores and scalability and all such things were claimed but seldom
proved. And then mathematics of normalization hit them in the face, so they
have to invent a new buzz word for normalization while still pretending SQL is
an old man's fetish. More acronyms I say!

~~~
loudmax
You do realize that there are successful companies storing massive amounts of
data in NoSQL applications, yes? I guess you don't have to call key value
stores and such "databases" if it bothers you.

If you're making the case that traditional relational databases are still
relevant today, I don't think you'd find many who would disagree. Databases of
the sort described by Cobb are as valuable today as they ever were. NoSQL land
has lots of competing technologies to draw the magpies. MongoDB was hot
before. Now it's not. So it goes.

But if you're trying to make the case that SQL is the only way to store data,
you lack exposure to the variety of data out there. There are situations in
which using a traditional relational database simply doesn't make sense. Would
you really want to run, say, an instant messaging application with millions of
users on Oracle?

~~~
derefr
Both of you are correct.

Around 2000 or so, we started to see companies that

1\. had Big Data;

2\. spread it across data centers spanning different continents;

and 3. needed to display and update it in real-time.

These companies (Facebook being the modern example) basically _needed_ to
throw out normalization (i.e. to choose AP over CP) in order to get an
acceptable UX for people interacting in different parts of the world. And
these companies were prestigious.

But these two facts combined meant that everyone was quick to adopt these
"pragmatic solutions to Big Data problems" in order to try to signal some of
the prestige involved with _having_ "Big Data problems."

But, since their Data actually wasn't Big enough for the real pragmatic
solutions to be more helpful than harmful, the prestige-seekers sought to
simplify the "pragmatic solutions" \-- keeping all the pain involved with non-
relational access, while shucking anything that could potentially operate at
scale. Thus were "consumer" non-relational databases (e.g. Mongo) born.

------
RankingMember
Dear god yes, agreed. For someone still learning the ropes, the absolute
torrent of new buzzwords/languages/dbs is really fatiguing.

~~~
kyle_t
It truly is and even for experienced developers it can be overwhelming. It is
simply impossible for anyone person to be knowledgeable every new technology
and/or programming language.

I used to constantly fall in the trap of shallow but wide breadth of
knowledge. Something new would come out and I would drop everything and dive
right in. It feels rewarding at the time but really in the end has very little
benefit (unless of course your a tech reporter).

My advice is to achieve a narrow and deep knowledge base. Pick a few that you
feel passionate about and really concentrate on mastering those. It would help
to pick those that have a large number of related job openings if its your
livelihood. Mastering Java may seem old school, but the number of job postings
I still see for java developers is amazing.

~~~
danielweber
It is incredibly fatiguing. Oh, you were working at a company for two years on
technology _X_? Sorry, we need to hire someone who knows _Y_ which you
couldn't because you were working on _X_. No, I don't care that you could
learn it in 3 days, our HR procedure says we can't give you 3 days.

Then I go download some C source code from 15 years ago and it still compiles
just fine and I smile a little bit.

------
TheZenPsycho
Hacker News declares new things are dead. Along with Web Pages, Blogs, Net
Neutrality and PSD to HTML.

The only things not dead are Edward Snowden and languages that compile to
javascript. … And declaring things dead.

------
michaelwww
The Magpie Developer is easy to shoot down, but as this thread [1] makes
clears, we usually switch roles in our career. If you're not a Magpie once in
awhile, you're not trying new things enough.

[https://news.ycombinator.com/item?id=6938645](https://news.ycombinator.com/item?id=6938645)

------
danso
After spending a couple days pitching in on a Wordpress redesign...I'm
reminded how lucky I was to have a little magpie in me. PHP was my first web
scripting language and I was even able to build web apps from it. Later on, my
employer switched to Rails, and I had never even heard of Ruby. But going back
to PHP years later, I'm surprised at how much the variety of experience I've
had just understanding different patterns makes it easy to go back, and even
understand things that I had never understood before in PHP.

I agree that newness is too fetishized, but trying out a fad can be a great
way to unexpectedly learn and grow

~~~
scarecrowbob
I agree-- I don't adopt every tool I play with, but how are you ever going to
learn better patterns for working if you never look at other ways people do
things?

------
badman_ting
Hell, a lot of us are just now absorbing ideas that came about in the 60s and
70s.

There's good new stuff of course, but a lot of it is redoing an existing idea
in a slightly different context, with new and exciting bugs waiting for you to
discover them when you'd most prefer not to.

------
jorgeleo
"Users don't care whether you use J2EE, Cobol, or a pair of magic rocks. They
want their credit card authorization to process correctly and their inventory
reports to print. You help them discover what they really need and jointly
imagine a system."

Damn right!

------
tel
This is definitely a reach, but the reason why I like learning math more than
CS is because it's been around long enough to inspire confidence that it will
continue to be around.

Likewise, this is why I would study something like HoTT--reasonable certainty
that the things I'm learning there will form the basis of the final
programming language.

I don't mind change, but I dislike putting weight in fashion.

------
mkhattab
I'm reminded of Peter Norvig's "Teach Yourself Programming in Ten Years."[0]
How can we develop intuition about new tools if their half-life is measured in
days while intuition takes perhaps years to develop?

\--- [0]: [http://norvig.com/21-days.html](http://norvig.com/21-days.html)

------
lmm
But progress is real. The people who moved to Ruby weren't just following
fashion (at least, not all of them) - the language genuinely improved on what
had gone before. And now that they're moving on again, it's not (just) to try
and get away from all those losers who're writing Ruby nowadays - it's because
lessons have been learned, problems have been solved, and there are some
improvements that you can't make without writing a new language.

Sure, it's possible to move too fast. But the dangers of stagnation are worse,
IMO.

------
_random_
"...the vast majority of programmers have yet to experience a dynamic language
of any kind..." \- We've tried them now, can you please take them back? At
least JS.

------
buckbova
I love shiny new things and will keep on collecting them.

Thanks to this article, I may have found some more. It pointed to the 2007
article, well here's the latest Scott Hanselmans Ultimate Dev Tools:

[http://www.hanselman.com/blog/ScottHanselmans2014UltimateDev...](http://www.hanselman.com/blog/ScottHanselmans2014UltimateDeveloperAndPowerUsersToolListForWindows.aspx)

------
aikii
I'm quite amazed that I almost never get any answer to the question : "what is
the problem you're trying to solve ?". We should make placebo software. Zero
feature, 100% marketing, but gives the occasion to look at the problems we
really have. Maybe have a consultant for the vaporware acting as a therapist.
Oh wait, I have a strange feeling of déjà vu.

------
vitd
Does anyone else find it odd that the majority of links in his articles are to
his own articles?

~~~
ben336
Nope. He's written a lot within a clearly defined scope "the human side of
software development", so his articles are often related. Beyond that, self-
linking is a great form of self-promotion and getting readers engaged in a
blog.

------
curmudgeoned
I believe these sorts of people are also referred to as warez d00dz.

