
Advice to my younger self: become allergic to the churn - galfarragem
https://lambdaisland.com/blog/2019-08-07-advice-to-younger-self
======
umvi
The piece reads like a Unix pastor's pulpit preaching

"Great nutritious technologies to use: Make, Emacs, Lisp, CLI"

"Bad unwholesome technologies to use: JavaScript, Ruby, IDEs, Graphical User
Interfaces"

I personally hate Make, it's burned me too many times. Now I use CMake, and I
haven't been burned in years.

And is it candy or an olive that I like VSCode and not Emacs (not that I've
ever tried Emacs, I just don't feel like investing in that ecosystem)?

I agree the front end web framework churn is out of control; hopefully it will
stabilize over time. But come on, don't just rip on random scripting languages
that have been around for 25 years

~~~
rxhernandez
I mean I haven't been a professional for multiple decades but I've written
professionally in a very wide range of fields in C, Matlab, Java, Python,
Verilog, Assembly, Fortran, and Javascript, and, there is no language that
I've seen that has more hair pulling, braindead I just learned to program
yesterday type programming nearly as much as I've seen in Javascript.

(I've worked intimately with decades old scientific programming codebases
where the scientists didn't give a single damn about writing cleanly and even
that was an order of magnitude more readable than the willy nilly, consistency
be damned crap I've been seeing in Javascript)

(I feel like I could describe a small portion of hell for me and it would
include Javascript)

~~~
tomgp
I have a similar set of experiences but have a feeling that it's at least
partly down to the sheer number of people writing Javascript and the very
public nature of much of that code that so much of it is a mess. I don't think
there's anything inherent in JS that make is produce messy code other than the
fact that it's easy for amateurs to get results (thanks to the fact it runs in
a browser and there's lots of easily available code samples).

~~~
tenaciousDaniel
I'm a JS dev and I love it, but JavaScript is to programming languages what
the electric guitar is to musical instruments.

It's the thing that attracts a lot of new beginners who may or may not be
interested in the more serious theory that makes programming what it is.

~~~
rxhernandez
Have you worked in other languages for substantial amounts of time?

~~~
tenaciousDaniel
I did Objective C for about 1-2 years, and a bit of java over the years. But
no, not really.

~~~
rxhernandez
Fair enough. Thank you for taking the time to a question that, in retrospect,
sounds antagonistic (apologies for that).

~~~
tenaciousDaniel
Nah, I didn't take it that way

------
misterman0
>> The Churn is losing a day debugging because a transitive dependency changed
a function signature. The Churn is spending a week just to get a project you
wrote a year ago to even run. The Churn is rewriting your front-end because a
shiny new thing came around.

If you think the churn is that then you're lucky. To me, the churn is working
for the man. Everything I ever do is for the man.

If I could ever reach a point where I could say, fuck you, the man, well,
that‘d be the day, wouldn't it? That'd be The End Of The Churn that The Man
invented.

~~~
conception
There are many many organizations and Institutes that aren't the man. You
don't have to work for a soulless corporation. You can do work that helps the
greater good of humanity.

~~~
henryfjordan
Under the current version of American Corporatism, no, you cannot work for a
company that isn't soulless. A Corporation is legally required to work only to
increase the value of a share, nothing more.

Maybe you can work for a non-profit if you're lucky, but how many are there
out there that pay market rate?

~~~
johan_larson
> A Corporation is legally required to work only to increase the value of a
> share, nothing more.

No. That's a misconception. Corporate directors are required to act in the
interests of the shareholders, but they have a lot of discretion in
determining what those interests are and how they are to be served.

Here's a reference from a decent law school:
[https://www.lawschool.cornell.edu/academics/clarke_business_...](https://www.lawschool.cornell.edu/academics/clarke_business_law_institute/corporations-
and-society/Common-Misunderstandings-About-Corporations.cfm)

~~~
henryfjordan
That reference cites the Hobby Lobby case where they got to opt out of
providing insurance for contraception and a Facebook video of an expert
explaining how the current corporate system produces sociopathic entities
(corporations). I don't believe either fundamentally debunks the idea that the
agents of corporations are required to attempt to increase the value of a
share.

I agree that there's still a lot of lee-way in that, like giving an employee a
bonus might hurt shareholders directly in the short-term but you can claim it
increases productivity and so is a good decision. You still have to justify
all decisions in terms of value to the shareholders though. If you (as CEO)
decide to just stop all work and spend every day at Disneyland until the
coffers are empty you can be sure you'll lose that suit. And when every single
decision has to be viewed through that lens, you aren't able to directly do
real good for the world, just indirectly.

------
red_admiral
I recently cut a technology out of my life that was causing a lot of churn,
and I'm much happier as a result.

That technology was TeX, even though it's on the author's "good" list.

It's somehow got to a point where every time two people got together to work
on a document, we had three different incompatible header files; package X
doesn't work with version Y of package Z; what order you import packages in
matters on my computer but not on a colleague's (still not sure why); the
order of macro expansion and character class redefinition in some packages
causes hard to track down bugs ...

80% of what I need to do, I can do in markdown and then run through one of
many document-generating tools. Currently for some projects I'm using gitbook-
cli and I'm both more productive and much happier. I even wonder whether I
want to make a fork of gitbook-cli, it's something I'd trust myself to be able
to contribute to.

For the other 20%, I use word. Yes it's a WYSIWYG interface but for short
documents (max 20 pages) where layout and design is important to get right,
I've found myself being able to save time and frustration compared to TeX.

Editor-wise, I use vim or vscode depending on the task. I've tried emacs, we
two are not really compatible.

~~~
609venezia
Have you tried ShareLaTeX/Overleaf? I have had better luck maintaining header
compatibility with it.

[https://en.wikipedia.org/wiki/ShareLaTeX](https://en.wikipedia.org/wiki/ShareLaTeX)

[https://www.overleaf.com](https://www.overleaf.com)

------
daenz
Accept the churn or be left behind, imo.

The best thing you can do to insulate yourself from the pain described in the
post (trying to remember how code worked, etc) is to brain dump important
things you learn along the way. I've recently begun keeping a TiddlyWiki[0]
for every major project I undertake. In it, I keep unexpected things I
learned, cheat sheet items, command-line snippets, and longer form entries
about structure.

The best part is it was all written by me, so the communication barrier is as
low as it can possibly be. Reading one of these TWs allows me to pick up a
project again extremely quickly. It's also useful on large projects where
different areas are like their own projects unto themselves.

Tools, not closing yourself off, help you overcome your limitations.

0\. [https://tiddlywiki.com/](https://tiddlywiki.com/)

~~~
rglover
The more experience I gain, the more dubious I grow of the "being left behind"
concept. It's rookie bait. With experience, you instinctively know when it's
time to move on from a technology (it no longer serves its purpose well or at
all) vs. jumping on the hype train to get some breeze in your hair.

~~~
daenz
There's a difference between jumping on the hype train and being stubbornly
stagnant. You will definitely be left behind if you refuse to adapt to new
"industry standard" tech, even if all of that new tech is 90% old concepts
with slight changes, repackaged.

That's not to say you should jump on every new half-baked hackathon-
originating javascript framework. But it does mean you should have your finger
on the general area of the pulse of where the industry is moving, and start
learning that tech. For example, pick up Rust for systems programming, instead
of C.

I know a web developer who refuses to learn or use any web tech besides PHP.
Finds cloud-based hosting confusing as well. "I can do anything I need in
PHP." Everything, except find good paying job that isn't maintaining gnarly
legacy codebases. Doing PHP is his bread and butter, and unless he decides to
learn some of the newer (stable) webdev tech, he's going to find himself with
no marketable skills in the future.

------
jrochkind1
Agreed. And I'm a rubyist!

I think ruby has actually gotten a LOT better at minimizing the churn (both
core/stdlib, and the ecosystem, specifically including Rails itself), as a
result of people learning from the experience. The ecosystem still doesn't
prioritize it as much as I'd like.

I also think this points out the benefit of sticking with a platform/ecosystem
for a while. Many people didn't realize the danger of backwards-incompat churn
until they saw it through experience over years.

You notice it when you work with the same code for a while -- if you are
always abandoning a codebase and coming to a new one, you abandon before it
gets painful, and either have a new one that isn't yet painful or a legacy one
where you can blame the pain on your predecessors making bad decisions.

If you are always abandoning a platform for a new one -- the dangers of
'churn' aren't apparently in a less mature platform, you never pay the price
at version 1.0, only after it's been around a while. But they _will_ be if it
lasts long enough and you stick with it long enough -- if when it starts
hurting you abandon it for some other new thing thinking the other new thing
will be better and not realizing it's better in that respect only cause it's
_newer_... you never learn.

~~~
collyw
Agreed, working on my own codebase for over 4 years taught me more than
anything else about how to design and write decent code. Its easy to blame
someone else for crap design / code but when its your own it forces you to
think of how it could be done better.

------
nkozyra
Unfortunately doing this tends to lead to being unemployable because too many
people associate being immersed in churn as "productivity." This spreads
philosophically and suddenly people doing the hiring want these kinds of
people who can "solve" these kinds of "problems."

At the same time, finding something that works and sticking with it
perpetually sounds wonderful - perfect, even. But breaking things and doing
hard things also leads to innovation and new ideas.

So ultimately I think there needs to be a balance and everyone should embrace
_a little_ churn while eschewing it in the broad form.

~~~
Ididntdothis
“Unfortunately doing this tends to lead to being unemployable because too many
people associate being immersed in churn as "productivity." This spreads
philosophically and suddenly people doing the hiring want these kinds of
people who can "solve" these kinds of "problems."”

Totally agree. Sticking to things that work can hurt your job prospects
seriously. You are quickly an “outdated dinosaur”.

------
breatheoften
Thinking you can label the churn as something it’s possible to avoid by single
choice upfront about dependencies is not realistic enough to be useful. Churn
is not just about the dependencies- but also about the project ...

Projects which are updated often have a forcing function on them - they need
to become updateable ... that can mean tests, that can mean reasonable build
tools, and that can mean dropping individual dependencies that are too painful
to update.

Obviously some ecosystems make all these things easier than others — typed
languages with reasonable consistent build processes help more than anything.
Good codebases that isolate and wrap usage of most dependencies next and
reasonable automated tests probably next most useful.

It is definitely the case that some codebases are easier to update than others
even given the same ecosystem/dependencies. When is the pain of “churn” most
reasonably considered my fault instead of yours? Those are the situations can
be fixed and they can get you pretty far — much further than “ahh, this
codebase has javascript and webpack i shall rewrite with make and closure” can
get you on average ...

~~~
marcosdumay
Maleability is a very important concept, but this article is not about it.

This is not about your code not breaking while you change it. It's about
things breaking when you change nothing. I've learned the lesson long ago by
using PHP software, some ecosystems just break much more often than others.

------
cryptozeus
“Examples: Clojure, Common Lisp, HTML, Make

Counter-examples: JavaScript, Ruby, React-Preact-Vue-Angular-“

OP does not give any reason as to why these are churn and why stop them ? This
makes no sense...basically to op’s point we should just never adopt new
technology because it’s just a new shiny thing. This makes no sense at all.

~~~
tracer4201
He didn’t elaborate but I agree with him. I’ve seen too many examples of
junior devs having to work on some small feature or enhancement, and their
idea becomes rewriting the entire code base in whatever new framework of the
month happens to be for the time. Yes there are plenty of reasons to adopt new
technology, but you have to balance the pros and cons with keeping your
business running and most importantly not disrupting your customers.

At the end of the day, honestly, how many frameworks do you need to build
webpages? And 95%+ certainly are not operating at a Google or Amazon scale.

On a side note, I recently saw a dev spend a few days wrestling with
dependency issues - he was trying to wire in Spring into a Java application
and running into configuration issues with decorators. At the end of the day,
this was a tiny, tiny application that periodically ingests messages from a
topic and forwards their payload to an email service. These are 5-10 messages
a day and the emails that are generated go to internal business customers as a
courtesy and not less of a mission critical notification. It’s worth thinking
about what was the business value vs the cost?

The take away isn’t that you should never use new technology - rather,
understand that new technology or change in general comes at a cost. Your job
is to balance that cost considering several factors - code quality (cost of
time spent), flexibility and extensibility, features (the stuff valuable to
whatever function responsible for your pay check), cost of on going support
and maintenance (operational load). Disregarding those factors and over
indexing on latest and greatest is usually the wrong approach.

~~~
cryptozeus
Agree with everything you said...but remember all the issues you listed are
not new to junior dev and also not new particular technology. Same debugging
issues were being worked on in c and c++. Your observation is correct but
applies to not only to all technology but life in general.

------
kevinconroy
Another name we often use is "Business-As-Usual" (BAU) or "Technical Debt".
Although some business-as-usual and tech is a necessary evil and in some
cases, a good thing, I have also wasted many hours to tasks that create little
to no value.

Another way to spot "The Churn": When you're done, what new value did your
effort create or unlock? Are users better off? Is the system more resilient
and reliable? Or did you just get it back to the way it was working before
everything went sideways?

Find more ways to create and unlock value and you'll find you move forward
much faster.

------
dnautics
> These are the olives of technology. Olives aren’t candy, and tasting them
> the first time isn’t always pleasant, but soon you will develop a taste for
> them. They have been here since ancient times, and they will remain for
> centuries to come. They are good for you, solid, reliable, nutritious. Eat
> less candy and more olives.

For what it's worth, olives are basically inedible from the tree and require a
considerable amount of processing before they are actually edible.

------
abathur
Churn can be all of the things the post describes, and I think it's good to be
skeptical of churn--always looking for ways to avoid it.

But it's also hard to tell the difference between churn and _maintenance_ ,
and I think one of our modern world's blind-spots is a de-prioritization of
maintenance.

------
danesparza
I'm sorry -- you lost me at UNIX, LISP, The Web, Emacs, TeX. There is a reason
UNIX and Linux is so fragmented - it was the thing that was churning for so
long! In some respects, it continues to churn (I'm looking at you, Linus
Torvalds).

~~~
bgorman
What are you referring to? Linux never intentionally breaks userspace code.

~~~
FigBug
Linux as narrowly defined as the kernel is very good at not breaking things.
Linux as used in common speech to mean a Linux distribution and associated
libraries undergoes constant churn.

I maintain a cross platform desktop app for Windows, macOS and Linux.

Windows is the best, 32 bit versions going back 15 years still work no issues.
macOS is next, 32 bit don't no longer work, but 64 bit versions still work
going back 5+ years. Ubuntu is by far the worst, some library I depend on
changes it's API pretty much every year, and the old version is removed,
breaking my app.

The solution appears to be Flatpak which bundle up the app with all it's
required libraries. However I'm not sure how to make this work for plugins.
Would each plugin need to be in it's own Flatpak? It's insane.

------
dustingetz
Clojure's reduced churn is a consequence of sitting at a local maxima in
programming language design. If it were not a local maxima, there would be
churn as we search for it.

UI has no such minority consensus for local maxima and is very much not
solved, which is why React-Preact-Vue-Angular is churning while the humanity
hivemind iterates towards a solution.

Here is the relevant Rich Hickey quote: [http://www.dustingetz.com/:rich-
hickey-web-frameworks/](http://www.dustingetz.com/:rich-hickey-web-
frameworks/)

------
_carl_jung
Relevant (arguably overblown) talk by Jonathan Blow:
[https://www.youtube.com/watch?v=pW-
SOdj4Kkk](https://www.youtube.com/watch?v=pW-SOdj4Kkk)

~~~
shurcooL
Thank you for sharing, I really enjoyed this talk. I somehow missed it when it
happened.

~~~
_carl_jung
No problem, I loved it too.

------
hrktb
> The Churn is spending a week just to get a project you wrote a year ago to
> even run.

How much effort would it be worth avoiding to spend a week every year ?

Sure the weeks can add up, but so does the time spent on low level
abstraction, and refusing to adopt better tools when the whole environment in
changing around (e.g. there is no mention of native mobile environments. Would
it be churn use to Swift instead of non-ARC ObjectiveC ?)

~~~
FigBug
I find Swift very bad for churn. Swift has had 5 versions in 5 years, with
breaking api changes each time.

Every answer on Stackoverflow about Swift has several answers, one for each
api version. Any time you grab some Swift code from the web or an older
project, it's not going to work.

Avoiding the churn isn't an option since new Xcode versions drop support for
old Swift versions. And only the two latest Xcodes will run on latest macOS.
They even drop the support for the conversion tools. So if I go back to an old
Swift project now, it won't compile in my Xcode, nor will my Xcode help
convert the code to modern. My only option is to run an order version of Xcode
in a VM to convert the code.

If I'm writing a library I want other people to use or share between projects,
I'll still do it in Objective-C. Apps I do in Swift but I find it annoying.

I still have 15 year old non-ARC Objective C libraries. Why spend the time
updating them when the are debugged and work fine?

Every time I have to do a Swift version update I introduce bugs.

~~~
hrktb
That’s part of my point.

Cutting yourself from new frameworks and hardware features just to cut churn
would be a horrible tradeoff in most cases.

There are some niches where churn can be mostly avoided, but I think churn is
usually a fact of life we could just embrace at a healthy pace.

From the opposite angle, a field with extremely low churn would seem
suspicious to me. For instance I would expect any language with no significant
update in the last 10 years to have abysmal unicode support.

------
letstrynvm
Important code needs love, it needs to be improved, made more robust, have
security issues handled, consolidated if it starts to bloat, to be cleaned and
kept legible even for whitespace so you or the next guy can easily see and
continue to look after all the moving parts.

If it's important, it needs and deserves all those things; they are not
"churn" but maintenance.

~~~
marcosdumay
Important code needs to become stable so it can be improved instead of
fighting constant degradation.

Churn kills important code, making it unfit for purpose.

------
codingdave
I'm not sure his list of technologies is accurate to the general coding
population. Nor would anyone else's list be accurate. The more general idea is
to learn the tech you use... learn it well and deep enough to avoid the kind
of churn the article describes. If you can do that with modern tools, fine. If
you can't and want to stick to older tools, that is also fine.

But draw your own line in the sand as to which tech you are going to use in
your production systems, and move to new tech only when both your skills are
sufficient, and there is a matching business need driving the change.

------
1-6
I frequently bite into olives only to find pits that did not get machined out.

~~~
snazz
Unless they’re stuffed olives, I’d recommend just buying them with pits in.
There’s quite a lot more variety that way at my local grocery store.

(I know you didn’t mean that literally)

------
SteveSmith16384
Just this morning I've spent an hour trying to get a well-known Http library
to work in my project, cos the developers keep completely refactoring all the
classes and methods. Stop it!!

~~~
danesparza
The problem isn't the libraries. The problem is you don't have CI (with a
dashboard your team is paying attention to). Check-in -> automatic build ->
automatic unit test execution -> signals if the build is ready

~~~
Ididntdothis
Until your CI pipeline causes churn because something doesn’t work anymore....

~~~
umvi
We use GitLab CI, and it's been working maintenance free for 3 years now. We
configured the test runners in 2016, and haven't touched them since and they
have never randomly broken - even across GitLab updates (of course, YMMV
depending on if you like to be on the bleeding edge all the time; we are
always 1 cycle behind the bleeding edge).

~~~
pritianka
What an awesome compliment for our team. Thank you!

------
peterholcomb
The first thing that came to mind when I read the opening paragraphs regarding
churn is developing mobile applications. I find myself losing a day or two
just to get my apps to compile if I ever go a month or two without spending
any time on them. It's brutal how quickly things can break and dependencies
need to be updated.

Is it possible to do modern mobile dev without churn (coming from a situation
where my clients often go months without requesting features)?

~~~
ElFitz
I only know of iOS, but here's what I try to do

\- Reduce the use of dependencies to the minimum (I'm bad at this) \- set
version compatibility in my Cartfile / Podfile \- when new breaking versions
of Swift came out, suggest to my client a 2 / 3 mission solely focused on
upgrading their app and it's dependencies (otherwise it will probably make
their next last minute super-urgent-right-now feature needlessly long and
complex to develop)

But I'd also willingly take any advice on this

Edit: also, semi-solved this for my JS work using automated dependency update
services, like dependabot, coupled with unit / integration tests, but still
haven't found a similar service for Swift

------
March_f6
I mean, sure. But at a certain point, LISP was Javascript in terms of bleeding
edge no? As someone who has built multiple things using React Native I can
absolutely relate to the churn, but each time I willfully wade into it and
each time I get better at handling all the pains that come along. I also can't
think of a better way to improve debugging skills than to use a tech stack
that gives rise to issues not easily solved via StackOverflow.

~~~
rout39574
Lisp was theory before it was code.

Javascript was a fashionable imitation of another popular language dialect.

~~~
waterhouse
> In 1995, Netscape Communications recruited Brendan Eich with the goal of
> embedding the Scheme programming language into its Netscape Navigator.[16]
> Before he could get started, Netscape Communications collaborated with Sun
> Microsystems to include Sun's more static programming language, Java, in
> Netscape Navigator so as to compete with Microsoft for user adoption of Web
> technologies and platforms.[17] Netscape Communications then decided that
> the scripting language they wanted to create would complement Java and
> should have a similar syntax, which excluded adopting other languages such
> as Perl, Python, TCL, or Scheme. To defend the idea of JavaScript against
> competing proposals, the company needed a prototype. Eich wrote one in 10
> days, in May 1995.

~~~
BrendanEich
Perl, Python, Lua were not ready for embedding (unsafe FFIs, OS-dependent
APIs) and would have been flash-frozen in early bad states. See
[https://news.ycombinator.com/item?id=1905155](https://news.ycombinator.com/item?id=1905155).
Better that JS got the early bad state freezing and thawing (ES3 helped; ES5
was constructive and ES6 made JS pretty good).

In early 1996, John Ousterhout stopped by to pitch Tcl/Tk, but it was too
late. VBScript was coming, but JS in 1995 Netscape betas got on first and
saved us from that dystopia.

------
namelosw
> Examples: Clojure, Common Lisp, HTML, Make

> Examples: UNIX, LISP, The Web, Emacs, TeX

1\. Or just computer science.

2\. Churn is not a problem. If you keep accepting the churn, you will learn
them faster and faster, until you scan the doc and know most of the things, if
it is churn. If it is not, it's something new (eg: Coq), then you have the
ability to identify 'new but not churn', instead of just learning 'old and not
churn' from a half-century ago (eg: Lisp).

3\. Among the churns, there would be 'meta churn' like Haskell, Lisp, Erlang,
and Rust. There are countless languages stealing Monad from Haskell, and
Kubernetes patterns are very similar to Erlang OTP patterns.

4\. Editors and IDEs are really irrelevant, I used Emacs a lot, I'm very
fluent with Vi, but for some languages, I prefer Intellij and VSCode. Just use
tools you feel productive.

------
wil421
In The Churn this week, a colleague wanted to improve a query for a widget on
a dashboard. The query was perfectly fine but now I’m stuck getting the wrong
results debugging everything I can find.

~~~
Cthulhu_
If it can be made faster then it's fine, BUT that colleague needs to write a
test first if it wasn't there yet (and I think it was given how you're fixing
wrong results / debugging).

------
marmaduke
This is the main reason I haven't picked up the new shiny languages after
using Python for a decade. OK the Gil is not cool but I'm OK with
multiprocessing (one example among many)

I look at Julia (for HPC) and think, sure if I were a grad student and had
time to burn. But now, I need to get an idea to figures live coded in the
space of minutes, NumPy and matplotlib are boring and just fine.

Research is one domain where churn is a very much a daily thing. Careers are
made on churn, in research.

------
ravenstine
> The Churn is losing a day debugging because a transitive dependency changed
> a function signature. The Churn is spending a week just to get a project you
> wrote a year ago to even run. The Churn is rewriting your front-end because
> a shiny new thing came around.

These forms of churn are hardly related besides that they are often self-
inflicted. The first 2 examples are really just technical debt, and perhaps
they can be referred to as the "grind" rather than churn.

Rewriting your code base to use the _framework of the now_ may happen because
of the industry changing to a point where using old-reliable.js is making it
difficult to hire new talent, whereas elon-musk.js is the hot new thing that
tons of programmers are interested in. Companies can feel obligated to follow
this trend because they think they'll become obsolete if they don't. The
company I currently work for is going through this at the moment, actually.

The churn that this causes is more novel than the grind because, as an
engineer, you are learning to do the same job in a different way. Avoiding the
grind isn't likely to fundamentally change the nature of your job, as writing
documentation and reducing the number of dependencies aren't very radical
ideas. But taking someone from one language and having them learn another, or
having them transition from one framework to another, can effectively demote
an engineer to junior grade until they've had experience with their new tools.
With the churn, your expertise can lose its meaning.

Even if you learn to solve the grind, solving the churn can be difficult even
if your loyalties remain to a single tech stack. With the exception of purely
personal projects, the industry will continue to shift its own loyalties to
different tools, so you've got a few different games to play when looking for
jobs:

\- Be the one who knows the legacy tech and can make sense of other people's
horrible legacy code. (Which the company will inevitably decide to have
rewritten in something like React out of the belief that all their problems
are being caused by the old technology.)

\- Be the one who knows the hot new thing and can write code that, whether or
not the code under the hood is atrocious, will make the bosses believe that
they can be like the Googles and the Facebooks.

The vast majority probably pick the latter. But if one wants to avoid the
churn as much as possible, the probably need to not only stick to tried and
true tools but also find a good company and stay there indefinitely, rather
than hopping companies every few years. Of course, that may come with a harsh
penalty down the road.

------
ahuth
“Churn” is definitely an issue. And JavaScript does have its own complexities.
It feels weird singling out Ruby and JavaScript (and any other individual
technologies), though.

I wonder if part of this is a human instinct to “other” people who are
different. Someone feels superior because they use Lisp, and looks down on
people using JavaScript. Or vice versa.

~~~
Gibbon1
I think it's less about languages and tools than the culture around those
languages and tools. I think it's centered around the the answer to the
question: How many customers, developers are we willing to toss to make this
change? For the ruthless and stodgy Microsoft the answer was zero. For Linux
close to zero.

------
lunias
Remember the last time you were supposed to fix a bug, but the code was so
"bad" that you rewrote the entire application?

IMO this is why churn is not going away. It's much easier to achieve flow-
state from zero than working up to it in a codebase that you're unfamiliar
with.

------
DannPuglo
Watching people fight Javascript never gets old. Keep yelling at those clouds,
fogies.

~~~
pepper_sauce
What properties do you think make modern JavaScript a good language, rather
than a (now, finally) barely passable language being actively ruined by its
own ecosystem?

------
rhacker
I'd have told my future self, moments after he said the word "churn" \- wait
wait wait, you're from the future and you have a time machine and you are
giving me programming advice. You realize if you give me stock picks none of
that will be relevant right?

------
oreglio
Like all industries we can't just stay still. Curiosity and creativity are
important. The "olives" are a fruit of some sort of older technology that has
matured. Maybe this technology was the "shiny new some some time back then"
Some people try to fix the the shortcommings from yesterday's technology with
their own ideas and view of things and nobody can prevents this. This is just
basic, human creativity at work.

Maybe what's important is not making people belive that yesterday's technology
is obsolete and should be burried or abandoned. The shiny new thing is just an
other tool in the toolbox and it's not because I created a new shape of screw
that all the other screws are worthless. There is still plenty of screws all
over the world that need to be unscrewed, we need those tools to do that,
people to know how to work with them and some of these screws still need to be
made. Maybe people don't talk about these screws as much ? They don't make the
news anymore because they've been there for so long ?

Maybe the point here is that social media thrives on the shiny new thing, it's
the core concept of it. An old news paper is worthless, no one ever sold an
older news paper. So they need to grab some attention to live and the shiny
new thing is a way to accomplish that.

But it's not because there is lots of articles about all those new frameworks
that the older ones are worthless. We need both. Maybe for some projects we
will not choose an architecture based on 10 years old concepts and frameworks
and some other projects need stability and a whole tested and stable toolchain
and documentation for the years to come.

What kind of work do you want to accomplish ? Small projects where you work
alone ? Where you just serve a few people ? Or big projects with a huge
infrastructure that needs to deliver to millions with great economic impact ?

I bet you don't use a hammer only when you want to build kitchen furniture.
And then tools that are needed in a factory that build kitchen furniture are
not the same that the ones that needs to be used for home. They need a whole
supply toolchain, some testing & quality checking. Whereas the worksman in his
own workshop is going to have less tools ...

We need all kinds of software and technology, maybe what's hard is to be able
to identify the right ones and defining precisely what we want and where we
think we will go. People are always going to want to try to use the shiny new
thing. We all need novelty, enhancing the shortcommings of your current tools.

~~~
yogthos
The reality is that there is very little actual innovation happening in
programming. Most of the ideas in use today have been discovered decades ago.
The big reason for churn is that people don't bother learning about what's
been done before, and keep reinventing the wheel.

You don't see in other disciplines such as physics or chemistry where people
spend years learning about existing research before actually starting to
contribute.

With programming the barrier to starting to write code is much lower than to
actually learning the background research. People don't bother looking at
what's been done already, and just start "inventing" things.

More often than not this result in half baked ideas because the authors of
projects don't really think about the full scope of the problem. Then once the
project starts hitting limits in practice, people start kludging things on top
of it, and eventually it becomes unwieldy to use. Then somebody comes by and
does the same thing simplistically again, and the cycle repeats. Nothing new
is learned in this process, and you get churn for the sake of churn.

~~~
Ididntdothis
I totally agree. But how do you balance this with the need for looking good on
the job market? If you want to stay employable you are almost forced to
participate in this craziness.

~~~
yogthos
I've been working with Clojure for the past decade, and haven't had to deal
with any of the craziness.

The job market is smaller, but so is the pool of developers. Companies tend to
be more flexible because of that and are often open to remote work. I'd much
rather work in a sane niche market than deal with the mainstream churn.

~~~
cutler
Isn't it fair to say, though, that you're not the average Clojure dev? I mean
you have a book and a web framework to your name so that puts you way ahead of
the pack. As an average Clojure dev I've found it very difficult to find work.

~~~
yogthos
That's just a result of me having been working with the same tech for a long
time. When I was starting out with Clojure it was a lot more niche than it is
now, and finding jobs was much harder. The whole reason I published a book was
due to lack of beginner resources being available. So I don't think there's
anything special about me, it's just that I was stubborn about wanting to work
with Clojure and didn't get dissuaded until I made that a reality.

We have local Clojure meetup in town, and when I first started going there
pretty much everybody was using Clojure as a hobby. Today, we have a bunch of
companies using it in production, and all of them are actively hiring. The
last three coop students I had all ended up getting Clojure jobs. I imagine
this varies based on where you live of course, but another option is to simply
introduce Clojure at a place that's using something else. That's where Clojure
jobs come from in the first place at the end of the day.

------
ChuckMcM
Damn, if I was making a time machine to go back to 1999 I would have told my
younger self, "Sell all your Tech stocks, right now." :-)

------
mbrodersen
You are the problem. Not the tools.

------
centrinoblue
javascrip is an amazing language- get over yourselves

~~~
DannPuglo
HN is full of this stuff. These people don't even understand what modern JS
is.

------
DevKoala
> Counter-examples: JavaScript, Ruby, React-Preact-Vue-Angular-…

Great programmers can find virtues in all tools. Hate the player, don't hate
the game.

