Hacker News new | past | comments | ask | show | jobs | submit login
Advice to my younger self: become allergic to the churn (lambdaisland.com)
231 points by galfarragem 67 days ago | hide | past | web | favorite | 209 comments

The piece reads like a Unix pastor's pulpit preaching

"Great nutritious technologies to use: Make, Emacs, Lisp, CLI"

"Bad unwholesome technologies to use: JavaScript, Ruby, IDEs, Graphical User Interfaces"

I personally hate Make, it's burned me too many times. Now I use CMake, and I haven't been burned in years.

And is it candy or an olive that I like VSCode and not Emacs (not that I've ever tried Emacs, I just don't feel like investing in that ecosystem)?

I agree the front end web framework churn is out of control; hopefully it will stabilize over time. But come on, don't just rip on random scripting languages that have been around for 25 years

I mean I haven't been a professional for multiple decades but I've written professionally in a very wide range of fields in C, Matlab, Java, Python, Verilog, Assembly, Fortran, and Javascript, and, there is no language that I've seen that has more hair pulling, braindead I just learned to program yesterday type programming nearly as much as I've seen in Javascript.

(I've worked intimately with decades old scientific programming codebases where the scientists didn't give a single damn about writing cleanly and even that was an order of magnitude more readable than the willy nilly, consistency be damned crap I've been seeing in Javascript)

(I feel like I could describe a small portion of hell for me and it would include Javascript)

It's all about who is writing it and why they are writing it, not the language. Well ok JavaScript has some legacy warts, but ECMAScript 2019 enables a variety of very well written code in a variety of styles, and TypeScript is icing.

As for me, I've only recently entered the JavaScript game, but the php I've seen has been much, much worse. I find the quality of JavaScript I typically encounter is on-par with typical Python code, and a fair bit better than the Frankenstein monsters of dev-ops code that often has a different convention for each line of bash (with attendant third-party applications) mixed in with inscrutable environment states as you shift in and out of various orchestration layers that have zero sense of application hierarchy or even running processes.

I've seen scientific codebases too, but the thing you're getting there is an authorship that is often small-team (1 or 2) and very slow to shift, with highly enculturated domain context. Of course JavaScript code is going to be a mess in organizations where you have devs coming from vastly different contexts and staying a max of 2 years with a team where all you care about is getting the pretty graphic on the screen.

> It's all about who is writing it and why they are writing it, not the language.

But who is writing it and why they are writing it are inextricably linked with the language. Every language has a culture and a reason for existing, those are pressures that bear as much on the person writing the language and the code produced as much as the person writing the code does on the code and the language.

Whether or not say, modern Java encourages good code is neither here nor there, as not only is there resistance to learning new things there is resistance to using the new things properly. Java has a history of bloated, inefficient code for a reason, and when you use Java, the abstractions you use will alter the code that you write. It's impossible to get away from 'bad code' without rewriting things, and then you're not coding your new project you're rewriting someone else's. It's also 'good practice' to use x y z language idioms, but there are cases when such idioms make the code much harder to read and much less clean.

As an example from C: I like POSIX, but dirname and basename are two ridiculously shit functions. If I choose to use them, the rot starts infesting my code, and everything that deals with them has to account for that, which itself causes a bias towards more obtuse code.

>It's all about who is writing it and why they are writing it, not the language. Well ok JavaScript has some legacy warts, but ECMAScript 2019 enables a variety of very well written code in a variety of styles, and TypeScript is icing.

Good languages kinda enforce some kind of basic rules, and prevent developers form making silly mistakes - because we all do them all the time. Those languages usually trade off the performance for safety though.

We should pick the safest language, that prevents errors and mistakes as long as we stay within allowed performance bounds.

JavaScript can be performant, but most of the times it isn't and it isn't safe at all. At least we have typed variants of JavaScript.

JavaScript by design was meant to fail gracefully - in the background - so the site works fine. Sadly nowadays sites are so JavaScript heavy that it isn't helping at all.

I generally prefer statically typed languages just because there is less ambiguity and function definition can help you greatly - you know what you put it, and what it returns.

With dynamic typed languages unless documentation is pristine(and in JavaScript ecosystem it is pretty varied from package to package) you are out of luck unless you look at the implementation.

bash and devOps code has their own special place in hell. bash has only one advantage - it works on any linux distro out of the box, and that's it. It is unreadable untyped mess.

Seconded on php. I have used php, I can use php, I just choose not to any more if I can help it.

Smart choice. I opted to stop accepting php jobs several years ago and life has never been easier.

I have a similar set of experiences but have a feeling that it's at least partly down to the sheer number of people writing Javascript and the very public nature of much of that code that so much of it is a mess. I don't think there's anything inherent in JS that make is produce messy code other than the fact that it's easy for amateurs to get results (thanks to the fact it runs in a browser and there's lots of easily available code samples).

I'm a JS dev and I love it, but JavaScript is to programming languages what the electric guitar is to musical instruments.

It's the thing that attracts a lot of new beginners who may or may not be interested in the more serious theory that makes programming what it is.

Agreed. I've gotten to where I don't like it as much for really large projects - TypeScript helps a bunch but is inherently limited by aspects of JS itself - but there is absolutely no language I'd rather sketch something out in.

The core data types actually strike a really excellent balance between representing the major distinct paradigms (named collections, unnamed collections, strings, numbers, booleans) while giving you maximum flexibility within them. Its standard library - array functions in particular - has become very mature. FP and OOP concepts are both quite well-supported, and doing async stuff (client or server or otherwise) with promises just smashes every other language I've used in terms of ergonomics.

Much of the above could be said about Python too, but regardless.

Yeah definitely, it's amazing for scratching together ideas. Though at a certain scale, it starts to feel pretty fragile. I don't have much experience with other languages but the explicit interface/implementation paradigm that Objective C used was so amazing to me (as someone who started out in JS). I find myself missing it every now and then, especially for large projects.

Have you worked in other languages for substantial amounts of time?

Yep. Python for many years, Java+TurboPascal+C# for a couple of years, C+Matlab for a bit under a year, and a few other ones that I wouldn't list simply due to how little work I've done in those.

TypeScript and Python (with typehints) are the current favorites. Golang seems to be getting up there for me too, but I don't have enough experience with it to actually be able to say that.

I used kotlin, golang, and ruby for multiple. not a huge sample of languages but i still like es6 javascript the best

I did Objective C for about 1-2 years, and a bit of java over the years. But no, not really.

Fair enough. Thank you for taking the time to a question that, in retrospect, sounds antagonistic (apologies for that).

Nah, I didn't take it that way

Also, it's used with high levels of added distortion.

I agree that there's nothing inherent that makes it produce messy code, but it also doesn't really promote good code either.

There's half a dozen different syntaxes for importing/exporting functions, creating functions, creating objects, etc., and it's all fair game. Then there's the asynchronous nature, so sometimes it's async/await, and sometimes it's promises, and sometimes it's callbacks, sometimes all happening in the same function.

And then on top of that there's all the front-end web frameworks which are like, totally un-opinionated, man, except for the parts where they're implicitly super opinionated.

I have made a few Wordpress sites for customers that required me writing Javascript and PHP code as I learned them for quick features and fixes. In other words, I had to write Javascript and PHP code before I even considered myself a programmer, I basically copy-pasted and edited until it worked.

I imagine that accounts for a lot of bad Javascript and PHP code. By the time I wrote any C++ or Python professionally I had a CS degree and a decade of programing experience.

what killed Javascript for me (and I think many others) is that there is no idiomatic way of writing Javascript. There is no "javascript"-y way of doing things (unless you count callback hell).

I'm not saying that a language should only offer one approach to a problem, but there should be general guardrails and guidelines. A shared vocabulary. None of that exists in modern JS

The same criticism (which is a valid one) is doubly true of Lisp, but Lisp doesn't take half the flak that JavaScript does.

"There are only two kinds of languages: the ones people complain about and the ones nobody uses."

- Bjarne Stroustrup

That's because there's three or four orders of magnitude more JS code out in the wild, then there is Lisp code.

The problem is that the original paradigms were pretty terrible but JS has to keep (as much as reasonably possible) backwards compatibility to not break the web. If you forked JS and made changes to clean it up that weren't backwards compatible, you'd have a hard time convincing browsers to take it, making the whole exercise sort of moot.

The fact that we have things like the Wayback Machine is a testament to how this approach, while terrible for developers starting out or working in a legacy codebase, has enabled us to preserve what is now a significant portion of contemporary cultural history. AFAIK there is no similar archive of programs for, say, Java or C.

"Standard" [1] and "Airbnb" [2] seem to capture some amount of sensible guidelines. Adding Prettier [3] gets you something akin to gofmt, and adopting standard eslint rules [4] can help too.

[1] https://standardjs.com/ [2] https://github.com/airbnb/javascript [3] https://prettier.io/ [4] https://eslint.org/

You're talking about style, I'm talking about idioms. Those aren't the same thing!

I think idioms and style have some level of overlap.

Semicolons could be considered idiomatic or stylistic, for example. Requiring strict equality operators feels more of an idiomatic way to avoid bugs, rather than simply whitespace. Eslint certainly has a number of rules that are more idiomatic than stylistic.

If you're touching on the lack of consistency in, say, open source javascript, I agree, code is all over the place. Also, I think you're looking at a language that has had dramatically more evolution (especially in recent time) than most other popularly used languages. Old code without promises and suffering from callback hell feels archaic.

However, I think if you look at, say, TypeScript repos (which is cheating, I know, it's not javascript), there is more homogeneity, at least from what I've seen.

>> language that has had dramatically more evolution

Of course it evolved. It’s easy to evolve if it was literally nothing in the first place.

The only thing JavaScript was ever even barely good for was that which it was designed to do, be the programmable interface for a browser. Why anyone would want to do anything else with it simply boggles my mind.

It's like if someone took the BASIC my calculator offered back in the day and turned it into a general purpose programming language.

But JavaScript's got asynchronous!

A race to the "promised async hell".

I wonder how many devs are struggling to debug and fix bad code written using Promise and async features. With care you can use these features safely and productively, but it is also very easy to make subtle mistakes that cause intermittent faults or strange corner case errors.

The comment is a reference to a phrase “It’s got electrolytes!”.

So was Windows 95.

For me, R probably has a worse rate of churn than javascript in the case of breaking changes. Javascript is full of hyped frameworks, but R is the only language I've used where a minor update to a package caused an entire project to break (because the package author had changed its api)

> there is no language that I've seen that has more hair pulling, braindead I just learned to program yesterday type programming nearly as much as I've seen in Javascript.

I would argue that anything for phones is similarly braindead.

While the languages are okay, the API's, UI idioms, and general ecosystem seem to have a half-life of about 6 months. If you step away from programming phones for 18 months, it's like you're learning everything from scratch.

(Side note: All right, Groovy should be taken out and shot and the idiot who foisted it on the Android build system needs to be taken to the woodshed and beaten severely.)

> Groovy should be taken out and shot and the idiot who foisted it on the Android build system

I think your beef there should really be with Gradle, or at least, how it is using Groovy. The way Groovy is implemented as a DSL for Gradle is, in my view, toxic. Almost everything is implicit behavior, crucial aspects rely on totally incidental "magical" features that just make things happen when you invoke some cryptic incantation that has an unobvious connection to its context.

Groovy itself is pretty good and IMHO much better than Java for most purposes.

Idk - there is definitely a difference between learning about React and a completely different way to do programming.

If you were any good 12 months ago - you will pick up any of the trends quite easily in a week or so (partially because you have also predicted the future).

It's such a shame! JS could be a concise little language with a lot of power (in terms of being able to ignore types until they matter), but instead you end up transpiling unusable garbage with 10MB string manipulation libraries that release breaking changes as patch versions. Python is what JS is supposed to be, although I do feel like Bart Simpson in front of the blackboard every time I write `lambda x:`.

Being a small language would be irrelevant to JS. The web could benefit from a sound type system built into browsers (aka typescript in the strict mode), but it's not like anybody complains about python lacking the type system, right? What web really needs is a way to write highly modular apps and let the browser download only relevant modules, with caching, prefetching and so on. None of the attempts have worked so far. If you look an any app today, it looks like the main module that has static deps on other modules that deoent on other modules and so on. When everything is linked together, we get this 50MB binary. Not a big deal for native apps or services, but it's a big deal for web apps that need to download all this mess and maybe actually use 20%. There is no way today to conveniently write a web app that would pull only necessary modules at runtime. It's not just that the language is missing some keywords, it's about the entire web world is designed around static linking: we'd need to redo existing non modules, existing frameworks, upgrade to http2 for batching multiple requests to download a bunch of small modules and so on. That's a lot of work that needs a lot of expensive devs and the returns of that work are non existent: there's no profit to be made there. That's why it'll stay this way: bloated web apps with thousands of unused deps.

> anybody complains about python lacking the type system.

Well, Python does have strong typing. It's just dynamic, not static, and I've seen enough criticism about it to say that it isn't as unimportant as you make it out to be. JavaScript gets a lot of criticism for having dynamic and weak typing. And I think both forms of criticism are valid, though not everybody agrees, which is fine too.

What makes webapps bloated is the require-at-the-top pattern. Type system makes no difference here. What most devs don't realize is that if their precious app written in holy C plus plus or Java got compiled into a standalone binary that could run on any device inside a browser, it would be a 150 MB mess that wouldn't even boot. All these nice frameworks, libs and utils would need to be downloaded before the app can start. We'll see this soon with wasm, btw.

You may hate Make and it is certainly far from ideal (tab-based syntax was a mistake) but the thing is, Make has very simple rules that haven't changed for decades and Makefiles generally keep working (and when they break it isn't the fault of Make itself but of something else that isn't as stable/backwards compatible as Make).

Common make fail #1

The project you landed on has been using some souped-up supermake with a ton of features (which were used with wild abandon) and some subtly different semantics for (say) macro expansion and variable assignment (and these are really difficult to determine the dependency scope of). The build system has grown to a recursively gnarly Turing-complete mess.

So when you go to update the tooling, you find out that this supermake is (a) a commercial project, (b) the company has gone out of business, or adopted a predatory per-seat licensing scheme, or no longer supports your platform, or has changed the product so significantly that you might as well port everything to a different version of make anyway.

Never happened to me.

Common make fail #2

I was regularly dealing with bugs in multi-million-line makefiles. Make, you see, has no debugger; you can't step through things to find out what's going on. To debug an issue you insert print (echo) statements and look at output, and/or gronk through many, many lines of make's internal structures. Why didn't file A compile, and why file B compile twice? It could take a long time to figure out.

Common make fail #3

"Builds are too slow, we need to speed them up."

So you go parallel. Hey, this is easy. Except for those libraries there (they need to be done first). And those headers are generated, so those need to come first, but first first. And everything depends on these libraries getting built, so need to wait for the linker here, here and here. Pretty soon you have a dependency hairball that nobody understands and that people are afraid to change because things always break. Eventually you get to a point where "yeah, building twice usually seems to fix that issue..." and that's when you know the project has grown too damned big for its shell again.

Common make fail #4

You die a little each time that you get screwed by tab-versus-space. And cross-platform . . . umm, what year is it? Then someone smugly mentions some XML-based tool to you, and you snap because that tool is worse in every single measurable dimension and a few more besides but the developer in question doesn't question things because the tool uses Holy XML and Blessed Java (well, some version of Java, anyway, you can probably still download it from somewhere) and who cares if it's just the same set problems with a different color paint slapped on top and the usual Reddit community dedicated to the elimination of all heretical views. Support? Who needs that, just read the source.

Seriously, I haven't seen any build tools in the past 30 years that have done a great job and that were a joy to use. CMake, Ant, Gradle, make-of-the-month-club, whizzy vendor-locked tools, bespoke in-house constructions, they have all been terrible. I see little hope, entropy has won this one.

Common make fail #0

Not starting by reading the manual, and then writing an idiomatic make file.

Once you understand how to use auto generated .d files, you’ll never have a multimillion line make file.

Once you understand the built in lex and yacc rules, you can generalize them to whatever code generator you are using.

The remaining thing (not spelled out in the manual) is to avoid recursive sub make, and to use the include directive instead. “Recursive submake considered harmful” is a helpful paper, but it should have had the subtitle “a practical alternative”, because people cite the title without reading the whole paper.

>I was regularly dealing with bugs in multi-million-line makefiles.

I've personally only used make for a few projects, and might not be "in the know" about how to best use make, but isn't a multi million (!?) line makefile a sign you should probably break it into peices, or use a different solution?

Multi-million just seems insane to me

It sounds like something that was auto generated by a brain dead script at one time. And then later hand modified to 'get things to work' which then meant there eventually was no way to regenerate it from scratch.

Personal feeling is you should have a way of regenerating makefiles.

Sorry i do not understand, is this post a joke? I mean it literally because of the "never happened to me", then the multimillion line makefile doesn't sound real but then you do write something that sounds like someone who has issues with makefiles could say (build order, though common that isn't make's fault - at least as much as isn't a programming language's fault that a program has bugs). So i'm not sure how to take this comment.

The multi-million line makefile is real. I wish it hadn't been. That was a terrible time.

I guess my point is that make doesn't scale, and unavoidably gets messy over time as each project solves the same problems over and over in different, buggy ways.

make solves problems you get early in a project ("let's get this handful of files automatically built") and doesn't even try to address issues of modern software development, like build parallelism and dependency management. Its facilities for debugging problems are laughable.

> make solves problems you get early in a project ("let's get this handful of files automatically built") and doesn't even try to address issues of modern software development, like build parallelism and dependency management. Its facilities for debugging problems are laughable.

Go look up plan9's mk. It's a remake of the program written by the designers a few decades later with everything they learned from it. Not only are the variables idiomatic, and the spaces problem fixed, but you can ask for output at each stage and (IIRC) view the graph that it builds.

> The multi-million line makefile is real

Of course you quickly get to million-line Makefiles in bigger projects. In C++, if you depend on stuff like Boost, a single file can easily depend on thousands of header files, and every single one must be tracked by Make.

But of course, nobody in his right mind would write these dependencies by hand. At my dayjob, the core Makefile is 700 lines. These 700 lines do cross-compilation with gcc/clang for various platforms, unit testing, valgrind, cppcheck, clang-tidy, coverage, and several other things. It is completely non-recursive, runs in parallel without problems, tracks all dependencies, and also works on Windows.

Of course this stuff is complicated. Make is very low-level compared to a build-generator like CMake. But if you want a bespoke build system for your project which lets you control every last detail of your build, it is still a good choice.

Do you know any resources on how to use Make in a sensible/sane way?

As far as rants go, this one was a joy to read. Then I wept for the state of affairs in build tools. I'm not even close to familiar with so many as you've listed, but I never experienced one that didn't eventually turn into a Rube Goldberg contraption (if it didn't already start out that way).

> grown to a recursively gnarly Turing-complete mess

> gronk through many, many lines of make

Thank you for my new word of the day. It describes exactly how I feel sometimes, dealing with technical debt accumulated over years, Frankenstein's systems where parts are always churning - often for no good reason at all, new but worse in most aspects - it can be exhausting gronking through the layers of complexity.

Once upon a time, .net code would just build. You had a nice list of project dependencies and references, and you could throw it all into a nice MSI installer that anyone off the street could use without being told how. It would plop debug builds in one directory and release builds in another.

Then Microsoft got rid of it because InstallShield gave them money. Now you need the super-extra-professional edition of Visual Studio to have anything but the most basic installer, and you need to hire a guy to write InstallShield scripts while he's taking a break from contemplating just how tough it would be to get by on pogey alone, and God forbid you need to change anything because the guy is on a different floor because he's not a real developer, etc. etc.

> Seriously, I haven't seen any build tools in the past 30 years that have done a great job and that were a joy to use. CMake, Ant, Gradle, make-of-the-month-club, whizzy vendor-locked tools, bespoke in-house constructions, they have all been terrible. I see little hope, entropy has won this one.

Have a look at redo[1].


Well, your make failure #1 is not using make?

Also, #3 is what make excels in. I'm still to see another tool that is more fit to that kind of thing.

At the end of the day, I don't like make. It's for reason #5: there's no project introspection, you can't group tasks into high level values, update your graph on the fly, or add anything to your target except at the source level. I see this one didn't make into your list. This is one of the reasons those multi-million makefiles exist.

I'd rather deal with a small shell script, or a larger Python one if it can't be small. Yet, autotools are great for system programming.

It's true that you can screw up the use of make in multiple ways, but that is true of any programming system. I find make to be a useful tool.

The most common mistake, by far, is trying to use make recursively (where make calls other makes down a directory hierarchy). That's typically where "builds are too slow" come from, because you end up with a hairball of complex workarounds because no one make invocation actually has the correct dependency data.

I just create a single makefile and treat it like any other program: Add carefully, ensure that the text is clear, etc.

It feels like most every build system is either super over engineered or just arcane. I like that Go keeps its build system integrated with the language and that its build system is overall pretty simple and easy to grok.

Go's build system is pragmatic and simple and hasn't let me down yet, but I'm still working on projects at fairly small scale, and the Go build system doesn't do much for other languages, so I didn't mention it.

I think that we're going to see more languages with built-in build systems from early-on in the language's development, but that is a doubled-edged sword (works great for your version of WHIZBOL, doesn't work great if you need to interop with other things).

Make is also very easy to grok though, at its core is just a series of "i want this file, it depends on these files, use these shell commands to produce it".

I’ve seen a lot more monstrosities created with Make, but if it’s controlled then yes, it’s not too complex.

Honestly, more than anything, this reminds me of the time Uber flounced off PostgreSQL because its performance sucked when used incorrectly.

When you (the general "you", please) use a tool wrong, you can look kinda a fool for blaming the tool.

Make is the worst build system... except for all the others.

> multi-million-line makefiles

Is this a copypasta?


Rake, the Ruby version of Make, is rock-solid, dependable, and crazy-powerful. Don't want a Turing-complete language in your build automation? Don't use it, the rule syntax works just fine. But it's there if you want it, and oh god is it great.

> Rake, the Ruby version of Make

That's a common misconception, but people often don't understand what make is. Contrary to the intuition, make is _not_ a build system (like, say, cmake), although build systems can be (and often are) implemented on top of make; nor is it a task runner (like rake).

At its core, make is a declarative expert system (with a very sane design and a very quirky rule syntax). Its area of expertise is updating files. I urge every user of make who hasn't read "Recursive Make Considered Harmful"[1] to do so.

[1]: http://aegis.sourceforge.net/auug97.pdf

Yup, and a common modern use case of make is to call in to other language specific build systems.

There is something extremely weird about the idea of having a build system for a particular language. It is not uncommon for a project to have multiple languages and it is also possible that some of the source code in one language is generated by an executable in another language. Because of this and because of compatibility issues the correct number of build systems to exist is 1. If there is more than 1 in a single project it will be the case that sometimes more is rebuilt than necessary.

Yes I wouldn't ever want more than one build system. I use Rake to build Crystal, and haven't even bothered to investigate Crystal's offering.

Rake isn't for Ruby any more than Make is for C. It comes from that ecosystem, but you can use it for anything. If you've got a system ruby installed, you've probably also got rake.

Exactly. Remember in 2010 before grunt a lot of us were using maven to build single page JavaScript apps? That was pretty funny in retrospect :-) The creator of angular created the precursor to the karma test runner, called jstestdriver, which was a maven plugin ;-) It worked though! We built a single page app with tdd using it at Autodesk.

You don't have to throw the baby out with the bath water to have sanity. Clojure targets both the JVM and Js runtimes, and you can leverage all the existing ecosystem while using a much better language and tooling. You can even use VS Code for Clojure development, it works great. Here's what this all looks like in practice


Having started with the MS stack way back in the old days of their Hegemony, I always felt like make was doing something that I shouldn't have to think about.

I'm curious about VSCode - why that, and not Visual Studio or another full-featured IDE? I've gone from IDE to vim and back, and I have to admit with great sadness that the various little gimmicks an IDE provides (without hand-configuring the damn thing!) make it worth it, particularly since they all allow for vim keybindings to one degree or another.

I have developed c++ with vim and visual studio for a couple decades. I recently made the switch to vscode from vim and i like it even better than studio. Its snappier and it integrates with my vim workflow well so i dont have to do either or.

I don't see the benefit of switch from sublime to vscode. What am I missing? Does vscode offer something different?

Better git and debugging integration, mainly. I use VSCode with the Sublime Text extension[1] and honestly it feels almost exactly like sublime, but with better git and debugging integration.

[1] https://marketplace.visualstudio.com/items?itemName=ms-vscod...

I'm not a sublime user, so I can't say why, but a couple of JS devs in my office switched from sublime to VS Code. I'm a little surprised by this to be honest, and I haven't interrogated them too thoroughly about why, since I'm not especially interested in either; you might want to give it a try, however.

> don't just rip on random scripting languages that have been around for 25 years

Where do you see that?

My point being: I imagine you could avoid this just fine with javascript. I don’t see this as a technical issue so much as a cultural one—we gave up on vetting our dep trees.

>I personally hate Make, it's burned me too many times.

I’d be interested in hearing about the problems Make has caused you— Whenever I’ve needed to write or edit a makefile, it’s been a pain looking up the relevant details in the manual, but nothing as severe as what you seem to be implying. Have I just been lucky, working with less complicated projects, or something else?

> I’d be interested in hearing about the problems Make has caused you

The main problem that kept burning me was that I'd have a "dirty" build when I thought I had a clean one or an incorrect one when I thought I had a correct one. It was super hard to get the dependency graph correct because developers were creating files left and right but would only update the Makefile just enough to get it building (and forget to add new header files to the list of headers, etc) so I had little confidence in incremental builds and was always building from the ground up.

CMake's generated build systems always have correct dependency graphs (i.e. it will only rebuild the bare minimum when you touch a given file) and I've always had complete confidence that my build is clean (since I just created a fresh directory) and that my incremental builds are correct.

> and forget to add new header files to the list of headers

This is why gcc/clang have the option to generate Makefiles for the header dependencies for you. Maintaining these yourself is a hopeless effort.

That makes a lot of sense. Setting up automatic dependency detection is what finally made Make click for me, and it’s not the most straightforward process to get right. How does CMake detect the files that a given product depends on; does it have language-specific dependency detection or something else?

> How does CMake detect the files that a given product depends on; does it have language-specific dependency detection or something else?

Well actually, CMake is a build system generator - and the default build system it generates is Unix Makefiles. So I assume CMake generates Makefiles with the appropriate automatic dependency detection baked in, I'm not sure.

The important part was that it eliminated human error on our team and we haven't had dependency graph problems since.

CMake has its own dependency scanner for C/C++. Look for a file 'depend.make' and you'll see the header dependencies that CMake has detected, minus the headers that CMake thinks are system headers.

Sometimes you can make mistakes which delete files, but you should keep development files backed up in some capacity. Still, spot the bug here:

    AS_SRC = main.S util.s
    OBJ = $(AS_SRC: .S=.o)
        rm -f $(OBJ)

The source files with a lower-case extension aren’t getting their file names translated and are passed straight through into OBJ unchanged.

Granted, knowing that there’s an error here helped greatly in spotting it; there’s a good chance I would have run the incorrect makefile at least once before investigating.

Whatever the bug, git reset is the solution.

Not the original parent, but the problem I've noticed most recently with make is when people use it to orchestrate building/running docker containers. I've seen several cases where containers will be pulled/rebuilt on every invocation (even if they don't need to be) because there's no file based dependency to indicate they're actually up to date. While docker's layer caching helps with this, it still can add a fair bit of time in some cases.

I've worked around that in the past by creating empty "target files" that are created when a given target is built and which dependencies rely on, but it's not a perfect solution.

I've looked around for a make alternative, but although I like rake, I don't really use Ruby these days, and so dealing with RVM/rbenv and whatnot is sort of a pain and not worth the effort over make.

> I personally hate Make, it's burned me too many times. Now I use CMake, and I haven't been burned in years.

Could you elaborate on how you've been burned by make? Because my experience is the exact opposite of yours. I shudder to remember some of the cmake I've had to deal with.

Hm, I'm not a fan of Cmake. I like how simple and clean Make is. In my experience it works more often than Cmake, and if not, I can usually debug it. Cmake seems like an overengineered beast to me. But I probably haven't 'grokked' it yet.

I'm not a huge fan of CMake's syntax, but the thing that sold me (i.e. the "killer feature") was how easy it was to setup out-of-source builds[1].

The ability to have the entire build/build artifacts in a directory completely standalone from the source directories was a huge win.

Because for me, that meant I could have 3 separate build directories - one for my optimized ARM builds, one for my debug (-g -O0) ARM builds, and one for my x86 unit test/coverage builds.

If I modified one file, I didn't need to re-spin 3 clean builds - I just could hop into the one or more build directory I was interested in and incrementally update the builds. Plus, out-of-source builds make building clean as easy as `rm -rf build/` instead of hoping that `make clean` has all the right pattern matching and subdirectory listings to truly scrub the build artifacts.

[1] https://cgold.readthedocs.io/en/latest/tutorials/out-of-sour...

VSCode (and Atom) feels like an Emacs with Javascript instead of Lisp anyway

>> The Churn is losing a day debugging because a transitive dependency changed a function signature. The Churn is spending a week just to get a project you wrote a year ago to even run. The Churn is rewriting your front-end because a shiny new thing came around.

If you think the churn is that then you're lucky. To me, the churn is working for the man. Everything I ever do is for the man.

If I could ever reach a point where I could say, fuck you, the man, well, that‘d be the day, wouldn't it? That'd be The End Of The Churn that The Man invented.

There are many many organizations and Institutes that aren't the man. You don't have to work for a soulless corporation. You can do work that helps the greater good of humanity.

The company I work for certainly does overall good, and treats its employees well, but it's still a larger for-profit company, and the work I do isn't exactly my choice. I figure working for anyone other than myself will always have this air of being for "the man".

It seems like you independently came to the conclusion of Naval's thesis.


Maybe you are looking for sovereignty, where you do things not because you "have" to but because you "choose" to.

The idea your touching upon might be bigger than just your "day job" but how you spend your time and what decides that.

> You can do work that helps the greater good of humanity

There are many groups that say they do this work, but very few that actually do. Then you have to move yourself internally until you are actually doing work that is the greater good.

What is the greater good to begin with? The flexibility of human understanding and the breadth of types of companies with "good intentions" available means that the search for such a mythical group is something that can take a lifetime and still not yield any fruit.

The thing is, you always end up working for the man since they own part of your income.

Under the current version of American Corporatism, no, you cannot work for a company that isn't soulless. A Corporation is legally required to work only to increase the value of a share, nothing more.

Maybe you can work for a non-profit if you're lucky, but how many are there out there that pay market rate?

> A Corporation is legally required to work only to increase the value of a share, nothing more.

No. That's a misconception. Corporate directors are required to act in the interests of the shareholders, but they have a lot of discretion in determining what those interests are and how they are to be served.

Here's a reference from a decent law school: https://www.lawschool.cornell.edu/academics/clarke_business_...

That reference cites the Hobby Lobby case where they got to opt out of providing insurance for contraception and a Facebook video of an expert explaining how the current corporate system produces sociopathic entities (corporations). I don't believe either fundamentally debunks the idea that the agents of corporations are required to attempt to increase the value of a share.

I agree that there's still a lot of lee-way in that, like giving an employee a bonus might hurt shareholders directly in the short-term but you can claim it increases productivity and so is a good decision. You still have to justify all decisions in terms of value to the shareholders though. If you (as CEO) decide to just stop all work and spend every day at Disneyland until the coffers are empty you can be sure you'll lose that suit. And when every single decision has to be viewed through that lens, you aren't able to directly do real good for the world, just indirectly.

>A Corporation is legally required to work only to increase the value of a share, nothing more.

That's just the excuse the CEO's mouth pieces trot out when the company is doing stuff that is shitty and sociopathic. You'll notice the CEO has no problem getting the company to do things that enriches himself at the expense of the shareholders.

I'd say the common connotations of the word 'churn' better fit the patterns in the parent 'Advice...' article. What you're describing deserves another name, something more like 'grind' or 'yoke' or 'rat-race'.

> If I could ever reach a point where I could say, fuck you, the man, well, that‘d be the day, wouldn't it? That'd be The End Of The Churn that The Man invented.

You probably already could if you wanted to. Ten years ago I quit my software engineering job and spent 2 years driving from Alaska to Argentina. Recently I quit again and spent 3 years driving around Africa.

When you spend less money, you have a lot more time than you think, and you don't have to work for the man.

I think this is a good place to point out that the "churn" could be a variety of things and really depends on the person "churning". The advice is still sound though, make steps in your life to not deal with the churn. The first step is identifying it, maybe next is how to overcome it. Not always an easy thing to do but I do believe everyone has that one thing holding them back.

The words churn and churning have now lost all of their meaning due to the fact they've been used too much man, you‘ve overused them, c'mon!

But I do think pessimism is the correct state of mind, given how far the tentacles of the man have reached and how little we do to resist his influence.

Who or what is the man? Only a Morpheus can tell you such a thing. I'm not him.

I recently cut a technology out of my life that was causing a lot of churn, and I'm much happier as a result.

That technology was TeX, even though it's on the author's "good" list.

It's somehow got to a point where every time two people got together to work on a document, we had three different incompatible header files; package X doesn't work with version Y of package Z; what order you import packages in matters on my computer but not on a colleague's (still not sure why); the order of macro expansion and character class redefinition in some packages causes hard to track down bugs ...

80% of what I need to do, I can do in markdown and then run through one of many document-generating tools. Currently for some projects I'm using gitbook-cli and I'm both more productive and much happier. I even wonder whether I want to make a fork of gitbook-cli, it's something I'd trust myself to be able to contribute to.

For the other 20%, I use word. Yes it's a WYSIWYG interface but for short documents (max 20 pages) where layout and design is important to get right, I've found myself being able to save time and frustration compared to TeX.

Editor-wise, I use vim or vscode depending on the task. I've tried emacs, we two are not really compatible.

Have you tried ShareLaTeX/Overleaf? I have had better luck maintaining header compatibility with it.



Accept the churn or be left behind, imo.

The best thing you can do to insulate yourself from the pain described in the post (trying to remember how code worked, etc) is to brain dump important things you learn along the way. I've recently begun keeping a TiddlyWiki[0] for every major project I undertake. In it, I keep unexpected things I learned, cheat sheet items, command-line snippets, and longer form entries about structure.

The best part is it was all written by me, so the communication barrier is as low as it can possibly be. Reading one of these TWs allows me to pick up a project again extremely quickly. It's also useful on large projects where different areas are like their own projects unto themselves.

Tools, not closing yourself off, help you overcome your limitations.

0. https://tiddlywiki.com/

I get what you are saying, and on a certain practical level you are right, you can't be that guy who never moved past COBOL...but I think it needs to be super clear that you can accept the churn (as necessary) but you sure as hell don't embrace it. Embracing it, IMO, is how JavaScript dev got to be what it is. Broadly speaking, a lack of nuance and temperance is the great blight on development.

The more experience I gain, the more dubious I grow of the "being left behind" concept. It's rookie bait. With experience, you instinctively know when it's time to move on from a technology (it no longer serves its purpose well or at all) vs. jumping on the hype train to get some breeze in your hair.

There's a difference between jumping on the hype train and being stubbornly stagnant. You will definitely be left behind if you refuse to adapt to new "industry standard" tech, even if all of that new tech is 90% old concepts with slight changes, repackaged.

That's not to say you should jump on every new half-baked hackathon-originating javascript framework. But it does mean you should have your finger on the general area of the pulse of where the industry is moving, and start learning that tech. For example, pick up Rust for systems programming, instead of C.

I know a web developer who refuses to learn or use any web tech besides PHP. Finds cloud-based hosting confusing as well. "I can do anything I need in PHP." Everything, except find good paying job that isn't maintaining gnarly legacy codebases. Doing PHP is his bread and butter, and unless he decides to learn some of the newer (stable) webdev tech, he's going to find himself with no marketable skills in the future.

Agreed. And I'm a rubyist!

I think ruby has actually gotten a LOT better at minimizing the churn (both core/stdlib, and the ecosystem, specifically including Rails itself), as a result of people learning from the experience. The ecosystem still doesn't prioritize it as much as I'd like.

I also think this points out the benefit of sticking with a platform/ecosystem for a while. Many people didn't realize the danger of backwards-incompat churn until they saw it through experience over years.

You notice it when you work with the same code for a while -- if you are always abandoning a codebase and coming to a new one, you abandon before it gets painful, and either have a new one that isn't yet painful or a legacy one where you can blame the pain on your predecessors making bad decisions.

If you are always abandoning a platform for a new one -- the dangers of 'churn' aren't apparently in a less mature platform, you never pay the price at version 1.0, only after it's been around a while. But they will be if it lasts long enough and you stick with it long enough -- if when it starts hurting you abandon it for some other new thing thinking the other new thing will be better and not realizing it's better in that respect only cause it's newer... you never learn.

Agreed, working on my own codebase for over 4 years taught me more than anything else about how to design and write decent code. Its easy to blame someone else for crap design / code but when its your own it forces you to think of how it could be done better.

Unfortunately doing this tends to lead to being unemployable because too many people associate being immersed in churn as "productivity." This spreads philosophically and suddenly people doing the hiring want these kinds of people who can "solve" these kinds of "problems."

At the same time, finding something that works and sticking with it perpetually sounds wonderful - perfect, even. But breaking things and doing hard things also leads to innovation and new ideas.

So ultimately I think there needs to be a balance and everyone should embrace a little churn while eschewing it in the broad form.

“Unfortunately doing this tends to lead to being unemployable because too many people associate being immersed in churn as "productivity." This spreads philosophically and suddenly people doing the hiring want these kinds of people who can "solve" these kinds of "problems."”

Totally agree. Sticking to things that work can hurt your job prospects seriously. You are quickly an “outdated dinosaur”.

I've been happily doing just this for a decade now, and haven't had any problems finding employment. People tend to focus on the absolute size of a community, but what really matters is the supply and demand ratio.

Sure, a mainstream language like Js will have a lot more jobs than Clojure, but it also has a huge pool of developers making it hard for any individual developer to stand out. Meanwhile, Clojure has a much smaller market, but a growing one meaning that the demand currently outpaces supply of Clojure developers making it much easier to actually get an interview. Another side effect of this is that companies tend to be a lot more open to remote work.

The best way to stay employed is pick something new. Start writing medium posts how this is so much better than what they are using. Get hired once the problems start mounting repeat the process for a new language.

Thinking you can label the churn as something it’s possible to avoid by single choice upfront about dependencies is not realistic enough to be useful. Churn is not just about the dependencies- but also about the project ...

Projects which are updated often have a forcing function on them - they need to become updateable ... that can mean tests, that can mean reasonable build tools, and that can mean dropping individual dependencies that are too painful to update.

Obviously some ecosystems make all these things easier than others — typed languages with reasonable consistent build processes help more than anything. Good codebases that isolate and wrap usage of most dependencies next and reasonable automated tests probably next most useful.

It is definitely the case that some codebases are easier to update than others even given the same ecosystem/dependencies. When is the pain of “churn” most reasonably considered my fault instead of yours? Those are the situations can be fixed and they can get you pretty far — much further than “ahh, this codebase has javascript and webpack i shall rewrite with make and closure” can get you on average ...

Maleability is a very important concept, but this article is not about it.

This is not about your code not breaking while you change it. It's about things breaking when you change nothing. I've learned the lesson long ago by using PHP software, some ecosystems just break much more often than others.

“Examples: Clojure, Common Lisp, HTML, Make

Counter-examples: JavaScript, Ruby, React-Preact-Vue-Angular-“

OP does not give any reason as to why these are churn and why stop them ? This makes no sense...basically to op’s point we should just never adopt new technology because it’s just a new shiny thing. This makes no sense at all.

He didn’t elaborate but I agree with him. I’ve seen too many examples of junior devs having to work on some small feature or enhancement, and their idea becomes rewriting the entire code base in whatever new framework of the month happens to be for the time. Yes there are plenty of reasons to adopt new technology, but you have to balance the pros and cons with keeping your business running and most importantly not disrupting your customers.

At the end of the day, honestly, how many frameworks do you need to build webpages? And 95%+ certainly are not operating at a Google or Amazon scale.

On a side note, I recently saw a dev spend a few days wrestling with dependency issues - he was trying to wire in Spring into a Java application and running into configuration issues with decorators. At the end of the day, this was a tiny, tiny application that periodically ingests messages from a topic and forwards their payload to an email service. These are 5-10 messages a day and the emails that are generated go to internal business customers as a courtesy and not less of a mission critical notification. It’s worth thinking about what was the business value vs the cost?

The take away isn’t that you should never use new technology - rather, understand that new technology or change in general comes at a cost. Your job is to balance that cost considering several factors - code quality (cost of time spent), flexibility and extensibility, features (the stuff valuable to whatever function responsible for your pay check), cost of on going support and maintenance (operational load). Disregarding those factors and over indexing on latest and greatest is usually the wrong approach.

Agree with everything you said...but remember all the issues you listed are not new to junior dev and also not new particular technology. Same debugging issues were being worked on in c and c++. Your observation is correct but applies to not only to all technology but life in general.

I'd put JavaScript in the "Examples" section as, like HTML, any change it gets is fully backwards compatible so you do not have to play catch-up with the latest stuff just to keep your existing code working.

I do not know about Ruby nor about React-Preact-Vue-Angular to judge, though based on what i've heard the latter pile does sound made up of things that expect you do waste your time ensuring your software keeps working.

Plain JS, yes. The modern JS ecosystem, however, has an enormous amount of churn. The hot way to do things in JS world five years ago is hopelessly obsolete now.

Well, the list says "JavaScript" so i took it as plain JavaScript. If you care about avoiding churn you can stick with stuff that have proven to remain backwards compatible and plain JavaScript fits the bill.

It fits the bill for compatibility, but building anything useful in plain JavaScript is a jaunt through all seven layers of hell.

Clojure is a really interesting example because it's newer than things in the "churn" list. It's an existence proof that "newness" and "churniness" don't go hand in hand. What causes "churn" isn't the tech, it's how the ecosystem around it works.

Another name we often use is "Business-As-Usual" (BAU) or "Technical Debt". Although some business-as-usual and tech is a necessary evil and in some cases, a good thing, I have also wasted many hours to tasks that create little to no value.

Another way to spot "The Churn": When you're done, what new value did your effort create or unlock? Are users better off? Is the system more resilient and reliable? Or did you just get it back to the way it was working before everything went sideways?

Find more ways to create and unlock value and you'll find you move forward much faster.

> These are the olives of technology. Olives aren’t candy, and tasting them the first time isn’t always pleasant, but soon you will develop a taste for them. They have been here since ancient times, and they will remain for centuries to come. They are good for you, solid, reliable, nutritious. Eat less candy and more olives.

For what it's worth, olives are basically inedible from the tree and require a considerable amount of processing before they are actually edible.

Churn can be all of the things the post describes, and I think it's good to be skeptical of churn--always looking for ways to avoid it.

But it's also hard to tell the difference between churn and maintenance, and I think one of our modern world's blind-spots is a de-prioritization of maintenance.

I'm sorry -- you lost me at UNIX, LISP, The Web, Emacs, TeX. There is a reason UNIX and Linux is so fragmented - it was the thing that was churning for so long! In some respects, it continues to churn (I'm looking at you, Linus Torvalds).

What are you referring to? Linux never intentionally breaks userspace code.

Linux as narrowly defined as the kernel is very good at not breaking things. Linux as used in common speech to mean a Linux distribution and associated libraries undergoes constant churn.

I maintain a cross platform desktop app for Windows, macOS and Linux.

Windows is the best, 32 bit versions going back 15 years still work no issues. macOS is next, 32 bit don't no longer work, but 64 bit versions still work going back 5+ years. Ubuntu is by far the worst, some library I depend on changes it's API pretty much every year, and the old version is removed, breaking my app.

The solution appears to be Flatpak which bundle up the app with all it's required libraries. However I'm not sure how to make this work for plugins. Would each plugin need to be in it's own Flatpak? It's insane.

What do you own your own time sharing mainframe?

Clojure's reduced churn is a consequence of sitting at a local maxima in programming language design. If it were not a local maxima, there would be churn as we search for it.

UI has no such minority consensus for local maxima and is very much not solved, which is why React-Preact-Vue-Angular is churning while the humanity hivemind iterates towards a solution.

Here is the relevant Rich Hickey quote: http://www.dustingetz.com/:rich-hickey-web-frameworks/

Relevant (arguably overblown) talk by Jonathan Blow: https://www.youtube.com/watch?v=pW-SOdj4Kkk

Thank you for sharing, I really enjoyed this talk. I somehow missed it when it happened.

No problem, I loved it too.

> The Churn is spending a week just to get a project you wrote a year ago to even run.

How much effort would it be worth avoiding to spend a week every year ?

Sure the weeks can add up, but so does the time spent on low level abstraction, and refusing to adopt better tools when the whole environment in changing around (e.g. there is no mention of native mobile environments. Would it be churn use to Swift instead of non-ARC ObjectiveC ?)

I find Swift very bad for churn. Swift has had 5 versions in 5 years, with breaking api changes each time.

Every answer on Stackoverflow about Swift has several answers, one for each api version. Any time you grab some Swift code from the web or an older project, it's not going to work.

Avoiding the churn isn't an option since new Xcode versions drop support for old Swift versions. And only the two latest Xcodes will run on latest macOS. They even drop the support for the conversion tools. So if I go back to an old Swift project now, it won't compile in my Xcode, nor will my Xcode help convert the code to modern. My only option is to run an order version of Xcode in a VM to convert the code.

If I'm writing a library I want other people to use or share between projects, I'll still do it in Objective-C. Apps I do in Swift but I find it annoying.

I still have 15 year old non-ARC Objective C libraries. Why spend the time updating them when the are debugged and work fine?

Every time I have to do a Swift version update I introduce bugs.

That’s part of my point.

Cutting yourself from new frameworks and hardware features just to cut churn would be a horrible tradeoff in most cases.

There are some niches where churn can be mostly avoided, but I think churn is usually a fact of life we could just embrace at a healthy pace.

From the opposite angle, a field with extremely low churn would seem suspicious to me. For instance I would expect any language with no significant update in the last 10 years to have abysmal unicode support.

Important code needs love, it needs to be improved, made more robust, have security issues handled, consolidated if it starts to bloat, to be cleaned and kept legible even for whitespace so you or the next guy can easily see and continue to look after all the moving parts.

If it's important, it needs and deserves all those things; they are not "churn" but maintenance.

Important code needs to become stable so it can be improved instead of fighting constant degradation.

Churn kills important code, making it unfit for purpose.

I'm not sure his list of technologies is accurate to the general coding population. Nor would anyone else's list be accurate. The more general idea is to learn the tech you use... learn it well and deep enough to avoid the kind of churn the article describes. If you can do that with modern tools, fine. If you can't and want to stick to older tools, that is also fine.

But draw your own line in the sand as to which tech you are going to use in your production systems, and move to new tech only when both your skills are sufficient, and there is a matching business need driving the change.

I frequently bite into olives only to find pits that did not get machined out.

Unless they’re stuffed olives, I’d recommend just buying them with pits in. There’s quite a lot more variety that way at my local grocery store.

(I know you didn’t mean that literally)

Just this morning I've spent an hour trying to get a well-known Http library to work in my project, cos the developers keep completely refactoring all the classes and methods. Stop it!!

A breaking change in API should mean a major version bump. You're either not using the public API, using the 'latest' version, or the library is, pardon my french, poop.

The problem isn't the libraries. The problem is you don't have CI (with a dashboard your team is paying attention to). Check-in -> automatic build -> automatic unit test execution -> signals if the build is ready

The problem is you don't have CI

Having CI won't stop the developer of the dependency changing the interface.

Why are developers mindlessly upgrading major versions of dependencies and expecting everything to be okay?

Until your CI pipeline causes churn because something doesn’t work anymore....

We use GitLab CI, and it's been working maintenance free for 3 years now. We configured the test runners in 2016, and haven't touched them since and they have never randomly broken - even across GitLab updates (of course, YMMV depending on if you like to be on the bleeding edge all the time; we are always 1 cycle behind the bleeding edge).

What an awesome compliment for our team. Thank you!

Your CI pipeline should be simple, relatively unchanging and therefore trusted. When shit breaks (and it will) you should get a signal much earlier. This signal should strongly relate to your most recent changes -- and your changes should be small because you commit early and often, right?

Some of us deal with the fact that the people maintaining the CI solution keep breaking our pipelines without any code changes in of our scope of responsibility. This becomes the churn as you push empty commits trying to troubleshoot why your builds don't work and it takes >1 day to get things working again. Most hosted CI solutions are entirely inappropriate for some organization's business goals and requirements, so you end up dealing with it being a B or C tier service in an organization that is changing.

I would suggest that there are definitely some organizations where you can't trust the pipeline to work reliably, and the cost of figuring out what went wrong becomes the churn.

But how do you want to avoid that? Staying with old libraries forever? In some areas this is possible but in others you have to use newer stuff. I always envy people who work on stuff that’s mainly their own code and not cobbled together from other systems with their questionable APIs.

> But how do you want to avoid that? Staying with old libraries forever?

Well, one way would be those libraries to have backwards compatible APIs so that your code keeps working yet still using the same APIs. Some actually do try that (or at least they claim so), e.g. curl.

Are you able to program against an interface rather than the classes themselves or is the entire API changing?

The first thing that came to mind when I read the opening paragraphs regarding churn is developing mobile applications. I find myself losing a day or two just to get my apps to compile if I ever go a month or two without spending any time on them. It's brutal how quickly things can break and dependencies need to be updated.

Is it possible to do modern mobile dev without churn (coming from a situation where my clients often go months without requesting features)?

I only know of iOS, but here's what I try to do

- Reduce the use of dependencies to the minimum (I'm bad at this) - set version compatibility in my Cartfile / Podfile - when new breaking versions of Swift came out, suggest to my client a 2 / 3 mission solely focused on upgrading their app and it's dependencies (otherwise it will probably make their next last minute super-urgent-right-now feature needlessly long and complex to develop)

But I'd also willingly take any advice on this

Edit: also, semi-solved this for my JS work using automated dependency update services, like dependabot, coupled with unit / integration tests, but still haven't found a similar service for Swift

Cutting out dependencies helps. If you just have to worry about Google and Apple the changes are usually slower and documented.

No. not really, However the solution here is to account for that time upfront saying "oh it's gonna take me some time to get back up to speed with your project". That's the reasonable thing to do anyways

I mean, sure. But at a certain point, LISP was Javascript in terms of bleeding edge no? As someone who has built multiple things using React Native I can absolutely relate to the churn, but each time I willfully wade into it and each time I get better at handling all the pains that come along. I also can't think of a better way to improve debugging skills than to use a tech stack that gives rise to issues not easily solved via StackOverflow.

Keep in mind Lisp predates the Internet by many decades.

The Lisp tradition is to leverage the expressiveness of the language to quickly create functionality that would be included as libraries in other languages. If you need an algorithm implemented, just do it and if it turns out you need it hyper-optimized and able to handle the pathological cases you look for someone who has something like that.

The JavaScript tradition (once NPM became a thing) is almost the opposite, look for a library first, and if you can't find one then consider making it yourself and posting it for others.

Lisp was theory before it was code.

Javascript was a fashionable imitation of another popular language dialect.

> In 1995, Netscape Communications recruited Brendan Eich with the goal of embedding the Scheme programming language into its Netscape Navigator.[16] Before he could get started, Netscape Communications collaborated with Sun Microsystems to include Sun's more static programming language, Java, in Netscape Navigator so as to compete with Microsoft for user adoption of Web technologies and platforms.[17] Netscape Communications then decided that the scripting language they wanted to create would complement Java and should have a similar syntax, which excluded adopting other languages such as Perl, Python, TCL, or Scheme. To defend the idea of JavaScript against competing proposals, the company needed a prototype. Eich wrote one in 10 days, in May 1995.

Perl, Python, Lua were not ready for embedding (unsafe FFIs, OS-dependent APIs) and would have been flash-frozen in early bad states. See https://news.ycombinator.com/item?id=1905155. Better that JS got the early bad state freezing and thawing (ES3 helped; ES5 was constructive and ES6 made JS pretty good).

In early 1996, John Ousterhout stopped by to pitch Tcl/Tk, but it was too late. VBScript was coming, but JS in 1995 Netscape betas got on first and saved us from that dystopia.

> Examples: Clojure, Common Lisp, HTML, Make

> Examples: UNIX, LISP, The Web, Emacs, TeX

1. Or just computer science.

2. Churn is not a problem. If you keep accepting the churn, you will learn them faster and faster, until you scan the doc and know most of the things, if it is churn. If it is not, it's something new (eg: Coq), then you have the ability to identify 'new but not churn', instead of just learning 'old and not churn' from a half-century ago (eg: Lisp).

3. Among the churns, there would be 'meta churn' like Haskell, Lisp, Erlang, and Rust. There are countless languages stealing Monad from Haskell, and Kubernetes patterns are very similar to Erlang OTP patterns.

4. Editors and IDEs are really irrelevant, I used Emacs a lot, I'm very fluent with Vi, but for some languages, I prefer Intellij and VSCode. Just use tools you feel productive.

In The Churn this week, a colleague wanted to improve a query for a widget on a dashboard. The query was perfectly fine but now I’m stuck getting the wrong results debugging everything I can find.

If it can be made faster then it's fine, BUT that colleague needs to write a test first if it wasn't there yet (and I think it was given how you're fixing wrong results / debugging).

This is the main reason I haven't picked up the new shiny languages after using Python for a decade. OK the Gil is not cool but I'm OK with multiprocessing (one example among many)

I look at Julia (for HPC) and think, sure if I were a grad student and had time to burn. But now, I need to get an idea to figures live coded in the space of minutes, NumPy and matplotlib are boring and just fine.

Research is one domain where churn is a very much a daily thing. Careers are made on churn, in research.

> The Churn is losing a day debugging because a transitive dependency changed a function signature. The Churn is spending a week just to get a project you wrote a year ago to even run. The Churn is rewriting your front-end because a shiny new thing came around.

These forms of churn are hardly related besides that they are often self-inflicted. The first 2 examples are really just technical debt, and perhaps they can be referred to as the "grind" rather than churn.

Rewriting your code base to use the framework of the now may happen because of the industry changing to a point where using old-reliable.js is making it difficult to hire new talent, whereas elon-musk.js is the hot new thing that tons of programmers are interested in. Companies can feel obligated to follow this trend because they think they'll become obsolete if they don't. The company I currently work for is going through this at the moment, actually.

The churn that this causes is more novel than the grind because, as an engineer, you are learning to do the same job in a different way. Avoiding the grind isn't likely to fundamentally change the nature of your job, as writing documentation and reducing the number of dependencies aren't very radical ideas. But taking someone from one language and having them learn another, or having them transition from one framework to another, can effectively demote an engineer to junior grade until they've had experience with their new tools. With the churn, your expertise can lose its meaning.

Even if you learn to solve the grind, solving the churn can be difficult even if your loyalties remain to a single tech stack. With the exception of purely personal projects, the industry will continue to shift its own loyalties to different tools, so you've got a few different games to play when looking for jobs:

- Be the one who knows the legacy tech and can make sense of other people's horrible legacy code. (Which the company will inevitably decide to have rewritten in something like React out of the belief that all their problems are being caused by the old technology.)

- Be the one who knows the hot new thing and can write code that, whether or not the code under the hood is atrocious, will make the bosses believe that they can be like the Googles and the Facebooks.

The vast majority probably pick the latter. But if one wants to avoid the churn as much as possible, the probably need to not only stick to tried and true tools but also find a good company and stay there indefinitely, rather than hopping companies every few years. Of course, that may come with a harsh penalty down the road.

“Churn” is definitely an issue. And JavaScript does have its own complexities. It feels weird singling out Ruby and JavaScript (and any other individual technologies), though.

I wonder if part of this is a human instinct to “other” people who are different. Someone feels superior because they use Lisp, and looks down on people using JavaScript. Or vice versa.

I think it's less about languages and tools than the culture around those languages and tools. I think it's centered around the the answer to the question: How many customers, developers are we willing to toss to make this change? For the ruthless and stodgy Microsoft the answer was zero. For Linux close to zero.

Remember the last time you were supposed to fix a bug, but the code was so "bad" that you rewrote the entire application?

IMO this is why churn is not going away. It's much easier to achieve flow-state from zero than working up to it in a codebase that you're unfamiliar with.

Watching people fight Javascript never gets old. Keep yelling at those clouds, fogies.

What properties do you think make modern JavaScript a good language, rather than a (now, finally) barely passable language being actively ruined by its own ecosystem?

I'd have told my future self, moments after he said the word "churn" - wait wait wait, you're from the future and you have a time machine and you are giving me programming advice. You realize if you give me stock picks none of that will be relevant right?

Like all industries we can't just stay still. Curiosity and creativity are important. The "olives" are a fruit of some sort of older technology that has matured. Maybe this technology was the "shiny new some some time back then" Some people try to fix the the shortcommings from yesterday's technology with their own ideas and view of things and nobody can prevents this. This is just basic, human creativity at work.

Maybe what's important is not making people belive that yesterday's technology is obsolete and should be burried or abandoned. The shiny new thing is just an other tool in the toolbox and it's not because I created a new shape of screw that all the other screws are worthless. There is still plenty of screws all over the world that need to be unscrewed, we need those tools to do that, people to know how to work with them and some of these screws still need to be made. Maybe people don't talk about these screws as much ? They don't make the news anymore because they've been there for so long ?

Maybe the point here is that social media thrives on the shiny new thing, it's the core concept of it. An old news paper is worthless, no one ever sold an older news paper. So they need to grab some attention to live and the shiny new thing is a way to accomplish that.

But it's not because there is lots of articles about all those new frameworks that the older ones are worthless. We need both. Maybe for some projects we will not choose an architecture based on 10 years old concepts and frameworks and some other projects need stability and a whole tested and stable toolchain and documentation for the years to come.

What kind of work do you want to accomplish ? Small projects where you work alone ? Where you just serve a few people ? Or big projects with a huge infrastructure that needs to deliver to millions with great economic impact ?

I bet you don't use a hammer only when you want to build kitchen furniture. And then tools that are needed in a factory that build kitchen furniture are not the same that the ones that needs to be used for home. They need a whole supply toolchain, some testing & quality checking. Whereas the worksman in his own workshop is going to have less tools ...

We need all kinds of software and technology, maybe what's hard is to be able to identify the right ones and defining precisely what we want and where we think we will go. People are always going to want to try to use the shiny new thing. We all need novelty, enhancing the shortcommings of your current tools.

The reality is that there is very little actual innovation happening in programming. Most of the ideas in use today have been discovered decades ago. The big reason for churn is that people don't bother learning about what's been done before, and keep reinventing the wheel.

You don't see in other disciplines such as physics or chemistry where people spend years learning about existing research before actually starting to contribute.

With programming the barrier to starting to write code is much lower than to actually learning the background research. People don't bother looking at what's been done already, and just start "inventing" things.

More often than not this result in half baked ideas because the authors of projects don't really think about the full scope of the problem. Then once the project starts hitting limits in practice, people start kludging things on top of it, and eventually it becomes unwieldy to use. Then somebody comes by and does the same thing simplistically again, and the cycle repeats. Nothing new is learned in this process, and you get churn for the sake of churn.

> Most of the ideas in use today have been discovered decades ago. The big reason for churn is that people don't bother learning about what's been done before, and keep reinventing the wheel.

But on the other side of the coin, do you really want to be locked into a decades-old solution to a "solved problem" forever? Or are there still improvements that can be made to reduce friction and human error?

"Machine/human readable data exchange format? Yeah, that's a solved problem - we use XML for that. What's that? You want to reinvent the wheel? JavaScript Object Notation, are you insane? Didn't you bother seeing which exchange formats are already out there? XML can already do everything JSON can do and more! Plus, JavaScript is the exact opposite of what we value here, get that crap out of my face; like I said, XML is the one true solution to data exchange and forever will be."

XML and JSON are both examples of the problem actually. S-expressions have existed since the 50s, and they solve all the same problems with a much saner syntax. In fact, if Mozilla marketing execs didn't insist on Java style syntax for Js, and it kept Scheme style syntax we wouldn't even need HTML and CSS today. We'd have a single syntax that would cleanly express code, styling, and layout.

S-expressions do not have typed standards for serializing dates and booleans like JSON. They also can't distinguish numbers from text.

S-expressions also have no standard for comments, and can't distinguish maps from lists.

A S-expression based serialization standard would be a bit cleaner than JSON, but it's enough of a change that it's worth redoing anything.

All of that and more is available with s-expressions in Clojure https://github.com/edn-format/edn

Your contrived example falls flat because JSON is flat out better for most use-cases and anyone having a discussion about it will probably fit into those use-cases.

People don't discuss solutions in a vacuum. Ideally they'd think about what the potential fallbacks of using XML alternatives are before jumping into a new tech. Which they will, unless they're really really new

The funny thing is there are developers[1][2][3] out there who would like to turn JSON into XML.

[1] https://json-schema.org [2] http://www.jsoniq.org [3] https://www.p6r.com/articles/2008/05/06/xslt-and-xpath-for-j...

> Your contrived example falls flat because JSON is flat out better for most use-cases and anyone having a discussion about it will probably fit into those use-cases.

Easy to say with hindsight now that it is massively popular and has a huge ecosystem of parsers for every conceivable language. At the time it was invented though, it was yet another data exchange format and I'm sure there were a lot of grey beards pooh-poohing it.

Back in the day, XML was almost universally hated by the people you describe as "greybeards", mostly because it was a convoluted "one size fits all" approach, hijacked by business, that ate bandwidth for overhead with none of it's promised benefits actually materializing (because everyone created their own, quasi-proprietary XML Scheme).

I remember a .NET developer using it in place of json. That was the slowest system I've ever seen.

I would say JSON is a little bit better than XML in some areas but also worse in a some. XML could easily have been extended with a few attributes. there was no real need for doing something completely different.

JSON is worse than XML if your use case requires namespaces.

Tangentially, this is one of the reasons that I am a fan of EDN[1] over JSON.

[1]: https://github.com/edn-format/edn

I totally agree. But how do you balance this with the need for looking good on the job market? If you want to stay employable you are almost forced to participate in this craziness.

+1 where I live the jobs with the highest pay grade are for some fancy frameworks. If I've been doing PHP for 5 years on an 10yo framework in a company that didn't feel the need to upgrade to anything. The day I want to find an other job I'm going to be confronted with opportunities that required mastering the latest technologies and I won't find something within my pay range because of this. That is the reality where I live Futhermore, how am I supposed to learn and practice all this new stuff when my company doesn't allow time to learn anything ? Is home projects enough to fill professionnal requirements for mastering a technology ? It's not easy to stay away

Aren’t personal projects how most people learn new technologies?

I used to do that. But right now I am in a high pressure job that sucks up a lot of energy but I don't learn anything new. When I come home these days the last thing I want to do is to spend more time on the computer. I need to do something active or just sleep to maintain my health.

I've been working with Clojure for the past decade, and haven't had to deal with any of the craziness.

The job market is smaller, but so is the pool of developers. Companies tend to be more flexible because of that and are often open to remote work. I'd much rather work in a sane niche market than deal with the mainstream churn.

Isn't it fair to say, though, that you're not the average Clojure dev? I mean you have a book and a web framework to your name so that puts you way ahead of the pack. As an average Clojure dev I've found it very difficult to find work.

That's just a result of me having been working with the same tech for a long time. When I was starting out with Clojure it was a lot more niche than it is now, and finding jobs was much harder. The whole reason I published a book was due to lack of beginner resources being available. So I don't think there's anything special about me, it's just that I was stubborn about wanting to work with Clojure and didn't get dissuaded until I made that a reality.

We have local Clojure meetup in town, and when I first started going there pretty much everybody was using Clojure as a hobby. Today, we have a bunch of companies using it in production, and all of them are actively hiring. The last three coop students I had all ended up getting Clojure jobs. I imagine this varies based on where you live of course, but another option is to simply introduce Clojure at a place that's using something else. That's where Clojure jobs come from in the first place at the end of the day.

It's nice to have found a niche you like. I thought I had a found one with the medical device I work on but it turns out the work is rather unpleasant and the hiring situation is not great either.

Sadly I agree as well. Instead of mastering the tools we have we need to make half arsed versions of everything because we don't have a mastery of this years trend.

Damn, if I was making a time machine to go back to 1999 I would have told my younger self, "Sell all your Tech stocks, right now." :-)

You are the problem. Not the tools.

javascrip is an amazing language- get over yourselves

HN is full of this stuff. These people don't even understand what modern JS is.

> Counter-examples: JavaScript, Ruby, React-Preact-Vue-Angular-…

Great programmers can find virtues in all tools. Hate the player, don't hate the game.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact