
Killing ants with nuclear weapons - colinprince
https://drewdevault.com/2017/09/08/Complicated.html
======
weeksie
As an old-ish guy who doesn't like people on my lawn, I agree with the
sentiment.

A big struggle when balancing how to use the knowledge gained over decades of
experience in today's context is trying to figure out which tools are
needlessly complex and which bring real value to the table. What are the
tradeoffs?

Makefiles aren't the glimmering beacon of simplicity I would lean on for an
example but sure, they're less byzantine than webpack but you have to learn
the syntax and other foibles along the way. Hell I used CMake back in the day
when I was working on a big Lua/C/C++ app so that gives you an idea of my
feels toward Make.

Far more common is the knee-jerk reflex to import a library for every damned
thing. For stuff like a graphics library or mapping? Sure. Library for a data
structure? Maybe? How much customization are you expecting to do to it? Will
it work out of the box? Library for plumbing, like auto generating Redux
action creators? Please don't do that. You're importing something that can
probably be generalized for your case with a 10 line function.

~~~
codazoda
I agree and I'll single out npm.

It's a great tool but it must be used with event greater care. You can `npm
install` a package and get dozens of packages requiring packages. Before you
know it your little app has 100's of dependencies.

On the flip side, I use `npm run` instead of writing simple bash scripts. Why?
Because everyone I work with runs lots of things this way. Doing so ensure's
they'll know where to look for all the stuff they might be able to run.

------
marcosdumay
This reads way too much as a "get out of my lawn" post, where old complexity
is good, but replacing it with newer complexity is bad.

Yet, I am asking myself if Puppet has ever saved me more time than it spent.
It would help if it was stable, but it keeps changing, and outdating my rules.

I would say it has better maintainability than scripts over ssh, mostly
because error handling is implicit and things are descriptive. But I would
take not having to maintain the rules over maintainability at any day.

~~~
subway
I think Puppet makes for an interesting example.

Having spent the last decade of my life mired in config management systems
from CFEngine to Ansible, it seems like the pattern is for a complex system to
be replaced with a "simple" system, the "simple" system quickly gains
complexity, and then we see a shift back to a simple alternative.

I'm starting to strongly suspect the root of the problem (at least within this
context of config mgmt systems) is that everyone wants to write libraries,
nobody wants to write a config.

By this, I mean if a Puppet user wants to install and a simple single site,
instead of a simple manifest or internal cookbook with just a package resource
and template, they frequently reach for a Apache module from the forge.
Similarly, if an Apache module doesn't exist in the forge, they will try to
publish one. This results in vast dependency graphs of questionably maintained
modules, and new users to the ecosystem saying "Wow, this is way too complex.
Oh hey, Chef looks simple." A short while later you get the Supermarket, and
the cycle continues.

I'd argue centralized repositories of config management modules/cookbooks lead
to the decline of any given config management system, and the best way to
avoid that is to use a real config management system (chef/puppet/whatever),
but stick to the basic resources, and avoid 3rd party modules at all costs --
they will _quickly_ go unmaintained. And seriously, do you think you'll need
to deal with that obscure setting for apache on irix that the popular
community module supports?

~~~
marcosdumay
I do avoid 3rd party modules. Not at all costs, but enough for them being less
of a problem than the puppet updates themselves. Not to talk about how it got
from a simple "write your configurations here" repository into something that
requires that you think about testing and staging envs for your configs. I am
still not sure if that's a net positive either.

There is an inherent problem on the complexity itself. The previous piles of
scripts I had before it had the complexity right there, on your face. They
were clearly hard to manage, but it was bounded. By using Puppet I exchanged
that for a lot of hidden complexity, that is ever increasing and once in a
while shows its face (always on a bad time).

Or, rephrasing the rant, why does software have to keep growing? If it didn't
I could evaluate what is a good trade-off and keep it.

------
iandanforth
I agree with the sentiment of this post and have felt this pain. Interestingly
I find that notation and syntax are the first, and most powerful,
justification for the new tools. Honest to god blockers to adoption I've
witnessed:

1\. Makefiles and/or bash scripts are not written in javascript

2\. Man pages are not web based

3\. The dense syntax of command line tool options is intimidating

I also have a strong sympathy with the position that UX matters at all levels
of tool building and its hard to argue that older tools are as approachable,
literate, or aesthetically appealing as newer tools.

The real tragedy is that it is so difficult to update the UX, syntax and
aesthetics of older tools while maintaining the hard-won stability and
separation of concerns they embody.

~~~
Florin_Andrei
> _Makefiles and /or bash scripts are not written in javascript_

Why should you rewrite all that stuff every time there's a new fashion out
there?

By the same token, a rewrite would have been required when Perl was all the
rage, or Python.

> _The dense syntax of command line tool options is intimidating_

Anything you don't understand very well can be intimidating.

~~~
smilekzs
Makefile was not originally designed to be a general-purpose language but
eventually grew features and pretends to be one. If a proper general-purpose
language with a well-designed library (possibly plus a general-purpose data-
only configuration language) is capable of performing the core function of an
"interpreted DSL", even at some cost of terseness, I would strongly prefer it
to the DSL. While having multiple libraries in different languages might
contribute to fragmentation, being able to write the build scripts in the same
language as the rest of your project provides value.

> > The dense syntax of command line tool options is intimidating

> Anything you don't understand very well can be intimidating.

IMHO what's intimidating is not the CLI itself, but possibly:

1\. lack of consistent convention between different tools (only PowerShell
attempted to address this)

2\. poor discoverability of options, which is often exacerbated by poor
quality of documentation

3\. naming of things (which is getting better with more modern CLI tools)

4\. ...

------
panarky
_> We have heaps and heaps of complicated, fragile abstractions to dismantle_

Nobody has time to dismantle the complicated, fragile abstractions because
they're too busy building new complicated, fragile abstractions to layer on
top of the old ones.

------
pavel_lishin
> _When I look at a tool like Gulp, I wonder if its success is largely
> attributable to people not bothering to learn how Makefiles work._

> _This complexity cost shows itself when the system breaks (and it will - all
> systems break) and you have to dive into these overengineered tools. Don’t
> forget that dependencies are fallible, and never add a dependency you
> wouldn’t feel comfortable debugging._

I don't have _a lot_ of experience with Makefiles, but I refuse to believe
that they're being cited as an example of something that's _not_ complex.

It seems like this blog post is basically the author saying, "I understand how
Makefiles work, but don't understand how Gulp works", and extrapolating that
everyone else shares his skillset.

~~~
megaman22
They are something that can be complicated, but don't have to be.

Gulp and Grunt and Webpack or whatever the new hot shit is, usually isn't used
for anything _that_ complex. Grab some files, pipe them through a tool or
series of tools, write the output to a destination. This is exactly what Make
was built for.

~~~
pavel_lishin
Is Gulp itself complex? If it's just a series of steps, does it really matter
whether you're using Make or Gulp? The author knows Make, so it's easier for
him to debug it. Someone else might know Javascript, so it's easier for them
to debug Gulp.

------
kkapelon
>When I look at a tool like Gulp, I wonder if its success is largely
attributable to people not bothering to learn how Makefiles work

I have. And then I learned that I also need autoconf/automake/m4 (which the
author is not mentioning). So there is complexity already out there (even for
makefiles)

>server applications ship entire operating systems in glorified chroots;

If this is about Docker, I don't think that the author truly knows how Docker
works.

~~~
blueflow
>And then I learned that I also need autoconf/automake/m4

Make is well on its own, especially for everything that is not portable C.

>server applications ship entire operating systems in glorified chroots;

\- Docker containers are running chroot'ed

\- Containers still do have an whole POSIX-ly userland with shell and
coreutils. You can extract every container onto an disk and make an static
linux kernel boot it.

~~~
alyandon

       - Containers still do have an whole POSIX-ly userland with shell and coreutils. 
         You can extract every container onto an disk and make an static linux kernel boot it.
    

What's in a docker image is entirely up to the image creator. Docker
containers don't necessarily have to use a complete userland + init system as
a base.

~~~
blueflow
In practice, containers do have a shell, since a few things (Dockerfile RUN &
CMD in non-exec mode, environment variable substitution, docker exec shell)
dont work without it.

~~~
webster23
But you don't have to have all that, you can have just a single executable in
the container, like here for example: [https://github.com/containous/traefik-
library-image/blob/f24...](https://github.com/containous/traefik-library-
image/blob/f24cd6a1a31933c76ab397868b5da84d43f5d2ff/scratch/amd64/Dockerfile)

------
ebiester
There's a reason that build tools show up in every language that exists: it
turns out that it's easier to build a project using the target language than
using a generic tool.

Consider that gradle, ant, and maven all call java internally rather than
externally, allowing build paths to be set up in other ways than string
manipulation.

Consider that javascript build tools use internal tools. What value is there
in creating string abstractions or intermediate json manipulation when we can
pass these from function to function instead?

Now, for cross-language projects, makefiles can be useful glue.

------
ericmcer
I think one of the big benefits of these tools is providing common knowledge
between developers. It is a huge headache for a new developer to walk into a
project where someone is building their own solutions or using uncommon tools.

If you want to find someone who can walk into your project and start
updating/debugging the gulpfile or dockerfile on day one, that is completely
doable.

------
bastian
At first glance i read the title as: Killer ants with nuclear weapons.

~~~
katastic
They came from outer space!

/Red_alert

------
jwaldrip
All I am reading here is a complacency with the way this person was taught. I
have no problem adapting to the new tech and welcome the abstraction. I trust
the communities and maintainers of such abstractions to help me when the time
arises.

------
katastic
Challenge accepted.

