
The Handmade Manifesto - AlexeyBrin
http://static.chronal.net/hmh/manifesto.html
======
ObjectiveSub
Meh, that group of developers didn't really die out. People tend to forget
that there's tons of people still worrying about performance.

I personally do computer graphics in video games. We still worry about every
byte, cycle, cache miss, and instruction that we write. We still spend days
trying to eek out another 100 microseconds from that tricky piece of code.

Not every software developer is a web/business developer. Hacker News tends to
forget that.

~~~
smoyer
I spent the first twenty years of my career as an embedded systems engineer
and there are two specific advantages I gained from that experience:

1) When your writing code for an underpowered microcontroller, you have a plan
before you start. This plan includes execution speed and memory usage targets.

2) When you're sending thousands (hundreds of thousands or millions) of non-
reprogramable devices into the field, you have a test plan (and comprehensive
test suite) that guarantees correctness and performance.

Do you know your "machine"? For instance, if you're working with a language
that runs on the JVM have you read the JVM specification? (It's the equivalent
of your micorcontroller's data-sheet). The part of this manifesto I agree with
is that you should thoroughly understand everything "beneath" your
application.

~~~
logicallee
>The part of this manifesto I agree with is that you should thoroughly
understand everything "beneath" your application.

Say someone (a full-stack developer if you want, or jack-of-all-trades-master-
of-none) uses Photoshop and Illustrator but also is coding against Firefox,
Chrome, and Safari, and their stack includes node.js, Express, Angular, and
PostgreSQL.

At some point, you can't thoroughly understand every cycle of every rendering
engine. (Which is what what you've stated implies.)

It is quite literally impossible - as in, a physical impossibilitiy, of
"thoroughly understanding everything 'beneath' your application." You just
can't - there are not that many hours in the day. You would have no
application left.

If you froze technology today you could thoroughly understand all of the
mentioned technologies in 10 years. But by then there would be new
technologies.

Instead, you just have to abstract it away, code against some framework that
compiles down to javascript, and only understand that. You can't thoroughly
understand every single thing your databse engine is doing either.

Otherwise you can just never get anything done. The modern world is made up of
applications whose combined reference materials total without exaggeration
millions of pages of text. You just can't read a million pages of text.

This is completely different from a single microcontroller's architecture.
Which is basically a single layer. On the web, even your target is a
collection of competing browsers.

Why should someone building a front-end site but also using the basics of a
database somehow, take months out of his or her life to gain a deep
understanding of every cycle of that database engine or what exactly it's
doing?

What does it get them?

Cycles - even billions of cycles - are cheap.

It's like asking a farrier[1] (someone who cares for horse's hooves and then
puts shoes on them) to learn all the intricacies of metallurgy, really go back
to where metal is mined in ores. Oh, and since, "A farrier combines some
blacksmith's skills (fabricating, adapting, and adjusting metal shoes) with
some veterinarian's skills (knowledge of the anatomy and physiology of the
lower limb)" I suppose this farrier suddenly needs to be a complete
veterinarian and really fully understand all layers of the horse?

It just doesn't work that way. At some point the farrier has to work with an
abstraction, and not know what is going on at lower layers, and at some point
the full stack developer has to just use an interface that compiles down to
javascript that will access a database he or she doesn't know. While you may
lament this, beautiful and functional sites have been built this way.

It's not fair to ask someone to fully understand everything. Even as a
microcontroller coder, you didn't understand the circuitry in the
microcontroller at an electrical level - you didn't even get the diagrams.
You, too, worked with an abstraction.

[1]
[https://en.wikipedia.org/wiki/Farrier](https://en.wikipedia.org/wiki/Farrier)

~~~
smoyer
I never said you had to deeply understand your tool-chain ... the use of
Photoshop isn't at all relevant (though understanding the produced image
format might be). PostgreSQL on the other hand becomes a lot more powerful the
more you learn about how it works "beneath the covers". Of course there are
limits ... I knew the timing of my microcontroller's signals but ignored
details of the physics behind the microcontroller's implementation.

The first computer I built was a COSMAC Elf (based on an RCA 1802
microprocessor) - I didn't have billions of cycles so there was a limit to the
work I could do. Now I do have billions of cycles and do my best to make sure
there's a linear increase in the amount of work I can do.

<old-guy-voice>In the old days, we spent large amounts of time reading data-
sheets, specifications and application notes - probably more than we spent
writing software. What you're implying is that you're too impatient to do
engineering but you're willing to be a programmer (or hacker).</old-guy-voice>

So I agree with the premise that we should be producing smaller, faster and
less buggy software than we are. I don't think we need everything related to
our systems - I certainly used ICE when available to avoid the code/burn/test
cycle required using EPROMs.

There is (of course) a finite limit to what we can learn, but that doesn't
mean we shouldn't _want_ to know everything. One important skill is being able
to discern what you need to know, what you need to understand and what you can
safely ignore - this is hard. Knowing is also a choice. I can learn more than
most because I don't watch TV or spend inordinate amounts of time wasting
elsewhere (I have a few vices like HN).

~~~
logicallee
The distinction you start with between stack and tool-chain is fair, so let's
cut out photoshop and illustrator. However, the three mentioned browsers are
certainly part of the stack, and the browsers are the ones executing the
cycles of anything running in the front end, which these days is a lot. Nearly
all projects are literally targeting all of these disparate browsers.

To cut a long story short: when you say "What you're implying is that you're
too impatient to do engineering but you're willing to be a programmer (or
hacker)" \- this is correct. The only way for many people to get projects off
the ground is not to engineer them, but just to throw them up.

By the by, I actually in a couple of spare hours toyed with the idea of
formalizing this into a project, where we would teach some valuable skill in
15 seconds. (I applied to a YC fellowship with it but wasn't selected.)

Here's my prototype - (it autoplays 15 seconds of sound, you've been warned.)
[http://iknowkungfu.meteor.com](http://iknowkungfu.meteor.com)

I didn't write the tutorials up there now, and many of them are too long. But
the idea is there: in a few seconds, you can often learn and incorporate
something into your stack that you know next to nothing about.

Do this a few times and you have a complete web app serving dynamic, database-
backed content - and you've still been able to focus on what you know rather
than engineering.

There are 3 billion people out there who deserve to use some of the tools that
are available. You, too, deserve to use some of the modern frameworks that are
available. Without investing hours, days, weeks of your time into it. I know
lots of technologies whose basic usage could be uploaded to someone's brain in
15 seconds of video. Git, to name one.

No, you won't understand how it works or why - but you can commit and roll
back (reset), which is all anyone cares about until they start caring more.
The advantage to using the git that you can learn in 15 seconds, over copying
files and continuously renaming them - is - simply put, astronomical.

Many parts of the toolchain, and many layers of the stack, are quite similar.
Who knows what cycles MongoDB is running? Who cares?

~~~
smoyer
"Learn X in Y minutes" is an awesome concept! Let me know if you ever decide
to run with it. I guess I don't fit the CoffeeScript is a hipster language
profile very well but I still enjoy writing code in it ;)

I basically agree with everything you've said (in both posts) ... you can be
successful while wasting cycles. And if you're working on a low-volume and/or
internal only application, you'll probably never face the limits of a modern
server.

If you need to operate "at web scale" [1], or run into an uncommon (or common)
bug, you'll need to know more about the frameworks and systems you're code
relies on (e.g. MongoDB configuration for systems over 2GB [2]). Blog posts
like the one referenced are completely unfair to those that developed MongoDB
- read the manual and understand how MongoDB works _OR_ use it at your own
risk.

So I'll switch arguments and help you make your point. We have an application
written in Oracle's Application Express - while we have extensive expertise in
Oracle's database software, we have this one system which was completed for
expediency's sake. It's kind of horrific but (mostly) works at the scale
required. It would be financially foolish to dig into more deeply into ApEx
for this one dead-end application. Everyone is happy.

[1]
[https://www.youtube.com/watch?v=b2F-DItXtZs](https://www.youtube.com/watch?v=b2F-DItXtZs)
(audio NSFW)

[2] [http://www.sarahmei.com/blog/2013/11/11/why-you-should-
never...](http://www.sarahmei.com/blog/2013/11/11/why-you-should-never-use-
mongodb/)

~~~
logicallee
Thanks for the reply! _(I didn 't actually build those tutorials, i.e. learn x
in y is another person's site, like I said I only spent a couple of hours on
the concept of a site like this and the current tutorials are external. I did
add the time analysis.)_

I like your final example - and remember, you guys are Oracle experts: you're
the most qualified people on the planet to learn ApEx properly from scratch,
even though you haven't.

Now switch gears and imagine a college student who just has an idea for some
cool project, but barely codes in any language. This describes the computing
needs of 3 billion people. They're not qualified to quickly become experts and
engineers at anything. But they still have a computer in front of them that
does a trillion operations every few minutes. The gulf between using that to
surf facebook or building... anything at all, even very poorly, is immense.
(Like git that you can learn in 15 seconds, versus manually copying and
renaming files for version control.) Thanks for the encouragement.

------
dantiberian
If this resonated with you, then check out Mechanical Sympathy [1], a mailing
list all about writing code that is aware of how computers work and working
with them. Martin Thompson's blog by the same name also has a lot of good
information [2].

To the people who say "Who is going to pay for this?" look at all of the
places that can measure the cost of slow or inefficient computing e.g. Wall
St, Google, Amazon, e.t.c. They all invest in writing low level efficient code
_where it makes sense_.

[1]: [https://groups.google.com/forum/#!forum/mechanical-
sympathy](https://groups.google.com/forum/#!forum/mechanical-sympathy) [2]:
[http://mechanical-sympathy.blogspot.co.nz](http://mechanical-
sympathy.blogspot.co.nz)

------
mehwoot
The tradeoff is development speed. Sure, there is a whole lot of software
written these days that isn't world class (In terms of anything really, not
just execution speed) but at the same time a lot of it is software that is
written at a reasonable price that generates value for businesses.

Not every piece of software is handcrafted work of art. Some of it just saves
X person Z hours of time at some mundane task, and we are able to make that
software for X person because the same 6 layers of abstraction that make it
slow also make programming accessible enough for it to be affordable. Nothing
really wrong with that.

But I guess I wouldn't begrudge someone choosing to develop handcrafted
software, as long as they're not claiming that is the way everything should
be.

~~~
colomon
This is a great point. I put a high value on writing efficient software, but
you have to know when to do it!

For instance, I wrote a script to download the O'Neill's tune collections in
ABC [1] the other day. I wrote it in Perl 6, not exactly known for blazing
performance. (It was a three line script, one of which was just a closing
bracket. [2]) If I'd carefully written it in C++, it would have taken me at
least 20x as long to write, and it would no doubt have been substantially more
"efficient". But it would not have been significantly faster in execution,
because it was using curl to download the ABC files that was the slow part.

[1] For instance,
[http://trillian.mit.edu/~jc/music/book/oneills/waifs/](http://trillian.mit.edu/~jc/music/book/oneills/waifs/)
[2]
[https://gist.github.com/colomon/ef8e4a8801d01b1d5813](https://gist.github.com/colomon/ef8e4a8801d01b1d5813)

------
vinceguidry
Good luck getting people to pay for it. I personally love the artisanal
mindset, but when you're trying to meet business goals, speed of execution
trumps quality of implementation every day of the week. They even coined a
phrase for it, worse is better.

There's a good reason the industrial revolution demolished the old guild
system.

When you prioritize speed of execution, you tend towards popular frameworks,
which is the next line item on the typical client wish-list. They want to know
that they don't have to keep an expensive pain-in-the-ass "rockstar" developer
around just to maintain and build on it.

I suspect the only way we're going to get to make software the way we want to
make it is, if we also run the business using the software. Look at Patrick
and Thomas.

~~~
latitude
> Good luck getting people to pay for it.

No luck needed. There's a metric ton of people who just _crave_ well-crafted
software. I have a project of that nature now and I am absolutely blown away
by the amount of compliments and thank-yous sent our way. It's really quite
something.

~~~
vinceguidry
What I should have said was, "Good luck getting your freelance clients to pay
for it." If you're running a software business, which I see you are, with a
product that serves a need that people actually have, in a way that they
actually need it to be solved, then sure, they'll appreciate craftsmanship.

Because you put the time and attention in _before_ you put a price on it and
asked them to pay for it. If you're asking them to value it before you put it
in, then you're selling them Fort Knox when all they need is a bank.

It's not just about software, it's about business too. If you want to make
software the way you want to make it, you better be willing to sell it
yourself.

Freelancing allows you to make software and get paid for it without having to
validate the business purpose behind the software. In many cases, the business
doesn't need anything fancy, handing them a manifesto isn't going to score you
any points.

------
stdbrouw
I'm reminded of the first industrial products. Handmade meant crummy and for
poor people, whereas something from a production belt meant precision-
engineered. But now that everyone can afford, say, a quality solid-state
amplifier, suddenly it's hip to have the expensive point-to-point tube amp
again. It's really about status and conspicuous consumption, but the snobbery
is hidden behind notions of quality and authenticity.

Now let's circle back to software. High-level programming languages,
interpreted languages, libraries, frameworks... all used to be really cool.
Until people realized that now everyone can learn how to code, so there's a
need for something to separate the elite from the plebeians, something to
separate the software engineers from the script kiddies... how about handmade
programming, maybe that'll do the trick?

------
brandonmenc
What does this even mean? Going back to hand-unrolling loops in assembler when
writing business applications?

The very smart people working on the layers this manifesto bemoans already do
that for us.

~~~
sliverstorm
He talks about abstraction layers, yes- although he's probably not arguing for
assembly as much as fewer stacks on services on stacks on...

But it seems like the main theme is programming to the actual machine rather
than the "frictionless vacuum" target machine that is oh-so-easy to assume.
Like one of my favorite presentations, where a PS3 developer team details how
simple choices in data structure arrangement got them orders of magnitude
speedup because of how different arrangements interplay with the CPU caches.
In OoO languages, it's easy to make a pathological cache virus.

~~~
renox
Let me guess: SoA instead of AoS? That's a classic optimisation, there's even
an annotation for this in 'Jay' (well there will be when it will exist
[https://www.youtube.com/user/jblow888](https://www.youtube.com/user/jblow888))

~~~
sliverstorm
Yes, that, but they also go beyond that. I found it today:

[http://harmful.cat-v.org/software/OO_programming/_pdf/Pitfal...](http://harmful.cat-v.org/software/OO_programming/_pdf/Pitfalls_of_Object_Oriented_Programming_GCAP_09.pdf)

------
Udo
I saw this come up during the handmade hero screencast yesterday, and I
appreciate the sentiment a lot. Although I have a feeling that Casey and his
crowd would disagree, I believe this initiative touches only part of the
problem, and worse, it could be interpreted as absolving programmers working
at higher levels of abstraction from doing their part.

Knowing what happens behind the scenes when you write a line of code is
important. Question the decisions made by frameworks and libraries, and
question whether you actually need them. _Keep the stack minimal and thin_.
Choose the right tool for the job, and be knowledgeable of its inner workings.
Resist replacing tradecraft with ritual, especially when programming in
groups.

These can (and should, in my opinion) also apply to web development, a huge
category that seems to be all but despised and written off by the handmade
people.

~~~
SomeCallMeTim
At a visceral level, I really love the concept. I've created my own game
engines (before the age of ubiquitous, well-tested free engines), and I
understand how to optimize code all the way down to counting cycles (which
used to be easier).

At a practical level, though, I'm really, really sick of people creating Yet
Another Framework. NIH is _not_ something that needs encouragement.

And this "manifesto" will be used to justify NIH. It may not be the point of
the authors, but people will absolutely point to it when they want to create
something that's been done well already.

I totally agree that the right idea is to use exactly the tools for the job.
Don't throw every new framework into your app and expect it to be fast enough.
"Minimal and thin" is the right answer.

I've been doing a lot of JavaScript work recently, for example, and I often
visit MicroJS [1] when I need something. Instead of the 100k+ of jQuery &
plugins I can often find one library that's 4k and another that's 6k that
together do all I need. If you need _most_ of jQuery, ZeptoJS [2] is often a
good answer.

That said, sometimes ecosystems are powerful. When you need nontrivial
functionality, and the best options have a dependency on jQuery, it might
still make sense to include jQuery. Development time still counts: Spending an
extra 30 hours to reinvent a component is great if you're spending your own
time on it, but when you're billing by the hour, it's a lot harder to justify
if that component is available off the shelf and Just Works in about 5
minutes.

It's all about compromises. I do think that it's common to see developers
throw in a component if it's even slightly useful, leading to the web of today
where a single page might download and initialize 40-60 different libraries.
It would be healthy if the pendulum would swing back in the other direction.
But we really don't need 300 ways to do every single thing in JavaScript.

[1] [http://microjs.com/#](http://microjs.com/#)

[2] [http://zeptojs.com/](http://zeptojs.com/)

------
Lerc
While this doesn't seem to be very well expressed, I think the sentiment
behind this is worth discussion.

Computers do waste a huge amount of resources in the name of expediency, They
are slow to do things, because it is simply cheaper to develop that way.

I think there is scope for the sort of crafted code that is described here
though. Not only as an artform but as a practical measure.

For any small piece of code a compiler can produce a comparable result to hand
written assembly. Entire programs that have been written in assembly tend to
be much smaller, use less memory and run faster. That apparent contradiction
is where the focus of crafted code should be. Most of the gains from hand
crafted assembly do not come from finding the quickest way to do a simple
calculation, but the additional attention that gets applied. It wouldn't
surprise me if the majority of the gain comes from not instruction level
optimisation, but calculation level optimisation.

For example, how do you find the last word in a string? Have a look on
stackOverflow and you will see examples in many language that are the
equivalent to.

    
    
        var lastword = fullString.split(" ").pop();
    

The work done by this operation is easier and easier to overlook as things get
higher level.

I have often wondered on the merits of writing a virtual assembly instruction
set that is platform agnostic (so compiles/transliterates to native) to
explore this idea. It wouldn't be like JVM bytecode which implements higher
level magic, and it wouldn't be like LLVM which is not aimed at being human
writable. Some existing RISC instruction sets might be close to what is
needed. There would be potential to write things in multiple instruction sets,
If the interfaces between code sections are well defined
(stack/register/memory passing) you could write modular code in a preferred
Instruction set.

This isn't practical for many applications, but I'm not convinced that doesn't
have a possible niche.

------
agentultra
If this strikes a chord you might also be interested in _data-oriented design_
[0]. It's the primary philosophy behind Casey Muratori's _Handmade Hero_ [1]
which is where, I assume, the reference in the manifesto's title comes from.
You should also watch Mike Acton's talk at CppCon[2].

It's really about giving up programming-by-folklore. It's about understanding
the problem so that you understand the data transformations your program has
to make given the hardware it will operate on. It's about doing the maths and
following the numbers instead of modelling _things_ in terms of
_abstractions_. It's about letting go of the idea that our job is to write
beautiful code.

Our job is to solve problems. The problem is a human one. The computer
transforms data to solve our problem. The code only needs to do those
transformations which solve the problem.

It's a philosophy I've picked up in part thanks to Mike Acton and Casey. I
mostly work in dynamic languages but it has already helped immensely at
keeping my code simple and forcing me to think about the data. This has a huge
impact on writing fast, simple code.

[0]
[http://dataorienteddesign.com/site.php](http://dataorienteddesign.com/site.php)
[1] [https://handmadehero.org/](https://handmadehero.org/) [2]
[https://www.youtube.com/watch?v=rX0ItVEVjHc](https://www.youtube.com/watch?v=rX0ItVEVjHc)

------
yaur
I write media stuff. Code in the media pipeline proper gets ultraoptimized
because it gets run between 30 and ~10k times per second. Code outside of the
pipeline is largely written in python because the extra startup time, while
non-trivial, is noise compared to the time waiting for PSI to be delivered.

The point is that "computers are slow" is, at least in some part, because
"just works" requires a mind-numbing number of operations.

------
chrismaeda
In DonaldKnuth's paper "StructuredProgrammingWithGoToStatements", he wrote:
"Programmers waste enormous amounts of time thinking about, or worrying about,
the speed of noncritical parts of their programs, and these attempts at
efficiency actually have a strong negative impact when debugging and
maintenance are considered. We should forget about small efficiencies, say
about 97% of the time: premature optimization is the root of all evil. Yet we
should not pass up our opportunities in that critical 3%."
([http://c2.com/cgi/wiki?PrematureOptimization](http://c2.com/cgi/wiki?PrematureOptimization))

------
normloman
I don't think we have to give up libraries and interpreted languages. Just
stop writing shit code because your boss needs it done in 5 minutes. Of
course, thats our bosses fault, not ours. And its a symptom of a larger
cultural attitude that values short term profits over long term
sustainability. I say, why bother!

------
whistlecrackers
I should make software by hand, rather than typing with my feet. Good idea.

------
toolslive
Stable problems get high quality solutions.

------
renox
I agree and disagree: I disagree because one of the main reason why "the
computer is slow" is the HDD: use a SSD instead for your main storage and it
makes a world of difference (except if you use bloated websites).

I agree because I remember BeOS: on the same computer it was much, much more
responsive than Windows or Linux.

It's probably much more practical to buy a SSD instead of hoping to replace
all your bloated SW with leaner one..

