Hacker News new | comments | show | ask | jobs | submit login
The Handmade Manifesto (chronal.net)
70 points by AlexeyBrin on Aug 21, 2015 | hide | past | web | favorite | 51 comments



Meh, that group of developers didn't really die out. People tend to forget that there's tons of people still worrying about performance.

I personally do computer graphics in video games. We still worry about every byte, cycle, cache miss, and instruction that we write. We still spend days trying to eek out another 100 microseconds from that tricky piece of code.

Not every software developer is a web/business developer. Hacker News tends to forget that.


I develop thermodynamic code for the oil & gas industry. I count the number of divisions in my code. Because a division is a really expensive operation if you need to do it billions of time.

So yes, there are still a lot of people worrying about performance, but a lot of them are coding in Fortran and this is definitely not sexy. Ok, now we also use CUDA, so this can go through the HN filter ;)


Seems to me we oftentimes also forget the "not so outspoken" devs, that just tinker away, doing their thing, without announcing every fart to the world.

People doing great work, without putting themselves into the spotlight.


HN culture doesn't merely forget about them - in fact, it aggressively excludes them.

Brilliant assembler programmer who likes to finish up at the end of the day and play with your kids? Fuck you, where's your GitHub profile.

Developing novel algorithms and prefer playing guitar at home to drinking at the pub? Fuck you, where's your conference talks.

etc, etc, etc. The broader culture of which HN is a microcosm spends a lot of time and effort devising filters to make people invisible if they don't fit a relentlessly self-promoting profile which has a questionable relationship with the quality of their work. I don't see Knuth and Cutler spending a lot of time on Twitter.


We cannot read what does not get written. We cannot discuss the merits and demerits code that is not public.

I'm in full agreement that HN is an informational bubble. I'm not so clear on exactly what HN is doing to aggressively exclude people. And where you see exclusion, I see an attempt to encourage people to join the discussion: "hey, why don't you put this up on GitHub so we can take a look", "this would make such a good topic for a conference talk."


We programmers are in a strange place that almost all of our work is covered by copyright such that it can't be shown to other programmers. People can assess our output, but not look at our code. We can't build a portfolio and take it with us.


I spent the first twenty years of my career as an embedded systems engineer and there are two specific advantages I gained from that experience:

1) When your writing code for an underpowered microcontroller, you have a plan before you start. This plan includes execution speed and memory usage targets.

2) When you're sending thousands (hundreds of thousands or millions) of non-reprogramable devices into the field, you have a test plan (and comprehensive test suite) that guarantees correctness and performance.

Do you know your "machine"? For instance, if you're working with a language that runs on the JVM have you read the JVM specification? (It's the equivalent of your micorcontroller's data-sheet). The part of this manifesto I agree with is that you should thoroughly understand everything "beneath" your application.


>you have a test plan (and comprehensive test suite) that guarantees correctness and performance.

I value the work you do, but please don't imply that tests can guarantee correctness. Tests can never prove the absence of bugs! To guarantee correctness we need to treat programs as mathematical objects.


>The part of this manifesto I agree with is that you should thoroughly understand everything "beneath" your application.

Say someone (a full-stack developer if you want, or jack-of-all-trades-master-of-none) uses Photoshop and Illustrator but also is coding against Firefox, Chrome, and Safari, and their stack includes node.js, Express, Angular, and PostgreSQL.

At some point, you can't thoroughly understand every cycle of every rendering engine. (Which is what what you've stated implies.)

It is quite literally impossible - as in, a physical impossibilitiy, of "thoroughly understanding everything 'beneath' your application." You just can't - there are not that many hours in the day. You would have no application left.

If you froze technology today you could thoroughly understand all of the mentioned technologies in 10 years. But by then there would be new technologies.

Instead, you just have to abstract it away, code against some framework that compiles down to javascript, and only understand that. You can't thoroughly understand every single thing your databse engine is doing either.

Otherwise you can just never get anything done. The modern world is made up of applications whose combined reference materials total without exaggeration millions of pages of text. You just can't read a million pages of text.

This is completely different from a single microcontroller's architecture. Which is basically a single layer. On the web, even your target is a collection of competing browsers.

Why should someone building a front-end site but also using the basics of a database somehow, take months out of his or her life to gain a deep understanding of every cycle of that database engine or what exactly it's doing?

What does it get them?

Cycles - even billions of cycles - are cheap.

It's like asking a farrier[1] (someone who cares for horse's hooves and then puts shoes on them) to learn all the intricacies of metallurgy, really go back to where metal is mined in ores. Oh, and since, "A farrier combines some blacksmith's skills (fabricating, adapting, and adjusting metal shoes) with some veterinarian's skills (knowledge of the anatomy and physiology of the lower limb)" I suppose this farrier suddenly needs to be a complete veterinarian and really fully understand all layers of the horse?

It just doesn't work that way. At some point the farrier has to work with an abstraction, and not know what is going on at lower layers, and at some point the full stack developer has to just use an interface that compiles down to javascript that will access a database he or she doesn't know. While you may lament this, beautiful and functional sites have been built this way.

It's not fair to ask someone to fully understand everything. Even as a microcontroller coder, you didn't understand the circuitry in the microcontroller at an electrical level - you didn't even get the diagrams. You, too, worked with an abstraction.

[1] https://en.wikipedia.org/wiki/Farrier


I never said you had to deeply understand your tool-chain ... the use of Photoshop isn't at all relevant (though understanding the produced image format might be). PostgreSQL on the other hand becomes a lot more powerful the more you learn about how it works "beneath the covers". Of course there are limits ... I knew the timing of my microcontroller's signals but ignored details of the physics behind the microcontroller's implementation.

The first computer I built was a COSMAC Elf (based on an RCA 1802 microprocessor) - I didn't have billions of cycles so there was a limit to the work I could do. Now I do have billions of cycles and do my best to make sure there's a linear increase in the amount of work I can do.

<old-guy-voice>In the old days, we spent large amounts of time reading data-sheets, specifications and application notes - probably more than we spent writing software. What you're implying is that you're too impatient to do engineering but you're willing to be a programmer (or hacker).</old-guy-voice>

So I agree with the premise that we should be producing smaller, faster and less buggy software than we are. I don't think we need everything related to our systems - I certainly used ICE when available to avoid the code/burn/test cycle required using EPROMs.

There is (of course) a finite limit to what we can learn, but that doesn't mean we shouldn't want to know everything. One important skill is being able to discern what you need to know, what you need to understand and what you can safely ignore - this is hard. Knowing is also a choice. I can learn more than most because I don't watch TV or spend inordinate amounts of time wasting elsewhere (I have a few vices like HN).


The distinction you start with between stack and tool-chain is fair, so let's cut out photoshop and illustrator. However, the three mentioned browsers are certainly part of the stack, and the browsers are the ones executing the cycles of anything running in the front end, which these days is a lot. Nearly all projects are literally targeting all of these disparate browsers.

To cut a long story short: when you say "What you're implying is that you're too impatient to do engineering but you're willing to be a programmer (or hacker)" - this is correct. The only way for many people to get projects off the ground is not to engineer them, but just to throw them up.

By the by, I actually in a couple of spare hours toyed with the idea of formalizing this into a project, where we would teach some valuable skill in 15 seconds. (I applied to a YC fellowship with it but wasn't selected.)

Here's my prototype - (it autoplays 15 seconds of sound, you've been warned.) http://iknowkungfu.meteor.com

I didn't write the tutorials up there now, and many of them are too long. But the idea is there: in a few seconds, you can often learn and incorporate something into your stack that you know next to nothing about.

Do this a few times and you have a complete web app serving dynamic, database-backed content - and you've still been able to focus on what you know rather than engineering.

There are 3 billion people out there who deserve to use some of the tools that are available. You, too, deserve to use some of the modern frameworks that are available. Without investing hours, days, weeks of your time into it. I know lots of technologies whose basic usage could be uploaded to someone's brain in 15 seconds of video. Git, to name one.

No, you won't understand how it works or why - but you can commit and roll back (reset), which is all anyone cares about until they start caring more. The advantage to using the git that you can learn in 15 seconds, over copying files and continuously renaming them - is - simply put, astronomical.

Many parts of the toolchain, and many layers of the stack, are quite similar. Who knows what cycles MongoDB is running? Who cares?


"Learn X in Y minutes" is an awesome concept! Let me know if you ever decide to run with it. I guess I don't fit the CoffeeScript is a hipster language profile very well but I still enjoy writing code in it ;)

I basically agree with everything you've said (in both posts) ... you can be successful while wasting cycles. And if you're working on a low-volume and/or internal only application, you'll probably never face the limits of a modern server.

If you need to operate "at web scale" [1], or run into an uncommon (or common) bug, you'll need to know more about the frameworks and systems you're code relies on (e.g. MongoDB configuration for systems over 2GB [2]). Blog posts like the one referenced are completely unfair to those that developed MongoDB - read the manual and understand how MongoDB works OR use it at your own risk.

So I'll switch arguments and help you make your point. We have an application written in Oracle's Application Express - while we have extensive expertise in Oracle's database software, we have this one system which was completed for expediency's sake. It's kind of horrific but (mostly) works at the scale required. It would be financially foolish to dig into more deeply into ApEx for this one dead-end application. Everyone is happy.

[1] https://www.youtube.com/watch?v=b2F-DItXtZs (audio NSFW)

[2] http://www.sarahmei.com/blog/2013/11/11/why-you-should-never...


Thanks for the reply! (I didn't actually build those tutorials, i.e. learn x in y is another person's site, like I said I only spent a couple of hours on the concept of a site like this and the current tutorials are external. I did add the time analysis.)

I like your final example - and remember, you guys are Oracle experts: you're the most qualified people on the planet to learn ApEx properly from scratch, even though you haven't.

Now switch gears and imagine a college student who just has an idea for some cool project, but barely codes in any language. This describes the computing needs of 3 billion people. They're not qualified to quickly become experts and engineers at anything. But they still have a computer in front of them that does a trillion operations every few minutes. The gulf between using that to surf facebook or building... anything at all, even very poorly, is immense. (Like git that you can learn in 15 seconds, versus manually copying and renaming files for version control.) Thanks for the encouragement.


You might be surprised at the depth of knowledge your average Farrier has about both horses and blacksmithing. They might not have a degree in metallurgy, but you can bet they understand the crystalline structures of iron, and how to use that to their advantage. You can also bet on them knowing more about the hooves and related structures of a horse much better than your average vet.

The problem with your hypothetical worker is a lack of depth into any one topic. If they're half artist and half front-end developer and half back-end developer, of course they're not going to be able to get into great depth on any one topic.

I firmly believe in the value of such a Jack-of-all-trades, but I'd be silly to not value the full time artist who can create better works in a third of the time, because they understand their tools and their craft.


Yes, it takes three times as long for a jack of all trades to draw something that is a tenth as good as what a full-time artist can do in minutes.

A tenth as good isn't good enough. But on the web, for many parts of the stack, a tenth as good really is enough. Even using too many cycles by a factor of three orders of magnitude is still good enough. You just don't need to know.


> Hacker News tends to forget that.

I'm not sure it knew it in the first place. It's particularly entertaining when the web dev crowd chimes in on embedded system topics.


I agree. The fact is that I can now build a database and front end as a team of one. Ten years ago that would have needed a team of maybe 5 developers. Abstractions slow things down at the computer level, but bring a great deal of developer productivity. As always its a tradeoff.


If this resonated with you, then check out Mechanical Sympathy [1], a mailing list all about writing code that is aware of how computers work and working with them. Martin Thompson's blog by the same name also has a lot of good information [2].

To the people who say "Who is going to pay for this?" look at all of the places that can measure the cost of slow or inefficient computing e.g. Wall St, Google, Amazon, e.t.c. They all invest in writing low level efficient code where it makes sense.

[1]: https://groups.google.com/forum/#!forum/mechanical-sympathy [2]: http://mechanical-sympathy.blogspot.co.nz


The tradeoff is development speed. Sure, there is a whole lot of software written these days that isn't world class (In terms of anything really, not just execution speed) but at the same time a lot of it is software that is written at a reasonable price that generates value for businesses.

Not every piece of software is handcrafted work of art. Some of it just saves X person Z hours of time at some mundane task, and we are able to make that software for X person because the same 6 layers of abstraction that make it slow also make programming accessible enough for it to be affordable. Nothing really wrong with that.

But I guess I wouldn't begrudge someone choosing to develop handcrafted software, as long as they're not claiming that is the way everything should be.


This is a great point. I put a high value on writing efficient software, but you have to know when to do it!

For instance, I wrote a script to download the O'Neill's tune collections in ABC [1] the other day. I wrote it in Perl 6, not exactly known for blazing performance. (It was a three line script, one of which was just a closing bracket. [2]) If I'd carefully written it in C++, it would have taken me at least 20x as long to write, and it would no doubt have been substantially more "efficient". But it would not have been significantly faster in execution, because it was using curl to download the ABC files that was the slow part.

[1] For instance, http://trillian.mit.edu/~jc/music/book/oneills/waifs/ [2] https://gist.github.com/colomon/ef8e4a8801d01b1d5813


Good luck getting people to pay for it. I personally love the artisanal mindset, but when you're trying to meet business goals, speed of execution trumps quality of implementation every day of the week. They even coined a phrase for it, worse is better.

There's a good reason the industrial revolution demolished the old guild system.

When you prioritize speed of execution, you tend towards popular frameworks, which is the next line item on the typical client wish-list. They want to know that they don't have to keep an expensive pain-in-the-ass "rockstar" developer around just to maintain and build on it.

I suspect the only way we're going to get to make software the way we want to make it is, if we also run the business using the software. Look at Patrick and Thomas.


> Good luck getting people to pay for it.

No luck needed. There's a metric ton of people who just crave well-crafted software. I have a project of that nature now and I am absolutely blown away by the amount of compliments and thank-yous sent our way. It's really quite something.


What I should have said was, "Good luck getting your freelance clients to pay for it." If you're running a software business, which I see you are, with a product that serves a need that people actually have, in a way that they actually need it to be solved, then sure, they'll appreciate craftsmanship.

Because you put the time and attention in before you put a price on it and asked them to pay for it. If you're asking them to value it before you put it in, then you're selling them Fort Knox when all they need is a bank.

It's not just about software, it's about business too. If you want to make software the way you want to make it, you better be willing to sell it yourself.

Freelancing allows you to make software and get paid for it without having to validate the business purpose behind the software. In many cases, the business doesn't need anything fancy, handing them a manifesto isn't going to score you any points.


I was thinking about your product as soon as I saw this.


> ... when you're trying to meet business goals, speed of execution trumps quality of implementation every day of the week.

This is a topic I've been mulling over in my head for quite a while, contemplating writing about it and/or trying to give talks/presentations on it (I just have no idea where that would possibly be welcomed/appropriate within the developer community).

I'm curious about where and how you work that you find this to so consistently be the case?

---

Over the years, I've worked with multiple kinds of businesses on custom, from-scratch software. I'm not talking websites; I mean software on which they depend for actual daily business/employee operations. Perhaps I've been extraordinarily lucky in the last 8 years, perhaps I filter out the rubbish, but I have yet to work with a single client that values speed of execution over quality of implementation. In fact, the longer I do this, and the more I get to intimately know my clients and their businesses, the easier I find they happily pay more for quality of implementation. I rarely find myself facing earning less to provide speed of execution. They consistently pay a premium for quality implementation.

I don't negotiate on speed vs quality. Want speed? Cut features, not quality. Need a lower budget? Cut features, not quality.

I spend more time getting to deeply understand my clients, their businesses, and their goals than many other developers I personally know closely. I begin this on Day One, Meeting One--the first time I'm invited to discuss a project with a potential client.

I ask questions that get a client talking deeply about their business--what weaknesses/inefficiencies they're trying to solve with custom software, what goals they're trying to meet, what they've tried in the past, even what other developers/companies/options they've talked to or considered before me, and what has been offered/promised by my competitors. I drive the conversation to what kinds of business/operational features they're looking to automate/enhance with software, what benefits they're hoping to experience, what value they think software will bring their business.

I always take copious notes. I try to keep the client doing most of the talking. I probe and dig when necessary to really understand what they're trying to accomplish. Within a single meeting, I work to establish a base-level of trust by way of focusing the entire conversation on making sure my client knows they are heard and their needs/goals are understood. When I do take the floor and speak at length, it's to echo back what I've heard, what notes I'm taking, how I understand their needs, how I think we can solve those needs.

The first meeting is usually a long one. I never watch the time. I don't wear a watch. I don't glance at my phone. I get the client talking and keep them talking. I remain mindful of who is quiet in the room, and try to actively engage them in the conversation. I want the whole room in on the discussion, and I want them all feeling like they're part of this thing.

I say we a lot. By the end of an initial meeting, I've usually established an impression and expectation that I'm invested. And it's not bullshit. For me, at least, it's really damn genuine. I want the client to know that we can solve X, Y, and Z in a way that's going to bring great value to the company, and it's going to be of the utmost quality.

I never talk about speed. I rarely even discuss rates in the first meeting. I'm also rarely asked about them by the end of the meeting.

I usually close out these initial meetings with a brief outline of the process we will follow moving forward creating software to improve the business. I explain how I'm going to be doing the heavy lifting of distilling all of our conversation and my notes into a document that highlights and maps out all we've discussed as a complete guide to everything we think is important to helping the company.

Clients don't typically think in terms of features. They think about efficiencies, cost savings, goals, plans, ideas, and above all, value. I never ask them to give me a list of features to build. That's my job to provide them, based on understanding their business and its needs completely. If I can't take the time to do that properly, we're never even going to be worried about speed of execution vs quality of implementation. The project's going to be a disaster from the start.

We get back together some days later, after I've mapped out everything from our discussion into a semi-technical feature spec & roadmap. Everything is broken down into constituent parts. I include explanations that educate them on the process of building their software. I go to great lengths to make sure they understand technical limitations/expectations. I don't ever bullshit them or myself. I've taken to ranking things with a loose grading of technical difficulty and time required for different components of the software. I am always priming them for the price tag without ever talking about it.

When we get to the part of talking about the price, it's never awkward. They've been primed. I've earned their trust by being completely open and honest, and by listening and paying attention to their real needs. They see me as an extension of the company. We're in this together and $X is what it's going to take to get the value the company needs out of the software. When the price is too high, I walk them through prioritizing the features/components into a list of things they want to have now until it fits into their budget for "this phase". I never offer a choice of speed over quality.

> They want to know that they don't have to keep an expensive pain-in-the-ass "rockstar" developer around just to maintain and build on it.

My clients typically want to know they can keep me around forever to maintain and build on whatever I do for them.

From where I sit, it's all about earning trust. Maybe I'm just lucky. Maybe I work with a completely different set of clients. I just can't imagine having to deal with a client who seriously stresses speed over quality.

> I suspect the only way we're going to get to make software the way we want to make it is, if we also run the business using the software.

Or, if we build up enough trust that a client views us as an equal, invested in the business using the software. My clients practically treat me as if I was a partner in their business.

At the very least, there's really no luck involved here. There are better clients out there. Maybe it's time to start rejecting shitty clients.


If you are ever looking for an experienced programmer as a partner, I would definitelty see myself embracing such a philosophy.


I'm reminded of the first industrial products. Handmade meant crummy and for poor people, whereas something from a production belt meant precision-engineered. But now that everyone can afford, say, a quality solid-state amplifier, suddenly it's hip to have the expensive point-to-point tube amp again. It's really about status and conspicuous consumption, but the snobbery is hidden behind notions of quality and authenticity.

Now let's circle back to software. High-level programming languages, interpreted languages, libraries, frameworks... all used to be really cool. Until people realized that now everyone can learn how to code, so there's a need for something to separate the elite from the plebeians, something to separate the software engineers from the script kiddies... how about handmade programming, maybe that'll do the trick?


What does this even mean? Going back to hand-unrolling loops in assembler when writing business applications?

The very smart people working on the layers this manifesto bemoans already do that for us.


I can relate. I remember working with Turbo/Borland Pascal on 80286. To this day it seems to me as faster than any C++ compiler I am using today on core i7. Granted, it is apples and oranges, but the difference in frequency alone is freaking 500x. The reason is TP was deliberately _designed_ to run as fast as possible: simple language to parse, monolithic single pass compiler, compilation to memory, cleverly designed file format. We became complacent and rarely anyone does that kind of architecting with ruthless focus on speed and simplicity.


The 'Handmade' part of this is (knowing the author) a reference to Casey Muratori's Handmade Hero [1], where he builds a game 'from scratch'. The game is written in C++, using nothing but the base OS libs (in his case, Win32), minimal reliance on the CRT, etc.

I'd guess they're talking about building programs with a level of care comparable to that project, meaning high performance memory allocation, no interpreted languages, high priority on fast execution and compile times, etc.

[1] https://handmadehero.org/


> Casey Muratori's Handmade Hero

Didn't hear about that, very nice! I'm starting a little 3D game engine from scratch in C++ and OpenGL, just for fun. I'm sure I'm going to learn a lot watching these videos.


He talks about abstraction layers, yes- although he's probably not arguing for assembly as much as fewer stacks on services on stacks on...

But it seems like the main theme is programming to the actual machine rather than the "frictionless vacuum" target machine that is oh-so-easy to assume. Like one of my favorite presentations, where a PS3 developer team details how simple choices in data structure arrangement got them orders of magnitude speedup because of how different arrangements interplay with the CPU caches. In OoO languages, it's easy to make a pathological cache virus.


Let me guess: SoA instead of AoS? That's a classic optimisation, there's even an annotation for this in 'Jay' (well there will be when it will exist https://www.youtube.com/user/jblow888)


Yes, that, but they also go beyond that. I found it today:

http://harmful.cat-v.org/software/OO_programming/_pdf/Pitfal...




I was thinking the same thing when I finished. What, how deep should I dig? What level is it 'handmade' enough?


At the magnetized needle level, I think:

https://xkcd.com/378/



I undersand why unrolling loops is faster with current hardware, but does it ought to be so ? Maybe we are putting a lot of complexity in software to cope with limitations of hardware that could not exist anymore in a couple of generations (of hardware).


Hardware innovation in this area is, if anything, slowing down. Fundamentally we have to acknowledge that a computer is itself a distributed system and that every action that crosses a bus or disrupts a pipeline incurs a latency penalty.


I saw this come up during the handmade hero screencast yesterday, and I appreciate the sentiment a lot. Although I have a feeling that Casey and his crowd would disagree, I believe this initiative touches only part of the problem, and worse, it could be interpreted as absolving programmers working at higher levels of abstraction from doing their part.

Knowing what happens behind the scenes when you write a line of code is important. Question the decisions made by frameworks and libraries, and question whether you actually need them. Keep the stack minimal and thin. Choose the right tool for the job, and be knowledgeable of its inner workings. Resist replacing tradecraft with ritual, especially when programming in groups.

These can (and should, in my opinion) also apply to web development, a huge category that seems to be all but despised and written off by the handmade people.


At a visceral level, I really love the concept. I've created my own game engines (before the age of ubiquitous, well-tested free engines), and I understand how to optimize code all the way down to counting cycles (which used to be easier).

At a practical level, though, I'm really, really sick of people creating Yet Another Framework. NIH is not something that needs encouragement.

And this "manifesto" will be used to justify NIH. It may not be the point of the authors, but people will absolutely point to it when they want to create something that's been done well already.

I totally agree that the right idea is to use exactly the tools for the job. Don't throw every new framework into your app and expect it to be fast enough. "Minimal and thin" is the right answer.

I've been doing a lot of JavaScript work recently, for example, and I often visit MicroJS [1] when I need something. Instead of the 100k+ of jQuery & plugins I can often find one library that's 4k and another that's 6k that together do all I need. If you need most of jQuery, ZeptoJS [2] is often a good answer.

That said, sometimes ecosystems are powerful. When you need nontrivial functionality, and the best options have a dependency on jQuery, it might still make sense to include jQuery. Development time still counts: Spending an extra 30 hours to reinvent a component is great if you're spending your own time on it, but when you're billing by the hour, it's a lot harder to justify if that component is available off the shelf and Just Works in about 5 minutes.

It's all about compromises. I do think that it's common to see developers throw in a component if it's even slightly useful, leading to the web of today where a single page might download and initialize 40-60 different libraries. It would be healthy if the pendulum would swing back in the other direction. But we really don't need 300 ways to do every single thing in JavaScript.

[1] http://microjs.com/#

[2] http://zeptojs.com/


While this doesn't seem to be very well expressed, I think the sentiment behind this is worth discussion.

Computers do waste a huge amount of resources in the name of expediency, They are slow to do things, because it is simply cheaper to develop that way.

I think there is scope for the sort of crafted code that is described here though. Not only as an artform but as a practical measure.

For any small piece of code a compiler can produce a comparable result to hand written assembly. Entire programs that have been written in assembly tend to be much smaller, use less memory and run faster. That apparent contradiction is where the focus of crafted code should be. Most of the gains from hand crafted assembly do not come from finding the quickest way to do a simple calculation, but the additional attention that gets applied. It wouldn't surprise me if the majority of the gain comes from not instruction level optimisation, but calculation level optimisation.

For example, how do you find the last word in a string? Have a look on stackOverflow and you will see examples in many language that are the equivalent to.

    var lastword = fullString.split(" ").pop();
The work done by this operation is easier and easier to overlook as things get higher level.

I have often wondered on the merits of writing a virtual assembly instruction set that is platform agnostic (so compiles/transliterates to native) to explore this idea. It wouldn't be like JVM bytecode which implements higher level magic, and it wouldn't be like LLVM which is not aimed at being human writable. Some existing RISC instruction sets might be close to what is needed. There would be potential to write things in multiple instruction sets, If the interfaces between code sections are well defined (stack/register/memory passing) you could write modular code in a preferred Instruction set.

This isn't practical for many applications, but I'm not convinced that doesn't have a possible niche.


If this strikes a chord you might also be interested in data-oriented design[0]. It's the primary philosophy behind Casey Muratori's Handmade Hero[1] which is where, I assume, the reference in the manifesto's title comes from. You should also watch Mike Acton's talk at CppCon[2].

It's really about giving up programming-by-folklore. It's about understanding the problem so that you understand the data transformations your program has to make given the hardware it will operate on. It's about doing the maths and following the numbers instead of modelling things in terms of abstractions. It's about letting go of the idea that our job is to write beautiful code.

Our job is to solve problems. The problem is a human one. The computer transforms data to solve our problem. The code only needs to do those transformations which solve the problem.

It's a philosophy I've picked up in part thanks to Mike Acton and Casey. I mostly work in dynamic languages but it has already helped immensely at keeping my code simple and forcing me to think about the data. This has a huge impact on writing fast, simple code.

[0] http://dataorienteddesign.com/site.php [1] https://handmadehero.org/ [2] https://www.youtube.com/watch?v=rX0ItVEVjHc


I write media stuff. Code in the media pipeline proper gets ultraoptimized because it gets run between 30 and ~10k times per second. Code outside of the pipeline is largely written in python because the extra startup time, while non-trivial, is noise compared to the time waiting for PSI to be delivered.

The point is that "computers are slow" is, at least in some part, because "just works" requires a mind-numbing number of operations.


In DonaldKnuth's paper "StructuredProgrammingWithGoToStatements", he wrote: "Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%." (http://c2.com/cgi/wiki?PrematureOptimization)


I don't think we have to give up libraries and interpreted languages. Just stop writing shit code because your boss needs it done in 5 minutes. Of course, thats our bosses fault, not ours. And its a symptom of a larger cultural attitude that values short term profits over long term sustainability. I say, why bother!


I should make software by hand, rather than typing with my feet. Good idea.


Stable problems get high quality solutions.


I agree and disagree: I disagree because one of the main reason why "the computer is slow" is the HDD: use a SSD instead for your main storage and it makes a world of difference (except if you use bloated websites).

I agree because I remember BeOS: on the same computer it was much, much more responsive than Windows or Linux.

It's probably much more practical to buy a SSD instead of hoping to replace all your bloated SW with leaner one..




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: