I personally do computer graphics in video games. We still worry about every byte, cycle, cache miss, and instruction that we write. We still spend days trying to eek out another 100 microseconds from that tricky piece of code.
Not every software developer is a web/business developer. Hacker News tends to forget that.
So yes, there are still a lot of people worrying about performance, but a lot of them are coding in Fortran and this is definitely not sexy. Ok, now we also use CUDA, so this can go through the HN filter ;)
People doing great work, without putting themselves into the spotlight.
Brilliant assembler programmer who likes to finish up at the end of the day and play with your kids? Fuck you, where's your GitHub profile.
Developing novel algorithms and prefer playing guitar at home to drinking at the pub? Fuck you, where's your conference talks.
etc, etc, etc. The broader culture of which HN is a microcosm spends a lot of time and effort devising filters to make people invisible if they don't fit a relentlessly self-promoting profile which has a questionable relationship with the quality of their work. I don't see Knuth and Cutler spending a lot of time on Twitter.
I'm in full agreement that HN is an informational bubble. I'm not so clear on exactly what HN is doing to aggressively exclude people. And where you see exclusion, I see an attempt to encourage people to join the discussion: "hey, why don't you put this up on GitHub so we can take a look", "this would make such a good topic for a conference talk."
1) When your writing code for an underpowered microcontroller, you have a plan before you start. This plan includes execution speed and memory usage targets.
2) When you're sending thousands (hundreds of thousands or millions) of non-reprogramable devices into the field, you have a test plan (and comprehensive test suite) that guarantees correctness and performance.
Do you know your "machine"? For instance, if you're working with a language that runs on the JVM have you read the JVM specification? (It's the equivalent of your micorcontroller's data-sheet). The part of this manifesto I agree with is that you should thoroughly understand everything "beneath" your application.
I value the work you do, but please don't imply that tests can guarantee correctness. Tests can never prove the absence of bugs! To guarantee correctness we need to treat programs as mathematical objects.
Say someone (a full-stack developer if you want, or jack-of-all-trades-master-of-none) uses Photoshop and Illustrator but also is coding against Firefox, Chrome, and Safari, and their stack includes node.js, Express, Angular, and PostgreSQL.
At some point, you can't thoroughly understand every cycle of every rendering engine. (Which is what what you've stated implies.)
It is quite literally impossible - as in, a physical impossibilitiy, of "thoroughly understanding everything 'beneath' your application." You just can't - there are not that many hours in the day. You would have no application left.
If you froze technology today you could thoroughly understand all of the mentioned technologies in 10 years. But by then there would be new technologies.
Otherwise you can just never get anything done. The modern world is made up of applications whose combined reference materials total without exaggeration millions of pages of text. You just can't read a million pages of text.
This is completely different from a single microcontroller's architecture. Which is basically a single layer. On the web, even your target is a collection of competing browsers.
Why should someone building a front-end site but also using the basics of a database somehow, take months out of his or her life to gain a deep understanding of every cycle of that database engine or what exactly it's doing?
What does it get them?
Cycles - even billions of cycles - are cheap.
It's like asking a farrier (someone who cares for horse's hooves and then puts shoes on them) to learn all the intricacies of metallurgy, really go back to where metal is mined in ores. Oh, and since, "A farrier combines some blacksmith's skills (fabricating, adapting, and adjusting metal shoes) with some veterinarian's skills (knowledge of the anatomy and physiology of the lower limb)" I suppose this farrier suddenly needs to be a complete veterinarian and really fully understand all layers of the horse?
It's not fair to ask someone to fully understand everything. Even as a microcontroller coder, you didn't understand the circuitry in the microcontroller at an electrical level - you didn't even get the diagrams. You, too, worked with an abstraction.
The first computer I built was a COSMAC Elf (based on an RCA 1802 microprocessor) - I didn't have billions of cycles so there was a limit to the work I could do. Now I do have billions of cycles and do my best to make sure there's a linear increase in the amount of work I can do.
<old-guy-voice>In the old days, we spent large amounts of time reading data-sheets, specifications and application notes - probably more than we spent writing software. What you're implying is that you're too impatient to do engineering but you're willing to be a programmer (or hacker).</old-guy-voice>
So I agree with the premise that we should be producing smaller, faster and less buggy software than we are. I don't think we need everything related to our systems - I certainly used ICE when available to avoid the code/burn/test cycle required using EPROMs.
There is (of course) a finite limit to what we can learn, but that doesn't mean we shouldn't want to know everything. One important skill is being able to discern what you need to know, what you need to understand and what you can safely ignore - this is hard. Knowing is also a choice. I can learn more than most because I don't watch TV or spend inordinate amounts of time wasting elsewhere (I have a few vices like HN).
To cut a long story short: when you say "What you're implying is that you're too impatient to do engineering but you're willing to be a programmer (or hacker)" - this is correct. The only way for many people to get projects off the ground is not to engineer them, but just to throw them up.
By the by, I actually in a couple of spare hours toyed with the idea of formalizing this into a project, where we would teach some valuable skill in 15 seconds. (I applied to a YC fellowship with it but wasn't selected.)
Here's my prototype - (it autoplays 15 seconds of sound, you've been warned.) http://iknowkungfu.meteor.com
I didn't write the tutorials up there now, and many of them are too long. But the idea is there: in a few seconds, you can often learn and incorporate something into your stack that you know next to nothing about.
Do this a few times and you have a complete web app serving dynamic, database-backed content - and you've still been able to focus on what you know rather than engineering.
There are 3 billion people out there who deserve to use some of the tools that are available. You, too, deserve to use some of the modern frameworks that are available. Without investing hours, days, weeks of your time into it. I know lots of technologies whose basic usage could be uploaded to someone's brain in 15 seconds of video. Git, to name one.
No, you won't understand how it works or why - but you can commit and roll back (reset), which is all anyone cares about until they start caring more. The advantage to using the git that you can learn in 15 seconds, over copying files and continuously renaming them - is - simply put, astronomical.
Many parts of the toolchain, and many layers of the stack, are quite similar. Who knows what cycles MongoDB is running? Who cares?
I basically agree with everything you've said (in both posts) ... you can be successful while wasting cycles. And if you're working on a low-volume and/or internal only application, you'll probably never face the limits of a modern server.
If you need to operate "at web scale" , or run into an uncommon (or common) bug, you'll need to know more about the frameworks and systems you're code relies on (e.g. MongoDB configuration for systems over 2GB ). Blog posts like the one referenced are completely unfair to those that developed MongoDB - read the manual and understand how MongoDB works OR use it at your own risk.
So I'll switch arguments and help you make your point. We have an application written in Oracle's Application Express - while we have extensive expertise in Oracle's database software, we have this one system which was completed for expediency's sake. It's kind of horrific but (mostly) works at the scale required. It would be financially foolish to dig into more deeply into ApEx for this one dead-end application. Everyone is happy.
 https://www.youtube.com/watch?v=b2F-DItXtZs (audio NSFW)
I like your final example - and remember, you guys are Oracle experts: you're the most qualified people on the planet to learn ApEx properly from scratch, even though you haven't.
Now switch gears and imagine a college student who just has an idea for some cool project, but barely codes in any language. This describes the computing needs of 3 billion people. They're not qualified to quickly become experts and engineers at anything. But they still have a computer in front of them that does a trillion operations every few minutes. The gulf between using that to surf facebook or building... anything at all, even very poorly, is immense. (Like git that you can learn in 15 seconds, versus manually copying and renaming files for version control.) Thanks for the encouragement.
The problem with your hypothetical worker is a lack of depth into any one topic. If they're half artist and half front-end developer and half back-end developer, of course they're not going to be able to get into great depth on any one topic.
I firmly believe in the value of such a Jack-of-all-trades, but I'd be silly to not value the full time artist who can create better works in a third of the time, because they understand their tools and their craft.
A tenth as good isn't good enough. But on the web, for many parts of the stack, a tenth as good really is enough. Even using too many cycles by a factor of three orders of magnitude is still good enough. You just don't need to know.
I'm not sure it knew it in the first place. It's particularly entertaining when the web dev crowd chimes in on embedded system topics.
To the people who say "Who is going to pay for this?" look at all of the places that can measure the cost of slow or inefficient computing e.g. Wall St, Google, Amazon, e.t.c. They all invest in writing low level efficient code where it makes sense.
Not every piece of software is handcrafted work of art. Some of it just saves X person Z hours of time at some mundane task, and we are able to make that software for X person because the same 6 layers of abstraction that make it slow also make programming accessible enough for it to be affordable. Nothing really wrong with that.
But I guess I wouldn't begrudge someone choosing to develop handcrafted software, as long as they're not claiming that is the way everything should be.
For instance, I wrote a script to download the O'Neill's tune collections in ABC  the other day. I wrote it in Perl 6, not exactly known for blazing performance. (It was a three line script, one of which was just a closing bracket. ) If I'd carefully written it in C++, it would have taken me at least 20x as long to write, and it would no doubt have been substantially more "efficient". But it would not have been significantly faster in execution, because it was using curl to download the ABC files that was the slow part.
 For instance, http://trillian.mit.edu/~jc/music/book/oneills/waifs/
There's a good reason the industrial revolution demolished the old guild system.
When you prioritize speed of execution, you tend towards popular frameworks, which is the next line item on the typical client wish-list. They want to know that they don't have to keep an expensive pain-in-the-ass "rockstar" developer around just to maintain and build on it.
I suspect the only way we're going to get to make software the way we want to make it is, if we also run the business using the software. Look at Patrick and Thomas.
No luck needed. There's a metric ton of people who just crave well-crafted software. I have a project of that nature now and I am absolutely blown away by the amount of compliments and thank-yous sent our way. It's really quite something.
Because you put the time and attention in before you put a price on it and asked them to pay for it. If you're asking them to value it before you put it in, then you're selling them Fort Knox when all they need is a bank.
It's not just about software, it's about business too. If you want to make software the way you want to make it, you better be willing to sell it yourself.
Freelancing allows you to make software and get paid for it without having to validate the business purpose behind the software. In many cases, the business doesn't need anything fancy, handing them a manifesto isn't going to score you any points.
This is a topic I've been mulling over in my head for quite a while, contemplating writing about it and/or trying to give talks/presentations on it (I just have no idea where that would possibly be welcomed/appropriate within the developer community).
I'm curious about where and how you work that you find this to so consistently be the case?
Over the years, I've worked with multiple kinds of businesses on custom, from-scratch software. I'm not talking websites; I mean software on which they depend for actual daily business/employee operations. Perhaps I've been extraordinarily lucky in the last 8 years, perhaps I filter out the rubbish, but I have yet to work with a single client that values speed of execution over quality of implementation. In fact, the longer I do this, and the more I get to intimately know my clients and their businesses, the easier I find they happily pay more for quality of implementation. I rarely find myself facing earning less to provide speed of execution. They consistently pay a premium for quality implementation.
I don't negotiate on speed vs quality. Want speed? Cut features, not quality. Need a lower budget? Cut features, not quality.
I spend more time getting to deeply understand my clients, their businesses, and their goals than many other developers I personally know closely. I begin this on Day One, Meeting One--the first time I'm invited to discuss a project with a potential client.
I ask questions that get a client talking deeply about their business--what weaknesses/inefficiencies they're trying to solve with custom software, what goals they're trying to meet, what they've tried in the past, even what other developers/companies/options they've talked to or considered before me, and what has been offered/promised by my competitors. I drive the conversation to what kinds of business/operational features they're looking to automate/enhance with software, what benefits they're hoping to experience, what value they think software will bring their business.
I always take copious notes. I try to keep the client doing most of the talking. I probe and dig when necessary to really understand what they're trying to accomplish. Within a single meeting, I work to establish a base-level of trust by way of focusing the entire conversation on making sure my client knows they are heard and their needs/goals are understood. When I do take the floor and speak at length, it's to echo back what I've heard, what notes I'm taking, how I understand their needs, how I think we can solve those needs.
The first meeting is usually a long one. I never watch the time. I don't wear a watch. I don't glance at my phone. I get the client talking and keep them talking. I remain mindful of who is quiet in the room, and try to actively engage them in the conversation. I want the whole room in on the discussion, and I want them all feeling like they're part of this thing.
I say we a lot. By the end of an initial meeting, I've usually established an impression and expectation that I'm invested. And it's not bullshit. For me, at least, it's really damn genuine. I want the client to know that we can solve X, Y, and Z in a way that's going to bring great value to the company, and it's going to be of the utmost quality.
I never talk about speed. I rarely even discuss rates in the first meeting. I'm also rarely asked about them by the end of the meeting.
I usually close out these initial meetings with a brief outline of the process we will follow moving forward creating software to improve the business. I explain how I'm going to be doing the heavy lifting of distilling all of our conversation and my notes into a document that highlights and maps out all we've discussed as a complete guide to everything we think is important to helping the company.
Clients don't typically think in terms of features. They think about efficiencies, cost savings, goals, plans, ideas, and above all, value. I never ask them to give me a list of features to build. That's my job to provide them, based on understanding their business and its needs completely. If I can't take the time to do that properly, we're never even going to be worried about speed of execution vs quality of implementation. The project's going to be a disaster from the start.
We get back together some days later, after I've mapped out everything from our discussion into a semi-technical feature spec & roadmap. Everything is broken down into constituent parts. I include explanations that educate them on the process of building their software. I go to great lengths to make sure they understand technical limitations/expectations. I don't ever bullshit them or myself. I've taken to ranking things with a loose grading of technical difficulty and time required for different components of the software. I am always priming them for the price tag without ever talking about it.
When we get to the part of talking about the price, it's never awkward. They've been primed. I've earned their trust by being completely open and honest, and by listening and paying attention to their real needs. They see me as an extension of the company. We're in this together and $X is what it's going to take to get the value the company needs out of the software. When the price is too high, I walk them through prioritizing the features/components into a list of things they want to have now until it fits into their budget for "this phase". I never offer a choice of speed over quality.
> They want to know that they don't have to keep an expensive pain-in-the-ass "rockstar" developer around just to maintain and build on it.
My clients typically want to know they can keep me around forever to maintain and build on whatever I do for them.
From where I sit, it's all about earning trust. Maybe I'm just lucky. Maybe I work with a completely different set of clients. I just can't imagine having to deal with a client who seriously stresses speed over quality.
> I suspect the only way we're going to get to make software the way we want to make it is, if we also run the business using the software.
Or, if we build up enough trust that a client views us as an equal, invested in the business using the software. My clients practically treat me as if I was a partner in their business.
At the very least, there's really no luck involved here. There are better clients out there. Maybe it's time to start rejecting shitty clients.
Now let's circle back to software. High-level programming languages, interpreted languages, libraries, frameworks... all used to be really cool. Until people realized that now everyone can learn how to code, so there's a need for something to separate the elite from the plebeians, something to separate the software engineers from the script kiddies... how about handmade programming, maybe that'll do the trick?
The very smart people working on the layers this manifesto bemoans already do that for us.
I'd guess they're talking about building programs with a level of care comparable to that project, meaning high performance memory allocation, no interpreted languages, high priority on fast execution and compile times, etc.
Didn't hear about that, very nice! I'm starting a little 3D game engine from scratch in C++ and OpenGL, just for fun. I'm sure I'm going to learn a lot watching these videos.
But it seems like the main theme is programming to the actual machine rather than the "frictionless vacuum" target machine that is oh-so-easy to assume. Like one of my favorite presentations, where a PS3 developer team details how simple choices in data structure arrangement got them orders of magnitude speedup because of how different arrangements interplay with the CPU caches. In OoO languages, it's easy to make a pathological cache virus.
Knowing what happens behind the scenes when you write a line of code is important. Question the decisions made by frameworks and libraries, and question whether you actually need them. Keep the stack minimal and thin. Choose the right tool for the job, and be knowledgeable of its inner workings. Resist replacing tradecraft with ritual, especially when programming in groups.
These can (and should, in my opinion) also apply to web development, a huge category that seems to be all but despised and written off by the handmade people.
At a practical level, though, I'm really, really sick of people creating Yet Another Framework. NIH is not something that needs encouragement.
And this "manifesto" will be used to justify NIH. It may not be the point of the authors, but people will absolutely point to it when they want to create something that's been done well already.
I totally agree that the right idea is to use exactly the tools for the job. Don't throw every new framework into your app and expect it to be fast enough. "Minimal and thin" is the right answer.
That said, sometimes ecosystems are powerful. When you need nontrivial functionality, and the best options have a dependency on jQuery, it might still make sense to include jQuery. Development time still counts: Spending an extra 30 hours to reinvent a component is great if you're spending your own time on it, but when you're billing by the hour, it's a lot harder to justify if that component is available off the shelf and Just Works in about 5 minutes.
Computers do waste a huge amount of resources in the name of expediency, They are slow to do things, because it is simply cheaper to develop that way.
I think there is scope for the sort of crafted code that is described here though. Not only as an artform but as a practical measure.
For any small piece of code a compiler can produce a comparable result to hand written assembly. Entire programs that have been written in assembly tend to be much smaller, use less memory and run faster. That apparent contradiction is where the focus of crafted code should be. Most of the gains from hand crafted assembly do not come from finding the quickest way to do a simple calculation, but the additional attention that gets applied. It wouldn't surprise me if the majority of the gain comes from not instruction level optimisation, but calculation level optimisation.
For example, how do you find the last word in a string? Have a look on stackOverflow and you will see examples in many language that are the equivalent to.
var lastword = fullString.split(" ").pop();
I have often wondered on the merits of writing a virtual assembly instruction set that is platform agnostic (so compiles/transliterates to native) to explore this idea. It wouldn't be like JVM bytecode which implements higher level magic, and it wouldn't be like LLVM which is not aimed at being human writable. Some existing RISC instruction sets might be close to what is needed. There would be potential to write things in multiple instruction sets, If the interfaces between code sections are well defined (stack/register/memory passing) you could write modular code in a preferred Instruction set.
This isn't practical for many applications, but I'm not convinced that doesn't have a possible niche.
It's really about giving up programming-by-folklore. It's about understanding the problem so that you understand the data transformations your program has to make given the hardware it will operate on. It's about doing the maths and following the numbers instead of modelling things in terms of abstractions. It's about letting go of the idea that our job is to write beautiful code.
Our job is to solve problems. The problem is a human one. The computer transforms data to solve our problem. The code only needs to do those transformations which solve the problem.
It's a philosophy I've picked up in part thanks to Mike Acton and Casey. I mostly work in dynamic languages but it has already helped immensely at keeping my code simple and forcing me to think about the data. This has a huge impact on writing fast, simple code.
The point is that "computers are slow" is, at least in some part, because "just works" requires a mind-numbing number of operations.
I agree because I remember BeOS: on the same computer it was much, much more responsive than Windows or Linux.
It's probably much more practical to buy a SSD instead of hoping to replace all your bloated SW with leaner one..