“I divide my officers into four classes as follows: The clever, the industrious, the lazy, and the stupid. Each officer always possesses two of these qualities. Those who are clever and industrious I appoint to the General Staff. Use can under certain circumstances be made of those who are stupid and lazy. The man who is clever and lazy qualifies for the highest leadership posts. He has the requisite nerves and the mental clarity for difficult decisions. But whoever is stupid and industrious must be got rid of, for he is too dangerous.”
"Gilbreth studied the methods of various bricklayers—the poor workmen and the best ones, and he stumbled upon an astonishing fact of great importance and significance. He found that he could learn most from the lazy man!"
"Most of the chance improvements in human motions that eliminate unnecessary movement and reduce fatigue have been hit upon, Gilbreth thinks, by men who were lazy—so lazy that every needless step counted.”
"Another important thing Gilbreth noted was that the so-called expert factory workers are often the most wasteful of their motions and strength. Because of their energy and ability to work at high speed, such men may be able to produce a large quantity of good work, and thus qualify as experts, but they tire themselves out of all proportion to the amount of work done."
I got an interesting personal education on this working on some volunteer construction projects down in Mexico. The locals all move with a very deliberate slow pace. Us gringos tried to be more gung ho, and by 11 am the heat plus our effort had destroyed us. Meanwhile the local folks just kept on truckin' until sunset.
Sustainable Pace is one of those good buzz-terms to throw around - sure you can be fast, but is it sustainable?
It's a good attitude to have in life and work as well - sure, you can do 16 hour binge working days, but for how long can you sustain that? Can you keep that up until retirement, whenever that may be? Is it worth it?
I believe life forms are optimizers .. survival means spending the least amount (according to our mental representations but bodies are a major source of data).
Then there's lazy and defeated. I'm so not the same as my colleagues who are lazy .. they know what is useless to a point and avoid the work. I have a fundamentally different approach (from computing, food shops and industry). I aim at minimizing work AND maximizing flow. It makes my processes very different because I remap the task and attack many paths to find the most enjoyable one. The other lazies don't dive in the task they just take the simplest path and do it slowly.
This is a cute quote, especially if one's difficult decisions involve letting other men do the actual work.
It does not really apply to software, where most good programmers are obsessive and hard working. Yes, that also applies to Lisp, where someone has to write the actual interpreters and compilers that others use to run their 100 macros on and pontificate how easy everything is.
It applies to technology organizations though. I think being a smart engineer is very important when making architectural decisions -- but this is not always the bulk of the actual work. Lazy people tend to want to minimize total effort of the organization too.
Come now, I think we all know code is not its line count.
1000 lines of clear code in a mission-critical place is worth far more than a 1-liner which accomplishes the same task and only one person in the company understands...
I've yet to see 1,000 (or even 100, and rarely 10) lines of code doing what could be done in a 1 liner that is both clear and correct, and generally the more verbose, the less clear and the less correct it is.
Moreover, the longer version almost invariably (and more likely the longer it is) either is or contains a DRY violation that was originally copy-and-pasted from elsewhere in the same codebase, often with some intentional changes, which has since diverged (or diverged further than the intended differences) because of inconsistent maintenance.
It's not line count, it's the dot product of line count and conceptual clarity.
There's a sweet spot where code is as terse as it can be while still being easy to read. The abstractions are clear and fit the domain, they're not decorated with poorly configured edge cases, and they're no more nested than they need to be.
It's debatable if this ideal has ever existed, but it's nice to believe it might be possible.
> It's not line count, it's the dot product of line count and conceptual clarity.
In my experience, in practice, beyond a very small multiple of the minimum required line count, line count and conceptual clarity for perfoming any given task are inversely related.
Also, neither of those are vector quantities, so I'm not sure what you are trying to get at with “dot product”.
Citation needed. This is a popular narrative but I don't believe it; line count is the only thing that has been shown to be consistently correlated with bug count.
Without consulting Stack Overflow, describe the “...” operator in Perl.
As for other one-liners, when the one line is actually five instructions squished together inside a ternary operator, you’re probably better off expanding it to an if statement and consuming ten lines of vertical space.
Clarity is far better than terse code that suffers from leaning toothpick syndrome.
> Without consulting Stack Overflow, describe the “...” operator in Perl.
I don't know Perl. But I'm prepared to do that exercise for APL if you want.
> As for other one-liners, when the one line is actually five instructions squished together inside a ternary operator, you’re probably better off expanding it to an if statement and consuming ten lines of vertical space.
That's exactly what I'm disputing. We know that bug count correlates with line count, and that the chance of bugs increases greatly once a function or class no longer fits on a single screen. So I suspect that squishing five instructions together inside a ternary operator actually makes for fewer bugs.
Whereas I have seen too-smart people creating bugs by cramming things into one line because the brevity required means you end up with hard-to-visually-parse clusters of brackets, braces, and whatever other punctuation your chosen language allows (is that a minus operator, is it negating a value, or is it a range in a regex?).
Bugs are correlated to line count because code complexity drives line count and bug count.
It’s not the number of lines of code causing the bugs, it’s the complexity of the problem causing the number of lines and number of bugs to grow independently of each other.
> We know that bug count correlates with line count
Statistically maybe, but when it comes down to it, clear code > compact code hands down.
Paraphrasing, but there's basically two ways to write code: either it's so complicated there are no obvious deficiencies, or it's so simple there are obviously no deficiencies.
I'm not smart enough to be able to read and understand a nested ternary or other examples of clever / compact code.
> Paraphrasing, but there's basically two ways to write code: either it's so complicated there are no obvious deficiencies, or it's so simple there are obviously no deficiencies.
Anyone who's seen an enterprise Java codebase knows there's a third way: so verbose that no-one can tell whether there are any deficiencies, even if each line in isolation is "simple".
I think we vastly underestimate the importance of sheer code length. Being able to comprehend the function, class, or program as a whole makes a lot more difference than whether nested ternaries or the like were used, IME.
And yet most Perl programmers I know do not work in mailing houses so they have never needed to use “...”.
I have worked with brilliant people who still have trouble figuring out the ternary operator, and people who should know better who construct ternary operators with five levels of parentheses, causing deciphering of the intent to be impossible.
So I choose Alexander the Great’s approach to solving the Gordian Knot: slice it open and remove the complexity to multiple layers that mere mortals can understand.
PS: I notice you failed to explain the operator yourself. Minus five points to Slytherin.
Indeed. That was because smart-matching in Perl was symmetric.
The smart-matching in Raku (formerly known as Perl 6) is asymmetric and customizable. If you write:
a ~~ b
you are effectively executing:
b.ACCEPTS(a)
In other words, the right hand side of the smart-match is responsible for accepting the left hand side. So as a developer of a class, you can add your own ACCEPTS (multi-)method to govern how your class will handle smart-matching.
Besides, in your very implausible example (unless there's some underlying library - or an esoteric language maybe?) 1 line of that code (apparently) replacing 1000 line of clear code can have a comment to explain itself.
I think laziness and impatience have always been listed in a sort of tongue in cheek way. Patience is probably the single most important attribute a programmer can have, many '10x' engineers are just the people who have an unshakeable ability to step through boring code for 4 hours until the problem reveals itself.
Laziness pays off for me when I argue for simpler requirements and paring down of features on things. It results in a better end product but most of my motivation was that I just didn't want to build all the stuff!
Early career my boss spent all day running reports. Like 7 hours a day. All hand edited excel files. Dozens of them.
She went on vacation and I had to learn how to make them.
She cane back to find everything automated.
Like heck was I going to spend all day every day building reports. Soo boring.
10x engineers are a thing. But you have to be pushy to keep it up.
Way to many people setup pipelines that require “effort”. Efficiency suffers. They always get more done in a short period. But so much less in the long run.
Did you ever stop and think you ruined your bosses job for her?
She came back ready to grind out another year and you've free'd her of her job so she can have new an exciting uncertainty with nothing to do.
I assume this is written tongue in cheek, but I have worked with people who legitimately feel this way. I have had quiet parking lot conversations about how "all this automation business" was going to replace work they knew how to do and could do with half a brain while listening to books with work that required them to learn new things. The horror.
Long-term, statistically, sure - but there are real consequences to people losing their jobs over automation, in that the skillset they built up over the years has suddenly become useless. They usually do not grow into a higher position, instead they have to find something else, take a pay cut, or end up unemployed and part of the statistics.
So yeah, on a macro scale there is no evidence that automation causes job loss, but on an individual level it definitely does. It's personal tragedies. What would you do if you lost your job because a system took over? What if you're in your fifties and you find your ability to learn new things is severely diminished? What if you're that age and employers won't hire you because there's people younger, smarter, more energetic than you, or a machine can do the job you're applying to?
Took some positions I wasn’t comfortable with so I would be forced to learn again.
It was many months of pain. But I’m now super in demand now despite my age.
Certainly meet lots of people that learn a skill that’s not in demand.
Heck, I have daily talks with daughter about having a backup option aside from “trick riding”
Laziness to me is thinking "Eh this doesn't need tests" or "It might be possible to break it this way but I doubt anyone will try that"
What people seem to mean by it is the ability to stop and think "This task is repetitive and tedious, we are passing the threshold where writing an automatic solution is faster than manually completing the task" I'm not sure what word would really describe this though "Efficient" probably.
I think it is laziness, but the appropriate kind. I am willing to work harder now, in order to be lazy later. I am so lazy, I do not ever want to manually test a function. Therefore, I write automated tests to enable my laziness. I am so lazy, I do not want to remember caveats on how a function works. Therefore, I do my best to make an interface be hard to use incorrectly.
Well this kind of laziness seldom comes at an advantage in the modern workplace? I mean there's always more work so working hard now doesn't equate to working less later. Unless you're an entrepreneur building your own product.
Sure it does.
If you work with people who don't automate then every time you finish a piece of automation you can slow down your pace a little and still be faster than the people doing stuff manually.
The thing with tests is that the effect is indirect, and you won't really realize how much time is not wasted down the line.
That time could apply to your whole department / the people you work with. In worst cases, that time is saved by there not being a rewrite of the application down the line.
(disclaimer: I'm currently rebuilding an application because the existing codebase is a mess, a web-based UI built in mid-2000's technology in the 2010's by a C developer who never seems to have learned basic code quality / craftsmanship or web development practices. It can probably be salvaged but it'd a lot of work (160KLOC) with little gains compared to rebuilding it in a modern stack)
Change that to "This task is repetitive and tedious, we are passing the threshold where writing an automatic solution is more work than manually completing the task"
I write tests when I'm fed up with manually testing. I continue writing tests into the next program, because I'm still fed up with manually testing.
A caveat though is that I despise working on anything UI related, because it's hard to test and I despise manual testing. But that's when I delegate :-)
But thats also why its not just laziness, but lazy and intelligent -- the lazy and dumb man finds no solution, and simply drops the ball, because he's too lazy to do it.
The lazy and smart man is too lazy to do it, and gets away with it
For the later part, I think "tires of tedium" is maybe what we mean? I mean, I might actually prefer writing more code even if it takes a little longer than doing the dumb task.
An early CS professor I had used to always say "a good programmer is a lazy programmer", because there are already so many well written and tested tools available. This was long before the current days of JS dependency bloat, but his position came from a strong UNIX background and he wanted us to be thinking about tools at our fingertips like grep, sed, awk, etc. instead of reinventing those fantastic tools from scratch.
Master Foo once said to a visiting programmer: “There is more Unix-nature in one line of shell script than there is in ten thousand lines of C.”
The programmer, who was very proud of his mastery of C, said: “How can this be? C is the language in which the very kernel of Unix is implemented!”
Master Foo replied: “That is so. Nevertheless, there is more Unix-nature in one line of shell script than there is in ten thousand lines of C.”
The programmer grew distressed. “But through the C language we experience the enlightenment of the Patriarch Ritchie! We become as one with the operating system and the machine, reaping matchless performance!”
Master Foo replied: “All that you say is true. But there is still more Unix-nature in one line of shell script than there is in ten thousand lines of C.”
The programmer scoffed at Master Foo and rose to depart. But Master Foo nodded to his student Nubi, who wrote a line of shell script on a nearby whiteboard, and said: “Master programmer, consider this pipeline. Implemented in pure C, would it not span ten thousand lines?”
The programmer muttered through his beard, contemplating what Nubi had written. Finally he agreed that it was so.
“And how many hours would you require to implement and debug that C program?” asked Nubi.
“Many,” admitted the visiting programmer. “But only a fool would spend the time to do that when so many more worthy tasks await him.”
“And who better understands the Unix-nature?” Master Foo asked. “Is it he who writes the ten thousand lines, or he who, perceiving the emptiness of the task, gains merit by not coding?”
Upon hearing this, the programmer was enlightened.
Yup, sounds like the "not invented here" adage; it's good that there's multiple ways that the same concept is explained.
Of course, as a developer you need to know tools exist and how you can use them to solve a specific problem, and to have the patience to not go coding right away but do a bit of research first.
I wonder the distribution of patience among age. I used to be impatient, but so fast I could get progress even without careful plans or observations (basically I was a chaos monkey iterating 10x faster than average until I covered a lot of space and found a trick).
I love patient now, being old deprives you of sprinting wild.. you know it's the fastest way to paint your self into a corner.. that said full patience leads to boredom often. You need a pacing agent.. or a balance between observations, hypothesis and tests.
> many '10x' engineers are just the people who have an unshakeable ability to step through boring code for 4 hours until the problem reveals itself.
I dunno, to me the '10x' developers are the ones that tackle the big problems, whilst letting the common folk take care of the actual implementation and nitty-gritty. Maybe I'm confusing it with architects/CTO's/lead developers though.
Eh, stepping through code for hours is a pretty inefficient debugging technique, so a properly lazy programmer should probably be thinking about why their system's diagnostics suck and if there might be a way to improve them. (I mean, I've done it, but then again I'm not a 10x engineer.)
At the end of the day if you can be very diligent, even when its so boring and tedious, you are a valuable programmer. Maybe not the best, but it will take you very far.
"Programming Perl" is absolutely chock full of great quotes. I think one of the things that attracted many programmers to Perl was how genuinely witty Larry Wall was in his writings/talks. Add to that his background in linguistics and he comes across as a pretty hip/zany/brilliant character.
I think lots of Perl programmers probably thought of themselves as "being in on" his creation and therefore being a bit like him, themselves. That was certainly one of the upshots for me back in the late 90s early 2000s.
IMHO famous software people have amazing quotes, right from the start, with:
"On two occasions I have been asked, 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question." - Babbage
"If I had a little less brains, I should & would be a good Catholic, & cling to that certainty which I do long for. However I don't wish to be without my brains, tho' they doubtless interfere with a blind faith which would be very comfortable." - Ada Lovelace
"I do not think that this argument is sufficiently substantial to require refutation.
Consolation would be more appropriate..." - Turing
>"On two occasions I have been asked, 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question." - Babbage
To me, this belongs into the same category as Henry Ford's "If I’d asked people what they wanted, they would have asked for a better horse":
Google search does exactly that, trying to offer the right answer, no matter the question.
One day, they will link the user tracking and the search engine, and when you hesitate for a moment, google will tell you the missing information that you are lacking. Thus, the right answer no matter the input. But of course, what they tell you to buy when you are making a purchasing decision will be up to the highest bidder.
As to Henry Ford, self driving cars are faster horses. Seems like people don't buy cars because they want to steer by themselves all the time, they just want to indicate the general direction.
Definitely. I'd say "Programming Perl" is a literary masterpiece. Larry Wall's genius was selling Perl as a literate, witty, expressive antidote to all the Java and C++ cruft of the late 90s (Perl 5). "Programming Perl" should be read alongside "Mastering Regular Expressions" which is another masterpiece. I think Ruby and Clojure are the only language cultures that come close to engaging the whole brain the way Perl 5 did when it was popular. However, while each has a couple of shining stars (Ruby: Matz & DHH, Clojure: Rich Hickey) they don't compare with the sheer number of stars who made up the Perl 5 cast of the early 2000s (Larry Wall, "The Damian" Conway, Chromatic, Curtis Poe, Lincoln Stein, Randall Schwartz, John Orwant, Simon Cozens, Tom Christiansen, Nat Torkington & many more). That really was an entertaining community to be part of and is, like disappearing bookshops, sorely missed. Perl lost a lot of mindshare to Python, famed for its boring predictability and crippled lambdas which the Perl 5 community would never have tolerated. If you want programming to inspire you the way O'Reilly's catalogue of Perl books did in the late 90s/early 2000s I'm not sure there's much apart from Matz and Rich Hickey's work. Their lectures are inspiring but their written output is limited. Go read "Programming Perl" - it's timeless.
Well, I hated "Programming Perl" and stopped reading after 50 pages or so.
My favorite language introduction book is the K&R C book, it is of glittering elegance compared to the Perl book (and language)'s "quirky" style. Quirky is the very last thing that you want in a programming language. And I am not even that fond of C, at least with today's alternatives being available.
What you want out of a programming language may not be the same as what, for example, DHH or Matz wants. You seem to want minimalism and proximity to the metal whereas someone who appreciates Perl and Ruby is more likely to value expressiveness. Dynamic and statically-typed languages are not in competition with each other. They both have their place.
There are surely differences in taste. I do like Python for "scripting", small and/or short-lived programs, often used to handle administrative tasks. Python is famously the anti-Perl in many questions of language design.
The discussions about the meaning of "hubris" in Greek mythology are slightly off. Hubris means to insult the gods.
So, while some of the mythological examples in the first long-ish discussion are good ones (e.g. Arachne), it is wrong to say that Leto commits hubris by killing Niobe's children (i.e. an "insult"). The gods cannot commit hubris. Hubris is only for us, mortals. It's to not know our place, that the gods are our masters and we shouldn't piss them off, or they will do something really nasty, and cruel, and unstoppable and extremely unfair- like Apollo and Artemis killing Niobe's children, to punish Niobe (I mean, what did the poor children do?).
Hubris is often glossed as "arrogance", but arrogance itself is not hubris- it's the cause of hubris (though not the only one).
Also, Odysseus did not commit hubris- he pissed of Poseidon by killing his son, Polyphemus the Cyclops. That did not "insult" the god- it pissed him off and he took revenge, but note that he didn't outright smite Odysseus, just made his life misery. Because, after all, Polyphemus had tried to eat Odysseus (and ate some of his comrades) so it was a fair fight. The "fight" with Poseidon of course was not "fair"- but then, Odysseus had Athena to watch over him.
Prometheus also did not commit hubris- hubris is for the mortals and Prometheus was an immortal, a titan. Prometheus was punished for going against the authority of Zeus, but again that was fair and square because Zeus himself had detrhoned the titans to become the greatest of the gods.
Source: Greek, with Greek schooling and upbringing and a lot of reading of Greek mythology as a kid.
Would you please stop posting unsubstantive comments and supercilious disses to HN? It's not good discussion, regardless of how badly other people are flawed.
You've unfortunately posted like this a lot, and it's not in the intended spirit of the site.
> Humble people don't go around talking about how humble they are,
I'm wrong all the time. It's in my signatures - Often wrong, but never in doubt (source: Garfield). I'm decisive and I make mistakes and those behaviors are not at odds.
It's important to remember that no matter how trivial the programming task, you are likely to have made a mistake somewhere and that's why everything should be tested, code reviewed, etc. People don't re-read their code often enough and these practices force it.
> It's important to remember that no matter how trivial the programming task, you are likely to have made a mistake somewhere and that's why everything should be tested, code reviewed, etc.
A good time to roll out Knuth's "I have only proven this code correct, not tested it", and, less slogan-y but maybe more convincing, "Nearly all binary sorts and merge sorts are broken" (https://ai.googleblog.com/2006/06/extra-extra-read-all-about...).
I think it's something of a sign that we're all terrible at our jobs that this resonates with us. If you tend to be wrong about things, sometimes it cancels out with your vices. It's not good to do the wrong thing, but if you must, let's hope you're lazy and impatient about it to limit your progress so you can't do much damage.
IDK guys, just found out last night a project I've been working on for months should have gone in a totally different direction, this stuff is hard.
If you need some encouragement: i agree that this is hard. But experiences like this are the most valuable. Many advances in our field came from a place like this.
Sometimes it means throwing away stuff and starting from scratch. This isn’t giving up, quite the opposite.
Laziness is the avoidance of work, not necessarily the minimization of work. A truly lazy person may not look for effort-saving solutions, but rather avoid providing a solution altogether by compromising requirements, making excuses or having someone else do the work.
Impatience will make you avoid tasks that take "too long", like reading documentation and taking the time to understand how things work. It will make you approach problems by trial and error. By acting in this way, your solutions will likely be suboptimal and your understanding will be superficial and incomplete.
Hubris will make you avoid asking for help or code reviews from people that might have something legitimate to say about your code, eventually wasting everyone's time including your own.
So I disagree. I know the source of these is a rather popular book written by smart people, and that those paragraphs make sense. But there might be better words to represent those paragraphs.
I would say that really depends on the person - you can be lazy in a "good faith" or "bad faith". "Bad faith" is just like you described - let someone else deal with that.
It also depends on organization itself, because in some cases the reward for being efficient is more work.
I get that this idea is appealing because it's quirky and counter-intuitive, and self-deprecating as well as self-aggrandizing.
But it's not true, obviously. Truly "lazy" people (i. e. those suffering from depression) will be trying to hack their motivation and beating themselves up over repeatedly failing. It's something that comes up in an Ask HR at least every two months.
Impatience & hubris are what gets "tech bros" the reputation of shallow arrogance. These traits optimize for Uber-like outcomes, in both scale and completely lack of decency. But going for the 1-in-100000 outcome is rarely a good personal strategy, and some marginal improvement doesn't change that.
I thought for sure hubris would be something about "oh, that'll be easy to automate!" I'm not sure if it's what makes a great programmer but I do think there's some sort of Paradox of Programmers where we can somehow entirely underestimate the "Big Problem" while also being completely undeterred - defiant even - once the "Big Problem" reveals its true size.
I think arrogance is very underrated and should be included. A degree of arrogance makes you take on tasks that you are unqualified for; this aids in growth. It also gives you an inflated trust in your own opinions, and hence promotes exposing them to challenge and scrutiny. Furthermore, it encourages risk-taking, which is usually good for society even though it's often bad for the individual.
Why would you expose your opinions to challenge if you know they are right?
Also why would you need to take any risks if you think you know everything already?
I don't think you know many arrogant people, I deal with people at my company that are both stubborn and usually wrong, which is a lethal combo. You have to workaround these people or else the project won't get done.
Sure, but all of those "virtues" have good and bad manifestations. I agree that arrogance can be terrible.
> Why would you expose your opinions to challenge if you know they are right?
You expose them to challenge because you know they're right; you don't act to protect them by keeping them hidden and not testing them. You "put them under load." This is basically mental and rhetorical dogfooding.
edit: I think the difference is between "I think I'm right" and "I want to believe I'm right." If you want to protect your sense that you're right, you'll be both hesitant to test your beliefs and vicious in their defense. Conversely, if you're attached to the actual state of being right, you'll trial your beliefs and abandon them if they fail you. Arrogance, or at least a moderate degree of it, can help you act like the latter by making ideas seem more reliable than they deserve, creating natural trials.
edit: I guess the common thread is that arrogance promotes exploration by suppressing fear of loss.
The way I read the parent comment is "arrogant people expose their opinions" (which unavoidably exposes them to challenge) and "arrogant people have full confidence they know exactly what to do" (but are actually unaware of the risk involved due to them being wrong).
The founders of the (rather successful) company I work at now likes to say the entire company is based on two things: "How hard can it be?" followed by "Hold my beer."
Sure, there have been occasions when that has been the wrong approach. But it's also been the right approach in a profitable enough way to compensate for the failures!
I will always have a soft spot for Perl. I got my first real big kid job writing Perl and it was a wonderful experience. The Swiss Army Chainsaw. I came close on a few occasions but I never did cut off (my own) limbs.
It depends: are you developing, a prototype or the early iteration of a product, or a mature product?
The "Maximizing the work not done" mantra... really depends on what time-frame you pick. Work not done over a day, a week, a year, a decade? Different time windows will lead to different decision-making processes.
True, but as the timeframe go up, so do the uncertainty of the decisions.
It also depends on the field too. Almost nobody made good/relevant “maximising work not done” decisions a decade ago about how their mobile apps should be built - who even knew Swift or Kotlin (or React of Flutter) we’re on the horizon? And nobody cares, because mobile apps from 10 years back are mostly irrelevant today.
It’s very easy/common to make errors in those sorts of decisions, by worrying about things too early. I know many many startup stories who ran out of runway while building Google/Facebook scale infrastructure architectures with no more than family and friends sized user bases. Projects that probably could have been live a year or two earlier using WordPress and a few custom plugins as a backend, and scaled at least to 100 engineers worth of revenue before needing heroic internet-scale design from the two or three person founding team.
I have also known, and seen first-hand, companies that neglect their code bases for a long time, and then have to let go entire teams once the quality bar is raised, or once they fail to adopt a proactive attitude towards defects.
Startups are not small companies. Startups are companies created with exponential growth in mind, and if your product is not prepared for that kind of growth, you do not need startup scale funding. You could do fine with small business funding. Unless you can find some chump that will buy your small company at a startup price, usually by inflating the payroll with a lot of redundant employees right before selling.
It turns out building scalable software most of the time is not about making the kind of investments you talked about. It's about designing each feature with scale in mind. Then, unless the startups you talk bought multiple geographically distributed datacenters and deployed their own submarine cables, they were not working at that scale. Don't exagerate.
> Then, unless the startups you talk bought multiple geographically distributed datacenters and deployed their own submarine cables, they were not working at that scale. Don't exagerate.
No, you’re right. Totally exaggerating, but that was seriously their mindset. “We can’t use an SQL database, it won’t scale to a billion users!” I strongly suspect the team talked about where they’d deploy their data enters and the risks of using other people’s cables - only _partly_ in jest.
The one I had in mind specifically while writing that were fighting with sharded MongoDB clusters, eventual consistency, and auto scaling app server fleets, while not having 1% of the user base I support with a more complex app on a handful of t3.medium spot and on demand instances, and a dB.t3.large rds aurora database. I know it’s always hard to reliably forecast when your current solution’s vertical scaling capability will runout and you need something more scalable than a stateless monolith with a vertically scaling SQL database behind them, but they were easily three maybe four or more orders of magnitude short of that inflection point. I have zero doubt their kinda good idea could have scaled to high seven or low eight digits of revenue before needing any heroic platform architecture. Well before that became a necessary problem to solve they could have had a 50 person engineering team with leadership who’d done it before to solve that problem for them.
And I totally agree with your “neglected codebase” problem too, I’ve spent the last 18 months or more on a “rewrite the old Grails backend code in Java” journey, which has gone about as smoothly as they always do /\/\/\/\... We’re now finally in a place where the Java platform is as good as the old one in most areas, and betting in some important ones. But of course we now have a bunch of “legacy clients” who are not (and may no ever be) migrated of the Grails platform, and I’m fighting very hard to avoid the worst of the “neglected codebase” problems that’s gonna throw me over then next few years while we solve the legacy client problem (and COVID isn’t making it any more likely that some of out government clients will upgrade is the need to pay to do so...)
I like to think of the first virtue as Proactive Laziness. I'll work hard right now (on something that's fun, automating tasks!) to avoid doing a lot more hard work later that's boring (doing the same tedious task manually over and over).
Doing something tedious and manual is a good way to let your brain unwind and kind of meditate. After a bit you get all sorts of insights into how to make the process work better. Or if there's nothing at all worthwhile about the process, your subconscious can still be solving other problems in the background.
One of the problems I find with impressing people with your technical abilities is that they start piling projects on you and you stop doing the tedious stuff that teaches you what needs to be done and how.
The greatest frustration in my career has been that everybody wants to do grandiose automation projects without being willing to assign any proportional effort and talent towards understanding the existing processes.
Amen, I've seen so many failed attempt to automate a process that hadn't even been attempted the "slow" "unscalable" way first. Doing a task manually a few (dozen?) times will reveal a disproportionate amount of information about what actually needs to be done.
Agreed, you want to do it the hard way first to get real insight into the problem. By the time you get around to actually automating it you'll be ready to.
Am fortunate in my present position that the projects I get tend to be complex one-off types. I have a lot of leeway on how it gets done as long as the software engineering aspects are sound.
I've had the opposite frustration at times in the past, where people around me were content to do it the tedious manual way forever. As a typical Perl programmer type that was kind of painful :-).
I get it, it's cheeky to refer to someone who's very keen on automating as "lazy". That being said I'm really not a big fan of using "lazy" as a quality for a good programmer. It's pejorative and it should stay that way. I've never met a "lazy" programmer who I enjoyed working with. Lazy people cut corners because it's less work in the short term.
Being someone who finishes and completes automation is antithetical to lazy. It takes MUCH effort to take things from 80% to 100%.
"Keen on automating" is one of the few definitions of lazy that actually make sense.
If we need to get rid of anything, it's the pejorative meaning. Using the word that way is simply a way to get out of having to deal with the complex system that is human motivation.
If someone is consistently lacking in motivation, there are two responses:
1. Figure out why and be able to do something about it; or
2. Call them "lazy" and hope the problem fixes itself.
As a general rule, option 2 almost never works out productively. It just ascribes a systematic problem to a single person and nothing improves. If anything, things get worse because people's attitudes change, further reducing motivation.
The phrase is up for interpretation; how about a "not overzealous" programmer? Someone who'd rather pick a well-tested off-the-shelf thing over coding it themselves.
> Three great virtues of programming are laziness, impatience, and hubris. Great Perl programmers embrace those virtues. So do Open Source developers.
> But here I’m going to talk about some other virtues: diligence, patience, and humility.
> If you think these sound like the opposite, you’re right. If you think a single community can’t embrace opposing values, then you should spend more time with Perl. After all, there’s more than one way to do it.
I've always heard these as good traits for a programmer, but I must have got a cursed version of them. As to laziness and impatience particularly: I'll sooner decide not to do a task at all rather than somehow automate it.
Just “deciding not to do tasks” is one option that works (or, as I like to say “never put off till tomorrow anything that you can get out of altogether”...)
A much more powerful tool as a senior dev is to push back on pointless or stupid requirements. “No we will not build a video transcoder so you can upload 8K video files you your WordPress site that gets 85+% of its traffic on phone screens. Learn to resize video before you upload it, or buy that as a service from Brightcove et al.” (Yes, that is a real example of a requirement I pushed back on, because neither the customer nor the account manager understood the problem. One of my devs tried to push back and got talked over, before I put my foot down and said “No, not happening. Happy to explain I detail why if you’d like.”)
I'm not sure what's behind this account's commenting history about dress, but it's repetitive and has frequently violated the site guidelines, so please don't.
Exactly bro! Now keep on rocking in the free world and understand that hubris alone does not a good programmer make but a good programmer is hubristic. Therefore hubris is a poor heuristic no matter what any tech bro tells you.
Therefore you should at least make an effort to dress up for an interview because you never know if it's your last chance to make any sort of impression.
And let your hubris shine that way instead of choosing the casual path that every other one will certainly choose.
Edit: of course this argument is lost on all the beautiful minds who spare no effort at shabby chic
Relevant quote: "As soon as the guy walked into the room, I knew it was going to be problematic, because he seemed extremely straight-laced and uptight, dressing more like an insurance salesman than a technologist."
Jobs was an ass, but Andy had a point: dressing correctly depends on the context. The idea of dressing "up" or "down" as if it was a single axis was never correct, and is even less nowadays.
General Kurt von Hammerstein-Equord, probably