Hacker News new | past | comments | ask | show | jobs | submit login
Efficiency is dangerous and slowing down makes life better (psyche.co)
745 points by _zhqs on Aug 21, 2020 | hide | past | favorite | 305 comments



I've found "efficiency as the opposite of stability" a very powerful concept to think about - even though it's fairly simple, it seems to be almost a fundamental law.

Whether it's about the economy at large, your own household, a supply chain, what have you - as soon as you optimize for efficiency by removing friction, you take all the slack/damping out of the system and become instantly more liable to catastrophic failure if some of your basic conditions change. Efficiency gives you a speed bonus, at the cost of increased risk / less resilience to unforeseen events.

Stewart Brand's concept of "Pace Layering" comes to mind for how to deal with this at a systemic level - https://jods.mitpress.mit.edu/pub/issue3-brand/release/2


> efficiency as the opposite of stability

In statistics, there is a slight variant of this thesis that is true in a precise formal sense: the tradeoff between efficiency and "robustness" (stability in a non-ideal situation).

For example, if you have a population sample, the most efficient way to estimate the population mean from your sample is the sample mean. But if some of the data are corrupted, you're better off with a robust estimator - in this case, a trimmed mean, where the extreme N% of high and low values are discarded.

The trimmed mean is less efficient in the sense that, if none of the data are corrupted, it discards information and is less accurate than the full mean. But it's more robust in the sense that it remains accurate even when a small-to-moderate % of the data are corrupted.


i stumbled on "stability' too, because it's a static quality.

rather than robustness, i prefer to use the term resilience, a dynamic quality, since efficiency is also a dynamic quality. you can trade efficiency for resilience and vice versa (as the parent poster switched to later).

edit:

i should add that i don't entirely agree with the thesis of the article, which exhorts us to slow down, thereby trading efficiency away for resilience. there are a number of ways to add resilience (and trade away efficiency), and in some cases, slowing down might be the best, but it's certainly not the only, or best, option in most cases.

for housing, an example used in the article, we could add more housing to create resilience, which requires reducing friction, like increasing the throughput of permitting/inspections while generally reducing zoning/regulations.


I like resilience better as well - here it just happens that the technical terms of statistics match up fairly nicely with what we're trying to say.


Do you have any reference or source of this statement? Just asking out of curiosity.


I'm completely ignorant on this topic, so I apologize for asking what must be an extremely stupid question to you, but: what makes stability a static quality whereas resilience is a dynamic quality? Are these statistical definitions that I can look up somewhere?


Not OP, but my take is that stability is usually defined as a base-state that will continue onto perpetuity unless some outside force disrupts it. Resiliency is more closely defined as the ability to recover from disruptions back to the base-state quickly.

Since the context here is that efficiency can remove layers of redundancy therefore allowing disruptions to wreck more havoc - I believe that's what OP was getting at.


Another example would be forward error correction (adding parity bits to improve robustness at the expense of efficiency).

But inefficiency isn't necessarally more robust unless the extra bits serve some purpose.


I may be wrong, but it seems to me that 20th century (theoretical) statistics research overemphasized efficiency at the expense of robustness. My guess is that this has to do with the (over-)mathematization of statistics in the past century, as opposed to a more empirical/engineering viewpoint. Efficiency typically only holds under extremely narrow (and often impossible to check) assumptions, which is great for mathematicians proving theorems and creating theories of efficiency. On the other hand, robustness is ideally about unknown unknowns and weak assumptions, which is hard to deal with mathematically.

It seems already the 21st century is seeing a more balanced emphasis on theory vs. real world applications though.


Not my impression.

The kind of outlier-culling technique suggested by civilized is not recommended these days because it adds unprincipled choice points to what Andrew Gelman calls the 'Garden of Forking Paths' [1, 2]. Thus they are bad for hypothesis testing, which tends to be what most statisticians care about.

Additionally the technique obscures the relationship between the variance of the sample and the population variance if we do not have reliable knowledge of the population distribution; likewise for the mean if the mean is not close to the mode. These problems can be quite dramatic for long-tailed distributions.

[1]: http://www.stat.columbia.edu/~gelman/research/unpublished/p_... [2]: https://statmodeling.stat.columbia.edu/2016/09/30/why-the-ga...


This post seems to be conflating a few different things:

1. Trimming for mean estimation, which removes extreme values in an algorithmic fashion

2. Subjective removal of outliers based on researcher judgment (this is the garden of forking paths Gelman talks about)

3. Estimating other distributional properties, such as the variance, with trimmed estimators

These are all different things and come with different theoretical and practical risks and benefits. Trimmed means are perfectly good statistical tools, although they have their limitations like anything else.


I made two separate points.

The choice of N used in cutting out the N% most extreme results is not determined by widely accepted statistical best practice. Hence it is a source of forks. The algorithm might be deterministic but the choice of this parameter isn't.

My discussion of distributional properties was another issue concerning this technique. You seemed to have missed the point that dropping extreme points can also lead to biased estimates of the mean.

Ten years ago, dropping outliers was considered good practice in the social sciences. Today, it has become a reason for rejection in peer review. There are better techniques for dealing with noisy data, such as adding measurements to data points to measure "badness" that can then be adjusted for in a multi-level model.


All but the simplest statistical estimators have researcher degrees of freedom (certainly including multilevel models) so it seems arbitrary to criticize the trimmed mean in particular for that "fault".

Similarly, any estimator can be biased if its assumptions are violated, so I'm not sure why the potential bias of the trimmed mean in particular is an interesting point.

I'm sure that social science peer reviewers have their reasons for their methodological preferences, but trimmed means are great workhorses in other areas of science, like signal processing.

The critique strikes me as potentially valid in its subfield but a bit parochial if it is attempting generality.


I don't deny the technique has its uses. The point is it is a poor technique to use if your goal is hypothesis testing, which, as I said, is what most statisticians care about.

I didn't reply to you, but to goodsector, who claimed that statisticians cared focus on efficiency at the expense of reliability. I dispute this.


I agree! Statisticians have largely led the development of robust methods, so I don't see how they can be characterized as ignoring the concern.


This also exists in control theory as the tradeoff between performance and robustness!


Wouldn't this be better described as a tradeoff between accuracy and robustness?

Interesting concept.


It's a bit different aspect of the same thing. If B can perform better than A given the same data, it usually means B can perform equal to A with less data. From wikipedia:

> Essentially, a more efficient estimator, experiment, or test needs fewer observations than a less efficient one to achieve a given performance.


Aha. That makes sense.


it's called bias vs. variance tradeoff, or over-fitting, in stats/machine learning lingo.


No, it isn't. The b/v tradeoff is not the same thing. The efficiency of an estimator is a different from its bias.


> The efficiency of an estimator is a different from its bias.

I think the comment is drawing a parallel to variance (better efficiency = lower variance). Still not exactly the same, I think, but pretty damn similar.


They're related in that less complex models will degrade more gracefully when making predictions on novel anomalies, and that in general model complexity drives the bias-variance trade-off.

But, erring on the side of efficiency in this discussion is more like over-fitting, which implies an overly complex model. It's making your model too good for one situation, such that it fails to generalize. You'd rather pull back on accuracy and choose a simpler model, in the hopes that it's more resilient to novel observations.


That's a nice way to think about, and it reminds me of Nassim Taleb's "antifragile" thesis [1]. Basically, the world is more random than you think, and to operate rationally under uncertainty, you need to be open-minded about opportunities and risks with huge asymmetries. Fragile systems are often very successful for a long time because they ignore hidden risks and then collapse due to the unexpected.

[1] https://en.wikipedia.org/wiki/Antifragility


> Fragile systems are often very successful for a long time because they ignore hidden risks and then collapse due to the unexpected.

There's another interesting aspect to this in that things that are failures from some perspectives may not be from others.

If stripping resiliency out of a company nets enough savings in the short term, it may still be profitable to the owners it even if it's long-term fatal.

As a hypothetical example, let's say you take a company making $1M a year and trim $19M a year of costs out of it. The company lasts another 10 years and then collapses. You've netted an extra $190M out of that company, or nearly 200 years at their previous rate.

In that case, it's in your local interest to strip the company bare, even if it's not necessarily optimal for your partners, workers, society, or any other stakeholder in this wonderful interconnected world of ours. The benefits are concentrated, the costs are distributed, and there's no mechanism for connecting the two.


Even better, why not get the company to borrow against those 10 years of income, and pay you your profit now?

Even better, since you now don't actually even need to keep the company alive for 10 years since you've already got your profit, you can sell the company's assets now and increase your profit!

Except there's not a company in the world that this logic doesn't apply to (at some point). Asset-stripping has killed off a few old companies that were ready for it, sure. But it's also destroyed lives and communities that didn't deserve that.

There is more to life than money, and one of the things that this article is talking about is that we need to recognise that.


The key difference here is that the “communities” are not recognized with any particular ownership interest.

Does the owner of the company have a right to take risks with the business? It is a serious impairment of ownership if not, and will likely lead to more-stagnant societies. The entire engine of America’s superior prosperity (even at the individual level) has been based on risk taking, while stagnant systems have their own problems (Greece, Italy come to mind with pre-covid crises) and can be a vector of corruption as the principal-agent problem remains unsolved and those entrusted with the well being of the community often work to enrich and empower themselves instead.

This is not to say that we cannot say the community ought to have more of a say, this is merely to point out that there is a tradeoff that affects society in general. If we are to succeed in making this tradeoff it will be in part by better aligning the interests of business owners and the community, and we should be aware of how hard a problem that is when we go to attack it.


> The entire engine of America’s superior prosperity (even at the individual level) has been based on risk taking

This is a story that America tells itself. It's not necessarily true.

And currently the USA in in enormous debt, partly because of the vast cost of bailing out its financial institutions and large corporations. "Risk-taking" is increasingly only being done by private individuals. Large American businesses are certainly not being exposed to the results of their risks - they're being bailed out, socialising the risk but privatising the reward.

In this case, if the community is shouldering the responsibility of bailing out companies that are in danger of collapsing, shouldn't there be some "impairment of ownership" as you put it? Aren't those communities entitled to ask that the company is run for their benefit too?


The option to take risks with the business is not a right, but a privilege. It brings prosperity only if exercised carefully. It is done less and less so, with more and more arrogant excuses, and sadly, it will be USA's undoing.


Although I lean towards seeing killing the company as short-sighted, suppose you then invest that money back into a new venture. You have far more capital than you would have had you let the business remain healthy. This may give you even better value in the long-run (say 200 years) if you let this new venture live. But if you kill this venture too and invest its earnings into yet a third venture...


Right - and I think that's generally what you'd see people doing.

I'm not even sure it's necessarily an intrinsically bad pattern - there can be short-term opportunities and circumstances that make a strategy worthwhile for a number of years but not further. I think the issue broadly is twofold. First is the collateral damage - companies forming and dissolving is fine for the investors but murder for the employees, who's livelihoods and health insurance become precarious. Second is that we've applied this to the entire economy in a way that makes us incredibly vulnerable to systemic shocks - see America's toilet paper supply between the months of March and July. Again, on an individual company-wide basis, it might still have been more profitable for Procter & Gamble to do whatever the hell it is they did to make 2-ply an impossible technology to reproduce domestically in 2020, but on an economy-wide basis, the fact that Everyone did it was a goddamn disaster.


> companies forming and dissolving is fine for the investors but murder for the employees, who's livelihoods and health insurance become precarious

That's a problem with the social safety net, not a problem with companies failing.

A lot of companies are going to fail even without someone actively trying to drive them out of business.

Some industries just have collapses in demand and no longer do something that anyone wants. Some companies are just mismanaged.

We need to provide support for the employees who are victims of companies collapsing, not try to prevent any company from ever collapsing.


Yup, but that’s achieved via taxation, currently one of the costs that are being trimmed in the 19M/y example above... It’s a case of having your cake and eating it too


Completely agree with this.


What number of companies still operate in the same form for several decades? A company dying isn't necessarily a negative. The assets don't get set on fire; the people working there don't disappear into a black hole. Investors might get fleeced, but overall the assets and people are used to others end, for which investors in these ends benefit. The worst outcome are those zombie companies that only exist for the sake of existing.


Yes, invest in a mega yacht or juicero or theranos, etc.


Only if you define local interest as purely financially motivated. I am not even talking about the ethics, but even social status is not only dependent on wealth. We are much more than a bank account number, thinking that we are is more or less a pathological disease, because rationally it makes no sense.


> Only if you define local interest as purely financially motivated <...> social status is not only dependent on wealth

For an awful lot of people, it basically is - I mean, practically the entire field of finance and professional management would look at the example I gave and say, "that's exactly the right thing to do." Mitt Romney's entire professional career follows that principle, and it's largely seen as a net positive to his political career (as a Republican).

I agree with you, I think it's an enormously damaging philosophy, both to the bearer and to the rest of us, but again, it may be locally optimal.


I don‘t know, I am not a fan of Mitt Romney but his social status seemed to be more dependent on being a successful governor and managing the olympics. Both are net positives for society, his business record actually was his weakness.


Could you help me understand this a little better and define what you mean by local interest?


I'm not using a rigorous definition, but essentially, consider two outcomes, each affecting 100 people:

  1. Every person accrues $10
  2. One person, "Bob", accrues $100, everyone else accrues $1
Generally speaking, outcome 1 has higher overall benefits, but outcome 2 is better for Bob. If Bob is the decision-maker, it's in Bob's interest to pick outcome 2, even if it's globally worse.

This is what I mean by local vs global interest - roughly, "in the interest of a given individual" vs "the best outcome across all individuals."


With respect to your 200 years figure, $190M also nets $3.8M/yr investing very conservatively. Even $50M once would usually be valued higher than $1M/yr forever, in a time-value-of-money sense.


Behold! Private Equity



That'd be a good post for HN, especially since the topic of efficiency/robustness has come up a few times recently.


One of my main takeaways from Antifragile was that the people often involved with making processes efficient have no business being in that position. He was right to label management science as quackery and practitioners as charlatans.


Reminds me of Machiavelli’s comments on the French state (many small warlords) vs. the Ottoman state (single supreme leader.) The French state was less efficient, but more resilient and difficult to conquer, while the Ottoman state had more efficiency but was highly fragile.

In some cases the old king of the conquered kingdom depended on his lords. 16th century France, or in other words France as it was at the time of writing of The Prince, is given by Machiavelli as an example of such a kingdom. These are easy to enter but difficult to hold.

When the kingdom revolves around the king, with everyone else his servant, then it is difficult to enter but easy to hold. The solution is to eliminate the old bloodline of the prince. Machiavelli used the Persian empire of Darius III, conquered by Alexander the Great, to illustrate this point and then noted that the Medici, if they think about it, will find this historical example similar to the "kingdom of the Turk" (Ottoman Empire) in their time – making this a potentially easier conquest to hold than France would be.


> The French state was less efficient, but more resilient and difficult to conquer, while the Ottoman state had more efficiency but was highly fragile.

Machiavelli is saying the opposite of what you think he's saying.

He's saying that France's governmental structure makes it (relatively) easy to conquer. Because there are many quasi-independent, competing fiefs, an invader is not necessarily facing a unified front, and may in fact be able to recruit dissatisfied lords to their cause. But that doesn't make it the kind of place you'd want to rule, because once you've conquered it, it's (relatively) easy for another invader to conquer you for the same reasons.

In contrast, there were no fiefdoms in Persia. Unlike the lords in France, the regional rulers in Persia were chosen by the state, and picked for their loyalty. When invading Persia, you are far more likely to face a united front, making it (relatively) difficult to conquer. That said, once you've conquered it, it would be (relatively) easy to hold for the same reasons.


French states are not easy to conquer, they are easy to enter, for the reasons you’ve stated.

Otherwise, we are saying the same thing.


So France was its own Vietnam. Fitting.


It is a fundamental law.

Systems theory has the concepts of "gain margin" and "phase margin" -- how much you can amplify feedback or delay feedback, respectively, before your self-adjusting feedback mechanism fails to find equilibrium and turns into an oscillator.

Even though most non-engineering systems don't fit the mathematical theory, the idea that only a finite amount of gain + delay is available, and that the two are somewhat inter-convertible, generalizes astoundingly well.


Do you happen to know a good introduction to that? I‘d love to read up more on it!


Steve Brunton, a University of Washington professor has a sweet lecture series called the Control Bootcamp that dives deep into control fundamentals while still being approachable. Great production values too: https://www.youtube.com/watch?v=Pi7l8mMjYVE&list=PLMrJAkhIeN...


This youtube channel has (seemingly good quality) Khan Academy style lecture videos on control systems, if you just want a quick introduction to the concepts.

https://www.youtube.com/watch?v=ThoA4amCAX4



Yeah, as a controls engineer I’ve worked on systems where the main requirement was not gain or phase margin, but a root sum square of the two. Nichols plots drive home the idea that it’s not really the margin at two discrete points that matter, but the close approach to the critical point.


> It is a fundamental law.

> Systems theory

I'm confused. Is it a law or a theory? And no, it can't be both. Laws are proved. Theories are unproven.


You are confused, twofold:

"Systems Theory" in above refers to a field of study, not a name for the idea they were discussing.

And your assertion about Laws and theories is incorrect.


Well, those two statements aren't referring to the same thing. The law they reference is something found in Systems Theory.

It may be useful, also, for you to read these two pages:

https://en.wikipedia.org/wiki/Scientific_law

https://en.wikipedia.org/wiki/Scientific_theory

The difference between Law and Theory (in scientific discourse) is not what you believe it to be.


> The difference between Law and Theory (in scientific discourse) is not what you believe it to be.

ELI5?


Jtsummers' explanation is correct, but I wanted to weigh in with an easy rule of thumb, which you may find easier to remember:

1. A law gives you a relationship with no mechanism. It almost always appears as a mathematical formula. You know how the pieces change with respect to another, but not why.

2. A theory gives you a mechanism. It tells you why. On rare occasions, it will not provide quantifiable predictions, in which case it is a qualitative theory.


Well, I'm not interested in explaining this to a five-year old, but I'll treat you as an adult and use words over two syllables.

> Laws are proved. Theories are unproven.

This is not what is used to identify something as either a law or a theory in scientific discourse.

First, "laws" may be disproven, or falsified: Newton's Law of Gravity could actually be considered disproven as it is not accurate at all scales, but it's accurate enough within its scope to continue using it. It's considered a law in the sense that it matches empirical, observed data (within certain bounds). See [0] for details on that. So that, right there, is a flaw in your understanding.

Second, laws do not attempt to offer an explanation of the phenomenon they describe, they offer predictive value like "a ball dropped from 50 meters will, at time t, have velocity ...". Again, Newton's Laws do not explain why gravity works, only offering a model to calculate the effect of it. This brings us to theories.

Theories are, again, falsifiable via empirical evidence (like laws), but they offer an attempt at explanation. A Theory of Gravity would try to explain why objects are attracted to each other and why the mass effects the amount of attraction. The theory can be shown as false, but like a law it can only be shown to match empirical data. This is not the same as proven.

TLDR: The distinguishing characteristic is that both attempt to predict, but theories attempt to explain. Both are falsifiable, and neither are considered proven only to match empirical data (possibly within some constraints).

[0] https://en.wikipedia.org/wiki/Newton%27s_law_of_universal_gr...


Your confusion is understandable. This misunderstanding surfaces every now and then on HN.

“Hypothesis. Theory. Law. These scientific words get bandied about regularly, yet the general public usually gets their meaning wrong.”

https://www.scientificamerican.com/article/just-a-theory-7-m...


I think this is probably better characterized as "efficiency erodes resilience". You can have stability if there are no perturbations. However, if there are, and you have optimized for a regime where there are not, you are very exposed to risk. This is pretty much Table's notion of antifragility as well as the study of resilience engineering.


There's an engineering version of "stable" that might be useful here to draw lines between the three or so different concepts being discussed. A "stable" system is one that will return to the same resting state when perturbed. One can have "equilibrium" in an unstable system, e.g. balancing a broomstick on one's hand, but the system will not return to that equilibrium if perturbed.


This idea also helps when thinking about local/global (I'd argue most of what we do is local optimization of minima, and what challenges us is when the assumptions about locality of the phenomenon are challenged enough that we can't guarantee that stability).

And the idea is also applicable to trajectories rather than singular points: if you change your starting point by an epsilon, would the trajectory be vastly similar or a bit different? or very different? cue in Lyapunov fun and Lipschitz continuity[0] as a metric and, to a lesser extent, conditions for chaotic trajectories to emerge.

[0] https://en.wikipedia.org/wiki/Lipschitz_continuity


It's not a binary opposition. In control theory you can make a system unconditionally stable, or you can allow a little oscillation around the ideal, or you can make it unconditionally unstable so the tiniest hint of noise drives it to an extreme.

In some systems you want to generate constant oscillations, so your system can be in stable perpetual dynamic equilibrium producing a nice sine wave.


I think you mean resilient when you say stable: A stable system NEED not get back to resting state by itself. A resilient one does.

Along these terms, I think equilibrium doesn't really make sense - it means stability in many definitions. From the top google result:

> Equilibrium is defined as a state of balance or a stable situation where opposing forces cancel each other out and where no changes are occurring. An example of equilibrium is in economics when supply and demand are equal. An example of equilibrium is when you are calm and steady.

Another take on this: (Stable vs unstable equilibrium) https://web.ma.utexas.edu/users/davis/375/popecol/lec9/equil....


> Table's notion of antifragility

sigh, someday we'll have auto-correct that just works. Doesn't even need to be AI, just use words in the current page. Heck just use contextual info such as the capitalization. Somebody, please?


"efficiency erodes resilience" is a good line, at a loss of symmetry. resilience also erodes efficiency.


Yes it's basically a myopic under-appreciation of the current system. I was guilty of that, many are I believe, we all strive for better but sometimes our perspective is off. (not to bring him up on every topic but Alan Kay said that perspective was worth a lot of IQ points)

In the list of anti-perfection patterns there's mechanical jitter.. a catastrophe avoiding relaxation.


Relieved to learn that other people dwell on this as well. My anxiety stems from the idea that modern corporations are incentivized to ruthlessly optimize for efficiency--short-term gains--thereby outcompeting corporations that are structured for longer-term outlooks by engineering redundancy (which you call stability) into their processes. I don't know how to begin to incentivize the idea that efficiency is not the end-all.


See also "Slack", a book about this very issue.

https://www.penguinrandomhouse.com/books/39276/slack-by-tom-...


It's funny, I've been sucked into factorio after the 1.0 release announcement and this really rings true.

If you make a production line that has perfect throughput with no buffers then you get fantastic efficiency and productivity right up until a single train hits a biter and is delayed by 30s.

Then you spend 3 hours trying to deal with every machine being out of sync with every other machine with constant start/stops :(


There's a similar relationship between efficiency (convenience) and security (safety). Examples are everywhere. A centralized system (efficient) is a SPOF (not safe). Aggressive caching can be fast but unreliable. Adding a lot of features in a short term makes the code unstable, etc, etc.

People often focus on one thing and overlook the sacrifice they're really making. Everything has a tradeoff.


That concept of efficency as opposite of stability seems a bit fallacious in the strong case - inefficency itself can and has caused systems to collapse which is the exact opposite of stability.

Cache/reserve and efficency is itself complicated and situational if it helps or hurts efficiency. Overfitting it could make it fragile but it also depends upon relative costs for what a blind pursuit of efficiency goes for. If something is cheap enough like data storage there will be plentiful slack because slimming it down is irrelevant to efficiency - why bother going with custom 4KB chips when economy of scale means a 250 MB one is cheaper and better? It just isn't worth trying to cut corners there.

A laggard damped system would take longer to get into a "bad state" assuming the environment doesn't demand rapid changes as the baseline to survive. Bad state is relative as always - one can doom themselves both by leaping onto the "next big thing" which isn't and by sticking to the buggy whip and musket when others have cars and automatic rifles.


I think you're both talking about different ideas here. Efficiency is good. But redundancy is also good (necessary, even, for a resilient system), and the problem is that you can always increase efficiency by removing redundancy, so it does get removed by short-sighted efficiency-optimizers.

Topically, in the past week we've seen two giant companies, Adobe and Canon, lose unimaginable amounts of user data. If they had had backups, which are a form of redundancy, this would not have been a problem. But the backups were too expensive--too inefficient--and so now customer trust in their service is absolutely destroyed.


If you are strong everywhere you are weak everywhere. The article is you plain out wrong. The problem is the lack of a strategic reserve, not efficiency in itself.


Similar to your argument, but another way to think is that efficiency adds dependency on a higher form of skill (ability to multi-task) or a complicated system.

I dread the day when Google Maps, Traffic stats, Uber has actually delivered us the "perfect people transporter system" maximizing the heck out of existing infrastructure (cities, roads) and then the inevitable happens.

Systems become too big to fail.


That’s one of the points of “The Goal”. As in you need slack to be more productive over time. https://www.ebay.com/p/117280427

I leave the eBay link because last I checked used copies on amazon were very pricey.


It's still in print so you can just get a new copy instead.


I think of this as an optimization kind of problem. The word "efficiency" itself is only meaningful in context of what's being made more efficient.

A system could be "more efficient at becoming stable," for example.

But if by "efficiency" we limit ourselves to mean "the time-cost of a set of actions," (as in the most efficient path is the one that takes the least time), we quickly encounter problems with maximizing usage of time and how that conflicts with unexpected work, which leads to the anti-stability you mentioned.

The way I think about it is that a 100% time-efficient process has zero time-flexibility. If you want to gain time-flexibility (e.g. the ability to pivot, or to work on different things too, or to introduce error bars to your calculations), you lose time-efficiency.


Dr. Richard Hamming said in his lecture series “Life Long Learning” given at the Naval Postgraduate School: a perfectly engineered bridge would collapse if an extra pound was driven across it. I used to think this was a joke, or at least said in jest. Nope.


When I think about personal finance, I often think about efficiency. The obvious examples include buying in bulk, avoiding finance charges to optimize what you get for your money, paying insurance up front when the payment options charge extra, and just plain having fewer subscriptions to keep monthly expenses lower and easier to track.

All of this efficiency increases financial stability. I suppose if we argue that I'm only referring to optimization and not efficiency, then perhaps it's not a great argument.


The two are certainly related but it feels like optimization is a bit different. Maybe the efficiency equivalent for personal finance would be an example like keeping your bank balance at a minimum and immediately transferring any spare cash to paying down a mortgage or otherwise into a fairly illiquid investment because thet's where the best returns are. But now if you have an unexpected expense you have to scramble to come up with funds.


Ah yes, I think that makes it more clear.

In this case, efficiency might be automating bill payments, but then you don't catch price changes, and depending on your other systems in place, you might miss overdrawing your account.


The colloquial words may not be the best. It seems like it's a distinction between optimizing things for which there is essentially no meaningful downside except maybe a bit of thought and time. Versus setting things up so that they're the most efficient system in the current environment but may not handle unexpected events well and therefore might not be the best choice.


Rather than stability, think of it as robustness or flexibility. Past a certain point, efficiency is an enemy of robustness.

Buying in bulk is cool. If you buy a big package of paper towels, it's cheaper and you don't have to worry about running out. The fact that you have a cabinet full of paper towels isn't a big deal. But suppose you find a really great deal on a semi-trailer load of towels and stock up. Now you have your guest bedroom full of paper towels. The next week, your cousin Edgar's house burns down; you'd like to offer him and his wife a room to stay in temporarily, but you have all this paper in the way. You have lost some flexibility.

A bigger problem is, say, corporate supply chains. With just-in-time supply, you don't have to store inputs and can focus on producing outputs; it's very efficient. But then there's a pandemic or a fire in a factory somewhere, and the supply chain falls apart. Now your outputs are perhaps in greater demand, but you can't take advantage because you have no inputs. You're out of business for the duration. You can't flexibly respond.


JIT isn't the problem in your last example, it's a weak supply chain. If you have a singular source for some critical or necessary thing, you have a risk whether you're using JIT concepts or not. Though higher if you take JIT to an extreme. Your sources need to be that, sources, and ideally geographically separated. See the various industry crises following natural disasters in Taiwan for why.

The proper, though not easy, thing to do is have some primary sources of your materials and secondary sources to handle surges and supply issues with your primes. You may still have a production slowdown, but hopefully not a shutdown. And like with backup, if you don't use them you don't have them. You have to place orders with all sources, and use materials from all sources to ensure that they are in fact up to snuff. It'd suck to buy your RAM from Foo until a flood, and then find that BAR's RAM doesn't actually work (or doesn't work with your motherboards).


Right.

But, having a single source is more efficient than multiple sources---you can integrate tighter with their ordering system, packaging, product quirks. Until it stops working.

Having all of your sources in one country is more efficient. Until it doesn't work.

Managing all of your materials just-in-time is more efficient. Until a backhoe hits the gas main in the street outside and you can no longer get trucks into your factory and have to shut down the line for the duration.

A company with a weak (but not completely idiotic) supply chain will have a significant margin over a company paying extra for a strong supply chain. Until something goes wrong.


I guess my point is that, in everything I've read and experienced, JIT isn't about efficiency to an extreme (drop to 1 supplier, have 1 uberefficient factory, have 1 expert). You still maintain a buffer, and if possible multiple suppliers. The source of JIT (in manufacturing) is primarily Lean or the Toyota Production System. Those do not endorse zero-buffers or sole-sourcing for the purposes of efficiency. The purpose of reducing buffers is to expose inefficiencies elsewhere. And you find the right level of buffering for you based on your suppliers and risk tolerance.


But it is not simple to find the "right" level.

You do not have to live for ever, so surviving a extinction level ecvent is more than the "right" level.

Being so brittle that the failure of one delivery kills you is less than the "right" level.

How long do you want to live? And when you die, how is it going to happen?

It is pleasing to see firms go out of business in a orderly fashion, such that they have the resources to shut up the shop properly, and every one in the business gets "onto the life-rafts". But it is more common to see businesses go down in a screaming heap with huge debts and big messes for others to clean up.

When I was at business school I was taught that a firm should have the correct amount of debt such that it maximised the tax shield of interest payments and had no fat that a hostile acquirer could use to do a hostile takeover.

That was in 2007. I am quite sure the course teaches something different now!


Just in time is synonymous with always late. Might work well with simple fungible goods but breaks down on complex stuff


complex stuff like building cars?


I think but you’d have to ask insiders if it works for them or inspite of it. In my industry it the latter where product arrives late and there is enormous recovery costs that are rarely mentioned or properly accounted for, everybody up top just sort of pretends it’s not routine when it actually is. Typically these sorts of things don’t make it out to the general public in press releases though. Holding a bit more product and some spares would have been far cheaper then the constant recovery effort.


A quality of most JIT systems I've experienced (and texts on the topic) is maintaining a sufficient buffer (alluded to in your last sentence). Taking it to its extreme (that is, essentially 0 inventory buffer) JIT only works if everything else works perfectly. That is an extreme position not endorsed by anyone or any text I have ever encountered. But it does remind me of the shitty takes on Agile I've seen (no planning, in particular). So seeing it in the wild would not surprise me.

"Sufficient" is dependent on a lot of factors, though. No one can tell you a definitive answer without knowing your system (your suppliers, your customers, your rates of production, your supply of capital to withstand a drop in production).


funny that you mention shitty agile we do that too


It increases your financial stability under the assumption that your life situation evolves in a predictable way. Say you need to cover unforeseen medical expenses, or you develop an allergy to one of the foods you bought in bulk (sorry, that one was a bit contrived), etc. - well, then you'd suddenly wish you'd held off on buying that pallet of canned soup.

On the other hand, if you only buy your food day to day, that is certainly more like JIT logistics, prevents waste & storage space needs, etc., but it screws you if you can't leave your house and the stores get closed due to some... ahem... what might possibly happen that forces you to stay inside.

So it's always a matter of your frame of reference, I guess.


> paying insurance up front when the payment options charge extra

Depending on the cost of the insurance, that sounds to me like a drop in stability: you have infrequent periodic large payments to make instead of frequent smaller payments to make. If you had an unexpected expense arise near the time of the large insurance payment, your financial situation could get temporarily bad; if instead your insurance was small payments on a monthly basis, the unexpected expense would be easier to ride out.

[Note that I pay all of my large expenses in lump sums instead of in small trickles, but that's mostly psychology on my part not efficiency or optimization]


Whether it's about the economy at large, your own household, a supply chain, what have you - as soon as you optimize for efficiency by removing friction, you take all the slack/damping out of the system and become instantly more liable to catastrophic failure if some of your basic conditions change

I would hope that the fragility of JIT supply chains was laid bare for everyone in the Covid crisis but I expect that lesson will soon be forgotten.


Reminds me of something six sigma manufacturing types call "building monuments." This refers to over-investing in optimization and automation to the point that there is too much sunk cost. As soon as something changes you're in trouble and all that sunk cost is gone, or even worse you can be stuck and unable to produce until you've retooled significantly.


Maybe the problem is that efficiency and robustness are orthogonal?


If you think about the two-dimensional efficiency/robustness space you can pretty easily see how it's isomorphic to the return/volatility space by way of simple transformations. If yourself to be convinced that such a bridge exists you can also bring tools in the return/volatility space back into the efficiency/robustness space by applying the appropriate inverse transformations. Maybe the conclusions are common sense, but I'd still read a blog framing that process by way of analogy.


Interesting idea, thanks for sharing.


EDIT: If you allow yourself.. :/


They're orthogonal for non-trivial decisions. Once you hit the efficient frontier of anything, you need to start making trade-offs.


That's a much more reasonable statement.

Add to that that if you optimize one from a set of orthogonal values, the other ones tend to decrease. And so you get to all the people claiming there's an intrinsic relation between them, on the face of a world of evidence.


They at least aren't in direct tension.


Up to a point, being inefficient conflicts with robustness. If you're profligately wasting a resource, you are in for a hard time if that resource dries up.

On the other hand, past a point, efficiency conflicts with robustness. To maximize efficiency, you become tightly coupled to some resource and again have a hard time if that resource dries up.


I think this requires more thorough definition. Obviously you can be inefficient without being stable. E.g. I could drive my car at 5 miles an hour to work which would neither be efficient nor stable.


It certainly would be stable. A very moderate increase in speed would allow you to make good on a substantial delay for any kind of reason.


It would not be stable as it would significantly increase the risk of accidents, road rage, or being pulled over for going too slow.


That instability is not because of the 5mph but because you're driving 5mph in a 60mph zone.

You could drive 5mph on back roads with less risk of course.


agreed. the two objectives of efficiency and robustness are not necessarily in conflict in all situations, but if you start trying to optimise a given system focusing on only one objective, the other one may degrade arbitrarily. better to define what tradeoff would be a good deal: how much efficiency would you be willing to lose to gain 1 unit of robustness, etc. then optimise both objectives taking into account your preferred exchange rate.

there's plenty examples of this kind of thing in engineering design situations. it's cheaper (i.e. more efficient usage of capital, at least in the short run) to not allocate resources for backups or allocate extra capacity in systems that isn't used 95% of the time. it's much more expensive to dig two spatially separated trenches to lay independent paths of fibre optic cable to a given building, but if you cough up the money for that inefficient redundant connection, your internet will have decreased risk of interruption by rogue backhoes. it's cheaper to not hire enough staff and get individuals in a team to over-specialise in their own areas of knowledge rather than having enough spare capacity and knowledge-sharing to be able to cover if people get sick, go on holiday or quit.

ref: https://en.wikipedia.org/wiki/Multi-objective_optimization


Right, there has to be some boundaries defined for what's in "reason".


> I've found "efficiency as the opposite of stability" a very powerful concept to think about

I think this concept misses capacity. I my opinion, it is crucial that you always leave some over-capacity to have stability (lets say, you are running at most at 80% capacity). If you then increase your efficiency without sacrifying your buffer capacity, everything is fine. But as soon as your try to run at more than 80% capacity to be more efficient, this slightest problem could have devastating effects.


Efficiency is a broad concept. When it comes to energy, I don't consider stability incongruous with efficiency at all: the less you waste over time, the more stable your electricity demand growth, which saves money and improves QoL for citizens.

In general, learning how to do things better can produce efficiency and more stability too, if the way is better all around.


This is also the primary topic of “blue ocean strategy” — with a difference there that running without slack causes the system to grind to a halt and or run in “emergency rush order mode” much of the time. In such a case efficiency and resilience/stability have some linear dependence. Sometimes these “opposites” are actually working together.


Apparently also the reason why plants forgo green light (i.e. the most abundant part of the solar spectrum): https://www.quantamagazine.org/why-are-plants-green-to-reduc...


This efficiency vs. resilience trade off seems to be a general pattern, also for organic systems.

https://www.quantamagazine.org/why-are-plants-green-to-reduc...


That makes sense. If efficiency is achieving maximum ROI, then the best ROI activies are generally the highest risk ones, which have the most chance of failing and are thus least stable.


The rule holds true for switchmode power supplies.

Sometimes you have to add a resistance (inefficiency!) in series with the output capacitor to achieve stability.


But even more effective and efficient to use a transfer resistor


"stability" strikes me as the wrong word there; maybe "resilience" or "flexibility"?


I'd say resilience instead of stability, but I agree that it's worth thinking about in many contexts.


Very interesting thought! Although I’d probably call it anti-fragility or resilience instead of stability.


Another way I like to state this is "bottlenecks are beautiful design devices".


Following this mindset, blockchain can be described as low efficiency and high stability?


There's always someone making weird correlations between Blockchain/Cryptocurrency and some random idea.


This isn’t really too correct. A lot of things are neither stable nor efficient.


> A lot of things are neither stable nor efficient.

Yes, and those are the shitty things.

But when you want to improve them, make them non-shitty, you're facing a choice: stability, or high performance - pick one.


Life is short and we’re not building generational ships so let’s just go fast and efficient and if it crashes it crashes, but by then the lion’s share of profits will have been made and maybe even spent.


I can't help but think of government (or management).


The best efficiency arises from simplicity. Yes, haste makes waste, and a system that's constantly changing will be unstable.

But efficiency itself is nether haste nor churn, in fact the opposite.


But isn't this a subtle diss against capitalism or a purely market based system? A free market results in _efficient_ allocation of capital, resulting in a _fragile_ allocation.


Also in government. It’s really good that they are slow and inefficient (although it would be nicer if they were less wasteful).

There is little worse than a very efficient government.


A very efficient singular planetary government (or, excessive international homogenization of approaches) seems like one plausible example.

To me, this very conversation is the approach we should be taking to the major problems du jour on the planet (treating it as an incredibly complex system with an infinite number of interacting variables, many of which we do not even know exist). But it seems as if once a system reaches a certain level of complexity, we lose the ability to even realize that it is in fact a complex system, and insist upon discussing it only in simplistic terms. Or maybe it's the fact that we are embedded within the system that makes it impossible to see.


This comment thread is making me realize we don't have a good word to distinguish between efficiency as in "we only have 7 hospital beds because that's all we need on 99% of each day" and efficiency as in "we replaced steps X,Y,Z with just step X', because we found that X' could accomplish everything that XYZ could accomplish but it's faster, more accurate, and cheaper".

One makes a tradeoff by reducing overheads and buffers, and the other doesn't have any tradeoffs, it's just a better way of doing things based on novel techniques.


This seems like it sort of ties in with NN Taleb's idea of "Anti-fragile" [1].

Perhaps also Chesterton's fence [2].

Maybe also the whole premature-optimization thing [3].

And of course the too-clever-by-half coyotes [4].

Really maybe it just comes down to "be wary of making changes that reduce resiliency."

I was hoping to come up with something cohesive with this comment, but really I guess I just agree with what you say.

And I think there are a bunch of people sort of circling around the same idea, which I don't think we've really quite landed on a precise definition of, just as you say.

[1] https://en.wikipedia.org/wiki/Antifragility

[2] https://en.wikipedia.org/wiki/G._K._Chesterton#Chesterton's_...

[3] https://en.wikipedia.org/wiki/Program_optimization#When_to_o...

[4] https://www.epsilontheory.com/too-clever-by-half/


> [4] https://www.epsilontheory.com/too-clever-by-half/

Thank you for this link. I'm halfway through that article and will probably read every single one on that website.


I love this comment on so many levels ;D


epsilontheory.com I love every article I have read so far. It's really expanding me. Thanks for sharing.


Unfortunately, it's probably quite wrong.

He uses the analogy of 'too clever by half' to exemplify his idea that 'financial innovation is always wrong'.

Nothing could be further from the truth. Insurance products have changed the world just as much as any technical innovation.

Mortgaged backed securities are not a bad thing, far from it, they allow more efficient use of capital by having 'saving Germans and Japanese' invest their money where they otherwise would not be able to.

The problem in the 2008 crash was soft systematic corruption and un-ironically a lack of fragility (ie one bank goes down it takes the rest down like dominos) - not necessarily the securities themselves for which he didn't actually even provide any basis of his negative assertion.

Efficiency is usually how we gain productivity and it's borderline absurd to say there is inherently something wrong with it on the whole. Like anything 'it depends'.

If you can have a software algorithm outperform 100 analysts on weather predictions for your fleet of drivers ... that's probably efficient. But cutting down operating margins so that any bump in the economy will leave you flat is maybe 'over optimisation'.


> He uses the analogy of 'too clever by half' to exemplify his idea that 'financial innovation is always wrong'.

I don't really have a horse in this race, but I think you're misreading the article.

He says: "Every truly disruptive discovery or innovation in history is the work of coyotes. It’s always the non-domesticated schemers who come up with the Idea That Changes Things. We all know the type. Many of the readers of this note ARE the type."

That's not a criticism, that's a point of praise.

He then follows it up immediately by saying: "Financial innovation is no exception. And this is Reason #1 why financial innovation ALWAYS ends in tears, because coyotes are too clever by half. They figure out a brilliant way to win at the mini-game that they’re immersed in, and they ignore the meta-game. Eventually the meta-game blows up on them, and they’re toast."

That isn't saying it's a bad thing, it's saying that the people who come up with the new ideas lose sight of the broader picture and get taken out by "the thieving raccoons" and the State.

He's saying "the coyotes" lose sight of the broader picture, just like in the famed XKCD [1] where the "too-clever-by-half" computer person encrypts all their data, and forgets that the thug who is going to come looking for it will just beat the piss out of them with a wrench until they turn over the key.

The core nugget of the article, in my opinion, is exactly the "meta-game is what always gets you" aspect.

It's the same thing that NN Taleb refers to as "2nd order effects."

[1] https://xkcd.com/538/


the way you wrote this comment, with the sources fully visable at the bottom, amazing. I will start doing this myself.


In economics we would call the latter a Pareto improvement - ie an efficiency improvement without any trade off



I don’t know if those two things are as different as you suggest. Multiple steps add redundancy, if you fail one step you can get much of the value from other steps. More steps are frequently added in response to edge cases.

Could be a Chesterton’s Fence scenario

Instead of bemoaning efficiency, it’d be interesting to reward/value redundancy and antifragility, at least at the system level.

I think this could mean trust busting, regulation, and general cultural shifts.


Multiple steps don't always add redundancy, sometimes they're just noise or artifacts. I had a colleague who, when opening a file, would first click on the Windows desktop, then "My Computer", then navigate to the directory, open the file, and close the Explorer window.

There was zero redundancy versus leaving the directory open so they could open the next file (or using the application's "Open File" dialog).

That is a perfect example of wasteful motion (in their case due to a poor mental model of how computers worked, as I learned through later discussions) that could be simplified significantly without loss of quality or redundancy in the system.

Contrast this with: The surgical office called me this morning and stated, "The surgery is for a ganglion cyst on your left wrist." Which I confirmed. When I go in on Tuesday for the surgery this will be repeated, and a mark will be made on the area to be cut open (though in this case it'd be really hard to screw up and open the right wrist, as there is no, quite visible, cyst there). That is useful redundancy of the sort you describe. Removing any step (the initial visit a week ago, the call today, the check when I arrive, the mark on the wrist) and you increase the risk of error.


Lots of good examples in sports, where wasted movements lead to failure, particularly at the higher levels


Yep. Not even just sports games, but athletics and movement in general. Learning to avoid certain movements when running, swimming, or rowing is very important especially if the goal shifts from complete a short distance as fast as possible to complete a long distance in a reasonable time (and often the more efficient form for long distance translates into faster short distance results as well).


My favorite story about unintended consequences of process improvement is the Vienna Sausage Factory: https://medium.com/dangerous-kitchen/vienna-sausages-a-guy-n...

Punch line - Sausages coming from a new modern factory didn't taste the same. The new, more efficient building removed a long transportation step where the partially finished sausages picked up flavors and scents coming from different parts of the factory. They had to create a new process to manually add those flavors that they were accidentally getting for free from the old factory layout.


I wasn't aware Vienna sausage had any flavor in the first place. In my view, Vienna sausage is in the same category of food as spam, baloney, and wonder bread: total garbage.


Wrong. Quality sausages are awesome. Btw. There exist high quality spams as well. Never seen one in USA though.


No, i mean the Vienna sausage brand. Of course there's amazing sausage


As long as those steps Y & Z are actually adding redundancy/antifragility, then yes. If the purpose is some deprecated feature or dependency that simply serves no purpose or is now an anti-feature, it would be more efficient to remove.

It would be good to have proof of a Chesteron's Fence analysis to say "yes, we know that Y & Z purposes and have analized the cost/benefit to removing them and the populations/systems impacted" - would this be an impact analysis?


In some scenarios, the process fails if all of the steps fail. In that case, redundancy is more stable, and you have a tension between stability and efficiency.

In other scenarios, the process fails if any of the steps fail. In that case, redundancy is less stable, and you can improve both stability and efficiency by eliminating unnecessary steps.

In either case, there may be other considerations involved as well (flexibility, visibility, recoverability...) but sometimes we just didn't see a better way to do something.


There are obvious cases where you are improving efficiency without eliminating slack. I'm not making my application more fragile by rewriting an n^2 operation over a large dataset to an n*log(n) operation that provides the same output. It's a win without tradeoffs.

This type of example exists in all industries. For example, finding a new alloy that has strictly superior properties across all dimensions for a specific use case. Or upgrading mail delivery routes using better pathfinding algorithms. Etc.


> I'm not making my application more fragile by rewriting an O(n^2) operation over a large dataset to an O(n log(n)) operation that provides the same output. It's a win without tradeoffs.

Hopefully not nitpicking too much: it's a win without _many_ tradeoffs. E.g., in the usual places where you'd accidentally get an O(n^2) operation rather than O(n log(n)), the O(n^2) operation is constant-space. In a sufficiently anomalous computing environment with a low enough priority on fast results you might still consciously opt for the O(n^2) solution.


Plenty of n log n operations are in constant space. Average Quicksort performance being perhaps the classic example. But most often you see these kinds of improvements by simply removing steps. Net result is less code, less memory use age, less of everything.


Yep, I'm not arguing that it's always a good idea (or even usually...or even that I've seen it be a good idea in the wild), just that it could be. Taking quicksort as an example, it's only constant space if you're allowed/able to mutate in-place. If you need (for some bizarre reason) to stream sorted results in constant space without mutating the underlying data then you'll need some other tool.


> the other doesn't have any tradeoffs, it's just a better way of doing things based on novel techniques

At least in the case of code, this isn't true. The variability comes in terms of change to the system, rather than the running of the system. i.e., if I simplify a process to be less modular and more monolithic, making it more efficient, that also makes it more purpose-built and less flexible. The "risk" increases of running up against a change that needs to be made but is intractably onerous. There's always a tradeoff.


I posted this in a similar thread yesterday:

> Fisher's Fundamental Theorem: The better adapted a system is to a particular environment, the less adaptable it is to new environments. -- Gerald Weinberg,"The Psychology of Computer Programming"

It's something everyone should consider in making critical design decisions. Your adaptable, modular system has some risks (particularly in terms of meeting performance targets, increased cost due to increased complexity), but the monolithic system has its own risks (less adaptable to changing requirements, potentially more fragile against attack or damage). Which you choose depends on many variables including your risk profile and anticipated need for change in the future.


IMO, it seems like there's always a tradeoff once we've restricted ourselves to interesting options, because those options are interesting because they're not obviously dominated by other options.

If we consider any possible solution, we can obviously imagine adding a completely spurious detail.


That's fair


The former is the removal of slack to make a process more efficient: essentially trading slack for efficiency (making a sacrifice to Moloch!). The latter is more what I would term the removal of inefficiencies.


There are always tradeoffs. The discovery that X, Y, Z could be replaced by X required investigation, which is a cost, and then changes to be deployed which disruptions that also have a cost. If the deployment of the optimization doesn't recover the costs, then it ends up not worth it. Usually the cost is recouped because the optimization is discovered once, and then deployed on a large scale; but that is not always the case.


I think considering the cost of change a cost of the optimized version is muddled thinking. It's a cost of deploying the optimized version, but that is only sometimes relevant.

A pure trade-off between efficiency and stability would imply that, were I already running the efficient version, we could buy stability by switching to the less efficient code.


* It's instructive to track the underlying resource. If technology_A saves you time, it's more time-efficient (aka quick). If technology_B saves you money, it's more cost-efficient (aka cheap).

* In game theory, "7 hospital beds" weakly dominates "8 hospital beds". But (x') strictly dominates (x, y, z). This is exactly what Pareto Optimality is about. Though perhaps a more colloquial term would be useful here.


We do have a word, it's efficiency and productivity.

Efficiency produces the same output with less input.

Productivity produces more output with the same input.

So efficiency is a measure of input to target output, and productivity is a measure of output to target input.

To make a process more efficient, you figure out how to get to some X output while using as little input as possible.

To make a process more productive, you figure out how given some Y input, you can maximize your output.


The first example about hospital capacity really just involves tradeoffs around your specific goals, whereas it only makes sense to talk rigorously about efficiency in the context of some specific output goal.

So if you fix some goal, say, "we want to be over capacity 1% of the time," then the most efficient way of doing that is probably to have the minimum number of beds that you need according to your predictions about utilization. But you can't really talk about efficiency when you're deciding what your goal is, e.g. whether you're okay being over capacity 10% of the time versus 1% of the time.


Yes, efficiency requires a fixed output goal. Because you want to minimize input while sustaining your desired output target.

For example, how can I make all of today's deliveries with less delivery trucks?

While productivity requires a fixed input goal. Because you want to maximize output while sustaining your desired input goal.

For example, how can I deliver more products per day without increasing the size of my delivery truck fleet?

Often time, improving one can improve the other, but not always. For example, someone could ask, how can we grow profit? Okay, one way is to be more efficient, thus spend less money to make the same revenue. Alright, maybe we use cheaper materials, so now produce the same amount and sell the same, but our margin has increased and we make more revenue. Someone else could say, we need to be more productive. Okay, so you invest in better marketing, and scale production to meet increased demand. You are as efficient as before, but more productive.

So I feel like, reducing the process from X,Y,Z to only X is about productivity. You still have the same number of employees, but since they don't need to waste time doing Y and Z anymore, they can produce more output. That said, you could choose to apply the gains to efficiency as well, for example, hey, because I eliminated Y and Z, I can now cut my workforce in half and deliver the same output.


There's also efficiency like "I picked up some extra screws at Home Depot because I wasn't sure if I had any at home".

Having ways to avoid an unanticipated repetition of a process, which would result in bunging up the works for dependent parts of the system, can make the entire flow more efficient. See also 'drum buffer rope' from constraint theory.


> ...and efficiency as in "we replaced steps X,Y,Z with just step X', because we found that X' could accomplish everything that XYZ could accomplish but it's faster, more accurate, and cheaper".

optimisation.


I agree it can get ambiguous but I think most of the time it's just framed differently and (to your point) are focused on different dimensions of efficiency.

For example, if your first statement was 'We only have seven beds because we tightened up our discharge workflow and that's all we need 99% of the time' and your second statement was 'We only have seven admins because we replaced steps X,Y,Z with just step X and that's all we need 99% of the time' they start to line up.


It's also not as simple as adding 7 more beds so we have all we need on 99.999% of days. Sure you have 14 beds, but you also need more doctors, nurses, surgeons, OTs, etc. to support those beds. Those health care providers won't get the patient contact they need to maintain their skills as competent providers. I know as a part-time EMT if I don't go on a certain number of calls a month that I have a noticeable decline in my skills


At some point you must choose your bottlenecks. Excess physical capacity doesn't have to be used all the time (that is, there's no "currency" for a bed, and other equipment can be rotated through use and maintenance cycles). If you choose to constrain yourself with the number of beds needed for your 99% situation, then you can't expand beyond that without great difficulty. If you have physical capacity for your 99% situation x 2 or even just one or two extra beds, then you don't have to maintain full-time staff for it. You could extend staff hours, or bring part-time staffers to full-time hours, or bring in (with supervision) students from a nearby medical or nursing school to handle what they can and offload the burden for that 1% situation.


A hospital can call a part-time EMT and ask them to work a few extra shifts due to some disaster. They can't ask them to bring a bed.


That is why kaizen is so nice because those are small improvements that you notice in daily operations.

The other one like bad efficiency I would just call "cost cutting measures" not efficiency improvements.

With kaizen you try to accommodate to what you have. So if Bob is slow cost cutting measure would be to fire him. Efficiency improvement way would be observing Bob to see what can be changed so you can get more value without messing him up.


It gets worse because even if one had precisely such a word, say the difference between ‘efficiency’ and ‘superlativity’ respectively, it would not necessarily be as simple as to say “efficiency bad, superlativity good.” They morph into each other in complex systems.

Consider the paradox of finding that a factory crew has no inputs—they are playing cards waiting for an order to come in—and yelling at them to go do other things around the shop like clean and assist other operations, rather than loafing. Or, for another solution to the problem, you might pre-order all the stuff and make sure that the team is always 100% loaded and never has the free capacity to play cards.

At first blush these improve superlativity, no? We are accomplishing everything that card-playing does but we are “faster, more accurate, and cheaper” if we are measuring, say, labor cost per part and the technician time averaged over the parts they worked on. Have we not just found a “novel technique” which is “just a better way of doing things?”

But staring at it for longer you may find yourself less sure. That’s what I mean by complex systems they morph into each other. There are more subtle tradeoffs here. For example when people feel free to loaf when they have no work, you can walk into the shop and ask who’s loafing and why and how you can improve their situation so that they again have proper work to do. There is an increase in latency when that shipment finally comes in and all the workers need to be summoned from across the floor to handle it again. There may be mental fatigue from having to context-switch too much or from having to constantly work on just one thing with no breaks. Or maybe the teams that need whatever they are producing cannot finish their work fast enough, so all of the inventory produced by this team slowly grows until it fills 50% of your factory floor, until you only have a certain amount of space because that’s all you need on 99% of each day.

The point is that the greedy algorithm may fail. In a linear circuit, you short out some resistor with some wire, you know that current is going to move faster afterwards. But in a nonlinear circuit, you no longer know this. In the absolute simplest case, the increase in current rapidly breaks a fuse and everything grinds to a halt. In more complicated cases you have a feedback loop and the increased voltage from the short-circuit feeds back to the earlier stages to throttle the current coming through.

Same with weight loss. People think that they will eat fewer calories and they will therefore lose such-and-so amount of weight. Well, probably. But this is a complex system we are talking about. One of the first things that happens when you start burning the fat is that your body burns your muscle too. This is the same reason that you can't burn fat on your stomach by doing crunches, your system is sending the call out to your entire body that it needs to digest surplus material. The loss in muscle mass appears to be the primary culprit which kicks down your basal metabolic rate and you hit what weight-loss folks call a “wall” where you are literally cold all the time and wearing sweaters and feeling too cranky to exercise and all that, feedback mechanisms which will mean that if you keep eating that restricted amount of calories you won’t be losing any more weight unless you can “break through” it by keeping warm through exercising and thereby increasing your muscle mass back up to where it needs to be and so forth. It’s just that it’s a complex system and the greedy algorithm does not always work for such systems.


"Efficiency" can only be defined in terms of something else -- you can be optimizing throughput or optimizing costs and you'll end up at very different solutions. One is efficient with respect to waiting times, the other is efficient with respect to costs.

In your latter example, it could very well be the case that steps Y and Z had purposes you didn't take into account that makes the new process less efficient in some cases with respect to the target metric.

Either way, overoptimization and focus on specific metrics to the exclusion of others is a real problem. Circumstances change over time and high levels of optimization make processes more brittle and likely to fail when circumstances change.


One of my favorite quotes on this topic comes from Aurelius' description of his adopted father in Meditations.

[Y]ou would never say of him that he "broke out a sweat": but everything was allotted its own time and thought, as by a man of leisure - his way was unhurried, organized, vigorous, consistent in all.

I feel like I spend a lot of time rushing from one thing to the next, constantly questioning whether I'm spending time wisely. And then I end up accomplishing less because I lack focus in one area. I've instead been trying to relax, slow down, and take tasks one at a time until completion. I'd also recommend Cal Newport's book, Deep Work, on this.


> but everything was allotted its own time and thought, as by a man of leisure

As an extreme example; watching Schumacher at his peak perform during a qualifying lap or during race, in treacherous rainy conditions, while everyone was absolutely struggling, and him out front, half a lap ahead of everyone was like watching poetry in motion. You could tell he was very relaxed just by the way his hands operated the steering wheel, hitting the apex every time in a single motion, no twitching or tossing around the car.

It seemed he just had more time, as in the time had just slowed down for him compared to everyone else.

Edit: Typo; damn you Mac OS auto-correct!


Time dilation is a real thing. You’re likely describing his experience accurately.

We see it in obviously exaggerated forms in film, like the Matrix, but that’s based on real shit. The best baseball hitters describe seeing the pitch the same way.


as by a man of leisure is my favorite phrase in the passage. I think it's the key to the whole thing. It's implies what kind of mindset you should try to adopt: relax, enjoy what you're doing and appreciate the moment (if you're able to).

Aurelius' adopted father was a consul three times, which is certainly not a stress-free job! But he apparently was able to keep cool by the way he approached his work.

https://en.wikipedia.org/wiki/Roman_consul


I'm still stumped over the fact that humans are behind Antique Roma existentialism on average.


Slow is smooth, smooth is fast.


Same-day delivery isn't efficiency; it's made possible by inefficiency, like using more delivery people and vehicles who cover inefficient routes, and using more warehouses closer to delivery areas.

Actually we can't discuss efficiency without making it clear what parameter we are optimizing, at the cost of what (if any) other parameters.

However, usually, if we reduce the time something takes not by cleverly eliminating, rearranging or otherwise streamlining the steps, but rather by some brute force method that requires more resources (more people, more equpiment, more energy), it is hard to frame that as efficiency.


Right. Same-day delivery is fast, not usually efficient.

Neither is multitasking using three different devices at the same time efficient. Again in this example the author seems to be confusing rush with efficiency. I didn't make it past those first few incoherent lines, so I don't know whether this confusion persists into the rest of the article.


As the customer, it would be more efficient for me to receive what I need today than after one month. I think same day delivery is more about that than optimising a company's operations. They are selling efficiency to their customers.


I agree. For me, the article absolutely did not deliver on what was said in the title. They declare efficiency as itself dangerous, where it is the narrow application of efficiency that really results in problens.


I feel like a lot of conversations and articles recently speak to an inadequate understanding of risk, and planning for risk.

Risk is made up of at least 2 or 3 components: what is the probability something will happen? And, if it does happen, what is the impact and how will you mitigate that impact?

For example, you may believe that a change to a website you are deploying has a low probability of taking the website offline. If it is taken offline, it may cost £X per hour in lost revenue, but you’ll leave the old version running on a standby server, so it only takes a few minutes to switch back. That’s a much more thorough understanding to 1 aspect of risk than “this rollout is low risk”. Once you have that understanding, it’s reasonable to discuss how to reduce the probability of an outage (better testing?), as well as how to reduce the impact (staged rollout?) or to speed up the fix if it were to happen (practise?).

In COVID terms, we should be discussing the impact of decisions in the light of future pandemics. Could we invest now in reusable PPE, so that next time we don’t have a global rush on the disposable stuff? Do we need to educate the public more readily about reducing disease transmission to reduce the likelihood of a pandemic in the first place? I’m not a doctor, so I have no idea on the specifics, but the likelihood of any given pandemic will always be low, so what is the impact of the decision if there is one and does that impact need to be mitigated? (even if it is less efficient to do so...)


You say risk has 2 or 3 components. The two you list, liklihood and severity, are the two I usually hear. What is the third?


Oh dear, I want to rant.

Efficiency is a dimensionless ratio of energy in to energy out. Economics, as formulated, has little to say about efficiency and lots to say about preference relations on utility functions (the texts do tend to hand-wavingly waffle about how markets attain "efficiency" through hypothetically rational actors maximizing utility functions etc., if you want a giggle check out the "fundamental theorems of welfare").

I actually read the fine article, and didn't give up after the first couple of paragraphs.

And in paragraph the third is introduced friction. I dunno, maybe it's because I actually study science, but friction is a well defined thing and the coefficient of same is another _dimensionless_ variable. It seems every time economists want to incorporate a notion from science proper they go for the dimensionless stuff because that way they don't have to go through the whole tiresome rigmarole of ... dimensional analysis. It makes me feel like Tantacrul criticising UI's (check him out on youtube, he's both funny and informative).

Anyway, efficiency is not dangerous, efficiency will actually allow the survivors of the Anthropocene Disaster make it through the coming disaster. Slowing down is not a bad notion because it means most humans can spend more time thinking (quite efficient actually) and less time haring around the planet distracting themselves from the vapidity of their vanity. However life is not better because it is slower, it's better because humans appreciate what an extraordinary (literally and figuratively) opportunity it is to be alive.

As insurance policy against disaster stop listening to economists because they observably don't have a clue (they can't make accurate and meaningful predictions), instead study science, especially physics, because these meanings are measured against reality and the resulting predictions are highly reliable (TANSTAAFL < 2TD).

/rant


> We worship efficiency. Use less to get more. Same-day delivery. Multitask; text on one device while emailing on a second, and perhaps conversing on a third. Efficiency is seen as good. Inefficiency as wasteful.

Is this a US thing? I've lived in three different European countries and nobody thinks this way. Efficiency and productivity are things I mostly just read about on HN.


It spills over to countries(like mine) enamoured of the American way of life, but in a twisted manner.

I've worked for companies that saw overburdening people with responsibilities as "efficient". On paper it was.


I agree with your observation. From my observation Dutch startups live by the motto “steady wins the race” and to a larger extent I think it speaks to the risk aversion ingrained in their culture - both in their startup and VC mentalities.

They would rather bootstrap and grow with stability then pull a softbank and use/give massive capital injections in the hopes of getting market efficient domination.


The article repeats a commonly believed myth about money being invented as a replacement for barter. Adam Smith thought so, as do many economists, but they forgot to ask anthropologists. See https://www.theatlantic.com/business/archive/2016/02/barter-...


Strongly recommend "Debt: The First 5000 Years" by David Graeber - https://www.indiebound.org/book/9781612194196 - solid treatment of economic history from an anthropological viewpoint.


I thought there were some hints of barter of sorts in palace economies from shipwrecks - no currency but incoming exotic commodities they couldn't produce and outgoing ships of commodities well beyond any personal use implies some trade relationships.

The fact Palace Economies all died out doesn't mean a barter economy never existed, just that it wasn't stable enough to leave any isolated "time capsule cultures". Granted palace economy commerce isn't on a personal level unless you count the ruler who allocates everything.


I think this article is missing something. What it talks about is efficiency of systems, which I think is different than efficiency in process.

Most of what the article talks about is making large systems more efficient for the benefit of the system at large. I agree, this makes individual components of the system more stressed and prone to catastrophic failure at any one point in the system. People within that kind of efficient system are pushed to their limits and breaking points.

However, making individual processes efficient i've found reduces stress on the individuals involved in that process and allows for that slowing down time.

The examples I can think of to back my points up

Previously I worked at a job where the actual process was incredibly inefficient. I ended up working long hours and twice as hard as I needed to. By the time I left that job, I'd increased the efficiency of the process to the point where I was working reasonable hours, had some good downtime to relax or take care of other things I'd had to neglect before.

The overall system at the place though was fairly inefficient, which was a good thing. It meant we were usually a little ahead and could account for things that went wrong.

Another recent example at my current job, we were working with a person who tried to overhaul our inefficient, yet working system. Our processes were efficient enough that we always got what we needed to do done. The person we were working with tried to over engineer an 'efficient' schedule and system for us that in the end caused far too much friction and they ended up losing money and the business relationship between the company I work for and them ended.


> I think this article is missing something.

The first word of the essay is "we," and it goes downhill from there.


As an "Engineer" by job title, I strive for efficiency at work, and it certainly creeps into my daily personal life also. With this pandemic I have learned to realize that efficiency is not the end game for everything, and the things you do efficiently should be chosen wisely. I have seen the the toll it takes on those in my daily life more clearly, but have also seen how it can lead to a safer less risky life during these times (i.e. shopping habits). It is also clear that efficiencies built into our supply chains (i.e. food) have become a liability as of recently. I do see this as a time of great reflection on efficiency for both individuals and corporations or markets. I think this re-evaluation will lead to some of the longest lasting changes to come out of this pandemic.


>Why hadn’t we stockpiled key supplies and machines, built up hospital capacity, or ensured the robustness of our supply chains? The reason, of course, is that it would have been seen as inefficient and profit-robbing.

>Seen in this light, at least some inefficiency is like an insurance policy. Think about your own situation. Every year that you don’t get into a car accident and your house doesn’t burn down and you stay healthy, you could think to yourself that you have ‘wasted’ your money on various pointless insurance products, and that you’d be financially better off without all those insurance premiums to pay.

This is a faulty claim. Traditional economic efficiency would say we _should_ stockpile medical supplies if it were more efficient for the markets in the long-term, which it would have been. The issue here is that _governments_ and experts didn't work together effectively (despite experts regularly noting the possibility of a pandemic, for example, Gates et al.) and the fact that our government generally isn't Keynesian. A quote (source: IMF https://www.imf.org/external/pubs/ft/fandd/2014/09/basics.ht...):

>Keynes argued that governments should solve problems in the short run rather than wait for market forces to fix things over the long run, because, as he wrote, “In the long run, we are all dead.”

As it relates to Global Warming, for example, we have various options. We could solve it by being "less efficient" (extracting less from the earth — and in fact, a Keynesian approach to that would be taxation to slow growth in harmful industries) but zoomed out, if we deal with global warming, we are more efficient over the long run. Moreover, we want to be _efficient_ in our development of green technologies.

The trouble isn't so much efficiency, it's zeroing in on making particular processes efficient to the detriment of the whole across time and in the present moment.


Of course not trying cram as much in as possible makes life better. However tell that to a CEO who has built a business off an unsustainable, for a normal person who wants to have a life outside of work, efficiency mindset.


The CEO gets it from above. Investors want to hold a portfolio with a particular risk/return profile. In order to do that, the individual components of the portfolio need to be running at higher risk, higher return profiles.

(Portfolio theory is why I am in favour of social safety nets. A worker wishes a relatively safe risk/return profile will voluntarily choose a much higher risk job when they can combine it with a low-risk backstop.)


This was me a few years ago. Hyper-efficient, doing 60+h weeks + workouts + family. Couldn't imagine it differently.

Then I had a medical condition that required me to slow down, changing my lifestyle completely.

Only in retrospect do realize my situation was extremely fragile.


Or a CEO who is keenly aware of the benefit of what they do. It would be difficult, for example, to convince a solar manufacturer that panel costs should decrease slowly when they know how many fossil fuel plants will be built because of that.


Efficiency is for machines. Effectiveness and creativity are for people.

And the more machines do laundry, the more effective and creative people can choose to be. Or they can choose to watch TV.


The fundamental problem is the inverted-U shape of payoffs to “optimizing” any one factor (or a small subset of key focus areas). Beyond a point, you’re just “overfitting” to the incorrect metric and underperforming in the truer sense. Yeah, you might not have a metric for the “deeper value”... c’est la vie! Any decision making approach that tries to ignore that problem/limitation is simply stupid.

The other problem with prizing efficiency is that we often optimize for efficiency under an incorrect model of the situation (model-reality-mismatch) — underweighting the likelihood of upsetting possibilities. That’s essentially what the idea of “black swan”/“fat tails” is about. It’s not really about statistics, unless you’re using a flawed and over-simplified statistical model to ground your metric of efficiency.

IMHO the same problem underlies the approach/framework of behavioral economics. In many situations, observed human behavior might be “irrational” only because your model of reality is naively simplistic. It shouldn’t be surprising that a satisficing approach works better in reality than an optimizing approach; if it does sound surprising, consider that your intuitions might be biased by an incorrect model of reality!


The problem is you can only slow down if everyone else does too, unless you're willing to step down in "class" which a lot of people have trouble with.


> The problem is you can only slow down if everyone else does too

That's a myth propagated by people with a stake in the pie. Nature shows us that you only need to be hyper-efficient and in a constant arms race if you're competing for the same resources.

Otherwise, you can adopt an opportunistic survival strategy that trades efficiency for variety and a diversity of useable resources.

Of course, efficiency is the only option when there are companies trying to grab all the resources available, as in our current business setting. But that's definitely not a healthy environment.


And in a number of different domains in life at that


It something I learned recently about ecologies, and permaculture design. And then I started applying it to my personal life. Instead of heroically finishing things, to instead, work on them in a way that is more like the regenerative and resilient processes from permaculture design.


I'd love to hear you elaborate on this in more detail.


I'm still working it out myself, so this is all work in progress.

I have a teenage stepdaughter doing online schooling. I had been working remote for years now. It took a long time to work out how things work with my wife -- she is a stay-at-home mom. Work and family life intrude on each other, and even more so this year with the pandemic.

I used to burn out big time, and then, I burn out in smaller doses. In the last burnout cycle, my wife miscarried, and that spent both my wife spinning. So I am not entire sure at this point how much of this is like when I functioned at peak.

The things that I have been doing:

- I try to have a few tasks I think I can accomplish and try to do them. But because the family life can be so disruptive, I've learned that I'm not really going to be as productive as I am at the peak, and just be ok with it.

- I try not to be a blocker for either my teammates or for my family.

- I have a garden. I enjoy it. I mainly do work with it in the morning and evening. Sometimes I am exhausted (especially in Phoenix summer hell season). In general, though, it recharging.

- I practice neigong, and I finally got to the point where I can reliably cycle something (which I will not get into technical details about unless you're also a practitioner. It is a rabbit hole). But suffice it to say, it rebalances the vital energy being distributed among my physical, emotional, and mental states. Part of the burnout was exhausting everything mentally, repeatedly, until there is just nothing left.

- When I push through something, it is for small things. Those small things might chain together. Past a certain point, it is better to go for a walk.

- I can tell when my brain is just exhausted. It is better to take a nap. I'll warn my wife that I'm not really present during grocery shopping. She doesn't always like that (it is one of the times we go out to do something together that is not in the house). I might take MCT with some mixed nuts.

- I don't use caffeine -- no coffee or tea. The closest I get is with roiboos, and even its 1mg caffeine can affect me. Fortunately, the caffiene content of chocolate does not affect me as much.

The part I am working out is how to live in a way that follows the permaculture ethics -- care of earth, care of people, and fair share. It is the last of these that has made me realize that the ambition of unbounded growth, whether for society, or for myself, is simply impractical. It may be strange to say it on a forum that was created for people who were or are interested in doing a Ycombinator startup and getting rich from winning the startup lottery ... but I've come to realize that it is not how I want to live or relate to the world.

I used to practice minimalism ... but now I realize that is just a stepping stone. It's investing in regenerative and resilient systems. The garden is part of my long-term effort to create a perennial food forest on my property. There is a lot of tech and "shiny" that I realize I don't really need to get.

And while I know that recently, I would keep comparing other people's cars as status / wealth symbols, it is ultimately meaningless. And fortunately, I know what I need to do to get my mind to cease doing that.

This is a very unusual way for me to approach this subject. I usually start out with a radical position: inequality and the wealth gap is _intrinsic_ to modern civilizations, and it rests upon the notion that wealth is something to be extracted from the earth, and access controlled. "Efficiency" is how you maximize profit, as if that is the only way to optimize things. Therefore, there will never be any system of economics and free market (or command economy) that will ever take the well-being of the earth and the people into account.

From this perspective, I think it is insanity. Why would anyone want to participate in a system where there will be guaranteed losers? (Because the few exceptions give the false hope that you might be the exception).

It therefore follows that, if I want to participate in a different kind of a "game", then I will have to live my life by those other principles. And so, I'm trying that with my current work. To give an receive my fair share. To reinvest capital gained from extractive wealth and convert it to regenerative wealth. To not tie my personal sense of self-worth into status or wealth symbols.


Having moved from an inefficient setting to an efficient environment, I have noticed a reduction of tolerance and threshold for getting stressed when things don't go smoothly. There is something to be said for the buffer that "inefficiency" provides.


All you have to do is look to history for the movements and societies that have put "efficiency" on a pedestal to gauge the value and practicality.

There was a major one in the 20th century. It did not go well.


I think the article is drawing the wrong conclusions. In every example given, the problem isn't "too much efficiency", but failing to plan for unlikely situations.

Making something inefficient doesn't magically increase preparedness.


No, but maximizing efficiency doesn't necessarily increase preparedness either. Being prepared has costs.


Sure, preparedness has costs, but the article makes the point poorly, IMO.

Many of the arguments are non sequiturs. The last example is particularly bad. On an icy road, it doesn't matter if a car gets 100 mpg or 10 mpg - speed and traction are far more important factors.


Doesn't the author miss that efficiency and deceleration (i.e. slow down) are in some respects orthogonal? It is possible to more efficiently go the same distance at the same speed, but use less resources. It is also possible to both slow down and increase efficiency which can synergistically decrease resource use.

Take the aluminum can. Beverage companies heavily innovated, reduced aluminum use, saved resources, emissions, and consumers get a win with lighter cans and less environmental pollution and carbon emission.


the article doesn't make sense to me

when making decisions:

"we should be asking which option will give us good-enough results under the widest range of future states of the world"

doesn't that lead you right back to the struggle for the most efficient way to determine the optimal "good-enough" answer?

Also, the article talks about efficiency but efficiency of what exactly? What if you're very efficient at a maintaining a relaxed and happy lifestyle?


Suppose you're playing a videogame. Your character just leveled up. You can upgrade either Attack or HP.

* "Upgrade Attack" is more efficient. It allows you to beat the game faster. But since your HP is low, you can't afford to make mistakes (get hit).

* "Upgrade HP" is more robust. It increases your damage-buffer, which allows you to absorb more mistakes without dying. But if you make zero mistakes, you won't beat the game as quickly as if you'd upgraded Attack.

IRL the dichotomy is often "income vs wealth", "velocity vs displacement", "throughput vs latency", "strength vs endurance", etc.


As it relates to hacking, I think of it as: clients can be fickle because they don't even know what they want sometimes. So I picture next steps as coalescing, like a wave collapsing. Uncertainty comes to dominate the process, so it works more like evolution than planning (stochastic vs deterministic).

My problem solving algorithm is to look up to 7 steps ahead, along up to 7 different branches in the tree of possibilities (I'd like to examine more choices but my brain can't hold them all). In practice this is always at least 2 and 2, so this task or its alternative, and then looking at the next step along each branch. Then I prune the tree, selecting for the steps that work across as many branches as possible, without rewrites or duplication. I'm often working on a common dependency several steps out that other people aren't aware of yet. This could be thought of as a breadth-first search or parallel search.

Unfortunately on an individual level, the tech industry seems to be going the opposite direction, exploring deeply down very long and linear branches of the tree. Coders are expected to implement an idea and be ready to scrap it without hesitation, iterating over and over again until finished, rather than simulating the outcome in their minds. More of a depth-first, serial search.

I think the split is due to programming moving from an individual endeavor towards teams. We can't see inside each other's heads, so we plot the course and divvy up steps among the members. The cost of this is that serial searches will almost always turn up suboptimal results, because the resources to do broader searches get withheld so we miss elegant solutions and even perceive them as more costly.

To answer your question, the serial approach works best for individual tasks, but misses the big-picture view that might have let us see that the task didn't need to be done in the first place. So my vote would be to see more quiet reflection and less spinning and churning. But I'm mostly outvoted, so it probably comes down to personality, with junior developers being more sought-after than senior developers like me who might be perceived as reserved/overcautious/conservative.


"... what of the COVID-19 pandemic? Why hadn’t we stockpiled key supplies and machines, built up hospital capacity, or ensured the robustness of our supply chains? The reason, of course, is that it would have been seen as inefficient and profit-robbing. Money spent on masks and gowns gathering dust in a warehouse could always be put to more ‘productive’ use in the marketplace."

This is what matters right now.


I think this is confusing efficiency with overfitting. A lot of the problems described in the article arise due to overfitting models to the historical data. We overfited supply chain management because covid doesn't happen often, we overfited credit modeling because chain reaction of default doesn't happen often... We actually weren't efficient enough, we ignored tail events.


I think life is about having good experiences! That is why traveling and visiting new places and food is great, your mind gets to experience news things. Running around in 80 mph+ well you tend too miss things is my experience. Life is usually not about maximum efficiency unless you are a 100m sprint runner. Robots and programs can run as fast as they want and automation do us favors, however us humans should not strive to do the same.

Compare Junk food, you get full quickly, nutrients in junk food is usually bad. You can instead choose to have the experience of slowly cooking a nutritious meal. Eating slowly cooked food with your family and friends is often better. Usually stress comes with running around too fast so you tend to consume not healthy food and you then after a while gain weight. I am not saying consuming junk food, or running fast is bad every now and then but doing so all the time you might miss slower better experiences...

Same goes for buying local vs buying cheap stuff which does not last long. Often buying local is better for the environment.


It’s an interesting definition of „efficient”.

In my book, you are efficient if you did 8 hours worth of work in 4, did it right on the first try, went home early, and had enough mental capacity left for hobbies and family.

60 hour weeks are not efficient. Busywork is not efficient. So called „productivity” software is more than often not.


This article has a horrible title, and some highly misleading content. It represents "efficiency" as things like "not buying insurance" in a dangerous fashion. There are VERY few serious people who think that.

"All things being equal" efficiency is always good. (e.g. if an identical task can be done for 1/2 the energy - good) "All things being equal" slowing down does not necessarily make life better. (Cleaning up after an oil spill is something we probably want fast.)

All things are not equal, and you have to make a series of tradeoffs.

How much risk are you willing to take? How much enjoyment do you get out of a task?

The author is of course correct that optimizing efficiency at the sake of all else is typically a problem. But to misrepresent efficiency (the key to our modern age) is not quite fair to the term.


Systems in real life are complex. Limiting them to just one dimension on which they are efficient may leave out factors that make them work at all, or just better.

But if you really know every and all components and criteria that may be applied to them, you might improve efficience. Good luck with that.


I think we need more hobbies. Maybe some anthropology classes as well.

There is, for instance, a certain give-and-take in classical Japanese garden aesthetics. In a formal English or French garden, almost all you can see, everywhere you look, is evidence of human hands. Since no plant is in anything you could mistake for its natural state, you are practically beaten over the head by the presence of the unseen gardener. Look at what I made nature do. What piece of work is a man.

It's a dynamically unstable system.

Meanwhile the Japanese garden is a dynamically stable system. The plants are allowed to 'win' in many cases that either support the gardener's goals, or are simply not worth the effort required to win the argument. It's more efficient to let the system do what is in its nature than to try and stop it.. If you know how to look, you can see the gardener everywhere, over-emphasizing what would have happened without them, in such a way that you see the plant first, and the work second. It is organic, with an editor.

You might see a similar philosophy in Judo. You are steering what is there, letting the subject of your efforts do most of the heavy lifting for you. It is quite efficient, but not in the antiseptic way we sometimes imagine that efficiency should possess.

TL;DR: most of us have no idea what efficiency looks like, and we make a mockery of it while trying to pursue it.


Reminds me of this piece from Seth Godin on the importance of building slack into our lives: https://seths.blog/2019/06/investing-in-slack/

(Not the chat app)


One of my RSS feeds is Seth's Blog. Nice mindfulness tips and perspective tehre.


I came across this related blog post the other day:

„in the name of taking it slow“ https://www.voyageofthezephyr.com/blog/takingitslow


Since when multitasking is efficient?

There’s a big difference between being productive and being busy.


Insurance is the opposite of a lottery. If you think buying lottery tickets are a waste of money because you lose out in the long run, the same argument should lead you to think having insurance is a good thing.


I think you have it backwards - insurance and lottery are (sort of) equal. In the long term, you'll lose money in the lottery. You'll also lose money paying for insurance.

If you have extraordinary "good luck", you may come out ahead in the lottery. And if you have extraordinary "bad luck" you may come out ahead in insurance.

Insurance of course has a more practical value, and a much better return on investment (even if it averages negative returns)


> You'll also lose money paying for insurance.

Only if you put zero value on the peace of mind that comes with having insurance against risks that, if they happened, would bankrupt you. But if you put zero value on that, you wouldn't buy insurance.


It's the same with the lottery. If you understand that the cash expected value is lower than the cost of the ticket, but you get some value from having "played" and the net is at least equal to the price of the ticket, it's not irrational to buy a lottery ticket.


This is true, but the kind of value being provided is different. Peace of mind is not the same thing as entertainment.


Maybe you all can agree that insurance and lottery are the same in that only a fraction of consumers come out ahead, and are opposites in terms of smart vs. stupid.


> Maybe you all can agree that insurance and lottery are the same in that only a fraction of consumers come out ahead

Only a fraction of the consumers come out ahead if we only consider monetary value. But the whole reason insurance and the lottery exist in the first place is that there are other kinds of value besides monetary value. In the case of insurance, it's peace of mind. In the case of the lottery, it's whatever entertainment value comes from being able to visualize yourself winning, even if your chances of actually doing so are tiny.


> Only if you put zero value on the peace of mind that comes with having insurance against risks

Only if you put zero value into seeing your savings go up with the amount you saved on insurance.


Unless you are saving enough to be able to handle the unexpected loss of the asset being insured (in which case, yes, it doesn't make sense to buy insurance since you are able to self-insure), putting the money into savings instead of insurance means losing the peace of mind that you are buying when you buy insurance. If that peace of mind is worth more to you than the money, then buying insurance is a net gain.


this way of framing things is only valid if outcomes of events are somewhat linear, but in real life this isn't true at all: some uninsured bad events can wipe you out completely, whereas insured bad events might not wipe you out. it doesn't matter if your expected value of not being wiped out was predicted to be 0.1 if you've gotten wiped out.

the decision of purchasing insurance or not depends more on how many resources or alternatives you have, and if you can easily smooth over an uninsured bad event so that it doesn't wipe you out. for individuals with limited resources who will be wiped out if uninsured, it is very rational to buy insurance, including paying a profit margin to the insurer on top.

another way of thinking about insurance is a way for groups of people to pool and share risk -- if they don't get hit by correlated bad luck then they all get to smooth out their bad outcomes.


> for individuals with limited resources who will be wiped out if uninsured

Usually, individuals with limited resources can't really be wiped out: there is not much to wipe out.

Here is my perspective: I consider myself typical individual with limited resources: I'm mostly live on the my salary (Software dev), I don't really have major assets (20k car bought for cash, renting a house), have some savings (~30k cash). And I only have minimal car liability insurance, as required by law.

So, what kind of "wipe out" event can happen to me / my family? And what kind of insurance do I need to get for that?

Let's say I have auto accident and I'm liable for million dollars. The likelihood of such event is low. Such event probably would result in bankruptcy, but it's not much worse, then constantly paying extra for insurance. So, there is "no wiping out" in this seemingly catastrophic event.

So, what I'm saying is "individuals with limited resources" often doesn't have much to wipe out, so they don't have to worry about insurance.


Usually they have wipeout problems that can’t be insured for. Medical conditions or unemployment being the most common.


Can't be insured for?

I find that ironic, because we actually have medical insurance and unemployment insurance here in USA, but we often see "individuals with limited resources" wiped out for reasons you mentioned.


In areas where that works it’s more of a social insurance and government run. In the us medical insurance is a rate subscription service, similar to say Costco.


> Let's say I have auto accident and I'm liable for million dollars. The likelihood of such event is low. Such event probably would result in bankruptcy, but it's not much worse, then constantly paying extra for insurance. So, there is "no wiping out" in this seemingly catastrophic event.

Consider the same accident and you fracture your spine, leaving you paralyzed neck down. Insurance (at least in the US) affords you access to medical care, and access to follow up therapy and other resources to (hopefully) recover completely.

If you have limited resources and no insurance, the consequences are a lot worse than bankruptcy.


That depends on how much avoiding bankruptcy is worth to you. It might be worth learning more about what the experience is like?


Insurance increases equality by reducing the financial penalties of bad luck. A lottery increases inequality by creating rich lotto winners who by definition are no more deserving than anyone else.

If you're interested in equality and justice, you should be in favor of insurance (and insurance-like government schemes like social security) and against lotteries.

On the other hand, the good thing about lotteries is that they can be used to raise money for a good cause with entirely voluntary contributions, and even for most people who don't win, it can still be considered pretty cheap entertainment.


Another way of thinking about it. Insurance is a way of marginally reducing the potential effect of variance on your life, lotteries are a way to increase it.


Insurance premiums and the lottery both have expected negative returns.

I don't (generally) play the lottery, and I don't (generally) participate in involuntary insurance schemes.

I don't buy pet insurance, I don't buy extended warranties; I generally try to avoid paying a third party's payroll, G&A, real estate, taxes, and profits, all on top of the actual expenses I incur that they (hopefully) pay for, after much paperwork.

That said, I sure don't carry the bare minimums in auto or health or homeowners insurance because I cannot afford a multi-hundred-thousand-dollar unexpected expense.


I cannot afford a multi-hundred-thousand-dollar unexpected expense.

Right, which demonstrates that the expected value of insurance can be positive in terms of utility even if it's negative in dollars. For most individuals, a $100k loss is more than 100 times as bad as a $1020 premium. Meanwhile the insurance provider can absorb individual losses and still come out ahead overall.


Insurance protects you from low-probability downsides you can't afford. Lottery tickets are about buying into low-probability upsides.

If the loss from not having insurance is bearable, you're better off not buying the insurance. If it's not, it's worth the money even if you'd lose out on average. That's why phone or car insurance (collision and comprehensive) is a waste of money for the average person, but health or homeowners insurance isn't.


The loss from not having car insurance is not bearable for most people, and therefore is not a waste of money. First, your own car can be tens of thousands of dollars. Most people can't take that hit. Then, even if you drive a pile of junk, the car you hit may be worth tens of thousands of dollars. Finally, there's medical bills if you injure someone.

Oh, yeah, and then there's legal requirements that you have car insurance. So no, I don't buy the idea that car insurance is a waste of money.

Phone insurance? There I agree with you.


I should've specified, I meant collision and comprehensive insurance. Liability insurance is legally required, as you said, and I personally take the maximum my insurer offers. Personal liability in an accident is potentially unbounded. This is an example of an unbearable loss that must be protected against, even if it costs money on average.

(If there's an outstanding loan on the car, the lender requires collision and comprehensive to cover their asset. Something to consider when financing a car - it's not just the car payment, but also the additional insurance you have to carry)

Used cars usually are not "tens of thousands of dollars". If one can't afford even a few thousand dollars in a pinch to buy a used car to replace one's current ride, then one probably needs collision and comprehensive too.


> Personal liability in an accident is potentially unbounded. This is an example of an unbearable loss that must be protected against, even if it costs money on average.

Why it is "unbearable loss"?

There is saying about that:

If you owe the bank $100 that's your problem. If you owe the bank $100 million, that's the bank's problem.

That may seems unethical, but if you think about it - it's reach people, who lives in our society, who needs to be careful to take care of their risks.

Of course, if you have assets to lose - you need to protect them (by buying insurance), but if you are "individual with limited resources" - it's not much different with / without insurance.

What I often hear is that poor people (with limited resources) needs to buy extra insurance to protect reach people from their risks. I disagree.


They were pointing out that liability in the case of a car crash could claim approximately all of your assets, where as a comprehensive loss can at most claim the current value of your car.

You have a good point though that if you have essentially no assets to go after liability insurance isn't worth much to you. This is why it is mandated for auto insurance (and why many places have underinsured motorist insurance, also)


If you're rich enough to not need collision & comprehensive insurance (i.e. you can put down a few thousand dollars to buy a car at short notice) you probably have assets worth protecting with liability insurance.


If insurance was cost effective in the long run, insurance companies wouldn't be profitable


That is only true in a zero sum system. If insurance can get say 3% profit margin but allows for others to take a riskier approach which overall produces 7% more profit by allowing slimmer margins both can profit. There are risks to that sort of leverage for both parties, especially in black swan events.

It is akin to loans essentially - the one taking them may be giving profits to others but that doesn't mean it isn't cost effective.


You aren't protecting against the expected value though, you are protecting against variance.


No argument there


Insurance schemes do not have to be run for a profit. It's not a law of nature.


This does not sit nicely with the fact that there are many large and successful insurance companies.


Most insurance companies make their money on their float: the time between money coming in and money going out.


Why? I'd think the whole point of insurance is that everyone (as an average) loses out in the long run?


By that same average, everyone loses out on the lottery in the long run, too.

But for the few who don't lose, it's worth it. That applies to both insurance and the lottery.

The difference, as has already been pointed out, is that insurance prevents a bad scenario, whereas the lottery sometimes results in a good scenario.


>"Multitask; text on one device while emailing on a second, and perhaps conversing on a third."

Result - shitty content of texting, emails and shitty conversation.

I am not sure who in their right mind actually advocates that.


The problem could be twofold: 1) our tools push us towards multitasking. Notifications pop up and there's a sense of immediacy where you need to deal with it now (texting back) or it will get "lost" in the stream of new notifications, never to be dealt with. 2) people today expect everything, even communications, to be responsive, leading to a low content response immediately as opposed to a high quality one later.

For #1, modern tools like Slack favor immediate responses since it's not set up to treat conversations or requests as manipulable items. With email you can drag and drop into folders and deal with them one by one. With slack, once you've "seen" a message, there's really no way to put it somewhere to remind yourself to respond. (If there is, it's not obvious to me.)


You can at least click on the context menu of a message and select ”Remind me” to set a reminder about the it. But yeah, I do agree with you. Slack messages are less ”manipulable”.


I think no one advocates it but many practice it.


The author lost me at the end with the car analogy. Who drives fast on icy roads? I try to drive the right speed for conditions. I use the right amount of efficiency for the job.


It’s telling that ancient philosophers didn’t include many of the things we value in our hyper-corporate society in their definitions of a good life. Leisure (N.B. not entertainment) is dead in America, and it has significant consequences. I wish it weren’t the case, but I really do find that I do my best thinking when I give myself time to do nothing or go for an aimless walk. Efficiency as a virtue should be reserved for machines, not human beings.


It appears the definition of efficiency here is just "fast" which is not just what efficiency is. I've worked with plenty of inefficient codebases, that were intrinsicly fragile, and difficult to work with... and cleaning them up made them more efficient and robust. Efficiency is about doing the most amount of things in the least amount of steps, not just minimizing features and making things fast at the expense of stability.


> The creation of ‘option markets’ means that you don’t have to go to the trouble of buying a stock that you’re going to be selling soon anyway. You can just promise to buy it, and then sell it at a price and date specified by the option contract.

Options, like their name implies, aren't a promise to do anything, they provide the right but not obligation to buy or sell at a certain price.


Note that in situations where exponential growth is possible, lower latency means much faster growth, and this can easily happen faster than you have time to react to.

Exponential growth at 5% a week is a lot different than 5% a day or 5% a minute. Social media would be less dangerous if there were more automatic delays before a message you post appears to others.


There was a time when GCC had to be installed by compiling from source. Doing so involved compiling the GCC source with the native compiler, and then using the resulting binary to compile the GCC source again.

I remember the install instructions advising "festina lente". The Latin phrase translates to "make haste slowly".


This reminds me of Frank Herbert's Bureau of Sabotage (https://en.wikipedia.org/wiki/Bureau_of_Sabotage).


Aren’t these ideas well known in the field of economy? They also don’t sound too alien for anyone marginally acquainted with psychology (e.g. by means of watching any of Mr. Schwartz’s talks).


The mix of font styles in the PSYCHE title strip on top really bothers me. Are there any typography experts in the house that can explain what concepts could be behind that choice?


They should say “Efficiency” is dangerous. Actual efficiency is not bad at all. Say focusing on one task till is done is sometimes the most efficient way to do a task.


Why not move fast and break things? I think it is a trade-off between being efficient and breaking something and being stable.


This is the most important lesson on 2020. But, how do I find an employer who will agree with me?...


I think we need to think about long time efficiency instead of short period ones.


Efficiency would not create any big invention. It's a big problem.


This article is badly mid-attributing why these systems are 'efficient' in the way they are.

The lessons in the '08 crisis were crystal clear - anyone who planned for difficult times and left a contingency lost out. Then look at 2020 - one of the first thing the US government did was bail out the financial system. Bailing out banks is almost a pavlovian response at this point.

That is why there is no robustness built in; because the capitalists have correctly identified that the they will not be allowed to go broke. Businesses are optimising based on an expected government response. What use is contingency planning when the contingency isn't allowed to eventuate? Excessively risky ventures are promoted in the good time and buffered in the bad.


Reads like GPT-3. Lots of words to say nothing.


Reminds me of the book Slack by Tom DeMarco


Can someone explain what this means?


TL;DR Author who is paid to write words, gets confused when a word has more than one meaning. https://en.wikipedia.org/wiki/Pareto_efficiency =/= "efficiency"


And what's the most efficient way to slow down?


I study empathy, and have often said that efficiency is the opposite of empathy. Whatever we want to call it, the human element is overlooked when we strive for the single dimension of efficiency. Hannah Arendt famously talked about the banality of evil. During the Holocaust, the counting and categorizing of people was flattening or reducing to that important human element we all share.


Transcript of a Calvin&Hobbes strip:

Calvin's Dad, sitting at his desk: "It used to be that if a client wanted something done in a week, it was considered a rush job and he'd be lucky to get it."

CD: "Now, with modems, faxes, and car phones, everyone wants everything instantly! Improved technology just increases expectations."

CD: "These machines don't make life easier - they make life more harassed."

Calvin, in the background: "Six minutes to microwave this?? Who's got that kind of time?!"

CD: "If we wanted more leisure, we'd invent machines that do things less efficiently."

https://pics.me.me/it-used-to-be-that-if-a-non-with-modems-3...


Pace is relative and I think teams will generally average out in terms of pace. Finding a pace that fits you and your style would be an essential factor in having better work efficacy. My boss works really fast, works constantly and is probably a workaholic, a pace which I can not and will not try to match, so I have to work at my own pace and this throws off how project planning goes because the PMs are used to his pace and don't give me an optimal amount of time on projects, so I will find myself rushing which of course makes me feel bad and triggers my impostor syndrome. Eventually this pace will break down and become unmanageable for me and I'll have to quit, which will hurt their production even more.

Pace and efficiency are definitely linked, but I don't necessarily know that I think faster = more efficient. And there are folks mentioning stability, also definitely linked here, but i think max efficiency would lean more into stability than pace.

Linking pace with efficiency also seems to create the idea that "faster will be the winner" which, you know, that's a whole thing, introducing competition in an arguably healthy way.

I generally tend to find that emotion will usually come before logic. Building something that could be considered efficient, but a feeling says "This could be more efficient" and then logic jumps in and does the work.

Just some stream of consciousness writing here, food for thought maybe, or extremely poor quality ideas!


> We worship efficiency. <> Multitask; text on one device while emailing on a second, and perhaps conversing on a third. Efficiency is seen as good. Inefficiency as wasteful.

The premise is total bullshit. No one thinks multitasking is good.

Barry Schwartz consistently doesn't get it, but he's good at presenting "Unpopular opinion's" style articles.

People say multi-tasking is great! But it's not! Upvote if you agree.

Efficiency is why your baby doesn't get stuck in the womb killing two people.

Bot sniping on GrubHub is scary, this is an efficiency that actually worries me. But it's not within this articles scope.


The author writes,

> Arguably, a little friction to slow us down would have enabled both institutions and individuals to make better financial decisions.

But the article fails to bring up the US's rate of interest. If interest rates were allowed to have risen, this would have been just the sort of "slowdown" that the author is looking for.

I find it remarkable that this article could be so insightful, yet lack even a single mentioning of this simple and fundamental fact. Rising interest rates are the market's implementation of precisely what the author is looking for.


Say you use the tips here( https://efficiencyiseverything.com/time/) to save time, particularly the first example about shoes.

You are going to have more time to do anything, relax, think, clean, whatever.

Why would spending more time putting on shoes be better?


"Premature optimization is the root of all evil" -Knuth


TL;DR.


Striving for efficiency is bad science.

The dream takes precedence over the reality.


This article suffers from a misconception of efficiency. A car that gets more miles per gallon than another is more fuel-efficient, not more efficient. A system of local optimums is an inefficient system. The author asks us to make life more efficient by considering efficiency in a broader scope. Efficiency is still good.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: