Hacker News new | past | comments | ask | show | jobs | submit login
Ephemeralization (wikipedia.org)
146 points by dearwell on Oct 25, 2021 | hide | past | favorite | 68 comments



This is the sneaky, flawed premise behind many instances of what we call optimization. I have a Law of Optimization: the closer you get to optimal on your chosen metrics, the more cost is shifted to unmeasured externalities. (I imagine someone has named this.) A stronger version would be to say that the distance to optimal is inversely proportional to the externalized cost, so that the total externalized cost goes to infinity as you squeeze out ever-smaller increments of gain with respect to your chosen metrics.

The important corollary is that you can get 80% of the way there with very little externalized cost, and that going too far will eventually reveal the oversimplification in your cost model.

You can move an object with near-zero energy, but it'll take 1000 years. You can get to net zero carbon emission by shifting it to someone else or dumping it somewhere it isn't measured or is a temporary sink that will eventually re-release. In my own area, you can reduce garbage collection overhead to zero by either never freeing anything (so the externalized cost is memory) or shifting the GCs to happen in between the timed portions of a benchmark (and this often happens accidentally, especially with a "no regressions" policy!)


A consultant I worked with compared the organization we wanted to measure to an air mattress: If you press it somewhere, i.e., measure a specific point, the air goes elsewhere, and another part is inflated. He used this analogy to explain the importance of taking the entire situation into account, by applying full cost accounting.


If you don't want to take the credit for it, you could call it the Pareto-Goodhart Law of Optimization, and state it something like this:

"The closer a system gets to an optimal state, the greater the propensity for the agents attempting to further optimize it to externalize the costs of the system."


I think this may explain part of why people tend to think of companies as "efficient" and governments as "wasteful": companies have more leeway to choose their metrics and externalize the unmeasured things.


If you think government agencies doesn't externalize costs then I have bad news for you! If we look at how communism went in practice then you'd see that those agencies were much better at externalizing costs than western companies. As long as they could optimize for the target measure they were fine and could blame everyone else when things didn't work.

People think of governments as wasteful since governments never seem to manage long supply chains well. They can run services like roads, electricity, school, healthcare etc that are simple as long as all the necessary tools and materials are supplied by the private sector. They seemingly can't run stuff like medicine manufacturing, heavy machinery engineering etc.


"They seemingly can't run stuff like medicine manufacturing, heavy machinery engineering etc. "

Well, I now have a image of soviet russias state factories spitting out tank after tank ...

Governments can handle this. Also "government" can mean very different things, but I agree that on a general base, they are not able to handle a complex chain of supply. Or at least not as good as a distributed market of private entities.


Yeah, I should have added "efficiently" in there. And manufacturing has gotten a lot more complex since back then, not sure they could properly manage a modern supply chain with related electronics etc, without relying on the private sector to provide things. But we know both China and Soviet gave up trying to accomplish things without the private sector, so I'd assume they couldn't.


More simply, companies can solve 80% of the problem, while governments have to solve 100% of the problem (i.e. supporting people in wheelchairs in remote local communities without smart-phones), which requires vastly more effort.


Your Law of Optimization reminds me of the (often popularly ignored) denominator in the famous e = m c^2 equation.

I believe the full version has a (1 - (v/c)^2) denominator on the right side; I think the Lorentz transformations motivate this.

The upshot matches your observation in spirit: the closer you get to the optimum (the celeritas/speed of light) the more energy you require.


> Your Law of Optimization reminds me of the (often popularly ignored) denominator in the famous e = m c^2 equation.

> I believe the full version has a (1 - (v/c)^2) denominator on the right side ….

I think the question is what you mean by `m`. If it's inertial mass, then I believe no correction is necessary. The point is that the inertial mass `m` is the rest mass `m_0` divided by `\sqrt{1 - (v/c)^2}`, as you say.


yes!


Ironic the man who wrote `make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience.` is most tagged with `e = mc^2`.


Great comment. This is something I've thought for a long while, especially as we see the suppressed costs of "optimized systems" rear their ugly head against unplanned contingencies.


A very common example of this is people suggesting to remove private profits from the modern economy. It seems like a really good way to cut down waste and save costs in theory, but it never worked in practice if you do it for the whole economy. It does work really well for some problems, but for many others those private profits seems to be a necessary evil for the system to work well.

What those people are missing is that private profits is a really good instrument for allocating resources and performing quality control efficiently in many sectors. Removing those means you now need to manage all those resource allocations and quality controls via other means, and so far we haven't found anything better for most business to business supply chains.

Governments often do well in the business to consumer space though, as in many cases cutting costs there means consumers gets worse service. But in the business to business step both sides wants to push costs to each other, so they negotiate on who should do what to make both earn money, or they will go and do business with someone else.


> A very common example of this is people suggesting to remove private profits from the modern economy. It seems like a really good way to cut down waste and save costs in theory, but it never worked in practice if you do it for the whole economy. It does work really well for some problems, but for many others those private profits seems to be a necessary evil for the system to work well.

> What those people are missing is that private profits is a really good instrument for allocating resources and performing quality control efficiently in many sectors.

Such a ‘whole economy’ system was being built by Allende’s socialist government in Chile in the form of project Cybersyn (with the help of a well known British cyberneticist). This was before the Chilean bourgeoisie, backed by US imperialist forces, coup-ed Chile: https://www.youtube.com/watch?v=RJLA2_Ho7X0

An exciting project called http://valueflo.ws is being developed by http://mikorizal.org and others that reminds me a lot of Cybersyn.


Interesting thread on Cybersyn: https://news.ycombinator.com/item?id=24764727


This can absolutely be true for many systems and optimisations, but Fuller's premise is that new technology will allow efficiency improvements or substitutions that would not be possible in an extant system.

For example one might come up with many ways to optimise the use of meeting rooms in an office, maybe making the rooms smaller to allow more simultaneous meetings, but now they're too cramped and you can't get as many people in a meeting. Maybe you add screens and network connections for presentations, but that's expensive, Tradeoffs, tradeoffs. However if you switch to a new technology like desktop videoconferencing you can get rid of the meeting rooms completely. Or you can reduce the number of meeting rooms by 5x and make the remaining ones bigger and higher tech for much better in person meetings that remain, at much lower total cost.

New technology can completely eliminate or restructure whole categories of optimisation tradeoff.


> the closer you get to optimal on your chosen metrics, the more cost is shifted to unmeasured externalities.

This is not inevitable in my experience. You can achieve optimizations that also reduce the dependency on external things. For some of us, this is one of the objectives of optimization. 3rd party dependencies == externalized cost.

The real game for me is to solve the problem using correctly-allocated empathy. Building a product to streamline back office operations? Empathize with the user and the buisness processes. Building an options exchange which must handle at least 1 million orders per second? Empathize with the computer hardware. Optimizing for one factor vs another and your willingness to take on external cost should all be driven from this perspective.


There's still going to be something you're not measuring, and this optimisation process is gonna be making that whatever-it-is worse.

As I'm reading it, the fundamental principle we're talking about here is that there's always a point beyond which optimization is a tradeoff.


> You can get to net zero carbon emission by shifting it to someone else or dumping it somewhere it isn't measured or is a temporary sink that will eventually re-release.

Often referred to as tragedy of the commons.


This article is about technological progress enabling doing more with less. I’ve been having a tangentially related thought about general project planning as it pertains to technological progress.

The thesis is: Unless you can create an optimistic project plan that results in the project’s completion within 5 years, the money would be better spent on doing basic research.

The thesis is informed by the ITER fusion reactor project, which was started in 2007 and whose current target date is 2025 (but will likely slip, as it has before), but that date is for first plasma formation and not actual deuterium-tritium fusion, which won’t happen until 2035.

Meanwhile, during this timeline, advances in materials science, namely higher temperature superconductors, have opened the possibility of significantly smaller and simpler designs, some of which are being plausibly pursued by startups (eg SPARC).

So I wonder if a general heuristic to guide science funding would be something like “if the optimistic timeline is > 10 years, invest instead in basic research that could potentially develop technologies that would shorten that timeline”.

This thesis is also generally informed by the relative success of New Space (eg SpaceX) companies compared to Old Space (eg Boeing) companies. Elon has famously said “if the plan is long, the plan is wrong”, and I think that philosophy has demonstrably worked out well for SpaceX.


Agreed, for hardware and manufacturing related development it is very difficult to see out past 5 years where anything new is being attempted. Within 3-5 years you are largely constrained by current available manufacturing, fabs, and processes, but farther out there may be better choices. You may be better served by doing the research (or collaborating with a supplier) to build new production capacity (understanding their challenges) for anything further out.

Look at DUV lithography where a huge industry flipped on its head in 5 years, while X-ray lithography had been in development (and written off) for so many decades it had to be renamed. That required real R&D not tweaking current available equipment, which led to asymetric advantages and huge profits (Zero to One).



ITER? 2007? I heard about that in the mid 90s. I can still hear the voice of Prof. Miri [1] in my head: "toroidal,... poloidal,... explain it to me"

Had to look it up [0] and found that it started in the late 70s even!

Interesting aspect you are bringing up there, lots of truth in it.

[0] https://en.m.wikipedia.org/wiki/ITER

[1] one of the nicest persons I met at the University https://iranglobal-info.translate.goog/node/57691?_x_tr_sl=a...


Etymologically speaking, I find the adapted meaning of this term irksome. Given that ephemera is a 16th century word[1] that refers to "transitory written and printed matter" and ephemerality "the quality of existing only briefly"[2], one would expect the standard definition for ephemeralization to be "the process of creating ephemera" or similar, as opposed to "ability of technological advancement to do "more and more with less and less until eventually you can do everything with nothing" ".

[1] https://www.lexico.com/definition/ephemera [2] https://en.wikipedia.org/wiki/Ephemera_(disambiguation)


Now tell me how you feel about the common phrase "I'll be there momentarily."

(I may logically accept that language follows usage, but I just can't not be bothered by that one. Also, people don't seem to like it when I reply "I hope you're right.")


Is it so hard to accept that "momentarily" refers to the period of time before arriving, rather than after arriving?

Time is a subject that is hard to avoid using idioms to convey, like when people ask "What time is it?" without specifying what "it" refers to.


> Is it so hard to accept that "momentarily" refers to the period of time before arriving, rather than after arriving?

Yes?

I don't need to make up a different subject for the sentence when it's laid out precisely. If it were "I'll be there briefly", would you still try to contort the words into something like "The time required to get there will pass briefly"?

(Though to argue the other side, "I'll be there shortly" doesn't bother me in the least...)


Interestingly enough, I'm less bothered by that. It could've been worse, e.g. "I'll be there momentishly".


I think the definition you refer to in [1] is apt.

"Things that exist or are used or enjoyed for only a short time."

They are used or enjoyed for a short time because the cost or burden of doing so is extremely low. They are cheap, easily accessible, transportable, convenient, disposable. These are the attributes that make them ephemera, and that Fuller saw as ephemeralizing things.


When one person can accomplish the work of fifty men, ideally the quality of life of all fifty-and-one persons increases. Buckminster Fuller echoed this sentiment and suggested that through innovation we would one day be beyond the need to have jobs and daily labors.

Capitalization on efforts, however, resumes unabated.

Capital-ism seems to not care so much about how much is accomplished, but looks for how much can be exchanged for concentration of resources.

While I enjoy Fuller's notion, wealth is already difficult enough to amass in a single locus, and I wonder how we can achieve both meaningful livelihood and ease of living without sacrificing one or the other.

Yes, everything approaches the push-button rapidity, but that is both things unhelpful and helpful. Exaggerative motions in human intent that slide towards infinity in either direction. Perhaps with the power of the collective come to be, we can protect ourselves from ourselves. Perhaps only our best selves will flourish.

With innovation and with invention, it stands to reason that the lives of the collective lot ought improve, yet we still find disparity and disparity growing [at that].

So what on earth to do, what on earth to do. When gains are concentrated in the few and the ease of the many appears to be squandered. What on earth to do, what on earth to do, when there's so much accomplished, and so much left to do.


CDs and DVDs are a good example of this. I used to have a wall of my apartment dedicated to media, all completely immaterial now.


that's also a good example of how it's often mostly or partly faked through displacement as well though. your CD and DVD collection has been replaced by round the clock maintenance of the storage and transmission of that data, from the record label or movie studio through various data centers, via a number of middle-men companies all requiring their own ever-changing and complex infrastructures, through a system of caching and routing and cabling to your house, which needs to take place over and over, forever. The only dependencies for your old collection was shelving and the electricity grid, otherwise they were stamped out of plastic once and good for at least a few decades.


There was a lot of plastic stamped out, and there was a round-the-clock distribution system for shipping those pieces of plastic to middle men distributors and stores before they got to you. There's a huge maintenance cost to these cloud services, but there's plenty of evidence that the environmental cost of that maintenance is much lower than the cost of those DVDs.

Is it perfect? No, I'd much rather that a county-or-state library system run the media library and have set of servers and data that were shared and archived and synced with the library of congress. It is by no means more expensive, in terms of cash, middlemen or natural resources, for digital delivery the way that's currently going on over buying DVDs.


Indeed, books to - I always wanted to build a library in my house when I was younger, as the years go by the number of books I have hard copies of gets less and less. Now I just have a bookshelf with reference books, and half of it is empty - everything else on the iPad.

A friend of mine had an entire room dedicated to vinyl LP's, now he has a phone.

It is great that I can carry round a whole library but I used to spend time just fiddling with the books, arranging them, dusting them, taking them out and looking at them. I know I could just build a library, and every now and then I buy some paper copies in a fit of nostalgia, but they just gather dust now. A lot of interests and hobbies are all electronic now.


And the mass of material in your apartment's structure is only one kilogram now, too. Your car weighs a few grams. Your food, milligrams. You only drank 4 millilitres of water last year.

Ephemeralisation applies only to things that are essentially informational, not material. Somehow, despite dematerialisation, the world ends up using more stone and steel every year.


The metaverse, NFTs, crypto, and digital goods are pointed to as the next wave.

Staunch proponents will tell you that we'll be living in VR metaverses, doing work remotely, earning crypto, buying virtual properties, etc.

I'm personally bearish on crypto/NFTs, but kids like buying digital goods in games. And VRChat is blowing up and looking more and more like a threat to Facebook.


Well you can only physically move an object so efficiently.

Unless everything we know about physics is wrong - you'll never be able to move an object w/o any energy. For that reason - it's hard to imagine a world with 7Bn people flying around at mach-2.

Our productivity gains are likely on an S-curve. It's hard to say where we are in the curve.


That's a poor example... The theoretical energy expended to move an object is zero.

All energy losses in moving things are due to inefficiencies that we could theoretically engineer away. For example air resistance could be eliminated with roads becoming vacuum tubes. Wheel friction can be eliminated with magnetic levitation. Braking energy can theoretically be fully recovered, etc.

Today we don't do that because we care about other things (cost etc), but if in the future energy costs went up dramatically, engineers would reconsider...


> The theoretical energy expended to move an object is zero.

The theoretical energy expended to keep an object moving is zero. The theoretical energy transferred to the object to get it moving is deeply non-zero, even as the energy transferred from the object to get it to stop moving is also nonzero. If you have a finite energy budget, even under theoretical conditions, you can only have a finite number of objects moving up to speed at any given time.

I know you acknowledge this in the rest of your message, but I don't think the original example is poor for the reason you give.


The theoretical energy expended to move an object tends toward zero, without limit, as the allowed time increases. Which is almost the same thing.

(Allowing it to take 12 billion years to move an inch is rather useless, but that's the same issue with the ephemeralization argument in general: it ignores all other measures of "cost".)


There is a hard limit. If you want to get somewhere, you have to get there before the expansion of the universe moves it away forever. Our accessible universe is constantly shrinking. It's a natural law that places firm discontinuities on optimization.

This may seem silly, but microcontroller shrinking is running into a similar problem with quantum noose, which is by far the closer boundary.

The accelerating expantion of the universe also importantly bounds availible time, which makes the specific equation governing how fast optimization can be achieved relevant. For example, if the cost of increasing intelligence is sufficiently exponential past a certain point, a true singularity may never come. And if it's more like busy beaver then exponential then it may be impossible for it to ever get very far indeed.


I really want to call this argument pedantic, but so was mine. And I can't argue against your argument's correctness or validity.

(Well, as far as we know, anyway. Expansion is still the best fitting theory, but there are observable anomalies. Your argument would still win in some other form, perhaps involving Planck Length or similar.)


There are fundamental thermodynamical limits to certain processes involving energy transfer, conversion etc. Heat transfer comes to mind. Those cannot be engineered away.

For simple mechanical systems, yes, inefficiencies can be removed, but it's more like externalizing them. How is the vacuum created and maintained? That requires a compressor, an inherently not lossless machine.


And for that matter, data can only be processed and transmitted at a certain efficiency.


> is the ability of technological advancement to do "more and more with less and less until eventually you can do everything with nothing,"

Curious, are No-Code tools an example of this?


Smartphones are an example of this, I'd say.

Low/No code usually allows you to do some things, but not everything... IMO.


Smartphones... yes. tvs are also an example. I have a flat screen from 2005 and one from this year. the weight difference is hard to believe.


I worked on a team that built an "ephemeral" system. Part of that was to ensure that no state was logged. They were so proud of it being built that way.

Problem was, the system failed constantly and we had no way of knowing what the issue was without ssh'ing directly into a machine and retrying the steps by hand.

I get what they were going for, but at the end of the day, their opinions on how ideal systems "should be" ended up destroying their project and ignored fundamentals.

Example: one of the main deployment hosts was constantly crashing from disk space issues for over a year. Turns out when they provisioned the host with 4GB of storage and 64GB of RAM. They never troubleshot the host and just restarted its services whenever it crashed!


That seems like a different kind of Ephemeralization than that described in the article but interesting none the less.


Most of the technology I work on wouldn’t be worth doing if the global population hadn’t grown 10x in the last century or so. You need a large mass of people willing to pitch in a penny for an improvement to their lives with expensive technology than a smaller number of enthusiasts.

This brings economies of scale and suddenly 1000 things that wouldn’t be worth doing make a new thing that wasn’t predicted. The neo-malthusians didn’t predict this happening with innovation and markets (reminds me that the US patent office almost closed 100 years ago saying ‘everything has already been invented’).


Is Malthusianism even believed anymore? The recent statistics I see show the world population slowing down greatly. From what I understand, America and most of the developed world needs more reproduction and immigration if they hope to sustain their economic growth. China raised the one child rule and no one took the bait, citing the same reasons everyone else does- having kids is really expensive and the future of the world with regard to climate change and environmental concerns, etc. is quite shaky.


Having kids has been MADE very expensive, culturally.


such a nice fancy sounding word for something that is so great & powerful but which also is a vast anti-Enlightenment force. we used to see the world about us, could monkey with things about us. butnhugh tech goods are deeply resistant to observation & understanding. what used to be apparent & ooen for interaction has become like magic to us, and we, humans, thr toolmakers, understand & see less of our ephemeralizing infrastructure & tools.

also ephemeralizes systems are often high capable, valuable resources that only big powers in the world can afford or operate well, much less build or develop. this lends to empheralized technology being more prescriptive, typically. it is general & adaptive & highly capable but rarely do we see examples of that power being opened & expressed to random people coming into a shop off the street, or to random consumers: consumers are given a crafted experience, a set of flows carved out of the more capable generalized ephemeralizes systems.


Maybe.

Thought experiment: replace ‘ephemeral’ silicon based technology with what we know of biology, which at the small scale is far more advanced than anything humans have created.

The fact that bio is so information theory intelligible only increases its ability to ‘blend into the background’ and ‘become ephemeral’.

(E.g. your body is constantly creating and destroying iron nanocrystals in protein cages we can’t synthesize to regulate the amount of iron ions involved in constructing a proton potential to move muscles.)


THANK YOU for posting this. Another piece of the puzzle is in place for me.

I myself sometimes thought-experiment about this (the ephemeralization tendency) during running, taking the tram etc but until now I didn't realize much bigger thinkers than I am ;-) already described this

The wikipedia article with referenced other pages and this HN topic are a great source of info for me. Thanks for it!


Copying and transporting data is a great example of this. That used to be extremely expensive, but today it is all but free. Today you can in seconds copy the equivalent of all books stored in the world 100 years ago and send it to the other side of the world, and the amount of money that cost you is less than a seconds worth of work.


This is what I always point to when people say capitalism is unsustainable because it supposedly "requires infinite growth in a closed system."

It doesn't require that, it only requires endless incremental improvements in efficiency, which is perfectly possible, given that even just the tiniest incremental improvement on a massive problem can result in huge productivity gains. A great example is farming. We produce more crop output than ever before all while using less land, labor, and energy.


Right, the outputs are higher than ever with the least conventionally measured inputs. Which was great for a while, but we're well into the tail portion of the curve where the advances are all accompanied with ever greater shifts of cost to the unmeasured externalities. Modern farming is a hyper-efficient process of converting fossil fuels -> fertilizer -> corn & soy -> food and its many imitations, along with massive environmental costs (CO2, runoff, freshwater usage and groundwater depletion, ...)


> In 2020, a meta-analysis of 180 scientific studies notes that there is "No evidence of the kind of decoupling needed for ecological sustainability" and that "in the absence of robust evidence, the goal of decoupling rests partly on faith". [1]

Economic growth without ecological collapse is currently capitalist fantasy and wish-casting.

No one sensible is arguing that efficiencies can't be found. They're pointing you at the very real crisis that we have no idea how to decouple economic growth from environmental resource consumption such that environmental disaster is averted.

1. https://en.wikipedia.org/wiki/Eco-economic_decoupling#Lack_o...


I like to think mathematics is really all about this *theroetical* pursuit.

Assume nothing, explain everything. Thus I claim that mathematical constants are a failure of the mathematicians. This level of constantless mathematics would have its users recalculating pi every single time.


I would argue that constants don't assume something. In fact the opposite. Take the equation for air resistance for example. F = bv^n n and b are constants that depend on the environment, so it is actually a layer of abstraction


Would you look at that. Turns out there's a word to describe the utopia high level language programmers use to justify the bloated software they use and produce.


I worry that Heylighen's algorithm would look like Facebook's with similar results.


An interesting thought experiment, really.

But when I read:

> the ability of technological advancement to do "more and more with less and less until eventually you can do everything with nothing"

I add the footnote: "Terms and Conditions may apply" :)

The blind spot of the idea is: limits. Should it be rephrased "eventually you can do almost everything with almost nothing", I'd have dropped the "thought experiment" label.


Agree 100%.

Stated another way: asymptotic functions convincingly appear that they'll hit zero--though obviously never do.

Same reason I'd be shocked if physicists ever achieve absolute zero Kelvins in a lab.


Realizing that you don't exist, as expressed by nondualist spiritual traditions, would be the ultimate endpoint of this.


Or, realizing that everyone else is a p-zombie (except for you :))…




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: