Hacker News new | past | comments | ask | show | jobs | submit login
No-Code and the IKEA Effect: Software lock-in evolved to make us never churn (capiche.com)
120 points by maguay on Oct 31, 2020 | hide | past | favorite | 76 comments



I've come to the conclusion that code is temporary, and is meant to be discarded after a while. Hardware change, requirements change, people change. The things that are permanent are data and methods.

That's why I think as long as you have documented and defined your methods (i.e. business logic) well, and your data is portable, you shouldn't worry much about vendor lock-in in the long run.


> I've come to the conclusion that code is temporary, and is meant to be discarded after a while

Interesting, my gut feeling is the opposite. I think code is being written today that will remain running for something approximating "forever". Stuff will work and people won't want to change it - or things will be so intermingled that they won't dare to.


Both can be true. IME 95% of code gets thrown away in pretty short order. 5% really lives on - often for decades.


I am happy to have written code mostly in that 5% the past 30 years. So IME it is 5/95.


Absolutely. This is why still so much COBOL code around. A lot of it 'just works' so 'dont fix what isnt broken' comes into play. And a lot of it, and Fortran, is so intermingled across huge systems, that nobody has any certainty of the impact of touching it.

My opinion is the technology used for the most successful and robust back-end systems will remain long after Im gone, while that which is purely for the HMI will continue to be as fluid as it is now, simply because behaviour evolves based on individual and institutional retained knowledge.


"just works" is a big factor.

Another part is front end vs back end. I still get contacted occasionally on various back end and infrastructure stuff I built from 1995-2010, while I never hear from anyone about the user facing code. Front end has been replaced many times over, while the back end infrastructure just ticks away quietly...


Out of curiosity what is the context behind the code you think will have a long lifetime? As someone who works on “webcrap” as they say, I don’t think anything I contribute to will work for more than a few years.


Open source code tends to have a long lifetime.

I've contributed almost 10k hours to 100s of projects in the last 10 years, and most of them are still alive and have only grown in adoption. (Also it would suck if all that time went into the bin a few years later!)

From my experience, here's a ranking of how long code lives:

* Proprietary "webcrap" and plumbing logic at companies. This dies the fastest (6-24 months).

* JavaScript frameworks and tools, open source. "Dies like flies" -- many of them, short intense life, then everything dies down and the buzz moves to the next swarm.

* Proprietary backend code. Lives 5 years, then gets rewritten or dies a long death.

* Open source code in dynamic languages. 5-10 years, then mass extinction events end some lineages, such as "The Tertiary Era" (Python), "The Discovery of the Fundamental Theorem of Write-Only Code" (Perl), and "The Birth of the Script Kiddie" (PHP).

* Open-source code in static languages (Haskell, C++, C). Usually around already 10 years before I started contributing, and showing no signs of getting old.


Rust might change these numbers a bit.

C/C++ are very stable and backwards compatible, Rust tends to be more aggressive at the moment of changing things.


Rust’s core team are pretty fanatic about backwards compatibilty, though. Best practices are still rapidly evolving but if you have code that compiles and works, it should continue to compile and work indefinitely.


If a business lives for a long time, even its "webcrap" might live with it.

Honestly I think the investments made into web technology right now are so massive and pervasive that I wouldn't be surprised if we still use something that looks a lot like the web stack in 50 years, simply because there'll never be a good moment to dump all that baggage when you're so deep in it. I don't think we can look to the past of computing to understand its future, I think we have reached in the past ten years or so a critical mass and penetration into the fabric of society that the rules are just different know than they were in the 90's.

Unless the way we interact with our computing devices fundamentally changes (say, a shift from visual/touch to, idk, voice or brain waves or whatever), I have a hard time seeing how there'd ever be a big shift from the web stack to something else like, say, the shift from Win32 apps to web apps. Hell, even if there is, the web might just embrace it. My day job is web-based AR...

The stack will evolve, sure, and maybe at the end there's nothing left and the clients won't know HTTP at all anymore... but I can just as easily see us just piling more stuff on, because I honestly think that, even with voice assistants and AI and whatever, most people are still going to produce and consume text and images from a device a lot like a phone for many years to come, simply because it's a system that works well for humans that have eyes and fingers and pockets.

And if that's the case, then what's going to cause a seismic shift in platforms, when almost all software developers are web developers, and all devices run web applications? And if the stack doesn't change, why wouldn't some of the code stick around, if the business or organization that invested in it does? Some businesses now might replace twenty-year-old code just because, but at least part of that is because of how software stacks have changed and aged, and I think the sheer amount of "webcrap" that's being excreted by people like you and me is going to slow that rate of change right down, even if it doesn't feel like that in the middle of all the framework churn.


I always thought what I wrote would last only months maybe a year or so; then 5 years passes, then 10. Some things I built which contain a lot of the first Delphi code I ever wrote (I was a Pascal coder before that) are still running production 30 years on. And that was the client saying we would probably rewrite ‘in a couple of years’.

Same goes for webcrap; I have some horrible php and jsp prototypes from the early 2000s which have been running production ever since.


I don't have the same length of perspective you do, but this has been my experience as well. I chat with buddies from old employers, and they still tease me about stuff I put into place 10 years ago that I would have bet good money wouldn't live 10 months.


The code for a lot of the standard Unix utilities was written decades ago. They'll modify the code when there is a reason to, but often there isn't.


Engineering/science code, e.g. modelling real-world phenomena. Fortran code from the '70s is not that unusual to see, still in production...


Code that runs on embedded computers has a long lifetime.


> you shouldn't worry much about vendor lock-in in the long run.

You should always worry about vendor lock-in. Vendors only do it because they gain something by it, namely reduced competition. Reduced competition is always bad for the consumer.


Not necessarily, and vendors don’t only do it to reduce competition, and not all vendors do it either.

As a consumer, having 50 nearly-similar options to choose from is arguably worse than having only a few. With 50 choices you’ll get an entire sub industry devoted to “helping” you make a choice, and you’ll likely miss out on certain economies of scale.

As a vendor, it’s a lot easier to understand, support, and build products for your customers if they are only using your products.

There are obviously negatives to vendor lock-in, but framing it as entirely negative is misleading.


If I were to pick one option out of 50 competitive products completely at random, I'd generally expect that to be better than the counterfactual monopolist alternative.


You should read what I wrote more carefully, nowhere did I claim that monopolies make better products.


You said that it would be easier for a vendor if it could expect to monopolise its customers.

Whilst that is doubtless true, that does not make anything better for customers.


I said nothing about monopolization, I would appreciate if you would not mischaracterize what I said.

If a company’s customers solely use their product, and not a mix of products, it’s much easier for the company to support and build new features for their customers. This is just a fact of reality, and one benefit of customers limiting the number of products they use.

Pointing out beneficial properties of consolidation and specialization doesn’t mean I support monopolies—I absolutely don’t.


Indeed:

Ideally, there would be one car. Ford. Ford would be the only vendor for cars, and Ford would be the only car vendor. We'd have the technology approved by Ford and all of the technology approved by Ford, including black paint, carburetors, tetraethyl lead, and, through a special deal with Radio (formerly RCA), AM radios tuned to Radio, the only Radio company and station in the world.

These vehicles would be simple to maintain, in that the only maintenance option is to ship them to Ford, and simple to customize, in that you cannot.

And anyone who attempts to modify their Ford would be subject to legal penalties, up to and including bomb-making, just as if they'd tried to destroy any other institution.


Wow, that's not at all what I'm saying. Did you notice how I offered no prescriptive nor normative statements? I'm just observing that there are tradeoffs involved--there are some beneficial properties that emerge as the number of choices decrease (along with a bunch of detrimental ones). Pretending like those don't exist does nobody any favors.

Since you asked, my opinion is that there's a healthy tension between too many choices and not enough choices, and we should strive to maintain that tension. Which requires an objective understanding of the tradeoffs, instead of knee-jerk reactions to the phrase "vendor lock-in".


What counts is not the quantity of choices its the complexity and cost of failure especially if the choices are interrelated with other choices and the cost of failure increase with successive badly chosen choices.

Standardization drastically reduces the cost of choices by letting you make them independently, a healthy market increases the chances that there will be many good choices, transparency increases the chance that you will pick one that is good for you.

A small number of complicated interrelated choices would be manifestly worse than a large number of independent mostly good enough choices. This is why its mostly ok that there are a billion different cars. Most of your choices are good enough for most purposes and you don't have to pick your phone, your route home, your brand of gas station based on your car or options.

Having few choices is virtually always bad. The converse is a difficult position to defend when instead of fewer choices one nearly ought to always wish for easier ones.


Taking the 10,000 foot view that "code is temporary"...

I think "business logic" is indeed temporary. With very few exceptions, most code you write for someone else will not stand the test of time. In fact, most of the companies I've worked for do not exist or have greatly changed over the years.

MIT/BSD software is a little better.

And strangely - GPL software does stand the test of time.


I think my usage of the term "business logic" was a bit misleading. What I meant was more the methods and ways which we expect the data to be processed; more like requirements than anything.


Is it because MIT/BSD software is more common in web or scripting language cultures, where code tends to be replaced faster? Or more common in newer code which most will not survive for years?


MIT/BSD licenses allow for proprietary forks. The most popular BSD-based kernel is Apple's, which isn't distributed under the BSD license. The most popular GPL kernel is Linux, which still is.

So with the BSD license somebody can take your work, improve it, distribute it under a proprietary license and supplant you in the market. Compare macOS or iOS installed base to FreeBSD.

But then the codebase is henceforth tied to that company. If Apple is dead in twenty years then probably nobody is still using the iOS kernel (and not because they switched back to FreeBSD), but if Google is dead in twenty years most likely Linux is still popular all over.


MacOS/iOS is not based on a BSD kernel.

https://imgpile.com/images/uL3CFP.jpg


macOS/iOS is based on a hybrid of the Mach kernel and the BSD kernel:

https://en.wikipedia.org/wiki/XNU

And Mach too is available under BSD-like terms.


It's important to remember it's not Free/Net/OpenBSD based, though, even if it had used bits and pieces from there to update its OSFMK derived kernel (OSFMK shared common origin point with NeXTSTEP kernel).

Arguably one can just call it Mach, as the bsd server part was delivered as part of Mach.


I was thinking more like:

- software written for a company has a lifetime ~ the life of a company

- MIT/BSD software may be adopted and not shared.

- GPL software would be shared widely and changes would be given back.

I'm just speculating and I have no idea what the real statistics are.


Agreed.

The primary reason the software development field has evolved to be preoccupied with saving/re-using/inter-op code is because of the difficulty of building even the most trivial piece of software.

Unfortunately code hoarding is simply a reflection of the age-old human flaw of wanting to recoup sunk cost, you assume if you've invested a lot of time into something, it must be worth holding on to for as long as possible :)

If software was as easy to make as a piece of office stationary is easy to write on, no one would care about code reuse/saving/interop...you don't see anyone saving post-it notes or even excel docs for indefinite reuse.


I couldn't agree more. A personal anecdote; I used to think about what would be my legacy as software developer? Anything that I write would be obsolete in few years. Nothing I wrote back then remained working. This thought was depressing me.

Then I realized that what I'm actually creating and maintaining is not code. It's the mindset behind the code, and the skillset to convert ideas to working pieces of software. My legacy would be to learn through experimenting, and to help others learn the same.


Who has a legacy anyway and why would a normal person expect to have one? A tailor's clothes get worn out, a farmer's food is eaten, the cook's dish eaten, the hotel cleaner's "output" gets messy again, the architect's apartment block torn down etc.

Who has a true legacy outside a small minority of philosophers, artists, scientists and political leaders?


I think even the idea of having a tombstone is a sort of legacy. People made cave-paintings to leave something behind. We all want to cling to existence somehow and want to leave a mark on world when we leave. The simplest way maybe is to have children, so your gene would live on after you.


> We all want to cling to existence somehow and want to leave a mark on the world when we leave.

Not everyone wishes to leave a mark on the world after we are gone. There are entire philosophical branches dedicated to the idea of embracing mortality and accepting that which makes us human.


I feel like that's just a way to cope with the horror of death.


Naturally, but it's also a way to approach death if you don't consider it a horror.


One can embrace mortality while also wishing to leave a positive mark on the world.


Most of the paintings get lost, even sometimes masterpieces.

Bloodlines die out or change enough to not be recognizable.


I see Excel documents that have been reused hundreds of times for dozens of different purposes over the last 20 years. I see that all the time. "Too many styles", VBA macros from the 90s, hundreds of items in the Name manager that refer to long-forgotten hard drives or #REF!. It's everywhere.


In most cases, the only reliable documentation and definition is the code itself. It's is rare for most of us where it is hard to figure out how to do something. In general the only hard part is to figure out exactly what needs to be done.


Actually, code isn't what does that. It's true that code is one of the things that keeps it "all". However, it's not the best place for documentation or even storage of that knowledge. Code changes.

Tests are what you need to invest in. When you test all of you business use cases in an abstract (perhaps even no-code) way, you are truly independent. You can rewrite the system very quickly, if you had a test-suite that allows for quick iteration (TDD style) and does not depend on implementation details.

Given just the code? Well good luck migrating to another cloud vendor. You will probably introduce one million bugs on the way.


What are some techniques you use to test business use cases in a code independent way?


You'll want to test against stable API/ABI, and use positive and negative testcases. The problematic parts are mocking, simulation and how tests affect the codebase.


> You'll want to test against stable API/ABI

Good luck with that. lol.

- The only stable APIs I've seen, if an API exists, are production financial APIs, because money is involved.

- DynDNS has been knows to change input and output parameter types, breaking calls

- Even Twilio has changed a fundamental API path (!) in the past couple of years, breaking SMS API calls in 2019

- Facebook only supports API version N and N-1. Hope you're not on N-2 and there's a sudden version bump!

(I say production APIs, since even payment gateways often have flaky dev/qa gateways that you can't reliably do automated tests against. Past companies that I worked at had to test against prod gateways with their own personal credit cards.)

Otherwise, if you want an API, ensure you choose a product/partner that offers/supports one.


> you have documented and defined your methods (i.e. business logic) well

Wishful thinking, though, in many cases. When software dictates the workflow, the code _is_ the business process documentation. Add a new function to the code? Did you go and update the docs, too? Takes serious diligence, takes bigger teams, hard for a small business systems department to keep up with that. That's why they use low-code platforms, aside from the skill gap (i.e. once you can really code, you tend to not want to write business systems stuff anymore)


What do you tend to want to write once you can really code?


Pick "boring" and battle-tested solutions and this fades. I implemented a fairly complex kickback-tracking business system 15 years ago, due to the rest of the company being entry-level PHP devs, I implemented the whole thing in pl/pgsql. 15 years later it's still ticking as the core of the business with minor modifications.

And the crazy thing is, in today's world of microservices and hyperfocus on horizontal scaling, I would still implement it in pl/pgsql, and indeed, I'd expect it to keep ticking for the next 20 years as long as the company stays in business.


That's why I think as long as you have documented and defined your methods (i.e. business logic) well, and your data is portable, you shouldn't worry much about vendor lock-in in the long run.

That's a huge big IF. In many if not most cases the only reliable documentation of a system is the running code itself.


As Tom Kyte said:

"I take the view that applications come. Applications go. Applications come again and go again. But - the data, the data - oh, it stays and stays and stays and is used by application after application after application. And if you store your data perfectly for "the first application", it'll not be good for the future ones. And I don't see tons of data that isn't used by many applications (need many views of the same data). See, my goal is the data - knowing that applications are like Mayflies but data is more like a tree."


The only thing that remains from code are the real world decisons and actions that come from it


Not only that. Good code should be easy to throw away without major effort.


For me, writing code has became like making a sandcastle. I sure put a lot of time and effort in it, but I never try to preserve it. It's pointless. What I preserve is the lessons I learned while building it, and applying them to build a better sandcastle next time.


The article presented evidence for the primacy of humans. That things are sticky because they are built for humans.

Your counterclaim to this piece is that humans aren't a thing that matters. Do you have reasons?


Big data is permanently in AWS


Big data is permanently in AWS

Egress costs make sure of that!


Asking this out of my ignorance, but there're no ways of exporting and extracting big data out of AWS?


There's definitely ways, and it isn't hard, but AWS charging heavily for egress data. I've had bills with AWS that were $200 a month, where $160 of that was egress charges.


As long as you keep paying, of course.


I don't use Ikea furniture because I have to assemble it. It's just, cheap, reasonably good, easy to find and get. The idea of having to buy a non-Ikea desk scares me because I have no idea where to buy it (I'd have to spend time googling, analyzing and coping with uncertainty), I believe it's going to be expensive (which also implies I'll spend the rest of my life worrying about damaging its surface) and heavy. With Ikea I just go there and get whatever I need (+ an awesome Swedish dinner and best coffee ever).


> I buy Ikea furniture not because I have to assemble it. I buy it because it's cheap, reasonably good, easy to find and get.

Ftfy


Thank you. Obviously the comma after "just" wasn't meant to be there but replacing "use" with "buy" discarded a portion of the meaning. The article says people overrate value of what they build themselves and are more likely to keep using it for longer. I don't feel convinced this applies to most of the Ikea shopping cases - I believe the factors I've mentioned might play more significant roles. I also doubt many people would prefer to spend time tweaking an app in a hypothetical scenario in which they could actually get it pre-configured in perfect accordance with their needs and tastes.


In terms of price and quality there is just a huge gap between IKEA and other furniture sources (at least in the Western countries I've lived in). Basically, it's virtually impossible to find products of similar function at the same price, and even at double the price it's a toss-up whether the quality will be better than IKEA. Sure, if you can afford to spend a fortune on furniture you can obviously find better stuff, but most of us can't.

For example, my 15-year-old IKEA leather couch has gone through three moves already, and is basically as good as new. I could afford it in my first job out of college. The Billy bookcases look a bit janky sagging under the weight, but are holding up even without extra support.


“...even at double the price it's a toss-up whether the quality will be better than IKEA.“

IKEA kitchens in particular.


Might be more applicable in this scenario to the act of throwing out your furniture?


Perhaps, but not my case anyway - I would never throw away anything which is not actual garbage anyway. Whenever I feel like replacing something I do my best to find somebody who wants it, no matter how long it takes and how little they pay.


> We may have complained for decades when software locked us in; now, we’re happy to stay.

Being willing to trade-off higher costs and less control in order to use new technology is commonplace in consumer tech. Nobody has ever been "happy" about this vendor lock-in, why would no-code be any different?

The current proprietary tools are valuable but they don't allow users to export their creations between vendors offering the same functionality - that means they are more locked-in then ever. Users have the added emotional element of having to abandon their creation and accept the sunk-cost if they want to switch. I'm quite sure that nobody is happy about that fact.


> I'm quite sure that nobody is happy about that fact.

Indeed. I think the original post may have mistaken effect for cause. We can look across the course of human history at many cases of authority figures and those who interact with them, but generally, forcing any kind of lock-in has weird second order effects, and sometimes "the cure is worse than the ailment" -- so too I think it could be here.

What would be needed to show this? Portability, for one. The open-core model exists for a reason. Software lock-in needs to be voluntary to be defendable. If you don't make it easy for your users to switch, you have no proof that they are staying with you because you are the best option, not just what they are forced to use. This can be very dangerous for an organization which takes its incumbence for granted, which is basically every large B2B sales-led SaaS org, almost by default.

It means that when an upstart figures out how to out-execute you, you will have ossified into a living fossil whose organs they will consume from the inside out all in one go, rather than getting advanced notice ahead of time while you can still course correct. Convenience is great and all, but on a long enough corporate lifecycle/lifespan (and they seem to be getting shorter and shorter these days), un-sustainability eventually takes its toll.


https://en.wikipedia.org/wiki/Data_portability

There is data portability in EU and you can get all your data... But it probably is useless or at least not as useful as in context of the SaaS you were using.


There's a legal requirement for data portability. That's a different thing.


Great piece, I think human satisfaction should be the most important metric when measuring products. The article implies somewhat it’s not. Also I think product which engages you and make you feel you learned something adds value, but only if you overcome the challenges (IKEA, Emacs)


Now all vendors have to do is to add a work component to their products and customer churn will cease. The only(?) caveat here is probably that if customers realize that their required work was added to the product for the sole purpose of customer retention then this scheme will surely backfire.

(Congrats btw for being close to the only comment that actually addresses the gist of the article and not just the title. Had to scroll to the bottom, but here you are :-) People are surely stressed nowadays. Could it be because that Ikea bookshelf isn't up yet?)


>I think product which engages you and make you feel you learned something adds value, but only if you overcome the challenges (IKEA, Emacs)

Do you think that "personality type" could be also a factor here?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: