This is why when iOS restricts apps from eating up the entire physical RAM on the device, and developers complain that this is not like on the mac/PC, they should realize Apple is extremely familiar that it's not. And it'll never be. When you let go of restrictions, applications bloat up to take the available resources while adding almost nothing in terms of functionality.
It's one reason why iOS had five years head start on Android when it comes to performance, perceived or otherwise, even with much slower hardware - they made much better use of it. Background apps? Yeah no, if you need any background activity, use a push notification service. Whereas on Android, for years it was the norm to just have your application run in the background and poll every once in a while.
You can put almost infinitely much effort into optimization. When do you stop? Do you test every user-interaction takes less than 100 ms below which it can be considered good?
It’s not as easy as that. What’s acceptable in test might not be acceptable in your customers conditions. It’s difficult to correctly measure the time for every interaction and humans are extremely bad at telling you how fast something is. Then you also have the problems of regressions that stem from other unrelated changes.
I believe it can be made easy, if you have an easy way to profile your GUI. Test the feature, get a timer. If it took more than 20ms on your killer test machine, it might be too slow for some users. (Adjust your heuristic with the context of your application of course.)
In my "software development" course at university (so, not programming, but every "management" thing around it: unit testing, git, waterfall/scrum, etc), I remember the instructor telling us that the first approach to making your program faster was upgrading the hardware.
The more time passes, the more shocked I am at an instructor telling this to his students, to the point that I'm now thinking I might recall a cruder version of what was originally said.
In any case, that may be the cause!
To some extend I don’t disagree with: “Just buy a bigger server”, but for software for laptops, home PCs or mobile it’s not a great option. If you’re trying to tweak every single query to get more performance from your SQL server, then maybe the money is better spend on hardware than developer time.
Weird counter point to “buy more hardware”: I worked with a client who have some code they developed, it takes a dataset, run some calculation and returns the result. To speed up part of the calculation they use CUDA.
We help migrate they system from older physical hardware to AWS, pretty standard. They got more memory, more CPU cores and jumped three generations of Nvidia GPUs. The result was 10% speed increase, no where near what hardware alone should have yielded, so clearly in this case there’s something in code that prevent the hardware from being fully utilised.
We do more things, and have more complex graphics.
Eg, a 320x200x8 display has 64000 bytes of memory. A 4Kx32 bit display has 33177600 bytes of memory. That's 518 more times memory used just for a single screen. Take into account that these days we have compositors and each window is present in memory on its own, before it's composed into the result, and it's even worse.
We also work on far bigger projects. The project I work on as a hobby has 400K lines of C++. Imagine the amount of processing an IDE needs to do to parse all of that and provide its various convenient functions. And of course a good chunk of that is in RAM, because that's far too much stuff to parse everything every time.
Old tools also don't really cut it anymore. If I try running ctags on it, it comes up with a 14GB tags file. I don't think it handles it quite right.
Tell a Product Manager "I can do a preliminary version of that feature for the next release, but it'll be really inefficient" and they hear only "I can do...that feature for the next release."
It's so much faster to develop wasteful software. It's also a lot easier.
There's no need to use high-performance, but difficult to develop languages (c, c++). There's no need to be conscious of data structure choices. There's no need to profile code, etc.
Optimizing code can be a deep rabbit hole. If you remember a few months ago, some random person dug into why GTA Online was taking longer each day to load. The reason turned out to be a string parsing inefficiency in the strlen method, a core library. But it only manifested itself in certain situations.
“Premature optimization is the root of all evil”
- Knuth
So the only exposure most students get is not to do it. I doubt any CS programs make performance analysis or optimization a requirement. Just buy a faster system.
Human effort and attention span are still limited quantities. There are so many more libraries, frameworks, utilities, and doodads, that it's easier to plug something together from lego blocks rather than craft (and debug!) a custom solution that is both efficient and small.
Absolutely. Your example, the App Store, and lots of other aspects of the iOS ecosystem are how Apple prevents most (not all) of the "tragedy of the commons" problems on that platform.
What does the App Store have to do with any of this? I can't imagine any technical restrictions that the App Store enforces that could not be enforced by the OS alone.
For instance, "no background services" and "you can't use up all the physical RAM" are both OS-level restrictions, not app-store-level restrictions.
How about many of those OS-level restrictions can be bypassed by hacking into private APIs, which is one of the things that'll get you banned from the App Store?
The OS does what it can, but the OS is entirely out in the open to be hacked, bypassed, subverted, worked around and so on. A pair of human eyes and a private app scanning process does what the OS can't.
No, it's not perfect. It's not even great lately. But it'd be worse without it, as you can witness on desktop operating systems.
What works is not putting up walls between developers and power users. Linux has no such restrictions and bloated software isn't something I have to deal with on it (I have the option to if I wanted to, but I don't.)
What works is restrictions.
This is why when iOS restricts apps from eating up the entire physical RAM on the device, and developers complain that this is not like on the mac/PC, they should realize Apple is extremely familiar that it's not. And it'll never be. When you let go of restrictions, applications bloat up to take the available resources while adding almost nothing in terms of functionality.