There's more. App Store also provides some sort of delta updates , which save a lot bandwidth, but failing to report it properly again .
App Thinning does not work for standalone image assets. It means you are forced to use Asset Catalogs where all PNG files are stored as LZFSE (previously zip) compressed BGRA bitmaps. It's good. But optimized PNG files can be 30-50% smaller on average. I'm fighting this  but not sure if there's a simple solution.
: but if you know of one, I'd love to use it!
The HTML Canvas will drive the needs, as well as other W3C specifications related to accessibility, internationalization, matching user preferences. You cannot do the same thing with "stupid" bitmap graphics. In fact the birmap graphics will be internally converted to vector graphics for better rendering (this is already what printer drivers are doing to improve their results, you cannot do that only by subsampling, bicubic interpolations: you need geometric transforms, notably for adapting texts or symbolic designs such as maps): the individual "pixels" in bitmaps are really stupid objects that do not properly handle how they are geometrically linked to their surrounding pixels: there's a need to contextually hide or show some details, or improve some of them, vector fonts for Opentype are perfect examples of that need.
But that still means it's years away from widespread support, because large companies can't rely on it for large user-bases, and feature-sets are still pretty widely varying in my experience. Some support animations, some complex gradients and masking, and huge variance in text rendering (which is absolutely essential to finely crafted images) cripples some of the most-valuable use-cases (internationalized images without making dozens or hundreds of near-duplicates).
It's just not there yet. I hope it gets there, but there are significant hurdles that don't seem to be getting a lot of attention.
... except now you need to do it on the device too, to make sure it renders identically there. Otherwise the same problem!
For very small images it doesn't matter / png may be smaller, but small images with that quality don't typically add up to hundreds of megabytes.
OTOH, people do vector stuff for many things, it's just they typically don't use SVG but something more efficient. XML parsing is not lightweight compared to a binary format or some sort of vector -> code translator speed wise.
Bitmaps will also never be able to adapt their content gracefully to user preferences and constraints. Bitmaps have no intelligence; they are only suitable for photography, but even in this case they lack information for proper rendering, so you will nver same twice the same bitmap across devices and rendering supports
Just compare what we had in the past with bitmap fonts (now almost abandoned except for text consoles) and how we appreciate now to have scalable vector fonts.
Bitmaps are extremely complicate to adapt to any layout, whereas SVG graphics can be now self-adaptative, to render only what is needed.
Webkit does a fine job with SVGs. You could draw it on a <canvas> element to get a rasterized view of the SVG in the desired size, and save the resulting image.
AFAIK, the majority of designs are created in photoshop or illustrator. No conversion nessecary. In any case, I would consider it a liability if I didn't have my designs in an editable format. What if tommorow I realized I wanted a variant in a different color? Or I decided I wanted to use a different background? If all I had was a bitmap logo, I would consider it worthwhile to have someone change it into a scalable format anyways, even if I wanted to transfer them as PNGs.
Also, numerous free tool exist to convert bitmaps to SVG. Using one of these, plus some manual smoothing and correction, I'm sure the conversion would take no more than half an hour.
It gets even worse once you realize how limited older mobile GPUs are or what are the incompatible subsets of features supported at decent performance.
Resolution independent formats are inherently anti-triangle and by the time you've hit triangles you already have a target resolution in mind.
Or put another way, GPUs really don't like vector graphics in general. That's not what they're built for or good at.
Once you've subdivided your curves and such into triangles/vertices/etc., the GPU ends up being a lot happier about its existence.
Specifically vector graphics typically includes curves, which GPUs just don't do at all.
Once you've tessellated a curve into triangles you've already baked in a desired resolution. You can't have resolution independent vector graphics in a GPU friendly way.
SVG graphics are clean, result in NO blur at all. They don't even need any Truetype-style hinting. You've probably looked only at early non-conforming implementations using bad approximations. But precise rendering algorithms for SVG are specified and fully tested (this is not the case for bitmaps whose support is much more problementic: see what happens to photos and videos when editing them! Never twice the same result, and lot of losses and artefacts everywhere, including bad undesired effects, and notably the moiré effects which are much worse and do not reproduce what natural scaling would do in your eyes, that are not fitted to a perfect rectangular grid).
So no, it's perfectly possible to create a vector image that results in unblurred strokes at arbitrary pixel sizes. What you want is a geometric transform to exhibit more some details: it is in fact impossible to do that with bitmaps except by sending variants tuned for multiple resolutions: you do that for a limited number of resolution and assume it is enough, but this is not the case, so you add more, and finally you have enormous images that are widely redundant with each other, and better represented by converting their 2D model to 3D and an adaptive 3D-to2D projection. you'll do that easily with vector graphics, not with bitmaps (or their extensions called "mipmaps" which are only approximations not really scalable because they're still all based on discretely sampled planes).
The classic easiest to explain in words case is taking a capital letter O and turning it into more of an "elongated stop sign" shape at around 6 pixels of width. There is a tradeoff between remaining faithful to the original shape and creating high-contrast, readable shapes at low resolution. There is nothing impossible about scaling down the vector outlines of a TrueType font in a mathematically perfect way. You still need the hints to make the result more readable.
(I worked on the hinting software for 13 of Apple's System 7 font faces and about 3/4 of Windows 3.1 launch fonts.)
This huge wall of text but apparently you don't understand the subject matter all that deeply. Why wouldn't SVG need hinting? Truetype needs hinting and it's a vector format as well.
Hinting as done in TrueType is probably overkill for most purposes. But icons are often not displayed in arbitrary sizes, but rather in one of a few known-in-advance sizes. It's not hard to tweak the paths and shapes to fit to the pixel grid for all those. Also, as you approach higher pixel densities, it becomes much less important; basically you may just have to make sure that the smallest sizes fit on whole pixels and that's it.
Long ago I've automated asset generation in different sizes from SVG for an Android app I worked on and even without caring (much¹) about a pixel grid the results were good enough not to need tweaking.
¹ When hand-writing SVGs I tend to care about integer coordinates simply to not lose my mind.
eg png is fine, but if a developer wants to include svg then let them.
For developers who do care about the clean, unblurred stroked at arbitrary pixel sizes this wouldn't impact them at all.
For the developers who care more about the space taken up, this would give them a potential useful option.
Ironically IMO I'm pretty sure this used to be one of the major selling points of vector graphics and svg :-)
The catch is that Xcode compiles all those PDFs to PNGs before bundling them into the .ipa, so in the end you have a bitmap either way...
In any case, I suspect something like lyon (https://github.com/nical/lyon) would be a better fit than pathfinder, which is both a) not super portable and b) very focused on rendering text, which it does very well.
From discussions with iOS colleagues it seems that Android has seen a way stronger push for vector assets.
Probably because from day one the OS and SDK has been designed for unknown displays at compile time.
In that sense, the only reason there would be more fragmentation in SVG support on Android than iOS is that it can be harder to upgrade Android devices that did not come from Google.
SVG files are vector based.
Vectors don't have pixels.
That is, developers could compile a different copy of the IPA for all target devices. Using variables like CPU architecture, screen size, and other feature flags, a smart compiler could cut a lot of code and assets that never run or display on certain devices.
(Granted, this would be much simpler on iOS, which has a limited set of targeted devices compared to Android.)
In fact, it seems Apple has already taken steps in this direction with its cloud compilation (I'm not an iOS developer, so not sure of the specifics). What would worry me is if they started requiring all source code be uploaded to their servers for compilation. Going down that road is fraught with ideological pitfalls.
It's not really a bytecode as it's not interpreted, but that's debatable.
App thinning, as they call it (https://developer.apple.com/library/content/documentation/ID...) almost solely is about two things: download time and disk space.
These apps in that list are also built by teams of 100s of engineers working at full speed. In reality, each one is its own little OS full of its own UI frameworks, testing frameworks, and nontrivial code.
Trust me when I say that everyone is plenty aware of how big their footprint is getting, and no one is happy about it. Apple won't even let you submit to the store if the actual single architecture binary is over 60MB.
Some limit that amount of data they download but a few are indiscriminate about it. I hated the Yelp app because it seemingly did not limit how much space it took up. The more reviews you viewed, the more space it took up. I never found a setting to limit it. If you go to Manage Storage you will discover that many apps take over 100mb in documents and data.
I guess we also should be looking at the actual data transferred rather than the number shown on the store page.
That's about 1/8th the size of those apps on iOS. Those are just the first three apps in use article, I'm sure the rest are about the same.
I wonder why it's so different.
Hopefully someone can chime in with the reason.
This actually affects Android 5+ too (Multi-OS engine uses Android's AOT compiler) but since it happens on the device it doesn't affect download sizes.
Any idea why that is?
But there's no way it's anything like a 10x difference. There's definitely something else going on here, like some quirk about how the app store is calculating the sizes of assets.
and source code is generally smaller than machine code.
Other than the environment provided ones, isn't it possible to bundle some dynamically loaded libraries that can be shared by multiple apps ? Makes absolutely no sense for each application to implement it's own web browser/runtime.
Can't believe we are constantly reinventing the wheel again.
Shared libraries beyond that would bring the hell of version incompatibility. One need only look at Apple’s evolution of Swift where there still isn’t binary compatibility?
Most people don’t want to crack open their phone and set paths so that their app has a version of Python that works with it.
Also shared libraries can be giant security holes. See the bug in AFNetworking that affected many thousands of apps. Imagine an errant app shipping a backdoor into a shared networking library.
Unless those platforms start distributing a huge number of libraries your software may decide do depend on, shared libraries won't take off on them.
It just never really took off for some reason. It's where all the BCL assemblies existed.
Instead we have NuGet and the libraries exist wherever the application is installed. GAC be damned :(
You would only get a benefit if two or more apps you installed used the exact same version of the dependency, but that seems like it might happen enough; especially once it's expected to be in the ecosystem.
This is more a general question but I've never understand why so many shared library systems have the restriction of "only one version of a library". What stops you from storing different versions of the same library for different apps?
(I understand the problem with transitive dependencies - e.g. if the same app transitively depends on two different versions of a library, that might spell trouble. Even for this there are solutions - e.g. different classloaders for the jvm - but even without, you could disallow this special case witout sacrifying much)
Edit: in other words, common data.
> Shared libraries beyond that would bring the hell of version incompatibility. One need only look at Apple’s evolution of Swift where there still isn’t binary compatibility?
Since the source code is shared with Apple once it reaches the marketplace, wouldn't it be simpler for Apple to compile a specific version for whatever version of the Swift language is available on the target device ?
It wouldn't have any market value and wouldn't drive iOS devices sales but is would certainly slim them down a little.
> Also shared libraries can be giant security holes. See the bug in AFNetworking that affected many thousands of apps. Imagine an errant app shipping a backdoor into a shared networking library.
This sorta makes me like iOS a bit more. Memory gets cheaper but security doesn't get any cheaper when the dependencies grow.
Edit: Also doesn’t help if you have C or C++ code mixed in there.
Yes, there is one: the standard library, provided by Apple. However, many apps eschew this to use their own libraries and frameworks for whatever reason, be it ease of use or feature enhancements, and these cannot be shared.
Everything that is shared between apps currently is stuff that's provided by the shared platform.
Are you talking about React Native apps here as well?
It's much more likely that app developers are optimizing for many things, including app size, but reducing app size has a bad cost/benefit ratio. Here are some decisions that may bloat your app:
* Want your network calls to be fast and reliable? Better use that cool new HTTP library rather than writing your own.
* Want to keep everything secure? Rule #1 of hacker news is never roll your own crypto so better import the best lib out there.
* Want to delight your users and their fancy QHD screens? Time to include some high res images and animations. Oh and you can't use vectors, they kill performance.
* Want to access new markets? Time to translate your strings into 80 common languages. Oh and some of these may require custom fonts to look right in your app.
* Want your Android game to have blazing fast graphics? Import that native library, and don't forget multiple architectures.
Not to mention, rarely does app size register to clients, PMs, bizdev in terms of a worthy task. Only when you end up at the top of the app sizes list is time given to optimize.
We work to keep down app sizes and offer line item tasks for app size optimization on updates, noone finds it needed until they start to risk being near the top of the app list on a users device when they want to delete some apps.
In games, many toolkits/engines like Unity and Unreal in mobile also add quite a large chunk. Building your own engine is rarely an option anymore in terms of competitive launch. Many games built with Unity/Unreal even with moderate assets can reach 200-300MB easily and easily creep up further on updates.
If you need to advocate this, it tells a lot about how much app devs care about the issue.
I can forgive games for sacrificing size in favor of high-res animations and a blazing fast graphics stack, but I don't think LinkedIn, for example, is in need of particularly fast graphics.
Given that, I don't think it's unreasonable to consider using a library to do it better, although you have to weigh the increase in apk size. The standard crypto libraries on the phone are great if the features you need were released 5 years ago, but if you want consistent behavior across Android versions and manufacturers, you need to include that yourself too.
Unlike Android, end users can't clear it, though backing up and restoring your phone does that...
It seems that most users don't really care nor use all that many apps.
So it's not just that there's a bit of sloppy (or pragmatically careless) packaging, but there's just a lot of stuff in these apps.
Are there any examples of well-known apps from large organizations that aren't excessively large in size?
Even the main Amazon shopping app is less than 100mb.
I am being deliberately ignorant, having no idea what functionality one might add, but I'm actually genuinely curious. All that I see is a small databse and a lot of assets that are being downloaded on the fly.
109 on my device, not sure why it needs over 100 for a menu/search and checkout function that relies entirely on external data.
Also another commenter said Alexa was 78 when it's 124 for me and does even less than the shopping app.
You can bet that those companies' infrastructure is optimized to hell and back.
You'd be surprised. (Source: I work at a unicorn.)
There were a few times already that I thought about using them, but couldn't install it right at the time.
This is nothing new. Look how much memory your typical Java desktop app, or worse, your average Java server app, uses. As soon as you have enterprises involved where it's regular occurrence for devs to be outfitted with the shiniest and beefiest (as perks or per standard company policy) developers won't care about resource usage until it's so excessive it either slows down their machines or accounting knocks on the door and asks who ordered dozens of x1.32xlarge machines...
Seriously, another huge problem is assets. In ye olde PC days, you included ONE version of an image and that was it (okay maybe a 16x16 favicon for the small start menu and a 32x32 one for desktop)... these days with loads of different combinations of resolution and DPI, and across multiple platforms (if you're doing a Cordova build and don't split the assets before packaging), stuff can and will grow.
I'm sure they could, but i suspect the engineering effort would've been much greater. That's money down the drain if it turns out users don't care, or their competitors released earlier and grabbed more market share.
A lot of people go to wal-mart. Doesn't mean that it's a good thing overall for mankind
It's a greedy search. Just because you optimize for your local best interests, doesn't mean the COMMUNITY will benefit overall and the ecosystem itself suffers.
Look at how slow, ad-filled, and error-prone websites are these days.
So yeah, companies and people should be at least _concerned_ with the disparity between "best for me" and "best for all of us." Otherwise, we get things like rampant pollution. Well, this is app pollution.
Enterprise apps are slow because developers lack shiny new tech. You are forced to build yet another Spring App that takes minutes to start and gigabytes of memory. You also need to use MQ to interface with other systems and log/audit almost everything.
Amazon differential is not the quality of their app. Unless they create something really impressive some day, they can only lose customers by moving faster there.
Besides, none of those are small startups anymore. All of them have something to lose.
For my only real experience with amazon (audible) the quality of their app is a negative, I'd much prefer they made an API available. The same can be said of others, I'd much rather native apps connecting to a netflix API than use their awful web player.
So size being a problem seems only true for engineers. A normal user don't care as long as their device still works. And apple won't care since they can use this as an opportunity to up sell larger devices.
Software expands to fill all available resources.
If you were testing over a dial-up connection, you would notice it.
Same with app writers. They forgot to optimize their app because they're testing it on the latest and greatest phones on their home network (or a corporate network which is blazing fast). If they tested on a low-end phone, and actually performed the update themselves over a slow cell network, they'd probably notice it.
Many common tools aren't set up for common-sense optimization. Ideally resizing images would be an automatic step, and you wouldn't have to remember. But that's not the case.
I'm sure that there are plenty of iPhone apps with 2 MB images from a camera, when a 256 KB image would do.
I wish devs would still test like that. Sure, your app works fine in downtown SF on the newest phone, but try using it in Nowherseville, TN, on a phone that is more than a couple years old. I bet a lot of user frustration comes from dealing with that.
Granting that you heavily imply a 4G connection, the choice of "Nowherseville, TN" is extremely interesting: Chattanooga, TN — which as a Tennessean I feel is essentially "Nowheresville" to most non-Tennesseans — has Internet service multiple orders of magnitude better than most of the Bay Area…
I do not, however, mean to imply a 4g connection. Many people don't even get that.
Hell, all of Android is far, far slower than it should be. I won't move to iOS but what Apple can do with a dual core phone is pretty amazing (and yes, I know that newer OS builds get slower).
Does anyone know of a good process-viewer/resource watcher for android?
Apparently writing an IntelliJ plugin of a reasonable quality is much harder than a Python or other script. Who would have thought. And a glorified text editor takes GBs of RAM, likewise a glorified Makefile.
(The resource watcher is built into Android Studio, but ignores GPU memory. To do that you have to run GPU debugger, a separate memory hog.)
They should be doing better than this.
(It came up for me in Ivan Illich's Deschooling Society :)
(As far as I can tell, the answer to "why is LinkedIn.app so large?" is not "because LinkedIn's iOS team sucks", but "because LinkedIn's iOS team works under a number of constraints, including app size, and app size is not a particularly powerful constraint to optimize for.")
I could replace the following with PWAs:
- Google news
- Shopping sites like Walmart, Wish
and many more.
Facebook and Amazon have no PWA's but have mobile websites. (Facebook mobile web works well with Opera. On other browsers it annoyingly redirects to play store to install messenger)
Consider an app like Discord , which is built using React Native and is thus a "native" app with some additional cruft like a JS runtime. It clocks in at a relatively small 30mb. Not bad.
Then consider Slack . For nearly intents and purposes it does the same exact thing. Discord has far more functionality than Slack. Yet, it is 129mb.
Tweetbot ? 12mb. Twitter ? 204mb.
The issue has little to do with the technologies used. PWA, React Native, full native, it doesn't matter. The issue is truly that these large companies have horrible, bloated engineering teams and that bloat comes through in the size of the apps produced. It is Conway's Law in action.
How can a clock need the equivalent of a box of floppies? Windows 3 and 3.1 together comes to the same Mb and that comes with a clock.
It's the new name for a home screen bookmark.
Essentially, apps on the web that feel and behave like a native app.
oh, the irony!
the best way to understand it is just to try it. Its just a website(PWA) pinned to your homescreen, so theres not much cost to try it.
I'm constantly removing Facebook/Messenger for situations like when I had to download Ticketmaster app for a concert ticket.
And with all these apps disallowing you from moving them to SD card, I can't even really use my 32GB SD card for them.
And if Play Store won't let you install it then download & install manually from here
No, seriously: uninstall that junk and just run stuff in the browser. It works much better than you think, the biggest annoyance being all the nag screens sites throw at you to get you to install their apps.
Also, relevant xkcd: https://xkcd.com/1367/
> It works much better than you think
It still doesn't work as well as a native app.
Fast forward to today, where my MacBook keeps nagging me that my "hard disk" (actually an SSD) is "full" because I only have 3GB of free space. In 20 years, what was once considered the maximum is now considered negligible.
Optimization is important, but regardless, software size is going to keep growing. Wringing hands over it doesn't help much.
It's not the storage size that is problematic as much as the transfer size. Websites and developers really need to be aware how long it will take for someone to even get your product. If a website doesn't load in X seconds, you're losing Y customers. If your app takes an hour to download, your customer has probably already moved on in frustration.
That's because it will use that space for caches and swap, and you're preventing it from doing so and making the system slower.
Then I remember downloading a 3 MB mp3 on my 9600 baud modem and being amazed at how much space was taken up by music that sounded realistic and not like just a bunch of beeps out of speakers that could only make beeps.
Then came the old joke about EMACS standing for "Eight Megs and Constantly Swapping".
Then I remember noticing that commercial software like games filled up a full CD's worth of space (back when software was distributed by physical CD's). After that it was common to ship software as multiple CD's, then multiple DVDs.
Now, is software even shipped on DVDs anymore? I just download everything, and, yeah, apps are still bloating, same as ever.
But the rest of us just want to use a phone with a couple apps. We can't afford phones like that, and high market-share apps shouldn't be designed for the top 1% of phones owners.
If Facebook wants people to use their shit, they can't design it solely for early-adopters.
When EVERY app does the same thing? It's a problem.
Here's some of what's listed in the update list on my iPhone:
Chipotle is 92. The kindle app is 171 (perhaps the fonts?). The Amazon app (which is mostly a web view anyway) is 127.
Robocall blocker? 22. Verizon app? 160. An app for tracking streaks of achieving task? 65.
Slack is 123.
Clips? The Apple app for making little movies that includes a fair bit of art? Only 55. That makes sense.
Authy? To show 2-factor codes? 65!
Outside of games (which have a lot of assets) app sizes seem to have absolutely no correlation to what they actually do.
(1) They substantial profits from memory upsells on their product lines
(2) Larger apps take more horsepower to run— so older models become less effective sooner!
Quite obviously most developers will have:
1) VERY powerful hardware
2) VERY fast (and unlimited) internet connection
It's not like (they should do it as part of quality assurance or similar) they take a car, drive in some remote countryside, possibly in the middle of nowhere, stay there a couple days and try accessing their website (or running their app) on the lowest/cheaper entry level hardware on a metered connection.
It's easy when you have a T1 or faster connection on a recent top-hardware to forget how a lot of other people have slower devices, with less memory and limited bandwith and metered connection.
I wouldn't be surprised if half of the outrage about file sizes simply comes from the fact that there are people who remember what software was like before all of this "technological supremacy" was a thing. People joining the workforce today didn't grow up with the experience of installing something off of seven floppy disks, which would have been considered an emormous program before the CD-ROM drive was common. They also don't know how much better those programs run, because they had to do so with 8 MB of memory or less and nowhere near 1 GB of hard drive space.
It's pure decadence.
Hey, nice home page it is only 3 Dooms ...
For most freelancers and contractors the economics of the market means most of the time you have to be lazy or you will lose the job to someone else.
Larger apps means you have more "need" to upgrade your phone to the latest version with more space, power, speed etc.
I suppose it can help hide price increases of the next higher configurations, such as when the base configuration of the MacBook Pro decreased from 256GB to 128GB between 2016 and 2017, with the list price for 256GB configurations increasing by a couple hundred dollars. However, Apple has also gone as far as developing an entirely new filesystem to decrease storage use.
And it is in Apple's interest to have smaller apps, because then people will be able to download more apps. They'll be willing to try more apps, because there will be less of a barrier between seeing the download button and being up and running.
When your low end configuration is hard to use because Facebook takes up 29% of the storage... that's a problem. Customers get mad.
When they can't update Facebook because it wants more space and the device is full and now the user is locked out from their friends... customers get mad.
When updating a handful of apps uses up 70% of their monthly data... users get mad.
And ALL of that effects apple's bottom line.
I haven't seen any data in this thread to back that up. It certainly doesn't apprar true in my experience.
It's worse knowing something like Facebook will cache a whole bunch of images, friend pictures and everything else.
"This shirt is dry clean only, which means it's dirty."
The idea is that because it's all open sourced, all the vulnerabilities will be found and patched. But more often than not you just end up missing the small notification from the maintainers telling you to update.
Code bloat = lost users
code bloat => "your phone is full" => "oh, my phone is too old" => new iphone => free space => code bloat
That said, I also delete bulky apps before I start deleting media.
If you are on a good network you probably won't ever notice such bloat until you run out of disk space or bust your download limit (Canada)
If I find an app to be too big for my taste I normally fall back to the web version if it exist at all but even then its hit or miss because sometimes the web app is pure complete garbage or worse than the app itself.
What is surprising me, is that in the Facebook iOS app, there is a "FBSharedFramework.framework" that has a binary file of 215MB. What the fuck is this? How a single binary can get that big?
> "In the case of the Facebook application, there are more than 18,000 classses in the application"
18,000 classes. Absolutely ridiculous.
I guess the good thing is that it gave me an opportunity to teach my kid about tradeoffs. "Ok, so if you really want this app, we're gonna have to delete 4 of your other games on the iPad." Even a 5 year old could reason his way out of that one.
Look at this poster, he doesn't care. He isn't going to remove any of those apps. There are runner apps that are a lot smaller, but he doesn't care about the size of the app enough to guide what he downloads.
Empty blog posts are empty.
"Why is there so much traffic, someone should do something, I hate driving these days."
This is why i uninstalled pokemon go -> it used up to much data and to much battery. Uninstall -> problem solved. unfortunetly, the vast majority of people don't care about app sizes, or how much data they use, etc.
On another note, I just went to check some of my apps and iOS 11 got rid of the size from that view. You now need to dive into each app to see the size.
For a while I tried to get by with a 16GB iPhone SE thinking I keep everything in the cloud, so why would I ever need 64GB? Well every couple of months I'd have to delete and re-install NYTimes and reddit and some magazine apps because they just grew and grew and grew in storage until I had 0 space left. Like they simply cache everything you've ever looked at.
It's dumb, because other apps are intelligent -- they'll automatically purge cache data when storage gets too low. But not NYTimes. Not reddit. It seems pretty inexcusable, really.
Earlier this year, Wechat released a revolutionary (kinda) feature called 'Miniapp', it supports releasing apps within Wechat itself (a bunch of xml/js files). All major Internet companies published their own miniapp in Wechat, which include s the most-used feature of their full app, and only takes less than 1MB of space. Guess what? the miniapps are not adopted by most users, it became just a fad.
This means most users are not sensitive to the disk space used by an app. Apps know this, and thus don't have motivation to reduce the size.
Thing is, some Android users (like me) are at the limit of their internal storage and some apps can't be moved to an external SD card for some reason. Before installing a new app (or indeed just an update), I have to decide which of the hellishly bloated apps on my phone I can delete to free up space.
One way to solve this problem, is to promote modular library structures, and package management tools (pod, npm) should support importing fine-grained submodules, even single features of a library/framework, whenever possible.
I'm not an app developer. But I've known about libraries that section off lesser used code as "addons" since... well... since I started programming. This seems so fundamental I don't understand why everyone isn't doing it. Especially considering the gains are far greater in the mobile and web world than desktop.
Vector graphics will solve this, but is not mainstream yet.
The facebook android app is 88MB (zipped), i did check the APK of why its so big:
90mb of code
30mb of resources (images)
Then you're trading off battery life vs. space. Vector graphics can be far more expensive to render than bitmaps.
I guess it would add considerable overhead for a negligible improvement in size (compared to just shipping all images) for most devices.
EDIT: Too slow...
Doesn't do anything for size on disk, but should help network a lot.
(I'm aware that these sizes do not include the OS and its libraries, but apps on a mobile device also have a similarly rich environment of libraries they can use.)
It's making app bundle sizes explode in size
I can see how high-res pictures embedded in the app can add dozens of megabytes, and maybe a different runtime that you have to bundle for compat reasons, but multiple hundreds of MB?
Please, educate me. I'm all ears. :)
You're not allowed to include your own browser, and can only use the platform's web view, which is not duplicated on a per-app basis.
There are many culprits for app bloat, but using a browser is most certainly not one of them.
- For games assets are a big issue. Some do not compress their assets, and unfortunately most image-authoring tools make it easy to output PNGs that are much larger than they can be. I think a lot of people by default assume bloat comes from images/icons/etc, but IMO this is a red herring for most non-game apps.
- Library bloat. Even simple apps pull in a large number of external dependencies, which contribute dramatically to app bloat. There's also a lot of code pulled in that replicate platform-provided functionality (see: the bajillions of layout libraries out there), which may be simpler to use than the stock Apple components, but add to your bundle size.
One of the common problems is that iOS open source dependencies are typically all-or-nothing - you end up pulling in a very large library even if you're only using a small slice of its functionality.
I think most disassemblies of iOS app bundles show that library bloat is typically a far larger problem than asset bloat.
In any case, I think the future will be something like Android Instant Apps - where apps are sliced up such that necessary bits can be downloaded on-demand. This gets users into apps faster, and saves space.
This "cure" is worse than the disease.
The status quo is that they must download a 150MB bundle before being able to proceed.
The proposed app slicing will allow them to download maybe 10-20MB before being able to proceed, with additional components (up to the 150MB total) downloaded on-demand.
If the user is on an intermittent or slow connection this is still a significant improvement over the status quo: the user gets into the app much more quickly.
Additionally, if the user only uses a small slice of the app's functionality (which is the 90% use case), additional downloading can be deferred until the user connects to wifi, at which point the rest of the app can be downloaded over a fast and reliable connection, all seamlessly without the user having to worry about it.
Cordova uses the system WebView.
It's really amusing to me when "engineers" start talking about the "scale" of the UI. It's a client. Thin vs. Fat aside, if it's that fat it's almost certainly a bloated mess of redundancy and what is called "overengineering" (which is actually underengineering--that is, a deficiency of the application of engineering and architecture principles to the design of the application).
Something that I'm very happy about in 2017 :-)
Data is expensive in many places, and wifi/broadband is typically cheaper.
"LinkedIn’s new flagship app, nicknamed “Voyager,” is a complete architectural and design overhaul available for iOS and Android, as well as an online mobile experience, in the coming weeks." https://thenextweb.com/apps/2015/10/14/linkedin-voyager/
Sounds like "VoyagerFeed" may be part of their own codebase.
Although the updates aren't always huge, Slack has been an atrocious app on Windows and iOS for performance and indexing. Absurd that the biggest companies have, well, the biggest apps.
I expect this realization will only become widespread when open-source folks actually fix their software's deep-seated usability problems. The way to do that is not by hiring UX designers or appifying it; it's by thinking deeply about how humans interact with computers and blowing away the conventions that make computers and software unforgiving, hard to explore, and opaque to the uninitiated.
* Charge owners a fee for apps over a certain size "a processing fee" or whatnot
* Charge owners a fee based on the download costs. Under XMB and it is free. The more you cost apple to transfer your app to their customer the larger the fee.
* Penalize large apps in the app store search results or give bonus to smaller apps.
Or rather than straight up bonus's / penalize apps based on size go after specific things that cause bloat
* Apps that don't use pngcrush on their png's
* Shipping wav files and not acc
Maybe Apple doesn't even need to actually implement any of these, but just threaten to.
We're finally getting back to that original vision.
Apple also encrypts then compresses, which means the binaries you download in the app store are incompressible.
If apple wants to decrease IPA download size worldwide, they would let developers not encrypt their app and just sign them. That would be very relevant for developers of popular free apps. I'm guessing they encrypt then compress so they wont have to re-encrypt the binaries on the users phones once they uncompress an app.
Also all the SV big-co have A/B testing practices with weekly release cycles that induces large line counts in their apps.
I think everyone copied the facebook mobile dev style, which simulates what you can do in webdev. In webdev there is no cost to adding another team for another feature that lives in some section of the greater app, since it's just another webpage. You can create many a/b tests and rollback things nearly instantly. With the weekly cycle everything is under a feature flag and you'll see a bunch of half developed features sitting in the delivered binary turned off via feature flag. This induces code size and creates these kinds of issues you see today.
Also large apps start requiring management structures that I call hallways and elevators. A single indie app can make a equivalent of a 1 room hut of an app, which doesn't require any hallways, elevators, floors, boiler rooms, parking structures or stairs. If you look at the layout plan of a highrise, you'll start to realize a lot of the floor space is taken up by the elevator and hallways.
Once apps become as large as a highrise, then they start requiring code structures that help manage the chaos, such as reporting systems, rollback systems, well defined tree structures and so on. That and the shear amount of rooms they have create apps larger than they look.
An analogy is when downloading a torrent; the torrent blocks a chunk of space on your disk equivalent to the size of the file being downloaded. As the file is being downloaded it replaces junk in the blocked chunk.
Again all this is my speculation.
There is however no reason that code, either compiled to a binary format or in a textual format, uses so much data. Heck, the memoirs of Casanova spans 3000 pages, and is 6,5 MB. People don't understand how incredibly large a megabyte is for simple code.
Surely the 275 MB isn't all useful data (I wonder what compression ratios you get on 'apps'), and it should be possible to cut it down to a few MB.
At a previous job we had a monolithic java server that ended up at over 350Mb of compiled code simply because each development team had imported whatever libraries they thought they needed. In some cases, 2 or 3 versions of the same library were included.
Most of it is actual code - not assets. For some types of apps (see: games) assets do take up a significant portion of total size, but for most everyday apps bloat by code over-inclusion is likely a bigger problem than asset-bloat.
I wonder if it's possible to get major open-source libs to move towards more fine-grained build targets and internal dependency management, so that devs don't pull in a gigantic binary when they're only using a small slice of the functionality.
Also I think we, as programmers, have taken the "don't reinvent the wheel" principle too far. The idea is to use 3rd party code to (1) save time writing, (2) reduce the risk of bugs, and (3) lower the maintainance burden.
But this makes sense only if the benefits outweigh the corresponding costs of integration, which also (1) takes time, (2) might be done wrong (especially because you don't understand the part you added), and (3) creates a maintainance burden. Of these, only #1 is solved when your development environment makes adding new libs quick and easy.
What I think GP was arguing for is that you have libstring.so which you can strip down to just having leftpad or w/e with Kconfig or similar configurations (preferably at link time not build time -- otherwise you still have the same problem).
But I agree with what you're saying in the second paragraph, which actually sounds just like what my GP meant by using "Webpack to throw away what isn't used". Having unused parts of larger libraries cut out would be a much cooler solution than just telling everyone to eschew kitchen sinks in favor of many tiny, bespoke items. Especially since finding the right small libraries is much harder than finding a kitchen sink that just works and has a sizable community behind it
Not necessarily. Distributions have to build everything, and in openSUSE we have an integrated build system which has a lot of very useful features (rebuild when dependencies are updated, and automated QA runs before releases). Those features require you to have all of the dependencies of a project in an RPM package. Even if you don't end up shipping that package, the package is used for tracking security issues and backports and so on. You can't pull anything from the internet during a build, you have to depend on your BuildRequires.
Now take any average JS application that has over 800 dependencies. Ruby was already bad enough with ~80 dependencies, but adding another order of magnitude is just not maintainable. One of my colleagues gave a talk at LCA about this problem.
If we're going to talk about linux packages, aren't they often written in languages with amazing optimization skills? If I write some C99 code that uses a gigantic library but I only use like 2% of it, my compiler will cut out the 98% my code doesn't use, so libleftpad sounds like an awful idea. That's one reason why packages on linux distros aren't too big.
But I'm talking about iOS apps where, as others have pointed out, the available optimizations suck, and as such, I think that having libleftpad.o included in everyone's iOS apps isn't a big deal (note that iOS doesn't really have a nice way to create libleftpad.so anyway AFAICT because all code for your app is supposed to be sandboxed so no one else can mess with or use it). I agree that it would be really cool to just cut out the 98% of $GIGANTIC_LIBRARY that isn't used at compile time, but since Objective C doesn't seem to have that now, I think small things would be a really nice way to give users more space on their phones without removing features.
And then you'll have to use software written in 50 different styles, with 50 different type and error conventions, with pieces that couple together or not at random.
All of which matters not one hoot to folks _using_ the software.
Behind the scenes, lodash is built out of many modules, but you as a developer don't need to think about that.
I've seen people try to package JS projects inside a distribution. >800 nested dependencies is simply not sane nor sustainable. The fact that they're small is even less excusable because it means that there was an active decision to not consolidate them into a single dependency that is properly maintained.
(If you link against the static library then the linker pulls in only the .o files that the app used, so assuming that the library was written in a sensibly modular way you pay only the binary size cost for more-or-less what you use. The linker can probably discard at a finer granularity than whole-object-file these days, but only-the-.o-you-use has been there for decades.)
Since you normally need to bundle dependencies to account for different versions, this adds up quite fast. Qt adds 20MB for OpenGL, 15MB for the VC++ redist, about 30MB for other core libraries. Some stuff in OpenCV requires Nvidia's performance primitives (NPP), so in goes nppi64_80.dll - that's 100MB alone. opencv_cudaarithm310.dll is 70MB and even opencv_imgproc310.dll weighs in at 30MB. And on and on.
So yes, one little call to cv::cuda::remap adds in a boatload of dependencies when all the algorithm is doing is using a lookup table to sample an image.
Now it has turned out exactly the opposite way. If we went back to linking object code together we would get smaller sizes. Instead we have to include huge DLLs.
I'm pretty sure the main motivation is/was to allow patching of dependencies independently from applications. Very popular shared libraries might save space overall, but that is a secondary effect.
It's now 6mb.
I understand your argument, but in this particular case things aren't volatile enough that it causes any problems, and if it does it's just as easy to solve.
Even if you can throw away dead code, you can run into bloat because a library has a massive set of foundational APIs that the rest is built on, and using one little feature of the library ends up bringing in half the library code because it's used everywhere.
PS: Do you have an example?
Objective-C's dispatch is built around looking up methods by their selector, which is effectively an interned name. Looking up the selector can be slow, but once you have one, invoking a method using a dynamic selector is as fast as invoking one with a selector that's known at compile time.
Due to the frequency of this, objc_msgSend (which handles dynamic method calls) is hand-written in assembly, with caching and "fast paths" to improve speed. The overhead can usually be brought down to that of a virtual function call in C++.
Look at how much code is written in Ruby and Python. Their method dispatches are way slower. To put it in perspective, it takes CPython about an order of magnitude longer to add two numbers together than it takes Objective-C to do a dynamic method dispatch.
When you get down to it, an ObjC message send performs a superset of the work of a C++ virtual call. A C++ virtual call gets the vtable pointer from the object, indexes into that table by a constant offset, loads the function pointer at that offset, and calls it. An ObjC message send gets the class pointer from the object, indexes into that table by a constant offset, loads the method cache information at that index, uses the selector to look up the entry in the cache's hash table, and then if all goes well, it loads the function pointer from the table and jumps to it.
There (usually) is no dead code elimination for libraries.
(Also Electron bundles 'just the rendering library from Chromium' , not 'all of Chrome').
(I also don't agree with 'don't need to'. The main takeaway from the leftpad debarcle was the fixes to the npm module deletion policy, and hopefully people learning they shouldn't rely on an 'npm install' for production deployments! Whether people should use small modules is still up for debate, there are trade-offs  ).
Also to the GP's point - Apple also now no longer supports splash screen images, so that element of bloat is no longer a factor (though some legacy apps have retained them pointlessly).
I think for non-game apps assets are not the primary driver of bloat.
I didnt know this as the only mobile I have dabbled with has been game dev with android in which case assets are the main cause of bloat.
That's actually one of the positives from moving away from gradients and shadows everywhere, funnily enough.
The sheer number of vectors could make any renderer I threw at it crash.
 - https://developer.apple.com/videos/play/wwdc2017/413/
This is pretty tone def. The bulk of apps are audio and ui assets and the compression rates on those are quite good.
That said, the compression rate for iOS apps is horrible as Apple decides to encrypt and then compress the binaries, completely blowing apart the compression ratio of duplicate data.
Encrypt then compress makes no sense at all. Compress then encrypt, or don't compress at all.
Anyway, probably millions of lines of code, 99.9% of which I didn't call. There is a tool for stripping code that will not be called, and I reduced my apk's from ~20MB to ~1MB, although I wound up turning it off in the end because it was not trivial to enable correctly. (I was linking 3rd party binary libraries into my app which complicated things)
When asked it was clear they'd already made up their mind, and discussions about optimizations, how the app actually did anything, MVC, MVP or SVGs didn't change that.
And thats how you have large apps!
Any idiot can create something bloated and complex, creating something small and simple requires much more effort.
You cannot just compare obj-c vs generics, you need to also account for the fact that when you are obj-c development, you might need some fast code and as a result drop to c. And code reuse is hard to impossible in C.
The fact that code reuse can be done without a performance hit is enough of a difference difference on it's own as it pushes down the level at which code reuse can be done. E.g. you can use generics for Graphics or Audio or general low level stuff.