Hacker News new | comments | show | ask | jobs | submit login
App sizes are out of control (trevore.com)
729 points by trevor-e 8 months ago | hide | past | web | favorite | 442 comments

The situation is messy. Take Facebook.app. The reported size on the App Store is 377MB, the distributed .ipa is 241MB. But it is a universal app which includes fat binaries arm_v7 and arm64 and all the graphics 1x, 2x and 3x. App Thinning halves that size for end users. Yet the App Store reports the full size.

There's more. App Store also provides some sort of delta updates [1], which save a lot bandwidth, but failing to report it properly again [2].

App Thinning does not work for standalone image assets. It means you are forced to use Asset Catalogs where all PNG files are stored as LZFSE (previously zip) compressed BGRA bitmaps. It's good. But optimized PNG files can be 30-50% smaller on average. I'm fighting this [3] but not sure if there's a simple solution.

[1] https://developer.apple.com/library/content/qa/qa1779/_index...

[2] https://twitter.com/rjonesy/status/878051126704254976

[3] https://twitter.com/vmdanilov/status/892015508203216896

I was wondering why not use SVG (or any other vector file format) files and render all the graphic on the device, instead of including tons of PNG at various resolutions.

Fuzzy reasons abound, but you do occasionally see this. One major often-complete-blocker that I've seen though has been ungood rendering, so results aren't always identical on all devices. You can't really rely on the device's possibly-weird implementation (who knows what the OEM did to it)(except on iThings probably), so you're forced to bundle an svg renderer that works reliably on all devices and has consistent behavior, and good luck finding a high quality lib that does that[1]. And, to be friendly to the designers, it has to be reasonably full-featured.

[1]: but if you know of one, I'd love to use it!

Hand tweaking for specific resolutions is an art. You can't expect to just render SVG and get equivalent results.

I've made few apps and so far nobody cared about it. I thought that hand tweaking was about 16x16 icons, but now we are using 1024x1024 icons sometimes, there's no reason to tweak anything there, really. Actually it was the other way around: I wrote some scripts which rendered SVG into multiple PNGs and I think that almost everyone does the same.

even if you provide one svg for each range, you still gets massive size savings for 99.9% of icons

SVG stadnardisation and support is growing fast, notably because it is now part of HTML5 and has strong support in browsers, which are the most active software development needed for all kinds of apps. SVG includes many tests to assert the conformance and is one of the best graphic standard in terms of standardization with very precise requirements. It's in fact easy to get the expected results on wide ranges of devices, and implementations are now very performant. they are implemented as reusable opensource libraries or directly now in core services of most OS distributions, and even part of mobile platforms. The hardware is now optimized to support almost all SVG graphic primitives. The standard is not born from nowhere, it has inherited the best practices initiated earlier in PostScript then OpenGL, DirectX, X11 and similar APIs, and references other related technologies that are also standardized: Unicode, OpenType, IEC color spaces. There are still ways to extend it, but the SVG standard defines the requirements needed to support many devices, small or large. Only very old implementations will suffer from some quirks due to absence of support of newer features, but the graphic will still render correctly provided they followed the mandatory conformance rules for each version. Remeber that SVG extensions are widely upward compatible: newer renderers will support all graphics built for conforming renderers based on older versions. Note that SVG is not designed to get exactly equivalent results but results that are suitable for use on each device. This is absolutely not the case for bitmap graphics which are extremely hard to adapt, and waste lot of bandwidth. The next step for SVG will be to be a bit less verbose, suing another base syntax than XML: JSON seems to be the best candidate as it offers significant performance improvements to XML parsers: this will appear because now SVG is standardized primarily not for its syntax but for its DOM and API, just like HTML now and more recently as well with CSS (which is also used by SVG). And more impoprtantly the SVG standard is open, allowing all kinds of experimentations and implementations. No new feature appears in the standard before there's already been experimentations. The only few things that SVG do not support completely for now: * fine tuning fonts to handle more semantics and typographical effects * better support to render on non-additive color spaces (SVG still depends to much on sRVG which does not print very well, and new display technologies are now using more than 3 color planes to extend their gamut); colorspace transforms, and masks are still better supported in PostScript. * support for 3D rendering is still in alpha stage, but will come sooner or later. * support for splines may later add other curves than just quadratic or cubic Beziers, and elliptical arcs: hyperbolic, sinusoidal, and other parametric curves should appear, with their own rules for their subsampling decompositions to match the expected resolutions on target devices * lighting models are still too simple. But the OpenGL standard already drives what will appear. * support for texels should also come for 3D rendering, including mipmaps and synthetic/parametric mipmaps * integration of videos with SVG animations is still a challenge; given the neeed to also synchronize in time with audio, users events or scenarios for gameplay.

The HTML Canvas will drive the needs, as well as other W3C specifications related to accessibility, internationalization, matching user preferences. You cannot do the same thing with "stupid" bitmap graphics. In fact the birmap graphics will be internally converted to vector graphics for better rendering (this is already what printer drivers are doing to improve their results, you cannot do that only by subsampling, bicubic interpolations: you need geometric transforms, notably for adapting texts or symbolic designs such as maps): the individual "pixels" in bitmaps are really stupid objects that do not properly handle how they are geometrically linked to their surrounding pixels: there's a need to contextually hide or show some details, or improve some of them, vector fonts for Opentype are perfect examples of that need.

It definitely is improving, and it's fantastic to see the progress. I really enjoy SVG assets, and I'd love to see them mostly take over (obviously they'll never be 100%, nor should they).

But that still means it's years away from widespread support, because large companies can't rely on it for large user-bases, and feature-sets are still pretty widely varying in my experience. Some support animations, some complex gradients and masking, and huge variance in text rendering (which is absolutely essential to finely crafted images) cripples some of the most-valuable use-cases (internationalized images without making dozens or hundreds of near-duplicates).

It's just not there yet. I hope it gets there, but there are significant hurdles that don't seem to be getting a lot of attention.

Well, the importance of red lining decreases as resolution increases. We still aren't there yet, however.

Sure. So hand-tweak the SVG files per density that you need custom-handled. Like you already do for images / when exporting images. Same difference....

... except now you need to do it on the device too, to make sure it renders identically there. Otherwise the same problem!


For very small images it doesn't matter / png may be smaller, but small images with that quality don't typically add up to hundreds of megabytes.

You are lucky: I know one because I wrote it! Take a look at: http://www.amanithsvg.com

Raster images are still faster and often you can make your png images very small and do stretching.

OTOH, people do vector stuff for many things, it's just they typically don't use SVG but something more efficient. XML parsing is not lightweight compared to a binary format or some sort of vector -> code translator speed wise.

The XML parser is not much a problem: the time to parse it is insignificant compared to the waste of resources to store and transport bitmap graphics.

Bitmaps will also never be able to adapt their content gracefully to user preferences and constraints. Bitmaps have no intelligence; they are only suitable for photography, but even in this case they lack information for proper rendering, so you will nver same twice the same bitmap across devices and rendering supports

Just compare what we had in the past with bitmap fonts (now almost abandoned except for text consoles) and how we appreciate now to have scalable vector fonts.

Bitmaps are extremely complicate to adapt to any layout, whereas SVG graphics can be now self-adaptative, to render only what is needed.

You could ship SVGs and lazily generate PNGs for the specific device. Once it's generated, you keep it cached. Although it might be a little slower on the initial run. I guess it depends on your SVG's complexity.

Webkit does a fine job with SVGs. You could draw it on a <canvas> element to get a rasterized view of the SVG in the desired size, and save the resulting image.

I'm guessing but it seems logical that phones are not the place you want to do extra graphics rendering if you don't have to.

Modern smartphones have quite competent GPUs. They can run 3D games at 1080p/60fps. They could raymarch circles around an svg icon. And if it wasn't enough, the renderings could be cached for later use.

TBH not many people consider SVG format worth the effort unless the graphic is made from scratch. 1) an organization rarely rationalizes having a UI dev spend 6 something hours either transforming an existing PNG to an SVG, or spending 12 hours compleitly replicate an existing design in photoshop only for it to be exportable as an SVG. I could be far fe5ching things but sometimes business or upper management get short sighted on the small things. Probably there is no rationalizarion in spending extra effort in having these graphics all svg format, thus improving your application's load and bandwidth utilization, vs pumping more funds towards other easier quicker solutions.

>replicate an existing design in photoshop

AFAIK, the majority of designs are created in photoshop or illustrator. No conversion nessecary. In any case, I would consider it a liability if I didn't have my designs in an editable format. What if tommorow I realized I wanted a variant in a different color? Or I decided I wanted to use a different background? If all I had was a bitmap logo, I would consider it worthwhile to have someone change it into a scalable format anyways, even if I wanted to transfer them as PNGs.

Also, numerous free tool exist to convert bitmaps to SVG. Using one of these, plus some manual smoothing and correction, I'm sure the conversion would take no more than half an hour.

GPUs do not like SVGs. They don't handle them well at all. Most SVGs are CPU-rasterized as a result.

So you just CPU-rasterize out to a buffer, and blit it to the screen. After the initial render, it's equivalent to the PNG. The application essentially decompresses its assets on launch.

May be SVG is not the best vector format after all. May be we need something simpler, where every item directly maps to GPU graphics calls, also binary would help.

GPUs have graphic calls anymore. At best they have fragment and pixel shaders. Converting a vector graphic into a good shader is not trivial, compared to using the built in functions of texturing.

It gets even worse once you realize how limited older mobile GPUs are or what are the incompatible subsets of features supported at decent performance.

If you want something simpler that works with GPUs it needs to be exclusively built out of triangles and/or code that can operate on a single pixel independently of its neighbors.

Resolution independent formats are inherently anti-triangle and by the time you've hit triangles you already have a target resolution in mind.

Or put another way, GPUs really don't like vector graphics in general. That's not what they're built for or good at.

GPUs like vector graphics just fine; rasterizing vector graphics is literally what GPUs are designed to do. It's infinitely scalable vector graphics that they tend to dislike.

Once you've subdivided your curves and such into triangles/vertices/etc., the GPU ends up being a lot happier about its existence.

GPUs are not designed for vector graphics, they are designed for triangles and triangles only. Triangles are a subset of vector graphics, but are almost never the subset that people mean when they say "vector graphics."

Specifically vector graphics typically includes curves, which GPUs just don't do at all.

Once you've tessellated a curve into triangles you've already baked in a desired resolution. You can't have resolution independent vector graphics in a GPU friendly way.

Rendering svg icons sounds like a non issue for any "smartphone" made possibly ever.

Because the experience is subpar when that is done. It is impossible to create a vector image that will result in clean, unblurred strokes at arbitrary small pixel sizes. Don't even suggest using TrueType-style hinting, I don't think any designer would want to touch that with a 10 foot pole.

I would have said exactly the opposite: blurred strokes belong to bitmaps, you are still thinking that pixels are square over a regular area that can opnly be rotated by multiples of 90 degrees.

SVG graphics are clean, result in NO blur at all. They don't even need any Truetype-style hinting. You've probably looked only at early non-conforming implementations using bad approximations. But precise rendering algorithms for SVG are specified and fully tested (this is not the case for bitmaps whose support is much more problementic: see what happens to photos and videos when editing them! Never twice the same result, and lot of losses and artefacts everywhere, including bad undesired effects, and notably the moiré effects which are much worse and do not reproduce what natural scaling would do in your eyes, that are not fitted to a perfect rectangular grid).

So no, it's perfectly possible to create a vector image that results in unblurred strokes at arbitrary pixel sizes. What you want is a geometric transform to exhibit more some details: it is in fact impossible to do that with bitmaps except by sending variants tuned for multiple resolutions: you do that for a limited number of resolution and assume it is enough, but this is not the case, so you add more, and finally you have enormous images that are widely redundant with each other, and better represented by converting their 2D model to 3D and an adaptive 3D-to2D projection. you'll do that easily with vector graphics, not with bitmaps (or their extensions called "mipmaps" which are only approximations not really scalable because they're still all based on discretely sampled planes).

Truetype hinting serves to distort the curves beneficially (by being aware of the pixel grid) in order to produce more readable output at low resolutions. SVG graphics do not automagically scale down and stay readable.

The classic easiest to explain in words case is taking a capital letter O and turning it into more of an "elongated stop sign" shape at around 6 pixels of width. There is a tradeoff between remaining faithful to the original shape and creating high-contrast, readable shapes at low resolution. There is nothing impossible about scaling down the vector outlines of a TrueType font in a mathematically perfect way. You still need the hints to make the result more readable.

(I worked on the hinting software for 13 of Apple's System 7 font faces and about 3/4 of Windows 3.1 launch fonts.)

> SVG graphics are clean, result in NO blur at all. They don't even need any Truetype-style hinting. You've probably looked only at early non-conforming implementations using bad approximations.

This huge wall of text but apparently you don't understand the subject matter all that deeply. Why wouldn't SVG need hinting? Truetype needs hinting and it's a vector format as well.

(Philippe always posts walls of text, even on the SVG mailing list; seems to be their thing.)

Hinting as done in TrueType is probably overkill for most purposes. But icons are often not displayed in arbitrary sizes, but rather in one of a few known-in-advance sizes. It's not hard to tweak the paths and shapes to fit to the pixel grid for all those. Also, as you approach higher pixel densities, it becomes much less important; basically you may just have to make sure that the smallest sizes fit on whole pixels and that's it.

Long ago I've automated asset generation in different sizes from SVG for an Android app I worked on and even without caring (much¹) about a pixel grid the results were good enough not to need tweaking.


¹ When hand-writing SVGs I tend to care about integer coordinates simply to not lose my mind.

Wouldn't it make sense for the file format to just allow svg in addition to others?

eg png is fine, but if a developer wants to include svg then let them.

For developers who do care about the clean, unblurred stroked at arbitrary pixel sizes this wouldn't impact them at all.

For the developers who care more about the space taken up, this would give them a potential useful option.

For developers who do care about the clean, unblurred stroked at arbitrary pixel sizes this wouldn't impact them at all.

Ironically IMO I'm pretty sure this used to be one of the major selling points of vector graphics and svg :-)

I might agree that for very small icons (eg: targeting a ~VGA or 800x600 to maybe 1024x76 phone screen) hand painted bitmap graphics might be best... but shouldn't it be enough to ship 16x16 (or whatever) icons along with vector graphics for high-resolution screens? And if it's too slow to live-render the graphics, how about rendering custom icon sets (etc) on install - using the gpu, and then cache on the device?

SVGs are a problem because they don't have a "preferred size" parameter. On Xcode you can use PDF files for icons (which are much like SVG, but state "all my points are relative to the size WxH). Once you have that you can make pixel-perfect icons that are always rendered sharply.

The catch is that Xcode compiles all those PDFs to PNGs before bundling them into the .ipa, so in the end you have a bitmap either way...

Nobody said arbitrary sizes. It's still a limited set.

pcwalton's project Pathfinder (https://github.com/pcwalton/pathfinder) may eventually lead to an efficient GPU-based vector renderer, which would make this feasible.

I mean, it's feasible now; plenty of folks use vector rendering: Caching the result makes the CPU rendering much easier to deal with. The GPU would really help more with animations and dynamically constructed vectors.

In any case, I suspect something like lyon (https://github.com/nical/lyon) would be a better fit than pathfinder, which is both a) not super portable and b) very focused on rendering text, which it does very well.

Thanks for indicating lyon, that looks very nice. I mentioned Pathfinder because path rendering on GPU is important on mobile devices (battery and heat concerns).

Oh, it renders generic paths? I thought it only rendered vector fonts. Today I learned!

iOS vector asset catalogs (which use PDFs) actually don't support true vector rendering; any vector images are converted to PNGs at compile time.

As of iOS 11/Xcode 9 vector images are supported which is quite nice. It may still convert to PNG for backwards compatibility though, I’m not sure if App thinning discards the PNG for iOS 11 devices.

Pretty sure every trade off made at Apple is weighted heavily toward extending battery life.

Android starts to use it more and more. At least where it makes sense : icons and simple illustrations.

From discussions with iOS colleagues it seems that Android has seen a way stronger push for vector assets. Probably because from day one the OS and SDK has been designed for unknown displays at compile time.

I'm not sure about iOS, but on Android the reason is the cost penalty of rasterizing larger icons and images on the fly. There's also some fragmentation with differing levels of SVG support on different OS levels. As a result, SVG is limited.

"some fragmentation..." That's like calling the ocean a bit wet! The experience of learning to code on Android revolves a great deal around dealing with that fragmentation.

I was specifically referring to the level of SVG support, which is constant across given OS versions, not per device like the fragmentation in old camera hardware abstraction layer implementations.

In that sense, the only reason there would be more fragmentation in SVG support on Android than iOS is that it can be harder to upgrade Android devices that did not come from Google.

As someone who has spent a lot of time with SVG, I can assure you there are very wide discrepancies in how the same file renders even using different tools on the same platform, let alone cross platform.

For some things like icons and UI elements, the automated resizing gives worse (fuzzy) results than a human artist. It makes a difference.

Because different graphics are shown are different resolutions, rather than purely scaling a highly detailed 1024x1024 SVG down to 128x128.

SVG = scalable vector graphics

SVG files are vector based.

>1024x1024 SVG

Vectors don't have pixels.

But they do generally have a nominal size at which the artist considers the render to be optimal... no need to be so short.

The point is still valid. An SVG designed for a high DPI screen will most likely look bad on a low DPI screen.

However, they usually are designed around a target size. And it's quite possible that, scaled too small, much of the detail is lost. Hence why you'd still want different vector files for different resolutions.

Considering the topic of this post (apps), I would argue assets are usually designed around a physical size of the image; even when you have multiple target display sizes, it becomes a part of the requirements to design assets the way that looks decent on all phones. Moreover, I find this whole branch of discussion a bit funny because assets are already designed to be high res and later downscaled for target dimensions, so simply using vectors literally just pushes that task to phone instead of build server.

A simple solution to this would be for Apple (or the developer, via Apple) to distribute different builds based on the device installing the app.

That is, developers could compile a different copy of the IPA for all target devices. Using variables like CPU architecture, screen size, and other feature flags, a smart compiler could cut a lot of code and assets that never run or display on certain devices.

(Granted, this would be much simpler on iOS, which has a limited set of targeted devices compared to Android.)

In fact, it seems Apple has already taken steps in this direction with its cloud compilation (I'm not an iOS developer, so not sure of the specifics). What would worry me is if they started requiring all source code be uploaded to their servers for compilation. Going down that road is fraught with ideological pitfalls.

I think they already do that with bitcode. You upload the LLVM bitcode and it will generate the code for the appropriate architecture as needed. It's standard when you upload your app to iTunes. They don't use your source code for that.

Do you mean "bytecode" or is "bitcode" a thing?

bitcode is indeed a thing, its a file format for LLVM IR https://lowlevelbits.org/bitcode-demystified/

It's not really a bytecode as it's not interpreted, but that's debatable.

I continue to learn more.

Wild guess, but presumably file size was not Apple's only or primary consideration on image format. Rendering performance, app load times, and power conservation all likely rank higher in Apple's priority list that raw disk space used.

I don't think that's true. Having unused icons on the device has negligible effect, if any, on performance, load times, or power usage (they never are loaded into RAM)

App thinning, as they call it (https://developer.apple.com/library/content/documentation/ID...) almost solely is about two things: download time and disk space.

+1. Also, it's a plus for apple if people would buy 128gb version of iphones instead of 8gb ones...

This may be unrelated, but why do they push out updates every other day or so? No other app is like that. And sizes aren’t getting any smaller. They’re at version 137 right now...

Those numbers are mostly unfair. For some reason in the iOS 10 App Store, Apple started listing the complete fat (both 32-bit and 64-bit archs) submitted .ipa size. If you want to easily test that, clear your cellular data usage, update one of those apps (or install) and then go back to settings and see the actual bytes transferred.

Also, most everyone is using Swift in some small part, so that automatically includes the standard library. Then you have some companies who switched to Realm DB away from CoreData. Or then there's a whole subset of companies that have decided they want to go all in on Javascript and have brought in the whole React Native stack with it.

These apps in that list are also built by teams of 100s of engineers working at full speed. In reality, each one is its own little OS full of its own UI frameworks, testing frameworks, and nontrivial code.

Trust me when I say that everyone is plenty aware of how big their footprint is getting, and no one is happy about it. Apple won't even let you submit to the store if the actual single architecture binary is over 60MB.

It’s partly unfair and only a partial picture. Many apps post install proceed to download data.

Some limit that amount of data they download but a few are indiscriminate about it. I hated the Yelp app because it seemingly did not limit how much space it took up. The more reviews you viewed, the more space it took up. I never found a setting to limit it. If you go to Manage Storage you will discover that many apps take over 100mb in documents and data.

I'm curious, does anyone have the equivalent app sizes on Android? I'd love to see a comparison, to see if this is more of a platform issue as mentioned in a few comments, or a developer issue.

I guess we also should be looking at the actual data transferred rather than the number shown on the store page.

I'm 100% willing to believe this is a universal problem and is just as bad as Android. I'd be FLOORED if Android apps were significantly smaller.

I checked now.

LinkedIn: 20.36MB Facebook: 76.59MB Twitter: 25.71MB

That's about 1/8th the size of those apps on iOS. Those are just the first three apps in use article, I'm sure the rest are about the same.


I wonder why it's so different.

Hopefully someone can chime in with the reason.

Java byte code is generally much smaller than equivalent machine code. If you use something like Multi-OS engine which statically compiles Java bytecode to native code for iOS it's fairly typical to see an order of magnitude difference in binary size.

This actually affects Android 5+ too (Multi-OS engine uses Android's AOT compiler) but since it happens on the device it doesn't affect download sizes.

Really? I've never heard that before. I'd guess it would be bigger if anything.

Any idea why that is?

It's true, Java bytecode was basically designed for small size. The goal was to transmit java apps quickly over the internet, a hope which mostly died off as JS took over, but which in some ways has been reborn with android (dalvik bytecode is somewhat different but is definitely small). The main tradeoff for the small size is a total lack of optimization until the JIT compilation stage.

But there's no way it's anything like a 10x difference. There's definitely something else going on here, like some quirk about how the app store is calculating the sizes of assets.

I'm talking about the binaries themselves as they are output from dex2oat. The 10x difference is I suspect a product of needing to compile parts of the standard library in addition to the app's code.

Many things that Java does with a single instruction take much more than that in the target architecture. For example, virtual calls (which are used almost everywhere in Java) take several instructions on x86 and ARM while only one in Java or Dalvik bytecode.

Bytecode is much closer to source code than machine code (there's a cool discussion about this in https://channel9.msdn.com/Shows/Going+Deep/Expert-to-Expert-...)

and source code is generally smaller than machine code.

Its 174 MB installed with another 150 on Data which is a thing but I hardly use facebook once a day (just to see notifications not even the feed) which is quite big and no cleaner is going to delete the Data as it is not cache. Facebook app is a big one and I don't even know why.

Pretty sure some android versions or phones let you delete the documents and data, whereas on iOS you have no freedom to clear the cache manually. It only happens when you run low on storage.

I'm as far to mobile development as I can be but is there no shared libraries on those platforms ?

Other than the environment provided ones, isn't it possible to bundle some dynamically loaded libraries that can be shared by multiple apps ? Makes absolutely no sense for each application to implement it's own web browser/runtime.

Can't believe we are constantly reinventing the wheel again.

For iOS, there are the shared libraries provided by the OS. UIKit, PhotoKit, etc...

Shared libraries beyond that would bring the hell of version incompatibility. One need only look at Apple’s evolution of Swift where there still isn’t binary compatibility?

Most people don’t want to crack open their phone and set paths so that their app has a version of Python that works with it.

Also shared libraries can be giant security holes. See the bug in AFNetworking that affected many thousands of apps. Imagine an errant app shipping a backdoor into a shared networking library.

It's pretty telling that Linux distros are pretty much the only end-user platforms where binary distribution and packaging is built around shared libraries. Pretty much everyone else (Windows/OS X/iOS/Android/etc) generally expect binaries to include their own bundled versions of 3rd party libraries. Lots of people have looked at this problem and most have opted against the shared library approach.

That's because Linux distros are more centralized that the Apple's walled garden. They just lack the walls, but are as much a garden as you can get.

Unless those platforms start distributing a huge number of libraries your software may decide do depend on, shared libraries won't take off on them.

Not sure if "Linux distros" is the completely right term there. The various *BSD based "ports" systems do similar too.

The Global Assembly Cache for .NET was supposed to be a shared library storage with support for multiple versions of the same library + signed assembly/hash matching.

It just never really took off for some reason. It's where all the BCL assemblies existed.

Instead we have NuGet and the libraries exist wherever the application is installed. GAC be damned :(

If Apple (or Google) wanted to make this work, it wouldn't be that hard. An app could list the dependencies, with a name perhaps, but definitely a hash. The developer would have to upload the dependencies, too. If the hash matched, the server would also do a byte by byte comparison, and if that matched, mark the dependencies as sharable. When downloading an app with a sharable dependency, the phone would check if it already had installed it, otherwise download it and install it at the same time as the app. Refcount the sharable dependencies at install/remove/upgrade time and rove unused dependencies after a reasonable time.

You would only get a benefit if two or more apps you installed used the exact same version of the dependency, but that seems like it might happen enough; especially once it's expected to be in the ecosystem.

Sounds like you just invented https://nixos.org/nix/

Google is actually working on this for the support libraries. All the Android apps use them, so it is a good target (and of course it might be possible to extend it to other libs). I guess it makes the most sense in the markets where data is very costly. By itself the support lib is not really big but if each KB counts, let's not download it several times.

> Shared libraries beyond that would bring the hell of version incompatibility

This is more a general question but I've never understand why so many shared library systems have the restriction of "only one version of a library". What stops you from storing different versions of the same library for different apps?

(I understand the problem with transitive dependencies - e.g. if the same app transitively depends on two different versions of a library, that might spell trouble. Even for this there are solutions - e.g. different classloaders for the jvm - but even without, you could disallow this special case witout sacrifying much)

For Apple libraries, it is because the latest framework could be providing a completely different implementation and they both want and need everyone on that implementation. For instance, they reimplemented Photos, introducing a new interface, maintaining the old interface but deprecating it.

Edit: in other words, common data.

I guess they went as far as they could without a proper package manager.

> Shared libraries beyond that would bring the hell of version incompatibility. One need only look at Apple’s evolution of Swift where there still isn’t binary compatibility?

Since the source code is shared with Apple once it reaches the marketplace, wouldn't it be simpler for Apple to compile a specific version for whatever version of the Swift language is available on the target device ?

It wouldn't have any market value and wouldn't drive iOS devices sales but is would certainly slim them down a little.

> Also shared libraries can be giant security holes. See the bug in AFNetworking that affected many thousands of apps. Imagine an errant app shipping a backdoor into a shared networking library.

This sorta makes me like iOS a bit more. Memory gets cheaper but security doesn't get any cheaper when the dependencies grow.

Your source code isn’t shared with Apple. You ship Apple a binary or intermediate code.

Edit: Also doesn’t help if you have C or C++ code mixed in there.

> isn't it possible to bundle some dynamically loaded libraries that can be shared by multiple apps

Yes, there is one: the standard library, provided by Apple. However, many apps eschew this to use their own libraries and frameworks for whatever reason, be it ease of use or feature enhancements, and these cannot be shared.

The Facebook API library provided by Facebook is notoriously huge in source code at least, and they're pretty proud of the size of it, can't find the link right now, but they had a blog post talking about it a few years ago ago.

Here's a link to the oft-mocked slides from the talk where a Facebook engineer spoke about how iOS can't handle the scale of their app because they do the very logical thing of using over 18,000 classes: https://www.columbia.edu/~ng2573/zuggybuggy_is_2scale4ios.pd...

It would have to entirely app stored manage - it's not possible for iOS developers to currently share code between apps.

Everything that is shared between apps currently is stuff that's provided by the shared platform.

> In reality, each one is its own little OS full of its own UI frameworks, testing frameworks, and nontrivial code.

Are you talking about React Native apps here as well?

I am surprised at all of the app developer shaming in this thread. Is it really likely that every developer working on a popular BigCorp app is an idiot who imports 10MB libraries every time he/she faces the slightest challenge?

It's much more likely that app developers are optimizing for many things, including app size, but reducing app size has a bad cost/benefit ratio. Here are some decisions that may bloat your app:

  * Want your network calls to be fast and reliable?  Better use that cool new HTTP library rather than writing your own. 

  * Want to keep everything secure?  Rule #1 of hacker news is never roll your own crypto so better import the best lib out there.

  * Want to delight your users and their fancy QHD screens?  Time to include some high res images and animations.  Oh and you can't use vectors, they kill performance. 

  * Want to access new markets?  Time to translate your strings into 80 common languages.  Oh and some of these may require custom fonts to look right in your app. 

  * Want your Android game to have blazing fast graphics?  Import that native library, and don't forget multiple architectures. 

The biggest app I ever worked on was Google Santa Tracker. It was about 60MB. We spent a lot of time optimizing the app size in this year's version. We managed to drop 10MB while adding a few new games to the app. I'm proud of it, but if I didn't have the freedom to pursue app size I certainly would have taken the extra bloat to ship the new content.


Is it really likely that every developer working on a popular BigCorp app....

Not to mention, rarely does app size register to clients, PMs, bizdev in terms of a worthy task. Only when you end up at the top of the app sizes list is time given to optimize.

We work to keep down app sizes and offer line item tasks for app size optimization on updates, noone finds it needed until they start to risk being near the top of the app list on a users device when they want to delete some apps.

In games, many toolkits/engines like Unity and Unreal in mobile also add quite a large chunk. Building your own engine is rarely an option anymore in terms of competitive launch. Many games built with Unity/Unreal even with moderate assets can reach 200-300MB easily and easily creep up further on updates.

For some reason reducing APK size talks have been a constant presence in the last three Google IO and WWDC as well.

If you need to advocate this, it tells a lot about how much app devs care about the issue.

Every popular mobile OS has standard libraries for making HTTP requests and doing crypto.

I can forgive games for sacrificing size in favor of high-res animations and a blazing fast graphics stack, but I don't think LinkedIn, for example, is in need of particularly fast graphics.

If you've had to look at a pcap from what the android standard libraries do for https chunked upload, you would cry. One tls packet for the per chunk header, one for the chunk of data, and one for the carriage return newline. I've thankfully forgotten if these managed to also be separate tcp packets.

Given that, I don't think it's unreasonable to consider using a library to do it better, although you have to weigh the increase in apk size. The standard crypto libraries on the phone are great if the features you need were released 5 years ago, but if you want consistent behavior across Android versions and manufacturers, you need to include that yourself too.

Except the standard library will be updated if this actually became a problem, whereas you might choose that the cost/benefet ratio to updating your library is not worth it.

To a first approximation, the standard library on deployed Android phones is never updated.

Yet the installed size of my facebook app on android is 350MB and since the total phone memory is 2 GB I have a storage problem (less than ideal free space). While facebook lite is not a smooth experience compared to the full app.

"Installed size" as reported by iOS can be deceptive, since it includes a cache.

Unlike Android, end users can't clear it, though backing up and restoring your phone does that...

Mobile platforms should really do something about vector graphics. Shipping raster image which has much larger size than vector image is stupid. Shipping 3-5 raster images is beyound stupid. Surely there could be a way to render images on demand or roll some simple vector format which would be fast to render (it's not like decompressing PNG is that easy, it takes processor time as well).

Android offers VectorDrawable for this purpose, and Android Studio will automatically convert SVGs for you. It works great for icons and other "material" stuff, anything where flat colors and simple paths are sufficient. I believe there is also a support library to use it on older OS versions but have no experience with that since we're happy with 5.0+ for our app. I highly recommend it over making all those PNGs for different screen densities.

I have seen projects that did not cared about app size. Or rather, app size was priority "super low", because critical and important issues were still open.

It seems that most users don't really care nor use all that many apps.

There were a few articles with actual content on this topic covered on Daring Fireball in recent months. Not sure what this blog blurb is adding to the conversation.

1. https://sensortower.com/blog/ios-app-size-growth

2. http://blog.timac.org/?p=1707

3. https://blog.halide.cam/one-weird-trick-to-lose-size-c0a4013...

It's a blog, the content doesn't have to be novel. That said, the author would improve the post by replicating your context links.

Yea, I wasn't trying to think too hard about the subject, I know there are many excellent posts out there already (like those linked above) that go into detail. I only wanted to show some actual download numbers to demonstrate how absurd it has become. Linking those other articles is a great idea, thanks for the feedback!

That second link about the FB app is hilarious. As an example, there are 3 copies of a 3.6 MB binary file used as part of an optical flow algorithm (likely for 360-degree videos)

So it's not just that there's a bit of sloppy (or pragmatically careless) packaging, but there's just a lot of stuff in these apps.

It's pretty rich how these companies are well known for the rigor they apply to interviewing candidates on technical subjects, yet actually drop the ball in production with poor engineering like this. Where does that rigor go after the interviews are done?

Are there any examples of well-known apps from large organizations that aren't excessively large in size?

I work on Amazon Prime Photos iOS app. It's currently 60mb in size, and a good chunk of that is the Swift runtime.

Even the main Amazon shopping app is less than 100mb.

I'll bite. In what ways can you use up 100MByte with a fancy reimplementation of an online shop? That should be enough for a text to speech engine, or a 3d engine with quite a few assets.

I am being deliberately ignorant, having no idea what functionality one might add, but I'm actually genuinely curious. All that I see is a small databse and a lot of assets that are being downloaded on the fly.

I don't know the division of work between the app and server but don't forget that the app includes some level of voice recognition, and has not just a bar-code scanner but enough of a computer vision system that I can pretty reliably look up products simply by taking a picture of them.

I believe it does have a text to speech engine......the Android app has one.

>Even the main Amazon shopping app is less than 100mb.

109 on my device, not sure why it needs over 100 for a menu/search and checkout function that relies entirely on external data.

Also another commenter said Alexa was 78 when it's 124 for me and does even less than the shopping app.

Those apps are standouts but generally Amazon iOS apps are appalling. Alexa is 78MB of pure UX hell.

Wait. Do you have to include the runtime with the app? That seems backwards.

Because the customers are the ones who are paying for that inefficiency, not the company.

You can bet that those companies' infrastructure is optimized to hell and back.

> You can bet that those companies' infrastructure is optimized to hell and back.

You'd be surprised. (Source: I work at a unicorn.)

You know, I'm not currently an Uber customer because their app is too big and I don't have enough spare space on my phone.

There were a few times already that I thought about using them, but couldn't install it right at the time.

OK, so that's one. How many people even look? I have no idea what size the applications on my phone are.

I never really looked (not into Uber). I just tried installing it, and it failed due to lack of space... And, well, I can take a cab when I need it.

Do you have a lot of space on your phone, or just not use a lot of apps or media? People might not have the same situation as you.

I have a 64 GB phone with a fair amount of apps and media. I don't know if it's more or less than average.

64GB is huge. Many common phones only have 8gb. Inexpensive phones these days have 4GB. Super cheap phones for developing nations may try to sneak in with less. When your phone is full, you generally go to the installed app list, sort by size and kill the biggest thing you can't live without. Eventually you get to the point where you have to delete app A to run app B, and then delete B to run A. If your app is smaller, it'll stay installed longer.

I don't know much about developing markets, but I feel like most phones have for years come in 16, 32, 64, and 128 GB sizes. In any event, I'm not sure someone in a developing country with the absolute cheapest smart phone available is necessarily the kind of user Uber has in mind. I'd be skeptical of the idea that reducing app size is the most revenue-driving thing they could do to their app.

I have to imagine the vast majority either don't care, or have other reasons for not having the app on there (like the company itself, for one).

Not exactly large organisation, but the slack app is 80MB. For reference, discord is 17.

Please also count in server side data downloaded for identical functionality. It does not help if 80% of the app is downloaded after installation.

I'm on iOS, so checking the sizes in Storage. Slack is 74MB with 14MB extra downloaded. Discord is 18MB with 6MB extra downloaded.

> Are there any examples of well-known apps from large organizations that aren't excessively large in size?

This is nothing new. Look how much memory your typical Java desktop app, or worse, your average Java server app, uses. As soon as you have enterprises involved where it's regular occurrence for devs to be outfitted with the shiniest and beefiest (as perks or per standard company policy) developers won't care about resource usage until it's so excessive it either slows down their machines or accounting knocks on the door and asks who ordered dozens of x1.32xlarge machines...

Seriously, another huge problem is assets. In ye olde PC days, you included ONE version of an image and that was it (okay maybe a 16x16 favicon for the small start menu and a 32x32 one for desktop)... these days with loads of different combinations of resolution and DPI, and across multiple platforms (if you're doing a Cordova build and don't split the assets before packaging), stuff can and will grow.

It's not that Devs don't care. It's because every metric anyone has ever gathered says that users don't care.

It's not that users don't care, they just don't know it's an issue or who to blame. When they're out of space on their phone the blame whatever company logo appears on the phone, they don't blame the authors of a dozen bloated apps they installed.

This. People blamed Apple for them running out of space on a 16 GB phone. I agree photos, videos, and music take up space, but the first iPhone had 4 GB of storage. The iPod Shuffle had 500 MB. Not exactly apples to apples comparison, but Snapchat used to be about 5 MB, and now it’s over 100!

Could Snap have made the current app using only 6mb of space?

I'm sure they could, but i suspect the engineering effort would've been much greater. That's money down the drain if it turns out users don't care, or their competitors released earlier and grabbed more market share.

> It's because every metric anyone has ever gathered says that users don't care.

A lot of people go to wal-mart. Doesn't mean that it's a good thing overall for mankind

If you're looking for apps to be developed "for the good of mankind," without consideration of profitability, then you're going to have to have a different model than the pursuit of profit.

well of course, and I think there should be legal action in this direction. No reason to waste CPU cycles on the technologie du jour.

Adding onto this comment because people are latching onto the wrong part.

It's a greedy search. Just because you optimize for your local best interests, doesn't mean the COMMUNITY will benefit overall and the ecosystem itself suffers.

Look at how slow, ad-filled, and error-prone websites are these days.

So yeah, companies and people should be at least _concerned_ with the disparity between "best for me" and "best for all of us." Otherwise, we get things like rampant pollution. Well, this is app pollution.

I do not think you worked in Enterprise. Getting new tech approved is difficult, provisioning a VM takes weeks. Most enterprise lag at least 2 years behind everyone else.

Enterprise apps are slow because developers lack shiny new tech. You are forced to build yet another Spring App that takes minutes to start and gigabytes of memory. You also need to use MQ to interface with other systems and log/audit almost everything.

It's hard to move fast and have a well engineered product.

Then don't move fast. It's as simple as that. Move at the pace that you can sustain a reasonably engineered app.

At which point everyone who moves fast will beat you.

Will them?

Amazon differential is not the quality of their app. Unless they create something really impressive some day, they can only lose customers by moving faster there.

Besides, none of those are small startups anymore. All of them have something to lose.

> Amazon differential is not the quality of their app.

For my only real experience with amazon (audible) the quality of their app is a negative, I'd much prefer they made an API available. The same can be said of others, I'd much rather native apps connecting to a netflix API than use their awful web player.

There might be countless number of failed chat/social apps that failed due to their time spent engineering something like size reduction or code quality. You probably just never hear about them, since you know they failed to get traction.

So size being a problem seems only true for engineers. A normal user don't care as long as their device still works. And apple won't care since they can use this as an opportunity to up sell larger devices.

90% of the time, the organizations that say stuff like this just don't have any discipline. You can move plenty fast with a well engineered product.

Awesomely, this blog post of less than 200 words and one screenshot loads over 1.41 MB for me.

Software expands to fill all available resources.

For the few among us who haven't read "The Website Obesity Crisis", this is as good an excuse as any to link it again:


The only issue I have with that site is that it's awful to look at.

Sorry for the extra bandwidth, forgot to optimize the images.

Well that's your answer to your question. You forgot to optimize the images, because with your test environment, you don't notice it.

If you were testing over a dial-up connection, you would notice it.

Same with app writers. They forgot to optimize their app because they're testing it on the latest and greatest phones on their home network (or a corporate network which is blazing fast). If they tested on a low-end phone, and actually performed the update themselves over a slow cell network, they'd probably notice it.

Many common tools aren't set up for common-sense optimization. Ideally resizing images would be an automatic step, and you wouldn't have to remember. But that's not the case.

I'm sure that there are plenty of iPhone apps with 2 MB images from a camera, when a 256 KB image would do.

I used to work at a shop that ran a 1.5 mbps dsl line, and a ~5 year old, cheap, slow computer. If their code worked on that, it would work on any of our customer's machines.

I wish devs would still test like that. Sure, your app works fine in downtown SF on the newest phone, but try using it in Nowherseville, TN, on a phone that is more than a couple years old. I bet a lot of user frustration comes from dealing with that.

> Sure, your app works fine in downtown SF on the newest phone, but try using it in Nowherseville, TN

Granting that you heavily imply a 4G connection, the choice of "Nowherseville, TN" is extremely interesting: Chattanooga, TN — which as a Tennessean I feel is essentially "Nowheresville" to most non-Tennesseans — has Internet service multiple orders of magnitude better than most of the Bay Area[1]…

[1]: https://www.washingtonpost.com/news/the-switch/wp/2013/09/17...

I can agree that may be a bad example, but I think the point is fairly clear.

I do not, however, mean to imply a 4g connection. Many people don't even get that.

The Spotify app is dog slow on a Samsung S8+. Do you know how ridiculously overspecced this phone is and it can't load a Winamp clone with a terrible UI!

Hell, all of Android is far, far slower than it should be. I won't move to iOS but what Apple can do with a dual core phone is pretty amazing (and yes, I know that newer OS builds get slower).

I really wish android came with better tools to let you know what is bogging down the machine. If only they hadn't gutted the linux part of it, we could have much better tools at our fingertips....

Does anyone know of a good process-viewer/resource watcher for android?

Google killed all of the outside tooling by messing with their own variant of IntelliJ and build system. (Especially the latter. Eclipse plugin for gradle w/ android support is hopelessly incomplete, killing eclim which was the one reasonable code completion software.)

Apparently writing an IntelliJ plugin of a reasonable quality is much harder than a Python or other script. Who would have thought. And a glorified text editor takes GBs of RAM, likewise a glorified Makefile.

(The resource watcher is built into Android Studio, but ignores GPU memory. To do that you have to run GPU debugger, a separate memory hog.)

Sure, but the apps the blog talks about aren't two-man teams. And some of these firms were already touted as dogfooding their apps under resource constraints-


They should be doing better than this.

Images do not seem to be the biggest culprit (unless you already fixed them). I see 200kB of images and 600kB of javascript (most of which comes from disqus)

So which huge apps did you delete? Oh, none. Now you have your answer to why they don't optimize for app size.

For those who don't know, this is Parkinson's law :-)

yes! Do you remember where you learned of it by name?

(It came up for me in Ivan Illich's Deschooling Society :)

Worth pointing out: The author of this article works at Kayak, which has an iOS app of 176MB.


(As far as I can tell, the answer to "why is LinkedIn.app so large?" is not "because LinkedIn's iOS team sucks", but "because LinkedIn's iOS team works under a number of constraints, including app size, and app size is not a particularly powerful constraint to optimize for.")

I have been replacing traditional apps with PWA's or mobile websites wherever possible (on Android). They hardly take up any space and also seem to behave well (drains less battery) compared to traditional apps.

I could replace the following with PWAs:

- Twitter

- Uber

- Lyft

- Google news

- Instagram

- Flipboard

- Shopping sites like Walmart, Wish

and many more.

Facebook and Amazon have no PWA's but have mobile websites. (Facebook mobile web works well with Opera. On other browsers it annoyingly redirects to play store to install messenger)

I think that's a fine solution, but its looking at the wrong problem.

Consider an app like Discord [1], which is built using React Native and is thus a "native" app with some additional cruft like a JS runtime. It clocks in at a relatively small 30mb. Not bad.

Then consider Slack [2]. For nearly intents and purposes it does the same exact thing. Discord has far more functionality than Slack. Yet, it is 129mb.

Tweetbot [3]? 12mb. Twitter [4]? 204mb.

The issue has little to do with the technologies used. PWA, React Native, full native, it doesn't matter. The issue is truly that these large companies have horrible, bloated engineering teams and that bloat comes through in the size of the apps produced. It is Conway's Law in action.

[1] https://itunes.apple.com/us/app/discord-chat-for-gamers/id98...

[2] https://itunes.apple.com/us/app/slack-business-communication...

[3] https://itunes.apple.com/us/app/tweetbot-4-for-twitter/id101...

[4] https://itunes.apple.com/us/app/twitter/id333903271?mt=8

Well here is an application that defies your approach: the Android clock. 17Mb update to that just now, vanilla android. Plus Google do some diff. style updates so that is probably a lot more and for a clock. Presumably it has 17Mb of updated alarms in surround sound and presumably these are needed however I can't see any other obvious bloat potential as the clock should use Android UX.

How can a clock need the equivalent of a box of floppies? Windows 3 and 3.1 together comes to the same Mb and that comes with a clock.

These clocks did not have high fidelity long ringtones and billion of options. And especially no Google calendar support. ;)

PWAs? Public Welfare Assistance schemes?

Progressive Web Applications

It's the new name for a home screen bookmark.

Progressive Web Apps - basically HTML5 web sites that work well on a phone, and take advantage of some newer browser features to make a web page more "app-like." Not sure if there's a distinction to be made between PWA and Single Page Applications (SPA), but in either case, you can use Service Workers, local storage, offline mode etc as well as "app" features like notifications and a homepage shortcut. The experience ends up being very much like that of an app, without a huge install.

So sounds like the promise of web-based "apps" from pre-app store iOS to WebOS to Firefox OS... finally about to be realized thanks to official Google backing. RIP to all the minor players who came before and failed.

... without a huge install if you watch the weight of the initial page load and optimize for that. ;)

Pretty^W Progressive Web Apps (https://developers.google.com/web/progressive-web-apps/)

Essentially, apps on the web that feel and behave like a native app.

Progressive Web Apps

But those websites suck down enormous amounts of data each time they load.

Not if they use caching (whether service worker based or not) aggressively.

If an organization isn't taking the time and effort to make sure their app isn't bloated full of stuff, what makes you think they'll do that?

caching can be done reasonably well for straightforward cases with yet-another-library https://serviceworke.rs/

oh, the irony!

For the developers out there, here's one option for native applications that leverages the beauty of the web: http://jasonette.com/

I've been doing this too; but annoyingly I usually am in Private Browsing mode and when I go to launch one of these apps it doesn't work because I'm no longer logged in.

What exactly is the difference between a PWA and a mobile website?

I can say that I use it accidentally without realising(at first) it was a PWA website. So, Im just gonna tell it in layman term. From what I can tell, its a website, but definitely felt more like an app rather than a website.

the best way to understand it is just to try it. Its just a website(PWA) pinned to your homescreen, so theres not much cost to try it.

Sounds very diffuse. I can pin any mobile website to my homescreen but I presume that alone doesn't make it a PWA.

PWA apps can run from home screen without address bar, with splashscreen and you can use things like ServiceWorkers (some things can run in background, better caching/storage, you can even run your app offline showing information/data/whatever from last time you synced with backend server).

Branding, obviously. :)

I have a cheap phone, and due to this I can only have like 6 apps installed at a time.

I'm constantly removing Facebook/Messenger for situations like when I had to download Ticketmaster app for a concert ticket.

And with all these apps disallowing you from moving them to SD card, I can't even really use my 32GB SD card for them.

There's also Facebook Lite https://play.google.com/store/apps/details?id=com.facebook.l...

And if Play Store won't let you install it then download & install manually from here http://www.apkmirror.com/apk/facebook-2/lite/

protip: switch to browser-based FB. it's perfectly reasonable.

And, you can still access messages by selecting "Request desktop version" from the Chrome menu.

This is faster than the app for me. My main problems with the app are a) it was pre installed so can't be moved to SD card, b) not only does the app include a snapchat clone I don't use, but it stores 100s MBs of user data too, c) it tells me I have messages, but that requires more 100s MBs to read in another app with another 100 features I don't use.

Didn't know about this one, thanks. That's an extra 300MB free on my phone from removing Messenger.

This no longer works for me, YMMV.

And the permissions aren't as nuts.

If you're on Android, you can try SlimSocial for Facebook + Notifications for Facebook, both available through FDroid. 223 KiB and 103 KiB respectively.

Folks: there's a built-in technology on your phone that allows you to load and run an app on-demand over the internet without dedicating any internal storage at all! It allows clean integration with many of the "native" features you expect like camera and notification and timers and stuff. And it's based on completely open standards with multiple, competing open source implementations.

No, seriously: uninstall that junk and just run stuff in the browser. It works much better than you think, the biggest annoyance being all the nag screens sites throw at you to get you to install their apps.

the biggest annoyance being that these are even less integrated and fine tuned to the environment than all those bloaty corporate apps. The best apps still are made by small indy developers, feeling right at home in their OS. Take for example Instapaper, Fantastical, Outbank, Due, Reeder.. You just can't make web apps so polished, so well integrated into the OS..

I'm afraid it's exactly as bad as I think. I use web sites instead of apps whenever I reasonably can, but most stuff I use frequently is much better as an app.

What is it that keeps online web apps from turning into the bloated monsters that Electron apps are on the desktop?

They don't need to bundle their own copy of Chromium and Node, presumably.

Facebook in particular likes to break their web apps to push you to use the native one. If you ever need to use Facebook in the browser try mbasic.facebook.com(it even includes their messenger).

Also, relevant xkcd: https://xkcd.com/1367/

  > It works much better than you think
Thanks, but we do have smartphones and we are aware of this thing called browser. We are also aware how it works. Not "much better" for sure, if better at all.

"It works much better than you think"

It still doesn't work as well as a native app.

One of my first jobs was as a technician at a tech support call center. For a while around 1997-1998, a good 1/3 of our calls were customers who ordered a new system with a hard drive over 2.1GB, but Windows/DOS could only make partitions as large as 2.1GB. Customers wondered why they got a smaller hard disk than they ordered, not realizing the extra space was that Drive D under My Computer.

Fast forward to today, where my MacBook keeps nagging me that my "hard disk" (actually an SSD) is "full" because I only have 3GB of free space. In 20 years, what was once considered the maximum is now considered negligible.

Optimization is important, but regardless, software size is going to keep growing. Wringing hands over it doesn't help much.

This misses a large point of the article:

It's not the storage size that is problematic as much as the transfer size. Websites and developers really need to be aware how long it will take for someone to even get your product. If a website doesn't load in X seconds, you're losing Y customers. If your app takes an hour to download, your customer has probably already moved on in frustration.

Yes, this guy doesn't seem to care at all about it, beyond writing a blog post. If he did, he would remove those huge apps and find alternatives.

Perhaps the lament is that because nobody else cares, you can't actually find an app that is small to replace with!

There are lots of apps smaller and better than some of these. Consider the running app? Or using facebook in Safari.

OP is saying that what we think is large now could be considered negligible tomorrow. storage or transfer alike.

> my MacBook keeps nagging me that my "hard disk" (actually an SSD) is "full" because I only have 3GB of free space

That's because it will use that space for caches and swap, and you're preventing it from doing so and making the system slower.

Yes, I know. My point was that today's "practically none" was 20 years ago's "more than the OS can handle."

I remember when apps used to take less than 4k of memory, because 4k was all the RAM in your computer.

Then I remember downloading a 3 MB mp3 on my 9600 baud modem and being amazed at how much space was taken up by music that sounded realistic and not like just a bunch of beeps out of speakers that could only make beeps.

Then came the old joke about EMACS standing for "Eight Megs and Constantly Swapping".

Then I remember noticing that commercial software like games filled up a full CD's worth of space (back when software was distributed by physical CD's). After that it was common to ship software as multiple CD's, then multiple DVDs.

Now, is software even shipped on DVDs anymore? I just download everything, and, yeah, apps are still bloating, same as ever.

Right? I just got a new phone that has 128 Gb of internal storage. 334 Mb app? no problem.

That's cool that you can afford $500+ for a phone.

But the rest of us just want to use a phone with a couple apps. We can't afford phones like that, and high market-share apps shouldn't be designed for the top 1% of phones owners.

If Facebook wants people to use their shit, they can't design it solely for early-adopters.

When it only has the functionality of a 20Mb app... it's a problem.

When EVERY app does the same thing? It's a problem.

From that perspective, you have a very valid point.

It's really crazy. Overcast (my favorite podcast app) is under 10 MB because Marco cares about things like that.

Here's some of what's listed in the update list on my iPhone:

Chipotle is 92. The kindle app is 171 (perhaps the fonts?). The Amazon app (which is mostly a web view anyway) is 127.

Robocall blocker? 22. Verizon app? 160. An app for tracking streaks of achieving task? 65.

Slack is 123.

Clips? The Apple app for making little movies that includes a fair bit of art? Only 55. That makes sense.

Authy? To show 2-factor codes? 65!

Outside of games (which have a lot of assets) app sizes seem to have absolutely no correlation to what they actually do.

Short a major customer outcry, Apple is largely incentivized to not fix this—

(1) They substantial profits from memory upsells on their product lines (2) Larger apps take more horsepower to run— so older models become less effective sooner!

But it also adds to their bandwidth and storage bills

It's probably a drop in an ocean compared to their other services.

Which they can easily price into the cost of the device, or into the cost of the app.

But if Android apps are smaller and faster, people will switch over to Android.

I highly doubt that. Few people actually know how big any given app is. Plus, there are more compelling things keeping people on their platform of choice, like iMessage or the sunken cost of apps that they've already purchased.

+1 for this. I think 16G iPhone6 exists to promote sales of models with larger memory. Now I have to uninstall an app to install a new one on my iPhone(16G), EVERY TIME, it's painful. Next time, I will definitely go for 64G.

The base for iPhones now is 32G not 16G.

Most of the time it's the same reason why web pages are MBs in size today: lazy developers that uses a new library for every feature they need, without a deeper analysis of costs and benefits.

Not only, there is IMHO also a "technological supremacy" bias.

Quite obviously most developers will have:

1) VERY powerful hardware

2) VERY fast (and unlimited) internet connection

It's not like (they should do it as part of quality assurance or similar) they take a car, drive in some remote countryside, possibly in the middle of nowhere, stay there a couple days and try accessing their website (or running their app) on the lowest/cheaper entry level hardware on a metered connection.

It's easy when you have a T1 or faster connection on a recent top-hardware to forget how a lot of other people have slower devices, with less memory and limited bandwith and metered connection.

This absolutely.

I wouldn't be surprised if half of the outrage about file sizes simply comes from the fact that there are people who remember what software was like before all of this "technological supremacy" was a thing. People joining the workforce today didn't grow up with the experience of installing something off of seven floppy disks, which would have been considered an emormous program before the CD-ROM drive was common. They also don't know how much better those programs run, because they had to do so with 8 MB of memory or less and nowhere near 1 GB of hard drive space.

It's pure decadence.

OT but not much, and JFYI, I started using an alternate unit of measure for web page sizes, the Doom:



Like: Hey, nice home page it is only 3 Dooms ...

I've watched entire episodes of anime in RM format in highschool that were smaller than the majority of apps these days. Not just smaller, but like... a 4X smaller or more.

It's not just "lazy developers".

For most freelancers and contractors the economics of the market means most of the time you have to be lazy or you will lose the job to someone else.

The problem is it's not really in Apples interest to get app sizes smaller.

Larger apps means you have more "need" to upgrade your phone to the latest version with more space, power, speed etc.

While that sounds like a clever trick that Apple could use, they really want to minimize production costs and maximize utility for consumers. Their profit margin comes from a share of that surplus utility.

I suppose it can help hide price increases of the next higher configurations, such as when the base configuration of the MacBook Pro decreased from 256GB to 128GB between 2016 and 2017, with the list price for 256GB configurations increasing by a couple hundred dollars. However, Apple has also gone as far as developing an entirely new filesystem to decrease storage use.

Except they have been working on it, by repackaging apps to strip out unneeded assets.

And it is in Apple's interest to have smaller apps, because then people will be able to download more apps. They'll be willing to try more apps, because there will be less of a barrier between seeing the download button and being up and running.

Plus Apple likes shipping lower end/cheaper devices to people who won't buy the high end.

When your low end configuration is hard to use because Facebook takes up 29% of the storage... that's a problem. Customers get mad.

When they can't update Facebook because it wants more space and the device is full and now the user is locked out from their friends... customers get mad.

When updating a handful of apps uses up 70% of their monthly data... users get mad.

And ALL of that effects apple's bottom line.

This is usually my thought when I see issues like this. If Apple wanted to fix the problem, they would fix the problem.

this incentive analysis is so shallow it doesn't pass first muster. does fitting/selling more apps make Apple more money? are storage and bandwidth expenses? do users value performance? most people here seem to agree more bloat = less users.

most people here seem to agree more bloat = less users

I haven't seen any data in this thread to back that up. It certainly doesn't apprar true in my experience.

i think the disagreement is on what constitutes "bloat"

As someone with 16gb of space on my phone (before the OS), this has become really noticeable. It's a breath of fresh air when you install an app (like the habit tracker I installed the other day) and it's only 2mb...

It's worse knowing something like Facebook will cache a whole bunch of images, friend pictures and everything else.

Is anyone else worried about the massive amounts of bundled third-party libraries that come with each app from a security, rather than a size, perspective? What happens when such a library receives a security patch? AFAIK it's up to each developer to keep all bundled libraries up-to-date, which means that, realistically, everyone is shipping lots of vulnerable stuff and they don't even know it.

"This shirt is dry clean only, which means it's dirty."

Yes that's exactly what it means, React has 630 dependencies so 630ish separate libraries and components. You might even stop updating a component since the new versions change the interface and end up breaking sections of your codebase.

The idea is that because it's all open sourced, all the vulnerabilities will be found and patched. But more often than not you just end up missing the small notification from the maintainers telling you to update.

This is definitely getting to be a major problem. I've removed several apps for this reason as well.

Code bloat = lost users

I think the way it works in the real world unfortunately is:

code bloat => "your phone is full" => "oh, my phone is too old" => new iphone => free space => code bloat

That said, I also delete bulky apps before I start deleting media.

Yup, and also bulky apps that insist on a two week rolling release schedule...

Yeah it is a problem, and so far the way most companies fixed this issue is by releasing a 'lite' version which of course is probably not feature complete but will be significantly faster to update and load.

If you are on a good network you probably won't ever notice such bloat until you run out of disk space or bust your download limit (Canada)

If I find an app to be too big for my taste I normally fall back to the web version if it exist at all but even then its hit or miss because sometimes the web app is pure complete garbage or worse than the app itself.

I think it is unlikely that most people care and the improved analytics and development speed is probably worth it. My android is connected to wifi almost 24/7, so it doesn't matter how long it takes up update.

Most people have small flash and can't fit many apps this size in the space that is left over from the OS and photos/media.

I tweeted some frustation about this a few days ago (https://twitter.com/GregoryOriol/status/889859849353383937).

What is surprising me, is that in the Facebook iOS app, there is a "FBSharedFramework.framework" that has a binary file of 215MB. What the fuck is this? How a single binary can get that big?

Facebook is THE mother lode of terrible engineering. They have turned it into an "art". You can see it across their end user offerings, as well as their open source ones. What is perplexing is that they have some of the most talented engineers out there. I'm guessing those people are using the resources of the company to research and do what is interesting to them, while clowns run the actual projects.

And here's a source to prove it: http://quellish.tumblr.com/post/126712999812/how-on-earth-th...

> "In the case of the Facebook application, there are more than 18,000 classses in the application"

18,000 classes. Absolutely ridiculous.

LEGO's app for their new robotic toy is 1.03 GB. https://itunes.apple.com/de/app/lego-boost/id1217385613

The Phillips Sonicare Kids app, which is nothing more than a simple game for kids to track brushing their teeth, is 245MB.

I guess the good thing is that it gave me an opportunity to teach my kid about tradeoffs. "Ok, so if you really want this app, we're gonna have to delete 4 of your other games on the iPad." Even a 5 year old could reason his way out of that one.

We released 2 apps on iOS and Android. One is 0.8 MB and the second just at 1.8 MB. Obviously we have not a lot of graphics embedded. We use mostly C and 2 cross platform projects SFML and Nuklear for the GUI. The GUI is more in the gaming style but for us it fits the bill and we render at 60 FPS on most devices including the iPad mini original of my daughter.

People don't care about app sizes, otherwise it would be an issue.

Look at this poster, he doesn't care. He isn't going to remove any of those apps. There are runner apps that are a lot smaller, but he doesn't care about the size of the app enough to guide what he downloads.

Empty blog posts are empty.

"Why is there so much traffic, someone should do something, I hate driving these days."

How do you know people do not care? There are billions of people with phones without enough space and slow or almost nonexistent internet access. They are just not vocal about it.

the solution is, everyone should uninstall these apps. if everyone did that, i bet you they would fix the problem reallly fast.

This is why i uninstalled pokemon go -> it used up to much data and to much battery. Uninstall -> problem solved. unfortunetly, the vast majority of people don't care about app sizes, or how much data they use, etc.

Our app downloaded is just under 100MB. Biggest part? Google Maps For Business at 30MB. Why do we use this monstrosity? We signed some marketing deal with Google. We have a replacement using Apple Maps thats about 2MB. But the beancounters won't let us use it because of the $. Not every size problem is some programmers fault.

I was comparison shopping something this week and wanted to check Best Buy, so I went to the app store. 100+ MB and needed to be on wifi to download. What in the Best Buy app could be over 100 MB?

On another note, I just went to check some of my apps and iOS 11 got rid of the size from that view. You now need to dive into each app to see the size.

Why didnt you go staight to their web site? Its well usable on a mobile browser.

cordova, debugging info, unscaled and uncompressed images.

It's not just the install size, but apps which accumulate and seemingly never delete data.

For a while I tried to get by with a 16GB iPhone SE thinking I keep everything in the cloud, so why would I ever need 64GB? Well every couple of months I'd have to delete and re-install NYTimes and reddit and some magazine apps because they just grew and grew and grew in storage until I had 0 space left. Like they simply cache everything you've ever looked at.

It's dumb, because other apps are intelligent -- they'll automatically purge cache data when storage gets too low. But not NYTimes. Not reddit. It seems pretty inexcusable, really.

Another question has to be asked: do they have the motivation to reduce the size of apps?

Earlier this year, Wechat released a revolutionary (kinda) feature called 'Miniapp', it supports releasing apps within Wechat itself (a bunch of xml/js files). All major Internet companies published their own miniapp in Wechat, which include s the most-used feature of their full app, and only takes less than 1MB of space. Guess what? the miniapps are not adopted by most users, it became just a fad.

This means most users are not sensitive to the disk space used by an app. Apps know this, and thus don't have motivation to reduce the size.

I'm not sure how your anecdote is supposed to show that users don't care about disk space used by an app, as opposed to not liking one particular attempt to reduce it. WeChat also includes a feature to manage storage space on a per-chat basis, and I have used it before when I ran out of space.

Thing is, some Android users (like me) are at the limit of their internal storage and some apps can't be moved to an external SD card for some reason. Before installing a new app (or indeed just an update), I have to decide which of the hellishly bloated apps on my phone I can delete to free up space.

They don't. It's the same race we had at the start of the PC race. Most of the "optimizers" optimized themselves out of existence while twiddling bits in assembler.

A major culprit of the bloat is the monolithic 3rd-party libraries/frameworks. You have to import the whole thing, even if what you need is just one simple function. Of course, you have the option of carefully studying the code, and hand-pick the part of code you need, but most developers will not do this, due to poor ROI.

One way to solve this problem, is to promote modular library structures, and package management tools (pod, npm) should support importing fine-grained submodules, even single features of a library/framework, whenever possible.

Everyone keeps saying this, and it's probably true. What I don't get is... why the hell these libraries everyone is using don't support MODULAR compilation. Why are they including stuff people aren't using?

I'm not an app developer. But I've known about libraries that section off lesser used code as "addons" since... well... since I started programming. This seems so fundamental I don't understand why everyone isn't doing it. Especially considering the gains are far greater in the mobile and web world than desktop.

Normal apps are around 10 to 20mb, which is still big. But the reason for this is support for different screen resolutions. Images of different sizes are bundled with the app, even though the phone only needs on of them.

Vector graphics will solve this, but is not mainstream yet.

The facebook android app is 88MB (zipped), i did check the APK of why its so big:

90mb of code 30mb of assetes (javascript, metadata, librtc, big json files (animations)) 30mb of resources (images)

> Vector graphics will solve this, but is not mainstream yet.

Then you're trading off battery life vs. space. Vector graphics can be far more expensive to render than bitmaps.

Is it possible maybe to render out the vector graphics once (on the device) and then have them pre-rendered (cached) until they change?

I guess it would add considerable overhead for a negligible improvement in size (compared to just shipping all images) for most devices.

EDIT: Too slow...

Why vectors can't be pre-rendered and cached?

That's what a bitmap asset is.

Except that a cache is disposable, and disposed often in practice.

But you don't have to redraw them all the time.

Yep, agree with the image bundling issues. We have tried to keep our footprint as small as possible in our own product (3D modeling application https://3dc.io) and are clocking in at 8mb on iOS. Out of that about 3MB are icons/splash images and other media. Regarding vector graphics, we couldn't use them, as they cause bugs and flickering with css animations in iOS Safari. But we do generate everything we can procedurally (primitives, colorpicker etc.) which saves a ton of space.

I don't know the status of iOS, but Android has done some work on sending diffs on updates, so real sizes should be smaller: https://android-developers.googleblog.com/2016/07/improvemen...

Doesn't do anything for size on disk, but should help network a lot.

Analytics and ads/tracking is one of the reasons. They're always writing/including more code to track every single click and pixel.

No discussion of bloat is complete with a mention of the demoscene, where sub-MB applications generate immensely complex graphics:




(I'm aware that these sizes do not include the OS and its libraries, but apps on a mobile device also have a similarly rich environment of libraries they can use.)

One word: Swift

It's making app bundle sizes explode in size

The lack of a stable ABI means that the Swift runtime libraries have to be bundled into the apps, which accounts for around 10MB of bloat. Unfortunately, 10MB is almost unnoticeable on the scale of bloated apps these days. Does Swift do anything else to make apps bigger?

Could you please elaborate, or provide examples?

I can see how high-res pictures embedded in the app can add dozens of megabytes, and maybe a different runtime that you have to bundle for compat reasons, but multiple hundreds of MB?

Swift libs are ~ 10 MB, still 265 MB to go.

I have a limited understanding of how this works, but could it be that app devs are leaving debug stuff in the builds, and not removing that when it gets pushed to production?

Please, educate me. I'm all ears. :)

Often, it's really bad design. For instance, Facebook's app has 18k classes. In other cases, it's a lot of big 3rd party libraries.

With text-only webs over 5MB I wouldn't expect less from mobile apps.

Six apps, six embedded web browsers.

Does that add up to much in terms of app size? I was under the impression most embedded web browsers used platform web view components?

You're right - there should be minimal (if any) size bloat because of the use of browsers on iOS.

You're not allowed to include your own browser, and can only use the platform's web view, which is not duplicated on a per-app basis.

There are many culprits for app bloat, but using a browser is most certainly not one of them.

I'm sincerely curious what would be the worst offenders for app bloat.

It varies, but for most apps I'd wager it's library bloat, but in my experience doing iOS dev here are the common culprits:

- For games assets are a big issue. Some do not compress their assets, and unfortunately most image-authoring tools make it easy to output PNGs that are much larger than they can be. I think a lot of people by default assume bloat comes from images/icons/etc, but IMO this is a red herring for most non-game apps.

- Library bloat. Even simple apps pull in a large number of external dependencies, which contribute dramatically to app bloat. There's also a lot of code pulled in that replicate platform-provided functionality (see: the bajillions of layout libraries out there), which may be simpler to use than the stock Apple components, but add to your bundle size.

One of the common problems is that iOS open source dependencies are typically all-or-nothing - you end up pulling in a very large library even if you're only using a small slice of its functionality.

I think most disassemblies of iOS app bundles show that library bloat is typically a far larger problem than asset bloat.

In any case, I think the future will be something like Android Instant Apps - where apps are sliced up such that necessary bits can be downloaded on-demand. This gets users into apps faster, and saves space.

Congratulations, now you cannot use any app on an intermittent or slow network connection.

This "cure" is worse than the disease.

How so? The scenario being modeled here is a user who wants to use an app but does not already have it.

The status quo is that they must download a 150MB bundle before being able to proceed.

The proposed app slicing will allow them to download maybe 10-20MB before being able to proceed, with additional components (up to the 150MB total) downloaded on-demand.

If the user is on an intermittent or slow connection this is still a significant improvement over the status quo: the user gets into the app much more quickly.

Additionally, if the user only uses a small slice of the app's functionality (which is the 90% use case), additional downloading can be deferred until the user connects to wifi, at which point the rest of the app can be downloaded over a fast and reliable connection, all seamlessly without the user having to worry about it.

Apparently, Crosswalk [1] for Android adds "only" about 20MB.

I just compiled a Cordova-based Android-app without Crosswalk, but also without images. Just JavaScript, HTML and CSS and a few plugins and the APK is < 3MB. Was quite surprised by that.

[1]: https://crosswalk-project.org/

I think a few more years and we can finally stop bundling browsers. It's not like anyone wants to bundle a browser inside their app for fun. That being said, Crosswalk really solved some serious issues for my product so cheers to them.

Yeah, I used it a year or two ago, and it was nice for making sure the experience was the same across all Android devices. Glad to see the built-in Webview is getting place.

Nope. Only Crosswalk embeds a browser engine in Android and the project has been cancelled.

Cordova uses the system WebView.

Facebook did flirt with HTML5 but ditched it in the end. So that's not really true.

They went off the deep end with their "engineering" of the app (see: https://www.reddit.com/r/programming/comments/3m5n2n/faceboo...).

It's really amusing to me when "engineers" start talking about the "scale" of the UI. It's a client. Thin vs. Fat aside, if it's that fat it's almost certainly a bloated mess of redundancy and what is called "overengineering" (which is actually underengineering--that is, a deficiency of the application of engineering and architecture principles to the design of the application).

I had a friend who quit working for Facebook about a month before that slideshow was published, and it was hilarious to see her mock it so mercilessly. She estimated a good 40% was completely dead code, and another 20% was partially dead code (e.g. code for notifying people of "pokes" without any mechanism to view them). She also had some choice words for the engineer who published the slides, but those can mainly be pieced together from the general mockery on Reddit and hacker news.

Are these slides available anywhere? The backup link is down, too.

I don't know. I just did a search for "facebook ios can't handle our scale" because I remembered reading this some time ago.

There was a time when people used more and more websites because they were fed up with desktop apps. Hopefully we see the same soon again.

275MB for LinkedIn is the complaint, to one-up that...I recently got an update for Hearthstone on iOS of 2.3GB. It adds one pack of cards.

Pretty soon these will start to hit Apple's limit on downloads over cellular (something I still can't believe exists in 2017).

It should definitely exist, but Apple's one-size-fits-all approach is pretty dumb. It ought to be configurable. The 100MB limit is way too big for someone on a really limited cellular plan, and way too small for someone on a high-end plan.

It should be as simple as a toggle to turn it off.

You're typing from a first world country where there are not huge cellular costs or speed limitations. If you go to other countries, even say, Botswana, the option to download over a gig in apps every (other) week isn't cheap. Bandwidth costs.

Don't worry, you can get to third world carrier really quickly by taking a Greyhound bus to Canada.

As a Canadian, to get 2.5 GB of data from a larger cell phone company costs me over $100 / month (there are smaller companies that will reduce that, but still, it's little data for quite a bit of money). I'm very happy I can force things to download only over WiFi.

>(something I still can't believe exists in 2017

Something that I'm very happy about in 2017 :-) Data is expensive in many places, and wifi/broadband is typically cheaper.

"It works for me so you shouldn't need the option" mentality at heart. I have unlimited data so why should I be under a ridiculous limit?

The reason the LinkedIn app is so big is the number of frameworks they use. 87 frameworks accounting for 248MB of the 277. The swift runtime takes up about 20MB, and something called VoyagerFeed takes up over 190. All the images actually only make up about 12MB. There's also about 0.5MB for each localisation.

>something called VoyagerFeed takes up over 190

"LinkedIn’s new flagship app, nicknamed “Voyager,” is a complete architectural and design overhaul available for iOS and Android, as well as an online mobile experience, in the coming weeks." https://thenextweb.com/apps/2015/10/14/linkedin-voyager/

Sounds like "VoyagerFeed" may be part of their own codebase.

I hope someone might take it upon themselves to make a Wall of Shame to let me know which apps I should just uninstall if I rarely use them.

Although the updates aren't always huge, Slack has been an atrocious app on Windows and iOS for performance and indexing. Absurd that the biggest companies have, well, the biggest apps.

Things will only get worse. The application paradigm is unsustainable.


You're entirely right, but "it is difficult to get a man to understand something, when his salary depends on his not understanding it" certainly applies to the entire "tech" industry, and the app factory in particular.

I expect this realization will only become widespread when open-source folks actually fix their software's deep-seated usability problems. The way to do that is not by hiring UX designers or appifying it; it's by thinking deeply about how humans interact with computers and blowing away the conventions that make computers and software unforgiving, hard to explore, and opaque to the uninitiated.

...So by hiring / lucking into someone whose job it is to look at UX. Like a designer.

The current alternative, treating the Browser like a poorly implemented, feature-scarce OS on which to run "web apps", is worse.

When i see articles like this it always reminds me of Carl Sassenrath's Fight Software Complexity Pollution - http://www.rebol.com/article/0497.html

Once upon a time I deleted music and videos from my phone to make room for newer music and videos but nowadays I catch myself removing apps. Not because of the space they are using but because of the app size itself. A once removed app is unlikely to be reinstalled.

Not sure that Apple has any interest into reporting smaller sizes.. They have interested in having lower download sizes (less strain on their servers) and users buying the bigger and better memory storage.. Along with the iCloud subscription of course..

They're working at making smaller apps on people's iPhones. They strip out all the graphics that don't apply to the given phone that's downloading on, plus bitcode is compiled for the specific architecture, so you don't have multiple architectures installed that you don't need.

Pile on the frameworks to speed development and to make mobile development more familiar-feeling for non-mobile coders. Instrument every user interaction. Use a cross-platform SDK. Squeeze every penny out of every ad network. Whoops! Bloated.

People here seem to be blaming over eager use of third party libraries for a lot of bloat. Is there no such thing as dead code elimination for apps that would eliminate the unused portions of your dependencies?

The only real solution I can think of would be if apple actually incentivized app creators to reduce their size. They are actively harming the Apple ecosystem by preventing users from having a large number of apps which hurts Apple.

* Charge owners a fee for apps over a certain size "a processing fee" or whatnot

* Charge owners a fee based on the download costs. Under XMB and it is free. The more you cost apple to transfer your app to their customer the larger the fee.

* Penalize large apps in the app store search results or give bonus to smaller apps.

Or rather than straight up bonus's / penalize apps based on size go after specific things that cause bloat

* Apps that don't use pngcrush on their png's

* Shipping wav files and not acc

* ...

Maybe Apple doesn't even need to actually implement any of these, but just threaten to.

There are more native capabilities now on devices which require more SDKs to leverage this functionality for cool stuff. The size of some of these SDKs are huge and bloat the size of your app.

And supposedly we don't need PWAs according to Apple...

Jobs wanted PWAs before they were around. The devs for the first version of the iPhone were told to write web apps until we realized the dev environment just wasn't up to snuff.

We're finally getting back to that original vision.

Swift induces large binary sizes through the language itself. You'd be surprised what a simple optional if let creates in assembly, or how it uses template specialization with pretty much every typed collection interaction or some other standard library interaction. That plus the 10-20+MB that the swift standard lib adds contributes a good chunk. Once ABI stability comes in and a bunch of in progress size optimizations come in, binary sizes will decrease quite a bit for swift using apps.

Apple also encrypts then compresses, which means the binaries you download in the app store are incompressible.

If apple wants to decrease IPA download size worldwide, they would let developers not encrypt their app and just sign them. That would be very relevant for developers of popular free apps. I'm guessing they encrypt then compress so they wont have to re-encrypt the binaries on the users phones once they uncompress an app.

Also all the SV big-co have A/B testing practices with weekly release cycles that induces large line counts in their apps.

I think everyone copied the facebook mobile dev style, which simulates what you can do in webdev. In webdev there is no cost to adding another team for another feature that lives in some section of the greater app, since it's just another webpage. You can create many a/b tests and rollback things nearly instantly. With the weekly cycle everything is under a feature flag and you'll see a bunch of half developed features sitting in the delivered binary turned off via feature flag. This induces code size and creates these kinds of issues you see today.

Also large apps start requiring management structures that I call hallways and elevators. A single indie app can make a equivalent of a 1 room hut of an app, which doesn't require any hallways, elevators, floors, boiler rooms, parking structures or stairs. If you look at the layout plan of a highrise, you'll start to realize a lot of the floor space is taken up by the elevator and hallways.

Once apps become as large as a highrise, then they start requiring code structures that help manage the chaos, such as reporting systems, rollback systems, well defined tree structures and so on. That and the shear amount of rooms they have create apps larger than they look.

I'm pretty surprised nobody mentioned proguard. Almost noone enables that, unless they have to, and that's part of the problem.

What blogging engine is this? Real nice and clean.

There is a reference to django.css in the source, assuming you meant the style. As the blogging engine wouldn't necessarily dictate the cleanliness of the look.

My phone currently only manages 2 non-default apps. If it could actually install to SD like it promises I'd be happy.

It would be cool if users could specify a max size for an app, and then the app tries to meet that size or be deleted.

Many apps are loaded with lots of A/B test, increasing the binary size .. because A/B testing

Interestingly LinkedIn reports on the Play Store as only 20MB. Though once installed it uses 111MB.

Just wait until they start adding 100Mb+ (Core)ML data models after iOS 11 (with A9+ chips).

Im just guessing this is to corner of a good amount of disk space so the app does not stop working from lack of space.

An analogy is when downloading a torrent; the torrent blocks a chunk of space on your disk equivalent to the size of the file being downloaded. As the file is being downloaded it replaces junk in the blocked chunk.

Again all this is my speculation.

Why isn't rsync used to update apps, instead of passing archives around?

Why swift runtime is not included as part of ios ecosystem?

Didn't Apple implement delta updates for apps?

Advertising APIs.

I never understand where this increasing size comes from. For videos or hi-res photographs, I understand.

There is however no reason that code, either compiled to a binary format or in a textual format, uses so much data. Heck, the memoirs of Casanova spans 3000 pages, and is 6,5 MB. People don't understand how incredibly large a megabyte is for simple code.

Surely the 275 MB isn't all useful data (I wonder what compression ratios you get on 'apps'), and it should be possible to cut it down to a few MB.

Could it be that whenever coders today need some fairly trivial functionality, they tend to go out and find a library that contains it. So you end up with lots and lots of libraries where only a tiny bits of them are used. Just a hypothesis though.

There are many reasons why apps are so big, but I think you are right about a major part of the problem. Development environments these days make it very easy to add in third party libraries for very little effort.

At a previous job we had a monolithic java server that ended up at over 350Mb of compiled code simply because each development team had imported whatever libraries they thought they needed. In some cases, 2 or 3 versions of the same library were included.

How did they get multiple versions of the same library to work together? Java loads things via the class path so I would have thought that would cause some sort of error.

Some libraries change the namespace (package names) between major versions, specifically to allow transition from one to another in a gradual manor, or to start following namespace guidelines better.


i don't see why any of this is a problem

This is why code coverage is so important as part of the general development cycle.

Wait, code coverage? How does that address the number of libraries used or duplicate versions of the same library?

I think this is likely the biggest culprit. Another user pointed out an analysis of the Facebook app:


Most of it is actual code - not assets. For some types of apps (see: games) assets do take up a significant portion of total size, but for most everyday apps bloat by code over-inclusion is likely a bigger problem than asset-bloat.

I wonder if it's possible to get major open-source libs to move towards more fine-grained build targets and internal dependency management, so that devs don't pull in a gigantic binary when they're only using a small slice of the functionality.

Depending on how they are made, how the linker is configured, and the phase of the moon, static linking can help by just taking the required functions.

Also I think we, as programmers, have taken the "don't reinvent the wheel" principle too far. The idea is to use 3rd party code to (1) save time writing, (2) reduce the risk of bugs, and (3) lower the maintainance burden.

But this makes sense only if the benefits outweigh the corresponding costs of integration, which also (1) takes time, (2) might be done wrong (especially because you don't understand the part you added), and (3) creates a maintainance burden. Of these, only #1 is solved when your development environment makes adding new libs quick and easy.

> I wonder if it's possible to get major open-source libs to move towards more fine-grained build targets and internal dependency management, so that devs don't pull in a gigantic binary when they're only using a small slice of the functionality.

Fwiw, this is already a trend in JavaScript land. Libraries are moving towards many small packages so you can import only the parts you need either manually or using a tool like Webpack to throw away what isn't used.

I don't think that's the right solution to the problem. Nobody wants libleftpad.so, and as someone who works on a distribution the very concept is horrific (making distribution packages for every three-line package does not make anyone happy).

What I think GP was arguing for is that you have libstring.so which you can strip down to just having leftpad or w/e with Kconfig or similar configurations (preferably at link time not build time -- otherwise you still have the same problem).

JavaScript people do not create libleftpad.so, they create libleftpad.o (metaphorically speaking), which will not clutter up your distribution, it will just clutter up some apps. I'd rather see 50 little files like libleftpad.o than 5 gigantic libraries.

But I agree with what you're saying in the second paragraph, which actually sounds just like what my GP meant by using "Webpack to throw away what isn't used". Having unused parts of larger libraries cut out would be a much cooler solution than just telling everyone to eschew kitchen sinks in favor of many tiny, bespoke items. Especially since finding the right small libraries is much harder than finding a kitchen sink that just works and has a sizable community behind it

> which will not clutter up your distribution, it will just clutter up some apps.

Not necessarily. Distributions have to build everything, and in openSUSE we have an integrated build system[1] which has a lot of very useful features (rebuild when dependencies are updated, and automated QA runs before releases). Those features require you to have all of the dependencies of a project in an RPM package. Even if you don't end up shipping that package, the package is used for tracking security issues and backports and so on. You can't pull anything from the internet during a build, you have to depend on your BuildRequires.

Now take any average JS application that has over 800 dependencies. Ruby was already bad enough with ~80 dependencies, but adding another order of magnitude is just not maintainable. One of my colleagues gave a talk at LCA about this problem[2].

[1]: https://build.opensuse.org/ [2]: https://www.youtube.com/watch?v=4ua5aeKKDzU

I had no idea about this - thank you for bringing it up! But we seem to be discussing different ideas. I was trying to argue about apps, specifically iOS apps, like the original article was about. iOS apps aren't distributed on openSUSE, so if some devs make some kind of libleftpad.o for Objective C developers making iOS apps, I'm saying that's fine, and I doubt it would clutter up your distro because Objective C isn't exactly known for it's cross-platform adoption :)

If we're going to talk about linux packages, aren't they often written in languages with amazing optimization skills? If I write some C99 code that uses a gigantic library but I only use like 2% of it, my compiler will cut out the 98% my code doesn't use, so libleftpad sounds like an awful idea. That's one reason why packages on linux distros aren't too big.

But I'm talking about iOS apps where, as others have pointed out, the available optimizations suck[0], and as such, I think that having libleftpad.o included in everyone's iOS apps isn't a big deal (note that iOS doesn't really have a nice way to create libleftpad.so anyway AFAICT because all code for your app is supposed to be sandboxed so no one else can mess with or use it). I agree that it would be really cool to just cut out the 98% of $GIGANTIC_LIBRARY that isn't used at compile time, but since Objective C doesn't seem to have that now, I think small things would be a really nice way to give users more space on their phones without removing features.

[0] https://news.ycombinator.com/item?id=14902174

Ah okay, I was talking about things like electron applications or desktop/server javascript projects. If you want to use $latest_js_web_framework and package it in a distribution, you're going to be in a world of pain (the same applies for Rust because their package management was based on NPM).

> I'd rather see 50 little files like libleftpad.o than 5 gigantic libraries.

And then you'll have to use software written in 50 different styles, with 50 different type and error conventions, with pieces that couple together or not at random.

> And then you'll have to use software written in 50 different styles, with 50 different type and error conventions, with pieces that couple together or not at random.

All of which matters not one hoot to folks _using_ the software.

Right, and most people could have much more free space on their phones (or, more apps). I'd much rather write some glue code than ask my users to dedicate 275 MB to install my app, not even counting cached data or anything else. There's a reason many of my friends and I have deleted apps like LinkedIn - it's not that we don't like them, it's that the the apps were too big for the little value they gave us, especially when considering the web versions give us almost all the same capabilities, and only cost a couple KB of persistent storage (cookies + history + maybe some localStorage or whatever).

Maybe you're not familiar with how it works. The idea is that you do, let's say, "import lodash" then your packaging system says "A-ha! But you're only using these 5 methods!" and only packages those.

Behind the scenes, lodash is built out of many modules, but you as a developer don't need to think about that.

Because tooling isn't perfect yet, you have the option of helping it out manually and being explicit about importing submodules. We were able to reduce our compressed JavaScript size by some 60% using a combination of automatic and manual tricks like that at my day job.

Right, but from what I've seen (though I'm not a JS developer, so I might be wrong) is that most projects are not designed like lodash. They are small, single-purpose libraries that would be much better served as part of a module of a larger library. That was the point I was trying to make with the whole "libstring" and "libleftpad" point.

I've seen people try to package JS projects inside a distribution. >800 nested dependencies is simply not sane nor sustainable. The fact that they're small is even less excusable because it means that there was an active decision to not consolidate them into a single dependency that is properly maintained.

If you want "just pull in the things I use" you can already get that by having a static library, and if you were going to be shipping a private copy of the .so file in your app then surely you would be better off with a static library -- you don't get any of the benefits of the DLL (sharing with other apps, can update the DLL without shipping a new copy of the whole app), so why pay the cost of it?

(If you link against the static library then the linker pulls in only the .o files that the app used, so assuming that the library was written in a sensibly modular way you pay only the binary size cost for more-or-less what you use. The linker can probably discard at a finer granularity than whole-object-file these days, but only-the-.o-you-use has been there for decades.)

Probably only widely used static C library with fine grained dependency tracking is glibc. The source code structure required for that is probably one of the major reasons for people to call glibc's source code unreadable and/or unmaintable.

I interpreted the comment in exactly the same way, and once again I'm baffled to see a perfectly polite and correct comment downvoted and greyed out.

Young minds don't like it when someone questions the wisdom of how they reinvented the wheel, even when the question comes from confusion instead of malice.

I'd be interested in seeing an analysis of the Instagram app post-takeover by FB. Its size has risen considerably since then.

Many C++ libraries follow this approach.

This happens a lot, especially with dynamic linking. I have a project that uses Qt, OpenCV, CUDA, PCL and VTK - fairly standard stack for 3D imaging and visualisation.

Since you normally need to bundle dependencies to account for different versions, this adds up quite fast. Qt adds 20MB for OpenGL, 15MB for the VC++ redist, about 30MB for other core libraries. Some stuff in OpenCV requires Nvidia's performance primitives (NPP), so in goes nppi64_80.dll - that's 100MB alone. opencv_cudaarithm310.dll is 70MB and even opencv_imgproc310.dll weighs in at 30MB. And on and on.

So yes, one little call to cv::cuda::remap adds in a boatload of dependencies when all the algorithm is doing is using a lookup table to sample an image.


It's pretty ironic that dynamic linking's main motivation was to eliminate duplicate code and reduce code size. Deploy a DLL once, every software uses it and doesn't have to include it.

Now it has turned out exactly the opposite way. If we went back to linking object code together we would get smaller sizes. Instead we have to include huge DLLs.

> dynamic linking's main motivation was to eliminate duplicate code and reduce code size

I'm pretty sure the main motivation is/was to allow patching of dependencies independently from applications. Very popular shared libraries might save space overall, but that is a secondary effect.

When I started out the main motivation was to get install sizes down. At least on Windows patching never really worked and ended in "DLL hell".

IIRC Windows has pretty involved infrastructure for deduplicating same executable images both in memory and on disk.

I had a fun time with this once. The Jetbrains Toolbox app is written using Qt and is 150mb. I opend the .app package on mac, deleted the frameworks directory, and symlinked it to my system's Qt install.

It's now 6mb.

And it will break at random due to API or ABI incompatibility. Have fun gluing the pieces back together.

They seem to build it against a reasonable version of Qt, I haven't updated it in a while (because it's just a launcher), and when I do I'll just re-symlink against whatever Qt is currently installed (which I do keep updated)

I understand your argument, but in this particular case things aren't volatile enough that it causes any problems, and if it does it's just as easy to solve.

This would be true if the entirety of the library was used by the application. Wouldn't the linker throw out anything unused?

Depends on the language. If you're using something like C++, you can probably remove a lot of dead code. But with more dynamic languages, like Objective-C or Ruby or JavaScript, it becomes difficult or impossible to prove that a given chunk of code is never used. In Objective-C (which I'm most familiar with), the linker will keep all Objective-C classes and methods because it has no idea if you might be looking things up by name at runtime and invoking them that way.

Even if you can throw away dead code, you can run into bloat because a library has a massive set of foundational APIs that the rest is built on, and using one little feature of the library ends up bringing in half the library code because it's used everywhere.

You could use static analysis to find out if anything calls objects dynamically by name at run time which could be a useful optimization. To be safe this would be one bool value for the entire code base, but with some care it would help. A larger issue is size is simply not considered a significant issue.

With Objective-C, that will always come back "true." Even if you never use dynamic lookups, Apple's frameworks do. I suspect the same is true of other languages.

Interesting, I find that surprising due to the overhead involved. In most languages calling functions via string names is very expensive.

PS: Do you have an example?

Nibs and storyboards are full of this stuff. They instantiate classes by name, call methods by name, etc. Some example documentation if you're curious:




Objective-C's dispatch is built around looking up methods by their selector, which is effectively an interned name. Looking up the selector can be slow, but once you have one, invoking a method using a dynamic selector is as fast as invoking one with a selector that's known at compile time.

> Interesting, I find that surprising due to the overhead involved.

Due to the frequency of this, objc_msgSend (which handles dynamic method calls) is hand-written in assembly, with caching and "fast paths" to improve speed. The overhead can usually be brought down to that of a virtual function call in C++.

That's like saying shooting yourself in the foot only hurts the first time. But, it's probably not a major performance issue in practice.

That's the thing about performance. If you do something a million times, it's usually OK if the first time takes a thousand times longer than the fast case, as long as subsequent times are fast.

Look at how much code is written in Ruby and Python. Their method dispatches are way slower. To put it in perspective, it takes CPython about an order of magnitude longer to add two numbers together than it takes Objective-C to do a dynamic method dispatch.

The interesting part of this is that all this dynamism and dependence on runtime actually improves performance in comparison to C++. For example it is perfectly possible to implement objc_msgSend in portable C such that it is on modern superscalar and heavily cache dependent CPUs on average faster than C++-style virtual method call.

Really? That seems highly unlikely, and doesn't match with any of the speed testing I've done.

When you get down to it, an ObjC message send performs a superset of the work of a C++ virtual call. A C++ virtual call gets the vtable pointer from the object, indexes into that table by a constant offset, loads the function pointer at that offset, and calls it. An ObjC message send gets the class pointer from the object, indexes into that table by a constant offset, loads the method cache information at that index, uses the selector to look up the entry in the cache's hash table, and then if all goes well, it loads the function pointer from the table and jumps to it.

> Wouldn't the linker throw out anything unused?

There (usually) is no dead code elimination for libraries.

Depends on if you're statically or dynamically linking. When statically linking the final binary shouldn't contained unused functions from external libraries.

A lot of modern runtimes don't have linkers. With Java, C# and JavaScript you get the whole library. There is no elimination of unused code.

Java as of Java 9 does have a linker.

I'm pretty sure this is part of it. Remember leftpad? For electron based apps they're just bundling all of chrome which explains the size too.

Leftpad is not an example of the issue mentioned. Leftpad is an example of a small module that was being used by a lot of projects. GPP is talking about pulling in larger libraries which contain code not actually used by the app.

(Also Electron bundles 'just the rendering library from Chromium' [1], not 'all of Chrome').

[1] https://electron.atom.io/docs/tutorial/about/#core-philosoph...

Sorry for not being specific. As for leftpad I just meant the concept of using libraries when you really don't need to. Not the size.

Well it wasn't that you weren't being specific, more that you were casually maligning the JS community unfairly, when the article isn't even about web/JS :)

(I also don't agree with 'don't need to'. The main takeaway from the leftpad debarcle was the fixes to the npm module deletion policy, and hopefully people learning they shouldn't rely on an 'npm install' for production deployments! Whether people should use small modules is still up for debate, there are trade-offs [1] [2]).

[1] https://github.com/sindresorhus/ama/issues/10#issuecomment-1... [2] https://medium.com/@Rich_Harris/small-modules-it-s-not-quite...

As I understand it, when you link your binary, the linker will only include the used parts of the libraries. Linking to a 10M library but calling only one function will not increase your binary size by 10M, probably just a few bytes.

I think this is exactly it. If you start vendoring dependencies you begin to notice how much of other's people's code your software is really using.

A huge chunk of that probably goes to, well, videos or hi-res photographs to be used in building the UI. Hi-res splash screens plus a bunch of hero images plus 2+ prerendered sizes of each display element for different pixel densities plus a dozen 30 second tutorial videos can easily add up to a couple hundred megs of assets alone.

I think a large factor is the ever increasing need to have different display sizes for different pixel densities, developers essentially have to create and package a couple versions of the same application into one package and that is alot of waste.

At least in the context of this article (iOS app bloat) this is no longer the case - developers upload all assets for all pixel densities, but Apple repackages for each specific device, so that each device only gets one set of assets. Same goes for binaries for multiple architectures - each device only gets the binary for its specific CPU architecture.

Also to the GP's point - Apple also now no longer supports splash screen images, so that element of bloat is no longer a factor (though some legacy apps have retained them pointlessly).

I think for non-game apps assets are not the primary driver of bloat.

Interesting, thanks for sharing that.

I didnt know this as the only mobile I have dabbled with has been game dev with android in which case assets are the main cause of bloat.

If the application is properly packaged, the appstore should be able to generate configuration-specific variants which only include the assets relevant to your device class (https://developer.apple.com/library/content/documentation/ID...)

This makes sense, a much better way to do things.

The worst part is that they probably use huge PNGs to make sure the flat graphics are clean. You know, the kind that are perfectly suitable for vector graphics.

Flat graphics are actually extremely compressible with PNG's.

That's actually one of the positives from moving away from gradients and shadows everywhere, funnily enough.

They are, but as soon as ther is anything that is not a rectangular gradient with a continuous delta, they're probably still bigger than SVGs

Unless your graphic designer provides you with vector graphics embedding a vector for each pixel of the rasterised image of the original vector graphic.


That's almost as bad as the autogenerated SVG I once was sent that used hundreds of thousands of think horizontal lines to fill an area. Instead of, you know, a filled polygon.

The sheer number of vectors could make any renderer I threw at it crash.

Not for apps like Facebook. Their app binary is in the hundreds of megabytes.

Lots of things. Most companies now have huge development teams and that means that you're going to get a lot of duplicate and triplicate media assets. Duplicate and triplicate static libraries and frameworks (which Apple is trying to address with it's new dyld improvements[1]). As you also can't forget all of the analytics and A/B testing frameworks that are put in place. Each one of these apps can actually probably run 100^2 permutations of the app.

[1] - https://developer.apple.com/videos/play/wwdc2017/413/

>Surely the 275 MB isn't all useful data (I wonder what compression ratios you get on 'apps'), and it should be possible to cut it down to a few MB.

This is pretty tone def. The bulk of apps are audio and ui assets and the compression rates on those are quite good.

That said, the compression rate for iOS apps is horrible as Apple decides to encrypt and then compress the binaries, completely blowing apart the compression ratio of duplicate data.

Compress then encrypt is continually a source of vulnerabilities.

For protocols yes, but you need to look at the full picture. e.g., what would the oracle be in this case?

Encrypt then compress makes no sense at all. Compress then encrypt, or don't compress at all.

When I was developing a few pet apps, the reason was pretty simple. I was pulling in entire libraries for one or two functions. They were good libraries and good functions. I think multiple examples were Google-provided app development libraries that you roll into your app. Like, appcompat. Nobody can get by without appcompat anymore it seems, but nobody needs all of its functionality either.

Anyway, probably millions of lines of code, 99.9% of which I didn't call. There is a tool for stripping code that will not be called, and I reduced my apk's from ~20MB to ~1MB, although I wound up turning it off in the end because it was not trivial to enable correctly. (I was linking 3rd party binary libraries into my app which complicated things)

While I was typing this comment ghostly pointed out some great blog entries: https://news.ycombinator.com/item?id=14901602

Too many programmers, too many templates and frameworks, poor factoring, too much code reuse, bad build configurations, and lack of LTO.

In several interviews I've been questioned about the importance of my contributions throughout my entire career because the apps sizes were so small (typically 8 - 12 mb)

When asked it was clear they'd already made up their mind, and discussions about optimizations, how the app actually did anything, MVC, MVP or SVGs didn't change that.

And thats how you have large apps!

To paraphrase Bill Gates, that's like measuring an aircraft designers skill based on the final weight of the aircraft.

Any idiot can create something bloated and complex, creating something small and simple requires much more effort.

But when you think about it, final weight of the aircraft does say something? I guess an amateur could design something like Cesna and have it flying, but designing An-225 is a different matter.

This is more like people building a Vesna but having it turn out the weight of a 747.

Don't download them? Out of those apps the only one you actually need is Uber, everything else can be replaced with a web browser. I don't download anything larger than 5MB unless I absolutely need it.

I wonder is this is due to lack of generics in objc.

Objective-C has fully dynamic dispatch. In a context like that, generics are just a type safety thing to ensure that you aren't sending messages to the wrong type — they don't make the executable any smaller.

They make code reuse possible. Watch this Sean Parent video https://www.youtube.com/watch?v=4moyKUHApq4 where he estimates that if Photoshop was rewritten using generics, the code size would go from 3,000,000 LOC to 30,000 LOC. You are right, during compilation, generics are specialized so you end up with code, however going all in on generics removes a lot of accidental complexity.

I think you've misunderstood my point. What code reuse is possible with generics that is not not possible in Objective-C without generics? I don't think there is very much, because Objective-C fundamentally doesn't care the types of objects.

The full answer would be kinda long but this is a rehash of the debate that has been had many times over.

You cannot just compare obj-c vs generics, you need to also account for the fact that when you are obj-c development, you might need some fast code and as a result drop to c. And code reuse is hard to impossible in C.

The fact that code reuse can be done without a performance hit is enough of a difference difference on it's own as it pushes down the level at which code reuse can be done. E.g. you can use generics for Graphics or Audio or general low level stuff.

Err, why not just drop to C++? Objective-C++ is real and works great. That's how Mac Chrome does things for example.

Interesting talk. If he's right, then a Photoshop clone ought to be doable. 30k LOC is doable for a single person or a small team. I'm guessing that even with generics, it wouldn't help that much. Also, I think there are probably a lot more algorithm LOC in Photoshop than he thinks...

Objective-C has had generics for years. Here's mention of how they are imported into Swift: https://developer.apple.com/library/content/documentation/Sw...

I'm aware, they are nowhere near comparable.

That has nothing to do with code size. You'll either get some sort of type erasure, which will create the types are compilation time anyway, or you'll have some sort of dynamic dispatch. Neither approach will make the generated code any smaller and may even make it bigger.

Other way around. Generics make apps bigger, not smaller, because the code gets duplicated as it is specialized for different types.

I know this affects Xamarin apps.

I m building a product that would bring the size of the apps way down

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact