Hacker News new | past | comments | ask | show | jobs | submit login
Why Apple’s new M1 chips are essential for rapid iOS development (doordash.engineering)
119 points by apalom on March 1, 2022 | hide | past | favorite | 127 comments



"Based on rough usage patterns, we assumed an average iOS engineer does around five clean builds and 30 incremental builds each day..."

...wait, what? I know YMMW, but for me and most of my colleagues it's 1-2 clean build per day and 30 incremental... per hour, sometimes.

"...In parallel with our modularization effort, we’re also adopting new technologies like SwiftUI and Xcode Previews. These technologies allow us to almost entirely remove the tweak-compile-and-run loop when developing user interfaces."

This sounds more like SwiftUI ad/SEO blogpost than real developer's insights. SwiftUI Previews is notorious for crashing, timeouts and random errors. I have a lot of problems with these on small projects, not to mention bigger ones at work. It's also a nonsense to proclaim 'it remove tweak-and-compile' loop. Any change to model layer requires you to rebuild project, otherwise Previews will return old state. Not to mention that Xcode likes to rebuild project even for UI-only change.


30 incremental builds an hour doesn't sound like a good process to me. Personally I find if I am building too often I'm not thinking about what I am doing enough and I'm just throwing things at a wall hoping they'll stick/work. A good software process involves a lot of thinking, a medium amount of typing, a small amount of compiling. I understand that sometimes the ideal and the reality are different of course.


30 incremental builds an hour while doing iOS UI development can be entirely the norm: Tweak a font size; rebuild. Tweak the font again; rebuild. Change the text color from 'label' to something slightly gray; rebuild. et cetera.


I know people do this sort of thing, but I still don't approve of it as a good way to live :-) Sounds too much like editing a document by manually styling every last element instead of creating a theme with a consistent set of styles.


It isn’t. Welcome to iOS development :-|


Edit: Whoops, I misunderstood your meaning please ignore and accept my apologies! You mean it isn't a good way to live!

What do you mean it isn't? Of course it is, at a conceptual level. By your own admission you're endlessly tweaking each presentation detail. Wouldn't it be better to operate in an environment where minor presentational details look after themselves?

Don't get me wrong, if that's the way you have to do it, that's the way you have to do it. I've done similar things in GUI environments myself. But I didn't enjoy doing it that way and yearned for something better - even if I didn't have the time, opportunity, skill or talent to construct that better way myself.


Yep, all good :)

Wouldn't it be better to operate in an environment where minor presentational details look after themselves?

From your lips to Tim's ears, buddy :)


You should seriously consider switching to Flutter when you can, their Hot Reload takes 1-3 seconds. And is a GOD SEND for UI developers. While in team meetings it allow me to do changes on the fly in front of the team, love it!


Have you worked with SwiftUI or Jetpack Compose? The new tooling for mobile development requires basically building the app to simulate the layout engine. And when you're building interfaces, ideally you want < 1 second updates so you can tweak things that don't lend themselves perfectly to foresight and analytical thinking - you just need feedback. Jetpack Compose in particular is atrociously slow. When was taking 10 seconds before Android Studio would register my keypresses, I knew it was time to upgrade to M1.


The efficiency of these IDE’s (XCode and Android Studio) is absolutely terrible. My guess is once these wonderful M1 processors become more mainstream, the efficiency will become worse until the M1s seem slow. Hardware can never catch up to terrible software performance; software developers won’t allow it.


No I haven't done that specifically, although I've done some stuff with other GUI builders and I think I know exactly what you mean. Basically this is why I added my ideal vs reality YMMV qualifier :)

In an ideal world GUI tools would let you easily specify what parameters were required from the user, what standard very flexible and adaptable widgets were to be used to get those parameters, and how the parameters are constrained. All with a coherent special purpose language or some other fit for purpose technique, with an absolute minimum of pixel tweaking etc involved.

Sadly it's not an ideal world.


The main issue with these new UI frameworks is that they are composable and dynamic. It's great because you can make your own application specific or organisation wide UI libraries that are, in essence, a collection of free functions. But it also means that the dependencies of all those functions need to be compiled. This same workflow is fine in the JavaScript world because it's an interpreted language and the compilation step can be skipped.


How big is the app you are working on? How long does an incremental build take including linking and signing?

30 incremental builds an hour sounds pretty high to me, unless you’ve got a smaller app. On Intel machines, I’ve been seeing minimal incremental build times of over a minute. The M1 obviously make some pretty significant improvements in this situation.

However, while I’m relatively new to the iOS world, it seems linker performance is a fairly hot topic. I know more than a few people are eyeing support for Mach-O linking in mold, based on the massive speedups shown for Linux.


It really depends in coding style. For example, I always like to write the smallest possible change to address a feature and bug fix that's testable, then I start writing unit tests, then back to code, then back to unit tests and so on. So it's not uncommon to do a lot of incremental builds (but yeah 30/hour is on the high side). Sometimes this is not feasible, e.g. working on a complex component which has a simple public API but does a lot of heavy lifting internally - in that case it takes a while to get to a testable change.


> SwiftUI Previews is notorious for crashing, timeouts and random errors.

Just going to plus one this point in particular. I just started working on an app for the first time in January, chose SwiftUI, and probably wasted at least a few days' worth of work time dealing with non-problems that become problems only because of using SwiftUI Previews.

Once I disabled Canvas and removed the Preview struct, my dev time got quite a bit more efficient.


I used SwiftUI for a side project and after 6 months binned the whole thing and switched to flutter. I was finding that a huge portion of my time was spent trying to get the project to compile (it generally just needed many recompiles so presumably there's race conditions in xcode or the compiler) and the previews induced several minutes worth of spinner every time I went near them.


That's pretty crazy. Fortunately once I stopped relying on Previews things quieted down. I have trouble debugging some runtime issues, but that's my fault for not being very confident with Xcode's debugger and such rather than a compiler or Swift issue.


What’s wrong with nibs or storyboards? Why does it have to be the latest shiny abominable object to emerge from the incubator?


> I know YMMW, but for me and most of my colleagues it's 1-2 clean build per day and 30 incremental... per hour, sometimes.

Yeah, this doesn't make any sense for me either... I only do clean builds of any of my software when I am working on the build system itself or am cutting releases (so I'd measure them on a small handful per week), and I'm constantly doing incremental builds so quickly it is essentially a pipeline: I don't even wait for the build to finish before I'm making more changes. Why are they doing so many clean rebuilds?


I've actually found SwiftUI previews work fairly well, have to admit they get better with every Xcode version, and I don't understand why sometimes they need to recompile. Using them I generate snapshot tests automatically, and an app for our designer to see the various UI components on device through TestFlight. Currently only for our UI components library, but I'm hoping to expand the automatic snapshot testing to the app at some point.


My biggest issue with SwiftUI previews is tweaks to a single UI file rebuilding random dependencies. Like, I just changed a color why is it rebuilding the graphing library AGAIN?


I'm not an expert on Swift, but I can only guess that code injection mechanism that's been used for Previews resolves changes incorrectly. Or the state got out of sync, so it's safer to rebuild the project rather than crash/timeout. Which unfortunately happens quite often, even after rebuild.


One thing that is surprisingly slow on the M1 is running an OpenGL ES application in the simulator.

I developed a game on a shoestring budget, and hoped that an M1 would be able to be used for capturing video of the game at the resolutions that Apple wants, but it was way too slow to be able to do this, so I had to just leave video out of my store page.

The same app run on your M1 mac with its iOS compatibility works completely fine (as it should, it is a 2d game with about 10 sprites that just get drawn with scaling and no rotation).


I've tried a number of games from the app store (relatively simple 2D games which don't melt your phone) on an M1 Pro MBP, and found they all constantly maxed out multiple CPU cores for no apparent reason.

There's definitely something wrong even with the iOS compatibility layer.


There is no OpenGL driver on M1. So your app is running through a translation layer to the native Metal driver underneath. All OpenGL rendering on M1, no matter the framework (iOS, Catalyst, macOS) has pretty terrible performance because of that.


Is this true for WebGL as well or does Safari/Chrome/Firefox do something special?


All browsers use the ANGLE library to translate to the underlying platform 3d engine (Direct3d, Metal, OpenGL, Vulkan?)

That aside, afaik M1 Macs only have OpenGL as a translation layer (to metal) for programs running under Rosetta


Simulator has never supported hardware acceleration for OpenGL ES. It is no different on an Intel Mac. Only Metal in Simulator runs on the GPU. This includes WebGPU if you're using Safari.


Sounds like it went through the software rendering path...


You are correct. It is frustrating because clearly it can be hardware accelerated, just Apple hasn't really bothered themselves to do this


I was under the impression that OpenGL on M1 devices just passes through to Metal at this point.

Does OpenGL ES not get this treatment...?


My guess is that they haven’t wired the iOS Simulator to use the M1’s hardware acceleration. I’m pretty sure WebGL (which uses OpenGL ES) is hardware accelerated on M1.


And iOS apps not running within the Simulator have hardware accelerated OpenGL ES too.


Simulator support for GPU rendering is far from trivial so it isn't about "bothering". Metal jobs are packaged up along with shaders and proxied out to the host macOS Metal framework. Textures are backed by IOSurfaces which do some fancy Mach memory mapping tricks to share memory between your app in a simulator, the macOS side of Metal, and sometimes GPU memory (on Intel Macs with discrete GPUs). For the most part this is transparent to you and works even when Simulator vNext is running on macOS vPrevious.

The vast majority of developers are using higher-level system frameworks, an engine that has a Metal renderer already, use WebGPU, or use Metal on Apple platforms.


With the amount of cash apple has, it very much is about bothering. They have the resources to make their development tools not terrible, but instead they seem to be investing in... cash for a rainy day.


Maybe they plan to bring an M1 iPhone to market, so it's not worth the considerable engineering effort to fix a problem that will go away by itself in a while.


I upgraded to an M1 Pro from a 2015 i5 Macbook Pro and have seen even more dramatic results. ~3 minute clean compile times for our iOS app, down from over 20 minutes!


That's exactly the jump I made.

From my perspective, compilation I do (mostly Go) is near instant versus 20 seconds.


> Figure 1: Plotting the cost of “time spent compiling” of the old vs. new laptops shows the one-time cost of the hardware upgrade quickly pays for itself.

Where the horizontal axis is labeled as "10", "20", "30" , ... without any unit and the vertical axis isn't even labeled beyond "Cost" (no units, no numbers). If you want to make a graph, fine, but remember your math teachers and at least get the axis right before even thinking of the data. And before drawing any conclusion.


Wonder if they've looked at Bazel and rules_ios [1] (or similar)? Bazel brings caching and reproducible builds (all but eliminating the need for clean builds) in addition to transparent remote build execution. I haven't use the iOS rules, but I'm pretty sure that Square and some other large companies do, and Reddit just posted [2] about their own transition.

1: https://github.com/bazel-ios/rules_ios

2: https://www.reddit.com/r/RedditEng/comments/syz5dw/ios_and_b...


Essential for any development occurring on a Mac IMO. I upgraded from a fully specced out 16inch to a base model M1 with the 32GB upgrade (a machine roughly half the price I think) and it's night and day. I can use Docker now without Slack getting laggy.


I guess the new benchmark for development is can it run Slack? (While developing)


In my experience, yeah unfortunately. Slack goes to shit if the computer is under load, way before other software.


Just be thankful you don't have to use Teams.


arguably slack having performance issues might increase most developers productivity


I realise your post is in jest, but I've found that responsive slack (due to a similar machine upgrade) has actually had a noticeably positive effect on my productivity. Checking a slack notification is now a ~2-5 second task, quick enough that I can switch back to my task with losing context. It used to be more like 10-20 seconds if my machine was under load which was highly disruptive.


I do not in any way post in jest.

Slack is an abomination.

Slack is all the bad organizational behavioral issues of email, except harder to filter with rules & with more sender-side options for creating noise for the recipient.

It's lovely to send a slack and get quick response.

It's a huge productivity killer to get dozens/100s of half-thoughts ad-hoc sent over IMs to individuals and blasted to groups.

Is there a massive production outage? (I care about)

Did someone just share a funny GIF? Is Bob helping Jim debug a firewall issue for a client? (none of which I care about)

I got that stupid red dot, hard to tell without checking.


When they broke Slack the other I actually celebrated for the couple of hours and got some work done for a change. No joke but it's the single most productivity destroying tool I have ever used.


When I had to use Teams it gave me a good excuse to opt out of bullshit meetings. It used to be that I can leave the BS meeting in the background and get some work done, but with Teams running I couldn't even use my IDE without seconds worth of keyboard latency, so now they had to explicitly choose whether to let me go or have me stay and get nothing done.


I can’t wait for the next macOS + iOS updates so I can use Universal Control to run Slack on my iPad and never have to open their bloated web/electron app ever again. I wish Slack would just let us run the iOS app on our macs with Catalyst, but they seem pretty bent on pushing desktop OS users to the web/electron version.


Is the iOS app not also web based?


For a short while when the M1 macs first released, people were successfully sideloading the iOS versions of Discord and Slack on them which were significantly lighter, but I think Apple has patched it now and it's not possible anymore


Why would they block that? That's a huge selling point of the M1!


Last I knew the Slack iOS app is built with React Native, which still isn’t the best in terms of resource usage but is better.


Probably, but if it’s not using my development system’s resources I don’t particularly care.


I thought the base model M1 maxed out at 16GB? Must be a M1 pro or max.


The 14 and 16 inch models only have pro or max, so yes that must be what they meant.


Yeah that was unclear, I meant the base model 16 inch which has an M1 pro chip.


Why do companies even build locally anymore? It's never been cheaper to spin up your own cluster that's going to be more than enough horsepower. If you give your devs the $899 air instead of the $2300 16", I feel like that $1400 surplus per dev would go a long way in buying some nodes that are going to still be some muscle 5 years from now. You could buy two mac minis per dev and set up an m1 cluster if you wanted. If you didn't want that overhead, plenty of companies sell compute by the second now these days too.


Having your entire codebase and build process running on the machine in front of you has a lot of other efficiencies that an external cluster takes away. What about the overhead of syncing your code to the server every time you make a change, and getting the binaries back to test? What if you want to work from outside the office? What if the network connection is bad?


>What about the overhead of syncing your code to the server every time you make a change, and getting the binaries back to test? What if the network connection is bad?

A bad network connection is severely disruptive to almost all dev workflows so it's not a unique reason against remote development. The other factors you raise are not really relevant. The "overhead" of syncing code should be invisible with modern dev tools, and binaries should be built remotely anyway. I.e. the laptop should basically used as a thin client + web browser.

>What if you want to work from outside the office?

This is an argument in favor of remote development. When everything is on the cloud, it doesn't matter whether you are in the office, at home, or travelling.


Absolutely no way, I often work remotely and can go for hours without even network connection and still be 100% productive, all in my MacbookAir which is not too expensive.

If you are outside the office, chances are your internet connection is not as good. The scenario given here where everything is in the cloud sounds like a nightmare for me.


What are these modern dev tools? I'm still using git. In order to send my code anywhere, I have to make a commit.


Developer happiness. Same reason my company doesn't make me fly economy when I travel. A cheap laptop says "drone on, you drones." A nice laptop says "we want you to be happy here."


I'm with you on point about laptop, because laptop is something that I use every day, but not on air travel.

Air travel is such a colossal waste of environment and time so to top it off with additional cost to the company for "developer experience" is so unnecessary. If a company wants me, a developer, to fly in in business class I'd immediately think that they prefer to concentrate on showing off instead of concentrating on things that really matter. Unless it's a fucking Ryanair (don't know what's US equivalent of shitty el cheapo airline), I'd be fine flying in economy. I'd rather the company spend the money on my salary or work environment instead of one time showoff so that I could sip champagne and put my feet up while flying for 4 hours.


> Air travel is such a colossal waste of environment and time

You're sure? For everyone across all fields? The simple fact is in many professional contexts, air travel is mandatory (including for many developers). I fly several times a year for work because the nature of my development job demands it.

> I'd rather the company spend the money on my salary or work environment

This is the point I was making. Since I have to travel, a relatively cheap business class upgrade (not first class) is the company spending money on my work environment.


Champagne? Sounds like you are describing first class and I was thinking the person you replied to was talking about business class. Typically, that's a seat closer to the front of the plane with a tiny bit more space.


16" MBP with M1 Max is the best performing Apple computer available, according to GeekBench. Some Mac Pro configurations outperform on multi-core benchmarks, but those start at $6,500 for rack mounted configurations.

The new MBPs are incredible machines and worth the money in engineer productivity. for an iOS engineer, I'm not sure of a better set up from Apple's current line up.


For a lot of companies, $1400 per developer isn't enough to worry about. That's probably a day or two of all-in developer time. Not worth it for something you are only paying for every year or two, especially if your developer prefers the 16" machine over the Air.


The cost to buy devs their own high spec devices, that makes them independent of any network issues, is a rounding error compared to the cost of the time and effort lost in maintaining the distributed build infrastructure, and the cost of lost work during network outages.


Build cluster is necessary for CI, but using it in place of local builds sucks from the developer perspective, especially when working remotely or when the build queue is stacked or when the scheduler randomly takes a dump and you need a sysadmin to kick something.


Yea let me build a binary on the server to pull down and download to run on my phone.


What about the salvage value of the laptops? This analysis seems to ignore that. Macs hold their value quite nicely.


Even Apple is moving into this space with Xcode Cloud.


I bought an MBP with an M1 two weeks ago. Never owned a Macbook before, although I use a company MBP, which is older and Intel based.

Actually I wanted to compile an Android app on it, which uses some C code. On an MPB with an M1, Android Studio does not handle that natively and without hassle in Android Studio release, or beta - they seem to have some support in Android Studio Canary now, with a little futzing.

I'm not in a rush, and will probably wait until I can just compile the Android app from Android Studio release without any jury-rigging needed. So I'm not stuck really, I have the time to wait for it to all get into the NDK post-canary release and the Android Studio post-canary release and I will proceed from there.

Also, I have never used Fidelity's Active Trader Pro app, and wanted to check it out as my desktop is a System76 box running Ubuntu and ATP is not available for that. So I tried to install on the MBP with M1 and - no go. Not sure what is needed for this to go on.

I think I have Rosetta installed, but hasn't helped much.

I'm not really stuck on these things, they're just observations. It is as fast as everyone says. The local app store only 16G RAM MBP's in stock so I got one instead of waiting for a 32G RAM one. Android Studio still compiles apps pretty quick though.

It would be nice if all of these apps and operations worked smoothly and out of the box now, but it's not a huge deal. The purchase works for me for other things and neither of these things have high urgency presently.


"Slow build times? just pump more hardware"

sounds like what rust people do, they can't acknowledge the stupidly and insanely slow build times, they tell you "it works fine on my threadripper, don't be poor lmao"


> Based on rough usage patterns, we assumed an average iOS engineer does around five clean builds and 30 incremental builds each day

Why five clean builds a day? That seems like a lot. Any iOS devs care to chime in?


Depends on what you’re working on and how that’s going. It’s easy to imagine shifting between completely different forks/branches with a ton of dependencies where you’d want to clean build. Or when you go to submit work you make sure unit tests run from scratch, etc.


I believe it. I did benchmarking on my algorithms and it wasn't uncommon to see speedups in that range just executing on newer hardware from an old laptop.

However, it's kind of amusing to think how shocked we are by this development when such things were taken for granted in the 90s.


I have been putting off upgrading my thinkpad for a while now because performance increase has always been incremental, 15% speedup does not really warrant a whole new laptop, but M1 has been first time I've actually been impressed with gains.


Where do these companies exist that buy the latest hardware for their devs? Please tell me, i've never seen one! ;) Honestly, maybe it's an australian thing (where i'm based) but i work for an international company now too and it's the same.


Silicon Valley companies often do this, especially when something as newsworthy as this comes out.


The same is true for Android development - we saw build times cut in half after upgrading to M1 machines. Another advantage is that there is no noticeable slowdown while Gradle does its thing, so multitasking is now significantly easier.


Does the toolchain on x86 still build an x86 and ARM executable(s)? Perhaps that contributes to the 2x build time?


For macOS universal binaries, yes, they still build both platforms. For iOS, it's only ever building for the platform you target, simulator or device, so the only change there is the simulator is now an arm64 target.


Typically in debug mode "Active Architecture Only" is set to "Yes." This means for most local builds you'll only have been building one architecture regardless of platform

This applies to incremental and clean builds running locally with default Xcode settings

When you archive and build for the store then your compilation will include all architectures


Writing integration tests on the M1 is now fun instead of tedious and painful!


I had to laugh at the opening: "The tools are largely free", except the bare minimum to develop for ios is a mac and the apple developer account (if you ever expect to publish your app). And a significant subset of functionality only works on physical devices and not emulators. If you don't already have the mac and iphone, you're looking at $600 for a used macbook air, ~$300 for a used 2 generations old iphone, and $100 for the dev account. $1000 is not "largely free".


> $1000 is not "largely free".

It is absolutely insignificant compared to the cost of developing software.

It’s a tool needed to do a job. Go talk to a carpenter, a builder or a mechanic and ask them how much they spent on tools. Then compare their hourly rates to that of a software engineer.


What cost(s) are you referring to?

I can develop software comfortably on a $100 laptop and whatever choice of FOSS operating system. You can get a workable but less comfortable setup with an inexpensive device like the raspberry pi or a used system.

That is part of what makes CS special compared to other fields, the tools and information are available to everyone who has a working computer and internet access instead of being extremely expensive and locked behind gatekeepers like colleges.

Also keep in mind that the free tools and operating systems are not "junky" or whatever you might assume if you aren't familiar with them. GCC and LLVM are examples of extremely high quality FOSS toolchains, Debian and FreeBSD for operating systems.


In the developed world I presume.


> Then compare their hourly rates to that of a software engineer.

They are equivalent if not more, where I am.


How quickly we forget what development costs looked like in the recent past

>PlayStation developers need to cough up £ 12,000 for the full system (which Sony is adamant it doesn’t make money on), although all subsequent software tools and hardware upgrades are free.

https://www.retroreversing.com/official-playStation-devkit

Not to mention other obscene costs that were common, like charging developers to issue a patch for software sold online.

>Double Fine's Tim Schaefer pegged the cost of submitting an Xbox 360 patch at $40,000

https://arstechnica.com/gaming/2012/07/microsoft-comes-under...


I don't know if that's a great comparison since only 7,918 PlayStation games were ever released while computers and phones have literally millions.


Just reading the replies to this thread it's unbelievable how out of touch people are with how software development happens in most of the world. I can guarantee that there are many companies in India where the entire engineering department has less than $1000 worth of hardware combined.


You need a computer to compile at all. If you’re doing iOS you probably already have an iPhone. Old physical phones (cheap) are fine for dev, can simulate latest. $100/yr only needed if you’re actually charging for the app - which should earn you those costs pretty soon. Yes, $1000 baseline for what you likely already have most of.

The “largely free” is in contrast with platforms having >$25,000/yr entry costs just for a license.


Except that it's actually in direct contrast with Android development, where the tools are available, for actually free, for Windows, Linux, and Mac. And the developer account is a one time $25.

My personal situation: I wanted to add an iOS client for the game I've been developing in Flutter. I did not have a mac or an iphone already. Fortunately, I've been able to borrow them for builds and occasional testing. But, that's obviously not optimal, and there's zero good reason that Apple couldn't provide their build tools for other platforms. Instead, they leverage their monopoly on the app store to force hardware sales to developers.


> and there's zero good reason that Apple couldn't provide their build tools for other platforms

How do you figure? Then Apple would have to maintain their build tools for other platforms. Waste of resources for something that ultimately isn't going to make Apple money. It's no different than Microsoft intentionally gimping Excel on Mac.

I've never seen anyone complain that they have to own a Windows computer to develop Windows apps. Or a Playstation to develop Playstation games. I'm not sure why Apple is such an exception in your eyes.


> Then Apple would have to maintain their build tools for other platforms. Waste of resources for something that ultimately isn't going to make Apple money.

It makes sense on Apple's part, but I still dislike it.

> I've never seen anyone complain that they have to own a Windows computer to develop Windows apps.

Windows can be installed on any computer, whereas macOS requires purchasing Mac hardware. Additionally you can cross-compile Windows apps on Linux (and possibly Mac) using MinGW toolchains (or with great difficulty, MSVC on Wine).


> Windows can be installed on any computer, whereas macOS requires purchasing Mac hardware

Running Windows natively requires an Intel PC. an Intel Mac via Boot Camp (but I'd advise buying a two-button mouse with a scroll wheel if you do that), or a subset of ARM PCs for which Microsoft provides ARM builds. That's definitely not "any computer" or even "practically any computer."


Ever heard of AMD?


came to post this, what a joke. are they steeped in privilege or what?!

maybe simply unaware of the alternative ecosystem costs


its not just about javascript and android, take a look at FPGA development... several multi thousand dollar licenses out the wazoo


well professionally (and its likely a lot more than that), but in the hobby market there are a lot of low end devices with free or zero cost tools.

The iCE40 boards are sub $20 (iCESugar nano) and have an open source toolchains. And things like the ECP5(G) boards are large enough to host full RISC-V/etc environments for less than $50.

So, yah, FPGA's can get outrageously expensive, but there are a lot of low end devices with a lot of capabilities (those ECP5's have highspeed serdes, which can act as PCIe cards for example).


yea i was gunna post a high end JTAG as example, but for mobile development, sheesh android is actually virtually free


On one hand, these machines are clearly superior, and it's a no brainer for companies to purchase them for development. On the other, this is just another one in the long line of examples of bad software practices being overcome by throwing more money and hardware at the problem. Is there any reason for an iOS app build to take 8+ minutes to begin with? And does anyone realistically believe that once all developers have these shiny new computers build time won't creep up again till everyone starts demanding M2 and M3?

This problem extends to the consumer side as well. Developers use expensive MacBooks and fiber internet connections and the end user has a laggy experience running the hundreds of megabytes of JavaScript on an average site. The user has no choice but to upgrade their computer, and then companies see the new spare CPU cycles and fill them up as well. The end result is that while hardware keeps getting ridiculously faster and more efficient, user experience doesn't noticeably improve.

It is just Writh's Law (https://en.wikipedia.org/wiki/Wirth%27s_law) in action.


Yeah that's about it.

What I find really offensive, which my Mac likes to remind me of regularly on the energy usage side of things, is the siloed vats of excrement that JavaScript, Browsers and Electron have enabled. These are absolutely horrible to use compared to native applications and each one has its own ozone hole above it. If I don't open a browser or Electron app my battery lasts 2x as long.

I do the occasional bit of Swift with XCode just for fun and to concentrate on that case really does disservice to the real elephant in the room above because at least Swift is not burning the universe to the ground and recompiling bits of it every time you click something. When you're done it's native.

Edit: personal wastage shitlit: Lens, Discord, Teams, Slack, VScode. VScode gets a pass as it doesn't annoy me as much as the others.


Teams is the worst offender: it will single-handedly cycle my battery on a medium-long video call.

That’s just battery usage, not even touching on general performance, how buggy the app is in general.


Honestly the worst performing app ive ever used in my life. It makes any machine running it slow as shit.

When someone is sharing video / screen share its even worse.

I absoutely cannot fathom how it passes any form of QA.


It's not that bad on my M1 MacBook Air to be fair. I had to do a training session which was 3h long on Teams and it ate 20% of the MBA's battery.

It was horrible on the previous Intel i5 MBP though.

I refused to install it on my M1Pro MBP!


100% This is the new Flash that Steve Jobs successfully railed against ruining the battery life and performance of Macs back in the day - https://youtu.be/EMXwa9EtehE. And things got soo much better for awhile when that went away.

I feel with JavaScript and especially everything being Electron (I shudder to think how many copies of Chromium I am running at the moment) we've backtracked back to those days - but worse.


Look at the bright side: bad/lazy software helps the world develop faster computers that can be used efficiently for "real" projects (i.e scientific research). If the computers would be "fast" enough there would be no incentive to make them faster and we would never reach singularity


It's also probably just the most efficient use of time and resources. "The market" could chose to optimize the last drop of speed out of millions of bits of software, or improve the speed of a handful of chips which benefits those million bits of software.


Your argument is more or less a critique of Swift itself, which is a far more full-featured and "safe" language than Objective-C, at the expense substantially higher compilation times given otherwise like-for-like programs.

You're not necessarily wrong, but that ship has sailed for iOS development long ago and isn't necessarily the fault of teams like Doordash who have probably (like others) already squeezed as much build-time performance out of their given app size and feature set.

Mobile apps are inherently monoliths, and there is unfortunately no real way around some code changes triggering full rebuilds of millions of LoC.


Definitely will echo this. The app I work on is probably a lot of bloat which the op is complaining about but compared to the last app I worked on it's quite a bit less and compiles at least 5 times more slowly due to using swift where as the last project was 100% obj-c. Even if we could somehow manage to convince management to let us clean up the app we'd still be at the mercy of however long the swift compiler takes.


Don't use type inference, and modularize the codebase into separate static libraries. That should reduce the amount of full rebuilds, and should make builds faster. (ETA: also try to avoid generics.)

Ofc, probably a lot harder to go back and redo a huge legacy codebase where you still have new deadlines to meet, but for folks just starting out, these should be guiding principles in a Swift codebase if you care about compilation time.


This reminds me of the same phenomenon that occurs in traffic planning[0]

[0]: https://www.vtpi.org/tdm/tdm64.htm

Briefly, as impediments to congestion are introduced (bike lanes, tolls), individual cars either join or leave the traffic system until the delay for cars reaches a rough equilibrium based on what people will tolerate.

Seems the same is true for computing capacity.


This may be true of 50-in-one megaprojects such as those from the likes of Uber and Google, but these machines are also beneficial to developers of more lean, focused, and optimized apps too. The few seconds my M1 Pro shaves off of incremental builds and the general improvements in IDE responsiveness compared to an upper-spec Intel mac pile up and become worth it quickly, not even mentioning quality of life improvements like far better battery life and being able to use the laptop in my lap comfortably.

For the projects I work on, compile time has been declining with each hardware gen despite app complexity increasing significantly. I’m sure there are other devs who this is true for too. We’re not all 5 abstractions deep and nonchalant about heavy resource usage and bad engineering.


> far better battery life and being able to use the laptop in my lap comfortably

These are the mains reasons I sold my 2015 MBP to get the M1 Air. I was not doing anything computationally heavy, so I don't notice any performance improvement. But the Air never get hot. And the battery usage is so light that I don't worry over its charge status.


> Is there any reason for an iOS app build to take 8+ minutes to begin with?

The speedup isn't limited to iOS app builds. For instance, at Reddit:

>We recently found that the new 2021 M1 MacBooks cut our Android build times in half.

https://twitter.com/softwarejameson/status/14559711620606976...


I guess Wirth was still not cynical enough because even he couldn't foresee that developers would deliberately break and degrade a working, fast application (namely, the website experience) just so they can go and write two separate, bloated applications that do somewhat the same but not really. Only to then complain about build times of applications that didn't need to exist in the first place.


Developers dont make those choices.


The user does have a choice: do not use crappy slow services. For example I stopped using Instacart because there was a long lag each time I typed a single character is the search box. Amazon and Peapod did not have this problem.


that is such a rookie mistake if they are doing searching immediately on inputs! debounce it ffs! i hope that was it and not general ui slowness


Yep agree, having spent some time digging into why Xcode builds are slow it looks like low investment is a major reason. The tooling and optimizations are fairly primitive compared to other ecosystems and harder to get right where they do exist. It’s just not a priority it seems, likely because new features win out.. and selling new hardware “solves the problem”.


How much did Apple pay DoorDash for this blog post?


And how about treating your contractor labor as real employees next, eh? Why not give them some nice, shiny, new hardware?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: