NetNewsWire is another great example of quality open source Mac applications that don't force my laptop in 100% CPU utilization, especially for RSS readers.
Given that the author did a recent article about technical programming interviews, I'd pretty much see this article as enough evidence that the author here knows their stuff in data structures appropriately and also optimising apps beyond his own ones and would bypass the programming interview stage anyway. (All this code is open-source in NetNewsWire).
Wow, I had no idea NetNewsWire had become free and open source. I stopped using it after v3 (because v4 removed some AppleScript functionality I depended on, although I now can't even remember what it was) and completely missed the news that a free v5 had been released, and open source to boot. This is fantastic!
I love the detail in the post! A lot of this seems to boil down to the author being able to drop to a lower level of abstraction when necessary. I'm curious to know how they got that facility. Did they dig deep when they found something was slow? Or did they do a bunch of research before hand? Maybe the higher level tooling didn't exist when they started working on the platform so they had to pick things up out of necessity?
> A lot of this seems to boil down to the author being able to drop to a lower level of abstraction when necessary. I'm curious to know how they got that facility. Did they dig deep when they found something was slow? Or did they do a bunch of research before hand?
I'm not the author, but I've got this facility as well. I guess part one is to have motivation --- it's hard to know what to research before hand[1], but if you've got something that is actively too slow now, that gives you a clear direction. Part two is not being easily satisfied that things are fast enough, or fast as you can get. There's almost always a way to make UI faster, it's just a matter of what needs to break to make it happen, sometimes it's abstraction layers. You can go back in time and look at 1980s UIs and while some were slow, many were super fast, and yet the CPUs were slower than dirt compared to today. We're often marshalling a lot more data and pixels, but it just shows than processing everything in time for the next frame is a reasonable target. Learning to use the profiling tools available is a good concrete first step.
[1] But, don't let that stop you from research into things that are interesting to you, or seem like they might be useful. Knowing something can be done is immensely helpful to doing it, even if you don't remember how it was done.
Knowing what little I do about the author’s experience, his facility in dropping to lower levels probably has a lot to do with his having worked extensively at those lower levels out of necessity in the past.
Ah I wasn't even thinking about motivation! Yeah it makes sense that I'd have to start with motivation when I'm breaking down a perf problem. I'm reminded of the talk So you want to be a wizard [1]. It definitely understands what you're talking about.
Don't forget overcoming fear. SICP helped me most in that I overcame unconscious barriers of fear of complexity. The only way to defeat those is to embrace complexity and fight through it.
Yeah, it’s impressive indeed.
Simmons has been developing macOS applications for a long time, e.g. when he worked for Omni Group. Here’s an interview from 2003 where he talks about an early version of NetNewsWire: https://daringfireball.net/2003/03/interview_brent_simmons
Ah dark knowledge through blood sacrifice. Was hoping they'd found an easier way :( Thanks for the pointer, been a bit since I read some OG John Gruber
There's no black magic here. Everything you see here can be found by running the profiler and reading the docs. I haven't been a Mac developer nearly as long as he has, and almost all of these are things that I do.
Mostly, yes; but the folklore like (paraphrasing) "don't use Core Data for this particular purpose" is also critical and isn't as easy to come by except through bitter experience (or war stories from other developers, if you're a good listener).
It’s clear from the docs that CoreData can use SQLite internally but doesn’t give the programmer full access to all the features of the database. No bitter experience necessary.
The bitter experience is needed to learn that there are no shortcuts, and that the higher-level library that makes everything easy for you will also bite you in the ass sooner or later. We aren't born knowing these things, sadly.
Aside from the specific optimizations listed in the article, and caring about improving round-trip times for remote API calls in general, the fact is that any app built properly with native MacOS/iOS APIs will be extremely fast.
Apple's APIs for macOS and iOS are simply better than those provided by either Microsoft (for Windows), Google (for Android), etc. Apple's APIs are, for the most part, very well-thought-out; their engineers have put together fantastic layers in the operating system and its UI components. The primary reason so many apps perform poorly on Apple platforms is because those apps were developed using 3rd party cross-platform tools (eg. Electron, or making an iOS app with webviews rather than native elements).
Developers who make efficient use of Apple's APIs, and who spend time to reduce their backend response time on remote API calls, will always produce an amazing product. The best example I can think of off the top of my head is the Apollo app for Reddit on iOS; it is coded with 100% native platform APIs, with no slow cross-platform garbage. It's such a breath of fresh air to witness.
> the fact is that any app built properly with native MacOS/iOS APIs will be extremely fast
Nope, you’re reaching too far. Applications built using native frameworks will have a strong push to generally do it right, but nobody can save you from an O(n^2) loop. And even with native frameworks and engineers with access to the best resources, you can totally mess up: take macOS Catalina’s Activity Monitor app, which takes up approximately half a core updating a table view every second, which it managed to do previously with a mere fraction of the resource usage. You can’t fix everything.
> The best example I can think of off the top of my head is the Apollo app for Reddit on iOS; it is coded with 100% native platform APIs, with no slow cross-platform garbage.
Apollo is not cross-platform, but it doesn’t always use native platform widgets as I have pointed out to Christian in the past. I don’t think this has had a major performance impact, especially since they aren’t in the most computationally expensive path, but those parts probably do not interact with the platform at the layer you think they do.
1. I did not mention Apple's own native apps as a perfect example. I have also noticed Activity Monitor as an imperfect sample. I'm not sure if it's a limitation of the APIs available, or if Apple engineers just fucked up. I 100% agree that Activity Monitor takes up far more resources than it should. Then again, monitoring every single process (plus thousands of threads!) running on an operating system is an intensive undertaking, so I'm not entirely sure who to blame, if anyone.
2. I will 100% defend Christian until the end of days. I respect the fact you at least know his name, assuming that you can in return understand that everything he does is amazing. Yes, some of what he does directly counteracts the weaknesses that are innate to the Apple platform (which should be called out against Apple). His most recent release, for example, offers a "Smart Rotation Lock", which swaps between portrait and landscape depending on whether a GIF or video is playing; something Apple should have built in from day one. Also, he had a volume bar in top-left corner on iPhone X and beyond before Apple even made that standard. IMO everything he does to break the "Apple philosophy" winds up getting adopted by Apple. Christian is that good.
3. Everything I said above, whether already existing at Apple or pushed towards the future, is already better than Windows or Linux. MacOS and iOS are positioned to take over the future of computing, minus one mother-fucking aspect (fuck you, Apple): gaming. Even if Metal is technically superior, vendor-lock at Apple makes Metal worthless vs. DirectX and OpenGL. This is the only thing I am willing to abdicate… Apple+MacOS+iOS needs to figure out how to play nice with cross-platform gaming.
Remember that Core Data manages a graph of objects: it’s not a database.
and therein lies the primary reason that I try to never use Core Data if I can avoid it. I almost always want a database. I almost never want to manage a graph of objects. If the backends I spoke to were also organized as object graphs, maybe I’d feel differently about it.
I always thought that using it as basically the db or cache for remote API was a terrible misuse of CoreData, but a graph of objects does come in handy when driving complex UIs.
It is absolutely a terrible use of Core Data. Raw sqlite or wrappers are fine for that and will be considerably less difficult to debug when something goes wrong.
These all have a lot more network overhead than an RSS reader that only has to periodically fetch a bunch of xml files. I feel like optimizing these kind of apps is a whole different problem space and a "native" app is probably not going to change that so much.
At heart, Slack is a chat app. Chat apps have existed for 40 years and don’t require a whole lot of data.
The complex part might be displaying the data, but that’s not much different than what an RSS reader does.
Even calling a chat app “complex” sounds absolutely ludicrous. It’s just that devs are now used to shipping 100MB apps in the name of fast development, so Slack is what we get.
I'm not sure when you used Slack the last time but there's a lot more than "just a chat app" and it sounds like one of these "I can build that in a weekend" comments.
There's video calling, audio calling, screen sharing, file sharing, permissions, media files sent around. It's not comparable to some old chat app where you only have to get new messages and display them in a chronological list.
I'm sure the app has to be able to deal with all that, and that unmistakably adds a layer of complexity which ought to exhibit _somewhere_.
On the other hand, when using the Slack app as a simple chat app - formatted text only, a few formatted in-line code snippets, things like that.. the experience can often be abysmal.
Just the other day, I was typing a _text_ message in a private chat, text only - no images, no fancy stuff other than a few bold and italics and inline code tags, and I could visibly measure the delay between typing and the text appearing on the screen.
That should never happen, regardless of whichever other fancy features the app _might_ be able to do.
> video calling, audio calling, screen sharing, file sharing, permissions, media files
Back before Apple dumped half the features and renamed it Messages, iChat had literally all of these features and then some and it was fast. It was efficient. It didn't run my CPU up to max no matter how I was using it nor how much, and it used very little RAM for an application in its class. It ran very well on hardware that Slack would choke to death.
Slack is the way it is because Slack has other priorities above being fast and efficient, it's not inherent to the nature of a chat app with AV calling, screen sharing and file sharing (file sharing was table stakes for a chat app even 15 years ago or more).
Telegram has done it twice, since both their cross-platform and their native apps are speedy, and they're different codebases started by different people.
Slack/others got where they are by not paying attention, not because it's all that difficult.
There is an unofficial one for Mac (i.e. not actively developed and anyway not by the core devs) - Seaglass. Pretty much dead/abandoned.
Besides, irrespective of what the founder (or someone) might say/indicate at times I really don't think Matrix/Riot is looking to compete with Telegram/WhatsApp/Signal. They are trying for a pie of what Slack and the kind have. I am giving up on Matrix as well for it to ever be a personal IM app/service. It' just doesn't make sense and biggest reason for that not happening are:
- People are not going to bother finding different instances, or host their own - they just want one service, one server
- The UI of Riot is specifically designed for group/team chats and I don't think they will try to stand in two boats in one app or have two separate apps.
- There's no money in personal IM apps. So unless there's a coffer like Fb/Telegram or someone like Acton donates handsomely at some point there isn't much there other than just being another Diaspora but in instant messaging space.
> It’s just that devs are now used to shipping 100MB apps in the name of fast development
Reminds me of that time people where upset they had to download a hundred MB OS update file instead. Turned out someone forgot to downscale some installer images from the raw bitmaps. After that the update was actually a few MB's.
But they control both server and client, so they could use an optimized protocol over a socket like in the protobuf family. Keep in mind that XML is a weird format in that you take your data, convert it to text, then compress it, transfer it, decompress it, convert it from text back to data, then interpret it. With binary formats these steps are a lot faster and more native to the machine.
RSS readers usually refresh every 15 minutes per default. So it’s number of feeds requests every 15 min which is very low. Compared to Discord and Slack which are permanently polling and streaming.
Discord always feels pretty fast once it’s started. Slack is a different beast, at least on my machine. I’m not doubting there’s room for improvement, just that the comparison to the RSS reader in the OP isn’t a good one.
I wouldn't advocate for slower software, but is all of this really necessary for a fast RSS reader? My RSS reader downloads the entire XML file as blob, parses it into an array of structs, then walks over the array looking for new links. It's not particularly slow. I can't imagine it would be worth optimizing this aspect of the code base.
I don't know if it's "necessary," but you can really see the speed difference between NetNewsWire and, say, Reeder on the same hardware. It's not that Reeder is slow, it's that NNW is just really fast.
> I can't imagine it would be worth optimizing this aspect of the code base.
Well, this is the code path taken when the news reader is starting up or refreshing, right? It feels to me like that's the most important place to optimize an RSS reader, because it has a direct impact on the time between when the user clicks the "Refresh" button and everything is updated. Even if the user doesn't actually need to sit there and do nothing when it's updating, if updating takes 3 seconds instead of 10, it makes the whole experience feel much faster.
FWIW, I find Reeder to be faster and slightly more reliable than NNW on the same hardware. I'm switching to NNW anyway because the UI is so much better. Reeder's custom UI feels fluid, but it's really easy to do things like accidentally mark lots of articles as read and then delete them from the local cache. For some things, the boring, native UI is really the best UI.
Interesting -- I find NNW to be faster than Reeder on my hardware, but keep drifting back to Reeder because... hmm. I suppose because of its UI, even though I actually agree that NNW has a better UI. :) It's specifically because I like to have the newsreader only show me unread articles in my feeds unless I specifically switch it to "show all," hiding each article after I read it. That's something NNW doesn't do, and the affordances it has in that direction -- the "All Unread" smart feed view and the "Go to next unread" command -- aren't quite what I want.
Frequently, yes - profiling is important, of course, but feed readers disproportionately have power users who subscribe to many feeds, some of those can be very busy (e.g. monitoring or saved search services), and some people want a fair sized archive. When I first open a desktop reader app, I’m asking it to poll hundreds of feeds - it’s not a server running constantly - and easily many tens thousands of items before filtering. That means lots of repeated updates - and wanting that not to impact whatever is happening in the foreground.
Not critical but definitely noticeable - kind of like how the Atom editor had a bad reputation for input latency even if it was still usable, coloring your perception of the entire app.
I kinda overlooked that this has to be a desktop app. Having the program download 100 feeds when I open it seems suboptimal when it could have refreshed them all in the background, but I guess that's the way it is. Work that's already done is always faster than work you have to do, if that's an option.
Definitely- the original NNW and I believe this successor could use server sync to avoid needing to spend time polling things which hasn’t updated but it’s always been a bit more resource intensive than it seems at first, especially when they do things like diff-ing items to detect changes.
That’s just moving the problem down the pipeline. But it’s the same problem. You can have a daemon to update the data, but if the data is updated slowly, or if the UI is updated slowly after data update, the user experience is still terrible.
No optimization is worth the effort before profiling.
If they found out that these operations are responsible for long enough delays, or high enough energy consumption spikes, it would make all the sense to optimize them.
Profiling is good at telling you which parts of your code account for most of the program’s runtime, but that’s it. It can’t tell you whether the performance is reasonable relative to the work done, or whether a totally different approach is necessary. Writing fast code goes well beyond using profilers, in keeping with the article’s thesis.
RSS applications all have a fixed feature set. Subscribe to feeds, check feeds for updates, organize & sync data for multiple clients, and present user's reading list to the screen. There are several RSS readers that all tick these boxes.
NNW is clearly a labor of love for the author who has been involved in macOS development and RSS for a long time. NNW is unabashedly Mac-only and unabashedly fast. I love reading about people's passion projects.
I just discovered the iOS app for NetNewsWire and I am blown away that it is only 7.5 MB in size! I did Xamarin development for a while; a blank hello world application is about 45 MB. Kudos to the team!
What is a good use case of such "free" mobile app if the sync feature is behind a third party paid service? Am I expected to always check my feeds only on my mobile? How useful is that?
As much as I would use NNW right away but it isn't really a useful service at this point in time.
There's in-progress work on supporting other sync services. Most relevantly, there's work on using a CloudKit backend[1], which would effectively be a NetNewsWire-only feed syncer service.
Feedly is limited with free account. The point I'm making is that NNW should be using its own service to maintain user's "read" states between its desktop and mobile apps.
I'm not a big innoreader fan but their mobile app is decent and their online service is synced together. In this case, I never have to worry about my "read" states getting out of sync.
When it comes to RSS adoption, the read-state needs to be seamless across all platforms.
It seems like you are saying, you should get this great use case handled for you, and not have to pay for anything: not for the app or for the service.
Is that what you read in two of my messages above? Interesting!
BTW, NNW is not charging as of yet. What I was talking about in two of my messages above, is the adoption issue with disparate platforms without the availability of read-state feature. If I use NNW mobile, I'm bound to using it all the time (which is not practical). If I use it's desktop app, I'm bound to using it on my Mac all the time (again, not practical). Anyone with more than hundred or more feeds, would find issues adopting such service. I am not here to defend my likeness of NNW so I won't say anything in this regard.
Yeah your comments were not clear. But the reply comment to you made a great point (that you actually acknowledged, but don't seem to understand underlies why your point is problematic): NNW is free (in both senses of the word) and open source software. Brent and the team are not being paid to work on it. They are using a third party service (Feedly) as a backing store. You have the option of paying for it, or you have the option of not paying for it. Either way, if your argument is that NNW lacks a syncing feature you are wrong. And if your argument is that the syncing feature (Feedly) should be free, then your argument effectively boils down to "hey people who did this awesome free software, please make more free software for me." Which is not a strong argument.
Any particular reason why this is a local (destop/mobile) app? I use Feedly and performance was never an issue, with the bonus that content and status is synced everywhere.
I don't want a website open all the time. I don't want to have to be connected to the internet to read my catalogue of saved articles. I want native performance, not "good enough" or "not really an issue" performance. I want 100% integration with my system; I want AppleScript-based automation. I want my system's accessibility settings to determine how I use the content, not depend on how some website's CSS handles my use cases — not that a lot of websites, particularly SPAs, are made such that VoiceOver can deal with them.
Those are my reasons why I'd want a local desktop app, anyway.
The whole point of the app is to provide native macOS/iOS UI. I use Feedly to sync with NetNewsWire on iOS (and will on the Mac once it's supported, supposed to be in next update).
Developer preference. Simmons is famous in the Indie Mac world, even though a good portion of his career has been at companies. He’s still most well known as the guy who developed the original NetNewsWire before selling that to NewsGator, a kind of Feedly before Feedly existed, and then after some events I really can’t quite remember, taking it independent again and eventually selling it to Black Pixel who gave it back. I believe they were acquihired a bit after that if I’m remembering correctly (but don’t quote me on that), so it might have been because they knew they were soon to sell.
This particular iteration of NetNewsWire only shares the name. NetNewsWire 5.0 was originally being developed as Evergreen because Simmons got interested in making another RSS reader. It would never exist as a web app simply because he’s not the type to want it to be anything other than a native app.
Less well known is he was also the original developer of MarsEdit before selling to Daniel Jalkut. MarsEdit to write blogs, NetNewsWire to read them.
I used to use NNW but one time my 2 year old kid just did a one quick swipe on my phone’s screen while NNW is open and just deleted the whole content. I’m not sure what gesture made it happen but I switched back to Feedly.
Given that the author did a recent article about technical programming interviews, I'd pretty much see this article as enough evidence that the author here knows their stuff in data structures appropriately and also optimising apps beyond his own ones and would bypass the programming interview stage anyway. (All this code is open-source in NetNewsWire).
Great article and nice work on NetNewsWire!