Hacker News new | past | comments | ask | show | jobs | submit login
Time Machine in macOS 10.15.3 is very slow on first full backup (eclecticlight.co)
178 points by miles on Feb 12, 2020 | hide | past | favorite | 153 comments

The article is mostly about the fact that the first backup is slow. That was my experience too, but I wouldn't call that a "serious bug" in the context of backups. I'd call silently corrupted backups a serious bug. There's this paragraph:

"Several users have reported to me that they too have experienced serious problems with Time Machine in 10.15.3, both in making first full backups and in trying to restore from existing backups. At least one of these has been reported to Apple as a bug in Time Machine, and has apparently joined several previous reports of the same problem." (My emphasis)

Clearly there are people who have what I'd call serious bugs, including people in the comments here. But it doesn't seem to me like anyone has proved that there are replicable serious bugs with Time Machine - although of course just because it's unproved doesn't mean it's not the case, and two backup methods are clearly better than one.

I've never had an issue with Time Machine (touch wood) and have found the ability to easily revert to previous versions a godsend sometimes.

I've been using Time Machine backups for years (company policy) via an external USB drive, and now that it's full and I only tend to remember it once every couple of months, a backup takes a full workday or more to run.

At the same time I've got Arq Backup running to back up my code folder (not everything in there is on accessible git remotes for me), but it's very heavy as well given the number of small files (code + git files). But at least it doesn't end up months out of date I guess.

Does anyone have a good backup solution for one's code folder? Large amount of small files (probably tens or hundreds of thousands. It's got a load of node_module folders as well I'm sure)

I have an hourly cron job that rsyncs my code folder to another one with venv, node_modules, Rust target directory etc. excluded. I only back up that folder. That cuts out most pointless small files and saves a lot of space too. I haven’t had much problem with .git folders because (1) I can only work so fast so the delta is usually fairly small; (2) periodic rsync cuts out a lot of intermediate changes. But if you still need to optimize, maybe you could repack .git too.

My code folder is a sym link to a folder in my iCloud drive. The beauty is all my code will be sync'ed to cloud automatically, and the bonus is I can see this folder in every machine that syncs with the same Apple account. I believe this approach works with any cloud drive like DropBox, Google Drive, One Drive, etc.

It also means that you can destroy your ‘backup’ easily, and get your erroneous change synced to every machine that syncs with the same Apple account, destroying most copies of the affected file(s).

Unlike Dropbox, iCloud Sync doesn’t version files, so that’s not easily corrected, even if you spot the problem early.

> Does anyone have a good backup solution for one's code folder? Large amount of small files (probably tens or hundreds of thousands. It's got a load of node_module folders as well I'm sure)

I've been using Duplicacy (along with Arq and Time Machine) which is amazing at speedy backups and deduplication. However, I've found that restores require quite a bit of CLI-fu [1].

Considering a move to Restic, because they have a killer feature which allows one to mount a snapshot as a FUSE filesystem [2].

[1] https://forum.duplicacy.com/t/restore-command-details/1102

[2] https://restic.readthedocs.io/en/v0.3.2/Manual/#mount-a-repo...

Borg Backup is quite good. It doesn't take advantage of FSlogger, but otherwise is well designed.

Arq is great, but like all third-party backup solutions, it relies on scanning the file system. See my comment: https://news.ycombinator.com/item?id=22311053.

If the backup disks is full, I'm not surprised doing a backup takes ages. It's got to go through purging stuff on the fly alongside backing up new stuff and constantly rebuilding indexes, which will grow and trigger even more purging requiring more re-indexing. Horrible. Your backup disk should always have some free working space for it to run efficiently.

Can't you reducing the retention period so you always have some working space free, or just get an appropriately sized backup disk? Time Machine backups should run in just a few minutes automatically every hour on a well configured system.

Use git and gitignore the node_modules folder. You shouldn't need to back up that directory, the package.json file should have all the required package info

Or use yarn 2, where if you check the local yarn cache folder in it's much, much smaller because it's compressed blobs for each dependency.

Probably best to ignore both git and node_modules folders with arq. Arq goes back and validates old backups sometimes (this makes my fans spin up much more than actually backing up weirdly) and both of those folders are going to be scanned and revalidated after the initial backup.

Fun fact, rumor on the webs is that a new Arq version is in the works! (Look at their twitter feed for screenshots they sent somebody recently)

> Probably best to ignore both git and node_modules folders with arq.

I agree with this. I also have (is) `Photos Library.photoslibrary` and (contains) `cache` as my exclusions. The exclusions have made my Arq backups less painful.

ValentineC, this is the best hacker news response I've gotten in a long time! Excited to mark those cache folders and stop syncing 200mb every hour.

It's not a Mac-centric solution but, all of my computers' crucial folders are sycned to each other via syncthing.

Desktop computers also back them up to another disk via backintime (an advanced backup tool built on rdiff).

This gives me realtime replication and two co-located backups. It's fire & forget kind of setup.

It’s definitely good to add more to the exclusion list for Time Machine (using System Preferences), since it might pull in very large things that you just don’t care about. For example, take a look at some things in ~/Library.

Can’t you just keep the external drive connected so that it does small incremental backups all the time?

I have a local external drive on a usb hub. I also have Backblaze for offsite backup. Neither one requires active management

If you use git, even without remotes, then perhaps use rsync or rclone to sync those repos to one or more storage areas? Could be sent to an SSH server, Google Drive, Backblaze B2 or whatever you like.

I use FreeFileSync periodically.

Relying on a cloud-based system always seems dodgy to me, since they sometimes get "confused" over which is the master.

I also put my code into Fossil for an easy-to-copy and move .db file.

I back up my entire home directory, including all gitted code projects twice daily using restic. It seems to work for now.

Hook up a timemachine NAS or similar to do the backups wirelessly?

You publish it on the Internet.

When it comes to backup, the time it takes for the backup to complete is not a property that matters the most in my opinion. The time it takes to restore the backup matters, and being sure that the backup is able to restore and is not corrupted matters even more.

A backup which takes so long that the user aborts it before it completes (say if it's from a laptop to an external USB drive and user wants to take the laptop out with them) IS a problem though.

That's not news. I stopped using Time Machine, when it couldn't transfer my backup to a new Macbook. Rsync worked better. If you look at the actual backup, they use hard links to make snapshots. Not great for many small files or small changes to large files.

Since then I wrote a simple frontend for Borg Backup for macOS, called Vorta[1]. Use it for local and remote backups[2]. Fast and works on any file system.

1: https://vorta.borgbase.com

Having huge numbers of tiny files in the filesystem definitely has its toll. As I routinely handle multiple JavaScript codebases with complicated node_modules trees, to reduce the overhead I started wrapping relevant directory trees into sparse bundle images—that helped quite a bit.

Thanks for mentioning this.

I've been looking for something that would let me do what Time Machine does, but to the cloud, for a while. I've been using BackBlaze, but it has poor metadata support and mandatory exclusions for a lot of stuff I want backed up including system configuration. I tried CloudBerry to BackBlaze B2 but the initial backup never completed, even after several weeks on a >250Mbps connection.

It strikes me as odd that APFS doesn't support hard links any more despite being the "new" filesystem, yet the entire macOS backup system relies on the old deprecated HFS+ filesystem for Time Machine disk format, since it uses hard links....

Seems a backwards, muddled, confused step (as does Catalina itself!).

The issue at question is hard links for directories, not just hard links. Hard links aren’t going anywhere, whereas hard links for directories was a dangerous feature to begin with (I won’t expand on that but you can easily find explanations) and basically only there for Time Machine.

APFS has native support for COW and snapshots which is way better than the directory hard links hack. They’re just slow to port Time Machine to APFS targets.

Thanks for the explanation! I had glossed over the hard links "for directories" only bit. Thanks.

The author keeps saying serious bugs and a thing as being infeasible when they actually mean they don't like the first back-up taking eight hours when they paid so much for their storage devices.

I have never waited for the first backup in Time Machine because it always happens when I am asleep. They probably take several hours. Nothing to see here, esp. since Time Machine is a totally end-user low-touch, simplistic service that, as far as I'm concerned, is one of the last truly well-engineered bits of user experience to come out of Apple. Who ever said it was fast?

Time machine is wonderful in my opinion - I've been using it a bit lately to restore files here and there, there's nothing like it on any other platform afaik.

Pick a file, hit enter time machine, wait a bit (I still have a time capsule, that I'll miss dearly if it goes), scroll through the history, and restore your file, its so easy.

It is the most advanced backup system I've ever used.

Edit: I think the really great thing about time machine/capsule is I don't know how it works, it's like a toaster, I plugged it in, clicked a couple of things on my Mac years ago and it still works, even after 3 Macs. I remember the days of typing tar cvf > /dev/rmt0 or some such and its a miracle in comparison.

> I still have a time capsule, that I'll miss dearly if it goes

Many consumer NAS devices can act as time capsules so you’ll be able to continue.

Oh yes thanks, I'm aware, but it won't be as easy as the Time Capsule I'm sure.

In my experience you just use the config tool to tell the NAS to be a time machine server and then point your Mac at the NAS same as you would the time capsule. Not bad, actually!

> you just use the config tool

yah, and then I'll probably have to reformat the drives because they're ntfs or something, oh but the drivers aren't compatible, just go and download the latest, oh they don't work with this version of the OS, maybe try a different distro, and so on.

Sigh, I like the apple way.

Sadly, there are hundreds of common bugs and annoyances that exist in all of Apple platforms that bite people everyday but will never get fixed. Reporting them is pointless because Apple rarely appears to stop to fix things. In the past, reporting them might've been helpful, but these days, it's as useful as shrieking into a hurricane.

Here's a list I started of just what I've found and could think of readily:


> Dropping support for 32-bit apps was a terrible idea.

What? Never?

First of all I think old CPU instruction sets should be deprecated in finite time. Secondly I think a decade is a reasonable time frame to do it in. It too fast, not too slow.

I go my first 64-bit mac in 2006. Fourteen years later why should the Mac have to support the older slower way that doesn’t release the potential of the hardware?

I welcome a 64-bit only OS and haven’t suffered any loss from it.

I still use a lot of software, like Aperture, for which there is no good replacement and no possibility of me recompiling it for 64-bit. I will probably never upgrade my personal machine to Catalina as a consequence. When it becomes infeasible to continue using Mojave I will probably abandon MacOS entirely.

Wouldn't it work via a VM?

Aside from that, Aperture was abandoned 5 years ago. I would blame Apple for not releasing a 64-bit version of it before dropping it, but not for dropping 32-bit altogether.

Aperture is actually a 64-bit app but it depends on some system frameworks that weren't migrated to 64-bit and dropped in Catalina.

It's possible to get it running in 10.15 by removing the references to those frameworks, it breaks some of the features like slideshows but the app will still run: https://medium.com/@cormiertyshawn895/deep-dive-how-does-ret... (the author has also made an app that automates this process, I haven't tried it so I can't vouch for how well it works)

If you abandon MacOS for photo management, what system will you go to that does not already support MacOS?

If you go to Windows and use Lightroom, why not stay on your mac?

If you go to Linux and use FSpot or Darktable, you can get them both on a Mac too.

I liked Aperture, but moved on to Lightroom and then to Capture One.

Even in an operating system that supports 32bit binaries, there shouldn't be any runtime overhead if you stick to 64bit apps only, except maybe a few megabytes of wasted disk space. (which could even have been made into an on-demand component install, if that's too much to spare)

For those of us with legacy 32bit apps and games, having them run ("slower" is in the eye of the beholder, as they perform as well as they always have) is better than not running at all.

When I run a legacy 32bit app, it's not like I'm bothered by the fact that it also loads a bunch of 32bit OS libraries. It's not like I feel like my value for money for owning the machine is restored if clicking the .app icon just shows a "Not supported" dialog box.

> I go my first 64-bit mac in 2006. Fourteen years later why should the Mac have to support the older slower way that doesn’t release the potential of the hardware?

Aren't 32bit binaries (when not needing 64bits) slightly faster than 64bit versions due to the smaller pointer size?

Depends on the use case. It's a balance between less memory and cache use (32b) and having more registers + new specialised instructions (64b). Each app will use those differently. Ideally, we'd revive x32 which is a combination of short pointers on the new architecture, but that never really got popular.

I agree even though I haven't switched yet because some of my apps would not work properly (or at all). Decade should be enough.

But hey, I also think switch to Python 3 gave enough time to everyone to switch too and there's no shortage of people who disagree :)

One bug that's very annoying and doesn't seem to be on Apple's radar: finder has a hard time understanding non US date formats and will show files as having "no date" sometimes

File system dates are stored in a locale dependent format?

No, I don't think the issue is in the fs format, it's on Finder "sort by date"

Finder sort by date operates on the locale dependent date instead of Unix time? That is pretty silly!

> It should be possible to buy a license for macOS for non-Apple PC's that includes a number of drivers to make it work.

This will literally NEVER happen and it makes me so sad. Apple would rather drop macOS completely then let this happen. In my opinion this is also the only reason for the T1/T2 security chips on the recent Macs. Yes I understand the other features of the T1/T2 chip (secure boot, touch id, etc) but to me these have got to be bullshit reasons. There is no reason why Apple's special little security chip should be controlling the exposure of your webcam[0].

[0] https://support.apple.com/en-us/HT208862

I've been using a "Hackintosh" desktop for the last five years, and once it was set up, it has worked flawlessly. That includes all the hardware except the Intel Bluetooth, which was replaced by a tiny dongle that came with something or other. Even the yearly major update has never been a problem. It's far less hassle than my last Linux-on-Desktop attempts, even though Apple isn't even trying.

So that would seem to be evidence against your theory that Apple is concerned about people running MacOS on generic hardware. While some cryptographic hardware would indeed be needed to reliably prevent such shenanigans, I'm somewhat certain they could sabotage such systems with minimal effort and raise the pain to levels where it's just not worth it.

They don't even bother to, say, check the CPU and refuse to run on AMD. That could probably be done in a single line of sourcecode. Not doing anything like that and instead designing custom silicone just isn't rational behaviour.

I think the reason you've been having a good Hackintosh experience is solely because of the amazing Hackintosh community and the advancements that have been made within these past few years. I've been involved in that community for a little over 10 years now. Sometime back we did not have these incredible tools (new bootloaders like Clover/Open Core, vastly improved audio/storage/graphics kexts, advanced SSDT/DSDT tools & patching) that now in its current state make Hackintosh much easier and straightforward. I agree with you that Apple hasn't been actively going against the community but its definitely begun. To approach what you said about sabotaging the Hackintosh systems with minimal effort... they definitely wouldn't just start pushing software updates hunting for Hackintosh systems. It would work but it'd be feeding more into the cat and mouse games we see. (This behavior is not out of line with them... see their war on iPhone app sideloading & iPhone jailbreaking). The best approach would be to begin integrating these chips in the lineup and soon enough after some generations of new hardware and software, Mac's without these chips won't be supported by macOS. Effectively choking out the Hackintosh community silently.

Apple has deprecated kernel extensions [1], but I'm sure the Hackintosh community will soon find a way around it.

[1] https://news.ycombinator.com/item?id=22251076

>There is no reason why Apple's special little security chip should be controlling the exposure of your webcam...

Er, yes there is. The custom image signal processor on the T2 enables the face detection feature that drives tone and white balance mapping _on_ _faces_. That requires custom silicon to do in real time. The T2 is a custom ARM chip, so I expect the image signal processor is a subsystem taken from one of the recent iPhone chips, which also have hardware enabled dynamic face detection and mapping.

There are no depth cameras on any Macs. The Tx chips manage fingerprints the way the iPhone button did.

I suppose they could be lying, but I've no reason to suspect that and this is what they say. There are ways to do facial recognition and mapping for this sort of thing with a single camera and no 3D sensor, it's just not as good.

I think simonh is saying it was economical for them to re-use the design from the iPhone ICs, which do handle facial recognition. That sounds pretty dubious to me, though.

I assume they used a version of the the iPhone button which included a Secure Enclave and finger recognition bonded to the sensor. However the T1/T2 do other stuff like disk encryption that AFAIK the phone does not have. The two devices (iOS devices and Macs) share quite a bit of DNA but have many different design criteria.

Apple could have trivially blocked hackintoshes but couldn’t be bothered doing so as they constitute a trivial number of devices.

The Tx chips may indeed spell the end of hackintoshes (I hope not, but who knows) but I’m sure hackintoshes played no role in any decision to use them.

That's because you don't pay and the Cascade of attention deficient teenagers doesn't care about fixing bugs. If you paid you'd get service and you wouldn't be the product.

Oh wait it's 2020, sorry, that's not playing anymore. What's this year's RDF spin?

But you do pay! Apple hardware is famous for that!

(I expect the problem is that you mainly pay for the hardware, not so much the software. At some Apple found out that if the hardware is nice enough, the software can be a bit shit, and people won't mind.)

This is said the other way around at least as often. As in: the high price of Apple Hardware is only worth it because it's a requirement to get MacOS.

And I'm willing to admit that there have been more annoyances in the OS. But having recently tried Windows and Linux again for the first time in over a decade, I can report that nothing much has changed in their relative standings. Just the Windows Activation process alone is a crime against customers. System settings/Utilities/etc are some abomination of cruft accumulated over decades, in a conglomerate of interfaces that seem to try to emulate websites, and sometimes actually are websites that somehow make changes to your system.

I'm actually really happy that this needled the incessant apple boosters around here. You love a company that literally refuses to fix bugs. You pay for that. Sometimes trolling has a point - Pin for the reality distortion field bubble.

Tell me what doesn't have serious bags in modern Mac Os? Is it only me who's constantly dealing with various lags and hangings? Is it only me who had to force quit music app at least every 24 hours?

Honestly not sure why this is downvoted. I run a multi-user setup and switching between users is a giant pain. Here is one scenario that I find really weird, at the very least since High Sierra.

Assume user MAIN and user WORK:

1) Open Macbook

2) Login prompt for MAIN shows, "Switch User" button below

3) Click "Switch User", now prompt with logins for MAIN and WORK shows

4) Click and successfully login to WORK, the desktop for WORK now shows

--- Getting weird now

5) Get flashing image of desktop of user MAIN (?)

6) Get login prompt for user MAIN (??)

7) Click cancel, bounce back to login screen with MAIN and WORK (???)

8) Login to WORK again, good to go from here.

Switching between users on macOS is not just weird sometimes but at least it feels downright insecure when I am able to see flashing images of MAIN when I am logged into WORK.

Active Directory stuff is broken whenever the Mac is hibernated. Trying to use an AD Admin password to approve something almost always requires a reboot before the password window stops shaking.

I switch users quite frequently and have never seen this or anything like it. (I see some others have mentioned that Firefox might be the culprit, though, and I don't use it.)

this plus all the random 3D animation and programs bugging out/freezing after switching between 2 active users. (looking at you firefox)

Oh yes, do not dare to have Firefox open across users!

I use the latest Mac OS on my 2015 MBP without any issues since the day it came out.

I ditched Microsoft about 10 years ago and have been a fervent proponent of Apple ever since, but since WWDC 2019, I cannot honestly recommend Apple to a new user anymore.

There are just too damn many bugs.

In everything. Operating systems, software, services, built-in apps, even their developer tools and even in their frameworks and the Swift language itself.

I run into at least one bug literally every day. Someone could write a daily blog about this. Core features like keyboard input, text selection, AirDrop, photo picker, iCloud Drive etc. are erratic and unreliable. It's a death by thousand cuts. Apple is no longer the clear best, just the least worst.

I love the 16" MBP though, and restoring it to the exact state as my previous 15" MBP from a Time Machine backup was smooth and effortless. But when I tried to backup the new system, it seemed to ignore the existing backup I had just restored from, proceeding to write 300+ GB all over again and not showing the older snapshots in the UI.

Sadly Catalina is the first incarnation of OS X/MacOS that I haven't upgraded to within the first couple of weeks of it coming out. I really feel like I'm missing few benefits by not upgrading and quite a few problems.

I have found that upgrading to macOS current - 1 with latest update applied works well. For example I decided to update to Mojave last month, and enjoyed my High Sierra until 2019.

Having said that, I am probably not going to upgrade to catalina anytime soon (at least not until Catalina+1 macOS gets first major patch release).

Why? Well my reasoning is that lots of stuff magically isn’t going to recompile itself from 32bit to 64bit. I can probably help with that while being on Mojave.

Same. You lose every cool program that happened to go unmaintained, and iTunes evaporates. If I do upgrade, I'm thinking about a Mojave VM for maintaining 32-bit support. The problem is I feel like Apple is throwing buckets of features into macOS without ever fixing anything and occasionally randomly breaking stuff that worked, so that the net result is always worse than before.

I can understand. Way back when I had an internship at a Apple reseller I always found it crazy people where strongly against upgrading to the latest release. A lot of it was because 3rd party software lagging behind on support. But nowadays I want my machine to just work and I completely understand.

I was not intending to but there are some dark patterns on the software update screen - Watch out!

I thought I was doing a general software update, half an hour of waiting later and I’ve got Catalina now.

I have been using time-machine for more than 10 years I think now on my third laptop.

Every time at some point it starts clearing out a corrupted backup history due to some issues its found with its own backup. Frankly, I don't trust it anymore, I just set it up for convenience and every other week I start a restic backup (https://restic.net/).

I'd say Time Machine seems like one of these programs neglected by a vendor.

I have been back-and-forth between Time Machine and switching to something else. I'm still using TM, but what alternatives do I have? Any recommendations?

I’ve gave up with Time Machine and switched to Carbon Copy Cloner (https://bombich.com). Not coming back.

Solid well-maintained software. One nice differentiating feature is that it fully supports bootable backups.

Me too. It just works


the interface is a bit weird but works well enough

Arq + B2 is fantastic for me!

That's what I use.

Arq is great, but my quibble with it is that, like all third-party backup solutions, it relies on scanning the file system. Any such program, Arq included, will take a very long time to run on a hard drive where absolutely nothing has changed.

What Time Machine does is use macOS's local database of pending changes, FSEvents. On the next backup cycle, it knows what's changed and doesn't need to scan unless the FSEvents database is missing or corrupt.

I'm looking forward to a time when Time Machine can take advantage of file system snapshotting, which apparently was a design goal for APFS. Snapshotting is used for Apple's Software Restore function now, but the Time Machine parts weren't ready for Catalina [1].

[1] https://eclecticlight.co/2019/08/16/are-we-ready-for-time-ma...

I have been using Arq for years. Locally over SSH to our NAS, remotely with B2 as well. I have done many restores over those years, both when installing a new machine or when I lost some file. It's really awesome, especially because it's decoupled from any storage vendor. You just buy a license and choose where you want to store backups.

On Linux I use restic, which is also great, but on macOS Arq is just more seamless.

I use restic on my Debian laptop and I think that it misses a GUI to be perfect.

I am fine with using the CLI for setup but unless I closely monitor systemd timers, backups could be failing silently and I would not notice.

Indeed! I also have a setup where the secrets come from the pass password manager, which I use with a hardware OpenPGP token. So, I get these global PIN dialogs once gpg-agent expires the PIN, which is quite annoying :(. Not sure though how to solve this nicely without making the secrets to visible to the rest of the system.

Also using Arq + B2, though it's much better with a high speed internet connection. On a 1Mbps max upload speed it was sometimes painful and had problems, with my new 20Mbps upload it's great.

I trust my SuperDuper local backups much more though. SuperDuper is amazing.

My setup too. The cost is so cheap. I also run Time Machine for local backups because I have spare disks.

Time Machine has been broken for years.

Backing up to local storage, or through ethernet has worked, but when backing up via WiFi it routinely comes up with the "Time Machine needs to recreate your backup" message.

Furthermore Time Machine insisted (probably still does) on using the deprecated AFP protocol that even Apple no longer maintains.

A backup that cannot be trusted is not a backup, so years ago i switched to Arq Backup (https://www.arqbackup.com/) instead. While not as "polished" as Time Machine (Restore functionality could _really_ use some work) it actually works, and in all my years of using it, i have never experienced a failed repository.

I wondered about that. Back when my Air was still running macOS I setup netatalk on a linux server so I could point Time Machine there. But every few weeks I got that message and had to repair my backup. I always assumed it was some issue with netatalk or my configuration. Eventually I gave up and used an external drive.

I get this message every few weeks. Is there any way to fix the backup? Until now i thought my synology nas is the culprit...

I get it too when using synology. This is what always fixes the backup for me: https://jd-powered.net/notes/fixing-your-time-machine-backup...

It only works with SMB in actual versions of macOS. But it insist of using AFP+ as its Filesystem.

My laptop is set to back up to a share on a mac os desktop, in theory. So whatever protocol they like. It says the last backup was made on Aug 22. I never turned it off.

It doesn't look like it works even between macs via SMB to me.

Mind, the laptop is on 10.14 and the desktop on 10.13, so that may be a problem.

> But it insist of using AFP+ as its Filesystem.

(Assuming you mean APFS) Not true. I have Openmediavault running on my raspberry pi 4 and the two drives i use for dual-backup are both ext4. (can really recommend RPI4+OMV for time machine btw)

> I have Openmediavault running on my raspberry pi 4 and the two drives i use for dual-backup are both ext4.

I suspect that's because the Time Machine solutions for NASes create a sparse bundle that has a HFS+ (or APFS, I'm not up-to-date) partition within.

Time Machine seems like ripe for some kind of overhaul.

"The first backup is slow" is not a new bug in 10.15.3. I had that issue in at least 10.14 as well, and who knows which other versions. Time Machine GUI is great; but for backups off the device you need NetAtalk (which has been reverse engineered though). The TM GUI would've been great with ZFS back in the days, if that deal went through. It did not. APFS does not support deduplication.

There's a bunch of Time Machine GUI FOSS applications which are inspired by Time Machine GUI, for Linux. Not sure which one works best, or if they work on macOS.

Last night Time Machine returned error 45 for me backing up to NAS. Google sent me down the rabbit hole - a bunch of people are having the same issue with Catalina. Here's the kicker, time machine isn't officially supported on NAS - although Apple support will help apparently. I ended up having to delete the backup and restart; I wasn't bothered because I have a second backup on a local USB drive. Having said this, I do think I need to switch to Carbon Copy Cloner (but I've been saying this for years now)...

Catalina + Time Machine may have bricked my mac.

Painfully I just ran into a serious restore bug with Catalina. I did a clean install of Catalina and migrated from a Time Machine backup using Migration Assistant. The migration hung (no progress after 3h) so I hit the cancel button.

Big mistake.

I’m now unable to log in to my box. Migration assistant migrated some of my data over since my profile picture changed from a stock Catalina tennis ball photo to the custom one I used on Mojave. I would have expected cancelling to not move ANY data over. Guessing the partial copy somehow messed up my ability to log in.

Ever worse: I wasn’t using iCloud to reset my password. And the recovery code I threw in 1Password just... isn’t in 1P any longer. When Catalina provides the recovery code it never said that this was the on my way to reset your password if you lose your login pass (or if it locks you out).

I can’t even do another clean install because Catalina asks you for your account password in Recovery mode.

So: can’t login because Catalina messed up a migration, and I’m not using iCloud and don’t have the recovery code.

The solution from Apple support is to take my machine into the Genius Bar.

I’m hating some Catalina product manager (because I cant reset my password) and dev team (because of the migration bug that corrupted my login password) right now.

Catalina bricked my iMac, so I'm not surprised. Thinking of salvaging what I can and getting a beefy Lenovo after like 20 years of using a Mac.

While ThinkPads are great machines (I just bought a cheaper one), I'd consider Dell XPSs as well, for the much better trackpads alone. Only two or three years ago, the displays on ThinkPads were absolute jokes. A traditional ThinkPad is a somewhat "masculine" and bulky thing, and the trackpad button layout really gets in the way of a fluid workflow for me. I bought mine because my XPS was dying (the XPSs' weak point is the battery, with high failure rates), but I'm planning to buy a new XPS later this year anyway.

As does all software made by Apple.

I was a fervent fan of apple around 2010, with Snow Leopard and iPhone 4. It really "just worked".

Now it just doesn't.

I believe these problems started with Catalina in general. I figured it was just me and disabled time machine and switched over to restic.

I made full switch over to macOS for my workstations (including laptops) as of 2018 and while it has been better, I've slowly watched macOS degrade. Multiple monitor support has gotten worse, displays over thunderbolt 3 have gotten worse, Time Machine is now unusable, wallpapers reset on reboot. Disk Utility while not as bad some previous versions is still bug-laden. There are also plenty of other issues I can't immediately recall.

I switched to macOS because POSIX on Windows at the time was bad and Linux desktop is still a buggy affair. However I might hit a breaking point soon if Apple can't fix things. Linux Desktop is worse, but not by much.

It's interesting isn't it, for me I've only had one problem with Catalina (not logging into iCloud) and that was resolved easily. Time Machine works, Disk Utility has been fine, running a display over thunderbolt is fine. I run Windows on the same hardware and have far more problems with that.

Some people clearly are having problems however it's hard to get a handle on how widespread this really is.

I switched to macOS in 2009 because Windows was a perpetual patch Tuesday and UX beige box. And that's after cleaning up the fall-out of Blaster, Code Red, Code Blue, Nimda and on and on across hundreds of endpoints and getting sucked into the MCSE/VMware/EMC cert/training dance. I did plenty of work in C (embedded, kernel and userland) and heterogenous net admin-ing on HP-UX, Sun/Solaris, SCO, AIX, SGI from 1996-2013, so I wasn't married to any particular vendor. (Scripting a Cisco 1604 ISDN router to auto-dial the ISP was fun.. and I discovered AIX was phoning-home to IBM and blocked them. LOL.)

If I were the Apple CEO, there needs to be a pause in new features in all platforms in order to fix what's broken, rather that just adding new features, ripping things apart and kicking the technical debt can down the road to continue snowballing.

> Linux desktop is still a buggy affair.

Even gnome isn’t as bad as OS X IMO. If you use a decent DE like fvwm or xfce it’s much better.

The problem everyone has is that their favorite mac software doesn’t have a Linux port and a lot of the alternatives are less sexy (and sometimes way worse.)

Even gnome isn’t as bad as OS X IMO.

It's many times worse, and I say this as a daily GNOME user. Well, perhaps not in terms of bugs ;). Qualms compared to macOS:

- They removed menus (with discoverable shortcuts) and replaced them by stupid hamburger menus that miss a lot of the prior functionality. (If menus bother you, just move them into the system tray.)

- They removed system tray icons. There are extensions, but they only work for a subset of the applications that I use and quite badly.

- Keyboard shortcuts are inconsistent between applications.

- Inconsistent ways to make applications full-screen.

- Removed desktop icons (I use the desktop as a short-term cache of stuff that I want to be able to open quickly).

- A lot of things are not configure through the settings applications. Some additional things can be configured through gnome-tweaks.

- Much worse noise cancellation than macOS.

- Video playback is not hardware accelerated in web browsers (not GNOME's fault).

- The GNOME applications are much less usable. E.g. most (all?) types of remote calendars can not be added to Gnome Calendar. You have to go through some archaic (compared to macOS) setup in Evolution (in my case I had to add multiple calendars from one account one by one). Evince misses a lot of basic operations that Preview support (such as reordering pages in a PDF.

I use GNOME because it is the only traditional desktop environment that has great Wayland and HiDPI support. Linux is great, but the Linux desktop is a tire fire compared to macOS.

Fully agree. I was running Ubuntu 16.10 until the weekend and am in the progress of transitioning over to 20.04 LTS when it's coming out. I could live with gnome quite well so far, but now I think gnome-shell (already in 18.04 I believe) is such a regression compared to Unity, I seriously think about using a lightweight DE instead. Gnome settings and gnome-tweak are jokes, can only tune the limited and old-school libinput trackpad settings and it's driving me mad, pompous and heavyweight animations, slow response from wakeup, no menu, search is useless, pointless display of time in the middle of the menu, etc. May I ask what you're using as lightweight "DE" (I just need a Unity-style launcher I guess) on top of gnome, and that preferably is available via apt and widely used on Ubuntu?

Edit: I want to add that the freelance gigs I'm doing are mostly using Ubuntu desktops, and it's been working great in recent years, so for me, 2018 was the year of the Linux desktop. At a FinTech I worked last year, the employer gave the staff the choice of using Mac OS or Ubuntu, and most (myself included) opted for Ubuntu; performed very well as a working horse

Strong disagree there. When forced to use Apple or Microsoft desktop environments i find they are a "tire fire" compared to stock gnome. Really. Not in the same ballpark of usability, polish, aesthetics, quality control.

Does it work the way you're used to? For me, yes. I'm used to it and used to a workflow using it. If you're used to something else, then no, you're not. The end.

Not in the same ballpark of usability, polish, aesthetics, quality control.

We must be living in different universes ;). I cannot comment on Microsoft Windows, since I have never really used it.

I'm used to it and used to a workflow using it. If you're used to something else, then no, you're not.

For reference: I have used macOS since 2007 and GNOME since before 1.0.0. Mid 2000s GNOME 2 was really awesome. Sure, it had its problems, but from the perspective of usability and completeness it was awesome. For years Sun Microsystems poured money into GNOME 2 usability studies and improvements, because it was supposed to replace CDE on Solaris.

Edit: should add that I am waiting until KDE on Wayland is well supported on NixOS. I have recently tried KDE on X.org a bit and it seems like it would be a leap forward for me.

If you use gnome3, learn how. Learn the workflows. Find some youtube demos or something. It is different to what you are used to. I find it very much faster and more efficient to the point that I don't think about it. You basically never touch the mouse, so yeah, it's different.

I used gnome 2 for many years. It's great. Loved it. No desire to go back at all. Did not find Sun moving the main menu item to the bottom left and calling it start a breakthrough in usability at all. I use osx semi-regularly. Not a pleasant change when I do. I use windows almost never and when I do my god the awful.

Try and use any desktop the same way you would use something different and it will be inferior to something different. Good luck.

The latest version of Oracle Solaris now uses Gnome 3, deprecating the excellent but old Gnome 2 implementation they had (that ZFS Directory history integration in Nautilus was very cool).

Catalina's slew of bugs broke me, and I swapped my 2014 11 inch macbook air to Ubuntu 19.10 in November. Its not perfect, but I wouldn't go back.

- It is very quick in comparison to macOS, it makes the 6 year old feel hardware new

- Your experience of bugs depends on what you do of course, but from my personal experience it is much stabler than macOS

- A lot of what I use is terminal based, or electron based, so I really don't have an issue with not finding software

- The install process was seamless. I realise this is truer for old hardware than it is for new hardware, but given that my 2014 air was getting unuseably laggy with macOS, the choice was between buying a new, extremely expensive macbook to get acceptable performance, or using my old hardware with linux. If I were to replace it, I would do so by buying a cheaper, linux supporting laptop rather than paying for a macbook

Despite the great work done on the linux desktop, Ubuntu 19.10 doesn't quite measure up to the 10.5-10.7 "golden era" of stable, elegant, developer friendly macOS for me, but it comes a lot closer than the last few macOS versions have.

I persisted with Linux for a year but eventually gave up because of the poor high-dpi support, mostly with applications. In some ways I wish I had never seen a retina screen, then I wouldn't know what I'm missing.

I'm using a 4k screen with my sway tiling desktop. Enlarge the fonts from alacritty and emacs, set fractional scaling to 1.5 from Firefox and you're done.

Except that Xwayland applications are either tiny or scaled up and blurry. I guess it's acceptable if you have a large (e.g. 30") screen. But on a 24" 4k screen both options are very unpleasant.

Also, I don't think Sway is an alternative for most macOS desktop users.

I'm an old macOS user. I downsized what I need from a desktop and I'm very happy with my minimal setup.

I'm not using scaling from sway/wayland, but I basically just use emacs, firefox and alacritty, so changing those apps is all I need to do to get scaling done.

I've actually switched back to using exclusively "low" DPI screens because I feel like it's just not worth the hassle. They're high enough resolution for me, maybe if I was a photographer or into video I'd care more about hidpi, but I just don't see the point.

The reason I’m still on Mac is 5k monitor support. If not for that, I’d be running Solus.

Since no-one's mentioned it yet, I've experienced super slow initial backups too, but found disabling the built in throttling sped things up so that it actually completes more quickly: sudo sysctl debug.lowpri_throttle_enabled=0

It seems to reset itself automatically.

I would love to be able to configure time machine to only back up a list of directories.

I keep my files organized in specific directories in my homedir, “dev”, and “personal”. I would like those to be backed up and nothing else.

My guess is that something like this would be much less prone to bugs and much quicker to run.

I realize some people have a perfect and/or high complex setup that they want to be able to come back to if their system fails, but my experience over the last 10 years has taught me that restoring from a time machine back up is extremely rare.

I would rather just make sure my important files are backed up (at the click of a button, or automatically) and I can easily rebuild my system by reinstalling things from scratch.

You can exclude stuff from time machine. I am able to reduce backup size that way. I would also prefer being able to add folders though.i would also like to be able to add folders to the iCloud Drive too instead of syncing only one big folder.

I use TimeMachine, Carbon Copy Cloner and Arq at the same time for different use cases. TimeMachine is good enough for quick file recovery or to restore a new laptop.

Does anyone know the name of that app in the blog post, looks like some kind of TM health checker?

It’s by the writer, check his downloads page for The Time Machine Mechanic. Whilst you are there check out some of his other apps. Personal fave is Podofyllin, a brilliant little pdf viewer.

Thanks, now I just need to find something like this for Photos.app. It's always hard to know what's going on in these apps as every a bit more detailed information is hidden from the user.

Having to manage Macs in an AD environment is such a pain. I'm still awaiting the day I can back them up with Veeam, just like everything else.

Time Machine works fine here on 10.15.2, I am serving it from a Time Capsule. I replaced the Time Capsule harddrive with a 8TB version. Even wireless backups work like a charm. I do not use the wireless network from the Time Capsule, but a Linksys Velop network.

It was impossible for me host the backups reliably from a Linux computer though.

A big issue that bit me in the ass with Time Machine was that it doesn't backup partitions outside of the main one.

I had to discover this after wiping my disk and losing three years of personal projects / test / scripts in the process.

Was my own fault for not checking but still ... fun times.

I just let Time Machine do its thing in the background, but I also rsync to a 2nd backup disk just in case... Oh, and all the important things are also backed up to cloud storage.

I’ve had issues with Time Machine in the past but have been pretty happy to date with Arq and Backblaze B2. Eventually want to configure it to use a local NAS as well.

Yup, I came here to second Arq as well. I back up to a local Windows PC which happens to be always on. Nowadays, it's very easy to enable SSH on Windows, create an account for backups, and then point Arq to that.

The only thing that got me, is that if you configure Arq to use SSH on Windows, you have to enter disk volumes as path.

For example, if you want your backups to appear on Windows in D:\BigDisk\Backups then configure Arq to back up to /D/BigDisk/Backups. That's not an Arq thing, that's how the OpenSSH server interprets paths on Windows.

Really wish Apple still offered Time Capsule, even if it is without AirPort Function.

Along with iOS Time Capsule. Instead they keep pushing their "Cloud" Solution.

A lot of people here talk about backing up their code. I did that too. In 2003. Anyone hear of Github? It's unlimited private repos.

.. and then you lose your 2FA-info or are blocked because Iran.

You need to lose your 2FA device, like your phone, the recovery codes (normally in a password manager), AND the local copy of your code at the same time.

Reality may be a bit more complex.

- You may not have all your source code repos synced locally. Maybe you think you do but forgot one.

- You may keep your phone in your computer bag and lose both at once (both your 2fa and your code).

- You may drop your phone in the ground and fail to get access to your password manager.

.. etc.

Having a backup strategy which involves you not breaking a piece of glass (aka phone) you play with while sitting on the toilet is a bit risky.

mac os is the new windows 10

Why use time machine over rsync?

A bit unrelated to this post, but Time Machine really sucks.

How can I get all the photos I’ve ever had on my laptop without going through each revision? I wish TM worked like rsync and some added metadata files for functionality. I’m literally having to zip each and every previous version which takes days and unzip those files on a hard drive to eventually have a replica of my current macOS filesystem with all the files I’ve had in the past conglomerated there

Sounds like you need a digital asset management platform or use git because that sounds like a overly-complicated, time-wasting workflow.

I really don’t trust this kind of ‘trust us we’re doing it in the background. Don’t worry about it’ type software anymore, without sufficient indicators of progress.

I do love Backblaze though. It’s like time machine in the cloud.

ANTI-DISCLAIMER: I don’t work there and deliberately not posting any referral code type stuff

I like TM because it is more than a traditional rsync style backup.

It appears to back up the state of machine and has allowed me to recover access to old google accounts by formatting the MBP and restoring it to 3yr old TM backup point.

Can backblaze let me restore my MBP to its state (including firefox/chrome logins, keyring/keychain passwords) as it was on Feb 12, 2018?

> Can backblaze let me restore my MBP to its state (including firefox/chrome logins, keyring/keychain passwords) as it was on Feb 12, 2018?

The Backblaze consumer product is unlikely to be able to do so, with all the files that they automatically exclude: https://help.backblaze.com/hc/en-us/articles/217665388-What-...

Yes they can, if they backup everything (apart from the system) while running as a privileged user and files are quiesced too.

Everything is a file in macOS.

There's no magic, nothing hidden and nothing special about TM.

Who's they? Is it some configuration that I can handle at my end as an administrator? Or is this some possibility that we aren't sure can be actualized?

Also how do I restore that backup to a blank MBP/disk so that I get the same state of affairs as I had on Feb 12th 2018?

The feature you want is called dedup(lication).

I searched for deduplication on Backblaze and it's definitely not what I want.

The feature I want is exactly as advertised by TimeMachine : an ability to restore a blank MBP/hdd to a point in time on the Time Machine backup timeline. See: https://support.apple.com/en-us/HT203981#macos

Apparently, you can pay backblaze $$ and they'll ship you a USB drive assuming you backed up your TimeMachine to backblaze: https://help.backblaze.com/hc/en-us/articles/115000046834-Ho...

So, I might as well have a local USB drive taking TM backups. Point is, Time Machine has a specific utility that Backblaze cannot provide. I am not sure if anything like this is even possible on Windows?

You should not bank on 1 backup, but at least 2 (1 of which is offsite). So you have original, on-site backup, off-site backup. In case of big trouble e.g. fire/burglary/disaster, you still got your off-site backup. Your on-site backup could be a NAS or USB stick, and your off-site backup could be on Backblaze. You only use the off-site backup when your on-site backup fails. At that point, you need to shell out money, because it costs money to download the data (costs per GB). Dedup saves you space. With incremental backups, lots of it. Saving space saves you money. Both on retrieval and on storage. All that Time Machine does for you is a really nice UI allowing you to go back to snapshots. Git allows that, too. ZFS and Btrfs allow that, too. I'm not aware of a UI as good as Time Machine though; that is the problem. But for an off-site backup, maybe you don't require that.

I am afraid you missed the point I was stressing about TM. It’s not the UI for me but the UX. The ability to restore a blank HDD to a macOS + state of all applications (including cookies and cache that’ll allow you to “go back in time”).

Agreed on other points about offline storage and offsite backups. I just think TM has its place and backblaze/syncthing has its own use case. My argument is you can’t replace one with another.

Yes, I can see the use-case. It is one reason why I am excited about NixOS. I just believe with a good UI you can make it good enough, akin to TM.

It‘s certainly a massive success for average users. No friction and extremely simple to use. There‘s really nothing like it that‘s as accessible.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact