Hacker News new | past | comments | ask | show | jobs | submit login
Arq 5.8.5 for Mac Fixes a Bad Bug (arqbackup.com)
139 points by ivank on May 23, 2017 | hide | past | web | favorite | 49 comments



While I agree with others that honesty is best, even if it hurts in the short term - there was a corresponding (perhaps even far worse) bug in the Windows version recently with no corresponding blog post or announcement.

I think it was the 5.7.8 (Windows) release that introduced a bug that caused Arq to believe a budget of 0 should be enforced. This meant if you backed up to AWS (probably others too) it would delete ALL backup records except the most recent one (and then immediately cleanup the now unreferenced objects, so even the reflog feature can't help you)

I had only been using Arq for a few months so I didn't lose anything major, but if I had been keeping years of historical records, I would be furious.

This was fixed in the next release 5.7.13 ("Fixed an issue that could cause Arq to enforce a budget when no budget was configured.") but still - I thought a bigger deal should have been made of this.


Windows users are not that fussy. Mac users are very loud so they require special treatment.


Another way to put this would be "Mac users are accustomed to higher quality software than Windows users."

Of course, either way of putting it would be engaging in lazy stereotypes about both platforms that haven't been accurate for years, if they ever really were; so let's not.


So majority of Windows software has HiDPI support?


So the majority of software has Mac support.


I dunno, data loss from backup software seems like a pretty big fuckup.

NB: I'm a very happy Arq user; I bought it because Time Machine crapped itself every couple months requiring me to recreate the backup (a pain in it's own right: 400g over wifi) and leading to a severe loss of confidence that backups were actually going to be there if I need them. I tracked it down to Time Machine gets really unhappy if you sleep your laptop while it's in the middle of a backup. However, having to check if Time Machine is running before closing the lid or being willing to run a 1-2% chance of corrupting the backup isn't really a workable state of affairs.

In a couple years, Arq hasn't eaten my backups. It's like Time Machine except written by competent people. This situation sucks but I'm sure Stefan will add enough tests.

On a digression, this is part of my increasing frustration with how shitty Apple sourced mac software is. This tiny company beats the pants off Time Machine, even including the recent screwup.


I've noticed the same problems. Apparently it is possible to write a program that will keep the laptop from sleeping if the lid is closed. (https://www.appducate.com/2012/12/keep-laptop-awake-even-whe... is one I found, but I haven't tried it.) Given that, I wonder if it is possible to write another program that will check to see if a time machine backup is running, and keep it from sleeping until the backup is completed, after which it will sleep.


It really is incredible that a one man shop can do a better job with backup software than Apple's Time Machine. Good job, Stefan, and boo hiss to Apple engineering.


Adding to the sentiment of others, I've been a long time Arq user and grateful for Stefan's openness and honesty.

I imagine most or all of us with that sentiment did not suffer any data loss or rather did not need to do a retrieval and discovered data loss during that time. So much easier for us to write it off as a non issue. Where is if it affected us I imagine we'd all be quite a bit more upset.

Though most people use common alternatives such as Time Machine which every few months runs into data issues and far less transparency. Stefan could have quietly updated with a fix, and waited to see if anyone discovered the issue and then only announced the issue if that occurred. But instead he proactive announced it. So bravo. And I'm sure he will instill new procedures to hopefully avoid that in the future, but of course with any software there is always room for error no matter how much testing you do.


I'm a long time Arq user. I very much appreciate the honesty. I can't solve a data backup problem if I don't know what that problem is.

Ont this note, I've thought about something at times: If there was one feature that I would like to see is options to verify backups. Beyond the possibility of Arq having a fault, it's also very possible that a backup service has silently failed. Either, the files are corrupted or the service lies that the files exist but you can't actually download them. I would appreciate the ability to have files pulled (either periodic random or marked files) pulled and compared to local files to verify the backup. I would also want to cap the max files/bytes pulled in a month so that I don't burn too much money.

I know it took guts to make the post. You did well.


Love this idea!! Verify backups or some portions of backups randomly from time to time would be great. This is one of the biggest problems I've seen with backups. A number of cases where "it used to work" but now it doesn't a year later when you need the backup.


I wonder if the easiest way of doing that wouldn't be to compare checksums? Saves having to re-download portions of a backup set, which could potentially get expensive in terms of bandwidth costs, and has the disadvantage of only confirming whatever files you are looking at.


But, that's still trusting the provider. The provider is likely to compute it at upload time and then store in meta data. So, having that meta result returned is no more proof your files exist than listing them.


Thanks for the honesty, I really appreciate this.

And BTW, I am a happy Arq user, I've been using it on multiple machines for the past several years. It is the ONLY backup solution out there that:

* encrypts my data without me having to supply the decryption key to a third party when decrypting,

* decouples backup and encryption from a particular storage provider,

* makes me independent of ebbs and flows of storage providers by supporting pretty much every one out there.

I've learned to value independence over cheapness, so I stick with Arq. Plus, it's been working really well over the years.


Duplicati hits everything you're talking about (I have essentially the same criteria for backups).[1] It's cross-platform (Windows, macOS and Linux).

I love using Arq on my MacBook Pro, but since they won't release a Linux client I use Duplicati on Ubuntu. I like it a lot.

__________________________

1. https://www.duplicati.com/


Nice! Thanks for posting this — I've been looking for something like Arq for Linux, and this seems to be it.


These are also handled by Borg [0], which supports compressed data as well. However, it does not have direct support for different cloud providers so the archive must be synced with different tools.

[0] https://borgbackup.readthedocs.io/en/stable/


A couple people use external tools (rclone and git-annex are popular) to sync their Borg repos somewhere else. This requires enough local space (hard drive / NAS / whatever) to store the repository. This can be an advantage (=storing to a local drive is very fast; if it's on a NAS the NAS can sync it to the cloud; also you can quickly recover files from a local drive), but is not always practical.

In the long run (Borg 1.2 - 1.3) we'll add an interface to Borg to make it storage-independent ("repository drivers") in a secure way (this is already sketched out, just not enough manpower in the project to implement it all right now. Finishing 1.1 has top priority and received a bit of funding).

Adding support for S3 and $threeDozenOtherAPIs directly to Borg is generally speaking unlikely.

PS: To-be-released Borg 1.1 will optionally get rid of cache syncs for "borg create".


Oof, unfortunate. Can well imagine the sinking feeling that comes with the discovery.

How do people feel about killswitches connected to C&C servers so that if one releases broken software, one can block its use, or force an update? I wouldn't be very upset if it were just a flag you could turn off to prevent killswitch enforcement.


Google has a thing like that for Chrome iirc (and its granularity is to the feature level)


MSN Messenger had a thing like that, also granular to the feature level. Disclaimer: I was involved with its development 12 years ago or so.


As an Arq user, the honesty and openness are greatly appreciated. Hopefully, nobody has gotten bitten by this issue when attempting to recover data.


I have been an happy Arq user for years. Thanks for being so honest!

This only increases my trust in Arq. Every program will have terrible bugs once in a while, even backup software. This shows that we can trust Haystack software to inform us, rather than sweeping things under the rug.

And as always: make multiple backups to different locations.


Ouch. The openness is appreciated, though.

Beyond randomly checking a time in the past and trying to download a file, is there a way I can tell which / how many files are now missing?


This could have been worse. I was using Crashplan and tried to add an ignorable directory in the web panel, which had no flags or warning labels or anything around it indicating that it could be destructive. The web server had some kind of error and when I reloaded the page, my settings were wiped out. Several hours later I got an email that all my backups were going to be deleted by 5pm. The email was received at 5:20pm. Customer support basically just shrugged it off, and I was an ex customer by the next day.

Shit happens, but that was particularly egregious. This is why you should have multiple backup methods in place, and if you've never restored one, you don't have a backup.


For those that use Arq, but counsel secondary backup tools as well, any recommendations?

I'm on a Mac - I use Dropbox (with permanent version history turned on) for common files, and Time Machine to a Drobo for short-term versioned/incremental backups (short-term because the backup invariably gets corrupted every year or so and I have to start over). (I also currently use Crashplan but am thinking of switching to Arq since Crashplan has very slow upload speeds - slower than my ISPs upload speeds.)

I'm also figuring out a way to remotely have a mirrored boot drive for my Macbook but haven't quite gotten there.


In addition to Arq, I run a local time machine and backblaze.

Backblaze runs all the time on _everything_ (some 4TB), and I have Arq running nightly on my most important stuff (photos, documents, music, recordings) to AWS as well as Amazon Cloud Drive.


If you happen to have critical virtual machines or disk images, make sure you back them up elsewhere, since Backblaze doesn't back them up. Made it an annoyance for me personally, but it otherwise works rather well for the typical user.


Backblaze backs up my virtual machines... You just have to remove those file types from the exclusion list (where they are by default), and raise the max file size to "No Limit". Neither of these are hidden options.


Wow, as a Mac user of arq I've completely lost any trust I had in this app. I guess I'll have to run both arq and something like duplicati...


Never trust one backup solution. While I am happy to see people use any backup at all, as a professional and as someone who cares about my own stuff I feel a much more sophisticated approach is needed.

Since 1980, I have learned to keep multiple backups, using multiple independent software, stored in multiple locations, on multiple types of media, overseen by multiple people.

An overview of my personal backup system for my primary system a MacBook Pro:

* Boot able mirror to remote hard drive via Carbon Copy Cloner * Data backup using Arq to home file server with RAID, and Google Cloud (previously to Amazon and Microsoft) * Data backup using Backblaze * Data sync via Resilio to home server * Code stored in Git * Some stuff in DropBox primarily sync with iOS apps * FastMail IMAP email synced to a local backup store * I just stopped use of CrashPlan after 9 years

I have other backups going for the home file server.

And of course, my servers for work have a whole other multilayered backup system.


How often do you create the bootable mirror to an external drive?


I have it set up to do it once a day. I rotate that between two drives. This part is done by hand, I try to rotate once a week, but more likely every two weeks.


It's not a good idea to have just one backup system. I like to have 2 completely independent backups.


You're completely right, although typically I only overlap backup methods for very important files and photos. Otherwise the storage requirements/costs get pretty extreme.


Backblaze is $5/mo.


Guess it's a clear case why diversity of backup formats is a good thing.

"If we could, we’d make backup copies of our valuable data on clay tablets" https://landing.google.com/sre/book/chapters/data-integrity....


bad things happen; its all in how you respond. keep up the good work.


The blog post doesn't actually say that 5.8.5 repairs old backup records. Does it?


I read it as that it will re-upload files that still exist on your live system that aren't (but should have been) in your backups, but will not recover deleted files that should be in backups.


No. Old backup records might have missing objects (if....) and this can not be repaired.


This is an important point that is not clearly addressed.


It's a bit off topic, but I wonder why they add such a visible ® after their name in the navbar.


I've worked with clients who were very adamant about the trademark and registered trademark logos overly apparently. Can anyone explain why?



Probably just didn't occur to them to put it between <sup> tags.


This is a part of a rather disturbing trend to ship frequent and not-well-teated updates. Another example of this is 1Password.


What did 1pw do recently?


Everyone makes mistakes. This one hurts, but they aren't trying to sweep it under the rug.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: