Hacker News new | comments | show | ask | jobs | submit login
CrashPlan is exiting the consumer market (code42.com)
274 points by akulbe 114 days ago | hide | past | web | favorite | 334 comments



I actually have 3 Crashplan for Families under my hands.

My mom and dads iMacs is one of them. This is literally easy money for Crashplan. But the price would double to move them to the new Small Business plan.

What I liked about Crashplan, besides the unlimited space was that if files would get deleted, and you would find out about it 3 months later, you could recover them.

With Backblaze, which is mentioned here, you wouldn't. Which, for me beats the whole purpose of doing backups at all.

See here:

https://help.backblaze.com/hc/en-us/articles/217666688-What-...

This makes no sense to me: so if a malicious hacker, virus, Trojan or your kid deletes a folder... by the time you find out, Backblaze might already have stopped taking backups of it, itself.


After we are done talking about Backblaze as a snappy app (yes, it has always been many times faster than CrashPlan), cleaner and simpler interface (no doubt here), a lot faster speed (this has been my experience and seems to be the case with a lot of people), it really isn't much good as a backup service choice.

I have mentioned before, I think here on HN too, the data retention policy is just too bad and their reasoning for it, that "it's not an archiving tool", is beyond ridiculous.

And then their interface for restoring files is only from the website. Yes, even if you have to download a 15GB file you do it in the browser or ask them to ship the hard disk to you. I guess it's tough luck if you are in a different country, or different continent altogether.

I will stick to CrashPlan for the time being and go for the 1 year Small Business plan (at 75% discount for existing Home customers) but beyond that or maybe even before that I am looking at something solid and self-hosted. I already backup my Dropbox folder to Tarsnap (which has a kind of okay GUI now), I will look at Glacier or a VPS and backup using something like Attic or Borg. Sad they have no GUI. I feel confident about backup apps with a GUI - I find it intuitive to customize and can see in one look (or maybe keep seeing) what's happening if I want to.


Brian from Backblaze here.

> I guess it's tough luck if you are in a different country, or different continent altogether.

Backblaze sends USB hard drive restores to other continents every day! Today we have one restore shipping to Finland and one to Great Britain. And to be clear, we absorb 100% of the cost: it is $189 USD deposit and we absolutely absorb 100% of the shipping cost whether it is Texas or Finland.

Also remember that the USB hard drive restores are "free" to customers if they return the USB hard drive to our office within 60 days. The only thing the customer has to pay for is "return shipping" and that can be on a super slow service and we're talking about 3.5" drives here that weigh just a few ounces. Of you can keep the 4 TByte drive for the $189 deposit (turns into "purchase price") which includes shipping and the hard drive and the service.


Hey Brian, so nice of you to be commenting here. Thank you.

I am sorry if I am sounding like an obstinate whiner but no, I don't want that. I mean I am glad if you offer that but I want:

1. A way to restore that is from the app and that can withstand disconnection, gap of hours, or maybe days - right inside your really awesome app (I mean it) with partial download and shit.

2. (I didn't know about international shipping, thank you for telling me) I don't want to rely on that and I honestly am not looking forward to cover whatever little (or more) cost that is and being out of USA it will be more and then the hassle of returning is even worse.

You know, I may just want 30GB out of my 3TB, or hell maybe even 5GB and even for that "the browser restore interface" is just sad.

As a customer I would urge you to have a look at it and maybe listen to customers on it (or future customers :))

And data retention! You didn't even touch upon it. Kind of shows your/BB's stand is clear on it, right?

[edit]

And I forgot that you need my encryption password for sending me that hard disk. Now that's a design problem you shouldn't even have in the first place :(


It is a lot of valid feedback, we're doing a lot of emergency planning this week, but when the smoke clears we'll definitely start figuring out a "local restore".

> I may just want 5GB out of my 3TB, even for that "the browser restore interface" is just sad.

I'm slightly confused by this one. I assumed that pixel-for-pixel the interface of local restore would match the pixels in "the browser restore interface"? Is there an assumption that if it is a local application, then there would not be a "tree view" you see in the web and instead you would get something that looks like a Macintosh Finder or a Windows native "Explorer" interface?

If that is the case, what if we built something that looks IDENTICAL to the local Finder or Explorer but in a web page would that make some people more happy?

> And data retention! You didn't even touch upon it.

Just an oversight. :-)

> Kind of shows your/BB's stand is clear on it, right?

No, I can ALMOST guarantee you the Backblaze data retention policy is about to change, and possibly within a week. We heard all of you loud and clear and we're just struggling to figure out exactly how much this is going to cost Backblaze, or exactly how much of that cost we plan on passing on to customers and what that looks like.


Thank you again.

No, what I meant by:

>> "I may just want 5GB out of my 3TB, even for that "the browser restore interface" is just sad"

Browsers are usually (or maybe never) good for downloading very large files. For example I am downloading a 6GB file or ZIP from BB using my browser and it crashed when I was done downloading 5.6GB. In most cases those 5.6 GBs would be gone and I will have to start over. Where as in a desktop or some kind of download manager/helper (which will have my login state/credentials just like Backblaze app has) it can be done over a period of time and will withstand machine going to sleep, shutdown, network disconnection etc.

But since you mention:

> what if we built something that looks IDENTICAL to the local Finder or Explorer but in a web page would that make some people more happy?

As I've mentioned above - it was about the functionality and not the look and feel.

And, imho, it will really be good if you could move the "restore" functionality to the main BackBlaze app:

- where I can see my files and hopefully available versions too and select version > file at that version or version > then files at that version (some apps first let a time to be selected and then files at that time are available for restore - and some apps do the opposite)

- or, at the least I can do all that on the web and the BackBlaze app starts restoring when it connects to the Internet again in a folder designated by me (

- restores retain the same folder structure. e.g., I am restoring a.png which was at ~/MyFolder/A/B/C/Z/B/A/CoolPics/a.png and restore function dumps it like this ~/<BackBlaze Restore Folder>/a.png. Ideally it should give me option to either dump just the file or give me the restore like this: ~/<BackBlaze Restore Folder>/MyFolder/A/B/C/Z/B/A/CoolPics/a.png or at least recreate it where it was and if another copy with the same name exists, rename the other file but this (latter) option will be a mess in case of many files

> I can ALMOST guarantee you the Backblaze data retention policy is about to change

That's very good to hear. Probably the best thing I head since yesterday :-)

PS. Please make sure this retention is something really cool and preferably not something like "well, let's make it 2 or maybe 3 months from 1" :-)

Because, as you must be knowing, the point of backup is that if I am looking for an important Excel sheet file after 7 months and somehow it was corrupted or deleted I'll turn to my backup and with a short retention it would have been gone forever defeating the purpose of my backup.

While we are at it, here's another feedback: Maybe start warning users when Backblaze sees some files (it was backing up previously) deleted before actually deleting them from archive (mark for deletion in a month or so). Maybe let the user do some action like "yeah, go ahead get rid of them", or "Uhuh, didn't mean to delete them - please restore it or put it where it should have been".


Linux support please...


I've done this. Cost me like $15 to get all my data back in 2 days :D


Also works on Linux which Backblaze doesn't. I have Ubuntu on all the family and some extended family machines and using Crashplan currently.

Anyone can recommend if a good backup alternative that works on a Linux (desktop)?


For full folder backups, duplicity [1] has been great for me, does incremental backups, and has quite a few backends. You control rotation policies, full-snapshot frequency, etc, and bring your own storage (works with s3, rsync, etc.). Full-folder recovery is also pretty easy in my experience.

The downside is that it's all on you to configure providers, pay for storage, and I'm not aware of any indexing or ability to retrieve single files. Also, it's CLI-based, so it might not be very intuitive for desktop use. Still, a free solution with lots of utility. :-)

[1] http://duplicity.nongnu.org/


The issue with these for me is that with the common cloud storage providers, even just backing up my Documents folder raises it above the price of BackBlaze or a similar offering.


Duplicity can backup to Backblaze's B2 service, which is an S3-alike where you only pay for storage and download, not upload. 40$ to restore backup in the case of a failed disk, and a few bucks a month (literally) for storage.


Anyone have any experience with Sia? $2 per Terabyte-month sounds really attractive.


I dont't think there's an easy way to push incremental backups into Sia


if you're willing to pay 50€/year, you could get a hubic[1] subscription for a 10TB cloud storage.

i haven't tested it myself though, just heard about it.

Its supposidly possible to share this account with friends, because duplicity backups are GPG encrypted.

[1] https://hubic.com/en/offers/storage-10tb


Hubic has an unpublished 10Mbps upload speed limit [1][2], which should be sufficient for most backups, but is still something worth taking note of.

[1] https://forums.hubic.com/showthread.php?173782-10Mbps-bandwi...

[2] https://www.reddit.com/r/DataHoarder/comments/4o9tm2/psa_hub...


I use rsync [1] to backup my laptop to my server twice an hour. (Hard-linking files that already exist in previous images.) Then another script [2] on the server prunes old backups every night.

[1] https://gist.github.com/azag0/187174485a35a70d9d1d79c8533a6b...

[2] https://gist.github.com/azag0/b975c8a1935db29376e20926404d00...


https://www.cloudberrylab.com

These guys have a fantastic tool for rolling out a backup plan using storage that makes sense for your needs.


I agree it's a concern. If you're in a Mac I'd recommend doing time machine backups as well. I consider those my first resort backups for file recovery and such. My Backblaze backups are a worst case scenario, belt and braces off site option.

Even my time machine backups only go back about a year though due to capacity limitations on the backup drive. My most critical files are also on Dropbox, which I know isn't really backup but it's nice to have as an extra recovery option.


Agreed, it's not a backup if it only exists in one form, or one space.

After purchasing the eventual death and exit of your 10th or 15th laptop, you kind of want to spend less time doing those transfers, especially with complex installs that can't be trivialized to the cloud.

For Mac users, I like the backup practice of using two drives in two locations, Time Machine for incremental backups, and Carbon Copy Cloner for bootable backups, and then bump one of those images up to the cloud. One drive on a desk at home, one at the office, offsite, or a family member's house.

Additionally, keeping a third hard drive in rotation elsewhere, or use a local NAS that bumps up to the cloud for you as well.

The bootable backups tools like Carbon Copy Cloner or Super Duper provide are priceless in my mind.. you can simply plugin your HD into another mac and boot off the external drive not missing a beat while your hardware may be getting attention.


I don't have bootable backups at the moment but it does sound like a good idea. I'll look into that, thanks.


The bootable backup is key. As the commenter above said, that will be the thing that lets you get back up and address any other missing data and find long-term replacement hardware. SuperDuper has saved me and my family a couple of times.


I'm in this same predicament. I used to have to do constant backup/restore on family member computers, and Crashplan really made that painless. This transition is going to be a real pain.

So far it doesn't look like they're keeping the free peer-to-peer backups, which I was using for the less important computers, so this would more than double the price if I went for the SB plans.


> So far it doesn't look like they're keeping the free peer-to-peer backups, which I was using for the less important computers, so this would more than double the price if I went for the SB plans.

Yeah, it seems that not only would the peer-to-peer backup process stop working, the already-backed-up files would be rendered useless too.

Relying on user accounts and "backup codes" stored on CrashPlan's servers sounds like it was an additional point of failure.


The same point about file recovery was why I chose Crash Plan and why I did not believe their plan could be sustainable. :(


There was probably a lot of abuse going on with the "unlimited" disk space aspect of it. I can imagine some people backing up their entire movie collection to their servers.


It's not "abuse" if you claim it's unlimited and people use it as if it were.


It is abuse, however, if you sell a service as a backup, and people use it as primary storage (e.g, by running backup/restore cycles to store and retrieve files). I don't know if people were doing this, but given some of the silly things I've heard of happening on other backup services, I wouldn't be surprised.


It's not abuse if you are following the TOS though. So if you want to classify it as abuse you need some objective measurement for what is acceptable use or not. 1 restore a month? 10? 100? You have to have a documents AUP before you can call some usage pattern abuse.


Sure, this is a figure of speech -- if it were abuse in a technical sense they would be going after users for terms of use violations rather than changing their pricing. I think the intent here is that they probably predicated their pricing on a certain anticipated usage pattern where most of their users would be regular people making backups of typical hard drive sizes and contents, and that more people took advantage of the unlimited offering to do something other than that, which meant their pricing didn't cover their costs.


I have a feeling (if this has not already happened) that someone is going to sue over the use of the term 'unlimited' and some court is going to rule that 'unlimited' does not mean 'without limits' but 'an reasonable amount based on the average expected use'.


This was sorted out back in the dial-up days when providers (including the one I worked for) offered "unlimited" dial-up service.

Customers sometimes argued that they should be able to stay connected 24/7 all month long because that's "unlimited".

Our definition (that we indicated in our terms) was "you will never be billed a surcharge for usage during a month", but there was no guarantee of being able to use a certain number of hours. So we reserved the right to de-prioritize (i.e. disconnect) high-usage users during peak times to allow other users to get on when our lines got full.

I'm sure somewhere in Crash Plan's terms of service they define what the term "unlimited" means in the context of their offering, and I would bet it means "there will never be a surcharge for usage of storage".

So they will never ding you for using more than X storage, but there's no guarantee that you can store a specific amount of data on the service.


While I like Backblaze's storage offering, they really have an extremely inferior management platform for their system. Management of devices is manual and reports are sent manually in Excel format. Cancellations and device reallocation are also done manually and over email.


If you're looking to recover a file 3 months later, have you tried set them all up with TimeMachine as the first layer of backup? Cloud backup services should really just be for catastrophic.


Yeah, I'm bummed about this as well. One answer to your accidentally deleted file might be to do snapshots at home (I do them using ZFS). That way, you can recover those locally if required and then use your external backup provider for true disasters. But I agree it's a shitty compromise.


WTF. This sucks, my main use case for Crashplan is downloading files that I long ago deleted (or left on an old machine), or downloading stuff from old external hard drives (which Backblaze also doesn't support). I might just suck it up and pay the small business price.


Sounds like you're not using their backup service as a backup service, then, and perhaps part of the reason why they've discontinued it.


To my mind, keeping copies of old, deleted files is an important part of a backup. Just to be clear, I wasn't using crashplan as a kind of media file store or anything - their restore experience was far too clunky for me to bother with that!

I'm more concerned about old files I've accidentally or deliberately deleted that I later turn out to need.

Anyway, my storage requirements are pretty low (probably 1tb) so I'm going to look into solutions like B2 with Arq (along with my existing Time Machine).

Just frustrating to have to go through all this!


So what's a good Linux consumer solution? We here are all capable of rolling our own hacks, sure. But I want a good clean friendly product I don't have to configure or think about. The real challenge in these things is the UX for restoring files.

(For home-grown options, I've been using rsnapshot for years. But it really works best on a local disk and the rsync-over-ssh hacks I've done are the opposite of "clean consumer solution".)


Duplicity [1] is a free software project that creates incremental backups and encrypts them using GPG. It can use Backblaze B2 [2] as a remote synchronization option, which when I looked at options last year was also the cheapest cloud storage provider anywhere (if this has changed, let me know!). Duplicity also supports a very wide array of other cloud storage services if you'd like to use another one [1].

The killer feature Duplicity has for consumer use is that it's probably the only (?) backup program on Linux that has an actual quality desktop GUI called Déja Dup [3] that's included by default in Ubuntu. It works somewhat similar to how Time Machine works on Macs. But if you need to backup headless systems the Duplicity command line interface works fine as well.

When it comes to backup features it's not the most powerful tool compared to some other solutions like Attic or Borg, but I think the GUI and out-of-the-box (encrypted) integration with cloud services make it one of the most user friendly solutions on Linux.

[1]: http://duplicity.nongnu.org/

[2]: https://www.backblaze.com/b2/cloud-storage.html

[3]: https://www.linux.com/learn/total-system-backup-and-recall-d...

[3b:] EDIT: Also Duplicati has a nice web GUI. Worth looking into as well.


While it works well, Duplicity is rather slow and only supports expensive full-incremental-full backup cycles.

Modern incremental-forever backup tools like borg[1] (fork of Attic[2]) are much faster since they are based on block hashing, which also gets you deduplication for free.

Basically, a backup is a set of hashes - this means that you can selectively delete or retain old backups without having to merge them.

I built a >50 TB enterprise backup cluster with borg and it works extremely well.

[1]: https://borgbackup.readthedocs.io/en/stable/

[2]: https://www.stavros.io/posts/holy-grail-backups/


For the "cheap" part one'll probably need a system that has erasure coding and can survive partial outages.

E.g. if one backs up to hubiC, OneDrive and Google Drive, with a RAID5-like redundancy, they can have just 33% overhead and still be sure shall any of those vendors discontinue the service or suffer a failure, their data would be still safe. Some call that RAIC - Redundant Array of Inexpensive Clouds.

git-annex and Tahoe-LAFS do this, but they're not actual backup solutions.


Borg is amazing, except for the lack of public key encryption, which makes it unusable: https://github.com/borgbackup/borg/issues/672


Been keeping my eye on restic for just this reason.

https://github.com/restic/restic


oops, missed the 'public key' part - restic, last I checked does not have that... the project author is thinking about the design of it:

https://github.com/restic/restic/issues/187


Agree, but lack of public key support is the only reason for me not using it


We've handled backup with Duplicity for various things, and found it useful.

If you go with this one, do be aware that by its nature, restores when you've done a lot of incremental backups can be extremely slow. (Think hours, where other systems would retrieve the data in minutes if not seconds.)

That means you really need to be a doing a full, and therefore also slow, backup reasonably frequently to keep Duplicity usable. As a rule of thumb, we found a full backup each month and dailies each night was just about tolerable even if you needed a restore quite late in the month, but even then a 10GB backup with 30 days of incrementals on top could be looking at an hour or more to retrieve a single file.

As I understand it, Duplicity's scheme can also be vulnerable to minor corruption (e.g., due to disk error) anywhere since your last full backup taking out the entire backup, so if you're going to use it then you might like to investigate the options it provides for mitigating this risk as well.


Adding a support in for duplicati. Been running it in a docker container on my servers for a while and it's been pretty smooth sailing. Sending all backups to a gsuite business account.


Before any confusion arises, Duplicity and Duplicati despite their similar names and goals are two different projects that are not mutually compatible. The majority of my post is about the former, not the latter.

If my understanding is correct Duplicati started out as a C# rewrite of Duplicity (hence the name), but the projects have grown apart and are different things now. I don't think you can use Déja Dup with Duplicati, for example.


Duplicati is suprisingly easy to set up for Windows -> Linux. Installed the windows client, made an account on my linux box and pointed the windows client at it.

Is it a proper service on Windows? That's a problem. I want to configure it from an admin account and have it still work when non admin/non tech savvy users login.


I can't answer your question, but every time I've tried Duplicati the DB gets corrupted pretty quick.


Some folks at r/DataHoarder recommend Duplicati: https://www.reddit.com/r/DataHoarder/comments/6vbcok/crashpl...


I'm pretty sure that whatever solution the r/datahoarder people go for is exactly the one I DON'T want to choose. Those are the exact people who make attractive pricing unsustainable.


Maybe you don't like the way they abuse their accounts, but it doesn't mean their tools are bad.


It does mean the tools are more likely to be driven out of business, though.


Duplicati is backup software only; you bring-your-own-(local or cloud)-storage.


I think the ship for "good clean friendly product I don't have to configure or think about" sailed with the mention of consumer Linux.


There's rsync.net which very explicitly supports rsync/zfs over SSH


I have an account with rsync.net and I like them a lot for their Unix-friendly services. However, I unfortunately have to recommend that you NOT use rsync.net if you are on a Comcast/Xfinity cable modem.

rsync.net has some kind of problem with the "Powerboost" feature. The first 10 megabytes of a transfer are fast as hell, then it drops to ~150kB/s. Have fun backing up gigabytes at that speed. There is something wrong with their traffic shaping. After much back-and-forth their tech support basically threw up their hands and gave up.


Do Comcast throttle torrents for you? Perhaps the traffic to rsync in some way looks like torrents to them. Or like spam/DOS or something like that. Or like a competitor. The things I've read about Comcast are not pretty, alas.


No, my torrents routinely saturate my connection. It's definitely rsync.net. In the past they have tweaked their settings and been able to get it to work, but it has never stuck.


I have 640 GB to backup, that would be 52 USD per month on rsync.net.

Dropbox takes 9.9 USD per month for 1 TB


I use rsync.net with borg. Works great.


They have a dedicated account for borg/attic: http://rsync.net/products/attic.html

There's also restic, which works similarly, but has native support for cloud storage: https://restic.github.io/ - sadly it's claims of being efficient are rather undermined by the continued lack of compression support.


I've been really happy with Rockstor - http://rockstor.com/ - it's essentially like FreeNAS, but with Linux, Btrfs, and lower RAM requirements.

I have a mirrored 2x6TB setup that all of my computers back up to. It shows up as an Apple TimeMachine, so MacBook backup and restore is easy; and then it also some media storage with automatic nightly/weekly/monthly snapshots - the monthly ones hang around for a year, the others are cleaned up sooner. For those, I have to un-hide the snapshot folders, but then I can just copy-paste any deleted files that I want to restore.

It definitely takes some setup work, but I've been pretty happy overall.

That server is in my basement, my work laptop also has CrashPlan for off-site backups, but that's a business plan, so it won't be affected by this. I do need to figure out an off-site backup for the rest of it...


I have a strong suspicion Linux support doesn't exist, because that user category are going to buy 1 unlimited plan, put it on their 20TB home SAN, and well...20TB of cloud storage is not cheap (price it out on backblaze business).


The last one I tried was SpiderOak. I've got to be honest though, I wasn't a huge fan then, and it still looks like things haven't improved all that much recently: https://ctrl.blog/entry/review-spideroak


I don't think I'd call it a "consumer" solution really, but it works very well, once you set it up.

rsync.net

It's not the cheapest either, but I never had issues when I was using them.


Tarsnap.


Tarsnap is not reasonable alternative for most users. Consider that I was paying $5/mo for 200GB of storage from CrashPlan. From tarsnap, that would be $50/mo.

And, frankly, my measly 200GB isn't much data. A 2TB HDD costs under $100 and could cost $500/mo to back up with tarsnap. It would still have been $5/mo with CrashPlan.

I get that Colin is basically just charging the S3 costs plus a small margin, but the reality is that S3 is far too expensive to be used to store low-value consumer data.


The most expensive S3 storage tier is $0.023/GB/month, or $4.60/month for 200 GB. That's quite a markup to $50.


It seems the price of S3 has dropped significantly over the years while tarsnap has remained more or less constant. I guess my statement was out of date.


I backup quite a bit and after deduplication with tarsnap it's not nearly as much.


I've been using duplicacy and it supports a bunch of different remote endpoints. https://duplicacy.com/


I hear that a lot about tarsnap, do you all mainly back up VMs or log files?

The majority of my most important backup data is my photos. I don't see how tarsnap could duplicate (or even further compress) that.


I evaluated tarsnap for my photos a while back, and this was the conclusion I came to. I would have been paying for basically the entire size of my photo library, which is quite large.


That's kind of another problem though. No way to predict costs in advance. All you can do is know the upper bound.

I have approx 2.5tb on CrashPlan, so that gives me an upper bound of $625 per month. Maybe it might be 50% lower when deduped, maybe more, maybe less. Who knows. Not really a pricing model I can make a decision on.

Tarsnap looks great for certain use cases. But it doesn't really make sense for home users with a large amount of data.


The Tarsnap client has a "dry-run" mode, which will dedup your data and tell you how much GB will it take up. Based on that, you can then make a calculation of how much will it cost.


Didn't know this. Thanks. I might give it a look as I'm being forced to switch provider anyway.


You can do a dry run with Tarsnap to see how much it would transfer and how much it would deduplicate before ever running for Realz(tm).

Using the --dry-run flag.


While I'm sure there would be minor savings, the price differential is so large that the data would need to be >90% duplicate to be cost-competitive even for small data sets.


Agree! It now even has a nice GUI https://github.com/Tarsnap/tarsnap-gui


One of the options I looked up after getting the email from Crashplan. Unfortunately the price is crazy, it would cost over $500/month for the amount of data I have in Crashplan for $5/month. (Granted that product is going away, but even the upgrade path to Crashplan Pro will only cost $10/month.)


You can only say that after you have tried a dry run of Tarsnap to see what your actual usage would be. You can't take the rate Colin charges at face value because that is not what you will actually wind up paying.


Haha was just typing the same thing you beat me to it


In my mental checklist of evaluating new products and services, Backblaze has always gone above and beyond while Crashplan irritated me enough to make me switch:

* Are they one of the market leaders?

* Do they have a company blog? Are they passionate about their product? (Backblaze's technical articles are always a treat; barely a word from Crashplan.)

* Do they regularly interact with their users, e.g. on HN? (I often see Backblaze commenting on relevant articles.)

* Is their website user-focused, or is it aimed at the enterprise? (Crashplan's website is a confusing mess.)

* Do they offer secondary products that relate to their primary offering? (B2 leverages Backblaze's fine-tuned backup architecture.)

* Is their software well-built/designed, or is it annoying and bloated? (Crashplan's client forces you to wait for reindexing every time you check a new folder! The tree view is also very annoying, and Backblaze's whitelist just makes a whole lot more sense.)

A shame for Crashplan users, but I hope this change ensures BB's longevity for many more years!


Their 30 day file retention policy makes all these questions irrelevant.

"Do they have a blog"?

Really? I'm sorry but when it comes to backups I rather have them working on the product instead of having a marketing machine up and running.


There's a pronounced difference between flat-out marketing and posting interesting articles that help promote your product. It's clear to me which side of the divide Backblaze lies on. And I think Crashplan's shuttering bears this out: their tech was always a means to an end for them, and their end simply did not include power users.


This is basically the marketing that got Backblaze it's respect.

The service features and GUI experience fall short, and yes I am widely just referring to the 30 day file retention. In practice, this basically means you need yet another plan to make up for that. Once you find that other plan, it should make Backblaze unnecessary if it works in a reasonable way.


There's only so much interesting posts to do about "backups". It's a boring product, like "email". So you don't use gmail because the team doesn't do enough enthousiastic posts about it?

This is typically a product you just want to work and are not busy with reading the company's blog as long as things are working fine.

You set the settings and can forget about it.

That said, your whole post seems somewhat biased.


I've always been a big fan of BB's blog, and have recommended their approach to friends and clients for their own blogs and sites. Their data, plans, and progress are not only interesting to people who care about the industry, but it's also a window into the kind of people they are. Proud of their accomplishments (their backup servers, which they've open sourced), and their tools (like their server lifts).


Poor thing, it's a blog. It's maintained and updated by the consumer oriented corporate brand. It's marketing.

I am not saying it's evil or that your wrong, but that your deductions are illogical.


Perhaps GP is trying to say, that, in contrast to "just a blog", it's a blog with really good, technical content. This makes IMHO a great difference, since basically anyone can "just write a blog" with buzzwords bingo, but to write a competent, high-quality, relevant, technical blog, is really hard and takes time to do properly. Doing so might mean that they care for their product, that little extra love. That in itself is marketing yes, but also an indicator of pride, which in itself _might_ indicate a well-greased machine with lots of love spent on it.

I'm a bit partial though, as that fits us quite well in what we are doing at our place.


Backblaze's reports on their hard drive failure rates regularly tops Hacker News' front page: https://news.ycombinator.com/from?site=backblaze.com


* Do they have a Linux client? -No

* Do they allow you to recover deleted files? -No (I have used this a few times, it saved some of the digitized family photos which I blew away with an rsync --delete flag)

* Do they offer secondary products that relate to their primary offering? Ok, I can read that as they are stretching too thin not focusing on backups enough. Will they announce they are dropping out of consumer backups and moving on to B2 as their main product?

* Is their software well-built/designed, or is it annoying and bloated? - Currently is doesn't run at all on my systems.


* Do they allow a data restore without you giving them your encryption password? - No: https://www.backblaze.com/backup-encryption.html


Ha, that was surprising indeed:

" we decrypt your data on our secure restore servers and we then zip it and send "

That seems shady to me


The key drawback for Backblaze for me is their lack of decent external HDD support. If you don't connect your external drive for 30 days then the files are marked for deletion.

I usually keep my photo library on external HDD and archive every few months from my laptop. I'd be anxious about not remembering to plug back in to keep the files backed up to the cloud.

If Backblaze sorted this out and extended the time to a year then I'd happily sign up again.


Agreed, it's a bit annoying. Wish they would fix it, or at least extend the limit. (Also, I can't seem to get one of my USB sticks to reset even when plugged in. Do they get queued up if there's a longer backup in progress?)


I switched from Backblaze to Crashplan a few years ago, and I've been MUCH happier with Crashplan. The biggest issue is that, at least on Mac, Backblaze noticeably slowed down the machines (multiple machines, not just a problem on one), while Crashplan didn't. Any backup software that makes my computer feel slow, creates regular beachballs, etc. is not something I'm going to pay for. Then the nail in the coffin for them was when I very carefully followed their instructions for migrating to a new computer, and yet it STILL insisted on starting from zero and backing up everything again. I figured if I had to backup from scratch anyway, I might as well do a trial run of something else, and I haven't looked back since.

So, this news makes me very sad, but there's no way I'm returning to Backblaze...


Disclaimer: I work at Backblaze, and I'm one of the main people who writes the client.

I would encourage you to try the Backblaze client again. We JUST released the 5.0 client (last week) and we continue to make little fixes that improve the customer experience. The client team's MAIN GOAL has always been that the client run in "Continuously" mode and NOT annoy the customer.

For example, even up until a year ago we didn't realize how reading certain datastructures from disk that we need to backup were hurting performance in non-SSD based computers. As developers we tend to run with SSDs now, so it went undetected for a while. So we're in the process of making that more and more invisible for customers still on on spinning drives in their laptops. The way we do make this invisible is actually slowing down the reading of those data structures so that it doesn't interrupt what you are doing.

> and yet it STILL insisted on starting from zero and backing up everything again

If you do an "Inherit Backup State" you will not re-upload everything. However, Backblaze has to read EVERY file off of local disk and compare the checksums (SHA1) and if the files have moved it must deduplicate and record the new location, so this can appear like a massive load on your computer for a few hours. I would encourage you to open the Network control panel and notice no files are actually pushed.


It's pretty clear in the comments here that all efforts other than solving the 30 day file retention limit are going to be a waste of time until you address this. It simply reduces the utility of the product to the point that you need yet another product that will almost inherently render yor product unnecessary.

Until you address this, I can't reason investing time, processor performance, or the tiny monthly payment.

Complete backup plans are taken seriously by myself and other HN users and when one tries to put one together and include Backblaze, it just doesn't add up.

Maybe you could make a blog post that addresses your recommendations head on. I remain open!


> all efforts other than solving the 30 day file retention limit

Trust me, that message has come through loud and clear! :-) We are going to do some analysis to see if we can afford it, but we'll need at least a week or two to figure that out.


Another feature (which didn't exist the last time I checked) SHOULD be to have /append only/ accounts.

It should be a /more privileged/ instruction to delete / replace / modify already stored files. This could prevent a backup service on a compromised system from removing remote backups. (Assuming the administrative information was kept secure using other means.)

Ideally I'd like to be able to manually assign privileges to sub-accounts.

    * Modify: resume uploads older than 24 hours
    * Modify: remove/delete
    * Modify: change storage (filestream/parameters/metadata etc).
    * Append: create new buckets
    * Append: add a new file to a bucket
    * Append: add a new /version/ of a file in a bucket.
    * Append: add metadata
    * Read: all list operations
    * Read: all download operations
Note: Since the main way of racking up money on a B2 account is downloads and since compromised clients could be used to engage in a DDoS attack (by compromised legitimate customer accounts) //read// operations are actually more sensitive than might be initially guessed.

A simplified customer UI might bundle those operations together, but some advanced way of providing finely grained privileges should be created.

Edit: Fixing formatting.


If you find that significantly expanding the retention limit doesn't fit within the current pricing structure (as I expect it wouldn't given how competitively priced Backblaze is), consider offering it as an add-on feature, as Dropbox does with Extended Version History (+$40/yr for 1-year retention of deleted files).


I think at a high level users might be OK with paying for their actual stored file size.

A way of improving the storage size might be to allow the user to white-list (with some defaults) folders more likely to have 'third party' files (E.G. C:\\Windows, C:\\Program*, /usr/) either excluded if they're on a list of common files or de-duplicated with a public list of common files (and check-sums). It would be useful to add-on programs and scripts if that list were public, and if a way of cloning those files in to a discounted pool were possible by 'uploading' them again (to ensure the customer actually /has/ that file and thus presumably the right to restore it).


With the prevalence of crypto ransom ware ... I'd really like more than 30 days.

Speaking of which -- possibly simple feature request -- when a files' entropy suddenly goes high from low, alert the user and keep the files longer.


Why not allow for additional retention as a paid add-on to the existing service? I can understand why Unlimited storage combined with Unlimited retention don't work well with a fixed monthly pricing model. But couldn't you keep the Unlimited storage and marry it to a per-GB cost structure for retained versions beyond 30-days? Then it would be up to advanced users to opt-in to this paid extra so that they can take advantage as they see fit (similar to Dropbox and Extended File History). Ideally all this would fit into the standard service... but if CrashPlan couldn't make that work economically then perhaps a different approach would be best?


> If you do an "Inherit Backup State" you will not re-upload everything

I tried this, and it failed. I went back and forth with support for some time, and at the end they effectively shrugged. I may have encountered some weird bug, I dunno, but it was definitely uploading everything again, and support acknowledged as much.


I went from Crash Plan (much too long to back up my files) to BlackBlaze (every time it would hit the end of uploading, my backups would disappear and I would have to re-upload 3 terabytes of data, BB support couldn't figure anything out) to finally just rolling my own external hard drive backups for media and using Dropbox (which I had been using anyway and love) and a hard drive for my files and pictures. I am going to upload just pictures to a cloud service in the future. Between Dropbox file restore and the external HD my files and pictures don't disappear. I don't care about my music and I have found that my media backups simply need to be made and then checked every couple of months. Turns out, TV shows, audiobooks, sporting events, and other static media are wonderful for just tossing onto a hard drive, labeling the hard drive with either a reference to a text file showing what is on it or labeling the drive itself. For example one of the HDs I have is literally labeled:

- 2014 FIFA World Cup - Brasil

- 2015-16 Penguins Stanley Cup

- 2016-17 Penguins Stanley Cup

- 2015-16 Los Angeles Lakers (Kobe's Final Season)

Every so often I go in and check that a game from each of these is playable at the start and end of a half, period, or quarter.

The TV shows and movies I have on Blu-Ray/DVD I don't even check. This is more "I'm too lazy to re-rip, but if I have to, I will" backups.

Ebooks are really the only media that I have to remember to backup, and that library is so small (and relatively static) that I just drag it to a USB stick every couple of months.


My main problem with BackBlaze is a lack of a Linux solution:

https://help.backblaze.com/hc/en-us/articles/217664628-Is-Ba...


Disclosure: I work at Backblaze.

For Linux, Backblaze offers "B2 Object Storage". You can choose all sorts of Linux 3rd party clients! The billing is different -> half of 1 penny per GByte per month. This is cheaper for less than a Terabyte, and can be a bit more expensive for more than a Terabyte.

Check the "integrations" page for pictures of a penguin: https://www.backblaze.com/b2/integrations.html

Among others, CloudBerry, Duplicity, GoodSync, and HashBackup are all options that can store files on Backblaze's reliable storage in our datacenter.


I just want a simple, predictable flat fee, not a pricing guide ...

https://help.backblaze.com/hc/en-us/articles/217667478-Under...


There is a calculator on this web page:

https://www.backblaze.com/b2/cloud-storage-pricing.html (Scroll down to "B2 Cost Calculator")

If you type in the size of your hard drive, that will be the "predictable bill" you will get each day.


That is pretty awesome (but I'd still rather have a fixed fee).

Also -- how do I know my 3rd party B2 backup system isn't going to cause me extra charges from using API's inefficiently?


> how do I know my 3rd party B2 backup system isn't going to cause me extra charges from using API's inefficiently?

Uploads are completely free. I think what you will find is the bulk of your charges will simply be "storage" (at half a penny per GByte per month), the API charges aren't intended to nickle and dime you. We are simply trying to discourage abuse of our API servers.

Creating a Backblaze B2 account is totally free, and if you stay under 10 GBytes it is totally free (and we will cap you to that until you give us a credit card). Seriously, try it out! There is a handy "Reports" page after you log in (look for the word "Reports" along the left. It will tell you EXACTLY what API calls have occurred and how many times.

If you need more than 10 GBytes and are still worried about cost, as part of your experiment look for "Caps & Alerts" along the left after you sign into your B2 account. You can setup daily "Caps" (like for $1/day) so that you absolutely won't exceed your budget until you are comfortable with the B2 service!


... Backblaze B2 is sounding pretty good.

What I would miss, however, is the central management and reporting and alerts that crashplan gives me (e.g., host X hasn't been backed up in N days!)

That of course is all dependent on the client, which all seem standalone.


just make a client front-end for Linux, please! there’s literally dozens of us who would pay for the privilege.

or let us pay what the Mac and Windows users do for whatever frontend we want. switching everyone over to the B2 protocol is one thing, but making us pay more for the privilege of doing the same backups your Windows/Mac users do is another.


> making us pay more for the privilege of doing the same backups your Windows/Mac users do is another

Just to be clear, it should be LESS expensive to use B2 in most cases. If you need less than 10 GBytes it is totally free in B2, but it is $5/month using the Backblaze Online Backup client. If you have 500 GBytes to backup, that is only $2.50/month using B2, but $5/month using the Backblaze Online Backup client. You break even at 1 TByte. And it costs you money if you have 2 TBytes to store.

Backblaze is not charging a premium for the B2 service. Let me put it this way: if somebody ELSE was willing to offer Backblaze storage at B2 prices we never would have built our own storage for the online Backup client. I'm not kidding. The only reason we had to build the storage layer is that to this day, Amazon S3 gouges customers at 4x what it actually costs them to provide the service. And don't even get me started at Amazon S3 downloads cost.

Backblaze Online Backup charges $5/month, and of that maybe $0.50 is profit (50 cents) living on the AVERAGES (make money on some customers, lose money on others). If it was based on Amazon S3 we would be losing $15.00/month by charging $5/month.


i am a film student. after each film is finished, i store the smaller proxy files and get rid of the rest. since this is an archive of my literal life's work, i store everything to my attached USB drive, which gets synced to my NAS, which then should sync to the cloud fulfilling 3 2 1. i am a broke student: CrashPlan was my only lifeline to guaranteed recovery in a total disaster scenario. the smaller proxies still clock in at over your cost threshold for B2 vs Backblaze.

now, because i believe in the ideals of free and open source software and use the operating system that is an instrument of those ideals, i am not worth it, despite knowing that given a passage, the open source community would build you a Hyperloop.


B2 is not a replacement for the backup service, though possibly it could be hacked to serve as one.

How about releasing a backblaze protocol spec or the client source code? That way us Linux fans can roll our own.


> How about releasing a backblaze protocol spec...

That is all described here: https://www.backblaze.com/b2/docs/

Now the Mac & Windows Backblaze backup client uses those protocols, but for full disclosure ALSO still uses one or two additional old legacy protocols. But our intention is to move entirely over to using the new B2 protocols, and if it is not possible we plan to extend the B2 protocols to make it possible.

One example is that the Macintosh Backblaze backup client has available the ability to push up to 1,000 individual small files in a single protocol request. (The individual setup and tear down of HTTPS for 1 or 2 or 3 byte files hurts performance.) So we'll be adding that to the "official" B2 protocols.


Cool, thank you!


Is there any plans to allow backup mounted disks? I have a small SSD on my notebook (30Gb) and all my data are stored on external NAS.

As I know, Backblaze doesn't allow me to backup mounted folders from NAS. This is only reason I don't use Backblaze.


Is there any chance that one day Backblaze will do the same as Crashplan does today?


Disclaimer: I'm biased because I work at Backblaze. I'm also one of the founders. Backblaze started in my living room in Palo Alto.

Backblaze is A LOT LESS LIKELY to exit out of providing consumer online backup anytime soon as it is our primary business.

I don't fully understand what just occurred with CrashPlan, but they had two separate "clients" - one for consumers and one for businesses. As far as I can tell, CrashPlan is discontinuing one of their two clients but doubling down on the "business client". I don't understand how they got into that situation, but Backblaze only has one backup client so we can't really abandon the one true client we have. :-)

Finally, Backblaze is profoundly different than CrashPlan in that we never really raised any bank financing or VC financing. We're 90% employee owned, and there are no deep pockets. CrashPlan raised something like $150 million which comes with "pressure to grow fast or die". Backblaze is free of any such pressure, we own our own fate.

In a final note: I have ALWAYS liked CrashPlan and I am sad to see them go. Realistically they were never a "competitor" to Backblaze. Our biggest competitor was customer apathy and customers not realizing that online backup was a good option. The more money companies like Carbonite and CrashPlan poured into advertising INCREASED Backblaze sales merely by raising awareness of online backup. CrashPlan (and Carbonite) have been absolutely wonderful to Backblaze because we didn't have the gigantic amounts of money to advertise that they had and they essentially advertise on our behalf (for free). We also know some of the CrashPlan people and I believe they are good people who want the best for their customers.


Thanks for sharing that fascinating insider insight regarding non-zero-sum competition and advertising!

This whole CrashPlan Home episode deepens my skepticism about services that claim to offer an unlimited amount of a physical resource (atoms on a hard drive, in this case) for a fixed price. Code42 has mentioned certain reasons for exiting the home user market, but I wonder if an unmentioned reason is the cost of providing service to an excessive number of users who consume more hard drive space than is profitable. As a current Crash Plan Home customer, I might be one of those users, given my 2.2 TB data set. It's felt like a steal for me, but in the future, I'll likely seek an option where I can get a good deal, a good, fair price instead of a steal (which in the end results in pain and hassle and ends up costing more than what I bargained for).

Is the cost of disk space the reason why BackBlaze expunges files deleted on a backed-up computer from the backup on your servers after 30 days? Though I think you guys are cool, this policy is a deal breaker for me, as far as a Crash Plan Home alternative is concerned. At this point, I'd rather pay a reasonable price for the disk space I use rather than have an "unlimited" plan where I have to constantly look over my shoulder at my files, to make sure important files haven't been inadvertently deleted within the past month. If the cost is too much for me, I'd rather be the one that makes the decision as to which files to exclude from my online backup, rather than have the backup service do it for me.

P.S. and Disclaimer: I developed a backup companion utility (Bitrot Detector) that happens to be more relevant and useful to users of a service like Backblaze which performs file mirroring than one like CrashPlan Home which performs file versioning. However, I'd prefer it if every backup service did versioning rather than mirroring, as version-preservation is what allows you to set-and-forget a backup, rather than to set-and-constantly-worry. Though of course I want more customers for my product, I'd rather have relatively fewer if it meant I lived in a world where every backup service did versioning and fewer people experienced data loss and the resulting grief.


> Is the cost of disk space the reason why BackBlaze expunges files deleted on a backed-up computer fromthe backup on your servers after 30 days?

The original reason was to prevent customers who owned a single 1 TByte hard drive from filling it with content, backing it up to Backblaze, writing down the date, then emptying the hard drive and filling it with DIFFERENT content, backing up to Backblaze, writing down the date, repeat 50 times.

The idea was this: if the data is not important enough for you to keep a local copy, Backblaze is not going to keep it either. You aren't allowed to go down to just "one copy in Backblaze".

We chose the 30 days as something we thought of as "reasonable". For example, we expected that within 30 days you would realize your laptop was stolen so you could request a full restore. That sort of thing.

However, and now we're all furiously debating the 30 days based on the enormous amount of feedback we are getting today. We will do an analysis and if we can afford it, we will be increasing that number.


A couple of years ago I went to use GPG for the first time in a long time. My private key didn't work. I looked at the file, and its size was 0 bytes. Cosmic ray? GPG crash? System crash while the file was being written? How it got truncated, I'll never know.

No problem, I'll just restore from backup. I have CrashPlan online backups, and local backups with Obnam, so I'll just recover it from one of them.

Every snapshot in CrashPlan and Obnam had the truncated file, going all the way back to the first snapshots. I thought I had lost my GPG key forever.

Then I remembered that I had some old CD-R/RW backups from years ago. I started going through them. Some of the discs were unreadable. Finally I found one that was readable and had the untruncated private key file.

Lesson learned: always keep your old backups. You never know which files on your system have suffered from bitrot or accidental truncation or accidental deletion--until you try to access them. It's very likely that some of them will have been destroyed more than 30 days ago.

Now keeping old backups doesn't mean keeping every snapshot, ever. CrashPlan takes 15-minute snapshots by default, so obviously I don't need every one of those going back years. But I definitely want to keep at least one snapshot for every year I've used the system, at least one for each of the last 12 months, etc.


Lack of file versioning and not persisting deleted files means Backblaze isn't even on my radar.

The mentions here of poor restore options doesn't sound all that positive either, but due to the former I'm not even considering going as far as trying you guys out.

The way things are looking I'm probably going to stay with Crashplan, migrating to their business plan. I'm not a data hoarder (backup set is ~400GB), but I haven't found anything else which has the same feature set for a reasonable price.


I'm not sure I understand the use case you mention. Is it in essence the idea of someone abusing an unlimited plan to back up more data that you'd reasonably expect them to? Can it be boiled down to Backblaze's cost of hard drive space and whether users pay enough to cover Backblaze's costs?

The use case you mention seems to be very exotic, but who knows, maybe it's not as rare as I would expect. To me, expunging deleted files because of this rare, somewhat-malicious case seems like it would cause undesirable collateral damage among your service's typical users.


To expand on my post, the features I require are:

- Keep at least first (non-zero sized) version of file, at least one version per month (if modified), at least one version per 15 minutes for last week, last version of file.

- Ability to back up files in certain directories very frequently (ie 15 minutes intervals).

- Ability to search for and restore individual files in client, ie without sending my password to someone else.

For this I find the $10/mo Crashplan now charges to be reasonable (my current backup set is ~400GB). I would be prepared to pay more if I required more storage, though within reason.


Maybe you could also furiously debate the Linux client while you're at it :)

(I can better understand the policy now as well -- keep people from using as primary offline storage.)


As a personal anecdote: I was a BackBlaze customer for over a year around 2011/2012. I backed up one computer with 2 attached hard drives.

Over a period of time one of the drives failed; slowly. It would lock up sometimes; and I would restart it. Months later I realized it was losing files when it did this. And when I tried to restore those files from BackBlaze I discovered that they had been purged; and that this was considered normal behavior.

BackBlaze had lulled me into a false sense of security regarding my data by claiming to back up my files but actually mirroring a hardware failure on my local system.

I had an unhappy email exchange with BackBlaze tech support, and also Brian, and came to realize how flawed the system was.

I switched to CrashPlan at that point.

I will not consider using a product that lacks proper file versioning; and a much longer lifetime for deleted files.


Thi is a nice ego boost for you but I wish you had answered some of the questions that folks have posted.

What good is a backup tool if you a) can't restore using the native tool and b) you can't restore files from more than 30 days.


> What good is a backup tool if you a) can't restore using the native tool

I am so confused by your question? With Backblaze, you can get 100% of you data back in two ways: 1) you can get a free external USB hard drive sent to you with all your data on it, or 2) You can prepare a free ZIP file with all your data and download it. We provide a restartable native bzdownloader to help you download the recovered files.

> What good is a backup tool....

Isn't the goal to get all your data back? Backblaze does that. Maybe I'm mis-understanding your question?

> you can't restore files from more than 30 days.

With Backblaze, you can recover files for more than 30 days. You can ALWAYS get "the most recent version" even after 9 years. What you cannot do is get all the "intermediate versions" (like if you change a text document, we retain all versions for 30 days, then we only keep the most recent version forever). I do agree that would be a better product (retain infinite versions of every file). Unfortunately we would need to charge more for that, and many customers only want the most recent copy of all their documents.

I understand if Backblaze is not a good solution for you. I just want to be absolutely clear what we provide and what we do not provide.


Do consider a (paid?) add-on similar to Dropbox's Extended Version History [1]. It'll at least protect against ransomware and other mishaps.

It'll also be nice if you could provide an option to get alerts if a certain subset of files change. I have folders on my NAS (I know Backblaze doesn't back up servers) which are basically files from my old desktops that I'll sort through some day in the future, and which are highly unlikely to change, maybe ever.

[1] https://www.dropbox.com/help/security/extended-version-histo...


I just posted a long reply to your original message, without first seeing that you'd replied to this comment (made by someone else).

If this is how the 30 day policy works, it is slightly more appealing than what I thought. However, it's still not enough peace-of-mind for me. This means that recent files are vulnerable to corruption during their first month of existence (through accidental overwriting, and some kinds of viruses and ransomware). The "most recent version" that is backed up may end up being a corrupted, useless version.

I agree that retaining infinite versions of every file would be cost-prohibitive. However, you may be able to find a healthy compromise. Maybe you could retain a ton of versions of small files and fewer versions of large files. Maybe you could provide users with a space allowance for versioning and allow them to decide which versions of which files to delete (if they reach the allowance limit) or to pay more to increase the allowance.

To summarize: You need versioning in order to be an awesome backup service (and an awesome backup service is the only thing I'll happily settle for!)


Without some form of versioning, there's always the risk that the only backed-up copy of a file is a corrupted, useless version. This is the best summary of everything else I've said here.


Indeed, I nearly lost my GPG key due to undetected corruption, because the corrupted file had been backed up for a long time. I only recovered it from a very old CD-R backup I made years earlier.

Keeping old snapshots (e.g. one or two per year) is an absolute must.


> if I delete a file will you retain it forever

No. We consider a file deletion as "the final version of the file you wish to retain". In that case you can get the deleted version for 30 days, then it is gone forever.


I'm sorry but this is a complete no-go. As much as I like Backblaze's attitude and pricing plans, this is unacceptable for a backup service. It's way too easy for files to be accidentally, unknowingly deleted, and for this to not be noticed for months or years. Any backup system that deletes missing files is not a backup but a mirror.


Wait, if I delete a file you will keep the latest version forever?


no one can predict the future.


b2 and backblaze are wholly separate products and services - I'm explicitly not looking for a replacement for S3, I'm looking for unlimited backup for a fixed fee for personal amounts


This is one of those places where open sourcing the client would make good sense.


Sadly Backblaze fails the main criteria of my mental checklist:

* Can it backup linux machines?


Just use backblaze's B2. There's integrations with dupliciti among others.


Disclosure: I work at Backblaze.

You can see a list of integrations here: https://www.backblaze.com/b2/integrations.html


Guessing Backblaze is getting a lot of hits right now, their site is slow for me.


Like many in this thread, I'm here looking for alternatives. CrashPlan was sufficient for my family needs, and none of the other services I looked at hit the same sweet spot. Slow, saturated uploads, and their bloated client were tolerable for what I was paying, and what I got. We've got a few mentioned replacement possibilities scattered in the comments and replies below, but I'd like to put forth my "asks" for solutions, which seem to match others.

I'm a tech guy, and "rolling my own" Linux solution is possible, but in this case, I'd be happy to pay someone else to worry about the details of storing a copy of my bits, so I can do other things with my time.

My wish list is as follows:

MUST

======

have both Linux and Windows client

ability to whitelist/blacklist/select subsets of files/folders to be backed up

======

SHOULD offer as many of the following as possible (all optional)

=======

some sort of "Family" pricing deal for multiple machines in one account

encrypted backups with consumer controlled key not necessarily given to service

option to switch out back end bulk storage infrastructure

"forever" file retention / restoration of old versions (charge me for storage)

user scheduled backups and/or don't bring client to crawl or saturate home line

======

Feel free to add any replies with suggestions here which might be useful to others.

For my point of view, if CrashPlan found the home offering unsustainable from "abuse" of the "unlimited" feature, I would have much preferred them introduce a cap to the home plan at some "reasonable" value. They could have balanced the books, and saved 95% of their customers. Start rolling off oldest file version backups when you hit space cap and/or offer extra space at cost+profit. Done.


I think https://www.spideroak.com fits your list (specifically "One" which is their consumer solution/app). I'm a happy user who's had an account with them for years (it costs me under $10/month iirc).

It has/supports:

* Windows, Mac, Linux, iOS and Android

* Lets you share your quota across unlimited machines/devices

* Lets you whitelist/blacklist folders & files to be backed up (by pointing and clicking and/or by specifying patterns)

* Does LAN syncing

* Keeps versioned file history forever (i.e. point-in-time recovery)

* Has scheduling but I haven't used this, it just runs quietly in the background and I never notice it negatively impacting anything (client and accompanying background service is currently using a total of 51MB of memory). You can cap the upload speed if you want to make sure it never saturates your uplink.

* Uses end to end encryption i.e. they don't have your key (important caveat: this is provided you only use their app and don't use their web interface, although even then they claim the key is never stored anywhere - just kept temporarily in the server's RAM while it auths/decrypts files for you)

* Been around for ages and not likely to go out of business any time soon

* Lets you share files with a self-destructing link

* Uses aggressive de-duping on their backend (ZFS based I think?) so you can squeeze the most out of your quota (just checked my account and I'm using 94GB without compression & de-duplication but only being charged for 41GB)

Edit: and they release a lot of opensource: https://github.com/SpiderOak


I've tried Spideroak and I really tried to like / use it, but it never worked very well, feels unfocused and a bit unfinished.

I'm probably going to just move my photos to Dropbox (I know, its not a backup program, but it works and is very simple to deal with)


Not sure how recently you tried it but the current SpiderOak One app is solid and polished (imho anyway). It just works and is brain dead simple to use. Much more so than I remember Crash Plan ever being (admittedly haven't used them in years).

Edit: I've only ever used SpiderOak's Windows app though. No idea how good/bad their Linux client is.

Edit 2: this is what the current version looks like on my windows machine: http://imgur.com/a/KfnlQ


I've used them for a few years, not been particularly impressed with their speed. I'm also currently fighting a problem with the client quietly wedging on upload (not sure how long ago it stopped, but the upload queue was past 6GB), and the re-syndication process just sitting there seemingly doing nothing.

> Uses aggressive de-duping on their backend (ZFS based I think?)

Err, I hope not. Pretty sure they sync a database of seen blocks between clients and they collectively dedupe based on that - their backend simply shouldn't have the information necessary to dedupe itself.


>Err, I hope not. Pretty sure they sync a database of seen blocks between clients and they collectively dedupe based on that - their backend simply shouldn't have the information necessary to dedupe itself.

Think you're right based on reading: https://spideroak.com/faq/what-is-deduplication

I assumed they'd use both techniques (i.e. de-dup client side when possible to save on upload) but that the way in which encrypted blocks were produced before uploading was deterministic (a bit like a hash function) so that if the same file was marked for backup on a different device the resulting blocks would be the same and could be de-duped server side.

Providing a unique salt was used client side (shared across all devices and not known by Spideroak) while making the block, it should be possible to end up with deterministic encrypted blocks that are still unique outside of your account (I think anyway... barely know what I'm talking about when it comes to the intricacies of crypto!)


Thanks corford.

I think I looked at spideroak a few years ago. Good to see they are still around. Might be my best fit.

A friend also suggested grabbing this table, export to your own spreadsheet and sort by what matters for you.

https://en.wikipedia.org/wiki/Comparison_of_online_backup_se...


Ha, I think your friend is right :) The size of that list just goes to show there isn't a "one size fits all" backup solution out there. SpiderOak is my sweet spot but YMMV.

Think they offer a free trial (or they used to at least) so no harm in giving it a whirl. Good luck on your quest!


I am not sure if synology allows for snapshots (I am sure there must be an option for that) but it certainly the cheapest solution over the long run. With cloudstation you can sync a folder to your NAS dropbox style (be careful to use a different user on each machine) which you can access from outside of your home if you open a port. You are in control of your data. Doesn't protect you against your home burning but you could have one in each home of the family. Modern hard drives are monstrously large. You don't need a massive system unless you have loads of data.

[Edit] actually you can probably just rsync the diskstation volume with a backup volume on the same NAS.

[Edit#2] actually you don't even need rsync. Synology has a utility called Hyper Backup that does time machine like incremental backups, either on the same NAS, or on a remote NAS.


arqbackup.com will do windows and mac; you pay for your own S3/gc data usage. It allows you to (1) black/white list; (2) encrypt client side; (3) use s3/gcp/your own nas; (4) schedule on demand.

I guess the reason for CrashPlan bailing on the home market wasn't just people trying to store 20T of movies, but also just the support needs. The economics of providing end-user support on $60/year plans is horrific. Note that all Crashplan really did was increase cheapest plan from $6 to $10/mo...


Under the old plan you could pay $10 for 2-10 devices, whereas now it is $10 per device, so the price change is a bit more than you suggested.


x0x0 not quite. $6 to $10 a month would be acceptable to me, but its now $10 a month per device, where before it was $6 a month for a "family" of up to 10 devices. That's the real cost increase problem. I'm still only one customer, with almost no support burden, but the price to go to office version is substantially different for multiple machines.


Exactly. The short list for my family was: me, my wife, my mother, father, sister, and mother-in-law. 6 users under one plan, also using each other's computers as targets, because the connection speed at my parents' is atrocious. So now, instead of 6$ total, which I could just sweep under a rug and call it a day, I'd have to pay 10x that - or bother getting money from them, because none of them want to bother managing their own backups. Hell, my parent's couldn't even been bothered to stick an USB HDD every once in a while.

This seriously sucks, especially that I now have to find a new solution, deploy it remotely on multiple PCs, and they'll all have to suck it up for another couple of weeks if not months when everything reuploads to the new solution.

As for me - even if I migrate to new solution, I'd still effectively loose the versioning. This is a serious bummer!


arqbackup ($50/computer) will probably do roughly what you want. If you remote mount the drives between the computers they will look like a nas that it can write to.


oh wow, that is a big price increase

$6/mo for a family's worth of devices is ludicrously cheap


I was a heavy Crashplan user. I have around 4 family machines backing up to my machine (free), and then my machine backing up to Crashplan (paid).

What are my options going forward?

I can easily switch myself over to Backblaze, but are there any simple, free, GUI clients for creating incremental, encrypted backups to another machine over the internet? The great thing about crashplan has been I just install it on the family member's machine and not once have I had a single problem. It just worked and provided my family member's with reassurance that I had a backup of their data, that I couldn't touch or see, if they had problems with their own machine.

And as I see that BB employees are in this thread, how about making backup folders opt-in for power users. I have sensitive files all over my machine, and who knows what crap MS is storing in User/LocalData etc. so I prefer to opt-in to folders rather than take a risk opting out and syncing a password or keyfile to your servers. The UI for exclusions is awful too.


You could try using Duplicati. It has a Web GUI. You could also just create the configs at home and then export it! You could also just invoke it from CLI.


I have looked into Duplicati, but I'm not so keen on running an SSH server on my machine and/or SFTP in order to let others back up to it. A lot of extra admin work would be involved in protecting that set up and creating the necessary port-forwards and firewalls. Whereas crashplan just works.

I'm certainly not going to go through the steps of setting up CLI tasks on my families machine members either.


You can choose any kinds of destinations for Duplicati. Even a local place.

I personally use it with Google Drive.


If you see the parent comment, I am referring to backing up relatives' machines to mine. These relatives are not local. Backup must happen over the internet.


You could switch to Backblaze, but you have to give them your encryption key when you restore.

Use B2 with a decent client instead, you don't get an all-in-one plan for your family, but at least you keep control over your data.


I don't wish to store my families data in the cloud for a fee. Crashplan allowed me to use my own (spare) storage space to provide this for them.


That was a great feature.

Mind if I ask why you don't want encrypted data in the cloud?


Because it costs me money to do so. Crashplan has always been free to backup data between machines and was/is very good at it. I don't mind paying for a 1tb hard drive to back up my mother's few photos, but trying to sell her on the idea of paying a monthly fee to do so is not going to get me far :)


You can sell it as a real backup: if her computer gets struck by lightning, her 1tb hard drive will too.


Good riddance. I switched from Backblaze to crashplan because it was cheaper for families. I've regretted it ever since. The client is horrible. Right now it's consuming about 4GB of memory. I had to increase that recently, otherwise it wouldn't even start. It littered gigs of log files around my system when it did that.

Often the webui would just timeout if I wanted to recover a file. Never could figure out why, so I couldn't really recover files reliably anyway.

I'm glad I'm forced to find a solution that better. I'd definitely stay away from them for business solutions.


I think this means it is also an end to CrashPlan's peer to peer free backup solution? I don't believe their business plan offers that.

Thank goodness I bought into their long service plan. I have 1.9 years to figure this out.

We have one Linux NAS and a bunch of Windows clients. I could go with Crashplan Business on the NAS and then punt on the Windows clients. Or go back to BackupPC for windows... I hope their windows client solution has improved -- I used to have to install a Cygwin rsync daemon.


It's worth pointing out that CrashPlan's business client app uses Electron (because the Java client app for home users wasn't bloated enough). That's reason enough for me to look elsewhere.


According to their website you continue using the same Java app: Do I need to install a new CrashPlan app? No. The conversion process automatically updates the CrashPlan app on your computers. Instead of the green-themed CrashPlan for Home branding, you'll see the blue-themed CrashPlan for Small Business branding.

[0] https://support.crashplan.com/Subscriptions/Migrate_your_Cra...


Hmm, maybe it's just the Enterprise version that's using Electron.


Wow, I had heard it was native for business. Being Electron is even less native than the Java client, no thanks.


Hmm, they said end of service would be October 22nd 2018, so I'm wondering if they were only expecting one year plus sixty days, not 1.9 years plus sixty days... may need to check on your situation.

In any case, it sounds like the business version drops the peer to peer free solution.


Damn, this came out of left field. CrashPlan was a constant pain to use for me (slow uploads, weird bugs where it wouldn't start up after updating, actually deleting all my backups when I didn't renew my 4-year old subscription over winter holidays with barely any warning). But the other services simply couldn't compare on price and backup retention. Having old versions/deleted files always available saved my ass so many times. The new business plan seems only twice as expensive as my current one, but it seems like a rip off to be charged twice as much for the same functionality. I think I'll take them for the 75% discount offer, and then wait for any reasonable competitor to pop up.


Are you allowed to go with the business plan without having a business? What do you put in as company name then?


Another bummed out Crashplan user here. They offer account migration[1]. Of course you then have to pay per computer, and you lose the computer-to-computer backup feature.

[1]: https://www.crashplanpro.com/migration/


Yes! It says enter company name. Question is how to proceeed there cause it is for my family and no company.


d/b/a <Insert Favorite Pet Name>


This is unfortunate. I chose Crashplan initially due to great family pricing, but more importantly as it had a Linux client. Given I backup Linux home server, a Linux desktop and a Linux laptop this has been its killer feature for me.


This is also what drew me to Crashplan, but I've been irritated with the service long enough that I think I'm glad they've forced my hand.

For one, their client is very bad at backing up without affecting other network services; when I'm on a video call I frequently have to pause Crashplan backups entirely or I'll have stream issues (or manually limit the upload speed to something really small that I'll inevitably forget to undo later). I've never had this issue with Dropbox for instance. The client is also written in Java so it's a resource hog; beyond annoying my desktop machine that also made it hard to install directly on a somewhat resource-limited Synology NAS device a few years ago (I eventually got the install to work but it sporadically won't start up on boot due to memory constraints).

Really, on my Linux desktop my most important files are my code, documents, pictures, and video, which are all already backed up to Dropbox. If my hard drive died it wouldn't be a big deal to do a fresh install as long as I can sync my code, documents, etc. with Dropbox. So I may just go without a full desktop backup solution.

I don't know how they're going to transition to enterprise given their trashy desktop client but good luck.

edit: in addition to Dropbox I'll probably add tarsnap to sync the important things to S3


> I don't know how they're going to transition to enterprise given their trashy desktop client but good luck.

I kinda feel that makes them perfectly suited for enterprise. I've never thought of enterprise software as having a particularly good user experience coffsharepointcoff

I'm also rather glad CrashPlan forced my hand. There were a lot of good ideas about it (the peer-to-peer features were interesting), but I really hated the Java client. And I always had to pause it's uploads if watching Netflix, because it would saturate the upload channel completely. (On a Mac here, for what it's worth.)

They've given plenty of notice though, so I'm thankful for that.


In my experience, CrashPlan was easy throttle to prevent network issues. Backblaze I would basically turn off during the day and reenable at night because its throttling was useless.


Their email to users offers a transition to the Small Business product... which appears to be the same product at $10/month/device. So this is really a pricing change, though you can enjoy a 75% discount for a year while you decide what's next.


Be careful if you rely on old versions and deleted files being backed up indefinitely. Their Home offering includes this, but it looks like Small Business does not.

edit: I got confused - Small Business likely still offers indefinite retention.


Where are you getting this from? Their feature comparison [0] lists both "keep deleted files" and "multiple file versions" as "forever"

[0] https://www.crashplan.com/en-us/business/compare/


Seemed like there is some conflicting information on the support page for restoring deleted files:

https://support.code42.com/CrashPlan/4/Restoring/Retain_and_...

https://support.code42.com/CrashPlan/6/Restoring/Retain_and_...

That red icon next to Small Business on the version 6 page scared me.

Hope you're right! Indefinite retention is a killer feature for me.


Same reason I was using Crashplan, now looking for suggestions for a good cross platform solution


Me too. They suggest Carbonite, but it doesn't have a Linux client.


I used to have Carbonite for my wife's iMac. It was a little slow to backup but worked on till we actually needed to recover all data after a HD failure. Since then I couldn't dislike Carbonite more than I do. The first ten minutes of the recovery process the disclose was fast. Then it continuously slowed down. Of course customer support blamed it on our connection which was complete BS. I then started downloading the files were needed most through their web UI which was not bottlenecked on their end. However, there seemed to be done restriction on their end on how many files you can download at once. The limit seemed to be around 100 files. If you download too many you will see a spinner minutes and then download a empty tgz. It was incredibly painful and tedious. We have up on most files. The entire idea of unlimited space is obviously not viable and something gotta give. Apparently in Carbonite's case it's the download bandwidth. Should have seen this coming.

I can't dissuade against Carbonite strong enough.

We now have a NAS and back up truely important data to S3 and iCloud.


Well, shit. I just switch da wife to Crashplan, touting its several advantages over Carbonite (which she was happy with). Now I'll have to eat crow and switch her back. :(


Check out spideroak.


I'm sure there tired of consumers taking advantage of the unlimited storage by storing PB's of data, and losing $$$. Amazon basically did the same thing recently removing unlimited storage. Business data is more a bit more predictable with storage needs and really makes alot more money.

I dumped crashplan a few years ago due to the bandwidth restrictions and time it would take to download TB's worth of data, built my own solution with a remote Synology nas. It's a real cost issue once you have more than 1-2TB of data, otherwise it's just easier to use gdrive or dropbox for most needs.


> Amazon basically did the same thing recently removing unlimited storage.

I'm working out what to do next since they did this. I use Arq (https://arqbackup.com) and back up my desktop (Windows) and laptop (OSX) to Amazon Cloud Drive. Together it's approx 1.5TB of backups - and this is excluding a bunch of folders and accepting data loss on them.

The doubled price doesn't appeal, but I haven't found an up-to-date storage price calculator to see what my options are in the TB range.


> I haven't found an up-to-date storage price calculator to see what my options are in the TB range.

For pay-as-you-go storage, I found this: http://coststorage.com/


I'm sure there tired of consumers taking advantage of the unlimited storage by storing PB's of data

I don't see how this makes sense. Instead of closing the whole service they can set a limit which will probably satisfy 95% of the users who aren't using enough storage to make it unprofitable. So I assume this isn't the reason.


I don't know for sure why crashplan ceased but I recall reading here on HN that someone happily posted they were using like 16+ PB's on Amazon drive which is insane, I'm sure plenty more abused it but in reality that much space is quite a few average users I'm sure. Crashplan could have created some sort tiered pricing structure for those using over 5-10 TB possibly but to build up your service and brand and dump it suddenly is really un-trusting.

I wouldn't be surprised they just ran out of data center capacity and decided cutting loose low revenue consumers was cheaper than building out new space.


Well, fuck.

Let the suggestions begin - who should I switch to? Bonus points if moving data to a newly purchased hard drive isn't a terrifying process that looks like it's losing all of my backups, as well as Linux/Windows/Mac support.


> Let the suggestions begin - who should I switch to? Bonus points if moving data to a newly purchased hard drive isn't a terrifying process that looks like it's losing all of my backups, as well as Linux/Windows/Mac support.

I'm kinda partial to Arq with your choice of cloud storage. I heard that they don't support moving backup data between destinations though so I'm going to have to re-evaluate my choices when my Amazon Drive unlimited subscription comes to an end as well.



We need a p2p backup solution. Something like what wuala used to be but open source and distributed. Give up 1TB of local space to get ~750GB offsite, or something like that.


You might need a p2p backup solution where your computer is part of the network, but I want a me2idontcare solution, where I give up dollars, and gain gigabytes offsite.

I want a boring solution, written by a boring company, that very boringly stores my files.


Isn't that FileCoin?


Tarsnap. Seriously.


Only about 1000 USD per month for what I have in Crashplan - I think I'll give it a pass.


GAH! How much are you storing in crash plan? 1000USD/mo seems a little insane. Is that after deduplication with tarsnap?


My data is already deduplicated - it's mostly video by volume. Total is in the region of 4T.



I don't suppose there's a Windows client?


Like many I'm really disappointed by that news. Not only I have been using CrashPlan for over 7 years, I did recommend it to many family members and friends. I remember back in 2010 it took more than a month to upload / backup all my Macs at home... Since then I have added a few extra TB of data which CrashPlan uploaded without any issues. Asst one time, I even had to restore my MBP which had a drive failure and I did loose only 15 minutes of work.

I need now to check all the different solutions, and start from scratch again! I wish they had offer some bridge to transfer what is already backed up to another provider; yes I know this is not a simple task but at least their customers will not be upset by that news.


Their email was a good reminder to watch out with date formats.

"Your new subscription expiration date is 11/06/2017" had me thinking they accidentally set the expiration to last June...


Oh, what a shame! My family is a heavy user of the computer-to-computer backups that CrashPlan for Home offers. It's great knowing all of the data on our family computers is securely backed up so that I can fetch the drive from a 20 minute drive to the office or my parent's house.

In case of catastrophic failure, where on-site backups would also be destroyed or corrupted, I would hate to have a drive shipped from the United States to me. It would probably take forever.

Is there a alternative available that does computer-to-computer backups, and supports Mac and Linux?


My own notes on alternative backup/sync solutions: https://github.com/pjc50/pjc50.github.io/blob/master/secure-...

SyncThing+cloud storage seems most promising but I've not actually tried it yet.


Yep. I am using syncthing since around 3 years now and never had a problem, upgrades are flawlass, really not that much CPU/RAM (<1 % and 50 MB for 18000 Files ~ 70GB). Setup is like this:

1. syncthing on PC -> syncthing on Synology NAS.

2. NAS -> Internal backup from 3-Disk RAID to additional 4th disk

3. I am thinking about additionally uploading it to Amazon Glacier or Backblaze


Well, this sucks.

CrashPlan was awesome in that it doubled as both a backup app and a cloud storage solution. Since I had my settings set to NEVER remove deleted files, I didn't need to have each of my backed up external hard-drives plugged in at all times in order to have them backed up. I could just plug them in as needed, make sure Windows assigned them a unique drive letter, and let Crashplan sync any new data up.

None of the alternatives I'm looking at can do this. Carbonite lets you back up one external hard drive on the Prime membership. Backblaze seems like it would delete all of the external drives if I don't keep them constantly plugged in. And no Linux client.

It's kinda sad that the market hasn't yet created something that seems fundamentally basic to data hoarders:

- TRULY unlimited

- BACKUP, not SYNC (that means I shouldn't need to keep three external hard-drives plugged in at all times just to ensure you don't delete it!)

- Fully cross-platform (yes, Linux users exist, too)


I don't see how "truly unlimited" and "data hoarders" can ever be the basis of a sustainable business. Each additional terabyte costs them real money to store.


I think you underestimate how many people are in the "lite data hoarders" category. I don't consider myself a huge data hoarder. I have about 4 Terabytes. That's the totality of two decades of collecting music, video, pictures, ROMS, game ISOs, etc. It's enough that I can still fit it all on today's portable external hard-drives. And I don't think it's really much more than other people who might do amateur photography or videos and like to store their RAWs. However, it's more than is feasible for your standard iCloud/Google Drive/Dropbox plan can provide without significant burden.

At this point there's nothing really comparable out there to what Crashplan offers. Even the services like Amazon Glacier or Backblaze's B2 start becoming way more expensive once you pass the terabyte mark or so.


> Even the services like Amazon Glacier or Backblaze's B2 start becoming way more expensive once you pass the terabyte mark or so.

Yes, this is my point! It's expensive to store a terabyte in the cloud. There is currently no way around that. Crashplan was probably losing money on customers who had multiple TBs in the cloud.


>I have about 4 Terabytes. That's the totality of two decades of collecting music, video, pictures, ROMS, game ISOs, etc.

Good lord. And you need to back this all up in the cloud?


If you had a lossless rip of a rare album that is not on iTunes, not on Amazon, not on Google Play Music, and cannot be found via torrents, wouldn't you want a backup of it in the cloud as well?


Of course, but it might be better for everyone involved if I paid for it per GB.


With deduplication, that is feasible.

With E2E encryption, not so much.


You could compromise and use something like MaidSafe does, where the data is encrypted using a key based on the contents of the file itself (self-encryption), so you need to know the key for the file in order to decrypt it but deduplication is still possible.


[flagged]


If you really believe it is pennies per terabyte, than go ahead and start your own online backup company. It isn't that cheap, especially since you also have to pay for bandwidth, customer support, etc.


I'm guessing they're leaving the business directly because individual customers just don't cost pennies. As mentioned above, "unlimited" backup plans at a flat rate just aren't sustainable, unless you're going to cut corners. Cutting corners when it comes to backups just leads to disaster down the line eventually anyway.


>all that enterprise corporate schlong in their mouths. Hope that money shot tasted good, effing traitors!

Holy fuck. Let's settle down a bit buddy.


[flagged]


Would you please not rant on HN, regardless of how annoying you find something?


"Truly" unlimited storage could potentially have unlimited costs. How on earth do you propose that could ever make economic sense to sell you?


Fwiw, I've been a crashplan family user for years, here's my future plan for family backups:

- Use a freenas at home

- Make everyone in the family back up to it instead of crashplan. (No idea what app to use here yet but assume there's a dozen of em)

- Anything that isn't already acting as an offsite backup via the freenas server (I.e. My own stuff, and server only files), upload to backblaze b2.

I'm overall okay with the announcement. Was already planning to switch away since crashplan wasn't offering the great deals they had years ago (crashplan family was ~7.50/month with 4 year plan in 2014)

I look forward to no longer futzing with Bhyve/Linux just to use crashplan from my server too.

I don't look forward to spending more time figuring this stuff out more though.


I chose pCloud as my provider of choice. It's not unlimited, but it's $10/mo for 2 TB of storage and 2 TB of traffic. That is more than enough for most users, and you can increase the size in 2 TB increments at additional cost. pCloud has pretty much all the bells and whistles one would need. I certainly consider it rich in features. I also like that it can be used as a sync storage or just as a "hard drive in the cloud" where I don't have to keep the data or folder structure on my machine if I don't want to.

The only downside is that the integrated encryption is an additional charge. To get around that, I decided to go with Boxcryptor, which I already had. The average user can use the free version of Boxcryptor. It is awesome and I highly recommend it. Both pCloud and Boxcryptor show up as physical drives on your computer, which I find convenient. So I have my primary source drive synced with the Boxcryptor drive/location and Boxcryptor is set up on my machine to encrypt the data on the fly as it is copied (surprisingly, this is not the default setting). Then, I have a separate process that syncs the raw Boxcryptor directory where the files show in their encrypted format (the "drive" version shows them unencrypted because everything is done on the fly both ways) to my pCloud storage. In the end, I have 3 versions - the unencrypted source, the encrypted Boxcryptor copy, and the pCloud backup which contains the encrypted copy.


I recently had evaluated my entire backup plan and CrashPlan was evaluated as a possible change. It was one of two finalists I was debating between. Boy am I glad I didn't go with them. I understand that they will not just cancel the existing users and are only preventing new customers from signing up. But let's be real. We know what this means. It means they will not be giving the individual consumers in any sort of priority whatsoever. The non-commercial customers have now become a legacy operation.

FWIW - the reason I decided against CrashPlan was: 1) I inherently don't trust anything that is unlimited everything (storage space, versions, undelete, etc.) 2) I especially don't trust anything that gives all those unlimited features for a paltry $5 a month 3) My speed was horrendously slow. Seemingly capped at about 2 mpbs (250 kbps). 4) The application seemed to be bloated and resource heavy. 5) They have a well known issue where you can wipe your entire backup from their server if you encounter the dreaded disaster scenario (or get a new machine) and have a fresh O/S with a new installation of the software. If you don't do the steps exactly in the correct order, it will see your new installation as being the source copy and having nothing to store, and subsequently wipe everything on their servers. I see that they recently posted something on August 1, 2017 about "adoption" which seems to be migration for new machines so maybe this issue is finally addressed.


I went though the "adoption" procedure in July with no issues. I was migrating from an aging Windows system to a Linux system and had no issues. I was expecting to have problems and to re-upload my ~500GB of data, but surprisingly it figured out where everything was. That even included moving the data around so I could reformat the NTFS drive to ext4. My only complaint over the years I've been using CrashPlan has been that the Java client is a bit memory hungry.


And who was the other finalist?


Sorry, I had split my original (long) comment into two separate ones and I thought I left a brief mention of it.

The winner was pCloud. Please see my other comment for a more detailed explanation of how I have things set up and why I use them if you are interested. https://news.ycombinator.com/item?id=15075935


After checking them out, it does seem like they hit a nice sweet spot of features and price. The lifetime plan is pretty tempting as well, but I do wonder how sustainable that is. I assume they are banking on a charge for increased storage in the future.


HashBackup[0] works well for me and works with several remote storage providers such as B2. I recently wrote a post about one of the ways I use it[1].

[0] http://www.hashbackup.com [1] https://jacobhands.com/hashbackup-example-1/


For all those complaining about Backblaze not holding deleted data forever: Do you really rely on a cloud solution as your one and only backup?

I use Backblaze, and I don't worry. It's not my only backup. I additionally have a local backup, which today has files I deleted like 5 years ago, as you should to.

So go get happy with Backblaze, it's a nice application, not some slow Java thing like Crashplan, and just works..


>Do you really rely on a cloud solution as your one and only backup?

Most likely yes. Considering most people don't have any backups of any kind, having at least one backup, and an off-site backup at that, is infinitely better.

The number of people who have any backup is small. The number of people who have two is far, far smaller. Most people just don't care about their data that much unless they're a business, which is why this thread exists to begin with.


I just canceled my Crashplan about 2 weeks ago and fully switched to Backblaze B2. Mostly because I consolidated all my computers to a Synology NAS and now I upload to B2 via Synology Cloud Sync. This way I have a single on-site backup plus my offsite backup. I never liked that I had to manage Crashplan on each individual computer. Now I just setup NAS sync and upload from the NAS.


I got a 4 year contract with Crashplan (expiring in 2018) and I'm also pissed by this news...

I paid less than 4$ per month (the price would've been already higher if I renewed yesterday, but still acceptable) and stored around 130 GB of data (so not a lot, but it's already more than the 100GB threshold that some services use)

I need something that would keep watch on all the files on the system (see inotify) to avoid continuous rescans of the disks, and that would work on Linux...

Just these 2 simple requirements already seem to disqualify any other tool suggested here:

Carbonite (no Linux)

Backblaze (no Linux, and none of the 3rd party integrations with B2 seems to support live watching of files)

Tarsnap (no watching)

Borg (no watching)

Does anyone know if it's feasible to keep the whole / under Dropbox, Google Drive or Spideroak? (I know that GDrive doesn't have an official linux client)

The only solutions I know would be:

Lsync (but it would be only syncing, no history), r1soft (but it seems to be only servers, haven't checked yet the features)...

Are there any other tools out there?

I'd really thinking that I should write my own


Brian from Backblaze here.

> Backblaze ... none of the 3rd party integrations with B2 seems to support live watching of files

I'm not exactly sure what you are going for, but you can "preview" files through the web interface on BOTH the Backblaze online backup and also B2.

If you are specifically asking about movies, just stay tuned. The first step was allowing preview of simple images, next we want to support previewing of movies.

Also, a feature we added a week ago in Backblaze Online Backup is the ability to "Share Files" (any file in your backup) with other people. For the first release we limited the file size (just so our servers did not tip over) but we plan to support the ability to share all files less than 10 GBytes.


>I need something that would keep watch on all the files on the system (see inotify) to avoid continuous rescans of the disks, and that would work on Linux...

What op is looking for has nothing to do with previews


By live watching, the commenter is referring to 'watching to changes to a file'.


Just googling "inotify linux backup," I found my way to this post about inotify not being recursive [0], doesn't that limit its usefulness for triggering backups of an entire drive?

For Linux and the relatively small amount of storage you require, you're very likely going to be looking for client software independent of backup storage medium, not a service that provides a client. 130GB would be about $3/month on AWS S3.

[0] https://www.quora.com/Linux-Kernel/Inotify-monitoring-of-dir...


Yes, from my understanding you have to watch on inotify every single file on your system (watching / is not enough), which causes a considerable increase in memory usage for the kernel (like, more than 500MB on my laptop), along with forcing you to increase fs.inotify.max_user_watches

Not ideal, but you only need to do it once and I already have it set up in my ansible playbook:

https://github.com/berdario/dotfiles/blob/5c999113d9ce993f01...


I researched now for a solution that lets me backup all my family PC's and found this two solutions that I will now try out:

https://www.duplicati.com/ https://www.urbackup.org/

edit: It's client/server backup software. Not an online service. I used CrashPlan in a similar manner.


I'm also a Crashplan user. Don't have great suggestions, but in terms of Google Drive on Linux, Insync (https://www.insynchq.com/) seems to work well enough.

Wish Google would just release the Linux client they use internally.


Can you explain what you mean by "watching", and why it's important to you?


I think there are a couple of variants. First is watching for files to change and then uploading them when they do (possibly waiting a short period and collecting a set of changed files before uploading). This is what folder syncing programs like Dropbox do and probably what berdario is seeking.

Second is watching for files to change and saving that list until the next scheduled backup is run. This is what macOS's Time Machine does, the OS's fsevents continuously keeps track of changes and Time Machine just asks it for the list of changed files since a given time (the last time Time Machine ran). This is an alternative to a backup client scanning the filesystem itself on a schedule.


Yes, what I'm actually looking for is the 2nd one. (that is, I don't care that the backup happens now or in 15 minutes, but I care that when the backup runs, the tool doesn't have to rescan the whole filesystem)


> I need something that would keep watch on all the files on the system (see inotify) to avoid continuous rescans of the disks

Inotify (on Linux, and fanotify wouldn't be enough, since the backup tool would have to track file deletions as well) is the only way to have seamless background backups

I tried to use other tools in the past, like duplicity... But having a scan every hour that takes several minutes and has noticeable CPU usage is a deal breaker

(and having a backup scheduled less frequently than that is also a dealbreaker, since you cannot rely that the laptop would be up an running at a certain time during the day)

(Crashplan also does a full rescan, but it's not strictly necessary... It's basically needed if the backup daemon crashed or was shutdown, or stuff had been changed on the filesystem while the OS was off, so that sanity check rescan can actually wait a few days)

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: