My mom and dads iMacs is one of them. This is literally easy money for Crashplan. But the price would double to move them to the new Small Business plan.
What I liked about Crashplan, besides the unlimited space was that if files would get deleted, and you would find out about it 3 months later, you could recover them.
With Backblaze, which is mentioned here, you wouldn't. Which, for me beats the whole purpose of doing backups at all.
This makes no sense to me: so if a malicious hacker, virus, Trojan or your kid deletes a folder... by the time you find out, Backblaze might already have stopped taking backups of it, itself.
I have mentioned before, I think here on HN too, the data retention policy is just too bad and their reasoning for it, that "it's not an archiving tool", is beyond ridiculous.
And then their interface for restoring files is only from the website. Yes, even if you have to download a 15GB file you do it in the browser or ask them to ship the hard disk to you. I guess it's tough luck if you are in a different country, or different continent altogether.
I will stick to CrashPlan for the time being and go for the 1 year Small Business plan (at 75% discount for existing Home customers) but beyond that or maybe even before that I am looking at something solid and self-hosted. I already backup my Dropbox folder to Tarsnap (which has a kind of okay GUI now), I will look at Glacier or a VPS and backup using something like Attic or Borg. Sad they have no GUI. I feel confident about backup apps with a GUI - I find it intuitive to customize and can see in one look (or maybe keep seeing) what's happening if I want to.
> I guess it's tough luck if you are in a different country, or different continent altogether.
Backblaze sends USB hard drive restores to other continents every day! Today we have one restore shipping to Finland and one to Great Britain. And to be clear, we absorb 100% of the cost: it is $189 USD deposit and we absolutely absorb 100% of the shipping cost whether it is Texas or Finland.
Also remember that the USB hard drive restores are "free" to customers if they return the USB hard drive to our office within 60 days. The only thing the customer has to pay for is "return shipping" and that can be on a super slow service and we're talking about 3.5" drives here that weigh just a few ounces. Of you can keep the 4 TByte drive for the $189 deposit (turns into "purchase price") which includes shipping and the hard drive and the service.
I am sorry if I am sounding like an obstinate whiner but no, I don't want that. I mean I am glad if you offer that but I want:
1. A way to restore that is from the app and that can withstand disconnection, gap of hours, or maybe days - right inside your really awesome app (I mean it) with partial download and shit.
2. (I didn't know about international shipping, thank you for telling me) I don't want to rely on that and I honestly am not looking forward to cover whatever little (or more) cost that is and being out of USA it will be more and then the hassle of returning is even worse.
You know, I may just want 30GB out of my 3TB, or hell maybe even 5GB and even for that "the browser restore interface" is just sad.
As a customer I would urge you to have a look at it and maybe listen to customers on it (or future customers :))
And data retention! You didn't even touch upon it. Kind of shows your/BB's stand is clear on it, right?
And I forgot that you need my encryption password for sending me that hard disk. Now that's a design problem you shouldn't even have in the first place :(
> I may just want 5GB out of my 3TB, even for that "the browser restore interface" is just sad.
I'm slightly confused by this one. I assumed that pixel-for-pixel the interface of local restore would match the pixels in "the browser restore interface"? Is there an assumption that if it is a local application, then there would not be a "tree view" you see in the web and instead you would get something that looks like a Macintosh Finder or a Windows native "Explorer" interface?
If that is the case, what if we built something that looks IDENTICAL to the local Finder or Explorer but in a web page would that make some people more happy?
> And data retention! You didn't even touch upon it.
Just an oversight. :-)
> Kind of shows your/BB's stand is clear on it, right?
No, I can ALMOST guarantee you the Backblaze data retention policy is about to change, and possibly within a week. We heard all of you loud and clear and we're just struggling to figure out exactly how much this is going to cost Backblaze, or exactly how much of that cost we plan on passing on to customers and what that looks like.
No, what I meant by:
>> "I may just want 5GB out of my 3TB, even for that "the browser restore interface" is just sad"
Browsers are usually (or maybe never) good for downloading very large files. For example I am downloading a 6GB file or ZIP from BB using my browser and it crashed when I was done downloading 5.6GB. In most cases those 5.6 GBs would be gone and I will have to start over. Where as in a desktop or some kind of download manager/helper (which will have my login state/credentials just like Backblaze app has) it can be done over a period of time and will withstand machine going to sleep, shutdown, network disconnection etc.
But since you mention:
> what if we built something that looks IDENTICAL to the local Finder or Explorer but in a web page would that make some people more happy?
As I've mentioned above - it was about the functionality and not the look and feel.
And, imho, it will really be good if you could move the "restore" functionality to the main BackBlaze app:
- where I can see my files and hopefully available versions too and select version > file at that version or version > then files at that version (some apps first let a time to be selected and then files at that time are available for restore - and some apps do the opposite)
- or, at the least I can do all that on the web and the BackBlaze app starts restoring when it connects to the Internet again in a folder designated by me (
- restores retain the same folder structure. e.g., I am restoring a.png which was at ~/MyFolder/A/B/C/Z/B/A/CoolPics/a.png and restore function dumps it like this ~/<BackBlaze Restore Folder>/a.png. Ideally it should give me option to either dump just the file or give me the restore like this: ~/<BackBlaze Restore Folder>/MyFolder/A/B/C/Z/B/A/CoolPics/a.png or at least recreate it where it was and if another copy with the same name exists, rename the other file but this (latter) option will be a mess in case of many files
> I can ALMOST guarantee you the Backblaze data retention policy is about to change
That's very good to hear. Probably the best thing I head since yesterday :-)
PS. Please make sure this retention is something really cool and preferably not something like "well, let's make it 2 or maybe 3 months from 1" :-)
Because, as you must be knowing, the point of backup is that if I am looking for an important Excel sheet file after 7 months and somehow it was corrupted or deleted I'll turn to my backup and with a short retention it would have been gone forever defeating the purpose of my backup.
While we are at it, here's another feedback: Maybe start warning users when Backblaze sees some files (it was backing up previously) deleted before actually deleting them from archive (mark for deletion in a month or so). Maybe let the user do some action like "yeah, go ahead get rid of them", or "Uhuh, didn't mean to delete them - please restore it or put it where it should have been".
Anyone can recommend if a good backup alternative that works on a Linux (desktop)?
The downside is that it's all on you to configure providers, pay for storage, and I'm not aware of any indexing or ability to retrieve single files. Also, it's CLI-based, so it might not be very intuitive for desktop use. Still, a free solution with lots of utility. :-)
i haven't tested it myself though, just heard about it.
Its supposidly possible to share this account with friends, because duplicity backups are GPG encrypted.
These guys have a fantastic tool for rolling out a backup plan using storage that makes sense for your needs.
Even my time machine backups only go back about a year though due to capacity limitations on the backup drive. My most critical files are also on Dropbox, which I know isn't really backup but it's nice to have as an extra recovery option.
After purchasing the eventual death and exit of your 10th or 15th laptop, you kind of want to spend less time doing those transfers, especially with complex installs that can't be trivialized to the cloud.
For Mac users, I like the backup practice of using two drives in two locations, Time Machine for incremental backups, and Carbon Copy Cloner for bootable backups, and then bump one of those images up to the cloud. One drive on a desk at home, one at the office, offsite, or a family member's house.
Additionally, keeping a third hard drive in rotation elsewhere, or use a local NAS that bumps up to the cloud for you as well.
The bootable backups tools like Carbon Copy Cloner or Super Duper provide are priceless in my mind.. you can simply plugin your HD into another mac and boot off the external drive not missing a beat while your hardware may be getting attention.
So far it doesn't look like they're keeping the free peer-to-peer backups, which I was using for the less important computers, so this would more than double the price if I went for the SB plans.
Yeah, it seems that not only would the peer-to-peer backup process stop working, the already-backed-up files would be rendered useless too.
Relying on user accounts and "backup codes" stored on CrashPlan's servers sounds like it was an additional point of failure.
Customers sometimes argued that they should be able to stay connected 24/7 all month long because that's "unlimited".
Our definition (that we indicated in our terms) was "you will never be billed a surcharge for usage during a month", but there was no guarantee of being able to use a certain number of hours. So we reserved the right to de-prioritize (i.e. disconnect) high-usage users during peak times to allow other users to get on when our lines got full.
I'm sure somewhere in Crash Plan's terms of service they define what the term "unlimited" means in the context of their offering, and I would bet it means "there will never be a surcharge for usage of storage".
So they will never ding you for using more than X storage, but there's no guarantee that you can store a specific amount of data on the service.
I'm more concerned about old files I've accidentally or deliberately deleted that I later turn out to need.
Anyway, my storage requirements are pretty low (probably 1tb) so I'm going to look into solutions like B2 with Arq (along with my existing Time Machine).
Just frustrating to have to go through all this!
(For home-grown options, I've been using rsnapshot for years. But it really works best on a local disk and the rsync-over-ssh hacks I've done are the opposite of "clean consumer solution".)
The killer feature Duplicity has for consumer use is that it's probably the only (?) backup program on Linux that has an actual quality desktop GUI called Déja Dup  that's included by default in Ubuntu. It works somewhat similar to how Time Machine works on Macs. But if you need to backup headless systems the Duplicity command line interface works fine as well.
When it comes to backup features it's not the most powerful tool compared to some other solutions like Attic or Borg, but I think the GUI and out-of-the-box (encrypted) integration with cloud services make it one of the most user friendly solutions on Linux.
[3b:] EDIT: Also Duplicati has a nice web GUI. Worth looking into as well.
Modern incremental-forever backup tools like borg (fork of Attic) are much faster since they are based on block hashing, which also gets you deduplication for free.
Basically, a backup is a set of hashes - this means that you can selectively delete or retain old backups without having to merge them.
I built a >50 TB enterprise backup cluster with borg and it works extremely well.
E.g. if one backs up to hubiC, OneDrive and Google Drive, with a RAID5-like redundancy, they can have just 33% overhead and still be sure shall any of those vendors discontinue the service or suffer a failure, their data would be still safe. Some call that RAIC - Redundant Array of Inexpensive Clouds.
git-annex and Tahoe-LAFS do this, but they're not actual backup solutions.
If you go with this one, do be aware that by its nature, restores when you've done a lot of incremental backups can be extremely slow. (Think hours, where other systems would retrieve the data in minutes if not seconds.)
That means you really need to be a doing a full, and therefore also slow, backup reasonably frequently to keep Duplicity usable. As a rule of thumb, we found a full backup each month and dailies each night was just about tolerable even if you needed a restore quite late in the month, but even then a 10GB backup with 30 days of incrementals on top could be looking at an hour or more to retrieve a single file.
As I understand it, Duplicity's scheme can also be vulnerable to minor corruption (e.g., due to disk error) anywhere since your last full backup taking out the entire backup, so if you're going to use it then you might like to investigate the options it provides for mitigating this risk as well.
If my understanding is correct Duplicati started out as a C# rewrite of Duplicity (hence the name), but the projects have grown apart and are different things now. I don't think you can use Déja Dup with Duplicati, for example.
Is it a proper service on Windows? That's a problem. I want to configure it from an admin account and have it still work when non admin/non tech savvy users login.
rsync.net has some kind of problem with the "Powerboost" feature. The first 10 megabytes of a transfer are fast as hell, then it drops to ~150kB/s. Have fun backing up gigabytes at that speed. There is something wrong with their traffic shaping. After much back-and-forth their tech support basically threw up their hands and gave up.
Dropbox takes 9.9 USD per month for 1 TB
There's also restic, which works similarly, but has native support for cloud storage: https://restic.github.io/ - sadly it's claims of being efficient are rather undermined by the continued lack of compression support.
I have a mirrored 2x6TB setup that all of my computers back up to. It shows up as an Apple TimeMachine, so MacBook backup and restore is easy; and then it also some media storage with automatic nightly/weekly/monthly snapshots - the monthly ones hang around for a year, the others are cleaned up sooner. For those, I have to un-hide the snapshot folders, but then I can just copy-paste any deleted files that I want to restore.
It definitely takes some setup work, but I've been pretty happy overall.
That server is in my basement, my work laptop also has CrashPlan for off-site backups, but that's a business plan, so it won't be affected by this. I do need to figure out an off-site backup for the rest of it...
It's not the cheapest either, but I never had issues when I was using them.
And, frankly, my measly 200GB isn't much data. A 2TB HDD costs under $100 and could cost $500/mo to back up with tarsnap. It would still have been $5/mo with CrashPlan.
I get that Colin is basically just charging the S3 costs plus a small margin, but the reality is that S3 is far too expensive to be used to store low-value consumer data.
The majority of my most important backup data is my photos. I don't see how tarsnap could duplicate (or even further compress) that.
I have approx 2.5tb on CrashPlan, so that gives me an upper bound of $625 per month. Maybe it might be 50% lower when deduped, maybe more, maybe less. Who knows. Not really a pricing model I can make a decision on.
Tarsnap looks great for certain use cases. But it doesn't really make sense for home users with a large amount of data.
Using the --dry-run flag.
* Are they one of the market leaders?
* Do they have a company blog? Are they passionate about their product? (Backblaze's technical articles are always a treat; barely a word from Crashplan.)
* Do they regularly interact with their users, e.g. on HN? (I often see Backblaze commenting on relevant articles.)
* Is their website user-focused, or is it aimed at the enterprise? (Crashplan's website is a confusing mess.)
* Do they offer secondary products that relate to their primary offering? (B2 leverages Backblaze's fine-tuned backup architecture.)
* Is their software well-built/designed, or is it annoying and bloated? (Crashplan's client forces you to wait for reindexing every time you check a new folder! The tree view is also very annoying, and Backblaze's whitelist just makes a whole lot more sense.)
A shame for Crashplan users, but I hope this change ensures BB's longevity for many more years!
"Do they have a blog"?
Really? I'm sorry but when it comes to backups I rather have them working on the product instead of having a marketing machine up and running.
The service features and GUI experience fall short, and yes I am widely just referring to the 30 day file retention. In practice, this basically means you need yet another plan to make up for that. Once you find that other plan, it should make Backblaze unnecessary if it works in a reasonable way.
This is typically a product you just want to work and are not busy with reading the company's blog as long as things are working fine.
You set the settings and can forget about it.
That said, your whole post seems somewhat biased.
I am not saying it's evil or that your wrong, but that your deductions are illogical.
I'm a bit partial though, as that fits us quite well in what we are doing at our place.
* Do they allow you to recover deleted files? -No (I have used this a few times, it saved some of the digitized family photos which I blew away with an rsync --delete flag)
* Do they offer secondary products that relate to their primary offering? Ok, I can read that as they are stretching too thin not focusing on backups enough. Will they announce they are dropping out of consumer backups and moving on to B2 as their main product?
* Is their software well-built/designed, or is it annoying and bloated? - Currently is doesn't run at all on my systems.
" we decrypt your data on our secure restore servers and we then zip it and send "
That seems shady to me
I usually keep my photo library on external HDD and archive every few months from my laptop. I'd be anxious about not remembering to plug back in to keep the files backed up to the cloud.
If Backblaze sorted this out and extended the time to a year then I'd happily sign up again.
So, this news makes me very sad, but there's no way I'm returning to Backblaze...
I would encourage you to try the Backblaze client again. We JUST released the 5.0 client (last week) and we continue to make little fixes that improve the customer experience. The client team's MAIN GOAL has always been that the client run in "Continuously" mode and NOT annoy the customer.
For example, even up until a year ago we didn't realize how reading certain datastructures from disk that we need to backup were hurting performance in non-SSD based computers. As developers we tend to run with SSDs now, so it went undetected for a while. So we're in the process of making that more and more invisible for customers still on on spinning drives in their laptops. The way we do make this invisible is actually slowing down the reading of those data structures so that it doesn't interrupt what you are doing.
> and yet it STILL insisted on starting from zero and backing up everything again
If you do an "Inherit Backup State" you will not re-upload everything. However, Backblaze has to read EVERY file off of local disk and compare the checksums (SHA1) and if the files have moved it must deduplicate and record the new location, so this can appear like a massive load on your computer for a few hours. I would encourage you to open the Network control panel and notice no files are actually pushed.
Until you address this, I can't reason investing time, processor performance, or the tiny monthly payment.
Complete backup plans are taken seriously by myself and other HN users and when one tries to put one together and include Backblaze, it just doesn't add up.
Maybe you could make a blog post that addresses your recommendations head on. I remain open!
Trust me, that message has come through loud and clear! :-) We are going to do some analysis to see if we can afford it, but we'll need at least a week or two to figure that out.
It should be a /more privileged/ instruction to delete / replace / modify already stored files. This could prevent a backup service on a compromised system from removing remote backups. (Assuming the administrative information was kept secure using other means.)
Ideally I'd like to be able to manually assign privileges to sub-accounts.
* Modify: resume uploads older than 24 hours
* Modify: remove/delete
* Modify: change storage (filestream/parameters/metadata etc).
* Append: create new buckets
* Append: add a new file to a bucket
* Append: add a new /version/ of a file in a bucket.
* Append: add metadata
* Read: all list operations
* Read: all download operations
A simplified customer UI might bundle those operations together, but some advanced way of providing finely grained privileges should be created.
Edit: Fixing formatting.
A way of improving the storage size might be to allow the user to white-list (with some defaults) folders more likely to have 'third party' files (E.G. C:\\Windows, C:\\Program*, /usr/) either excluded if they're on a list of common files or de-duplicated with a public list of common files (and check-sums). It would be useful to add-on programs and scripts if that list were public, and if a way of cloning those files in to a discounted pool were possible by 'uploading' them again (to ensure the customer actually /has/ that file and thus presumably the right to restore it).
Speaking of which -- possibly simple feature request -- when a files' entropy suddenly goes high from low, alert the user and keep the files longer.
I tried this, and it failed. I went back and forth with support for some time, and at the end they effectively shrugged. I may have encountered some weird bug, I dunno, but it was definitely uploading everything again, and support acknowledged as much.
- 2014 FIFA World Cup - Brasil
- 2015-16 Penguins Stanley Cup
- 2016-17 Penguins Stanley Cup
- 2015-16 Los Angeles Lakers (Kobe's Final Season)
Every so often I go in and check that a game from each of these is playable at the start and end of a half, period, or quarter.
The TV shows and movies I have on Blu-Ray/DVD I don't even check. This is more "I'm too lazy to re-rip, but if I have to, I will" backups.
Ebooks are really the only media that I have to remember to backup, and that library is so small (and relatively static) that I just drag it to a USB stick every couple of months.
For Linux, Backblaze offers "B2 Object Storage". You can choose all sorts of Linux 3rd party clients! The billing is different -> half of 1 penny per GByte per month. This is cheaper for less than a Terabyte, and can be a bit more expensive for more than a Terabyte.
Check the "integrations" page for pictures of a penguin: https://www.backblaze.com/b2/integrations.html
Among others, CloudBerry, Duplicity, GoodSync, and HashBackup are all options that can store files on Backblaze's reliable storage in our datacenter.
(Scroll down to "B2 Cost Calculator")
If you type in the size of your hard drive, that will be the "predictable bill" you will get each day.
Also -- how do I know my 3rd party B2 backup system isn't going to cause me extra charges from using API's inefficiently?
Uploads are completely free. I think what you will find is the bulk of your charges will simply be "storage" (at half a penny per GByte per month), the API charges aren't intended to nickle and dime you. We are simply trying to discourage abuse of our API servers.
Creating a Backblaze B2 account is totally free, and if you stay under 10 GBytes it is totally free (and we will cap you to that until you give us a credit card). Seriously, try it out! There is a handy "Reports" page after you log in (look for the word "Reports" along the left. It will tell you EXACTLY what API calls have occurred and how many times.
If you need more than 10 GBytes and are still worried about cost, as part of your experiment look for "Caps & Alerts" along the left after you sign into your B2 account. You can setup daily "Caps" (like for $1/day) so that you absolutely won't exceed your budget until you are comfortable with the B2 service!
What I would miss, however, is the central management and reporting and alerts that crashplan gives me (e.g., host X hasn't been backed up in N days!)
That of course is all dependent on the client, which all seem standalone.
or let us pay what the Mac and Windows users do for whatever frontend we want. switching everyone over to the B2 protocol is one thing, but making us pay more for the privilege of doing the same backups your Windows/Mac users do is another.
Just to be clear, it should be LESS expensive to use B2 in most cases. If you need less than 10 GBytes it is totally free in B2, but it is $5/month using the Backblaze Online Backup client. If you have 500 GBytes to backup, that is only $2.50/month using B2, but $5/month using the Backblaze Online Backup client. You break even at 1 TByte. And it costs you money if you have 2 TBytes to store.
Backblaze is not charging a premium for the B2 service. Let me put it this way: if somebody ELSE was willing to offer Backblaze storage at B2 prices we never would have built our own storage for the online Backup client. I'm not kidding. The only reason we had to build the storage layer is that to this day, Amazon S3 gouges customers at 4x what it actually costs them to provide the service. And don't even get me started at Amazon S3 downloads cost.
Backblaze Online Backup charges $5/month, and of that maybe $0.50 is profit (50 cents) living on the AVERAGES (make money on some customers, lose money on others). If it was based on Amazon S3 we would be losing $15.00/month by charging $5/month.
now, because i believe in the ideals of free and open source software and use the operating system that is an instrument of those ideals, i am not worth it, despite knowing that given a passage, the open source community would build you a Hyperloop.
How about releasing a backblaze protocol spec or the client source code? That way us Linux fans can roll our own.
That is all described here: https://www.backblaze.com/b2/docs/
Now the Mac & Windows Backblaze backup client uses those protocols, but for full disclosure ALSO still uses one or two additional old legacy protocols. But our intention is to move entirely over to using the new B2 protocols, and if it is not possible we plan to extend the B2 protocols to make it possible.
One example is that the Macintosh Backblaze backup client has available the ability to push up to 1,000 individual small files in a single protocol request. (The individual setup and tear down of HTTPS for 1 or 2 or 3 byte files hurts performance.) So we'll be adding that to the "official" B2 protocols.
As I know, Backblaze doesn't allow me to backup mounted folders from NAS. This is only reason I don't use Backblaze.
Backblaze is A LOT LESS LIKELY to exit out of providing consumer online backup anytime soon as it is our primary business.
I don't fully understand what just occurred with CrashPlan, but they had two separate "clients" - one for consumers and one for businesses. As far as I can tell, CrashPlan is discontinuing one of their two clients but doubling down on the "business client". I don't understand how they got into that situation, but Backblaze only has one backup client so we can't really abandon the one true client we have. :-)
Finally, Backblaze is profoundly different than CrashPlan in that we never really raised any bank financing or VC financing. We're 90% employee owned, and there are no deep pockets. CrashPlan raised something like $150 million which comes with "pressure to grow fast or die". Backblaze is free of any such pressure, we own our own fate.
In a final note: I have ALWAYS liked CrashPlan and I am sad to see them go. Realistically they were never a "competitor" to Backblaze. Our biggest competitor was customer apathy and customers not realizing that online backup was a good option. The more money companies like Carbonite and CrashPlan poured into advertising INCREASED Backblaze sales merely by raising awareness of online backup. CrashPlan (and Carbonite) have been absolutely wonderful to Backblaze because we didn't have the gigantic amounts of money to advertise that they had and they essentially advertise on our behalf (for free). We also know some of the CrashPlan people and I believe they are good people who want the best for their customers.
This whole CrashPlan Home episode deepens my skepticism about services that claim to offer an unlimited amount of a physical resource (atoms on a hard drive, in this case) for a fixed price. Code42 has mentioned certain reasons for exiting the home user market, but I wonder if an unmentioned reason is the cost of providing service to an excessive number of users who consume more hard drive space than is profitable. As a current Crash Plan Home customer, I might be one of those users, given my 2.2 TB data set. It's felt like a steal for me, but in the future, I'll likely seek an option where I can get a good deal, a good, fair price instead of a steal (which in the end results in pain and hassle and ends up costing more than what I bargained for).
Is the cost of disk space the reason why BackBlaze expunges files deleted on a backed-up computer from the backup on your servers after 30 days? Though I think you guys are cool, this policy is a deal breaker for me, as far as a Crash Plan Home alternative is concerned. At this point, I'd rather pay a reasonable price for the disk space I use rather than have an "unlimited" plan where I have to constantly look over my shoulder at my files, to make sure important files haven't been inadvertently deleted within the past month. If the cost is too much for me, I'd rather be the one that makes the decision as to which files to exclude from my online backup, rather than have the backup service do it for me.
P.S. and Disclaimer: I developed a backup companion utility (Bitrot Detector) that happens to be more relevant and useful to users of a service like Backblaze which performs file mirroring than one like CrashPlan Home which performs file versioning. However, I'd prefer it if every backup service did versioning rather than mirroring, as version-preservation is what allows you to set-and-forget a backup, rather than to set-and-constantly-worry. Though of course I want more customers for my product, I'd rather have relatively fewer if it meant I lived in a world where every backup service did versioning and fewer people experienced data loss and the resulting grief.
The original reason was to prevent customers who owned a single 1 TByte hard drive from filling it with content, backing it up to Backblaze, writing down the date, then emptying the hard drive and filling it with DIFFERENT content, backing up to Backblaze, writing down the date, repeat 50 times.
The idea was this: if the data is not important enough for you to keep a local copy, Backblaze is not going to keep it either. You aren't allowed to go down to just "one copy in Backblaze".
We chose the 30 days as something we thought of as "reasonable". For example, we expected that within 30 days you would realize your laptop was stolen so you could request a full restore. That sort of thing.
However, and now we're all furiously debating the 30 days based on the enormous amount of feedback we are getting today. We will do an analysis and if we can afford it, we will be increasing that number.
No problem, I'll just restore from backup. I have CrashPlan online backups, and local backups with Obnam, so I'll just recover it from one of them.
Every snapshot in CrashPlan and Obnam had the truncated file, going all the way back to the first snapshots. I thought I had lost my GPG key forever.
Then I remembered that I had some old CD-R/RW backups from years ago. I started going through them. Some of the discs were unreadable. Finally I found one that was readable and had the untruncated private key file.
Lesson learned: always keep your old backups. You never know which files on your system have suffered from bitrot or accidental truncation or accidental deletion--until you try to access them. It's very likely that some of them will have been destroyed more than 30 days ago.
Now keeping old backups doesn't mean keeping every snapshot, ever. CrashPlan takes 15-minute snapshots by default, so obviously I don't need every one of those going back years. But I definitely want to keep at least one snapshot for every year I've used the system, at least one for each of the last 12 months, etc.
The mentions here of poor restore options doesn't sound all that positive either, but due to the former I'm not even considering going as far as trying you guys out.
The way things are looking I'm probably going to stay with Crashplan, migrating to their business plan. I'm not a data hoarder (backup set is ~400GB), but I haven't found anything else which has the same feature set for a reasonable price.
The use case you mention seems to be very exotic, but who knows, maybe it's not as rare as I would expect. To me, expunging deleted files because of this rare, somewhat-malicious case seems like it would cause undesirable collateral damage among your service's typical users.
- Keep at least first (non-zero sized) version of file, at least one version per month (if modified), at least one version per 15 minutes for last week, last version of file.
- Ability to back up files in certain directories very frequently (ie 15 minutes intervals).
- Ability to search for and restore individual files in client, ie without sending my password to someone else.
For this I find the $10/mo Crashplan now charges to be reasonable (my current backup set is ~400GB). I would be prepared to pay more if I required more storage, though within reason.
(I can better understand the policy now as well -- keep people from using as primary offline storage.)
Over a period of time one of the drives failed; slowly. It would lock up sometimes; and I would restart it. Months later I realized it was losing files when it did this. And when I tried to restore those files from BackBlaze I discovered that they had been purged; and that this was considered normal behavior.
BackBlaze had lulled me into a false sense of security regarding my data by claiming to back up my files but actually mirroring a hardware failure on my local system.
I had an unhappy email exchange with BackBlaze tech support, and also Brian, and came to realize how flawed the system was.
I switched to CrashPlan at that point.
I will not consider using a product that lacks proper file versioning; and a much longer lifetime for deleted files.
What good is a backup tool if you a) can't restore using the native tool and b) you can't restore files from more than 30 days.
I am so confused by your question? With Backblaze, you can get 100% of you data back in two ways: 1) you can get a free external USB hard drive sent to you with all your data on it, or 2) You can prepare a free ZIP file with all your data and download it. We provide a restartable native bzdownloader to help you download the recovered files.
> What good is a backup tool....
Isn't the goal to get all your data back? Backblaze does that. Maybe I'm mis-understanding your question?
> you can't restore files from more than 30 days.
With Backblaze, you can recover files for more than 30 days. You can ALWAYS get "the most recent version" even after 9 years. What you cannot do is get all the "intermediate versions" (like if you change a text document, we retain all versions for 30 days, then we only keep the most recent version forever). I do agree that would be a better product (retain infinite versions of every file). Unfortunately we would need to charge more for that, and many customers only want the most recent copy of all their documents.
I understand if Backblaze is not a good solution for you. I just want to be absolutely clear what we provide and what we do not provide.
It'll also be nice if you could provide an option to get alerts if a certain subset of files change. I have folders on my NAS (I know Backblaze doesn't back up servers) which are basically files from my old desktops that I'll sort through some day in the future, and which are highly unlikely to change, maybe ever.
If this is how the 30 day policy works, it is slightly more appealing than what I thought. However, it's still not enough peace-of-mind for me. This means that recent files are vulnerable to corruption during their first month of existence (through accidental overwriting, and some kinds of viruses and ransomware). The "most recent version" that is backed up may end up being a corrupted, useless version.
I agree that retaining infinite versions of every file would be cost-prohibitive. However, you may be able to find a healthy compromise. Maybe you could retain a ton of versions of small files and fewer versions of large files. Maybe you could provide users with a space allowance for versioning and allow them to decide which versions of which files to delete (if they reach the allowance limit) or to pay more to increase the allowance.
To summarize: You need versioning in order to be an awesome backup service (and an awesome backup service is the only thing I'll happily settle for!)
Keeping old snapshots (e.g. one or two per year) is an absolute must.
No. We consider a file deletion as "the final version of the file you wish to retain". In that case you can get the deleted version for 30 days, then it is gone forever.
* Can it backup linux machines?
You can see a list of integrations here: https://www.backblaze.com/b2/integrations.html
I'm a tech guy, and "rolling my own" Linux solution is possible, but in this case, I'd be happy to pay someone else to worry about the details of storing a copy of my bits, so I can do other things with my time.
My wish list is as follows:
have both Linux and Windows client
ability to whitelist/blacklist/select subsets of files/folders to be backed up
SHOULD offer as many of the following as possible (all optional)
some sort of "Family" pricing deal for multiple machines in one account
encrypted backups with consumer controlled key not necessarily given to service
option to switch out back end bulk storage infrastructure
"forever" file retention / restoration of old versions (charge me for storage)
user scheduled backups and/or don't bring client to crawl or saturate home line
Feel free to add any replies with suggestions here which might be useful to others.
For my point of view, if CrashPlan found the home offering unsustainable from "abuse" of the "unlimited" feature, I would have much preferred them introduce a cap to the home plan at some "reasonable" value. They could have balanced the books, and saved 95% of their customers. Start rolling off oldest file version backups when you hit space cap and/or offer extra space at cost+profit. Done.
* Windows, Mac, Linux, iOS and Android
* Lets you share your quota across unlimited machines/devices
* Lets you whitelist/blacklist folders & files to be backed up (by pointing and clicking and/or by specifying patterns)
* Does LAN syncing
* Keeps versioned file history forever (i.e. point-in-time recovery)
* Has scheduling but I haven't used this, it just runs quietly in the background and I never notice it negatively impacting anything (client and accompanying background service is currently using a total of 51MB of memory). You can cap the upload speed if you want to make sure it never saturates your uplink.
* Uses end to end encryption i.e. they don't have your key (important caveat: this is provided you only use their app and don't use their web interface, although even then they claim the key is never stored anywhere - just kept temporarily in the server's RAM while it auths/decrypts files for you)
* Been around for ages and not likely to go out of business any time soon
* Lets you share files with a self-destructing link
* Uses aggressive de-duping on their backend (ZFS based I think?) so you can squeeze the most out of your quota (just checked my account and I'm using 94GB without compression & de-duplication but only being charged for 41GB)
Edit: and they release a lot of opensource: https://github.com/SpiderOak
I'm probably going to just move my photos to Dropbox (I know, its not a backup program, but it works and is very simple to deal with)
Edit: I've only ever used SpiderOak's Windows app though. No idea how good/bad their Linux client is.
Edit 2: this is what the current version looks like on my windows machine: http://imgur.com/a/KfnlQ
> Uses aggressive de-duping on their backend (ZFS based I think?)
Err, I hope not. Pretty sure they sync a database of seen blocks between clients and they collectively dedupe based on that - their backend simply shouldn't have the information necessary to dedupe itself.
Think you're right based on reading: https://spideroak.com/faq/what-is-deduplication
I assumed they'd use both techniques (i.e. de-dup client side when possible to save on upload) but that the way in which encrypted blocks were produced before uploading was deterministic (a bit like a hash function) so that if the same file was marked for backup on a different device the resulting blocks would be the same and could be de-duped server side.
Providing a unique salt was used client side (shared across all devices and not known by Spideroak) while making the block, it should be possible to end up with deterministic encrypted blocks that are still unique outside of your account (I think anyway... barely know what I'm talking about when it comes to the intricacies of crypto!)
I think I looked at spideroak a few years ago. Good to see they are still around. Might be my best fit.
A friend also suggested grabbing this table, export to your own spreadsheet and sort by what matters for you.
Think they offer a free trial (or they used to at least) so no harm in giving it a whirl. Good luck on your quest!
[Edit] actually you can probably just rsync the diskstation volume with a backup volume on the same NAS.
[Edit#2] actually you don't even need rsync. Synology has a utility called Hyper Backup that does time machine like incremental backups, either on the same NAS, or on a remote NAS.
I guess the reason for CrashPlan bailing on the home market wasn't just people trying to store 20T of movies, but also just the support needs. The economics of providing end-user support on $60/year plans is horrific. Note that all Crashplan really did was increase cheapest plan from $6 to $10/mo...
This seriously sucks, especially that I now have to find a new solution, deploy it remotely on multiple PCs, and they'll all have to suck it up for another couple of weeks if not months when everything reuploads to the new solution.
As for me - even if I migrate to new solution, I'd still effectively loose the versioning. This is a serious bummer!
$6/mo for a family's worth of devices is ludicrously cheap
What are my options going forward?
I can easily switch myself over to Backblaze, but are there any simple, free, GUI clients for creating incremental, encrypted backups to another machine over the internet? The great thing about crashplan has been I just install it on the family member's machine and not once have I had a single problem. It just worked and provided my family member's with reassurance that I had a backup of their data, that I couldn't touch or see, if they had problems with their own machine.
And as I see that BB employees are in this thread, how about making backup folders opt-in for power users. I have sensitive files all over my machine, and who knows what crap MS is storing in User/LocalData etc. so I prefer to opt-in to folders rather than take a risk opting out and syncing a password or keyfile to your servers. The UI for exclusions is awful too.
I'm certainly not going to go through the steps of setting up CLI tasks on my families machine members either.
I personally use it with Google Drive.
Use B2 with a decent client instead, you don't get an all-in-one plan for your family, but at least you keep control over your data.
Mind if I ask why you don't want encrypted data in the cloud?
Often the webui would just timeout if I wanted to recover a file. Never could figure out why, so I couldn't really recover files reliably anyway.
I'm glad I'm forced to find a solution that better. I'd definitely stay away from them for business solutions.
Thank goodness I bought into their long service plan. I have 1.9 years to figure this out.
We have one Linux NAS and a bunch of Windows clients. I could go with Crashplan Business on the NAS and then punt on the Windows clients. Or go back to BackupPC for windows... I hope their windows client solution has improved -- I used to have to install a Cygwin rsync daemon.
In any case, it sounds like the business version drops the peer to peer free solution.
For one, their client is very bad at backing up without affecting other network services; when I'm on a video call I frequently have to pause Crashplan backups entirely or I'll have stream issues (or manually limit the upload speed to something really small that I'll inevitably forget to undo later). I've never had this issue with Dropbox for instance. The client is also written in Java so it's a resource hog; beyond annoying my desktop machine that also made it hard to install directly on a somewhat resource-limited Synology NAS device a few years ago (I eventually got the install to work but it sporadically won't start up on boot due to memory constraints).
Really, on my Linux desktop my most important files are my code, documents, pictures, and video, which are all already backed up to Dropbox. If my hard drive died it wouldn't be a big deal to do a fresh install as long as I can sync my code, documents, etc. with Dropbox. So I may just go without a full desktop backup solution.
I don't know how they're going to transition to enterprise given their trashy desktop client but good luck.
edit: in addition to Dropbox I'll probably add tarsnap to sync the important things to S3
I kinda feel that makes them perfectly suited for enterprise. I've never thought of enterprise software as having a particularly good user experience coffsharepointcoff
I'm also rather glad CrashPlan forced my hand. There were a lot of good ideas about it (the peer-to-peer features were interesting), but I really hated the Java client. And I always had to pause it's uploads if watching Netflix, because it would saturate the upload channel completely. (On a Mac here, for what it's worth.)
They've given plenty of notice though, so I'm thankful for that.
edit: I got confused - Small Business likely still offers indefinite retention.
That red icon next to Small Business on the version 6 page scared me.
Hope you're right! Indefinite retention is a killer feature for me.
I can't dissuade against Carbonite strong enough.
We now have a NAS and back up truely important data to S3 and iCloud.
I dumped crashplan a few years ago due to the bandwidth restrictions and time it would take to download TB's worth of data, built my own solution with a remote Synology nas. It's a real cost issue once you have more than 1-2TB of data, otherwise it's just easier to use gdrive or dropbox for most needs.
I'm working out what to do next since they did this. I use Arq (https://arqbackup.com) and back up my desktop (Windows) and laptop (OSX) to Amazon Cloud Drive. Together it's approx 1.5TB of backups - and this is excluding a bunch of folders and accepting data loss on them.
The doubled price doesn't appeal, but I haven't found an up-to-date storage price calculator to see what my options are in the TB range.
For pay-as-you-go storage, I found this: http://coststorage.com/
I don't see how this makes sense. Instead of closing the whole service they can set a limit which will probably satisfy 95% of the users who aren't using enough storage to make it unprofitable. So I assume this isn't the reason.
I wouldn't be surprised they just ran out of data center capacity and decided cutting loose low revenue consumers was cheaper than building out new space.
Let the suggestions begin - who should I switch to? Bonus points if moving data to a newly purchased hard drive isn't a terrifying process that looks like it's losing all of my backups, as well as Linux/Windows/Mac support.
I'm kinda partial to Arq with your choice of cloud storage. I heard that they don't support moving backup data between destinations though so I'm going to have to re-evaluate my choices when my Amazon Drive unlimited subscription comes to an end as well.
I want a boring solution, written by a boring company, that very boringly stores my files.
I need now to check all the different solutions, and start from scratch again!
I wish they had offer some bridge to transfer what is already backed up to another provider; yes I know this is not a simple task but at least their customers will not be upset by that news.
"Your new subscription expiration date is 11/06/2017" had me thinking they accidentally set the expiration to last June...
In case of catastrophic failure, where on-site backups would also be destroyed or corrupted, I would hate to have a drive shipped from the United States to me. It would probably take forever.
Is there a alternative available that does computer-to-computer backups, and supports Mac and Linux?
SyncThing+cloud storage seems most promising but I've not actually tried it yet.
1. syncthing on PC -> syncthing on Synology NAS.
2. NAS -> Internal backup from 3-Disk RAID to additional 4th disk
3. I am thinking about additionally uploading it to Amazon Glacier or Backblaze
CrashPlan was awesome in that it doubled as both a backup app and a cloud storage solution. Since I had my settings set to NEVER remove deleted files, I didn't need to have each of my backed up external hard-drives plugged in at all times in order to have them backed up. I could just plug them in as needed, make sure Windows assigned them a unique drive letter, and let Crashplan sync any new data up.
None of the alternatives I'm looking at can do this. Carbonite lets you back up one external hard drive on the Prime membership. Backblaze seems like it would delete all of the external drives if I don't keep them constantly plugged in. And no Linux client.
It's kinda sad that the market hasn't yet created something that seems fundamentally basic to data hoarders:
- TRULY unlimited
- BACKUP, not SYNC (that means I shouldn't need to keep
three external hard-drives plugged in at all times just to ensure you don't delete it!)
- Fully cross-platform (yes, Linux users exist, too)
At this point there's nothing really comparable out there to what Crashplan offers. Even the services like Amazon Glacier or Backblaze's B2 start becoming way more expensive once you pass the terabyte mark or so.
Yes, this is my point! It's expensive to store a terabyte in the cloud. There is currently no way around that. Crashplan was probably losing money on customers who had multiple TBs in the cloud.
Good lord. And you need to back this all up in the cloud?
With E2E encryption, not so much.
Holy fuck. Let's settle down a bit buddy.
- Use a freenas at home
- Make everyone in the family back up to it instead of crashplan. (No idea what app to use here yet but assume there's a dozen of em)
- Anything that isn't already acting as an offsite backup via the freenas server (I.e. My own stuff, and server only files), upload to backblaze b2.
I'm overall okay with the announcement. Was already planning to switch away since crashplan wasn't offering the great deals they had years ago (crashplan family was ~7.50/month with 4 year plan in 2014)
I look forward to no longer futzing with Bhyve/Linux just to use crashplan from my server too.
I don't look forward to spending more time figuring this stuff out more though.
The only downside is that the integrated encryption is an additional charge. To get around that, I decided to go with Boxcryptor, which I already had. The average user can use the free version of Boxcryptor. It is awesome and I highly recommend it. Both pCloud and Boxcryptor show up as physical drives on your computer, which I find convenient. So I have my primary source drive synced with the Boxcryptor drive/location and Boxcryptor is set up on my machine to encrypt the data on the fly as it is copied (surprisingly, this is not the default setting). Then, I have a separate process that syncs the raw Boxcryptor directory where the files show in their encrypted format (the "drive" version shows them unencrypted because everything is done on the fly both ways) to my pCloud storage. In the end, I have 3 versions - the unencrypted source, the encrypted Boxcryptor copy, and the pCloud backup which contains the encrypted copy.
FWIW - the reason I decided against CrashPlan was:
1) I inherently don't trust anything that is unlimited everything (storage space, versions, undelete, etc.)
2) I especially don't trust anything that gives all those unlimited features for a paltry $5 a month
3) My speed was horrendously slow. Seemingly capped at about 2 mpbs (250 kbps).
4) The application seemed to be bloated and resource heavy.
5) They have a well known issue where you can wipe your entire backup from their server if you encounter the dreaded disaster scenario (or get a new machine) and have a fresh O/S with a new installation of the software. If you don't do the steps exactly in the correct order, it will see your new installation as being the source copy and having nothing to store, and subsequently wipe everything on their servers. I see that they recently posted something on August 1, 2017 about "adoption" which seems to be migration for new machines so maybe this issue is finally addressed.
The winner was pCloud. Please see my other comment for a more detailed explanation of how I have things set up and why I use them if you are interested.
I use Backblaze, and I don't worry. It's not my only backup. I additionally have a local backup, which today has files I deleted like 5 years ago, as you should to.
So go get happy with Backblaze, it's a nice application, not some slow Java thing like Crashplan, and just works..
Most likely yes. Considering most people don't have any backups of any kind, having at least one backup, and an off-site backup at that, is infinitely better.
The number of people who have any backup is small. The number of people who have two is far, far smaller. Most people just don't care about their data that much unless they're a business, which is why this thread exists to begin with.
I paid less than 4$ per month (the price would've been already higher if I renewed yesterday, but still acceptable) and stored around 130 GB of data (so not a lot, but it's already more than the 100GB threshold that some services use)
I need something that would keep watch on all the files on the system (see inotify) to avoid continuous rescans of the disks, and that would work on Linux...
Just these 2 simple requirements already seem to disqualify any other tool suggested here:
Carbonite (no Linux)
Backblaze (no Linux, and none of the 3rd party integrations with B2 seems to support live watching of files)
Tarsnap (no watching)
Borg (no watching)
Does anyone know if it's feasible to keep the whole / under Dropbox, Google Drive or Spideroak? (I know that GDrive doesn't have an official linux client)
The only solutions I know would be:
Lsync (but it would be only syncing, no history), r1soft (but it seems to be only servers, haven't checked yet the features)...
Are there any other tools out there?
I'd really thinking that I should write my own
> Backblaze ... none of the 3rd party integrations with B2 seems to support live watching of files
I'm not exactly sure what you are going for, but you can "preview" files through the web interface on BOTH the Backblaze online backup and also B2.
If you are specifically asking about movies, just stay tuned. The first step was allowing preview of simple images, next we want to support previewing of movies.
Also, a feature we added a week ago in Backblaze Online Backup is the ability to "Share Files" (any file in your backup) with other people. For the first release we limited the file size (just so our servers did not tip over) but we plan to support the ability to share all files less than 10 GBytes.
What op is looking for has nothing to do with previews
For Linux and the relatively small amount of storage you require, you're very likely going to be looking for client software independent of backup storage medium, not a service that provides a client. 130GB would be about $3/month on AWS S3.
Not ideal, but you only need to do it once and I already have it set up in my ansible playbook:
edit: It's client/server backup software. Not an online service. I used CrashPlan in a similar manner.
Wish Google would just release the Linux client they use internally.
Second is watching for files to change and saving that list until the next scheduled backup is run. This is what macOS's Time Machine does, the OS's fsevents continuously keeps track of changes and Time Machine just asks it for the list of changed files since a given time (the last time Time Machine ran). This is an alternative to a backup client scanning the filesystem itself on a schedule.
Inotify (on Linux, and fanotify wouldn't be enough, since the backup tool would have to track file deletions as well) is the only way to have seamless background backups
I tried to use other tools in the past, like duplicity... But having a scan every hour that takes several minutes and has noticeable CPU usage is a deal breaker
(and having a backup scheduled less frequently than that is also a dealbreaker, since you cannot rely that the laptop would be up an running at a certain time during the day)
(Crashplan also does a full rescan, but it's not strictly necessary... It's basically needed if the backup daemon crashed or was shutdown, or stuff had been changed on the filesystem while the OS was off, so that sanity check rescan can actually wait a few days)