Hacker News new | comments | show | ask | jobs | submit login
Show HN: A tool for painless automated backups (github.com)
22 points by christophetd 6 months ago | hide | past | web | favorite | 18 comments



If it uses https://github.com/gilbertchen/duplicacy in the background it needs a (huge?) disclaimer that a $20 per year license for commercial use is required.

What is the license on your code?


Thank you for pointing this out, I forgot this specificity of duplicacy. I added a disclaimer: https://github.com/christophetd/duplicacy-autobackup/commit/...

Regarding the license of my code I'm unsure what to use. The license of duplicacy (https://github.com/gilbertchen/duplicacy/blob/master/LICENSE...) states that "Modification and redistribution are permitted, but commercial use of derivative works is subject to the same requirements", so it seems to me that as long as I include such a disclaimer, it should be fine. I opened an issue on the duplicacy project to ask the author his opinion.


I used to use duplicacy, but switched to Duplicati: https://www.duplicati.com/

Works in mono in all OS's, ive used it on mac, linux and windows. Supports GPG (public) and AES (secret) key encryption, all the cloud backends, etc. Provides a Web UI.

Its also LGPL so no concerns about commercial licensing ala Duplicacy.


I initially wanted to use duplicati as well, but I tried it twice at a one year interval and both times the web UI felt very buggy, and I have a hard time trusting for my backups a tool that has a buggy UI.


Given how close 'duplicacy' sounds to 'duplicity', and that they have the exact same purpose (backup tool), I'm surprised they haven't been sued yet. Particularly since there is a paid license cost:

https://duplicacy.com/buy.html


Agreed, the names are terribly chosen and very confusing. There was a discussion on the topic here: https://news.ycombinator.com/item?id=14507778


There is also duplicati

But I can't imagine suing would achieve anything. they're all dictionary words.


I find rclone much better (https://rclone.org/docs/)


Can you use rclone for incremental, content-deduplicated backups? Is it as easy as Duplicacy to use?


I haven't used rclone, but `rsync --link-dest` lets you create an incremental snapshot using hard links so that you don't have duplicate files between snapshots. Wrap it in a shell script, and you have something surprisingly useful.

I won't link to my wrapper script because I don't want my HN account linked to my github/real name but this guy's work is what I based mine on: https://blog.interlinked.org/tutorials/rsync_time_machine.ht...


rsync --link-dest can't do content deduplication, can't upload to the cloud and it's much more inconvenient to use for doing and managing backups than Duplicity, Duplicacy or Duplicati.


> rsync snip can't upload to the cloud

First you have to settle on a definition for cloud before you get to make claims about what tools can or can’t do.


rclone can!


Any support for unattended asymmetric encryption, with the private key stored off-box?


Duplicacy doesn't support it, so I don't plan to add support for it either.


Take this as a positive criticism, I dont want to put you down, but today everyone is considering backup as problem of moving data. Which is wrong. This is far from what I would expect from backup tool. You have many targets but than this is just a data moving tool to the cloud. On the other side rarely any serious organisation moves the data to the cloud and even if they do, it is considered as second tier storage. Primary target is always to the streamer (tape), it is THE only storage that was time proven. Also, there are some concepts; incremential, differential, delta, rehydrating, archiving, lets not even go into deduplication... and there are open source solutions for that. Backup word is not a simple tag, do your homework and study the backup field, it is highly complex and far from moving the data from point A to point B, maybe just a thought to show you the importance, some organisations have separate backbone for backups and disk arrays to minimise downtime, the data are not always on disk, the minimum you need is flushing the disk cache, for the database you will hit into words like freeze, for virtual machines... well.. good luck :)

But anyway, nice to see you like it enough to write a tool, just keep on going :)

----

After my initial post, I was dowvoted. To justify that: this is not a backup, it is just an alternative to rsync or cp with multiple targets and more of something I would run after a real backup has finished. If someone is looking for real backup, there are multiple open source backup products that know their domain. This one clearly does not.


The scope of OP's project is to make Duplicacy (an incremental, deduplicating backup tool) easier to install and easier to use for client-initiated, periodically autoscheduled backups.

Can you recommend anything the OP should do differently to achieve these goals?


I did. Now you can downvote me even more. I have finished the discussion.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: