Hacker News new | past | comments | ask | show | jobs | submit login

How does it compare to http://freefilesync.sourceforge.net/ ? Apart from the delta copying the feature list sounds similar. I find the FreeFileSync UI quite usable.



Faster bulk copying, lighter build, lighter on resource usage, removable device tracking, much better UI (it's subjective, of course, but just try it out and see for yourself). Formal backup planner, dry runs, native support for running as a service with separate engine and UI processes.

  -- edit --
Let me give an example of what I mean by "much better UI".

Bvckup 2 has a hierarchical log viewer [1] and this thing does an absolute wonder for usability. Essentially you can go from a backup run summary to the specifics of a failure in a matter of 2-3 clicks without being overwhelmed in the process.

The issue however is that it was a total bitch to implement so that it wouldn't require keeping an entire log in memory at all times. See, if the log is flat, it's easy to display its relevant part by looking at the scroll position and then rendering respective chunk from the log file in a window. But once you have a tree, determining which item is visible becomes a hard problem, because random nodes can be opened and closed. Consider the case when you have a million-item log and, say, 15% of the nodes are closed. Moreover, the visibility look-up needs to have a real-time performance, because the window needs to respond in real-time to user dragging scroll button up and down. So, behind the scenes, this hierarchical log viewer is backed by a double-indexed b-tree index file - a construct that that has no prior art and that took me several weeks to converge to and to implement [2]. It could very well pass for a modest PhD thesis in a smaller university :)

So, I mean, there are smaller simple-looking differences between bvckup's UI and other apps, but the thing is that they are sometimes really hard to implement. They do however improve the overall UI experience quite a bit.

[1] http://bvckup2.com/img/r8/screenshot-3-log-viewer-2.png

[2] http://bvckup2.com/wip/#27012014


> It could very well pass for a modest PhD thesis in a smaller university :)

This may be a bit far-fetched. Some might even say disrespectful to the bulk of PhDs out there, who typically worked several years (not weeks) to complete their dissertations.

From properly relating to current state of the art, to actually developing multiple (not just a single one) techniques improving it, and finishing by properly presenting all that work in a scientific manner: your nice trick has a long way to passing as a PhD.

EDIT: Ok, I didn't get the playful tone. Sorry.


A common standard for a Ph.D. dissertation is "an original contribution to knowledge worthy of publication", and the usual standard for publication is "new, correct, and significant". The length of time devoted is not part of the criteria. Maybe can do the work in a weekend; maybe take years.

When I was a grad student, at one point I took a 'reading course'. A paper was required, maybe just expository and not necessarily original. I started with a problem seen but not solved in a course. I hit the library and saw no solution. In an evening I got some rough ideas for a solution and then proposed solving the problem as my 'course'. A prof looked at the problem for a weekend, didn't see a solution, and agreed that the problem was significant enough. We shook hands. Then immediately I outlined my first cut solution.

Then in some pleasant evenings for two weeks, sitting with my wife on our bed as she watched TV, I found a good, clean, solid solution. In addition I discovered a new theorem comparable with the famous Whitney extension theorem and applied it to solve my problem and, also, produce some curious additional, new results. I also found that I'd solved a problem stated but not solved in a famous paper in mathematical economics by Arrow, Hurwicz, and Uzawa. Poor Uzawa -- apparently so far he has yet to get his Prize! I published the paper in JOTA, right away, with no significant revisions. So, about four weeks of pleasant, not very hard, work, and I'd met the formal requirement for a Ph.D. dissertation. I used another piece of work I'd done for my Ph.D. dissertation, but that paper did 'polish my halo' in the department.

Net, length of time is not one of the criteria!


Yes, I know how doctorate degrees work and what they mean. It was a joke (as indicated by a smiley) and as with every joke there's a grain of joke in it. Not all PhDs are created equal.


Why would I want to delve into my backup run's log files? Seems pretty counter-intuitive as it's something you expect to work out of the box, not something that you'd have to debug and have issues with.


Because things happen. Like bad sectors leading to internal CRC failures, network drives disappearing, WiFi getting saturated, etc.


I'd love to hear more, if you're comfortable sharing, about the data structure you use for the log viewer. Either way, this looks like an excellent product; thanks for sharing.


freefilesync does do dry runs. Can bvckup2 copy the changed portion of files only? Does it compare file contents, or just size/modified time? (Or is that an option?)

Agreed that trying to parse the giant logfile dumps of most sync programs is a pain, so kudos for that.


Re: dry runs - my bad, I should've checked more carefully.

Re: copying changed portions only - yes, http://bvckup2.com/#delta

Re: compare file contents - not for deciding if the file was modified or not, no. There was just no demand for this. That said, it's easy to add and it can be done more efficiently than doing a raw comparison between the original and the backup copies. That's because the delta copier computes file hash as a part of the process, so the app can simply go through the source file, redo the hash and see if it's changed since the last run. But as I said there were zero demand for this, so the comparison is done based on timestamps and the file size. Depending on the file system support the app automatically selects between comparing just the modified time or both the modified and created. HTH.


I, too, have found freefilesync to be the best option so far for windows. Changed everything over from Synctoy recently. (I mirror everything I want to backup to a home server, then do cloud backups from there using crashplan.)

I may have to give this a try, although it would be nice to see some figures on just how much faster it is, to see whether it would be worth it.


What about Cobian Backup, Duplicity, Duplicati, rsync.net ?


I compiled a few notes on some cloud backup solutions a while ago (2012), http://alicious.com/cloud-backup-solutions/. In the end I use duplicati (with cloud storage, was live drive but that doesn't work and it's actually a current todo to move-over to Google storage [backups are encrypted]) and crashplan with buddy storage.


C'mon, let's just list them all!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: