
Show HN: Archive websites from the command line - flippant
https://github.com/marvelm/erised
======
bnj
At first glance I love this, looking forward to putting it through its paces.
I often use wget to backup local copies of websites and it's been frustrating
having no easy/automatic system to (a) keep easy information about when the
snapshot was taken and (b) allows me to incrementally relate that snapshot
into a network of historical info

~~~
flippant
Thanks for checking out the project. You can also query the underlying
database at ~/.erised/erised.sqlite3 if you prefer SQL over JSON strings.

------
mark_sz
Cool.

How does it work? What's under hood?

~~~
flippant
It uses Electron under the hood.

[https://electron.atom.io/](https://electron.atom.io/)

~~~
mark_sz
ok, but does it archive whole website or just one page (given URL)?

~~~
flippant
It only archives the one page.

