Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
mikeqq2024
4 months ago
|
parent
|
context
|
favorite
| on:
Show HN: Crawl a modern website to a zip, serve th...
for a long crawling task, if exited/broken for any reason, does it save and restore at the next run?
eXpl0it3r
4 months ago
[–]
The README says:
> Can resume if process exit, save checkpoint every 250 urls
mikeqq2024
4 months ago
|
parent
[–]
nice, better make it as a commandline option with default value. 250 is too many for large files and slow connection.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: