Hacker News new | past | comments | ask | show | jobs | submit login

for a long crawling task, if exited/broken for any reason, does it save and restore at the next run?



The README says:

> Can resume if process exit, save checkpoint every 250 urls


nice, better make it as a commandline option with default value. 250 is too many for large files and slow connection.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: