Hacker News new | past | comments | ask | show | jobs | submit login

Now I'm suddenly tempted to write up some kind of program that'll automatically archive not just the Wikipedia pages themselves ¹, but also their citations and external links. Or maybe (probably) someone else has already written such a program.

¹ I vaguely recall Wikipedia already provides some way to download all pages in bulk, but I can't seem to find it (if it even exists anymore, or if it ever actually existed instead of me just hallucinating it)





Nice, thanks!




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: