I made a simple Python tool to download all the JFK files as they're released for the 2025 collection.
It's pretty straightforward if you're comfortable with basic Python setup. I'm using it to test some automated analysis software I'm working on and figured others might want easy access to the documents as they become available.
Feel free to fork/contribute if you're working on similar analysis tools, I'm interested to see what others come up with.
I’m sharing a bash script I built to automate pre-commit file management in Git, particularly for handling large files and Git LFS. I briefly looked for existing solutions but decided this would be a good experiment to automate myself.
Key features:
- Compresses files over 50MB and automatically adds them to Git LFS if needed
- Tracks large files with Git LFS automatically
- Option to create backups before every commit
- Includes dry-run, stash, and revert options
- Experimenting with adding a Git hook to run the script automatically
Right now, the script is focused on macOS, but I’ll update it for Linux as I use it across other repos.
Would love feedback or suggestions from anyone who's dealt with similar issues!
It's pretty straightforward if you're comfortable with basic Python setup. I'm using it to test some automated analysis software I'm working on and figured others might want easy access to the documents as they become available.
Feel free to fork/contribute if you're working on similar analysis tools, I'm interested to see what others come up with.