Hi HN,
I’m a B.Tech student from India. I built Recall because my desktop was a disaster of downloaded PDFs and screenshots, but I didn't feel comfortable uploading my personal documents to a cloud-based AI just to get them organized.
What it is: A desktop app that runs 100% locally. It uses Ollama (Llama 3.2) to analyze file content, generate context-aware folder names, and move files automatically. I also recently added a local RAG (Retrieval-Augmented Generation) chat so I can query my notes without internet access.
The Stack:
Backend: Python
Inference: Ollama (running Llama 3.2 or Mistral)
GUI: CustomTkinter
Storage: Local ChromaDB (for the chat vectors)
Installation Note for Mac Users: Since I am a student developer, macOS may flag the app as "Unverified." This is a standard Gatekeeper check.
The Fix (2 Methods):
GUI: Go to System Settings > Privacy & Security and click "Open Anyway" next to the app.
Happy Thanksgiving! I’ve been reading HN for a while but only recently started participating. It’s rare to find a corner of the internet that has maintained this level of thoughtful discourse for so long. Huge thanks to the mods for their hard work.
Has anyone verified if this is just a UI leftover, or if the underlying services are actually still active in the background even when disabled? I've been considering switching to a fork like LibreWolf if Mozilla keeps hard-baking these features into the core.
Backend: Python Inference: Ollama (running Llama 3.2 or Mistral) GUI: CustomTkinter Storage: Local ChromaDB (for the chat vectors) Installation Note for Mac Users: Since I am a student developer, macOS may flag the app as "Unverified." This is a standard Gatekeeper check. The Fix (2 Methods):
GUI: Go to System Settings > Privacy & Security and click "Open Anyway" next to the app.