Hacker Newsnew | past | comments | ask | show | jobs | submit | wanglet33's commentslogin

The sandbox approach mentioned by @arty_prof is essential, but there’s also the 'Data Leakage' side of the coin. If an LLM agent has access to your local filesystem to 'help' with code, it essentially has a map of your credentials. Aside from Dockerizing everything, are people using localized, air-gapped LLMs for sensitive security logic to prevent the 'Phone Home' risk entirely? Curious if anyone has successfully integrated something like Ollama into their dev-flow for this specific reason.

This is a cool concept for prototyping! I've been spending the last few weeks manually building out a P2P vault app (using PeerJS and WebRTC), and the "last mile" of deployment and cross-platform permissions always takes 10x longer than I expect. If your tool can handle the deployment to a live URL that seamlessly, it's a huge time-saver. Does it handle local browser storage (IndexedDB/LocalStorage) logic well yet, or is it mostly focused on UI generation?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: