Hacker Newsnew | past | comments | ask | show | jobs | submit | tygra's commentslogin

That's dark, baby!

You can buy a good laptop and run open-source LLMs locally. 100% private. Disable its network. Stay offline.

Time to stop feeding the machine!

YOU CAN!

;)


Obviously you need decent hardware to run LLMs locally, but you don’t need a super high-end computer just to host qwen3:30b or gpt-oss:20b. Those models are already pretty solid for writing and coding.


No, I'm not talking about serious medical issues. I mean uploading your blood test results and asking an AI, "Give me specific nutrition, lifestyle, and supplementation recommendations."

For work, as an employee, sure, it's easy to say the company approved ChatGPT or Gemini, so you can go ahead and upload, for example, usage data to get a retention analysis. But what if you're the employer?


We are talking about sensitive data, like bank statements, blood test results, and citizenship documents.


Oh, wow.

"All these sites may as well have all my information." I don't think so. That's scary!

But you do have a choice.

If you want to upload your bank statements and ask an AI, "How did my spending habits change in Q2 compared to Q1?" or upload your blood test results and ask, “Give me specific nutrition, lifestyle, and supplementation recommendations," you can run LLMs locally. No data will leave your computer.


Work: Because your company is using Microsoft 365, you can upload customer data to ChatGPT - for example, for revenue reporting. Am I understanding you correctly?

Personal: So, you download and run open-source LLMs on your computer?


We have confidentiality agreements with Microsoft that are worth whatever lawyers think they are worth. Using chatGPT though the company azure accounts is fine, the entreprise Microsoft copilot is fine too, but standard ChatGPT isn’t fine.

Personally no I dont use LLMs for confidential data, the local ones that can run on my personal computers aren’t good enough.


Understood. Makes sense.

Give qwen3:30b and gpt-oss:20b a try. You don't need a fancy GPU, just a modern CPU. Those models are already pretty solid for analyzing your personal data.


Oh yeah, I've been there!


Interesting.

If you can run an open-source LLM locally on your own computer, completely offline, and use it for legal, finance, or medical topics, would you still say no to that?


I can and have when I was required to. It was slow and had worse results than I had hoped. Probably because I dont have enough VRAM for the big open source models so I was using 8b ones.

I left this bit out because my original comment was getting long but I think it's important to be respectful over others' privacy wishes. So I didn't use an API when it concerns other people.


Nice.

Of course, you need decent hardware to run LLMs locally, but you don’t need a super high-end computer to host qwen3:30b or gpt-oss:20b. You don’t even need a GPU for those models, as long as you’ve got a modern CPU. And they’re already pretty solid for writing and coding.


Wait… why would anyone even upload a file containing passwords or secret keys to ChatGPT or Gemini?!


Anonymization is a good idea.

Just last week I wanted to do some financial analysis with ChatGPT (GPT-5), but I wasn't comfortable uploading the CSV with all my transaction data. Instead, I used Qwen3 running locally on my MacBook to anonymize the data first, and then uploaded the sanitized version to ChatGPT.

¯\_(ツ)_/¯


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: