Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And that's not a coincidence. That's not what the word "coincidence" means. It's a complete misunderstanding of how these tools works.




I don't think you're the right person to make any claim of "complete misunderstanding" when you claim that training an LLM on regulations would produce a system capable of answering questions about that regulation.

> you claim that training an LLM on regulations would produce a system capable of answering questions about that regulation.

Huh? But it does do that? What do you think training an LLM entails?

Are you of the belief that an LLM trained on non-medical data would have the same statical chance of answering a medical question correctly?

we're at the "Redefining what words means in order to not have to admit I was wrong" stage of this argument




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: