Hacker News new | past | comments | ask | show | jobs | submit login

As a person I can at least tell you what I do and don't understand about something, ask questions to improve/correct my understanding, and truthfully explain my perspective and reasoning.

The machine model is not only a black box, but one incapable of understanding anything about its input, "thought process", or output. It will blindly spit out a response based on its training data and weights, without knowing the difference whether it true or false, meaningful or complete gibberish.




As a person, you can tell what you think you do and don't understand, and you can explain what you think your reasoning is. In practice, people get both wrong all the time. People aren't always truthful about it, either, and there's no reliable way to tell if they are.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: