Hacker News new | past | comments | ask | show | jobs | submit login

Interesting read. Time wasn't a variable I had considered missing from interactions with AI, but it makes sense.

I'd also add this: tools like the AI bots so prevalent today are flawed because they cannot consider things like context, limitations, dependencies and scope. I give a question...they attempt to spit out a complete answer with complete disregard for the context which my question is coming from.

AI fails in the same way a monkey can't drive a car.... abstraction. We humans know a red light ahead means stop at the stop light, not stop immediately where you are right now. All AI can do is make a best guess of what the inputs pattern-match to. This is like always having an answer without ever asking for clarification or context.






Exactly. What I consider a patch and definitely a symptomatic solution is "solved" via agents that search the web (e.g. asking for the weather forecast of this year - in that case the LLM cannot know the year I am referring to, if not via a web search). Generally speaking, LLMs lack direct temporal awareness. Standard models do not model the flow of time unless explicitly. Some models can encode a model of time when trained on sequential video data and rely on external encoders to provide temporal structure. But that is a very narrow application (video in this example). That cannot be considered a generic form of awarenes of time as a concept through which facts can change.



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: