Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you have some data and continuously update it with a function, we usually call that data state. That's what happens when you keep adding tokens to the output. The "story so far" is the state of an LLM-based AI.


'If you have some data and continuously update it with a function, we usually call that data state. That's what happens when you keep adding tokens to the output. The "story so far" is the state of an LLM-based AI.'

You're conflating UX and LLM.


You're being pedantic. While the core token generation function is stateless, that function is not, by a long shot, the only component of an LLM AI. Every LLM system being widely used today is stateful. And it's not only 'UX'. State is fundamental to how these models produce coherent output.


"State is fundamental to how these models produce coherent output."

Incorrect.


I never said LLMs are stateful.


[flagged]


Please don't do flamewar on HN. It's not what this site is for, and destroys what it is for.

https://news.ycombinator.com/newsguidelines.html


Really?

Delete my account.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: