Hacker News new | past | comments | ask | show | jobs | submit login

I believe it can do that, but there's two periods where it can "think".

1. During training time, which is very expensive in terms of input text and money - plus, model training often completely fails which is why you have to do checkpoints, hyper parameter searches, etc.

2. During inference, but it doesn't have arbitrary thinking and memory abilities like a human does, it only has however much space is in the input tokens + its weights. There are thoughts that can't fit even in an optimal model.

GPT isn't just a model, it also has that sampling system which is a regular computer program (as opposed to a learned one), which does give it extra abilities.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: