Hacker News new | past | comments | ask | show | jobs | submit login

> As for linguistics, IMHO the existence and success of GPT pretty much puts Chomsky into the proven wrong bucket, so again, not a good example. (his whole point used to be that statistical model can't learn syntax in principle, and GPT's syntax is close to impeccable)

What do you disagree with? He appears to be correct. The software hasn’t learned anything. It mixes and matches based on training data.

https://m.youtube.com/watch?v=ndwIZPBs8Y4




According to the scientific method, on which the rest of the natural sciences are currently based, GPT is a valid model of GPT's syntax.

There are "alternatives" for the method according to some philosophers, but AFAIK none of them are useful to any degree and can be considered fringe at this point.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: