It seems obvious to me, but there was a camp that thought, at least at one time, that probabilistic next token could be effectively what humans are doing anyways, just scaled up several more orders of magnitude. It always felt obvious to me that there was more to human cognition than just very sophisticated pattern matching, so I'm not surprised that these approaches are hitting walls.