GPT-4 does have 35,000 token limit. I'm sure GPT-5 will probably have an order of magnitude bigger context window. I'm sure by the seventh or eighth model there will be enough context to recreate Twitter's entire codebase or something. Although perhaps the context window for LLM's really is fundamental and a completely different architecture is required. That's alright, I think Anthropic might make some progress with their "constitutional" AI then. And if that fails, Deep Mind has it's own multi modal agent which is probably running a unique architecture as well.