GPT-4, per the API docs, has both -8K and -32K versions; I wouldn't be surprised that they are only putting the smaller one up for the web interface, for resource reasons, but that doesn't explain an apparent 4K limit.
I just ran into the same thing. I thought it had way more tokens than GPT3.5... This did not look to be the case in the demo video I was watching earlier.