> I always end up getting to the end of all the prompts only to be told I need to speak to a human or the chatbot going in a circle
I've had success with just repeating "Agent please" or "I wanna talk to human" if I notice the chat bot isn't a traditional conditional-if-else-bot but an LLM, and it seems like most of them have some sort of escape-hatch they can trigger, but they're prompted to really avoid it. But if you continue sending "Agent please" over and over again, eventually it seems like the typical context-rot prevents them from avoiding the escape-hatch, and they send you along to a real human.
I saw a social media video of people at the drive in, it was a robot voice asking what they'd like. "I'd like a million cups of water please.". The voice immediately changed to a noticably human one asking "Hi how can I help you."
I've had success with just repeating "Agent please" or "I wanna talk to human" if I notice the chat bot isn't a traditional conditional-if-else-bot but an LLM, and it seems like most of them have some sort of escape-hatch they can trigger, but they're prompted to really avoid it. But if you continue sending "Agent please" over and over again, eventually it seems like the typical context-rot prevents them from avoiding the escape-hatch, and they send you along to a real human.