That would make each API call cost at least $3 ($3 is price per million input tokens). And if you have a 10 message interaction you are looking at $30+ for the interaction. Is that what you would expect?
Gemini 1.5 Pro charges $0.35/million tokens up to the first million tokens or $0.70/million tokens for prompts longer than one million tokens, and it supports a multi-million token context window.
Substantially cheaper than $3/million, but I guess Anthropic’s prices are higher.
Is it, though? In my limited tests, Gemini 1.5 Pro (through the API) is very good at tasks involving long context comprehension.
Google's user-facing implementations of Gemini are pretty consistently bad when I try them out, so I understand why people might have a bad impression about the underlying Gemini models.