This is a cool idea! A service that allows you to dump information and then query it with an LLM seems really useful!
It seems a bit expensive for the amount of storage you get though... Is there a reason for that? I don't know much about LLMs but do they need to generate a lot of additional data from the user's dataset for querying?
It seems a bit expensive for the amount of storage you get though... Is there a reason for that? I don't know much about LLMs but do they need to generate a lot of additional data from the user's dataset for querying?