What do you use so that you can throw in a set of documents and/or a nontrivial code base into an LLM workspace and ask questions about it etc.? What the cloud-based services provide goes way beyond a simple chat interface or mere code completion (as you know, of course).
Now I have all the Python and Markdown files from the current project on my clipboard, in Claude's recommended XML-like format (which I find works well with other models too).
Then I paste that into the Claude web interface or Google's AI Studio if it's too long for Claude and ask questions there.
Sometimes I'll pipe it straight into my own LLM CLI tool and ask questions that way:
Thanks. Google AI Studio isn’t local, I think, is it? I’ll have to test this, but our project sizes and specification documents are likely to run into size limitations for local models (or for the clipboard at the very least ;)). And what I’d be most interested in are big-picture questions and global analyses.