Hacker News new | past | comments | ask | show | jobs | submit login

Can it be adapted to use ollama? Seems like a good tool to setup locally as a navigation tool.



Yes, you can certainly use Ollama! However, we strongly recommend using a more beefed up model to get sustainable results. Check out our external_client.ts file in examples/ that shows you how to setup a custom LLMClient: <https://github.com/browserbase/stagehand/blob/main/examples/...>


It doesn’t look like accessing the llmclient for this is possible for external projects in the latest release, as that example takes advantage of being inside the project. (At least working through the quick start guide).


We accidentally didn't release the right types for LLMClient :/ However, if you set the version in package.json to "alpha", it will install what's on the main branch on GitHub, which should have the typing fix there


Yeah I saw it was a recent change in your GitHub and was happily running your examples.

To be honest I took about 2 minutes of playing around to get annoyed with the inaccuracies of the locally hosted model for that, so I get why you encourage the other approaches.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: