Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I like the idea of this and the use case, but don't love the tight coupling to openai. I'd love to see a framework for allowing BYOM.




It's been 2.5 years since ChatGPT came out, and so many projects still don't allow for easy switching of the OPEN_AI_BASE_URL or affiliated parameters.

There are so many inferencing libraries that serve an OpenAI-compatible API that any new project being locked in to OpenAI only is a large red flag for me.


Thanks for the feedback! Totally hear you on the tight OpenAI coupling - we're aware and already working to make BYOM easier. Just to echo what Zecheng said earlier: broader model flexibility is definitely on the roadmap.

Appreciate you calling it out — helps us stay honest about the gaps.


Yes, there is a roadmap to support more models. For now there is a in progress PR to support Anthropic models https://github.com/traceroot-ai/traceroot/pull/21 (contributed by some active open source contributors) Feel free to let us know which (open source) model or framework (VLLM etc.) you want to use :)

Why not use something like litellm?

That's also one option, we will consider add it later :)

Adding model provider abstraction would significantly improve adoption, especially for organizations with specific LLM preferences or air-gapped environments that can't use OpenAI.

Yep, you're spot on - and we're hearing this loud and clear across the thread. Model abstraction is on the roadmap, and we're already working on making BYOM smoother.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: