Replit just released a report on OpenAI dominance in AI-related projects & the growing importance of HuggingFace and LangChain.
The competition between HuggingFace & Replicate on open source models hosting looks fierce. But I was mostly surprised to see the evolution of Javascript vs Python in the last quarters: ggml, llama.cpp and those are offering inference speed gains but for environments that aren't Javascript driven. Does this come from high-level wrappers on ML service hosting platforms?
The competition between HuggingFace & Replicate on open source models hosting looks fierce. But I was mostly surprised to see the evolution of Javascript vs Python in the last quarters: ggml, llama.cpp and those are offering inference speed gains but for environments that aren't Javascript driven. Does this come from high-level wrappers on ML service hosting platforms?