Hacker News new | past | comments | ask | show | jobs | submit login
Ollama can run any GGUF Model on Hugging Face Hub now (huggingface.co)
15 points by djhu9 16 days ago | hide | past | favorite | 1 comment



Oh, that makes more sense:

> You can use any GGUF quants created by the community (bartowski, MaziyarPanahi and many more) on Hugging Face directly with Ollama, without creating a new Modelfile.

> [...]

> Getting started is as simple as:

> ollama run hf.co/{username}/{repository}

> Please note that you can use both hf.co and huggingface.co as the domain name.

And the quant selection is nice too. I'd initially wondered why the title was worth announcing, since I've been feeding ollama arbitrary ggufs for a while, but that required writing a (trivial one-line) Modelfile so this is a nice improvement.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: