> You can use any GGUF quants created by the community (bartowski, MaziyarPanahi and many more) on Hugging Face directly with Ollama, without creating a new Modelfile.
> [...]
> Getting started is as simple as:
> ollama run hf.co/{username}/{repository}
> Please note that you can use both hf.co and huggingface.co as the domain name.
And the quant selection is nice too. I'd initially wondered why the title was worth announcing, since I've been feeding ollama arbitrary ggufs for a while, but that required writing a (trivial one-line) Modelfile so this is a nice improvement.
> You can use any GGUF quants created by the community (bartowski, MaziyarPanahi and many more) on Hugging Face directly with Ollama, without creating a new Modelfile.
> [...]
> Getting started is as simple as:
> ollama run hf.co/{username}/{repository}
> Please note that you can use both hf.co and huggingface.co as the domain name.
And the quant selection is nice too. I'd initially wondered why the title was worth announcing, since I've been feeding ollama arbitrary ggufs for a while, but that required writing a (trivial one-line) Modelfile so this is a nice improvement.