Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
schappim
11 months ago
|
parent
|
context
|
favorite
| on:
Run Google Gemma 2 2B 100% Locally
What is the advantage of this over running: ollama run gemma2:2b ?
mappu
11 months ago
|
next
[–]
ollama is a thin wrapper over llama.cpp, so i'd pose the opposite question - what does ollama give you over using llama.cpp directly?
schappim
11 months ago
|
parent
|
next
[–]
Model management, customisable HTTP APIs, monitoring, security features, "parallel requests" (batch processing), no requirement for HF auth etc...
yjftsjthsd-h
11 months ago
|
parent
|
prev
|
next
[–]
Ease of use. Rather like arduino or like docker vs chroot/jails/zones, there's nothing wrong with just using the underlying tech, but lowering friction has value.
umtksa
11 months ago
|
prev
[–]
and I tried both of them and ollama some how handled everything better for gemma2
Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: