Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
vorticalbox
5 months ago
|
parent
|
context
|
favorite
| on:
Open source AI is the path forward
If you have the ram for it.
Ollama will offload as many layers as it can to the gpu then the rest will run on the cpu/ram.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
Ollama will offload as many layers as it can to the gpu then the rest will run on the cpu/ram.