Hacker News new | past | comments | ask | show | jobs | submit login
Demo: Use WebAssembly to Run LLMs on Your Own Device with WasmEdge (youtube.com)
2 points by 3Sophons 8 months ago | hide | past | favorite | 6 comments



Super cool idea, I've been hoping for such a thing!

However, terrible interface!

Firstly I hate blindly running shell scripts from third part URLs.

Secondly yours is awful!

It does too much. It would be better to instruct how to set up the wasm environment and download the LLM separately from running the dang thing.

I'm laboriously separating these parts now to manually troubleshoot those that are failing.

Thanks for your efforts however.

If I get tinyllamma working I'll be piping it through piper for some speech output!


Please look at this separated steps as requested! https://www.secondstate.io/articles/wasm-runtime-agi/


Yo, can you share your email ? We should speak.

You will be interested in this:

https://arxiv.org/abs/2110.04913


that's promethean.way /at/ protonmail /dot/ com


Please let us know if you have other concerns or bump into any issues? There is a different LLM based article for each model, like the Mixtral MoE https://www.secondstate.io/articles/mixtral-8-7b/


LlamaEdge revolutionizes AI/LLM runtime with lightweight (<5MB), portable, and secure applications for diverse CPUs/GPUs across different OSes, simplifying development and deployment from local to edge and cloud. https://www.secondstate.io/LlamaEdge/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: