By mult-idevice (mobile, tablets, desktop) I guess we cover cross-platform but the idea is to run on any device/platform with access to whatsapp with one session running as the "server" (the desktop one). This server is the user of an whatsapp number (connected using wa-js) and also connected to a LLM of choice.
The current use-case is the most basic possible, I'm using it to keep a daily, notes and ask general questions. the LLM then save some of the data it extract from the notes into collections that I plan to use later. It is quite new project yet
OpenELM, a family of Efficient Language Models Developed by Apple, is trending on Hugging Face!
OpenELM offers models with 270M to 3B parameters, pre-trained and instruction-tuned, with Good results across various benchmarks.
My Feedback:
First Phi 3, now OpenELM. It's great to see these small models improving. I know they're not ready for production in all cases, but they're really great for specific tasks.
I see small open-source models as the future because they offer better speed, require less compute, and use fewer resources, making them more accessible and practical for a wider range of applications.
What do you think about this? Do you consider using small opensource. If yes what you are thinking to make?