Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Llama Run Update: Cloud-Based AI Inference with Next.js and Google Gemini (llamarun.vercel.app)
2 points by KrishBaidya 5 months ago | hide | past | favorite | 1 comment


Hey everyone, I'm thrilled to share a pretty big update to Llama Run! For those who don't know, it's an AI desktop assistant I've been working on, designed to help you automate tasks using your own custom Python plugins. And now, it's getting a serious upgrade with cloud-based AI!

So, what's new?

Snappier Performance: We've rebuilt parts of Llama Run with Next.js and are hosting it on Vercel. This means the app is faster and more responsive, especially when running your automated tasks and custom plugins. Cloud AI Power: We've connected Llama Run to the magic of Google Gemini and OpenAI. This lets you tap into powerful AI for things like generating intelligent responses or handling complex processes, and it all happens in the cloud for speed and scalability. Secure Authentication: We're now using Firebase for user authentication. This makes it super easy and secure to connect your app to those cloud services.

Basically, this update brings the power of cloud-connected AI processing directly to your desktop. The goal is to make Llama Run even more efficient and responsive, giving you a taste of the future.

What's coming up?

My next big focus is fully supporting Python plugins! It's still a work in progress, but I'm committed to creating a robust plugin system that's useful for both developers and end-users.

I'd love to hear what you think! You can try out the app on the Microsoft Store. And if you're curious about how it all works, the project is open source on GitHub, and I'd welcome any contributions!

Thanks for taking a look!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: