Thank you for sharing this. I probably missed something very obvious, but under which circumstances does information the user enters leave the user's device and go to Google? Is there a heads up before this happens?
This is the best I could find:
"Some NeuroTools features use Google's Gemini AI service. To make these work, we send the relevant input from the app to the Google Gemini API."
I would love to play around with it, but local-only is a must for me.
On that note, would you agree that in principle it should be possible to run this with a local LLM?
Thank you, you're right, I should clarify that further, most features do use AI. Only the task list does not immediately use AI, the rest of the features after the initial input and you clicking submit, do use AI.
And yes, this can definitely work with a local LLM, and there's a high chance I will do that next. I've created a local LLM open-source Mac app last year (paid, but open source, so you could just compile it yourself), you can find it on my profile, so it's definitely something I am very interested in! I wanted to try to tap a broader audience this time so did not release it with a local LLM feature. But very high chance it'll come next!
To be fair, it seems to simply have followed the customer's instructions. The fact that it did not give any helpful answers bothers me significantly more. I wonder if it has access to any kind of internal database in order to be able to give more specific answers regarding a customer's queries.
I concur. Used to be an Ubuntu user, and for me Mint fixed quite a few of the problems I started having with Ubuntu (e.g. snaps). Plus, I find their defaults quite usable. If you like choice, give it a go using a live system.
This is the best I could find:
"Some NeuroTools features use Google's Gemini AI service. To make these work, we send the relevant input from the app to the Google Gemini API."
I would love to play around with it, but local-only is a must for me.
On that note, would you agree that in principle it should be possible to run this with a local LLM?