MyOllama is an open-source mobile client that enables interaction with Ollama-based LLMs from your iOS/Android device.
Technical highlights:
- Built with Flutter for cross-platform support
- Supports remote LLM access via IP configuration
- Implements custom prompt engineering capabilities
- Compatible with multiple LLM architectures (Llama, Gemma, Qwen, Mistral)
- Image recognition support for compatible models
- Persistent conversation management
- Localization for English, Korean, and Japanese
Key differentiators:
- No cloud dependencies - runs entirely on your local machine
- Privacy-focused architecture
- Zero subscription fees
- Custom instruction templating
- Streamlined API integration with Ollama backend
The app is available on the App Store and the source code is released under GNU license on GitHub.
GitHub: [https://github.com/bipark/my_ollama_app]
App Store: https://apps.apple.com/us/app/my-ollama/id6738298481
Looking for contributors and feedback from the community.