A self-hosted, minimalist web interface that connects to your Ollama API from any device on your network.
⚠️ Project status: vynUI is actively in development. Expect frequent updates and experimental features.
If you'd like to follow progress or contribute to planning, check out:
- 🌐 Connect to your host Ollama machine over your local network
- ⚡ Real-time chat with streaming responses
- 💻 Lightweight, responsive UI focused on minimalism
- 🔒 100% open-source & self-hosted
- The app expects to connect to an Ollama host IP and port — update the connection string in the UI connection modal.
- Add any environment variables or config files here if you add server-side logic later.
vynUI connects through your local network to a host running Ollama. Messages typed in the UI are forwarded to Ollama, which replies with streaming responses. The UI is deliberately minimal to keep latency and resource usage low.
We’d love help. Please read CONTRIBUTING.md before opening issues or PRs.
Good first issues:
- Improve documentation or add examples
- Find errors in the code
When opening PRs:
- Keep commits focused and atomic
- Document behavior changes in the PR description
- Add screenshots if the UI is modified
If you like this project:
- Star the repo ⭐
- PLEASE Share it with devs who run local LLM setups
- Open issues for bugs or feature requests
Where we plan to share updates:
- Repo releases
- Project boards (Miro / Trello linked above)
Licensed under the MIT License. See LICENSE.md.
© 2025 Lunar Productions