This release has two important features:
- Local LLM support for models running under Ollama
- Command-line flags for all the config file options. Depending on your exact config, you may be able to run the SSH server without any config file at all.
This release has two important features: