Initial setup #312
Replies: 1 comment
-
|
@johny-mnemonic apologies for the confusion — the docs were out of date. Provider configuration lives in the Providers app (top-level app in the dock, not a section inside Settings). The line in Adding your Ollama endpoint via the UI:
About your manual YAML edit: The About a setup wizard: You're right, there isn't one yet — first-launch wizard is on the roadmap but not built. A model has to be configured through the Providers app before Agents can deploy. Tracking that gap. External LiteLLM: taOS bundles its own LiteLLM proxy that's automatically configured from your Providers entries — that's the path of least friction. Pointing taOS at a separate LiteLLM you already run elsewhere isn't supported as a first-class option today; it'd require pointing the agents directly at your LiteLLM URL and skipping the Providers layer entirely. Possible but undocumented. If your existing LiteLLM has providers configured already, the simpler path is to add those same providers to taOS's Providers app — taOS's bundled LiteLLM will route through them and you keep one source of truth in the UI. Thanks for the detailed report — it's the kind of feedback that closes onboarding gaps fast. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I have successfully installed TAOS and logged into Web Desktop, but there is no model to select and I see no way to add model providers. There was no "setup wizard" on first launch. I guess it is not implemented yet...
There is a reference to "Settings → Providers → Add Provider" but I don't see such section in the Settings.
There is a "Advanced" section in the Settings, but when I try to add for example one of my Ollama endpoints it produces
Save failed (404)error and the "Validate" button seems to do nothing. In the log it says"PUT /api/settings/config HTTP/1.1" 404 Not Found, so I guess this is also not implemented yet and I will have to find where the config file is and modify it on CLI, right?I have modified the defaults there to look like this:
Hope this is correct.
I already have LiteLLM running in my network. Not sure if that can be used instead of internal LiteLLM referenced in the docs.
Beta Was this translation helpful? Give feedback.
All reactions