Skip to content

Commit 60e8c28

Browse files
committed
Improve conf examples
1 parent b893dee commit 60e8c28

File tree

4 files changed

+143
-119
lines changed

4 files changed

+143
-119
lines changed

README.md

Lines changed: 82 additions & 119 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,24 @@
11
# mcp-server-uyuni
22

3-
Model Context Protocol Server for Uyuni Server API.
43

5-
## Disclaimer
4+
The Uyuni MCP Server is a Model Context Protocol (MCP) server implementation that bridges the gap between Large Language Models (LLMs) and the Uyuni systems management solution.
65

7-
This is an open-source project provided "AS IS" without any warranty, express or implied. Use at your own risk. For full details, please refer to the [License](#license) section.
6+
This project allows AI agents (such as Claude Desktop or other MCP-compliant clients) to securely interact with your Uyuni server. By exposing Uyuni's API as standardized MCP tools, it enables users to manage their Linux infrastructure using natural language commands. Instead of navigating the web UI or writing complex API scripts, you can simply ask your AI assistant to perform tasks like auditing systems, checking for updates, or scheduling maintenance.
7+
8+
Key Capabilities
9+
This server exposes a suite of tools that allow LLMs to:
10+
11+
Inspect Infrastructure: Retrieve lists of active systems, check CPU usage, and view system details.
12+
13+
Manage Updates: Identify systems with pending security updates or CVEs and schedule patch applications.
14+
15+
Execute Actions: Schedule reboots.
816

9-
## Tools
17+
It is designed to be run as a Docker container or locally, offering a streamlined way to integrate AI-driven automation into your system administration workflows.
18+
19+
20+
21+
## Tool List
1022

1123
* get_list_of_active_systems
1224
* get_cpu_of_a_system
@@ -28,11 +40,13 @@ This is an open-source project provided "AS IS" without any warranty, express or
2840

2941
There are two main ways to run the `mcp-server-uyuni`: using the pre-built Docker container or running it locally with `uv`. Both methods require a `config` file.
3042

31-
### Config File
43+
To use `mcp-server-uyuni`, you must first create a configuration file. Once configured, you can run the server using Docker (recommended) or locally with uv.
3244

33-
Before running the server, you need to create a `config` file. You can place it anywhere, but you must provide the correct path to it when running the server.
45+
### 1\. Configuration
3446

35-
```
47+
Create a file (e.g., `uyuni-config.env`) to store your environment variables. You can place this file anywhere, but you must reference its path when running the server.
48+
49+
```bash
3650
# Required: Basic server parameters.
3751
UYUNI_SERVER=192.168.1.124:8443
3852
UYUNI_USER=admin
@@ -81,46 +95,11 @@ Replace the values with your Uyuni server details. **This file contains sensitiv
8195
8296
Alternatively, you can also set environment variables instead of using a file.
8397
84-
## Debug with mcp inspect
85-
86-
You can run (docker option)
87-
88-
`npx @modelcontextprotocol/inspector docker run -i --rm --env-file /path/to/your/config ghcr.io/uyuni-project/mcp-server-uyuni:latest`
8998
90-
or you can run (uv option)
99+
### 2\. Running as a Container (Recommended)
91100
92-
`npx @modelcontextprotocol/inspector uv run --env-file=.venv/config --directory . mcp-server-uyuni`
101+
The easiest way to run the server is using the pre-built Docker image. This method isolates the environment and requires no local dependencies other than Docker.
93102
94-
## Use with Open WebUI
95-
96-
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution. More at https://docs.openwebui.com/
97-
98-
> [!NOTE]
99-
> The following instructions describe how to set up Open WebUI and the MCP proxy for **local development and testing purposes**. For production deployments, please refer to the official [Open WebUI documentation](https://docs.openwebui.com/) for recommended setup procedures.
100-
101-
### Setup Open WebUI
102-
103-
You need `uv` installed. See https://docs.astral.sh/uv
104-
105-
Start v0.6.10 (for MCP support we need a version >= 0.6.7)
106-
107-
```
108-
uv tool run [email protected] serve
109-
```
110-
111-
Configure the OpenAI API URL by following these instructions:
112-
113-
https://docs.openwebui.com/getting-started/quick-start/starting-with-openai
114-
115-
For gemini, use the URL https://generativelanguage.googleapis.com/v1beta/openai and get the token API from the Google AI Studio https://aistudio.google.com/
116-
117-
### Setup Open WebUI MCP Support
118-
119-
First, ensure you have your `config` file ready as described in the Usage section.
120-
121-
Then, you need a `config.json` for the MCP to OpenAPI proxy server.
122-
123-
### Option 1: Running with Docker (Recommended)
124103
125104
This is the easiest method for deployment. Pre-built container images are available on the GitHub Container Registry.
126105
@@ -139,7 +118,6 @@ This is the easiest method for deployment. Pre-built container images are availa
139118
}
140119
}
141120
}
142-
```
143121
144122
Alternatively, you can use environment variables instead of a file.
145123
@@ -158,101 +136,80 @@ Alternatively, you can use environment variables instead of a file.
158136
}
159137
}
160138
}
139+
140+
**Command:**
141+
142+
```bash
143+
docker run -i --rm --env-file /path/to/uyuni-config.env ghcr.io/uyuni-project/mcp-server-uyuni:latest
161144
```
162145
163-
### Option 2: Running Locally with `uv`
146+
* **`-i`**: Keeps STDIN open (required for MCP communication).
147+
* **`--rm`**: Removes the container after it exits.
148+
* **`--env-file`**: Points to the configuration file you created in step 1.
149+
150+
### 3\. Running Locally with `uv`
151+
152+
If you are developing or prefer running Python directly, you can use `uv`.
153+
154+
**Prerequisites:**
155+
156+
* Install `uv`: [https://docs.astral.sh/uv](https://docs.astral.sh/uv)
157+
* Clone this repository.
164158

165-
This method is ideal for development.
159+
**Setup and Run:**
166160

167-
1. **Install `uv`:** See https://docs.astral.sh/uv
168-
2. **Install dependencies:**
161+
1. Sync dependencies:
169162
```bash
170163
uv sync
171164
```
172-
3. Replace `/path/to/your/config` with the absolute path to your `config` file.
165+
2. Run the server:
166+
```bash
167+
uv run --env-file /path/to/uyuni-config.env --directory . mcp-server-uyuni
168+
169+
170+
### 4\. Client Configuration Examples
171+
172+
MCP servers are rarely run manually; they are usually configured within an MCP Client (like Claude Desktop). Below are examples of how to configure your client to use `mcp-server-uyuni`.
173+
174+
#### Claude Desktop Configuration
175+
176+
Add the following to your `claude_desktop_config.json`:
177+
178+
**Docker Method:**
173179
174180
```json
175181
{
176182
"mcpServers": {
177-
"mcp-server-uyuni": {
178-
"command": "uv",
183+
"uyuni": {
184+
"command": "docker",
179185
"args": [
180-
"run",
181-
"--env-file", "/path/to/your/config",
182-
"--directory", ".",
183-
"mcp-server-uyuni"
186+
"run", "-i", "--rm",
187+
"--env-file", "/absolute/path/to/uyuni-config.env",
188+
"ghcr.io/uyuni-project/mcp-server-uyuni:latest"
184189
]
185190
}
186191
}
187192
}
188193
```
189194

190-
### Start the MCP to OpenAPI proxy server
191-
192-
193-
Then, you can start the Model Context Protocol to Open API proxy server:
194-
195-
```
196-
uvx mcpo --port 9000 --config ./config.json
197-
```
198-
199-
### Add the tool
200-
201-
And then you can add the tool to the Open Web UI. See https://docs.openwebui.com/openapi-servers/open-webui#step-2-connect-tool-server-in-open-webui .
202-
203-
Note the url should be http://localhost/mcp-server-uyuni as explained in https://docs.openwebui.com/openapi-servers/open-webui#-optional-using-a-config-file-with-mcpo
204-
205-
206-
![OpenWeb UI with MCP Support with GPT 4 model](docs/example_openwebui_gpt.png)
207-
![OpenWeb UI with MCP Support with Gemini 2.0 flash model](docs/example_openwebui_gemini.png)
208-
209-
### Testing Advanced Capabilities (Elicitation)
210-
211-
> [!NOTE]
212-
> The Model Context Protocol (MCP) includes advanced features like **Elicitation**, which allows tools to interactively prompt the user for missing information or confirmation.
213-
>
214-
> As of this writing, not all MCP clients support this capability. For example, **Open WebUI does not currently implement elicitation**.
215-
>
216-
> To test tools that leverage elicitation (like the `add_system` tool when an activation key is missing), you need a compatible client. The official **MCP extension for Visual Studio Code** is a reference client that fully supports elicitation and is recommended for developing and testing these features.
217-
195+
**Local `uv` Method:**
218196

219-
## Local Development Build
220-
221-
To build the Docker image locally for development or testing purposes:
222-
```bash
223-
docker build -t mcp-server-uyuni .
197+
```json
198+
{
199+
"mcpServers": {
200+
"uyuni": {
201+
"command": "/path/to/uv",
202+
"args": [
203+
"run",
204+
"--env-file", "/absolute/path/to/uyuni-config.env",
205+
"--directory", "/absolute/path/to/mcp-server-uyuni-repo",
206+
"mcp-server-uyuni"
207+
]
208+
}
209+
}
210+
}
224211
```
225212

226-
Then, you can use `docker run -i --rm --env-file .venv/config mcp-server-uyuni` at any of the mcp-client configurations explained above.
227-
228-
## Release Process
229-
230-
To create a new release for `mcp-server-uyuni`, follow these steps.
231-
232-
1. ** Create a release branch:** `git fetch upstream && git checkout upstream/main -b release-x.y.z`. Assuming upstream is the remote alias for the upstream git
233-
2. **Update Documentation (`README.md`):**
234-
* Ensure the list of available tools under the "## Tools" section is current and reflects all implemented tools in `srv/mcp-server-uyuni/server.py`.
235-
* Review and update any screenshots in the `docs/` directory and their references in this `README.md` to reflect the latest UI or functionality, if necessary.
236-
* Verify all usage instructions and examples are still accurate.
237-
3. **Update Test Cases (`TEST_CASES.md`):**
238-
* Refer to the "How to Update for a New Tag/Release" section within `TEST_CASES.md`.
239-
4. **Commit Changes:** Commit all the updates to `README.md`, `TEST_CASES.md`, and any other changed files.
240-
5. **Update version in pyproject.toml:** Use semantic versioning to set the new version.
241-
6. **Update uv.lock:** Run `uv lock` to update uv.lock file with the version set in pyproject.toml
242-
7. **Update CHANGELOG.md:**
243-
* Generate the changelog using `conventional-changelog-cli`. If you don't have it installed globally, you can use `npx`.
244-
* The command to generate the changelog using the `conventionalcommits` preset and output it to `CHANGELOG.md` (prepending the new changes) is:
245-
```bash
246-
npx conventional-changelog-cli -p conventionalcommits -i CHANGELOG.md -s
247-
```
248-
* Review the generated `CHANGELOG.md` for accuracy and formatting.
249-
* Commit the updated `CHANGELOG.md`.
250-
8. Push the branch and create a new Pull Request: `git push origin release-x.y.z` . Assuming origin is the remote alias for your git fork.
251-
7. **Create Git Tag:** Create a new Git tag for the release (e.g., `git tag vX.Y.Z`). Follow [semantic versioning rules](https://semver.org/).
252-
8. **Push Changes and Tags:** Push your commits (including the changelog update) and the new tag to the repository (e.g., `git push && git push --tags`).
253-
9. **Automated Build and Push:** Pushing the tag to GitHub will automatically trigger the "Docker Publish" GitHub Action. This action builds the Docker image and pushes it to the GitHub Container Registry (`ghcr.io`) with tags for the specific version (e.g., `v0.3.0`) and major.minor (e.g., `v0.3`). Pushing to `main` will update the `latest` tag.
254-
10. **Test the container:** Pull the newly published image from `ghcr.io` and run the tests in `TEST_CASES.md` against it.
255-
`docker run -i --rm --env-file .venv/config ghcr.io/uyuni-project/mcp-server-uyuni:VERSION` (replace VERSION with the new tag).
256213

257214
## Feedback
258215

@@ -265,3 +222,9 @@ Thanks in advance from the uyuni team!
265222
## License
266223

267224
This project is licensed under the Apache License, Version 2.0. See the [LICENSE](LICENSE) file for details.
225+
226+
227+
## Disclaimer
228+
229+
This is an open-source project provided "AS IS" without any warranty, express or implied. Use at your own risk. For full details, please refer to the [License](#license) section.
230+

config.example.gemini.json

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
{
2+
"mcpServers": {
3+
"mcp-server-uyuni": {
4+
"command": "docker",
5+
"args": [
6+
"run",
7+
"-i",
8+
"--rm",
9+
"-v",
10+
"/tmp/mcp-server-uyuni.log:/tmp/mcp-server-uyuni.log",
11+
"--name",
12+
"mcp-server-uyuni",
13+
"--env-file",
14+
"./uyuni-connection-config-stdio.env",
15+
"registry.opensuse.org/systemsmanagement/uyuni/ai/devel_bci_16.0_containerfile/uyuni-ai/mcp-uyuni-server",
16+
"/usr/bin/mcp-server-uyuni"
17+
]
18+
}
19+
},
20+
"security": {
21+
"auth": {
22+
"selectedType": "gemini-api-key"
23+
}
24+
}
25+
}
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
{
2+
"mcpServers": {
3+
"mcp-server-uyuni": {
4+
"command": "uv",
5+
"server_type": "stdio",
6+
"args": ["run", "--env-file=./uyuni-connection-config-stdio.env","--directory","./src/mcp-server-uyuni/","mcp-server-uyuni"]
7+
}
8+
}
9+
}
10+
Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
# Required: Basic server parameters.
2+
UYUNI_SERVER=example_server:example_port
3+
UYUNI_USER=example_user
4+
UYUNI_PASS=example_password
5+
6+
# Optional: Set to 'false' to disable SSL certificate verification. Defaults to 'true'.
7+
# UYUNI_MCP_SSL_VERIFY=false
8+
9+
# Optional: Set to 'true' to enable tools that perform write actions (e.g., POST requests). Defaults to 'false'.
10+
# UYUNI_MCP_WRITE_TOOLS_ENABLED=false
11+
12+
> [!WARNING]
13+
> **Security Note on Write Tools:** Enabling `UYUNI_MCP_WRITE_TOOLS_ENABLED` allows the execution of state-changing and potentially destructive actions (e.g., removing systems, applying updates). When combined with `UYUNI_MCP_TRANSPORT=http`, this risk is amplified, as any client with network access can perform these actions. Only enable write tools in a trusted environment.
14+
15+
# Optional: Set the transport protocol. Can be 'stdio' (default) or 'http'.
16+
# UYUNI_MCP_TRANSPORT=stdio
17+
18+
> [!WARNING]
19+
> **Security Note on HTTP Transport:** When `UYUNI_MCP_TRANSPORT` is set to `http`, the server runs without authentication. This means any client with network access can execute commands. Only use this mode in a trusted, isolated network environment. For more details, see the Security Policy.
20+
21+
# Optional: Set the path for the server log file. Defaults to logging to the console.
22+
# UYUNI_MCP_LOG_FILE_PATH=/var/log/mcp-server-uyuni.log
23+
24+
# Required to bootstrap new systems into Uyuni via the `add_system` tool.
25+
UYUNI_SSH_PRIV_KEY="-----BEGIN OPENSSH PRIVATE KEY-----\n..."
26+
UYUNI_SSH_PRIV_KEY_PASS=""

0 commit comments

Comments
 (0)