Skip to content
Open

MCP #1209

Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
e98d46a
wip: query_ai
rboixaderg Jan 29, 2026
dade433
feat: Enhance AI query handling with logging and multi-step support
rboixaderg Jan 29, 2026
3f6dda7
feat: Implement retry mechanism for empty query results in AI query h…
rboixaderg Jan 30, 2026
a30945b
feat: Introduce MCP integration
rboixaderg Jan 30, 2026
3052880
feat: Add token issuance for MCP with configurable duration
rboixaderg Jan 30, 2026
0524444
feat: Remove AI query module and related components
rboixaderg Jan 30, 2026
bd98a75
feat: Add code formatting and testing targets to Makefile
rboixaderg Jan 30, 2026
1cfb2f6
feat: Refactor MCP integration by removing unused components and enha…
rboixaderg Jan 30, 2026
abd32a3
feat: Remove MCP README documentation as part of project cleanup
rboixaderg Jan 30, 2026
b18ee39
feat: Add chat functionality to MCP with LLM integration
rboixaderg Jan 30, 2026
3d4c7c4
feat: Introduce MCP tools and chat endpoint with enhanced configuration
rboixaderg Jan 30, 2026
8ba8cf2
feat: Add MCP contrib and update requirements
rboixaderg Jan 30, 2026
ae4938c
chore: copilot suggestions
rboixaderg Jan 30, 2026
f8701c0
fix: Update MCP requirements and documentation for Python 3.10+ compa…
rboixaderg Jan 30, 2026
85f4f44
chore: Update Makefile to add .PHONY declaration for tests target
rboixaderg Jan 30, 2026
83abee0
chore: Update jinja2 and MarkupSafe versions in contrib-requirements.…
rboixaderg Jan 30, 2026
a98eb19
feat: Enhance MCP chat model support and update documentation
rboixaderg Jan 31, 2026
6031e7d
feat: Implement security checks in InProcessBackend for content visib…
rboixaderg Jan 31, 2026
dabd2fa
feat: Refactor MCP backend and chat tool execution for improved conte…
rboixaderg Feb 1, 2026
0ce4b9d
chore: Update dependencies in requirements.txt and setup.py for impro…
rboixaderg Feb 2, 2026
bf84b96
refactor: Simplify type annotations in MCP backend for improved reada…
rboixaderg Feb 2, 2026
1ee5075
refactor: Organize type imports in MCP backend for clarity
rboixaderg Feb 2, 2026
779fec4
chore: Update CI workflow to support Python versions 3.10, 3.11, 3.12…
rboixaderg Feb 2, 2026
cc84e39
chore: Update CI workflow to limit supported Python versions to 3.10 …
rboixaderg Feb 2, 2026
e498d4f
refactor: Update register_tools function to use InProcessBackend type…
rboixaderg Feb 11, 2026
030b9a5
test: Update MCP service tests to assert correct status codes for ena…
rboixaderg Feb 11, 2026
5261e38
adong locking
nilbacardit26 Feb 12, 2026
50304ff
feat: Introduce MCPUtility for managing FastMCP server and app instan…
rboixaderg Feb 13, 2026
8ba5d30
chore: Update contrib-requirements and setup.py to include MCP depend…
rboixaderg Feb 13, 2026
bb43682
chore: Update requirements.txt to specify version ranges for cffi and…
rboixaderg Feb 13, 2026
8b306b0
chore: Update jsonschema version specifications in requirements.txt t…
rboixaderg Feb 13, 2026
5d37add
chore: Adjust PyJWT version specifications in requirements.txt for im…
rboixaderg Feb 13, 2026
e76c321
chore: Refine version specifications for uvicorn, jsonschema, cffi, a…
rboixaderg Feb 13, 2026
d8d4e2e
feat: Add lifespan management for MCP utility and normalize query han…
rboixaderg Feb 16, 2026
1d99968
refactor: Enhance MCP lifespan management by implementing asynchronou…
rboixaderg Feb 16, 2026
10a6540
wip: Mcp
rboixaderg Feb 27, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .github/workflows/continuous-integration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@ jobs:
DATABASE: ${{ matrix.database }}
DB_SCHEMA: ${{ matrix.db_schema }}
MEMCACHED: "localhost:11211"
DOCKER_API_VERSION: "1.44"

steps:
- name: Checkout the repository
Expand All @@ -65,6 +66,7 @@ jobs:
pip install -r contrib-requirements.txt
pip install -e .[test]
pip install -e .[testdata]
if [[ "${{ matrix.python-version }}" == "3.10" || "${{ matrix.python-version }}" == "3.11" ]]; then pip install -e .[mcp]; fi

- name: Start memcached image
uses: niden/actions-memcached@v7
Expand Down
1 change: 1 addition & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ CHANGELOG

- Docs: Update documentation and configuration settings
- Chore: Update sphinx-guillotina-theme version to 1.0.9
- Feat: Add MCP (Model Context Protocol) contrib
[rboixaderg]


Expand Down
10 changes: 10 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -41,3 +41,13 @@ create-cockroachdb: ## Create CockroachDB
@echo ""
@echo "$(YELLOW)==> Creating CockroachDB $(VERSION)$(RESET)"
./bin/python _cockroachdb-createdb.py

.PHONY: format
format: ## Format code
flake8 guillotina --config=setup.cfg
black guillotina/
isort -rc guillotina/

.PHONY: tests
tests: ## Run tests
DATABASE=POSTGRES pytest -s -x guillotina
4 changes: 2 additions & 2 deletions contrib-requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@ codecov==2.1.13
mypy-zope==1.0.11
black==22.3.0
isort==4.3.21
jinja2==2.11.3
MarkupSafe<2.1.0
jinja2>=3.1.2,<4.0.0
MarkupSafe>=2.0
pytz==2020.1
emcache==0.6.0; python_version < '3.10'
pymemcache==3.4.0; python_version < '3.10'
Expand Down
1 change: 1 addition & 0 deletions docs/source/contrib/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,5 @@ Contents:
pubsub
swagger
mailer
mcp
dbusers
196 changes: 196 additions & 0 deletions docs/source/contrib/mcp.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,196 @@
# MCP (Model Context Protocol)

The `guillotina.contrib.mcp` package exposes Guillotina content to [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) clients and provides a chat endpoint where an LLM can query content using the same tools.

**What you get:**

- **@mcp** — MCP-over-HTTP endpoint so IDEs (Cursor, VS Code, etc.) and other MCP clients can discover and call tools against a container.
- **@chat** — REST endpoint to send a message and get an LLM reply; the LLM can call the same tools (search, count, get_content, list_children) in-process. API keys stay on the server.

Both use the same read-only tools and the same permissions.

## Installation

**Requires Python 3.10+** (the `mcp` package does not support older versions). Guillotina core supports Python 3.8+; MCP is optional and only needed when you use this contrib.

1. Install the MCP extra (Python 3.10+ only):

```bash
pip install guillotina[mcp]
```

Or add to your requirements: ``guillotina[mcp]`` or ``mcp>=1.0.0; python_version >= "3.10"`` and ``litellm>=1.0.0``. `litellm` is required only if you use **@chat**.

2. Enable the contrib in your app config:

```yaml
applications:
- guillotina
- guillotina.contrib.mcp
```

## Configuration

You can override these in your application config:

| Setting | Description |
|--------|-------------|
| `mcp.enabled` | If `false`, `@mcp` returns 404. Default: `true`. |
| `mcp.chat_enabled` | If `false`, `@chat` returns 404. Default: `true`. |
| `mcp.chat_model` | Model for @chat (LiteLLM). Required if you use chat. Examples: `openai/gpt-4o-mini`, `gemini/gemini-1.5-flash`, `anthropic/claude-3-haiku`, `groq/llama-3.1-8b-instant`, `openrouter/google/gemini-2.0-flash-001`, `minimax/MiniMax-M2.1`, `mistral/mistral-small-latest`, `deepseek/deepseek-chat`, `cerebras/llama3-70b-instruct`. |
| `mcp.token_max_duration_days` | Max `duration_days` for `@mcp-token`. Default: `90`. |
| `mcp.token_allowed_durations` | Optional list of allowed values (e.g. `[30, 60, 90]`). If set, only these values are accepted. |
| `mcp.description_extras` | Optional dict: tool name → string appended to that tool’s description (for LLM context). Keys: `search`, `count`, `get_content`, `list_children`. |
| `mcp.extra_tools_module` | Optional dotted path to a module that defines `register_extra_tools(mcp_server, backend)` (and optionally chat extensions). See [Extending](#extending). |

For `mcp.chat_model`, use the `provider/model-name` format. For current model names use the [LiteLLM Providers](https://docs.litellm.ai/docs/providers) index and the docs for each provider.

**@chat** reads credentials **only from environment variables**. Set the variables for the provider implied by your `mcp.chat_model`:

| Provider | Required | Optional (base URL) |
|----------|----------|----------------------|
| OpenAI | `OPENAI_API_KEY` | — |
| Google (Gemini) | `GEMINI_API_KEY` or `GOOGLE_API_KEY` | — |
| Anthropic | `ANTHROPIC_API_KEY` | — |
| Groq | `GROQ_API_KEY` | — |
| OpenRouter | `OPENROUTER_API_KEY` | `OPENROUTER_API_BASE` (default: `https://openrouter.ai/api/v1`) |
| MiniMax | `MINIMAX_API_KEY` | `MINIMAX_API_BASE` (e.g. `https://api.minimax.io/v1` for chat completions) |
| Mistral | `MISTRAL_API_KEY` | — |
| Deepseek | `DEEPSEEK_API_KEY` | — |
| Cerebras | `CEREBRAS_API_KEY` | — |

Do not put API keys in config files.

## Using the MCP endpoint (@mcp)

- **URL**: `POST` or `GET` on `/{db}/{container_path}/@mcp` (e.g. `POST /db/guillotina/@mcp`).
- **Auth**: Same as any Guillotina request. Use either:
- **Basic**: `Authorization: Basic <base64(user:password)>`.
- **Bearer**: Obtain a token with `POST /{db}/{container}/@login` or `POST /{db}/{container}/@mcp-token`, then `Authorization: Bearer <token>`.
- **Headers**: Clients must send `Accept: application/json, text/event-stream`; otherwise the server returns 406.

The resource you call (e.g. a container) is the context for all tools: search, count, get_content, and list_children are scoped to that resource.

**Obtaining a long-lived token for MCP clients**

- **Endpoint**: `POST /{db}/{container_path}/@mcp-token`
- **Auth**: Required (e.g. Basic or short-lived JWT).
- **Body** (optional): `{"duration_days": 30}`. Default 30; max and allowed values come from config.
- **Response**: `{"token": "<jwt>", "exp": <timestamp>, "duration_days": <number>}`.

Use this token as `Authorization: Bearer <token>` when calling `@mcp` from Cursor, VS Code, or other MCP clients.

**Example (list tools)**

```bash
# With Basic auth
curl -X POST -u root:password "http://localhost:8080/db/guillotina/@mcp" \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-d '{"jsonrpc":"2.0","method":"tools/list","id":1}'
```

MCP clients discover tools via the standard MCP protocol (e.g. `tools/list`, `tools/call`). Configure your IDE/client with the `@mcp` URL and the same auth (Basic or Bearer).

## Using the chat endpoint (@chat)

- **URL**: `POST /{db}/{container_path}/@chat`
- **Auth**: Same as `@mcp` (e.g. Bearer from `@mcp-token` or `@login`).
- **Body**:
- Single message: `{"message": "user text"}`.
- Full history (to keep context): `{"messages": [{"role": "user", "content": "..."}, {"role": "assistant", "content": "..."}, ...]}`.
- **Response**: `{"content": "assistant reply text"}`.

The server runs an LLM (LiteLLM) and executes the same tools (search, count, get_content, list_children) when the model requests them. To keep conversation context, your client should accumulate all messages and send them in `messages` on each request.

**Example**

```bash
TOKEN=$(curl -s -X POST -u root:password "http://localhost:8080/db/guillotina/@mcp-token" \
-H "Content-Type: application/json" -d '{"duration_days": 30}' | jq -r .token)

curl -s -X POST -H "Authorization: Bearer $TOKEN" \
"http://localhost:8080/db/guillotina/@chat" \
-H "Content-Type: application/json" \
-d '{"message": "How many items are in this container?"}' | jq .
```

## Built-in tools

All tools are read-only and scoped to the resource (container) you call @mcp or @chat on:

| Tool | Purpose |
|------|---------|
| `search` | Catalog search (same query keys as Guillotina `@search`: `type_name`, `term`, `_size`, `_from`, `_sort_asc` / `_sort_des`, field filters like `field__eq`, etc.). |
| `count` | Count catalog results with the same query filters (no `_size` / `_from` / sort). |
| `get_content` | Get a resource by path (relative to container) or by UID. |
| `list_children` | List direct children of a path, with optional pagination (`from_index`, `page_size`). |

Parameters and descriptions are exposed via MCP (`tools/list`) and to the LLM in @chat automatically.

## Permissions

- **@mcp** and **@chat** require permission `guillotina.mcp.Use`.
- **@mcp-token** requires `guillotina.mcp.IssueToken`.
- The contrib grants both to `guillotina.Authenticated`. Adjust grants in your app if you need to restrict or extend access.

## Extending

### Richer tool descriptions

Set `mcp.description_extras` to a dict mapping tool name to extra text (appended to the built-in description), or register a utility providing `guillotina.contrib.mcp.interfaces.IMCPDescriptionExtras` that returns such a dict. Useful to describe your content types or project-specific usage.

### Custom tools (MCP and optional @chat)

Set `mcp.extra_tools_module` to a dotted path to a module that defines:

- **`register_extra_tools(mcp_server, backend)`** — Required. Register additional tools with `@mcp_server.tool()`. They are then available on **@mcp** (e.g. to Cursor/VS Code). Use `backend` (InProcessBackend) to call search, get_content, etc. from your tool.

To make the same tools available in **@chat** (so the LLM can call them), the same module can optionally define:

- **`get_extra_chat_tools()`** — Returns a list of tool definitions in the same format as the built-in ones: each item is `{"type": "function", "function": {"name": "...", "description": "...", "parameters": {"type": "object", "properties": {...}}}}`.
- **`execute_extra_tool(backend, name, arguments)`** — Async. Called when the LLM invokes one of your extra tools. Receives `backend`, tool `name`, and a dict of `arguments`; return a JSON-serialisable result.

Tool names must be unique and must not clash with the built-in names: `search`, `count`, `get_content`, `list_children`.

**Example**

```yaml
# config
mcp:
extra_tools_module: "myapp.mcp_tools"
```

```python
# myapp/mcp_tools.py
def register_extra_tools(mcp_server, backend):
@mcp_server.tool()
async def my_tool(container_path: str = None, query: str = "") -> dict:
"""My project tool. Does X with container_path and query."""
# use backend.search(...), backend.get_content(...), etc.
return {"result": "..."}

def get_extra_chat_tools():
return [
{
"type": "function",
"function": {
"name": "my_tool",
"description": "My project tool. Does X with container_path and query.",
"parameters": {
"type": "object",
"properties": {
"container_path": {"type": "string", "description": "Optional path relative to container."},
"query": {"type": "string", "description": "Query."},
},
},
},
},
]

async def execute_extra_tool(backend, name, arguments):
if name == "my_tool":
# run same logic as the MCP tool
return {"result": "..."}
return {"error": f"Unknown tool: {name}"}
```
29 changes: 29 additions & 0 deletions guillotina/contrib/mcp/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
from guillotina import configure


app_settings = {
"mcp": {
"enabled": True,
"description_extras": {},
"extra_tools_module": None,
"token_max_duration_days": 90,
"token_allowed_durations": None,
"chat_enabled": True,
"chat_model": None,
},
"load_utilities": {
"guillotina.mcp": {
"provides": "guillotina.contrib.mcp.interfaces.IMCPUtility",
"factory": "guillotina.contrib.mcp.utility.MCPUtility",
"settings": {},
}
},
}


def includeme(root, settings):
configure.scan("guillotina.contrib.mcp.permissions")
configure.scan("guillotina.contrib.mcp.tools")
configure.scan("guillotina.contrib.mcp.lifespan")
configure.scan("guillotina.contrib.mcp.services")
configure.scan("guillotina.contrib.mcp.chat")
Loading
Loading