Skip to content

Conversation

@anuraaga
Copy link
Contributor

I am working with @codefromthecrypt on MCP instrumentation for context propagation and he suggested I can send a PR here for it. This adds instrumentation for MCP Python SDK as an implementation of the discussion in modelcontextprotocol/modelcontextprotocol#246.

The key point is picking requst.params._meta as the carrier for context. If a different field was defined in the protocol in the future, it would be relatively simple to migrate to it as the instrumentation otherwise doesn't change at all.

I have not included any examples / quickstart yet since it seemed a bit awkward to me because the instrumentation doesn't generate telemetry itself - it means either having examples that include other instrumentation, or manually creating spans, which both seemed out of scope for a example. In the future if generating MCP telemetry, this problem would be solved. But let me know if you have any advice on this point.

@anuraaga anuraaga requested a review from a team as a code owner April 17, 2025 05:48
@dosubot dosubot bot added the size:L This PR changes 100-499 lines, ignoring generated files. label Apr 17, 2025
@github-actions
Copy link
Contributor

github-actions bot commented Apr 17, 2025

CLA Assistant Lite bot All contributors have signed the CLA ✍️ ✅

@anuraaga
Copy link
Contributor Author

I have read the CLA Document and I hereby sign the CLA

github-actions bot added a commit that referenced this pull request Apr 17, 2025
@codefromthecrypt
Copy link
Contributor

@anuraaga good point on example.. how about raising a PR on this pointing the requirements.txt to your branch here? It should solve the broken trace. This example already uses openinference for openai-agents and MCP in a way that this should fix, I think.

self.traces.clear()


class OTLPServer(HTTPServer):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to start up an HTTP collector mostly since the stdio transport starts a subprocess. I considered just returning the parent span context in the tool's response, but figured it's worth setting this up now so it can be extended to actually generating telemetry in the future without new test setup

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agree this is the neatest way out I can think of, and also the MCP general discussion has some hints about OTLP proxying on the agent (MCP-client side), so this doubles as a hint of a potential future.

await client.initialize()
yield client
case "sse":
proc = await asyncio.create_subprocess_exec(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While the SSE version could start the server in the same process, it seemed simplest to keep it consistent with stdio for these tests

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense

@codefromthecrypt
Copy link
Contributor

probably after folks buy-in in general, the next step could be to find where arize examples live and make something like I did, except in typescript for the https://github.com/Arize-ai/phoenix/tree/main/js/packages/phoenix-mcp server. that would be pretty cool

@anuraaga
Copy link
Contributor Author

how about raising a PR on this pointing the requirements.txt to your branch here? It should solve the broken trace.

Good idea. Just one extra requirements dependency fixed the trace

elastic/observability-examples#62

the next step could be to find where arize examples live and make something like I did, except in typescript for the https://github.com/Arize-ai/phoenix/tree/main/js/packages/phoenix-mcp server

Makes sense, will work on Typescript next.

Copy link
Collaborator

@mikeldking mikeldking left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks so much for the PR @anuraaga and the intro @codefromthecrypt ! This seems like a great start to me but will get some folks on my team to take a look just for context building.

cc @axiomofjoy

@@ -0,0 +1 @@
__version__ = "1.0.0"
Copy link
Collaborator

@mikeldking mikeldking Apr 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@axiomofjoy axiomofjoy self-requested a review April 17, 2025 20:23
Copy link
Contributor

@axiomofjoy axiomofjoy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a solid foundation to build on. Small comments and questions, but it's ready to merge after those are addressed.

High-level question: My understanding is that MCP also allows servers to invoke clients. Will we need to handle context propagation in the opposite direction?

beeai: uv pip install --reinstall {toxinidir}/instrumentation/openinference-instrumentation-beeai[test]
mcp: uv pip uninstall -r test-requirements.txt
mcp: uv pip install --reinstall-package openinference-instrumentation-mcp .
mcp: python -c 'import openinference.instrumentation.mcp'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is this import needed for?

Copy link
Collaborator

@mikeldking mikeldking left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

excited to try this out! Will merge and publish - just need to get the release-please in place.

@axiomofjoy axiomofjoy changed the title feat: mcp python context propagation feat(mcp): mcp python context propagation Apr 17, 2025
@axiomofjoy axiomofjoy merged commit 1af5f7d into Arize-ai:main Apr 17, 2025
9 checks passed
@github-actions github-actions bot mentioned this pull request Apr 17, 2025
@axiomofjoy
Copy link
Contributor

It's out: https://pypi.org/project/openinference-instrumentation-mcp/

Keep cooking

@anuraaga
Copy link
Contributor Author

Thanks for the help! Especially 82b3972 sorry I had forgot to fix that minimum version, but you found it.

@anuraaga
Copy link
Contributor Author

High-level question: My understanding is that MCP also allows servers to invoke clients. Will we need to handle context propagation in the opposite direction?

This is a good point - I think the standard tools / resources wrappers on top of the protocol (what MCP SDK calls FastMCP some times) does not expose sending requests from server to client, while the underlying protocol layer indeed supports it. I'll look at getting demo code of that case set up to explore it - I'm a bit scared of how the trace may render if the parent/child get reversed in the middle of a request, I don't think I've ever tried something like that before.

@axiomofjoy
Copy link
Contributor

his is a good point - I think the standard tools / resources wrappers on top of the protocol (what MCP SDK calls FastMCP some times) does not expose sending requests from server to client, while the underlying protocol layer indeed supports it. I'll look at getting demo code of that case set up to explore it - I'm a bit scared of how the trace may render if the parent/child get reversed in the middle of a request, I don't think I've ever tried something like that before.

Awesome, excited to see what you find

@codefromthecrypt
Copy link
Contributor

For the follow-up,I think the best use case in the MCP protocol to showcase reverse propagation is sampling (server asking a client to do an LLM completion). I elaborated that here and hope it helps! #1543

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

4 participants