Skip to content

Conversation

@anuraaga
Copy link
Contributor

This reworks the context propagation instrumentation for MCP to instrument Transport instead of Client/Server. I had intentionally picked the latter at first due to concerns of the implications of propagating back from a server to a client, but with more thought, it was too much concern. Notably, if treating as two nodes talking to each other, it seems just fine to have A -> B -> A since the second A just happens to be the same server, but is not the same RPC method. This is the point notably brought up in #1524 (review) where MCP explicitly supports a server calling back to the client. This PR adds this usage to the test cases to demonstrate.

Aside from server calling client, we found a couple of other advantages of instrumenting Transport

  • Some libraries such as vercel AI can accept the Transport from @modelcontextprotocol/sdk while using hand-written clients. By instrumenting Transport, this will be instrumented without explicitly instrumenting vercel AI
  • Some libraries don't use Client/Server at all to set up an MCP stream. Notably, proxies such as in mcp-inspector do this. Instrumenting Transport handles this case too.

The downside is that for any new transport, instrumentation needs to be updated - this is probably not a frequent event though. The instrumentation logic is the same so it is mostly a registration step.

/cc @codefromthecrypt

@anuraaga anuraaga requested a review from a team as a code owner April 24, 2025 05:56
@dosubot dosubot bot added the size:XL This PR changes 500-999 lines, ignoring generated files. label Apr 24, 2025
Copy link
Contributor

@codefromthecrypt codefromthecrypt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks so much for this @anuraaga! I had hoped both vercel and inspector would be traceable with openinference by my next talk (cloud native summit early next week in auckland)

Bravo and thanks for the support!

@mikeldking fyi another MCP win

Copy link
Collaborator

@mikeldking mikeldking left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great @anuraaga. Cool to see the new streamable module too.

@mikeldking mikeldking merged commit 292489e into Arize-ai:main Apr 24, 2025
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:XL This PR changes 500-999 lines, ignoring generated files.

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

3 participants