-
Notifications
You must be signed in to change notification settings - Fork 194
Improve OTel GenAI semantic convention compliance for LLM instrumentations #1593
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
abec7c9 to
265e892
Compare
…tions This commit enhances OpenAI and Anthropic instrumentations to better comply with OpenTelemetry GenAI semantic conventions (https://opentelemetry.io/docs/specs/semconv/gen-ai/). Changes: - Set SpanKind=CLIENT for all GenAI API calls (was INTERNAL) - Use OTel-compliant span names in "{operation} {model}" format (e.g., "chat gpt-4o-mini") - Add request parameter attributes: gen_ai.request.max_tokens, temperature, top_p, top_k, stop_sequences, seed, frequency_penalty, presence_penalty - Add gen_ai.tool.definitions when tools are provided in requests - Add server.address and server.port attributes extracted from client base_url - Remove deprecated gen_ai.system attribute from OpenAI instrumentation - Add finish_reasons mapping for OpenAI Responses API (status → finish_reason) Files modified: - main.py: Added _span_kind parameter support to Logfire.span() - llm_provider.py: Set SpanKind.CLIENT, OTel span naming, server attributes - openai.py: Request params extraction, tool definitions, remove deprecated attr - anthropic.py: Request params extraction, tool definitions - semconv.py: Added new semantic convention constants 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
265e892 to
eec1209
Compare
Add LangChain/LangGraph OTel GenAI semantic convention support - Transform LangSmith spans to add OTel GenAI attributes - Add gen_ai.input.messages and gen_ai.output.messages in OTel schema - Extract gen_ai.system_instructions from system messages - Add gen_ai.response.finish_reasons - Add logfire.instrument_langchain() to capture tool definitions - Patches BaseCallbackManager to inject callback handler - Captures gen_ai.tool.definitions (not available via LangSmith OTEL) - Uses _start()/_end() pattern to avoid async context issues
5121d6a to
9307923
Compare
47a633e to
015babe
Compare
54a2096 to
2f34d18
Compare
e41de92 to
d35ecc4
Compare
|
Please make a new PR that just adds/changes these: SpanKind, Langchain stuff should definitely be completely separated from OpenAI/Anthropic stuff. Changing span names and dropping |
|
@alexmojaki Thanks for review, I am breaking this PR further into 3 distinct prs: I will also close this PR now. |
Summary
This PR improves OTel GenAI Semantic Convention compliance across LLM instrumentations.
What's New
1. OpenAI & Anthropic Instrumentation Improvements
Enhanced existing instrumentations to better comply with OTel GenAI Semantic Conventions:
INTERNALCLIENT"Chat Completion with {model!r}""chat gpt-4o-mini"gen_ai.system"openai"(deprecated)gen_ai.provider.name"openai"/"anthropic"server.address"api.openai.com"gen_ai.request.*max_tokens,temperature,top_p, etc.gen_ai.tool.definitions2. LangChain/LangGraph Instrumentation (New)
Adds
logfire.instrument_langchain()to capture tool definitions from LangChain agents.Why: LangSmith's OTEL integration does NOT include tool definitions - they're sent via their proprietary API. This instrumentation patches LangChain's callback system to capture them.
Usage:
Captures:
Related