feat: integrated litellm for tracing #532
Open
+1,187
−0
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Pull Request
Summary
Add comprehensive LiteLLM integration to the OpenLayer Python SDK, enabling automatic tracing and monitoring of completions across 100+ LLM providers through LiteLLM's unified interface.
Changes
litellm_tracer.py
with full support for streaming and non-streaming completionstrace_litellm()
function toopenlayer.lib
for easy integrationstream_options={"include_usage": True}
litellm_tracing.ipynb
) with multi-provider examplesContext
LiteLLM is a popular library that provides a unified interface to call 100+ LLM APIs (OpenAI, Anthropic, Google, AWS Bedrock, etc.) using the same input/output format. This integration allows users to:
This addresses the need for comprehensive LLM monitoring across diverse model providers in production environments.
Testing
Test Results:
Key Technical Achievements: