-
Notifications
You must be signed in to change notification settings - Fork 1
feat: integrated litellm for tracing #532
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
shah-siddd
wants to merge
5
commits into
main
Choose a base branch
from
siddhant/open-7336-integrate-litellm-for-tracing
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
5 commits
Select commit
Hold shift + click to select a range
9220e10
feat: integrated litellm for tracing
shah-siddd 34061e9
style: lint fixes
shah-siddd cda21db
test: fix tests
shah-siddd 831192d
fix: fixed model names and OpenLayer to Openlayer.
shah-siddd 3b78fde
test: fixed test cases
shah-siddd File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,169 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"[](https://colab.research.google.com/github/openlayer-ai/openlayer-python/blob/main/examples/tracing/litellm/litellm_tracing.ipynb)\n", | ||
"\n", | ||
"\n", | ||
"# <a id=\"top\">LiteLLM monitoring quickstart</a>\n", | ||
"\n", | ||
"This notebook illustrates how to get started monitoring LiteLLM completions with Openlayer.\n", | ||
"\n", | ||
"LiteLLM provides a unified interface to call 100+ LLM APIs using the same input/output format. This integration allows you to trace and monitor completions across all supported providers through a single interface.\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"!pip install openlayer litellm\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## 1. Set the environment variables\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"import os\n", | ||
"\n", | ||
"import litellm\n", | ||
"\n", | ||
"# Set your API keys for the providers you want to use\n", | ||
"os.environ[\"OPENAI_API_KEY\"] = \"YOUR_OPENAI_API_KEY_HERE\"\n", | ||
"os.environ[\"ANTHROPIC_API_KEY\"] = \"YOUR_ANTHROPIC_API_KEY_HERE\" # Optional\n", | ||
"os.environ[\"GROQ_API_KEY\"] = \"YOUR_GROQ_API_KEY_HERE\" # Optional\n", | ||
"\n", | ||
"# Openlayer env variables\n", | ||
"os.environ[\"OPENLAYER_API_KEY\"] = \"YOUR_OPENLAYER_API_KEY_HERE\"\n", | ||
"os.environ[\"OPENLAYER_INFERENCE_PIPELINE_ID\"] = \"YOUR_OPENLAYER_INFERENCE_PIPELINE_ID_HERE\"\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## 2. Enable LiteLLM tracing\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from openlayer.lib import trace_litellm\n", | ||
"\n", | ||
"# Enable tracing for all LiteLLM completions\n", | ||
"trace_litellm()\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## 3. Use LiteLLM normally - tracing happens automatically!\n", | ||
"\n", | ||
"### Basic completion with OpenAI\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# Basic completion with OpenAI GPT-4\n", | ||
"response = litellm.completion(\n", | ||
" model=\"gpt-4\",\n", | ||
" messages=[\n", | ||
" {\"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n", | ||
" {\"role\": \"user\", \"content\": \"What is the capital of France?\"}\n", | ||
" ],\n", | ||
" temperature=0.7,\n", | ||
" max_tokens=100,\n", | ||
" inference_id=\"litellm-openai-example-1\" # Optional: custom inference ID\n", | ||
")\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"### Multi-provider comparison\n", | ||
"\n", | ||
"One of LiteLLM's key features is the ability to easily switch between providers. Let's trace completions from different providers:\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# Test the same prompt with different models/providers\n", | ||
"prompt = \"Explain quantum computing in simple terms.\"\n", | ||
"messages = [{\"role\": \"user\", \"content\": prompt}]\n", | ||
"\n", | ||
"models_to_test = [\n", | ||
" \"gpt-3.5-turbo\", # OpenAI\n", | ||
" \"claude-3-haiku-20240307\", # Anthropic (if API key is set)\n", | ||
" \"groq/llama-3.1-8b-instant\", # Groq (if API key is set)\n", | ||
"]\n", | ||
"\n", | ||
"for model in models_to_test:\n", | ||
" response = litellm.completion(\n", | ||
" model=model,\n", | ||
" messages=messages,\n", | ||
" temperature=0.5,\n", | ||
" max_tokens=150,\n", | ||
" inference_id=f\"multi-provider-{model.replace('/', '-')}\"\n", | ||
" )\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## 4. View your traces\n", | ||
"\n", | ||
"Once you've run the examples above, you can:\n", | ||
"\n", | ||
"1. **Visit your OpenLayer dashboard** to see all the traced completions\n", | ||
"2. **Analyze performance** across different models and providers\n", | ||
"3. **Monitor costs** and token usage\n", | ||
"4. **Debug issues** with detailed request/response logs\n", | ||
"5. **Compare models** side-by-side\n", | ||
"\n", | ||
"The traces will include:\n", | ||
"- **Request details**: Model, parameters, messages\n", | ||
"- **Response data**: Generated content, token counts, latency\n", | ||
"- **Provider information**: Which underlying service was used\n", | ||
"- **Custom metadata**: Any additional context you provide\n", | ||
"\n", | ||
"For more information, check out:\n", | ||
"- [OpenLayer Documentation](https://docs.openlayer.com/)\n", | ||
"- [LiteLLM Documentation](https://docs.litellm.ai/)\n", | ||
"- [LiteLLM Supported Models](https://docs.litellm.ai/docs/providers)\n" | ||
] | ||
} | ||
], | ||
"metadata": { | ||
"language_info": { | ||
"name": "python" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 2 | ||
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some instances of "OpenLayer" here and in other files (e.g.,
litellm_tracer.py
and__init__.py
)