Skip to content

LLM Langchain wrap call in chain to display it in sentry AI tab #13905

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 6 commits into
base: master
Choose a base branch
from

Conversation

Luke31
Copy link

@Luke31 Luke31 commented Jun 3, 2025

DESCRIBE YOUR PR

tackles part of issue #13167 and getsentry/sentry-python#3007 (comment)

Current documentation works great for traces and profiles. However if setting up Langchain as described in the documentation, the AI LLM-Monitoring tab in sentry remains empty.

Update Langchain example to introduce a chain with a custom run-name which will be displayed as a pipeline in the AI dashboard.

image

This will show with the error as

image

including the message

image

Also change Paris to France as Paris has no capital🇫🇷

IS YOUR CHANGE URGENT?

Help us prioritize incoming PRs by letting us know when the change needs to go live.

  • Urgent deadline (GA date, etc.):
  • Other deadline:
  • None: Not urgent, can wait up to 1 week+

SLA

  • Teamwork makes the dream work, so please add a reviewer to your PRs.
  • Please give the docs team up to 1 week to review your PR unless you've added an urgent due date to it.
    Thanks in advance for your help!

PRE-MERGE CHECKLIST

Make sure you've checked the following before merging your changes:

  • Checked Vercel preview for correctness, including links
  • PR was reviewed and approved by any necessary SMEs (subject matter experts)
  • PR was reviewed and approved by a member of the Sentry docs team

LEGAL BOILERPLATE

Look, I get it. The entity doing business as "Sentry" was incorporated in the State of Delaware in 2015 as Functional Software, Inc. and is gonna need some rights from me in order to utilize my contributions in this here PR. So here's the deal: I retain all rights, title and interest in and to my contributions, and by keeping this boilerplate intact I confirm that Sentry can use, modify, copy, and redistribute my contributions, under Sentry's choice of terms.

EXTRA RESOURCES

paris is already the capital
Copy link

vercel bot commented Jun 3, 2025

@Luke31 is attempting to deploy a commit to the Sentry Team on Vercel.

A member of the Team first needs to authorize it.

Luke31 added 2 commits June 3, 2025 13:06
convert to chain
indentation
@Luke31 Luke31 changed the title Update index.mdx LLM Langchain wrap call in chain to display it in sentry AI tab Jun 3, 2025
Add hint to enable tracing to view data in LLM tab
@Luke31 Luke31 marked this pull request as ready for review June 3, 2025 04:44
@AbhiPrasad AbhiPrasad requested a review from colin-sentry June 9, 2025 19:35
Copy link
Member

@szokeasaurusrex szokeasaurusrex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, although I have a small question

import sentry_sdk

sentry_sdk.init(...) # same as above

llm = ChatOpenAI(model="gpt-3.5-turbo-0125", temperature=0, api_key="bad API key")
with sentry_sdk.start_transaction(op="ai-inference", name="The result of the AI inference"):
response = llm.invoke([("system", "What is the capital of paris?")])
chain = (llm | StrOutputParser()).with_config({"run_name": "test-run"})
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm admittedly not super familiar with Langchain, could you explain what this change does differently than what we had before? I get that the run_name allows it to show up in Sentry, but what is the StrOutputParser() for?

@@ -36,6 +36,8 @@ An additional dependency, `tiktoken`, is required to be installed if you want to

In addition to capturing errors, you can monitor interactions between multiple services or applications by [enabling tracing](/concepts/key-terms/tracing/). You can also collect and analyze performance profiles from real users with [profiling](/product/explore/profiling/).

Tracing is required to see AI pipelines in sentry's [AI tab](https://sentry.io/insights/ai/llm-monitoring/).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding this, definitely an oversight that we forgot to add this before!

Copy link

vercel bot commented Jun 10, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
sentry-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 10, 2025 9:28am
1 Skipped Deployment
Name Status Preview Comments Updated (UTC)
develop-docs ⬜️ Ignored (Inspect) Visit Preview Jun 10, 2025 9:28am

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants