Skip to content

Added support to get usage for ChatGoogleGenerativeAI #893

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

NeerajG03
Copy link

This PR has been raised to add support to track ChatGoogleGenerativeAI's langchain call usage.
A simple check for usage_metadata attribute and then parsing using the _parse_usage_model

@CLAassistant
Copy link

CLAassistant commented Aug 29, 2024

CLA assistant check
All committers have signed the CLA.

Copy link

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Disclaimer: Experimental PR review

PR Summary

Added support for tracking usage of ChatGoogleGenerativeAI's langchain calls in the Langfuse Python SDK.

  • Introduced check for 'usage_metadata' attribute in message chunks in langfuse/callback/langchain.py
  • Implemented parsing of 'usage_metadata' using _parse_usage_model function
  • Enhanced usage tracking capabilities for Google's Generative AI in addition to existing LLM providers

1 file(s) reviewed, no comment(s)
Edit PR Review Bot Settings

@maxdeichmann
Copy link
Member

maxdeichmann commented Aug 29, 2024

@NeerajG03 did you test this locally? Thanks a lot for the contribution!

@NeerajG03
Copy link
Author

NeerajG03 commented Aug 29, 2024

@maxdeichmann I'm sorry i'm very new to contributing for projects.
If you are asking if i have run this test poetry run pytest -s -v --log-cli-level=INFO, then thats a yes. 👍
I ran it for test_langchain.py and test_langchain_integration.py

@NeerajG03
Copy link
Author

Here is the screenshot of the run summary
Screenshot 2024-08-29 at 10 27 56 PM

@aunitt
Copy link

aunitt commented Apr 22, 2025

This also worked for me for ChatVertexAI.

It appears to fix my issues which appear to be those mentioned in langfuse/langfuse#5468

@@ -1139,6 +1139,11 @@ def _parse_usage(response: LLMResult):
break

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would suggest adding

Suggested change
break
if llm_usage:
break

or even better modify the previous condition to:

                if generation_chunk.generation_info and (
                        generation_chunk.generation_info.get("usage_metadata", None)
                    ):

as in my testing with ChatVertexAI generation_info contains empty usage_metadata while message contains valid one.

Comment on lines +1142 to +1146

if hasattr(message_chunk, "usage_metadata") and message_chunk.usage_metadata is not None: #for ChatGoogleGenerativeAI
llm_usage = _parse_usage_model(message_chunk.usage_metadata)
break

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking at it better, this should have no effect, as a few lines down, there is:

or getattr(message_chunk, "usage_metadata", None) # for Ollama

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants