Skip to content

Conversation

@keyuchen21
Copy link
Contributor

Summary

Fixes #32637 - ChatPromptTemplate.save() method now properly saves templates to files instead of raising NotImplementedError.

Problem

The save() method in ChatPromptTemplate was not implemented and simply raised NotImplementedError, preventing users from saving their chat prompt templates to files.

Solution

  • Implemented a working save() method that uses dumpd() for proper serialization
  • The method supports both JSON and YAML formats
  • Ensures saved templates can be correctly loaded back using the load() function

Changes

  1. Implemented ChatPromptTemplate.save() method:

    • Uses dumpd() from langchain_core.load for proper serialization
    • Supports JSON and YAML file formats
    • Creates parent directories if they don't exist
    • Validates file extensions and raises appropriate errors
  2. Added comprehensive unit tests:

    • Test saving/loading to JSON format
    • Test saving/loading to YAML format
    • Test with MessagesPlaceholder
    • Test error handling for invalid file extensions
    • Test error handling for partial variables
    • Test directory creation

Test plan

Added unit tests that verify:

  • Templates can be saved to JSON and loaded back correctly
  • Templates can be saved to YAML and loaded back correctly
  • Proper error handling for invalid file extensions
  • Proper error handling when partial variables are present
  • Parent directories are created automatically when saving to nested paths
  • Templates with MessagesPlaceholder work correctly

All tests pass successfully.

Example Usage

from langchain_core.prompts import ChatPromptTemplate
from langchain_core.load import load

# Create a chat prompt template
template = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant named {assistant_name}."),
    ("human", "{question}")
])

# Save to file
template.save("my_prompt.json")

# Load it back
with open("my_prompt.json") as f:
    import json
    loaded_template = load(json.load(f))

# Use the loaded template
messages = loaded_template.format_messages(
    assistant_name="Claude",
    question="What can you help me with?"
)

shahrukh-shaik and others added 5 commits August 20, 2025 18:48
if annotation is not present or None, it shall default to empty list []. Similar to line number langchain-ai#3875  
"annotations": block. Get("annotations") or []
Fixes langchain-ai#26348 where streaming LLM outputs could fail with TypeError when
UsageMetadata contains None values for token counts.

The issue occurred when providers returned None for token counts during
streaming, causing _dict_int_op to raise a ValueError about unsupported
types.

Changes:
- Modified _dict_int_op to treat None values as the default value (0)
- Added comprehensive tests for None value handling
- Updated error message to include None as a supported type

This ensures robust handling of partial or missing token metadata
during streaming operations.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <[email protected]>
- Replace NotImplementedError with working implementation
- Use dumpd() for proper serialization to ensure compatibility with load()
- Support both JSON and YAML formats
- Add comprehensive unit tests for save functionality
- Handle error cases (invalid extensions, partial variables)
- Ensure directory creation when saving to nested paths

Fixes langchain-ai#32637

Co-Authored-By: Keyu <[email protected]>
@keyuchen21 keyuchen21 requested a review from eyurtsev as a code owner August 22, 2025 21:27
@vercel
Copy link

vercel bot commented Aug 22, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

1 Skipped Deployment
Project Deployment Preview Comments Updated (UTC)
langchain Ignored Ignored Preview Aug 22, 2025 9:38pm

@codspeed-hq
Copy link

codspeed-hq bot commented Aug 22, 2025

CodSpeed WallTime Performance Report

Merging #32654 will not alter performance

Comparing keyuchen21:fix-chat-prompt-save (abdb539) with master (2d0713c)

⚠️ Unknown Walltime execution environment detected

Using the Walltime instrument on standard Hosted Runners will lead to inconsistent data.

For the most accurate results, we recommend using CodSpeed Macro Runners: bare-metal machines fine-tuned for performance measurement consistency.

Summary

✅ 13 untouched benchmarks

- Remove trailing whitespace
- Remove whitespace in blank lines
- Use Path.open() instead of open() for file operations
@keyuchen21 keyuchen21 changed the title core: fix ChatPromptTemplate.save() implementation fix(core): implement ChatPromptTemplate.save() method Aug 22, 2025
- Parse JSON data before passing to load() function
- Fix trailing whitespace issues
- Tests now correctly verify save/load functionality
@codspeed-hq
Copy link

codspeed-hq bot commented Aug 22, 2025

CodSpeed Instrumentation Performance Report

Merging #32654 will not alter performance

Comparing keyuchen21:fix-chat-prompt-save (abdb539) with master (2d0713c)

Summary

✅ 14 untouched benchmarks

- Add blank line after import statement
- Add trailing comma in function arguments
- Consolidate single-line ChatPromptTemplate instantiations
@mdrxy mdrxy added the core Related to the package `langchain-core` label Aug 25, 2025
@mdrxy mdrxy changed the title fix(core): implement ChatPromptTemplate.save() method fix(core): implement ChatPromptTemplate.save() method Aug 25, 2025
@keyuchen21
Copy link
Contributor Author

@mdrxy any update?

Copy link
Collaborator

@ccurme ccurme left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Closing as duplicated with #33631.

@ccurme ccurme closed this Oct 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

core Related to the package `langchain-core`

Projects

None yet

Development

Successfully merging this pull request may close these issues.

ChatPromptTemplate save() not implemented

4 participants