Added caching of system prompt and user input#72
Conversation
|
@ottokruse can you review it |
ottokruse
left a comment
There was a problem hiding this comment.
@RyanFrench should have a look as he's the maintainer of this repo now.
From what I can see, this looks good and useful! Thanks @karthiky7
No tests though--can we add a quick unit test for this ?
If we run the GH action we'll see if typing and linting are ok
| user_input: str | None, | ||
| tools: Sequence[Callable | Tool] | None = None, | ||
| stop_event: Event | None = None, | ||
| enable_cache: bool | None = False, |
There was a problem hiding this comment.
nitpick: should we do
enable_cache: bool | None = None,so you can see the difference between explcitly turned of and not set? Otherwise, it could be
enable_cache: bool = False,|
Thank you @karthiky7 for this! As @ottokruse mentioned, could we please add some tests to cover both the caching enabled and disabled flows so that we ensure the correct messages are being added. I also agree with the other comment about setting the default value to None for enable_cache. |
Issue #, if available:
Description of changes:
Implemented caching for system prompts and user inputs so repeated conversations can reuse cached context, reducing token consumption, latency, and overall cost while preserving existing agent behavior.
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.