Skip to content

Allow specification of maximum event contents in prompt to mitigate error compounding #154

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions src/google/adk/agents/llm_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,6 +133,14 @@ class LlmAgent(BaseAgent):
user messages, tool results, etc.
"""

max_contents: Optional[int] = None
"""The maximum number of contents to include in the model request.
Hallucinations can lead to compounding error if allowed to persist in the
system prompt indefinitely. Recommend setting this large enough to allow
relevant content to stay in memory but short enough that older irrelevant
content can be forgotten.
"""

# Controlled input/output configurations - Start
input_schema: Optional[type[BaseModel]] = None
"""The input schema when agent is used as a tool."""
Expand Down
8 changes: 7 additions & 1 deletion src/google/adk/flows/llm_flows/contents.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@ async def run_async(
invocation_context.branch,
invocation_context.session.events,
agent.name,
agent.max_contents,
)

# Maintain async generator behavior
Expand Down Expand Up @@ -186,7 +187,10 @@ def _rearrange_events_for_latest_function_response(


def _get_contents(
current_branch: Optional[str], events: list[Event], agent_name: str = ''
current_branch: Optional[str],
events: list[Event],
agent_name: str = '',
max_contents: Optional[int] = None
) -> list[types.Content]:
"""Get the contents for the LLM request.

Expand Down Expand Up @@ -224,6 +228,8 @@ def _get_contents(
result_events = _rearrange_events_for_async_function_responses_in_history(
result_events
)
if max_contents:
result_events = result_events[-max_contents:]
contents = []
for event in result_events:
content = copy.deepcopy(event.content)
Expand Down