Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python: Add support for AutoGen's 0.2 ConversableAgent #10607

Merged
merged 14 commits into from
Feb 20, 2025
Merged
1 change: 1 addition & 0 deletions python/.coveragerc
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
[run]
source = semantic_kernel
omit =
semantic_kernel/agents/autogen/autogen_conversable_channel.py
semantic_kernel/connectors/memory/astradb/*
semantic_kernel/connectors/memory/azure_cognitive_search/*
semantic_kernel/connectors/memory/azure_cosmosdb/*
Expand Down
3 changes: 3 additions & 0 deletions python/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,9 @@ dependencies = [

### Optional dependencies
[project.optional-dependencies]
autogen = [
"autogen-agentchat == 0.2.40"
]
azure = [
"azure-ai-inference >= 1.0.0b6",
"azure-ai-projects >= 1.0.0b5",
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
## AutoGen Conversable Agent (v0.2.X)

Semantic Kernel Python supports running AutoGen Conversable Agents provided in the 0.2.X package.

### Limitations

Currently, there are some limitations to note:

- AutoGen Conversable Agents in Semantic Kernel run asynchronously and do not support streaming of agent inputs or responses.
- The `AutoGenConversableAgent` in Semantic Kernel Python cannot be configured as part of a Semantic Kernel `AgentGroupChat`. As we progress towards GA for our agent group chat patterns, we will explore ways to integrate AutoGen agents into a Semantic Kernel group chat scenario.

### Installation

Install the `semantic-kernel` package with the optional `autogen` dependency:

```bash
pip install semantic-kernel[autogen]
```

For an example of how to integrate an AutoGen Conversable Agent using the Semantic Kernel Agent abstraction, please refer to [`autogen_conversable_agent_simple_convo.py`](autogen_conversable_agent_simple_convo.py).
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
# Copyright (c) Microsoft. All rights reserved.

import asyncio

from autogen import ConversableAgent
from autogen.coding import LocalCommandLineCodeExecutor

from semantic_kernel.agents.autogen.autogen_conversable_agent import AutoGenConversableAgent

"""
The following sample demonstrates how to use the AutoGenConversableAgent to create a reply from an agent
to a message with a code block. The agent executes the code block and replies with the output.
The sample follows the AutoGen flow outlined here:
https://microsoft.github.io/autogen/0.2/docs/tutorial/code-executors#local-execution
"""


async def main():
# Create a temporary directory to store the code files.
import os

# Configure the temporary directory to be where the script is located.
temp_dir = os.path.dirname(os.path.realpath(__file__))

# Create a local command line code executor.
executor = LocalCommandLineCodeExecutor(
timeout=10, # Timeout for each code execution in seconds.
work_dir=temp_dir, # Use the temporary directory to store the code files.
)

# Create an agent with code executor configuration.
code_executor_agent = ConversableAgent(
"code_executor_agent",
llm_config=False, # Turn off LLM for this agent.
code_execution_config={"executor": executor}, # Use the local command line code executor.
human_input_mode="ALWAYS", # Always take human input for this agent for safety.
)

autogen_agent = AutoGenConversableAgent(conversable_agent=code_executor_agent)

message_with_code_block = """This is a message with code block.
The code block is below:
```python
import numpy as np
import matplotlib.pyplot as plt
x = np.random.randint(0, 100, 100)
y = np.random.randint(0, 100, 100)
plt.scatter(x, y)
plt.savefig('scatter.png')
print('Scatter plot saved to scatter.png')
```
This is the end of the message.
"""

async for content in autogen_agent.invoke(message=message_with_code_block):
print(f"# {content.role} - {content.name or '*'}: '{content.content}'")


if __name__ == "__main__":
asyncio.run(main())
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
# Copyright (c) Microsoft. All rights reserved.

import asyncio
import os
from typing import Annotated, Literal

from autogen import ConversableAgent, register_function

from semantic_kernel.agents.autogen.autogen_conversable_agent import AutoGenConversableAgent
from semantic_kernel.contents.function_call_content import FunctionCallContent
from semantic_kernel.contents.function_result_content import FunctionResultContent

"""
The following sample demonstrates how to use the AutoGenConversableAgent to create a conversation between two agents
where one agent suggests a tool function call and the other agent executes the tool function call.

In this example, the assistant agent suggests a calculator tool function call to the user proxy agent. The user proxy
agent executes the calculator tool function call. The assistant agent and the user proxy agent are created using the
ConversableAgent class. The calculator tool function is registered with the assistant agent and the user proxy agent.

This sample follows the AutoGen flow outlined here:
https://microsoft.github.io/autogen/0.2/docs/tutorial/tool-use
"""


Operator = Literal["+", "-", "*", "/"]


async def main():
def calculator(a: int, b: int, operator: Annotated[Operator, "operator"]) -> int:
if operator == "+":
return a + b
if operator == "-":
return a - b
if operator == "*":
return a * b
if operator == "/":
return int(a / b)
raise ValueError("Invalid operator")

assistant = ConversableAgent(
name="Assistant",
system_message="You are a helpful AI assistant. "
"You can help with simple calculations. "
"Return 'TERMINATE' when the task is done.",
llm_config={"config_list": [{"model": "gpt-4", "api_key": os.environ["OPENAI_API_KEY"]}]},
)

user_proxy = ConversableAgent(
name="User",
llm_config=False,
is_termination_msg=lambda msg: msg.get("content") is not None and "TERMINATE" in msg["content"],
human_input_mode="NEVER",
)

assistant.register_for_llm(name="calculator", description="A simple calculator")(calculator)

# Register the tool function with the user proxy agent.
user_proxy.register_for_execution(name="calculator")(calculator)

register_function(
calculator,
caller=assistant, # The assistant agent can suggest calls to the calculator.
executor=user_proxy, # The user proxy agent can execute the calculator calls.
name="calculator", # By default, the function name is used as the tool name.
description="A simple calculator", # A description of the tool.
)

autogen_conversable_agent = AutoGenConversableAgent(conversable_agent=user_proxy)

async for content in autogen_conversable_agent.invoke(
recipient=assistant,
message="What is (44232 + 13312 / (232 - 32)) * 5?",
max_turns=10,
):
if any(isinstance(item, FunctionResultContent) for item in content.items):
for item in content.items:
if isinstance(item, FunctionResultContent):
print(f"# {content.role} - {content.name or '*'}: '{item.result}'")
elif any(isinstance(item, FunctionCallContent) for item in content.items):
for item in content.items:
if isinstance(item, FunctionCallContent):
print(
f"# {content.role} - {content.name or '*'}: Function Name: '{item.function_name}' "
f", Arguments: '{item.arguments}'"
)
else:
print(f"# {content.role} - {content.name or '*'}: '{content.content}'")


if __name__ == "__main__":
asyncio.run(main())
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# Copyright (c) Microsoft. All rights reserved.

import asyncio
import os

from autogen import ConversableAgent

from semantic_kernel.agents.autogen.autogen_conversable_agent import AutoGenConversableAgent

"""
The following sample demonstrates how to use the AutoGenConversableAgent to create a conversation between two agents
where one agent suggests a joke and the other agent generates a joke.

The sample follows the AutoGen flow outlined here:
https://microsoft.github.io/autogen/0.2/docs/tutorial/introduction#roles-and-conversations
"""


async def main():
cathy = ConversableAgent(
"cathy",
system_message="Your name is Cathy and you are a part of a duo of comedians.",
llm_config={
"config_list": [{"model": "gpt-4o-mini", "temperature": 0.9, "api_key": os.environ.get("OPENAI_API_KEY")}]
},
human_input_mode="NEVER", # Never ask for human input.
)

joe = ConversableAgent(
"joe",
system_message="Your name is Joe and you are a part of a duo of comedians.",
llm_config={
"config_list": [{"model": "gpt-4", "temperature": 0.7, "api_key": os.environ.get("OPENAI_API_KEY")}]
},
human_input_mode="NEVER", # Never ask for human input.
)

autogen_agent = AutoGenConversableAgent(conversable_agent=cathy)

async for content in autogen_agent.invoke(
recipient=joe, message="Tell me a joke about the stock market.", max_turns=3
):
print(f"# {content.role} - {content.name or '*'}: '{content.content}'")


if __name__ == "__main__":
asyncio.run(main())
20 changes: 20 additions & 0 deletions python/semantic_kernel/agents/autogen/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
## AutoGen Conversable Agent (v0.2.X)

Semantic Kernel Python supports running AutoGen Conversable Agents provided in the 0.2.X package.

### Limitations

Currently, there are some limitations to note:

- AutoGen Conversable Agents in Semantic Kernel run asynchronously and do not support streaming of agent inputs or responses.
- The `AutoGenConversableAgent` in Semantic Kernel Python cannot be configured as part of a Semantic Kernel `AgentGroupChat`. As we progress towards GA for our agent group chat patterns, we will explore ways to integrate AutoGen agents into a Semantic Kernel group chat scenario.

### Installation

Install the `semantic-kernel` package with the optional `autogen` dependency:

```bash
pip install semantic-kernel[autogen]
```

For an example of how to integrate an AutoGen Conversable Agent using the Semantic Kernel Agent abstraction, please refer to [`autogen_conversable_agent_simple_convo.py`](../../../samples/concepts/agents/autogen_conversable_agent/autogen_conversable_agent_simple_convo.py).
5 changes: 5 additions & 0 deletions python/semantic_kernel/agents/autogen/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Copyright (c) Microsoft. All rights reserved.

from semantic_kernel.agents.autogen.autogen_conversable_agent import AutoGenConversableAgent

__all__ = ["AutoGenConversableAgent"]
Loading
Loading