-
Notifications
You must be signed in to change notification settings - Fork 3.5k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Python: Add support for AutoGen's 0.2 ConversableAgent (#10607)
### Motivation and Context For those who use AutoGen's 0.2 `ConversableAgent`, we're providing support in Semantic Kernel to run this agent type. This assumes one will port their existing AG 0.2.X `ConversableAgent` code and run in Semantic Kernel. Note: as we move towards GA for our agent chat patterns, we are analyzing how to support AutoGen agents with a shared time. This PR does not provide support for the AG `ConversibleAgent` group chat patterns that exist in the 0.2.X package. <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> ### Description Add support and samples for the AG 0.2.X ConversableAgent. - Add unit test coverage. - Add samples and READMEs. - Closes #10407 <!-- Describe your changes, the overall approach, the underlying design. These notes will help understanding how your code works. Thanks! --> ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [X] The code builds clean without any errors or warnings - [X] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [X] All unit tests pass, and I have added new tests where possible - [X] I didn't break anyone 😄 --------- Co-authored-by: Chris <[email protected]>
- Loading branch information
Showing
10 changed files
with
712 additions
and
66 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
20 changes: 20 additions & 0 deletions
20
python/samples/concepts/agents/autogen_conversable_agent/README.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,20 @@ | ||
## AutoGen Conversable Agent (v0.2.X) | ||
|
||
Semantic Kernel Python supports running AutoGen Conversable Agents provided in the 0.2.X package. | ||
|
||
### Limitations | ||
|
||
Currently, there are some limitations to note: | ||
|
||
- AutoGen Conversable Agents in Semantic Kernel run asynchronously and do not support streaming of agent inputs or responses. | ||
- The `AutoGenConversableAgent` in Semantic Kernel Python cannot be configured as part of a Semantic Kernel `AgentGroupChat`. As we progress towards GA for our agent group chat patterns, we will explore ways to integrate AutoGen agents into a Semantic Kernel group chat scenario. | ||
|
||
### Installation | ||
|
||
Install the `semantic-kernel` package with the `autogen` extra: | ||
|
||
```bash | ||
pip install semantic-kernel[autogen] | ||
``` | ||
|
||
For an example of how to integrate an AutoGen Conversable Agent using the Semantic Kernel Agent abstraction, please refer to [`autogen_conversable_agent_simple_convo.py`](autogen_conversable_agent_simple_convo.py). |
61 changes: 61 additions & 0 deletions
61
...ples/concepts/agents/autogen_conversable_agent/autogen_conversable_agent_code_executor.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,61 @@ | ||
# Copyright (c) Microsoft. All rights reserved. | ||
|
||
import asyncio | ||
|
||
from autogen import ConversableAgent | ||
from autogen.coding import LocalCommandLineCodeExecutor | ||
|
||
from semantic_kernel.agents.autogen.autogen_conversable_agent import AutoGenConversableAgent | ||
|
||
""" | ||
The following sample demonstrates how to use the AutoGenConversableAgent to create a reply from an agent | ||
to a message with a code block. The agent executes the code block and replies with the output. | ||
The sample follows the AutoGen flow outlined here: | ||
https://microsoft.github.io/autogen/0.2/docs/tutorial/code-executors#local-execution | ||
""" | ||
|
||
|
||
async def main(): | ||
# Create a temporary directory to store the code files. | ||
import os | ||
|
||
# Configure the temporary directory to be where the script is located. | ||
temp_dir = os.path.dirname(os.path.realpath(__file__)) | ||
|
||
# Create a local command line code executor. | ||
executor = LocalCommandLineCodeExecutor( | ||
timeout=10, # Timeout for each code execution in seconds. | ||
work_dir=temp_dir, # Use the temporary directory to store the code files. | ||
) | ||
|
||
# Create an agent with code executor configuration. | ||
code_executor_agent = ConversableAgent( | ||
"code_executor_agent", | ||
llm_config=False, # Turn off LLM for this agent. | ||
code_execution_config={"executor": executor}, # Use the local command line code executor. | ||
human_input_mode="ALWAYS", # Always take human input for this agent for safety. | ||
) | ||
|
||
autogen_agent = AutoGenConversableAgent(conversable_agent=code_executor_agent) | ||
|
||
message_with_code_block = """This is a message with code block. | ||
The code block is below: | ||
```python | ||
import numpy as np | ||
import matplotlib.pyplot as plt | ||
x = np.random.randint(0, 100, 100) | ||
y = np.random.randint(0, 100, 100) | ||
plt.scatter(x, y) | ||
plt.savefig('scatter.png') | ||
print('Scatter plot saved to scatter.png') | ||
``` | ||
This is the end of the message. | ||
""" | ||
|
||
async for content in autogen_agent.invoke(message=message_with_code_block): | ||
print(f"# {content.role} - {content.name or '*'}: '{content.content}'") | ||
|
||
|
||
if __name__ == "__main__": | ||
asyncio.run(main()) |
95 changes: 95 additions & 0 deletions
95
...s/concepts/agents/autogen_conversable_agent/autogen_conversable_agent_convo_with_tools.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,95 @@ | ||
# Copyright (c) Microsoft. All rights reserved. | ||
|
||
import asyncio | ||
import os | ||
from typing import Annotated, Literal | ||
|
||
from autogen import ConversableAgent, register_function | ||
|
||
from semantic_kernel.agents.autogen.autogen_conversable_agent import AutoGenConversableAgent | ||
from semantic_kernel.contents.function_call_content import FunctionCallContent | ||
from semantic_kernel.contents.function_result_content import FunctionResultContent | ||
|
||
""" | ||
The following sample demonstrates how to use the AutoGenConversableAgent to create a conversation between two agents | ||
where one agent suggests a tool function call and the other agent executes the tool function call. | ||
In this example, the assistant agent suggests a calculator tool function call to the user proxy agent. The user proxy | ||
agent executes the calculator tool function call. The assistant agent and the user proxy agent are created using the | ||
ConversableAgent class. The calculator tool function is registered with the assistant agent and the user proxy agent. | ||
This sample follows the AutoGen flow outlined here: | ||
https://microsoft.github.io/autogen/0.2/docs/tutorial/tool-use | ||
""" | ||
|
||
|
||
Operator = Literal["+", "-", "*", "/"] | ||
|
||
|
||
async def main(): | ||
def calculator(a: int, b: int, operator: Annotated[Operator, "operator"]) -> int: | ||
if operator == "+": | ||
return a + b | ||
if operator == "-": | ||
return a - b | ||
if operator == "*": | ||
return a * b | ||
if operator == "/": | ||
return int(a / b) | ||
raise ValueError("Invalid operator") | ||
|
||
assistant = ConversableAgent( | ||
name="Assistant", | ||
system_message="You are a helpful AI assistant. " | ||
"You can help with simple calculations. " | ||
"Return 'TERMINATE' when the task is done.", | ||
# Note: the model "gpt-4o" leads to a "division by zero" error that doesn't occur with "gpt-4o-mini" | ||
# or even "gpt-4". | ||
llm_config={ | ||
"config_list": [{"model": os.environ["OPENAI_CHAT_MODEL_ID"], "api_key": os.environ["OPENAI_API_KEY"]}] | ||
}, | ||
) | ||
|
||
# Create a Semantic Kernel AutoGenConversableAgent based on the AutoGen ConversableAgent. | ||
assistant_agent = AutoGenConversableAgent(conversable_agent=assistant) | ||
|
||
user_proxy = ConversableAgent( | ||
name="User", | ||
llm_config=False, | ||
is_termination_msg=lambda msg: msg.get("content") is not None and "TERMINATE" in msg["content"], | ||
human_input_mode="NEVER", | ||
) | ||
|
||
assistant.register_for_llm(name="calculator", description="A simple calculator")(calculator) | ||
|
||
# Register the tool function with the user proxy agent. | ||
user_proxy.register_for_execution(name="calculator")(calculator) | ||
|
||
register_function( | ||
calculator, | ||
caller=assistant, # The assistant agent can suggest calls to the calculator. | ||
executor=user_proxy, # The user proxy agent can execute the calculator calls. | ||
name="calculator", # By default, the function name is used as the tool name. | ||
description="A simple calculator", # A description of the tool. | ||
) | ||
|
||
# Create a Semantic Kernel AutoGenConversableAgent based on the AutoGen ConversableAgent. | ||
user_proxy_agent = AutoGenConversableAgent(conversable_agent=user_proxy) | ||
|
||
async for content in user_proxy_agent.invoke( | ||
recipient=assistant_agent, | ||
message="What is (44232 + 13312 / (232 - 32)) * 5?", | ||
max_turns=10, | ||
): | ||
for item in content.items: | ||
match item: | ||
case FunctionResultContent(result=r): | ||
print(f"# {content.role} - {content.name or '*'}: '{r}'") | ||
case FunctionCallContent(function_name=fn, arguments=arguments): | ||
print(f"# {content.role} - {content.name or '*'}: Function Name: '{fn}', Arguments: '{arguments}'") | ||
case _: | ||
print(f"# {content.role} - {content.name or '*'}: '{content.content}'") | ||
|
||
|
||
if __name__ == "__main__": | ||
asyncio.run(main()) |
61 changes: 61 additions & 0 deletions
61
...mples/concepts/agents/autogen_conversable_agent/autogen_conversable_agent_simple_convo.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,61 @@ | ||
# Copyright (c) Microsoft. All rights reserved. | ||
|
||
import asyncio | ||
import os | ||
|
||
from autogen import ConversableAgent | ||
|
||
from semantic_kernel.agents.autogen.autogen_conversable_agent import AutoGenConversableAgent | ||
|
||
""" | ||
The following sample demonstrates how to use the AutoGenConversableAgent to create a conversation between two agents | ||
where one agent suggests a joke and the other agent generates a joke. | ||
The sample follows the AutoGen flow outlined here: | ||
https://microsoft.github.io/autogen/0.2/docs/tutorial/introduction#roles-and-conversations | ||
""" | ||
|
||
|
||
async def main(): | ||
cathy = ConversableAgent( | ||
"cathy", | ||
system_message="Your name is Cathy and you are a part of a duo of comedians.", | ||
llm_config={ | ||
"config_list": [ | ||
{ | ||
"model": os.environ["OPENAI_CHAT_MODEL_ID"], | ||
"temperature": 0.9, | ||
"api_key": os.environ.get("OPENAI_API_KEY"), | ||
} | ||
] | ||
}, | ||
human_input_mode="NEVER", # Never ask for human input. | ||
) | ||
|
||
cathy_autogen_agent = AutoGenConversableAgent(conversable_agent=cathy) | ||
|
||
joe = ConversableAgent( | ||
"joe", | ||
system_message="Your name is Joe and you are a part of a duo of comedians.", | ||
llm_config={ | ||
"config_list": [ | ||
{ | ||
"model": os.environ["OPENAI_CHAT_MODEL_ID"], | ||
"temperature": 0.7, | ||
"api_key": os.environ.get("OPENAI_API_KEY"), | ||
} | ||
] | ||
}, | ||
human_input_mode="NEVER", # Never ask for human input. | ||
) | ||
|
||
joe_autogen_agent = AutoGenConversableAgent(conversable_agent=joe) | ||
|
||
async for content in cathy_autogen_agent.invoke( | ||
recipient=joe_autogen_agent, message="Tell me a joke about the stock market.", max_turns=3 | ||
): | ||
print(f"# {content.role} - {content.name or '*'}: '{content.content}'") | ||
|
||
|
||
if __name__ == "__main__": | ||
asyncio.run(main()) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,20 @@ | ||
## AutoGen Conversable Agent (v0.2.X) | ||
|
||
Semantic Kernel Python supports running AutoGen Conversable Agents provided in the 0.2.X package. | ||
|
||
### Limitations | ||
|
||
Currently, there are some limitations to note: | ||
|
||
- AutoGen Conversable Agents in Semantic Kernel run asynchronously and do not support streaming of agent inputs or responses. | ||
- The `AutoGenConversableAgent` in Semantic Kernel Python cannot be configured as part of a Semantic Kernel `AgentGroupChat`. As we progress towards GA for our agent group chat patterns, we will explore ways to integrate AutoGen agents into a Semantic Kernel group chat scenario. | ||
|
||
### Installation | ||
|
||
Install the `semantic-kernel` package with the `autogen` extra: | ||
|
||
```bash | ||
pip install semantic-kernel[autogen] | ||
``` | ||
|
||
For an example of how to integrate an AutoGen Conversable Agent using the Semantic Kernel Agent abstraction, please refer to [`autogen_conversable_agent_simple_convo.py`](../../../samples/concepts/agents/autogen_conversable_agent/autogen_conversable_agent_simple_convo.py). |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
# Copyright (c) Microsoft. All rights reserved. | ||
|
||
from semantic_kernel.agents.autogen.autogen_conversable_agent import AutoGenConversableAgent | ||
|
||
__all__ = ["AutoGenConversableAgent"] |
Oops, something went wrong.