|
| 1 | +### Understanding Semantic Kernel AI Connectors |
| 2 | + |
| 3 | +AI Connectors in Semantic Kernel are components that facilitate communication between the Kernel's core functionalities and various AI services. They abstract the intricate details of service-specific protocols, allowing developers to seamlessly interact with AI services for tasks like text generation, chat interactions, and more. |
| 4 | + |
| 5 | +### Using AI Connectors in Semantic Kernel |
| 6 | + |
| 7 | +Developers utilize AI connectors to connect their applications to different AI services efficiently. The connectors manage the requests and responses, providing a streamlined way to leverage the power of these AI services without needing to handle the specific communication protocols each service requires. |
| 8 | + |
| 9 | +### Creating Custom AI Connectors in Semantic Kernel |
| 10 | + |
| 11 | +To create a custom AI connector in Semantic Kernel, one must extend the base classes provided, such as `ChatCompletionClientBase` and `AIServiceClientBase`. Below is a guide and example for implementing a mock AI connector: |
| 12 | + |
| 13 | +#### Step-by-Step Walkthrough |
| 14 | + |
| 15 | +1. **Understand the Base Classes**: The foundational classes `ChatCompletionClientBase` and `AIServiceClientBase` provide necessary methods and structures for creating chat-based AI connectors. |
| 16 | + |
| 17 | +2. **Implementing the Connector**: Here's a mock implementation example illustrating how to implement a connector without real service dependencies, ensuring compatibility with Pydantic's expectations within the framework: |
| 18 | + |
| 19 | +```python |
| 20 | +from semantic_kernel.connectors.ai.chat_completion_client_base import ChatCompletionClientBase |
| 21 | + |
| 22 | +class MockAIChatCompletionService(ChatCompletionClientBase): |
| 23 | + def __init__(self, ai_model_id: str): |
| 24 | + super().__init__(ai_model_id=ai_model_id) |
| 25 | + |
| 26 | + async def _inner_get_chat_message_contents(self, chat_history, settings): |
| 27 | + # Mock implementation: returns dummy chat message content for demonstration. |
| 28 | + return [{"role": "assistant", "content": "Mock response based on your history."}] |
| 29 | + |
| 30 | + def service_url(self): |
| 31 | + return "http://mock-ai-service.com" |
| 32 | +``` |
| 33 | + |
| 34 | +### Usage Example |
| 35 | + |
| 36 | +The following example demonstrates how to integrate and use the `MockAIChatCompletionService` in an application: |
| 37 | + |
| 38 | +```python |
| 39 | +import asyncio |
| 40 | +from semantic_kernel.contents.chat_history import ChatHistory |
| 41 | +from semantic_kernel.connectors.ai.prompt_execution_settings import PromptExecutionSettings |
| 42 | + |
| 43 | +async def main(): |
| 44 | + chat_history = ChatHistory(messages=[{"role": "user", "content": "Hello"}]) |
| 45 | + settings = PromptExecutionSettings(model="mock-model") |
| 46 | + |
| 47 | + service = MockAIChatCompletionService(ai_model_id="mock-model") |
| 48 | + |
| 49 | + response = await service.get_chat_message_contents(chat_history, settings) |
| 50 | + print(response) |
| 51 | + |
| 52 | +# Run the main function |
| 53 | +asyncio.run(main()) |
| 54 | +``` |
| 55 | + |
| 56 | +### Conclusion |
| 57 | + |
| 58 | +By following the revised guide and understanding the base class functionalities, developers can effectively create custom connectors within Semantic Kernel. This structured approach enhances integration with various AI services while ensuring alignment with the framework's architectural expectations. Custom connectors offer flexibility, allowing developers to adjust implementations to meet specific service needs, such as additional logging, authentication, or modifications tailored to specific protocols. This guide provides a strong foundation upon which more complex and service-specific extensions can be built, promoting robust and scalable AI service integration. |
0 commit comments