diff --git a/docs/karpor/1-getting-started/2-installation.md b/docs/karpor/1-getting-started/2-installation.md index d6a1bb5d..d30deda6 100644 --- a/docs/karpor/1-getting-started/2-installation.md +++ b/docs/karpor/1-getting-started/2-installation.md @@ -91,34 +91,33 @@ helm install karpor-release kusionstack/karpor --set registryProxy=docker.m.daoc ### Enable AI features -If you are trying to install Karpor with AI features, including natural language search and AI analyze, `ai-auth-token` and `ai-base-url` should be configured, e.g.: +If you want to install Karpor with AI features, including natural language search and AI analysis, you should configure parameters such as `ai-auth-token`, `ai-base-url`, etc., for example: ```shell -# At a minimum, server.ai.authToken and server.ai.baseUrl must be configured. +# Minimal configuration, using OpenAI as the default AI backend helm install karpor-release kusionstack/karpor \ ---set server.ai.authToken=YOUR_AI_TOKEN \ ---set server.ai.baseUrl=https://api.openai.com/v1 + --set server.ai.authToken={YOUR_AI_TOKEN} -# server.ai.backend has default values `openai`, which can be overridden when necessary. -# If the backend you are using is compatible with OpenAI, then there is no need to make -# any changes here. +# Example using Azure OpenAI helm install karpor-release kusionstack/karpor \ ---set server.ai.authToken=YOUR_AI_TOKEN \ ---set server.ai.baseUrl=https://api.openai.com/v1 \ ---set server.ai.backend=huggingface + --set server.ai.authToken={YOUR_AI_TOKEN} \ + --set server.ai.baseUrl=https://{YOUR_RESOURCE_NAME}.openai.azure.com \ + --set server.ai.backend=azureopenai -# server.ai.model has default values `gpt-3.5-turbo`, which can be overridden when necessary. +# Example using Hugging Face helm install karpor-release kusionstack/karpor \ ---set server.ai.authToken=YOUR_AI_TOKEN \ ---set server.ai.baseUrl=https://api.openai.com/v1 \ ---set server.ai.model=gpt-4o + --set server.ai.authToken={YOUR_AI_TOKEN} \ + --set server.ai.model={YOUR_HUGGINGFACE_MODEL} \ + --set server.ai.backend=huggingface -# server.ai.topP and server.ai.temperature can also be manually modified. +# Custom configuration helm install karpor-release kusionstack/karpor \ ---set server.ai.authToken=YOUR_AI_TOKEN \ ---set server.ai.baseUrl=https://api.openai.com/v1 \ ---set server.ai.topP=0.5 \ ---set server.ai.temperature=0.2 + --set server.ai.authToken={YOUR_AI_TOKEN} \ + --set server.ai.baseUrl=https://api.openai.com/v1 \ + --set server.ai.backend=openai \ + --set server.ai.model=gpt-3.5-turbo \ + --set server.ai.topP=0.5 \ + --set server.ai.temperature=0.2 ``` ## Chart Parameters diff --git a/i18n/zh/docusaurus-plugin-content-docs-karpor/current/1-getting-started/2-installation.md b/i18n/zh/docusaurus-plugin-content-docs-karpor/current/1-getting-started/2-installation.md index d1233cef..f3ff55df 100644 --- a/i18n/zh/docusaurus-plugin-content-docs-karpor/current/1-getting-started/2-installation.md +++ b/i18n/zh/docusaurus-plugin-content-docs-karpor/current/1-getting-started/2-installation.md @@ -91,29 +91,33 @@ helm install karpor-release kusionstack/karpor --set registryProxy=docker.m.daoc ### 启用 AI 功能 -如果您要安装带有AI功能的Karpor,包括自然语言搜索和AI分析,则应配置 `ai-auth-token` 和 `ai-base-url`,例如: +如果您要安装带有 AI 功能的 Karpor,包括自然语言搜索和 AI 分析,则应配置 `ai-auth-token`、`ai-base-url` 等参数,例如: ```shell -# 至少需要配置 server.ai.authToken 和 server.ai.baseUrl。 +# 最少配置,默认使用 OpenAI 作为 AI Backend helm install karpor-release kusionstack/karpor \ ---set server.ai.authToken=YOUR_AI_TOKEN \ ---set server.ai.baseUrl=https://api.openai.com/v1 -# server.ai.backend 的默认值是 `openai`,可以根据需要进行覆盖。如果你使用的后端与 OpenAI 兼容,则无需在此处进行任何更改。 + --set server.ai.authToken={YOUR_AI_TOKEN} + +# 使用 Azure OpenAI 的样例 helm install karpor-release kusionstack/karpor \ ---set server.ai.authToken=YOUR_AI_TOKEN \ ---set server.ai.baseUrl=https://api.openai.com/v1 \ ---set server.ai.backend=huggingface -# server.ai.model 的默认值是 `gpt-3.5-turbo`,可以根据需要进行覆盖。 + --set server.ai.authToken={YOUR_AI_TOKEN} \ + --set server.ai.baseUrl=https://{YOUR_RESOURCE_NAME}.openai.azure.com \ + --set server.ai.backend=azureopenai + +# 使用 Hugging Face 的样例 helm install karpor-release kusionstack/karpor \ ---set server.ai.authToken=YOUR_AI_TOKEN \ ---set server.ai.baseUrl=https://api.openai.com/v1 \ ---set server.ai.model=gpt-4o -# server.ai.topP 和 server.ai.temperature 也可以手动修改。 + --set server.ai.authToken={YOUR_AI_TOKEN} \ + --set server.ai.model={YOUR_HUGGINGFACE_MODEL} \ + --set server.ai.backend=huggingface + +# 自定义配置 helm install karpor-release kusionstack/karpor \ ---set server.ai.authToken=YOUR_AI_TOKEN \ ---set server.ai.baseUrl=https://api.openai.com/v1 \ ---set server.ai.topP=0.5 \ ---set server.ai.temperature=0.2 + --set server.ai.authToken={YOUR_AI_TOKEN} \ + --set server.ai.baseUrl=https://api.openai.com/v1 \ + --set server.ai.backend=openai \ + --set server.ai.model=gpt-3.5-turbo \ + --set server.ai.topP=0.5 \ + --set server.ai.temperature=0.2 ``` ## Chart 参数