Skip to content

Commit

Permalink
docs: update AI feature installation instructions
Browse files Browse the repository at this point in the history
- Rephrase instructions for enabling AI features for clarity
- Add examples for different AI backends (OpenAI, Azure OpenAI, Hugging Face)
- Restructure configuration examples to improve readability

These changes aim to make the installation process for AI features more intuitive and provide better guidance for users configuring different AI backends.
  • Loading branch information
elliotxx committed Jan 27, 2025
1 parent c6376ae commit e6a2cbc
Show file tree
Hide file tree
Showing 2 changed files with 39 additions and 36 deletions.
37 changes: 18 additions & 19 deletions docs/karpor/1-getting-started/2-installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,34 +91,33 @@ helm install karpor-release kusionstack/karpor --set registryProxy=docker.m.daoc

### Enable AI features

If you are trying to install Karpor with AI features, including natural language search and AI analyze, `ai-auth-token` and `ai-base-url` should be configured, e.g.:
If you want to install Karpor with AI features, including natural language search and AI analysis, you should configure parameters such as `ai-auth-token`, `ai-base-url`, etc., for example:

```shell
# At a minimum, server.ai.authToken and server.ai.baseUrl must be configured.
# Minimal configuration, using OpenAI as the default AI backend
helm install karpor-release kusionstack/karpor \
--set server.ai.authToken=YOUR_AI_TOKEN \
--set server.ai.baseUrl=https://api.openai.com/v1
--set server.ai.authToken={YOUR_AI_TOKEN}

# server.ai.backend has default values `openai`, which can be overridden when necessary.
# If the backend you are using is compatible with OpenAI, then there is no need to make
# any changes here.
# Example using Azure OpenAI
helm install karpor-release kusionstack/karpor \
--set server.ai.authToken=YOUR_AI_TOKEN \
--set server.ai.baseUrl=https://api.openai.com/v1 \
--set server.ai.backend=huggingface
--set server.ai.authToken={YOUR_AI_TOKEN} \
--set server.ai.baseUrl=https://{YOUR_RESOURCE_NAME}.openai.azure.com \
--set server.ai.backend=azureopenai

# server.ai.model has default values `gpt-3.5-turbo`, which can be overridden when necessary.
# Example using Hugging Face
helm install karpor-release kusionstack/karpor \
--set server.ai.authToken=YOUR_AI_TOKEN \
--set server.ai.baseUrl=https://api.openai.com/v1 \
--set server.ai.model=gpt-4o
--set server.ai.authToken={YOUR_AI_TOKEN} \
--set server.ai.model={YOUR_HUGGINGFACE_MODEL} \
--set server.ai.backend=huggingface

# server.ai.topP and server.ai.temperature can also be manually modified.
# Custom configuration
helm install karpor-release kusionstack/karpor \
--set server.ai.authToken=YOUR_AI_TOKEN \
--set server.ai.baseUrl=https://api.openai.com/v1 \
--set server.ai.topP=0.5 \
--set server.ai.temperature=0.2
--set server.ai.authToken={YOUR_AI_TOKEN} \
--set server.ai.baseUrl=https://api.openai.com/v1 \
--set server.ai.backend=openai \
--set server.ai.model=gpt-3.5-turbo \
--set server.ai.topP=0.5 \
--set server.ai.temperature=0.2
```

## Chart Parameters
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -91,29 +91,33 @@ helm install karpor-release kusionstack/karpor --set registryProxy=docker.m.daoc

### 启用 AI 功能

如果您要安装带有AI功能的Karpor,包括自然语言搜索和AI分析,则应配置 `ai-auth-token``ai-base-url`,例如:
如果您要安装带有 AI 功能的 Karpor,包括自然语言搜索和 AI 分析,则应配置 `ai-auth-token``ai-base-url` 等参数,例如:

```shell
# 至少需要配置 server.ai.authToken 和 server.ai.baseUrl。
# 最少配置,默认使用 OpenAI 作为 AI Backend
helm install karpor-release kusionstack/karpor \
--set server.ai.authToken=YOUR_AI_TOKEN \
--set server.ai.baseUrl=https://api.openai.com/v1
# server.ai.backend 的默认值是 `openai`,可以根据需要进行覆盖。如果你使用的后端与 OpenAI 兼容,则无需在此处进行任何更改。
--set server.ai.authToken={YOUR_AI_TOKEN}

# 使用 Azure OpenAI 的样例
helm install karpor-release kusionstack/karpor \
--set server.ai.authToken=YOUR_AI_TOKEN \
--set server.ai.baseUrl=https://api.openai.com/v1 \
--set server.ai.backend=huggingface
# server.ai.model 的默认值是 `gpt-3.5-turbo`,可以根据需要进行覆盖。
--set server.ai.authToken={YOUR_AI_TOKEN} \
--set server.ai.baseUrl=https://{YOUR_RESOURCE_NAME}.openai.azure.com \
--set server.ai.backend=azureopenai

# 使用 Hugging Face 的样例
helm install karpor-release kusionstack/karpor \
--set server.ai.authToken=YOUR_AI_TOKEN \
--set server.ai.baseUrl=https://api.openai.com/v1 \
--set server.ai.model=gpt-4o
# server.ai.topP 和 server.ai.temperature 也可以手动修改。
--set server.ai.authToken={YOUR_AI_TOKEN} \
--set server.ai.model={YOUR_HUGGINGFACE_MODEL} \
--set server.ai.backend=huggingface

# 自定义配置
helm install karpor-release kusionstack/karpor \
--set server.ai.authToken=YOUR_AI_TOKEN \
--set server.ai.baseUrl=https://api.openai.com/v1 \
--set server.ai.topP=0.5 \
--set server.ai.temperature=0.2
--set server.ai.authToken={YOUR_AI_TOKEN} \
--set server.ai.baseUrl=https://api.openai.com/v1 \
--set server.ai.backend=openai \
--set server.ai.model=gpt-3.5-turbo \
--set server.ai.topP=0.5 \
--set server.ai.temperature=0.2
```

## Chart 参数
Expand Down

0 comments on commit e6a2cbc

Please sign in to comment.