Skip to content

Commit a2bb83e

Browse files
feat: oci genai
Signed-off-by: Anders Swanson <[email protected]>
1 parent 4a1fbed commit a2bb83e

File tree

1 file changed

+22
-1
lines changed

1 file changed

+22
-1
lines changed

docs/reference/providers/backend.md

Lines changed: 22 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
A Backend (also called Provider) is a service that provides access to the AI language model. There are many different backends available for K8sGPT. Each backend has its own strengths and weaknesses, so it is important to choose the one that is right for your needs.
44

5-
Currently, we have a total of 11 backends available:
5+
Currently, we have a total of 12 backends available:
66

77
- [OpenAI](https://openai.com/)
88
- [Cohere](https://cohere.com/)
@@ -14,6 +14,7 @@ Currently, we have a total of 11 backends available:
1414
- [Hugging Face](https://huggingface.co)
1515
- [IBM watsonx.ai](https://www.ibm.com/products/watsonx-ai)
1616
- [LocalAI](https://github.com/go-skynet/LocalAI)
17+
- [Oracle Cloud Infrastructure (OCI) Generative AI](https://www.oracle.com/artificial-intelligence/generative-ai/generative-ai-service/)
1718
- [Ollama](https://github.com/ollama/ollama)
1819
- FakeAI
1920

@@ -195,6 +196,25 @@ LocalAI is a local model, which is an OpenAI compatible API. It uses llama.cpp a
195196
k8sgpt analyze --explain --backend localai
196197
```
197198
199+
<<<<<<< HEAD
200+
## Oracle Cloud Infrastructure (OCI) Generative AI
201+
202+
[Oracle Cloud Infrastructure (OCI)](https://www.oracle.com/cloud/) Generative AI s a fully managed OCI service that provides a set of state-of-the-art, customizable large language models.
203+
K8sgpt can be configured to use ready-to-use pretrained models, or create and host your own fine-tuned custom models based on your own data on dedicated AI clusters.
204+
205+
To authenticate with OCI, create a [OCI SDK/CLI](https://docs.oracle.com/en-us/iaas/Content/API/Concepts/sdkconfig.htm) `config` file in your home directory's `.oci/` directory.
206+
207+
Next, configure the OCI backend for a given model within an OCI compartment:
208+
```bash
209+
k8sgpt auth add --backend oci --model <Model OCID> --compartmentId <Compartment OCID>
210+
```
211+
212+
Analyze using the OCI backend:
213+
```bash
214+
k8sgpt anaylze --explain --backend oci
215+
```
216+
217+
=======
198218
## Ollama (via LocalAI backend)
199219

200220
Ollama is a local model, which has an OpenAI compatible API. It supports the models listed in the [Ollama library](https://ollama.com/library).
@@ -230,6 +250,7 @@ Ollama can get up and running locally with large language models. It runs Llama
230250
```bash
231251
k8sgpt analyze --explain --backend ollama
232252
```
253+
>>>>>>> 4a1fbed37eedd35111986d07c99cf340d5653fd6
233254
## FakeAI
234255

235256
FakeAI or the NoOpAiProvider might be useful in situations where you need to test a new feature or simulate the behaviour of an AI based-system without actually invoking it. It can help you with local development, testing and troubleshooting.

0 commit comments

Comments
 (0)