You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **ChatGPT completion**, and **voice routines** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:
5
5
@@ -17,11 +17,13 @@ This is a no-nonsense async Scala client for OpenAI API supporting all the avail
17
17
Note that in order to be consistent with the OpenAI API naming, the service function names match exactly the API endpoint titles/descriptions with camelcase.
18
18
Also, we aimed the lib to be self-contained with the fewest dependencies possible therefore we ended up using only two libs `play-ahc-ws-standalone` and `play-ws-standalone-json` (at the top level). Additionally, if dependency injection is required we use `scala-guice` lib as well.
19
19
20
-
**✔️ Important**: this is a "community-maintained" library and, as such, has no relation to OpenAI company.
20
+
🔥 **New**: This lib supports also "OpenAI-API-compatible" providers such as [FastChat](https://github.com/lm-sys/FastChat), [Azure](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference), or any other service with a custom URL. Check the examples below for more details.
21
21
22
22
👉 Check out an article about the lib/client on [Medium](https://medium.com/@0xbnd/openai-scala-client-is-out-d7577de934ad).
23
23
24
-
**🔥 New**: Try out also our [Scala client for Pinecone vector database](https://github.com/cequence-io/pinecone-scala), or use both clients together! [This demo project](https://github.com/cequence-io/pinecone-openai-scala-demo) shows how to generate and store OpenAI embeddings (with `text-embedding-ada-002` model) into Pinecone and query them afterward. The OpenAI + Pinecone combo is commonly used for autonomous AI agents, such as [babyAGI](https://github.com/yoheinakajima/babyagi) and [AutoGPT](https://github.com/Significant-Gravitas/Auto-GPT).
24
+
Try out also our [Scala client for Pinecone vector database](https://github.com/cequence-io/pinecone-scala), or use both clients together! [This demo project](https://github.com/cequence-io/pinecone-openai-scala-demo) shows how to generate and store OpenAI embeddings (with `text-embedding-ada-002` model) into Pinecone and query them afterward. The OpenAI + Pinecone combo is commonly used for autonomous AI agents, such as [babyAGI](https://github.com/yoheinakajima/babyagi) and [AutoGPT](https://github.com/Significant-Gravitas/Auto-GPT).
25
+
26
+
**✔️ Important**: this is a "community-maintained" library and, as such, has no relation to OpenAI company.
25
27
26
28
## Installation 🚀
27
29
@@ -30,7 +32,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
30
32
To pull the library you have to add the following dependency to your *build.sbt*
31
33
32
34
```
33
-
"io.cequence" %% "openai-scala-client" % "0.4.0"
35
+
"io.cequence" %% "openai-scala-client" % "0.4.1"
34
36
```
35
37
36
38
or to *pom.xml* (if you use maven)
@@ -39,11 +41,11 @@ or to *pom.xml* (if you use maven)
39
41
<dependency>
40
42
<groupId>io.cequence</groupId>
41
43
<artifactId>openai-scala-client_2.12</artifactId>
42
-
<version>0.4.0</version>
44
+
<version>0.4.1</version>
43
45
</dependency>
44
46
```
45
47
46
-
If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.4.0"` instead.
48
+
If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.4.1"` instead.
47
49
48
50
## Config ⚙️
49
51
@@ -83,6 +85,34 @@ Then you can obtain a service in one of the following ways.
83
85
)
84
86
```
85
87
88
+
- Minimal `OpenAICoreService` supporting `listModels`, `createCompletion`, `createChatCompletion`, and `createEmbeddings` calls - e.g. [FastChat](https://github.com/lm-sys/FastChat) service running on the port 8000 (🔥 new)
**✔️ Important**: If you want streaming support use `OpenAIServiceStreamedFactory` from `openai-scala-client-stream` lib instead of `OpenAIServiceFactory` (in the three examples above). Three additional functions - `createCompletionStreamed`, `createChatCompletionStreamed`, and `listFineTuneEventsStreamed` provided by [OpenAIServiceStreamedExtra](./openai-client-stream/src/main/scala/io/cequence/openaiscala/service/OpenAIServiceStreamedExtra.scala) will be then available.
87
117
88
118
- Via dependency injection (requires `openai-scala-guice` lib)
@@ -194,7 +224,7 @@ For this to work you need to use `OpenAIServiceStreamedFactory` from `openai-sca
194
224
}
195
225
```
196
226
197
-
- Create chat completion for functions (🔥 new)
227
+
- Create chat completion for functions
198
228
199
229
```scala
200
230
valmessages=Seq(
@@ -242,9 +272,9 @@ This extension of the standard chat completion is currently supported by the fol
242
272
-`gpt-3.5-turbo-0613` (default), `gpt-3.5-turbo-16k-0613`, `gpt-4-0613`, and `gpt-4-32k-0613`.
243
273
244
274
245
-
**✔️ Important Note**: After you are done using the service, you should close it by calling (🔥 new) `service.close`. Otherwise, the underlying resources/threads won't be released.
275
+
**✔️ Important Note**: After you are done using the service, you should close it by calling `service.close`. Otherwise, the underlying resources/threads won't be released.
246
276
247
-
**III. Using multiple services (🔥 new)**
277
+
**III. Using multiple services**
248
278
249
279
- Load distribution with `OpenAIMultiServiceAdapter` - _round robin_ (_rotation_) type
This module provides the actual meat, i.e. WS client implementation ([OpenAIServiceImpl and OpenAIServiceFactory](./src/main/scala/io/cequence/openaiscala/service/OpenAIServiceImpl.scala)).
4
4
Note that the full project documentation can be found [here](../README.md).
@@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
10
10
To pull the library you have to add the following dependency to your *build.sbt*
11
11
12
12
```
13
-
"io.cequence" %% "openai-scala-client" % "0.4.0"
13
+
"io.cequence" %% "openai-scala-client" % "0.4.1"
14
14
```
15
15
16
16
or to *pom.xml* (if you use maven)
@@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
This is the core module, which contains mostly domain classes and the [OpenAIService](./src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala) definition.
4
4
Note that the full project documentation can be found [here](../README.md).
@@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
10
10
To pull the library you have to add the following dependency to your *build.sbt*
11
11
12
12
```
13
-
"io.cequence" %% "openai-scala-core" % "0.4.0"
13
+
"io.cequence" %% "openai-scala-core" % "0.4.1"
14
14
```
15
15
16
16
or to *pom.xml* (if you use maven)
@@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
0 commit comments