Skip to content

Commit 995b650

Browse files
committed
Version 0.4.1
1 parent 79abe97 commit 995b650

File tree

6 files changed

+57
-27
lines changed

6 files changed

+57
-27
lines changed

README.md

Lines changed: 44 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# OpenAI Scala Client 🤖
2-
[![version](https://img.shields.io/badge/version-0.4.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd) ![GitHub CI](https://github.com/cequence-io/openai-scala-client/actions/workflows/continuous-integration.yml/badge.svg)
2+
[![version](https://img.shields.io/badge/version-0.4.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd) ![GitHub CI](https://github.com/cequence-io/openai-scala-client/actions/workflows/continuous-integration.yml/badge.svg)
33

44
This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **ChatGPT completion**, and **voice routines** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:
55

@@ -17,11 +17,13 @@ This is a no-nonsense async Scala client for OpenAI API supporting all the avail
1717
Note that in order to be consistent with the OpenAI API naming, the service function names match exactly the API endpoint titles/descriptions with camelcase.
1818
Also, we aimed the lib to be self-contained with the fewest dependencies possible therefore we ended up using only two libs `play-ahc-ws-standalone` and `play-ws-standalone-json` (at the top level). Additionally, if dependency injection is required we use `scala-guice` lib as well.
1919

20-
**✔️ Important**: this is a "community-maintained" library and, as such, has no relation to OpenAI company.
20+
🔥 **New**: This lib supports also "OpenAI-API-compatible" providers such as [FastChat](https://github.com/lm-sys/FastChat), [Azure](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference), or any other service with a custom URL. Check the examples below for more details.
2121

2222
👉 Check out an article about the lib/client on [Medium](https://medium.com/@0xbnd/openai-scala-client-is-out-d7577de934ad).
2323

24-
**🔥 New**: Try out also our [Scala client for Pinecone vector database](https://github.com/cequence-io/pinecone-scala), or use both clients together! [This demo project](https://github.com/cequence-io/pinecone-openai-scala-demo) shows how to generate and store OpenAI embeddings (with `text-embedding-ada-002` model) into Pinecone and query them afterward. The OpenAI + Pinecone combo is commonly used for autonomous AI agents, such as [babyAGI](https://github.com/yoheinakajima/babyagi) and [AutoGPT](https://github.com/Significant-Gravitas/Auto-GPT).
24+
Try out also our [Scala client for Pinecone vector database](https://github.com/cequence-io/pinecone-scala), or use both clients together! [This demo project](https://github.com/cequence-io/pinecone-openai-scala-demo) shows how to generate and store OpenAI embeddings (with `text-embedding-ada-002` model) into Pinecone and query them afterward. The OpenAI + Pinecone combo is commonly used for autonomous AI agents, such as [babyAGI](https://github.com/yoheinakajima/babyagi) and [AutoGPT](https://github.com/Significant-Gravitas/Auto-GPT).
25+
26+
**✔️ Important**: this is a "community-maintained" library and, as such, has no relation to OpenAI company.
2527

2628
## Installation 🚀
2729

@@ -30,7 +32,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
3032
To pull the library you have to add the following dependency to your *build.sbt*
3133

3234
```
33-
"io.cequence" %% "openai-scala-client" % "0.4.0"
35+
"io.cequence" %% "openai-scala-client" % "0.4.1"
3436
```
3537

3638
or to *pom.xml* (if you use maven)
@@ -39,11 +41,11 @@ or to *pom.xml* (if you use maven)
3941
<dependency>
4042
<groupId>io.cequence</groupId>
4143
<artifactId>openai-scala-client_2.12</artifactId>
42-
<version>0.4.0</version>
44+
<version>0.4.1</version>
4345
</dependency>
4446
```
4547

46-
If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.4.0"` instead.
48+
If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.4.1"` instead.
4749

4850
## Config ⚙️
4951

@@ -83,6 +85,34 @@ Then you can obtain a service in one of the following ways.
8385
)
8486
```
8587

88+
- Minimal `OpenAICoreService` supporting `listModels`, `createCompletion`, `createChatCompletion`, and `createEmbeddings` calls - e.g. [FastChat](https://github.com/lm-sys/FastChat) service running on the port 8000 (🔥 new)
89+
90+
```scala
91+
val service = OpenAICoreServiceFactory("http://localhost:8000/v1/")
92+
```
93+
94+
- For Azure with API Key (🔥 new)
95+
96+
```scala
97+
val service = OpenAIServiceFactory.forAzureWithApiKey(
98+
resourceName = "your-resource-name",
99+
deploymentId = "your-deployment-id",
100+
apiVersion = "2023-05-15", // example
101+
apiKey = "your_api_key"
102+
)
103+
```
104+
105+
- For Azure with Access Token (🔥 new)
106+
107+
```scala
108+
val service = OpenAIServiceFactory.forAzureWithAccessToken(
109+
resourceName = "your-resource-name",
110+
deploymentId = "your-deployment-id",
111+
apiVersion = "2023-05-15", // example
112+
accessToken = "your_access_token"
113+
)
114+
```
115+
86116
**✔️ Important**: If you want streaming support use `OpenAIServiceStreamedFactory` from `openai-scala-client-stream` lib instead of `OpenAIServiceFactory` (in the three examples above). Three additional functions - `createCompletionStreamed`, `createChatCompletionStreamed`, and `listFineTuneEventsStreamed` provided by [OpenAIServiceStreamedExtra](./openai-client-stream/src/main/scala/io/cequence/openaiscala/service/OpenAIServiceStreamedExtra.scala) will be then available.
87117

88118
- Via dependency injection (requires `openai-scala-guice` lib)
@@ -194,7 +224,7 @@ For this to work you need to use `OpenAIServiceStreamedFactory` from `openai-sca
194224
}
195225
```
196226

197-
- Create chat completion for functions (🔥 new)
227+
- Create chat completion for functions
198228

199229
```scala
200230
val messages = Seq(
@@ -242,9 +272,9 @@ This extension of the standard chat completion is currently supported by the fol
242272
- `gpt-3.5-turbo-0613` (default), `gpt-3.5-turbo-16k-0613`, `gpt-4-0613`, and `gpt-4-32k-0613`.
243273

244274

245-
**✔️ Important Note**: After you are done using the service, you should close it by calling (🔥 new) `service.close`. Otherwise, the underlying resources/threads won't be released.
275+
**✔️ Important Note**: After you are done using the service, you should close it by calling `service.close`. Otherwise, the underlying resources/threads won't be released.
246276

247-
**III. Using multiple services (🔥 new)**
277+
**III. Using multiple services**
248278

249279
- Load distribution with `OpenAIMultiServiceAdapter` - _round robin_ (_rotation_) type
250280

@@ -289,10 +319,10 @@ import scala.concurrent.duration.DurationInt
289319
import scala.concurrent.{ExecutionContext, Future}
290320

291321
class MyCompletionService @Inject() (
292-
val actorSystem: ActorSystem,
293-
implicit val ec: ExecutionContext,
294-
implicit val scheduler: Scheduler
295-
)(val apiKey: String)
322+
val actorSystem: ActorSystem,
323+
implicit val ec: ExecutionContext,
324+
implicit val scheduler: Scheduler
325+
)(val apiKey: String)
296326
extends RetryHelpers {
297327
val service: OpenAIService = OpenAIServiceFactory(apiKey)
298328
implicit val retrySettings: RetrySettings =
@@ -315,7 +345,7 @@ class MyCompletionService @Inject() (
315345
val serviceAux = ... // your service
316346

317347
implicit val retrySettings: RetrySettings =
318-
RetrySettings(maxAttempts = 10).constantInterval(10.seconds)
348+
RetrySettings(maxRetries = 10).constantInterval(10.seconds)
319349
// wrap it with the retry adapter
320350
val service = OpenAIRetryServiceAdapter(serviceAux)
321351

build.sbt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ val scala3 = "3.2.2"
77

88
ThisBuild / organization := "io.cequence"
99
ThisBuild / scalaVersion := scala212
10-
ThisBuild / version := "0.4.0"
10+
ThisBuild / version := "0.4.1"
1111
ThisBuild / isSnapshot := false
1212

1313
lazy val commonSettings = Seq(

openai-client-stream/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# OpenAI Scala Client - Stream Support [![version](https://img.shields.io/badge/version-0.4.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
1+
# OpenAI Scala Client - Stream Support [![version](https://img.shields.io/badge/version-0.4.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
22

33
This module provides streaming support for the client. Note that the full project documentation can be found [here](../README.md).
44

@@ -9,7 +9,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
99
To pull the library you have to add the following dependency to your *build.sbt*
1010

1111
```
12-
"io.cequence" %% "openai-scala-client-stream" % "0.4.0"
12+
"io.cequence" %% "openai-scala-client-stream" % "0.4.1"
1313
```
1414

1515
or to *pom.xml* (if you use maven)
@@ -18,6 +18,6 @@ or to *pom.xml* (if you use maven)
1818
<dependency>
1919
<groupId>io.cequence</groupId>
2020
<artifactId>openai-scala-client-stream_2.12</artifactId>
21-
<version>0.4.0</version>
21+
<version>0.4.1</version>
2222
</dependency>
2323
```

openai-client/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# OpenAI Scala Client - Client [![version](https://img.shields.io/badge/version-0.4.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
1+
# OpenAI Scala Client - Client [![version](https://img.shields.io/badge/version-0.4.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
22

33
This module provides the actual meat, i.e. WS client implementation ([OpenAIServiceImpl and OpenAIServiceFactory](./src/main/scala/io/cequence/openaiscala/service/OpenAIServiceImpl.scala)).
44
Note that the full project documentation can be found [here](../README.md).
@@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
1010
To pull the library you have to add the following dependency to your *build.sbt*
1111

1212
```
13-
"io.cequence" %% "openai-scala-client" % "0.4.0"
13+
"io.cequence" %% "openai-scala-client" % "0.4.1"
1414
```
1515

1616
or to *pom.xml* (if you use maven)
@@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
1919
<dependency>
2020
<groupId>io.cequence</groupId>
2121
<artifactId>openai-scala-client_2.12</artifactId>
22-
<version>0.4.0</version>
22+
<version>0.4.1</version>
2323
</dependency>
2424
```

openai-core/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# OpenAI Scala Client - Core [![version](https://img.shields.io/badge/version-0.4.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
1+
# OpenAI Scala Client - Core [![version](https://img.shields.io/badge/version-0.4.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
22

33
This is the core module, which contains mostly domain classes and the [OpenAIService](./src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala) definition.
44
Note that the full project documentation can be found [here](../README.md).
@@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
1010
To pull the library you have to add the following dependency to your *build.sbt*
1111

1212
```
13-
"io.cequence" %% "openai-scala-core" % "0.4.0"
13+
"io.cequence" %% "openai-scala-core" % "0.4.1"
1414
```
1515

1616
or to *pom.xml* (if you use maven)
@@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
1919
<dependency>
2020
<groupId>io.cequence</groupId>
2121
<artifactId>openai-scala-core_2.12</artifactId>
22-
<version>0.4.0</version>
22+
<version>0.4.1</version>
2323
</dependency>
2424
```

openai-guice/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# OpenAI Scala Client - Guice [![version](https://img.shields.io/badge/version-0.4.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
1+
# OpenAI Scala Client - Guice [![version](https://img.shields.io/badge/version-0.4.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
22

33
This module provides dependency injection for the OpenAI Scala client with a help of `Guice` library.
44
Note that the full project documentation can be found [here](../README.md).
@@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
1010
To pull the library you have to add the following dependency to your *build.sbt*
1111

1212
```
13-
"io.cequence" %% "openai-scala-guice" % "0.4.0"
13+
"io.cequence" %% "openai-scala-guice" % "0.4.1"
1414
```
1515

1616
or to *pom.xml* (if you use maven)
@@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
1919
<dependency>
2020
<groupId>io.cequence</groupId>
2121
<artifactId>openai-scala-guice_2.12</artifactId>
22-
<version>0.4.0</version>
22+
<version>0.4.1</version>
2323
</dependency>
2424
```

0 commit comments

Comments
 (0)