You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **ChatGPT completion**, and **voice routines** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:
5
5
@@ -30,7 +30,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
30
30
To pull the library you have to add the following dependency to your *build.sbt*
31
31
32
32
```
33
-
"io.cequence" %% "openai-scala-client" % "0.3.3"
33
+
"io.cequence" %% "openai-scala-client" % "0.4.0"
34
34
```
35
35
36
36
or to *pom.xml* (if you use maven)
@@ -39,11 +39,11 @@ or to *pom.xml* (if you use maven)
39
39
<dependency>
40
40
<groupId>io.cequence</groupId>
41
41
<artifactId>openai-scala-client_2.12</artifactId>
42
-
<version>0.3.3</version>
42
+
<version>0.4.0</version>
43
43
</dependency>
44
44
```
45
45
46
-
If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.3.3"` instead.
46
+
If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.4.0"` instead.
47
47
48
48
## Config ⚙️
49
49
@@ -224,7 +224,7 @@ For this to work you need to use `OpenAIServiceStreamedFactory` from `openai-sca
224
224
)
225
225
226
226
// if we want to force the model to use the above function as a response
227
-
// we can do so by passing: responseFunctionName = Some("set_current_location")`
227
+
// we can do so by passing: responseFunctionName = Some("get_current_weather")`
228
228
service.createChatFunCompletion(
229
229
messages = messages,
230
230
functions = functions,
@@ -257,7 +257,7 @@ This extension of the standard chat completion is currently supported by the fol
257
257
258
258
service.listModels.map { models =>
259
259
models.foreach(println)
260
-
service.close
260
+
service.close()
261
261
}
262
262
```
263
263
@@ -272,7 +272,7 @@ This extension of the standard chat completion is currently supported by the fol
This module provides the actual meat, i.e. WS client implementation ([OpenAIServiceImpl and OpenAIServiceFactory](./src/main/scala/io/cequence/openaiscala/service/OpenAIServiceImpl.scala)).
4
4
Note that the full project documentation can be found [here](../README.md).
@@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
10
10
To pull the library you have to add the following dependency to your *build.sbt*
11
11
12
12
```
13
-
"io.cequence" %% "openai-scala-client" % "0.3.3"
13
+
"io.cequence" %% "openai-scala-client" % "0.4.0"
14
14
```
15
15
16
16
or to *pom.xml* (if you use maven)
@@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
This is the core module, which contains mostly domain classes and the [OpenAIService](./src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala) definition.
4
4
Note that the full project documentation can be found [here](../README.md).
@@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
10
10
To pull the library you have to add the following dependency to your *build.sbt*
11
11
12
12
```
13
-
"io.cequence" %% "openai-scala-core" % "0.3.3"
13
+
"io.cequence" %% "openai-scala-core" % "0.4.0"
14
14
```
15
15
16
16
or to *pom.xml* (if you use maven)
@@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
0 commit comments