Skip to content

Commit 28aa45e

Browse files
committed
Version 1.1.2
1 parent 81d32da commit 28aa45e

File tree

3 files changed

+11
-11
lines changed

3 files changed

+11
-11
lines changed

README.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# OpenAI Scala Client 🤖
2-
[![version](https://img.shields.io/badge/version-1.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd) ![GitHub CI](https://github.com/cequence-io/openai-scala-client/actions/workflows/continuous-integration.yml/badge.svg)
2+
[![version](https://img.shields.io/badge/version-1.1.2-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd) ![GitHub CI](https://github.com/cequence-io/openai-scala-client/actions/workflows/continuous-integration.yml/badge.svg)
33

44
This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **chat completion**, **vision**, and **voice routines** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:
55

@@ -35,16 +35,16 @@ Also, we aimed the lib to be self-contained with the fewest dependencies possibl
3535
In addition to the OpenAI API, this library also supports API-compatible providers (see [examples](./openai-examples/src/main/scala/io/cequence/openaiscala/examples/nonopenai)) such as:
3636
- [Azure OpenAI](https://azure.microsoft.com/en-us/products/ai-services/openai-service) - cloud-based, utilizes OpenAI models but with lower latency
3737
- [Azure AI](https://azure.microsoft.com/en-us/products/ai-studio) - cloud-based, offers a vast selection of open-source models
38-
- [Anthropic](https://www.anthropic.com/api) - cloud-based, a major competitor to OpenAI, features proprietary/closed-source models such as Claude3 - Haiku, Sonnet, and Opus. 🔥 **New**: now with cache support!
38+
- [Anthropic](https://www.anthropic.com/api) - cloud-based, a major competitor to OpenAI, features proprietary/closed-source models such as Claude3 - Haiku, Sonnet, and Opus. 🔥 **New**: now also through Bedrock!
3939
- [Google Vertex AI](https://cloud.google.com/vertex-ai) - cloud-based, features proprietary/closed-source models such as Gemini 1.5 Pro and flash
4040
- [Groq](https://wow.groq.com/) - cloud-based provider, known for its superfast inference with LPUs
41-
- [Grok](https://x.ai/) (🔥 **New**) - cloud-based provider from x.AI
41+
- [Grok](https://x.ai/) - cloud-based provider from x.AI
4242
- [Fireworks AI](https://fireworks.ai/) - cloud-based provider
4343
- [OctoAI](https://octo.ai/) - cloud-based provider
4444
- [TogetherAI](https://www.together.ai/) - cloud-based provider
4545
- [Cerebras](https://cerebras.ai/) - cloud-based provider, superfast (akin to Groq)
4646
- [Mistral](https://mistral.ai/) - cloud-based, leading open-source LLM company
47-
- [Deepseek](https://deepseek.com/) (🔥 **New**) - cloud-based provider from China
47+
- [Deepseek](https://deepseek.com/) - cloud-based provider from China
4848
- [Ollama](https://ollama.com/) - runs locally, serves as an umbrella for open-source LLMs including LLaMA3, dbrx, and Command-R
4949
- [FastChat](https://github.com/lm-sys/FastChat) - runs locally, serves as an umbrella for open-source LLMs such as Vicuna, Alpaca, and FastChat-T5
5050

@@ -63,7 +63,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
6363
To install the library, add the following dependency to your *build.sbt*
6464

6565
```
66-
"io.cequence" %% "openai-scala-client" % "1.1.1"
66+
"io.cequence" %% "openai-scala-client" % "1.1.2"
6767
```
6868

6969
or to *pom.xml* (if you use maven)
@@ -72,11 +72,11 @@ or to *pom.xml* (if you use maven)
7272
<dependency>
7373
<groupId>io.cequence</groupId>
7474
<artifactId>openai-scala-client_2.12</artifactId>
75-
<version>1.1.1</version>
75+
<version>1.1.2</version>
7676
</dependency>
7777
```
7878

79-
If you want streaming support, use `"io.cequence" %% "openai-scala-client-stream" % "1.1.1"` instead.
79+
If you want streaming support, use `"io.cequence" %% "openai-scala-client-stream" % "1.1.2"` instead.
8080

8181
## Config ⚙️
8282

@@ -146,7 +146,7 @@ Then you can obtain a service in one of the following ways.
146146

147147
2. [Anthropic](https://www.anthropic.com/api) - requires `openai-scala-anthropic-client` lib and `ANTHROPIC_API_KEY`
148148
```scala
149-
val service = AnthropicServiceFactory.asOpenAI()
149+
val service = AnthropicServiceFactory.asOpenAI() // or AnthropicServiceFactory.bedrockAsOpenAI
150150
```
151151

152152
3. [Google Vertex AI](https://cloud.google.com/vertex-ai) - requires `openai-scala-google-vertexai-client` lib and `VERTEXAI_LOCATION` + `VERTEXAI_PROJECT_ID`

build.sbt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ val scala3 = "3.2.2"
77

88
ThisBuild / organization := "io.cequence"
99
ThisBuild / scalaVersion := scala212
10-
ThisBuild / version := "1.1.1"
10+
ThisBuild / version := "1.1.2"
1111
ThisBuild / isSnapshot := false
1212

1313
lazy val commonSettings = Seq(

openai-count-tokens/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# OpenAI Scala Client - Count tokens [![version](https://img.shields.io/badge/version-1.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
1+
# OpenAI Scala Client - Count tokens [![version](https://img.shields.io/badge/version-1.1.2-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
22

33
This module provides ability for estimating the number of tokens an OpenAI chat completion request will use.
44
Note that the full project documentation can be found [here](../README.md).
@@ -21,7 +21,7 @@ or to *pom.xml* (if you use maven)
2121
<dependency>
2222
<groupId>io.cequence</groupId>
2323
<artifactId>openai-scala-count-tokens_2.12</artifactId>
24-
<version>1.1.1</version>
24+
<version>1.1.2</version>
2525
</dependency>
2626
```
2727

0 commit comments

Comments
 (0)