From d06c5580532e9ce7fceec6d0e666735e8a58a369 Mon Sep 17 00:00:00 2001 From: Hause Lin Date: Mon, 26 Aug 2024 12:33:02 -0400 Subject: [PATCH] Update site --- README.Rmd | 6 ++++-- README.md | 14 +++++++++++--- _pkgdown.yml | 10 +++++----- 3 files changed, 20 insertions(+), 10 deletions(-) diff --git a/README.Rmd b/README.Rmd index 2f944ed..393b986 100644 --- a/README.Rmd +++ b/README.Rmd @@ -23,6 +23,8 @@ knitr::opts_chunk$set( The [Ollama R library](https://hauselin.github.io/ollama-r/) is the easiest way to integrate R with [Ollama](https://ollama.com/), which lets you run language models locally on your own machine. Main site: https://hauselin.github.io/ollama-r/ +The library also makes it easy to work with data structures (e.g., conversational/chat histories) that are standard for different LLMs (such as those provided by OpenAI and Anthropic). It also lets you specify different output formats (e.g., dataframes, text/vector, lists) that best suit your need, allowing easy integration with other libraries/tools and parallelization via the `httr2` library. + To use this R library, ensure the [Ollama](https://ollama.com) app is installed. Ollama can use GPUs for accelerating LLM inference. See [Ollama GPU documentation](https://github.com/ollama/ollama/blob/main/docs/gpu.md) for more information. See [Ollama's Github page](https://github.com/ollama/ollama) for more information. This library uses the [Ollama REST API (see documentation for details)](https://github.com/ollama/ollama/blob/main/docs/api.md). @@ -254,7 +256,7 @@ resp_process(resp, "text") # text vector # or list_models("text") ``` -#### Utility/helper functions to format and prepare messages for the `chat()` function +#### Format and prepare messages for the `chat()` function Internally, messages are represented as a `list` of many distinct `list` messages. Each list/message object has two elements: `role` (can be `"user"` or `"assistant"` or `"system"`) and `content` (the message text). The example below shows how the messages/lists are presented. @@ -265,7 +267,7 @@ list( # main list containing all the messages ) ``` -To simplify the process of creating and managing messages, `ollamar` provides utility/helper functions to format and prepare messages for the `chat()` function. +To simplify the process of creating and managing messages, `ollamar` provides functions to format and prepare messages for the `chat()` function. These functions also work with other APIs or LLM providers like OpenAI and Anthropic. - `create_messages()`: create messages to build a chat history - `create_message()` creates a chat history with a single message diff --git a/README.md b/README.md index 0b6530d..038add7 100644 --- a/README.md +++ b/README.md @@ -16,6 +16,13 @@ easiest way to integrate R with [Ollama](https://ollama.com/), which lets you run language models locally on your own machine. Main site: +The library also makes it easy to work with data structures (e.g., +conversational/chat histories) that are standard for different LLMs +(such as those provided by OpenAI and Anthropic). It also lets you +specify different output formats (e.g., dataframes, text/vector, lists) +that best suit your need, allowing easy integration with other +libraries/tools and parallelization via the `httr2` library. + To use this R library, ensure the [Ollama](https://ollama.com) app is installed. Ollama can use GPUs for accelerating LLM inference. See [Ollama GPU @@ -293,7 +300,7 @@ resp_process(resp, "text") # text vector # or list_models("text") ``` -#### Utility/helper functions to format and prepare messages for the `chat()` function +#### Format and prepare messages for the `chat()` function Internally, messages are represented as a `list` of many distinct `list` messages. Each list/message object has two elements: `role` (can be @@ -308,8 +315,9 @@ list( # main list containing all the messages ``` To simplify the process of creating and managing messages, `ollamar` -provides utility/helper functions to format and prepare messages for the -`chat()` function. +provides functions to format and prepare messages for the `chat()` +function. These functions also work with other APIs or LLM providers +like OpenAI and Anthropic. - `create_messages()`: create messages to build a chat history - `create_message()` creates a chat history with a single message diff --git a/_pkgdown.yml b/_pkgdown.yml index f722064..94818c6 100644 --- a/_pkgdown.yml +++ b/_pkgdown.yml @@ -18,7 +18,7 @@ reference: - title: Ollamar functions - subtitle: API endpoints - desc: Functions to make calls to the Ollama server/API. + desc: Make calls to the Ollama server/API. contents: - generate - chat @@ -33,8 +33,8 @@ reference: - embeddings - ps - - subtitle: Utility functions - desc: Functions to work with the Ollama API. + - subtitle: API helpers + desc: ork with the Ollama API. contents: - resp_process - ohelp @@ -43,8 +43,8 @@ reference: - test_connection - create_request - - subtitle: Manipulate chat/conversation history - desc: Functions to manipulate messages and chat history for `chat()` function. + - subtitle: Manipulate chat history + desc: Manipulate chat history for Ollama and other LLM providers. contents: - create_messages - create_message