From 46c582a03a04ff5504d69ed86f4d698ef0b20444 Mon Sep 17 00:00:00 2001 From: Hause Lin Date: Sat, 24 Aug 2024 15:18:47 -0400 Subject: [PATCH] Update paper --- paper/paper.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/paper/paper.md b/paper/paper.md index da2bd8a..cbbdcf4 100644 --- a/paper/paper.md +++ b/paper/paper.md @@ -28,17 +28,17 @@ Large language models (LLMs) have transformed natural language processing and AI The increasing importance of LLMs in various fields has created a demand for accessible tools that allow researchers and practitioners to leverage LLMs within their preferred programming environments. Locally deployed LLMs offer advantages in terms of data privacy, security, and customization, making them an attractive option for many users [@Chan2024Aug; @Liu2024Aug; @Lytvyn2024Jun; @Shostack2024Mar]. However, the lack of native R libraries for interfacing with locally deployed LLMs has limited the accessibility of these models to R users, even though R is a popular and crucial tool in statistics, data science, and various research domains [@Hill2024May; @Turner2024Aug]. `ollamar` fills a critical gap in the R ecosystem by providing a native interface to run locally deployed LLMs. -The `ollamar` R library is a package designed to integrate R with Ollama, allowing users to run large language models locally on their own machines. Although alternative R libraries exist [@Gruber2024Apr], `ollamar` distinguishes itself the features described below. +The `ollamar` R library is a package that integrates R with Ollama, allowing users to run large language models locally on their machines. Although alternative R libraries exist [@Gruber2024Apr], `ollamar` distinguishes itself through the features described below. -**User-friendly API wrapper**: It provides an interface to Ollama server and all API endpoints, following closely the official API design. This design makes it easy for R users to understand how other similar libraries (such as in Python and JavaScript) work, while also allowing users familiar with other programming languages to quickly adapt to and use this library. The consistent API structure across languages facilitates seamless transitions and knowledge transfer for developers working in multi-language environments. +**User-friendly API wrapper**: It provides an interface to the Ollama server and all API endpoints, closely following the official API design. This design makes it easy for R users to understand how similar libraries (such as in Python and JavaScript) work while allowing users familiar with other programming languages to adapt to and use this library quickly. The consistent API structure across languages facilitates seamless transitions and knowledge transfer for developers working in multi-language environments. -**Consistent and flexible output formats**: All functions that call API endpoints return `httr2::httr2_response` objects by default, but users can specify different output formats, such as dataframes (`"df"`), lists (of JSON objects) (`"jsonlist"`), raw strings (`"raw"`), text vectors (`"text"`), or request objects (`"req"`). This flexibility greatly enhances the usability and versatility of the library. Users can choose the format that best suits their needs, such as when working with different data structures or integrating the output with other R packages or allowing parallelization via the `httr2` library. +**Consistent and flexible output formats**: All functions that call API endpoints return `httr2::httr2_response` objects by default, but users can specify different output formats, such as dataframes (`"df"`), lists (of JSON objects) (`"jsonlist"`), raw strings (`"raw"`), text vectors (`"text"`), or request objects (`"req"`). This flexibility greatly enhances the usability and versatility of the library. Users can choose the format that best suits their needs, such as when working with different data structures, integrating the output with other R packages, or allowing parallelization via the `httr2` library. -**Utility functions for managing conversation history**: LLM APIs often expect conversational or chat history data as input, which are often nested lists or JSON objects. Note that this data format is a common for chat-based applications and APIs (not limited to Ollama), such as those provided by OpenAI and Anthropic. `ollamar` provides helper functions to simplify the process of preparing and processing conversational data for input to different LLMs, streamlining the workflow for chat-based applications. +**Utility functions for managing conversation history**: LLM APIs often expect conversational or chat history data as input, often nested lists or JSON objects. Note that this data format is standard for chat-based applications and APIs (not limited to Ollama), such as those provided by OpenAI and Anthropic. `ollamar` provides helper functions to simplify preparing and processing conversational data for input to different LLMs, streamlining the workflow for chat-based applications. # Conclusion -`ollamar` bridges a crucial gap in the R ecosystem by providing seamless access to large language models through Ollama. Its user-friendly API, flexible output formats, and conversation management utilities enable R users to easily integrate LLMs into their workflows. This library empowers researchers and data scientists across various disciplines to leverage the power of locally-deployed LLMs, potentially accelerating research and development in fields relying on R for data analysis and machine learning. +`ollamar` bridges a crucial gap in the R ecosystem by providing seamless access to large language models through Ollama. Its user-friendly API, flexible output formats, and conversation management utilities enable R users to integrate LLMs into their workflows easily. This library empowers researchers and data scientists across various disciplines to leverage the power of locally deployed LLMs, potentially accelerating research and development in fields relying on R for data analysis and machine learning. # Acknowledgements