Skip to content

Commit

Permalink
Update urls
Browse files Browse the repository at this point in the history
  • Loading branch information
hauselin committed Apr 29, 2024
1 parent b27fd54 commit 0013f6e
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 5 deletions.
4 changes: 2 additions & 2 deletions README.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,10 @@ knitr::opts_chunk$set(
# Ollama R Library

<!-- badges: start -->
[![R-CMD-check](https://github.com/hauselin/ollamar/actions/workflows/R-CMD-check.yaml/badge.svg)](https://github.com/hauselin/ollamar/actions/workflows/R-CMD-check.yaml)
[![R-CMD-check](https://github.com/hauselin/ollama-r/actions/workflows/R-CMD-check.yaml/badge.svg)](https://github.com/hauselin/ollama-r/actions/workflows/R-CMD-check.yaml)
<!-- badges: end -->

The [Ollama R library](https://hauselin.github.io/ollamar/) provides the easiest way to integrate R with [Ollama](https://ollama.com/), which lets you run language models locally on your own machine. For Ollama Python, see [ollama-python](https://github.com/ollama/ollama-python). You'll need to have the [Ollama](https://ollama.com/) app installed on your computer to use this library.
The [Ollama R library](https://hauselin.github.io/ollama-r/) provides the easiest way to integrate R with [Ollama](https://ollama.com/), which lets you run language models locally on your own machine. For Ollama Python, see [ollama-python](https://github.com/ollama/ollama-python). You'll need to have the [Ollama](https://ollama.com/) app installed on your computer to use this library.

> Note: You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models.
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@

<!-- badges: start -->

[![R-CMD-check](https://github.com/hauselin/ollamar/actions/workflows/R-CMD-check.yaml/badge.svg)](https://github.com/hauselin/ollamar/actions/workflows/R-CMD-check.yaml)
[![R-CMD-check](https://github.com/hauselin/ollama-r/actions/workflows/R-CMD-check.yaml/badge.svg)](https://github.com/hauselin/ollama-r/actions/workflows/R-CMD-check.yaml)
<!-- badges: end -->

The [Ollama R library](https://hauselin.github.io/ollamar/) provides the
easiest way to integrate R with [Ollama](https://ollama.com/), which
The [Ollama R library](https://hauselin.github.io/ollama-r/) provides
the easiest way to integrate R with [Ollama](https://ollama.com/), which
lets you run language models locally on your own machine. For Ollama
Python, see [ollama-python](https://github.com/ollama/ollama-python).
You’ll need to have the [Ollama](https://ollama.com/) app installed on
Expand Down

0 comments on commit 0013f6e

Please sign in to comment.