Skip to content

[FEATURE] Add support for red-candle #394

@orangewolf

Description

@orangewolf

Scope check

  • This is core LLM communication (not application logic)
  • This benefits most users (not just my use case)
  • This can't be solved in application code with current RubyLLM
  • I read the Contributing Guide

Due diligence

  • I searched existing issues
  • I checked the documentation

What problem does this solve?

We'd like access to in process, local running LLMs. These are especially useful for running smaller models in line or for running cost prohibitive work loads.

Proposed solution

red-candle allows users to run state-of-the-art language models directly from Ruby with blazing-fast Rust under the hood. Hardware accelerated with Metal (Mac) and CUDA (NVIDIA). Red candle leverages the Rust ecosystem, notably Candle and Magnus, to provide a fast and efficient way to run LLMs in Ruby. https://github.com/scientist-labs/red-candle

Why this belongs in RubyLLM

red-candle provides a unique alternative to other providers in that it is local, in process and low requirement.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions