Skip to content

Why LangchainLLMWrapper and LangchainEmbeddingsWrapper are Required for Ollama Models in RAGAS Metrics? #13

@tapasig

Description

@tapasig

While using Ollama model for RAGAs evaluation metrices the two wrapper functions are required: one for llm model and another for embedding model.
Because of the interface compatibility: the RAGAs metrices expect LLMs to implement the BaseRagasLLM, which Ollama models don't natively support. LLMs to implement the BaseRagasLLM interface, which Ollama models don't natively support. May be because of Ollama's unique response schema response.content vs OpenAI's choices.message.content - kindly verify

from ragas.llms import LangchainLLMWrapper
from ragas.embeddings import LangchainEmbeddingsWrapper

wrapped_llm = LangchainLLMWrapper(ollama_llm)
wrapped_embeddings = LangchainEmbeddingsWrapper(ollama_embeddings)

ANSWER_CORRECTNESS = ragas.metrics.AnswerCorrectness(name = "ANSWER_CORRECTNESS",
                                                     weights = [0.90, 0.10],
                                                     llm = wrapped_llm,
                                                     embeddings = wrapped_embeddings
                                                     )

Metadata

Metadata

Labels

bugSomething isn't workingenhancementNew feature or request

Type

Projects

Status

Todo

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions