-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Labels
bugSomething isn't workingSomething isn't workingenhancementNew feature or requestNew feature or request
Description
While using Ollama model for RAGAs evaluation metrices the two wrapper functions are required: one for llm model and another for embedding model.
Because of the interface compatibility: the RAGAs metrices expect LLMs to implement the BaseRagasLLM, which Ollama models don't natively support. LLMs to implement the BaseRagasLLM
interface, which Ollama models don't natively support. May be because of Ollama's unique response schema response.content
vs OpenAI's choices.message.content
- kindly verify
from ragas.llms import LangchainLLMWrapper
from ragas.embeddings import LangchainEmbeddingsWrapper
wrapped_llm = LangchainLLMWrapper(ollama_llm)
wrapped_embeddings = LangchainEmbeddingsWrapper(ollama_embeddings)
ANSWER_CORRECTNESS = ragas.metrics.AnswerCorrectness(name = "ANSWER_CORRECTNESS",
weights = [0.90, 0.10],
llm = wrapped_llm,
embeddings = wrapped_embeddings
)
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingenhancementNew feature or requestNew feature or request
Type
Projects
Status
Todo