Local LLM using Ollama without OpenAI #731
-
Hi all, My goal is to use only local models and to not use any OpenAI models at all. My environment looks like this:
My code looks like this: from paperqa import Settings, ask
# Define the missing EnvironmentState -> was getting Pydantic errors without it
class EnvironmentState:
def __init__(self, *args, **kwargs):
pass
# Rebuild the Settings model -> was getting Pydantic errors without it
Settings.model_rebuild()
local_llm_config = {
"model_list": [
{
"model_name": "ollama/llama3.2",
"litellm_params": {
"model": "ollama/llama3.2",
"api_base": "http://localhost:11434", # Ollama server API base URL
},
}
]
}
settings = Settings(
llm = "ollama/llama3.2",
llm_config = local_llm_config,
summary_llm = "ollama/llama3.2",
summary_llm_config = local_llm_config,
embedding = "ollama/nomic-embed-text",
paper_directory = "papers",
)
answer = ask(
"What machine learning algorithms can classify heart failure?",
settings = settings
)
print(answer) I wonder how to decouple AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message':
'Incorrect API key provided: foo. You can find your API key at https://platform.openai.com/account/api-keys.', 'type':
'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
Received Model Group=gpt-4o-2024-08-06
Available Model Group Fallbacks=None It seems that the code is still trying to access Many thanks for hints - Cheers :) |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 6 replies
-
Hey @lindamathez 🌟, I'm here to help you with your technical issues and questions. Let's tackle this bug together and get everything up and running smoothly! To configure
By following these steps, you should be able to use |
Beta Was this translation helpful? Give feedback.
-
it finally worked by adding |
Beta Was this translation helpful? Give feedback.
Hi @lindamathez , yes this is fixed by #728 on our current
main
branch. It will be released soon in v5.5.1.In the meantime, you can either: